Archive for the 'Research' Category

Co-creating impact: why universities and communities should work together

Kate Pahl and Keri Facer, authors of Valuing interdisciplinary collaborative research, discuss the value of co-production and collaboration between academic researchers and community projects. 

Valuing Interdisciplinary Collaborative Research will be launched at the Co-Creating Cities & Communities Summer Event today in Bristol #ahrcconnect #citiesandcoms2017 @ahrcconnect

Kate Pahl

Keri Facer

“Increasingly, universities are being asked to work with communities in more inclusive, collaborative and ethical ways, but their processes and practices are often overlooked, particularly within the arts and humanities.

University ways of knowing and doing are only one part of research and new conceptual tools are needed to make sense of this. This makes for a new and exciting research landscape.

“Impact isn’t just about academics doing brilliant, original research… impact is co-created.”

The ‘impact’ agenda needs to shift to recognise the nature of ‘co-produced impact’. That is, impact isn’t just about academics doing brilliant, original research which is written up in articles and then re-produced in different forms to a grateful community which draws on this research.

Instead, impact is co-created. People have ideas, in communities and in universities and they work on these together, bringing different knowledges and practices to those questions and ideas. This then produces a different kind of knowledge – richer, more diverse, more carefully located in real and everyday contexts and more relevant.

Connected Communities

The Connected Communities (CC) programme, headed by the AHRC cross-research council, has funded over 300 projects, worked with over 500 collaborating organisations and over 700 academics from universities across the UK, on topics ranging from festivals to community food, from everyday creativity to care homes, from hyper-local journalism to community energy.

‘Valuing Collaborative Interdisciplinary Research’ (Policy Press 2017), the latest volume in the Connected Communities book series, brings together a number of diverse and rich research projects that range from community evaluation, to how community values play out in collaborative research, how decisions on heritage should be made, and on what artists do when they work with academics and communities together with the role of performance in highlighting community concerns.

Many different people contributed to the projects ranging from people from the Heritage Lottery Fund and The Science Museum, to people working within communities as well as within universities.

facer-blog-pic

Some themes which emerge in the book include translation, co-production, dialogic modes of research and tacit and embodied knowledge. A key theme is the nature of knowledge and its production practices . Ways of capturing everyday knowledge, through stories, maps, material objects, conversations and performances, are discussed and considered.

In the book we attempt to map this new world out. We offer a set of helpful ideas and ways forward to articulate what is needed to do this sort of work. We argue that projects like this need to include an element of productive divergence.

“Perhaps if this kind of research was funded more often, surprises like the recent election result wouldn’t have come as so much of a shock.”

The projects are often grounded in the world materially and objects play a strong part. They often involve mess, uncertainty, complexity and a focus on practice and involve translating across different fields, as well as stories as a mode of exchange. Many of the projects draw on tacit and embodied learning that were informed by arts methodologies as well as ideas from sensory and phenomenological perspectives.

Perhaps if this kind of research was funded more often, surprises like the recent election result wouldn’t have come as so much of a shock. Universities need to become more attuned to the voices of communities, to their accounts of what is important and necessary to research. The Connected Communities programme and this book make a start in redressing the balance.

 

Valuing interdisciplinary collaborative research edited by Keri Facer and Kate Pahl is available with 20% discount on the Policy Press website. Order here for £19.99.

Find out more about impact, influence and engagement at Policy Press here.

Policy Press newsletter subscribers receive a 35% discount – sign up here.

The views and opinions expressed on this blog site are solely those of the original blog post authors and other contributors. These views and opinions do not necessarily represent those of the Policy Press and/or any/all contributors to this site.

Academic Work, Fast and Slow

Should academics strive to be ‘fast’ or ‘slow’? Helen Kara, author of Research and evaluation for busy students and practitioners, argues that there is not one, clear answer. 

Helen Kara

In recent years there has been an increasingly heated debate, in the blogosphere and elsewhere, about whether academia is – or should be – ‘fast’ or ‘slow’.

This is linked to other discourses about speed such as Slow Food and Slow Cities.

Some commentators aver that the pace of life in academia is speeding up because of managerialism, the REF and its equivalents in other countries, and the ensuing pressure to conduct and publish interesting research with significant results. All of this, in addition to the increasing casualisation of employment in academia, and the increasing speed of digital communication, has led to toxic working conditions that cause academics to have breakdowns and burn out.

This doesn’t only affect academics, but also non-academics doing academic work such as undergraduate and postgraduate degrees. Also, to some academics’ surprise, this doesn’t only apply in academia, but also in the public sector more widely, and parts of the private sector too. Perhaps this is because, as the saying goes, the speed of change is faster than it’s ever been before, yet it will never be this slow again.

Continue reading ‘Academic Work, Fast and Slow’

Why we need morality included in our public policy

In today’s guest post author of Morality and public policy, which publishes this month, Clem Henricson demands we put the discussion and inclusion of moral issues back into government decision making and law formation…

ClemWith an increasingly bitter secular religious divide we need a radical shift in our take on morality – not a breast beating on the state of morals, but an enhanced understanding of the nature of morality and a way forward to remedy what is a seriously defective relationship with public policy.

Have you ever questioned why the moral sphere is segregated from core public policy? Why in the gestation of policy is morality hived off as the provenance of private conscience and the clerisy?

We have separate development with the relegation of moral issues to some zone outside the mainstream of governmental concerns. Are governments too cowardly or ill equipped to address these matters?

Legislation and change

It emphatically should not take so long for legislation to keep up with changes in social mores – changes in attitudes to matters such as abortion, homosexuality, cohabitation and that issue that has exercised us so much recently- assisted dying – with its haunting images of campaigners such as Tony Nicklinson and Terry Pratchett.

Why does government hide behind the private member’s bill, judicial rulings, loud protracted campaigns and flouting of the law that are so often the necessary prelude to change? Why is government dilatory and evasive, instead of embracing the essence of human relations – handling fluctuations and tensions head on?

“..an illusory dividing line drawn between […] public policy and conventional ‘morality’”

Continue reading ‘Why we need morality included in our public policy’

The Illusion of China’s Economic Liberalization

China has become an economic superpower by  Author and academic Andrzej Bolesta, author of China and Post-Socialist Development 

Dr Andrzej Bolesta

Dr Andrzej Bolesta

The leading theme of the proceedings of the Chinese rubberstamp legislature – the National People’s Congress – is always reforms.

Recently, hopes have been high. The administration of president Xi Jinping and premier Li Keqiang has been promising more market forces in China’s economy. Despite this, in recent times, China’s economic liberalization has been an illusion rather than a fact and this will not change any time soon.

The assumption has been that the interventionist model, characterized by the state’s heavy involvement in the economy, which dominated the first 30 years of reforms and opening up has come to exhaustion.

Why? The implementation of the state interventionist model has led to a growth in social inequalities, significant damage to the natural environment and, perhaps most recently, an almost three percent drop in economic growth – the lowest in 24 years.

Market forces

The logic has been that if a lack of market forces has brought upon China these, to put it mildly, worrying trends, then we need to introduce more market forces to amend the situation. “We need more market forces” – has been the message of the current state administration. And what has happened since the calls for more market reforms began? In terms of economic liberalization, not much.

Since the commencement of economic transformation by Deng Xiaoping in 1978 China has undergone an extensive process of liberalization. A somewhat market-based economy was constructed. But with the administration of president Hu Jintao and premier Wen Jiabao from 2003 the liberalization process essentially halted and China has since only reluctantly been fulfilling its obligations related to its WTO membership.

“The authorities have little intention of continuing economic liberalization”

The current drive towards economic liberalization is also an illusion. Some reforms will continue but the progress towards a greater role of the market in economic affairs will be slow, painful and perhaps full of retrenchments. The authorities have little intention of continuing economic liberalization. There is an important reason for this.

Shanghai - economic capital of China Photo credit - wikipedia

Shanghai – economic capital of China Photo credit – wikipedia

The higher echelons of the Chinese communist party long ago chose the model of China’s development and since the beginning of transformation the general idea has hardly been altered, despite the plethora of analyses that claim to the contrary.

This model can essentially be summarized as an attempt to employ the systemic, institutional and policy solutions used by Japan and Korea during their high growth periods.

East Asian development model

It is true that China is very different from both countries. It is much larger and more decentralized; it has a different historical institutional background as it was a socialist country; and finally, it attempts to imitate Japan and Korea at a time when the advancements of globalization in a way impose a great deal of openness on national economies, thus making some of the Japanese and Korean historical policies incompatible with the arrangements of the contemporary world.

Nevertheless, with all its indecisiveness and reform retrenchments, China’s leadership has vigorously implemented the East Asian development model as extensively as the internal and external conditions have allowed – a model responsible for the most spectacular developmental advancements of mankind in the second half of the twentieth century.

“In this model, however, there is hardly any space left for further economic liberalization”

This implementation is clearly visible when we examine China’s trade policies to support export and to discriminate import, when we observe the deliberate policy of development of certain sectors of export-orientated production; when we see economic nationalism becoming, next to political nationalism, the leading state ideology; and when we see how the leadership wants to keep the society subordinate and obedient and at the same time supportive of the national development trajectory, and does so by keeping the social sphere weak and unorganized and by, nevertheless, creating conditions for gradual improvements in the welfare.

This model has largely been a success. China has become an economic superpower. In this model, however, there is hardly any space left for further economic liberalization. Further liberalization will only be possible once domestic companies reach the level of sophistication that enables them to control the domestic market in the free market environment, and to effectively compete on the global arena, as was the case of Japanese keiretsu and Korean chaebols.

This moment has yet to arrive. Chinese authorities continue to believe that more competition from foreign actors will negatively affect the domestic business sector. Chines authorities also believe that the social and environmental problems their country is currently facing cannot be solved by market forces. And this is the perfect excuse to continue following its long-term model of development with an intrusive and interventionist state. Chinese leadership will talk about economic liberalization as it plays the global game, for China is a part of the global economy. And the global game is to praise more market, more free trade, more economic liberalization. But it will not liberalize.

Dr. Andrzej Bolesta
@a_bolesta

China and post socialist development [FC]China and Post-Socialist Development is available for purchase from our website here (RRP £70.00). Don’t forget Policy Press newsletter subscribers get a 35% discount when ordering through our website. If you’re not a subscriber yet why not sign up here today and join our Policy Press community?

The views and opinions expressed on this blog site are solely those of the original blogpost authors and other contributors. These views and opinions do not necessarily represent those of the Policy Press and/or any/all contributors to this site.

Are you skilled in the dark art of Social Media?

In this blog post, Kim Eggleton, our Journals Executive, explains why she believes social media is the researcher’s new best friend

Kim Eggleton, Journals Executive

Kim Eggleton, Journals Executive

Whenever I talk to researchers about using social media, the most common “objection” I hear is that it’s self-promotion, and nothing more than vanity.

The second most common protest is that good research should stand on its own merits, and if it’s good enough, people will find it. I can understand where both these opinions come from, but I think the world has moved on considerably and neither of these concerns are valid any longer.

SocMediaCartoon

What’s Social Media all about

…in 2012, over 1.8 million articles were published in 28,000 journals.

With the inevitable overload of information that came with Internet for the masses, it has become harder and harder to make good work stand out. It is estimated that in 2012, over 1.8 million articles were published in 28,000 journals. In 2010 in the US alone, it is estimated that over 320,000 books were published. Continue reading ‘Are you skilled in the dark art of Social Media?’

Deprivation of necessities has become more widespread in Britain since 1999

Eldin Fahmy, co-editor of our journal Journal of Poverty and Social Justice recently wrote a blog post for the LSE Politics and Policy blog based on a themed issue in the journal. In case you missed it on the LSE blog, here it is again!

Eldin FahmyThe 2008 financial crisis and subsequent austerity measures have seen the most sustained decline in household incomes since the 1930s. In this post, Eldin Fahmy examines their impacts on public perceptions of minimally adequate living standards, and on the extent of deprivation. Based upon analysis of survey data for 1999 and 2012, it seems that as households have been forced to ‘tighten their belts’, perceptions of minimum living standards have become less generous. At the same time the extent of deprivation has increased dramatically.

The 2012 UK Poverty and Social Exclusion survey (2012-PSE) is the latest and most comprehensive in a series of household surveys conducted since the early 1980s adopting a ‘consensual’ approach to poverty which reflect public views on minimally adequate living standards. Since our last survey in Britain in 1999, public perceptions of what constitute the ‘necessities of life’ have become less generous.  Nevertheless, the proportion of adults in Britain deprived of these necessities has increased substantially since 1999.

Poverty in Britain today is widely understood in relative terms as an inability to take part in lifestyles and activities which are customary or widely approved in contemporary society due to insufficient resources. This requires direct observation of living standards and cannot be established simply be using arbitrary income thresholds. Since Mack and Lansley’s ground-breaking 1983 survey, surveys on poverty in 1990,1999 and 2012 have therefore examined public views on minimally acceptable living standards and have incorporated these views within the definition and measurement of poverty itself.

One consistent finding emerging from these surveys has been the striking degree of public consensus across social groups (e.g. by gender, age, social class, income level, etc.) concerning the relative importance of different items and activities. Nevertheless, as deprivation is here understood to be relative to prevailing societal standards, we should expect that perceptions of necessities will vary across time to reflect changing living standards, tastes and customs. What, then, do the British public view as necessities of life today and in what ways has this changed since our last survey in 1999?

Table2Table 1 shows the percentage of adults in 2012 and 1999 describing a comparable set of items and activities as ‘necessities’. In both 1999 and 2012 there is widespread agreement on many items, and perceptions of necessities extend far beyond what might be described as ‘basic’ needs to encompass a range of ‘social’ necessities. As predicted by relative deprivation theory, perceptions of necessities also reflect changes in prevailing living standards and consumption norms, for example, in relation to technological items which have become more widely available (and widely encouraged) over the 1999-2012 period.

However, one implication of a relative approach is that during periods of declining living standards public perceptions of necessities may also become less generous. Given the sustained decline in household incomes and living standards arising from the 2008 financial crisis, it would be astonishing if this was not also reflected in public attitudes to the necessities of life. Table 1 suggests that this is indeed the case.

Many items record a substantial fall in the proportion of respondents who view them as necessities in 2012 compared with 1999, with those items where public support was more equivocal in 1999 witnessing an especially dramatic decline in approval. As household incomes have become more constrained, more basic necessities (towards the top of Table 1) are increasingly prioritised over more discretionary items. As we argue in our preliminary report, it seems that the public have scaled back their expectations regarding minimum living standards in ways which reflect the prevailing climate of austerity and pessimism. One consequence of recession and the austerity programme may be that the British public have ‘tightened their belts’ and now consider many things which in the past were viewed as essential to no longer be necessities.

However, even though public perceptions of minimum living standards became less generous, the extent of deprivation of necessities has nevertheless increased for adults in Britain over this period.  Table 2 shows the percentage of adults in Britain who lack different necessities in 1999 and 2012 because they cannot afford them. The proportion of adults unable to afford items and activities considered by the British public to be ‘necessities of life’ in 2012 has increased dramatically compared with 1999. For example, the percentage of adults unable to adequately heat their home has increased seven-fold, and the percentage unable to afford a damp-free home, or to replace broken electrical goods, or to afford appropriate clothes for job interviews has at least doubled over this period.

There is now widespread agreement on what constitutes a minimally acceptable diet for adults, including two meals a day, fresh fruit and vegetables daily, and meat and fish every other day.  However, an increasing number of adults are unable to afford to eat properly, with the percentage of British adults who are unable to afford at least one of these dietary essentials increasing from 5 per cent in 1999 to 8 per cent in 2012. Since Table 2 focuses on the same items measured in comparable ways in 1999 and 2012, there has been an absolute increase in social and material deprivation over this period amongst the British adult population.

Underpinning the growth in deprivation over this period has been a rising tide of income inequality over the 1999-2008 period which ensured that despite a period of sustained economic growth until 2008, the benefits of growth were for the most part not enjoyed by poorer households whose incomes and wages fell further and further behind those of the better-off in relative terms.

Following the 2008 recession there has been a modest decline in income inequality and relative income poverty, but this reflects an overall decline in societal standards rather than any absolute improvement in the circumstances of poorer households. Although this decline in living standards is also reflected in more restrictive public perceptions of necessities, the extent of social and material deprivation amongst adults in Britain has clearly increased substantially since 1999.  Indeed, these findings reflect the situation in 2012 before the majority of proposed changes to welfare benefits came into effect. Since these measures are set to hit the poor hard, our findings almost certainly underestimate the true extent of social and material deprivation in Britain today.

Note: A longer version of this article was published in the Journal of Poverty and Social Justice (Vol 22, Issue 2) in October 2014. This article gives the views of the author, and not the position of the British Politics and Policy blog, nor of the London School of Economics. Please read our comments policy before posting. Featured image credit: Cliff Johnson CC BY-SA 2.0

About the Author

JPSJ 2013 [FC]Dr Eldin Fahmy is Senior Lecturer in the School for Policy Studies at the University of Bristol. He is a member of the ESRC-funded 2012 UK Poverty and Social Exclusion Surveyresearch team (Ref: RES-060–25–688 0052) and co-editor of the Journal of Poverty and Social Justice.

The views and opinions expressed on this blog site are solely those of the original blogpost authors and other contributors. These views and opinions do not necessarily represent those of the Policy Press and/or any/all contributors to this site.

There are fewer people registered to vote in 2015 than there were in 2010: is that to Labour’s advantage?

Policy Press authors and academics Ron Johnston and Charles Pattie have teamed up with David Rossiter to write a recent LSE General Election 2015 blog There are fewer people registered to vote in 2015 than there were in 2010. We were fascinated to read about the discrepancies in voter registration between this election and the last one, especially as we have been supporting the Bite the ballot campaign, encouraging people to register to vote. Here is their post reblogged in full…

AuthorsThe 2010 general election result was considerably biased in Labour’s favour: if they and Conservatives had won equal shares of the vote total, Labour could have obtained as many as 54 more seats than their Tory opponents. This bias partly reflected unequal electorates across the country’s constituencies.

Recently published data show that the number of registered electors nationally has since declined. But is Labour’s advantage still there? Ron Johnston, Charles Pattie and David Rossiter analyse those data and show that, unless the Conservatives win a lot of seats from Labour on 7 May, if the two parties are roughly equal in their number of votes Labour could again benefit from the inherent biases in the electoral system, perhaps by as many as 30 seats.

All UK general election results since the 1970s have been biased, favouring Labour over the Conservatives – bias being defined as the difference in the number of seats each would have gained if they had equal shares of the votes cast. If that had occurred in 2010 – with votes distributed across Britain’s constituencies in the same proportions as the votes actually cast – Labour would have obtained 54 more seats than the Conservatives.

Pro-Labour bias

Several factors create this pro-Labour bias; the most consistent have been differences between constituencies won by the two parties in their average electorates and turnout rates. Small constituencies can be won by fewer votes than large ones; so can those with low turnouts compared to those with high. The mean electorate in Conservative-won seats was 72,304 in 2010, but 68,672 in those won by Labour; average turnout in those two groups of seats was 68.2 and 61.2% respectively. The former difference was worth 18 seats to Labour in the total bias of 54; the latter was worth 31 seats.

The Conservatives tried to remove the impact of differences in average electorates: the 2011 Parliamentary Voting System and Constituencies Act required all constituencies to have electorates within 5% of the national average by the time of the 2015 general election, and the Boundary Commissions’ revised recommendations for new seats applying this rule would have removed any pro-Labour bias. But the redistribution was aborted, the Liberal Democrats voting with Labour and against their coalition partners to delay the redistribution until 2016, in retaliation for the lack of progress on House of Lords reform.

But has that difference in mean electorates been reduced, if not eliminated, by changes since 2010 in the distribution of the electorate across Britain’s 650 constituencies? Labour’s advantage over the Conservatives was a consequence of:

  • Smaller constituencies on average in Scotland and Wales (65,234 and 58,627 electors respectively) – where Labour won 67 seats and the Conservatives only 9 – compared to England (average 71,918), where the Conservatives won 297 seats to Labour’s 191;
  • A decline since the constituency boundaries were defined – using data for 2000 in England and Wales, and 2004 in Scotland – in the average electorate in seats won by Labour (most of which are in urban areas) compared to those won by the Conservatives.

In general, Labour won the smaller constituencies and those with declining electorates: they needed fewer votes to win there than did the Conservatives in the larger constituencies and those with expanding electorates.

As the 2015 election is to be fought in the same constituencies as 2010, these differences presumably remain in place – and might even be exaggerated, thereby enhancing Labour’s advantage – which could be crucial in determining the largest party in a close-run election. But has there been any clear pattern of change over the five years?

The Office of National Statistics recently published the number of registered electors in each constituency in December 2014 (except that the Scottish data will not be available until May 2015). These will not be the final figures at the 2015 election, because enrolment is open until mid-April, but comparing them with those for December 2009 (before the 2010 election) provides insights on trends since then. (For Scotland, we have had to use the 2013 data.)

“…there are as many as 1 million new ‘missing voters’, joining the several million who were not registered before 2010”

Across Britain, despite overall population growth in recent years, the average constituency electorate declined by 228 individuals – in part because a large number of people have moved home but not registered at their new address (especially young people who were registered as students but have since graduated and moved away): others qualify to vote but have not registered (again, many of these are probably young people). The Electoral Commission estimates that because of these patterns there are as many as 1 million new ‘missing voters’, joining the several million who were not registered before 2010.

Electorate 2009The first graph shows a very strong correlation between each constituency’s electorate in 2009 and 2014 – the overall pattern of constituency sizes has not changed – but with one very clear variation: average electorates declined in both England and Wales (by 558 and 888 respectively) but increased by 2,669 electors in Scotland (no doubt reflecting Scots’ keenness to vote in the 2014 Independence Referendum).

There were considerable variations around these averages, however: 286 constituencies experienced an increase, 158 of them by more than 1,000 electors; 346 experienced a decline – 213 of them by more than 1,000 electors and 96 by more than 2,500. Have the declines been concentrated in Labour-held seats, thus increasing their advantage over the Conservatives? Or has the recent population growth in many UK cities diluted the pro-Labour bias?

Electorate Average constiuencyThe answers – as illustrated in the second diagram – are yes, but only slightly to the first question, and thus no to the second. Only constituencies won by the SNP in 2010 have, on average, increased in size. The mean electorate in Conservative-held seats declined by 224 between 2009 and 2014, compared to 1,179 in Labour-held seats (despite the growth in Scotland where it holds 41 seats).

The difference between the two parties’ mean electorates was 4,016 in 2009; in 2010 it was 4,101. Thus if the Conservatives and Labour each won the same seats in 2015 as 2010, Labour could anticipate a favourable bias of some 18-20 seats if the parties have near-equal vote shares because of this factor alone.

An unlikely outcome

That is an unlikely outcome, of course. Labour’s initial strategy for 2015 targeted 106 seats. If it won them all, and all other seats stayed with their 2010 winner, the average Labour constituency in 2014 would have 68,098 electors and the average Conservative constituency 72,810 – the gap would be 4,712 electors, and the pro-Labour bias probably larger than five years ago. (The 106 seats that Labour would win – 89 of them from the Conservatives, 12 from the Liberal Democrats, 4 from Nationalist parties and one from the Greens – had an average electorate in December 2014 of 68,682.)

On the other hand, the 40 seats that the Conservatives have targeted as potential gains – 32 from Labour and 8 from the Liberal Democrats – averaged electorates of 67,475 in 2014. If all were won, the average electorate in Labour-won seats would be 68,112, whereas in Conservative-won seats it would be 71,442, a slightly smaller gap between the two of only 3,330: there would still be a pro-Labour bias, but reduced because some smaller constituencies had crossed into the Tory camp.

The marginal seats on average have smaller electorates than those that are relatively safe for the two parties, therefore. The more of them that the Conservatives win, the smaller the gap between each party’s mean electorate and the smaller the likely pro-Labour bias in the outcome.

One other scenario worth exploration concerns Scotland, where the average electorate increased after 2009. In 2010 Labour won 41 seats there, the Liberal Democrats 11, the SNP 6 and the Conservatives 1. Some commentators suggest that the SNP might win most of the Scottish seats. If, to take the extreme case, the SNP won all 59, the average electorate in England and Wales would be 67,381 for Labour and 71,795 for the Conservatives. Labour would still have an advantage over the Conservatives in the translation of votes into seats should the two parties get approximately the same number of votes overall.

How about turnout variations? The average in 2010 was 61.2 and 68.2% in Labour- and Conservative-held seats respectively. In Labour’s 106 target seats it was 66.3 whereas in the Conservatives’ it was 64.9; if Labour won all of its targets, turnout in 2015 – if the 2010 pattern is replicated – where it won would average 62.7% whereas in the remaining seats in Conservative hands it would be 68.8%. If the Conservatives won all of their targets, turnout in all of its seats would average 67.8%, whereas in those retained by Labour it would be 60.9.

Once again, the conclusion is clear – Labour would be advantaged by the same pattern of turnout differentials across the constituencies in 2015 as in 2010 (even if the SNP won all of Scotland’s seats, when the average turnout would be 60.9% and 68.2% in Labour- and Conservative-held seats respectively in England and Wales).

“Turnout differences gave Labour a further – and more substantial – advantage over its main rival in 2010…”

Labour had a considerable advantage over the Conservatives in 2010 – as at previous elections – because its seats had fewer electors on average. (Which is not to deny that some Labour-held seats have large electorates: two of the biggest in 2014 were Manchester Central and Ilford South.) That situation will not change markedly in 2015, unless the Conservatives win a large number of Labour-held marginals. Turnout differences gave Labour a further – and more substantial – advantage over its main rival in 2010, and that too is unlikely to change markedly in 2015.

In conclusion if, as all the opinion polls suggest, the two parties are close in their vote shares on 7 May, Labour could get as many as 30 more seats than the Conservatives (with the size of that gap dependent on the outcome in Scotland). This could be sufficient to make Labour the largest party, giving Ed Miliband the first attempt to form a government – even if Labour came only second in the vote tally. Such an outcome is almost certain because of the lower turnout in Labour seats. The Conservatives’ failure to get the differences in constituency size changed, because the creation of new constituencies was aborted in 2013, makes Labour’s advantage even more certain.

About the authors

Policy Press CoverYou can read more on this subject by Ron Johnston and Charles Pattie in their book Money and electoral politics  – available to buy from the Policy Press website here. Don’t forget newsletter subscribers receive a 35% discount on all our titles purchased through our website. Not a subscriber? Don’t feel left out, sign up here!

Ron Johnston is Professor of Geography at the University of Bristol. Charles Pattie is Professor of Geography at the University of Sheffield, specialising in electoral geography. David Rossiter has worked in a research capacity at the Universities of Sheffield, Oxford, Bristol, Leeds and Essex. He has been involved in the redistricting process both as academic observer (for example The Boundary Commissions, MUP, 1999) and as advisor to the Liberal Democrats at the time of the Fourth Periodic Review.

This blog was originally posted on the LSE blog here. #bitetheballot #imvotingbecause #whyvote

The views and opinions expressed on this blog site are solely those of the original blogpost authors and other contributors. These views and opinions do not necessarily represent those of the Policy Press and/or any/all contributors to this site.


Enter your email address to follow this blog and receive notifications of new posts by email.

Twitter Updates

Archives

Creative Commons License

Republish our articles for free, online or in print.

The work on the Policy Press blog is licensed under a Creative Commons licence.