Political Science journals: acceptance rates & turnaround time

What are the prospects for getting work published in political science related journals, and how long can authors expect to wait from a title before a first decision on a manuscript submission?  The table below© lists information for peer reviewed political science journals about journal final acceptance rates and turnaround times for manuscripts sent for external review.  Data is for the last collection period editors can provide (mostly 2011) and excludes journal special issues.  ‘Turnaround time’ is defined as the date from when a manuscript was first received by the journal to the date the author of the manuscript was provided with a first decision.

Journal

(italics: new or updated entry in last 7 days)

Publisher

Acceptance Rate
% mean average

Turnaround
mean number of days to 1st decision

Europe, US, other
publisher address +
Acta Politica Palgrave 27 92 Eur
Admin. Sci. Quart. Sage 9 62 US
African Affairs Oxford 12 60 US
Am. J. Pol. Sci. Wiley 11 91 US
Am. Pol. Res. Sage 21 49 US
Am. Pol. Sci. Rev Cambridge 8 70 US
Am. Rev. Pub. Admin. Sage 17 44 US
Brit. J.Pol. Sci. Cambridge 11 90+ Eur
Brit J. Pol & I. R. Wiley 35 63 Eur
Camb. Rev. Intl. Affairs T&F 25 90 Eur
Canadian J of Pol. Sci. Cambridge 27 72 Canada
Citizenship Studies T&F 25 74 Eur/US-Can
Comparative Eur. Pol Palgrave 30 45-50 Eur/US
Comparative Pol. Sheridan 9 86 US
Comparative Pol. Stud. Sage 14 53 US
Contemp. Pol. Theory Palgrave 10-15 60 Eur/US
Environmental Pol. T&F 19 n/a Eur
Ethics & Global Pol. CoAction 17 61 Eur
EiOP   29 66 Eur
Europ. J. Devel. Research Palgrave 29 60 Eur
Europ. J. Int. Rel. Sage 11 94 Eur/Aus
Europ. J Pol. Econ. Elsevier 15 40 Eur
Europ. J. Pol. Res. Wiley 6 42 Eur
Europ. Union Pol. Sage 20 45 Eur
Foreign Policy Anal. Wiley 27 64 US
French Pol. Palgrave 37 68 Eur/US
Governance Wiley 13 33 US
Govt. & Opposition Cambridge 20 54 Eur
Govt. & Policy (E&P:C) Pion 25 92 Eur
Intl. Fem. J. of Pol. T&F 18 75 US
Intl. Interactions T&F 13 45 US
Intl. Org. Cambridge 10 35 US
Intl. Polit. Sociol. Wiley 13 126 US
Intl. Stud. Quart. Wiley 9 58 US
Intl. Stud. Perspec. Wiley 25 68 US
Intl. Stud. Rev. Wiley 48 73 US
Irish Political Studies T&F 46 54 Eur
J. Comms. Mgmt. Emerald 29 65 Eur
J. Conflict Resolution Sage 14 105 US
J. Europ. Integration. T&F 26 80+ Eur
J. Int. Bus. Stud. Palgrave 6 76 US
J Polit. Philosophy Wiley 5 90% in 60 Aus
J. of Politics Cambridge 12 55 US
J. Poverty & Soc. Justice Policy 30 80 Eur
J. Public Affairs Wiley 32 70 Eur
J. Pub. Adm. Res  Theory Oxford 11 82 US
J. Social Policy Cambridge 20 91 Eur
J. Theoretical Politics Sage 15-20 90 Eur
J. Transatlantic Studies T&F 36 98 Eur
J. World Business Elsevier 12 65 US-Eur
Latin American Politics & Society Wiley 15 40 US
Legis. Stud. Quart. Wiley 14 60 US
New Polit. Econ. T&F 28 50 Eur
Org. & Mgmt. T&F 25 60 US
Organization Sage 20 112 Eur/US
Party Politics Sage 21 57 Eur
Philsophy & Public Affairs Wiley 3 95% in 60 US
Policy Sciences Springer 26 39 US
Policy Stud. J. Wiley 14 75 US
Polis   25 90 Eur/US
Politics Wiley 22 44 Eur
Polit. Behav. Springer 14 85 US
Polit. Comm. T&F 11 42 US
Polit. Geography Elsevier 22 37 Eur
Polit. Psychology Wiley 13 67 US
Polit. Quart. Wiley 60 40 Eur
Polit. Res. Quart Sage 12 75 US
Polit. Stud. Wiley 33 100 Eur
Polit.Theory Sage 9 97 US
Politics Philosophy Economics Sage 10 120 US
Problems of Post Communism ME Sharpe 26 60 US
Pub. Admin. Wiley 13 40 Eur/US
Pub. Admin. Review Wiley 12 47 US
Public Choice Springer 10 44 US/Eur
Public Opinion Quarterly Oxford 13 78 Eur
Publius Oxford 20 58 US
Quart. J. Pol. Sci. Now 14 n/a US/Eur
Regulation & Governance Wiley 12 51 Eur/US
Rev. of Intl. Orgs. Springer 15 47 Eur
Rev. of Intl. Polit. Econ. T&F 13 90 Eur/US-Can
Rev. of Intl. Stud. Cambridge 11 70 Eur
Rev. of Policy Research Wiley 36 58 US
Scand. Polit. Stud. Wiley 24 59 Eur
Security Studies T&F 10 60 US
Socio-Economic Review Oxford 13 35 Eur
Soc. Pol. & Admin Wiley 18 34 Eur
Soc. Pol & Soc. Cambridge 38 88 Eur
Soc. Sci. Quart. Wiley 14 133 US
South Europ. Soc & Pol. T&F 6 63 Eur
State Pol. & Policy Quart. Sage 25 50 US
Studies in Conflict & Terrorism T&F 36 57 US
Swiss Pol. Sci. Rev. Wiley 30 78 Eur
Terrorism & Political Violence T&F 26 39 US
The Forum De Gruyter 25 30 US
Work Employ & Soc Sage 18 47 Eur
World Politics Cambridge 7 83 US
         
Total: 96        
Mean averages all 20 66  
  Europe: 37 24    
  US&Can: 45 16    
  Europe/US/Can/Aus: 14      

 

The RSS feed for the table above© is http://www.reviewmyreview.eu/feed

Review My Review

(If you have arrived here first and are searching for the data table see instead here)

Thanks to all the editors concerned for contributing data for their titles to the commons. 

Titles in political science and related fields which did not provide (or pledge) information following two or more contacts are: Armed Forces & Society;Aust. J Pol. Sci.;Brit. Pol.; Bus. & Pol.; Critical Policy Stud; Critical Soc. Pol; Cultural Pol; Democratization; East Europ.Pol. & Soc., Ephemera; Europ. Urban & Reg. Stud.; Europe-Asia Studies; Evidence & Policy; German Pol.; Global Envt. Pol; Intl.J. Pub. Admin; Intl. Politik; Intl. Pol.; Intl. Theory; J.Civil Soc.; J. Common Market Stud.; J. Contemp. Europ. Res.; J. Europ. Pub. Policy; J. Europ. Soc. Policy; J. Strat. Stud; Local Govt. Stud.; Mediterranean Stud.; New Polit.Sci.; Policy & Pol.; Polit. Anal.; Polit. Sci. Quart.; Pol. & Soc; Politikon; Polity; Post Soviet Affairs; Pub. Policy & Admin.; Reg. Stud.; Rev. African Polit. Econ.; Scot. J. Polit. Econ.; Survival ;Theory, Culture & Soc.; Urban Affairs Review; West Europ. Pol.  Only 6 of these are US based titles.

A third category involved replies along the lines that the data was not readily available accompanied by varying degrees of intent to respond/ consider at a later stage, including European Political Science, History of Political Thought, Journal of Global Governance,Journal of Latin American Studies; International Affairs; Millennium, Parliamentary Affairs, and PS: Political Science and Politics; two journals added that they were a new editorial team (Journal of Public Policy, Regional & Federal Studies).

Readers will welcome data on any missing titles; editorial management & peer review software systems provide this information to editorial teams & publishers.  Alternatively, readers of any title who may recall a recent editorial commentary containing such data, etc, can identify missing information.

More about this site

This is an information tool for authors of manuscripts written for academic peer-reviewed political science related journals, as well as a resource for editors, publishers, reviewers and learned societies.   Beyond this page you can also find an opportunity via the link below to debate the review process, or to share a submission and review experience responsibly on this moderated site.  Results from surveys will also be posted in due course.

The growth of trained social science researchers, and research performance assessments, have increased the significance of the peer review process as well as the varied politics which can lurk behind it. The essential components of the peer review system remain largely unchanged over decades in that reviewers are anonymous and volunteer journal editors have substantial executive powers.  As social scientists we know the problems such circumstances can cause, yet there are few opportunities to debate the issues.  The impact on medicine and the natural sciences are well aired in popular coverage (see also the comments page), but there is little public discussion about the social sciences.

The publishing process has always involved substantial information asymmetries between editors and authors.  Editors now have an opportunity to easily share more information with authors via  editorial  management software systems (such as Scholar One/Manuscript Central) which most journals now use.  These could include progress updates where  authors have the ability to track a manuscript going through each stage  of the review process, but the systems are almost never used to inform authors.  These systems can also provide information about acceptance rates and turnaround times.

….and the data; publishers need to develop common standards

Political scientists are well versed with the politics which can lie behind any data collection exercise and its presentation.  Respondents will be skewed towards those with reasonable figures to report, and data can be collected and presented in various ways to best serve perceived interests.  Beyond this, an industry standard is required for full comparability. In the absence of this, data requests in this first round of collection have been standardised to common parameters, such as external review; for instance, some titles – mainly US based – have extensive ‘screening’ practices of internal review which restricts the number of manuscripts which go for external review.

Editors will be invited at a later stage to provide data to cover different types of average, ranges of variation in turnaround time data, different categories of acceptance responses, screening’ times (before a manuscript is sent for external review) and reject rates during screening.  However, highly tailored requests for data calculations will reduce response rates, and limit the extent of comparability.  Ultimately, an industry funded solution for a third party agency is required to capture this sophistication of data together with the means for independent verification.  Until such time as a comprehensive industry standard emerges among publishing houses, private initiatives such as this is the only way to provide authors with any information.  In any event, an indicative ‘mean average’ for final acceptance rates as well as turnaround times for manuscripts sent for external review is the data which most editors currently have readily available through editorial software management systems, and using this information is the best place from which to get this process established.

The territorial base of the title (US or otherwise – mostly derived from the Thomson ISI list linked in the Comments section) also seems to be an influence upon the data reported.  Until now, the available public domain data has mainly been provided by US based journals attached to professional societies, leading to more familiarity of public reporting (commentators attribute this to claims that some US titles   disproportionately favoured quantitative approaches, resulting in pressures upon titles linked to membership societies to release data).  There has been much less of a tradition of public domain reporting from titles based in other territories.  For editors based elsewhere – mostly Europe – this has been a first opportunity to share the data widely, and for a few even the very first moment to collect and scrutinise it.  Typically, such titles have fewer administrative resources with which to collect data than do US based titles attached to membership societies.  If and once known, sharing the information has been a challenge for some, and remains so for a few still;  titles based outside the US account for a disproportionate number of the non-responses, while US based titles (around half of the total) account for only 10% of non-responding titles.

We’re all Authors

There are frequent reports of long waiting times until a decision is notified; I recently waited almost 300 days for a single review from European Urban and Regional Studies.  The mean average for articles sent for review reported by editors is a little over two months; editors say unacceptable times are a small number of inevitable outlier cases caused by reviewer delay.  Yet survey data taken in the summer of 2012 highlights an issue of difference in the reports from editors, reviewers, and the experiences of authors.   Around one-fifth of authors (n=500) report an average turnaround time in the last two years in excess of 110 days.  See the link ‘Survey Results’ at the top left of the page for more.

A common perspective among authors is that editors can do more to exercise judgement about individual reviews in reaching a decision whether to publish, and particularly ‘outlier’ reviews betraying a lack of professionalism.  Journals with low acceptance rates often apply a  crude rule of rejecting any article which attracts a single negative review, leading to unfair outcomes and the possibility of work exceeding its sell-by date as well as never reaching the public domain.  There is scope for greater involvement of editorial board members in scrutinising outlier reviews so as to share the workload which editors have.

A reasonable premise is that the confidence which stakeholders have in a journal is related to the amount of available public information a journal is willing to provide. At present Palgrave, and Elsevier, are the only publishers which identify on their journal web pages the names and contact details of its responsible publishing managers for different titles.  In medicine, The Lancet provides a model of good publishing practice by having an Ombudsman empowered to investigate complaints of administrative malpractice.  There are ongoing examples on these pages, such as turnaround times exceeding a year.  Some editors have reported cases of their own articles in which a decision has never been made.  There are reports of decisions taking longer to communicate to authors than the time spent in external review.  In such cases, independent third party oversight can prevent excess.  The lack of response by any publisher with a stable of social science titles to this suggestion when put directly to them illustrates the need for such a measure; given a reasonable grievance and a wall of silence what redress is possible?

I have experience on all sides, as a journal editor, author and reviewer, and undertake this endeavour at my own initiative and from my own resources.  You will find reports from cases where arrangements work well, and where they don’t,  and suggestions for improvements, on this site.

Justin Greenwood, Professor of European Public Policy (info@reviewmyreview.eu)

Continue to Post or Read Reviews >>