One particularly annoying aspect of doing reviews for learned journals is that assignments tend to arrive in clusters. Six months ago, I found myself in a bit of a pickle, with loads and loads of requests arriving within a short time. And just five weeks ago, another volley of invitations to review hit my mailbox within the space of hours, in one instance within minutes, which looked suspiciously like a flaw in the matrix. As these systems are fully computerised, automated and increasingly urgent reminders are now clustering in my mailbox, too.
This morning, I came around to read the first two of them, only to realise that I had already read and rejected them during the last campaign, when they had been submitted to other journals. As I would have done in their stead, the authors had addressed some minor points but left the basic structure as it was then, meaning that I could basically cut and past my old reviews of them. Something like this has happened to me in the past, but with a single manuscript and a two-year hiatus between those incarnations I reviewed. Getting this twice in a single morning is a little creepy. Now I’m looking forward to my afternoon reading.
Sixteen months ago, we started the Political Science Peer-Review Survey. This week, the input form was shut down. That is about three quarters of a year later than expected, but then again, I underestimated the fallout of my move back to Germany. Moreover, until a few weeks ago there was still a tiny trickle of replies coming in. So far, we have found few major problems with the data. The RA has spotted two instances where the respondent somehow managed to save the data at various stages of the interview, thereby inflating the number of respondents. Moreover, it’s amazing how many political scientists read ‘percent’ and give absolute numbers 😉
Right now, the RA is enjoying is well-deserved holiday. He’ll be back in four weeks time, and we hope to have a data set ready for distribution by June.
With about 100 new respondents, yet another brilliant week for the Political Science Peer-Review Survey draws to a close. While the snowball is still rolling, and while we cannot know for certain because the survey is anonymous after all, we might soon reach a point of saturation: I have received a number of very friendly replies from people who tell me that they have already heard about the survey once (or twice) from someone else. The Netherlands in particular seem to be a hotspot of peer-review survey related activities. You could guess that much from the distribution of our respondents. While the US dominate the field (as they should), Switzerland and the Netherlands come an amazing 5th and 6th, accurately reflecting the standing of these countries as Social Science strongholds.
On Monday, the Political Science Peer-Review Survey had 506 respondents. Between Tuesday and Friday, we sent out 1,100 new invitations. Five days and many contacts with helpful colleagues later the number stands at 626. Feel free to join them.
The title says it all: yesterday, respondents 500-506 took the Political Science Peer-Review Survey, which is obviously great. A neat detail is that so far, more than 60 current or previous editors of political science journals have taken part in the survey. Tomorrow, we will resume or email campaign (aimed at those who have published in SSCI journals over the last eight years or so) to get even more people on board.
Technorati-Tags: political science, peer review, journals, survey, publications, research, ssci, social science citation index
On Monday, we started a new initiative to boost response to the Political Science Peer Review Survey. Thanks to some very industrious research students, we were able to identify about 21,000 individual authors who have published in Social Science Citation Index-covered Political Science Journals between 2000 and 2008. For about 8,000 of these, the SSCI lists their email addresses (that’s the EM field in the SSCI records), and so we started contacting them and asked them to participate in the survey. Obviously, some addresses are not longer valid because people have moved on to different places or have left academia altogether. Nonetheless, I was slightly surprised by the rather poor quality of the address data supplied by Thomson. In some cases, letters were missing whereas in other cases similar looking letters (e.g. ‘v’ and ‘y’) had been confused. This looks like either a weak OCR routine or an non-native and underpaid data typing slave has been used. Overall, we have contacted 962 people so far. About 200 of our messages have bounced, and we have 61 new responses to the survey (assuming that without the mailout, no one would have responded during these four days), which brings us to a new total of 238 responses
While we are in the mood of surveying the peer-review process in political science, here is a quick link to the Political Science Journal Monitor. The site itself is blogspot blog converted into a makeshift forum, and activity is low. Nonetheless, this is an interesting an potentially relevant resource for many of us.
Almost exactly three years ago, a major political science journal asked me to review a manuscript. I recommended to reject the paper on the grounds that a) its scope was extremely limited and b) that it largely ignored the huge body of existing political science literature on its topic. The editors followed my suggestion (presumably, the other reviewers did not like the piece either). A couple of days ago, an obscure national journal sent me the very same (though slightly updated and upgraded) manuscript review. Is this sad or funny? How often did they authors have to downgrade their ambitions for finding a decent outlet in the process? And how common is this?
Thanks to the all new, all shiny political science peer-review survey, there is at least an answer to the last question: about 30 per cent of our respondents say that they would submit a rejected manuscript to a less prestigious journal. But what really strikes me is the proportion of reviewers who have reviewed (and rejected?) the same manuscript for at least two different journals: 29 per cent. This squares nicely with my personal experience (sometimes I seem to hit the same wall twice or more) and points to the fact that political science is a small world. Too small perhaps.
The survey is still open, so if your are an active political scientist, please, please participate and share your experience with us! We will publish preliminary results of the peer review survey online and will eventually put the data into the public domain.
If you edit, review or author manuscripts for political science journals, the peer-review process is at the centre of your professional life. Unfortunately, for most of us the process is largely a black box. While everyone has heard (or lived through) tales from the trenches, there is very little hard evidence on how the process actually works. This is why a number of colleagues and I started the peer-review survey project that aims at collecting information on the experience of authors, reviewers and editors of political science journals.
If you are an active political scientist, this survey is for you: we need your expertise, and your input is greatly appreciated. Filling in the form is fun and will typically take less than ten minutes of your time. It is also a great way to release some steam 🙂
Ready? Then proceed to the Political Science Peer-Review Survey.
We also put some (very) preliminary results of the political science peer-review survey online and will release further findings and eventually the data set in the future.
If you think this is worthwhile (and who wouldn’t?), please spread the word. To make this easier, we have created short URL for the survey (http://tinyurl.com/peer-review-survey) and the results (http://tinyurl.com/peer-review-results) that you can forward to your colleagues. Thanks again for your support. It is greatly appreciated.