On Monday, we started a new initiative to boost response to the Political Science Peer Review Survey. Thanks to some very industrious research students, we were able to identify about 21,000 individual authors who have published in Social Science Citation Index-covered Political Science Journals between 2000 and 2008. For about 8,000 of these, the SSCI lists their email addresses (that’s the EM field in the SSCI records), and so we started contacting them and asked them to participate in the survey. Obviously, some addresses are not longer valid because people have moved on to different places or have left academia altogether. Nonetheless, I was slightly surprised by the rather poor quality of the address data supplied by Thomson. In some cases, letters were missing whereas in other cases similar looking letters (e.g. ‘v’ and ‘y’) had been confused. This looks like either a weak OCR routine or an non-native and underpaid data typing slave has been used. Overall, we have contacted 962 people so far. About 200 of our messages have bounced, and we have 61 new responses to the survey (assuming that without the mailout, no one would have responded during these four days), which brings us to a new total of 238 responses
Almost exactly three years ago, a major political science journal asked me to review a manuscript. I recommended to reject the paper on the grounds that a) its scope was extremely limited and b) that it largely ignored the huge body of existing political science literature on its topic. The editors followed my suggestion (presumably, the other reviewers did not like the piece either). A couple of days ago, an obscure national journal sent me the very same (though slightly updated and upgraded) manuscript review. Is this sad or funny? How often did they authors have to downgrade their ambitions for finding a decent outlet in the process? And how common is this?
Thanks to the all new, all shiny political science peer-review survey, there is at least an answer to the last question: about 30 per cent of our respondents say that they would submit a rejected manuscript to a less prestigious journal. But what really strikes me is the proportion of reviewers who have reviewed (and rejected?) the same manuscript for at least two different journals: 29 per cent. This squares nicely with my personal experience (sometimes I seem to hit the same wall twice or more) and points to the fact that political science is a small world. Too small perhaps.
The survey is still open, so if your are an active political scientist, please, please participate and share your experience with us! We will publish preliminary results of the peer review survey online and will eventually put the data into the public domain.
More preliminary findings on Social Networks in Political Science: from our analysis of collaboration patterns in the British Journal of Political Science (BJPS) and Political Studies (PS), we conclude that co-publication is much more widespread and intense than in Germany (not a huge surprise). Yet, at least on the basis of these two journals, collaboration networks in British political science look rather fragile when compared to the sciences. Obviously, further research is needed.
Like most social scientists I am a little bit obsessed with social networks. I’m also interested in the sociology of knowledge, which is a little more original. So some time ago, a colleague and I embarked on a project called “Networks in Political Science”, which rather unsurprisingly tries to apply network analysis to publications in Political Science. Our basic idea is that everyone seems to take subfields, theoretical schools and even citation circles for granted, but unlike in some other disciplines, little empirical work has been done so far. More specifically, we want to identify
- highly cited articles that form the core of subfields
- individual influential scholars
- sub-networks of scholars that cite each other and/or collaborate frequently, thereby forming an “invisible college” and
- individuals that are able to bridge sub-discplinary divides by publishing in a whole host of subfields.
Ideally, we would build a huge database of articles, chapters, and monographs. However, this requires lots of research assistants, and so for the time being, we use the Social Science Citation Index, which covers at least the core journals. We are soon due to deliver a paper at a conference, so I started writing it up. I’ve already put some preliminary findings on co-publication in Politische Vierteljahresschrift (PVS), arguably the most important German political science journal, on the web. The summary is very short and perhaps not very surprising: it doesn’t happen on a large scale.