The editor of the Economics Bulletin, John Conley, has noted that many things go wrong with economic journals. Here is the abstract of his letter:
This letter calls attention a recent trend in economics publishing that seems to have slipped under the radar: large increases in submissions rates across a wide range of economics journals and steeply declining acceptance rates as a consequence. It is argued that this is bad for scholarly communication, bad for economics as a science, and imposes significant and wasteful costs on editors, referees. authors. and especially young people trying to establish themselves in the profession. It is further argued that the new “Big Deal” business model used by commercial publishers is primarily responsible for this situation. Finally it is argued that this presents a compelling reason to take advantage of new technologies to take control of certifying and distributing research away from commercial publishers and return it to scholarly community.
According to Conley,
The purpose of academic journals is to facilitate scholarly communication, filter for errors, and maintain the record of scientific advance.
This is, in my opinion, an idealized conception that does not reflect the purpose of economic journals anymore. For economic research, the current economic journals are largely redundant. Conley himself notes this:
I seldom actually read journals any more. I research topics using Google Scholar, RePEc, SSRN, and so on. It is inconvenient to sign up with publishers to get tables of contents emailed to me or to login to my university’s library web portal to search a journal issue by issue. I find it adds very little value over a more general search in any event. In short, certification remains important to help people gain tenure and promotion and to get a sense of the quality and centrality of individual scholars. However, neither certification by a journal, nor the collection of similar papers within the bound or even electronic pages of a specific journal has very much meaning to me when I am trying to understand where the debate in a subfield is at any given moment. As a result, I was beginning to come to the conclusion that while they are irritating, commercial publishers are “mostly harmless” to the research enterprise itself as publishing itself is becoming mostly irrelevant.
This coincides with my own observation: researchers don’t need journals. The main purpose of the journals is currently to ease the work of hiring committees. People publish in order to get a job. The wish to communicate new findings appears secondary in most cases.
Journals could serve worthier aims, however: they are needed by students, college teachers, and others who would like to obtain reliable information but can not as easily separate the wheat from the chaff as active researchers can.
The important point Conley is making is, however, that the current journal system, although largely irrelevant for research, is nevertheless
bad for scholarly communication, bad for economics as a science, and imposes significant and wasteful costs on editors, referees. authors. and especially young people trying to establish themselves in the profession.
I fear, however, that John Conley’s suggestion to increase the number of journals would not improve the situation very much. As long as hiring committees use the reputation of journals, rather than the reputation of individuals, a useful system of “communication, filter for errors, and maintain the record of scientific advance” is practically blocked.
What can be done besides increasing the number of journals? Here some further suggestions.
1. Hiring committees can restrict the number of papers to be considered for judging an applicant to, say, three and disregard all other writings. This may help to reduce the number of publications and thereby reduce the need for further journals; it would also tilt the quality-quantity trade-off in favor of quality. (I think this has been a practice in Berkeley.)
2. Hiring committees that feel incompetent to judge the substantive quality of a contribution and have to resort to statistics of some sort may turn to citation counts of individual authors, as obtainable through Google Scholar, Web of Science, or RePEc). This is a better solution than the the current practice of relying on the prestige of journals and would take account of the fact that many papers in top journals are not so good, and medium-quality journals publish excellent articles.
I am a full professor and not really planning to move anywhere but I still try to publish single author articles in journals (multi-author articles could be seen as trying to help other people get jobs). I am out here in Australia not a member of the NBER etc. and think that getting a paper in an international journal will get it a lot more attention than just putting it into a local working paper series. Even if people often end up reading the working paper version they’ll see on Google Scholar that it was certified by the journal. And if they use Web of Science or Scopus to search they’ll only find journal articles. Grant agencies also only consider actual journal publications when looking at your track record. And you’ll see full professors at even Harvard and MIT etc. publishing in journals though they could just stop at an NBER working paper. So, obviously it still has an important function beyond getting jobs. Or is the latter just a bad habit on their part?
I agree with point 1. Max 1 paper per year can be considered really innovative and worked out well. ‘Me too’ papers are only helpful in generating citations. Situation in statistics seems better. Limiting paper length and limiting no. of references is good for good journals. Getting cites from top journals is what matters. Review articles are obsolete if they cannot be updated online.