The academic journal system and its method of quality control, peer review, have been widely criticised. Dirty tricks and perverse decisions are often suspected.
The problems are well known. Most journals are prohibitively expensive for those outside the academy. There is typically a longish delay between submitting an article and getting it published, particularly if the author has to submit to several journals (simultaneous submissions not being allowed). The peer-review process is fraught with problems, such as unreliability (two reviewers are really not enough), the possibility that an author's pet theory may be pinched or blocked by anonymous reviewers, and a hostility to new ideas from editors and peers steeped in the prejudices of their disciplines.
The multiplicity of journals, each with its own format, makes life complicated for authors and readers. And mistakes are made - things are published that probably shouldn't be, and vice versa.
The conventional wisdom is that there's no viable alternative. However, particularly now that the internet is challenging so many established practices, it does seem worth considering other possibilities.
One scenario starts from the idea of a repository for all academic papers, both working papers and those that have been published in peer-reviewed journals - such as the Social Science Research Network (http://ssrn.com/).
However, there are two problems - one minor and one major. The minor problem is that currently these repositories are subject-specific, which means that authors must decide where their work belongs, and readers need to guess where work of interest is likely to be found. This problem can be easily solved: an organisation with global ambitions, such as Google, could create a repository for all academic areas. Let's call it the General Journal (GJ).
The major problem is quality control. How could readers be sure that what they read has been vetted by the gatekeepers of the academy?
This role could be taken on by third parties. Let's imagine we have a Bungee Jumping Science Association (BJSA), which publishes the respected Annals of Bungee Jumping. This journal is expensive, and libraries are increasingly unwilling to pay the subscription. Furthermore, prospective authors and readers often stray to rival publications, such as Suicide Studies and The Rubber Review. All the BJSA would have to do is transfer the allegiance of its writers and readers to the GJ and set up its own journal quality certification scheme.
Authors would submit their papers to the GJ and apply to have them certified. If, as is the usual practice, the BJSA insists on revisions to the original article before awarding its stamp, the certified paper would be a new entry in the GJ - linked to the original so that interested readers could see the impact of the certification process on the original.
This system would have big advantages. Readers could go to the GJ and search for papers with the BJSA quality stamp. The GJ would be open access and everything - the BJSA papers and those certified by Suicide Studies and The Rubber Review - would be in one place. The BJSA would face only the costs associated with certification.
Authors would also like the new system because it would save them time and effort. They would post their articles on the GJ and apply for multiple stamps of approval.
How the system would evolve is difficult to predict. Peer-review approval stamps could be fairly specific. If an article is in a peer-reviewed journal, we don't know if it has been checked for the quality of the writing, the soundness of the argument or the statistical analysis, or simply consistency with the editor's prejudices. With the new system, there could be a statistical stamp of approval, and readers would be able to see if research on, for example, the MMR vaccine lacked such certification.
On the other hand, articles exploring possibilities, as yet unproven, could be subject to different, perhaps more flexible, criteria. And there could be a certificate based on review by outsiders from other disciplines as a way of countering the introspective - and often bizarre - evaluation criteria used in some academic disciplines.