Thirteen ways to spot a ‘predatory journal’ (and why we shouldn’t call them that)

Larissa Shamseer and David Moher have taken a close look at what it is that sets dodgy journals apart from the rest

March 27, 2017
Crocodile, predator, predatory
Source: iStock

Are predatory journals the publishing equivalent of what Donald Trump might call “a bad hombre”? our colleague asked in a recent commentary. In research just published, we dived in to have a look.

For the unfamiliar, the term “predatory journals” has been used to refer to online-only scholarly publishing entities with murky operations. They claim to be open access and to provide basic editorial services (peer review, perpetual archiving), yet they are failing on all accounts. They make money by collecting fees (article processing charges) from authors who publish with them.

However, we’re calling them “illegitimate publishing entities instead of “predatory journals”. In our opinion, these entities should be considered similar to the pseudo banking sites that sprang up in the early days of email, enticing people to send money abroad. They are not technically “journals” as they appear not to be authorised or in accordance with accepted publishing standards or rules.

In addition, the term “predatory” suggests that that the problem is one-way (that such entities are preying on innocent researchers), when, in fact, the relationship might be more symbiotic (some authors knowingly participate). “Predatory” also implies nefarious intent. Yet, in our observations, caught up inadvertently in this labelling is also a subgroup of highly localised, usually amateur, publishing efforts. Their inability to abide by (or their indifference to) the basic standards of publishing does not absolve them, however.

Thousands of these impostors are infiltrating the world of scientific publishing. And they are possibly giving a voice to very low-quality or bogus research (aka “alternative facts”), whereby agendas can be spun and spurious data released stealthily, free from the watchful eye of peer review. Even more disturbing is the likelihood that they are publishing real research that will never truly see the light of day. This jeopardises a basic tenet of scholarly publishing – to enable the results of research to be known for others to build upon.

From where we sit, at the Ottawa Hospital’s Centre for Journalology (ie, within biomedicine), the moral stakes of publishing in such shoddy entities are possibly highest, given the ethical and societal implications (such as impacts on future patient treatment outcomes). Taxpayer dollars are being used to fund research and pay for publications that are buried in the pages of illegitimate entities, never to be incorporated into health decisions or care.

In January 2017, the “blacklist” of scholarly publishing containing the names and URLs of these entities went dark, leaving a gaping hole. Beall’s List, as it was called (named after its curator), provided those who were interested with a reference list of, essentially, where not to publish. What most did not consider, however, was the serious lack of science and rigour in the way that Beall made decisions about which entities made the list. The criteria were: (1) not evidence-based; (2) applied arbitrarily; and (3) applied unilaterally by Beall. This is concerning given the amount of trust so many placed in the list.

With a few of our publishing colleagues, we rolled up our sleeves to find out how legitimate journals and their illegitimate counterparts stack up against Beall’s criteria (and a few of our own). We compared a few hundred biomedical journals – some from Beall’s List, some taken from known legitimate sources (such as PubMed Central, Abridged Index Medicus), including open access journals. Our full study methods are available here.

We found that not all criteria were unique to so-called “predatory journals” and that some were equally attributed to both types of journals. 

Based on our findings, we propose a list of 13 easy-to-assess “red flags” that researchers should look for. Taken collectively, we hope that they may help researchers sniff out a potentially illegitimate publishing entity. We specifically describe what to look for when assessing each criterion.

* Please note the following caveats:

    • These recommendations are based on the methods of assessment that we used in our study
    • We comment on the importance and rationale for these items, including how legitimate journals differ, in our study
    • We assessed biomedical journals, but the items are likely applicable more broadly
    • In our experience, illegitimate publishing entities are dynamic in nature, and these criteria may not apply perpetually to a given journal/entity
    • The information that we suggest examining is not always in an obvious location, including in legitimate journals. Authors should thoroughly examine the websites of journals they are planning to submit to. 

 

No. Rule Interpretation
1 The scope of interest includes non-biomedical subjects alongside biomedical topics Look to the “aims and scope” section of a potential journal website. If multiple, wide-ranging and unrelated fields of study are combined (eg, agriculture, geology, astrophysics, health), you should be concerned.
2 The website contains spelling and grammar errors If you see multiple, obvious English-language typos and grammatical errors on a journal’s home page, this is probably a sign of low quality (and poor translation).
3 Images are distorted/fuzzy, intended to look like something they are not, or are unauthorised Low-resolution or stretched images, screenshots or those that resemble or replicate legitimate industry images are a likely sign of a low-quality, if not illegitimate, publishing entity. Obtaining knowledge of or referring to legitimate publishing industry logos in your field may be handy. The Google Chrome similar image search feature may also be helpful here.
4 The home page language targets authors If the wording on a potential journal’s web page is aimed at attracting authors and submissions (eg, prominently inviting submissions, promoting quick peer review or publication), this is probably a good warning to stay away. You likely want your research to be read by a specific target audience. Publishing it in a venue with little focus (ie, vastly different topic areas) reduces this possibility.
5 The Index Copernicus Value is promoted on the website A metric called the Index Copernicus Value (ICV) is associated almost exclusively with illegitimate entities. Legitimate journals do not appear to use this questionable metric. Not all illegitimate entities have it, but if you see this metric listed on a potential journal’s website, it’s probably best to stay away.
6 Description of the manuscript handling process is lacking A journal should provide authors with details of what to expect after you submit your paper, including details about peer review. If you cannot find this information anywhere on the journal’s or publisher’s website, be concerned.
7 Manuscripts are requested to be submitted via email In the submission area of a journal’s website, if you are instructed to submit your manuscript via email to the journal/editor for consideration, rather than through an electronic submission system, this may be a sign that the journal is unfamiliar with standard/legitimate practice. Submitting your manuscript via email is probably unwise. If you encounter this, check with peers about the norms in your field.
8 Rapid publication is promised At this moment in time, most legitimate journals are unlikely to make any indication that your manuscript will be published rapidly. Whether or not an article is published at all typically depends on the outcome of peer review.
9 There is no retraction policy Every journal should have a mechanism for recalling or retracting an article. This information may be found in the journal policies section, or even in instructions to authors. Even in the best of journals, errors, omissions and fraud are possible. Journals should have an explicit, transparent statement of how they intend to handle such instances.
10 Information on whether and how journal content will be digitally preserved is absent Perpetual preservation of content, following a variety of industry digital archiving protocols, is a technical obligation of scholarly journals. It is a prerequisite for any journal seeking to be indexed in databases such as PubMed; it also a vital part of the inclusion criteria for entities such as the Directory of Open Access Journals. This information can be difficult to locate or to understand (if it is located). A good indication of preservation is whether the journal’s publisher deposits content to a central repository (in medicine, PubMed Central is an example of this).
11 The article processing/publication charge is very low (eg, <$150) In open access biomedical journals or those with open access options (hybrid) article processing charges (APCs) can be quite high (upwards of $800, about £650). Information about APCs may be found within journal sections on open access, journal policies or instructions to authors. If a journal claiming to be open access does not indicate a fee, or if the fee is very low (eg, <$150), check with your peers about the going rate for article processing charges in your area of research/publishing.
12 Journals claiming to be open access either retain copyright of published research or fail to mention copyright Pure open access journals do not require authors to sign over the copyright for their manuscripts to the journal. Authors should be able to retain copyright of published open access work. Look for this information before submitting to or publishing in a journal. It is often found in the journal policies or the instructions to authors sections.
13 The contact email address is non-professional and non-journal affiliated (eg, @gmail.com or @yahoo.com) Journals, journal editors and journal staff should all have institutional or journal-affiliated email addresses as a marker of professionalism. Check the “contact us” email address(es) as a first pass.

 

The bottom line is this. Authors must take responsibility for the places they submit their work, in the same way that we need to pay attention to where we spend money as consumers. These illegitimate entities are not scholarly journals.

Authors and readers can arm themselves against them by examining these 13 features more closely, and in addition we encourage all authors to consider the checklist at Think, Check, Submit, put forward by leaders in publication ethics and the publishing industry, when selecting a journal to submit to.

Larissa Shamseer is a PhD candidate at the University of Ottawa and a senior research associate with the Centre for Journalology at the Ottawa Hospital Research Institute. David Moher is a senior scientist and director of the Centre for Journalology.

You've reached your article limit.

Register to continue

Registration is free and only takes a moment. Once registered you can read a total of 3 articles each month, plus:

  • Sign up for the editor's highlights
  • Receive World University Rankings news first
  • Get job alerts, shortlist jobs and save job searches
  • Participate in reader discussions and post comments
Register

POSTSCRIPT:

The authors wish to thank their colleagues, Jason Roberts and Ginny Barbour, for their help with this article.

Have your say

Log in or register to post comments

Featured Jobs

Most Commented

Daniel Mitchell illustration (29 June 2017)

Academics who think they can do the work of professional staff better than professional staff themselves are not showing the kind of respect they expect from others

celebrate, cheer, tef results

Emilie Murphy calls on those who challenged the teaching excellence framework methodology in the past to stop sharing their university ratings with pride

Sir Christopher Snowden, former Universities UK president, attacks ratings in wake of Southampton’s bronze award

Senior academics at Teesside University put at risk of redundancy as summer break gets under way

Tef, results, gold, silver, bronze, teaching excellence framework

The results of the 2017 teaching excellence framework in full. Find out which universities were awarded gold, silver or bronze