I am drawn to research that develops a sociology of the web. It is easy – too easy – to assume that “everyone” is online and using digitised platforms in similar ways. Great differentiation is found in the identities, languages, behaviour and functions summoned by analogue and digital media.
Researchers, librarians and teachers are probing how specific groups and communities claim and use the Nintendo DSLite, Second Life, Facebook, the iPhone and even email in innovative ways.
The Pew Internet and American Life project has achieved an expansive range of longitudinal studies, but provocative work also exists in Ofcom reports on adult and child media literacy.
The Oxford Internet Institute at the University of Oxford and the Center for Information and Society (formerly the Center for Internet Studies) at the University of Washington align scholarship and professional development with international outreach.
The Joint Information Systems Committee (Jisc) papers, projects and studies are releasing results that can frame further research in web sociology. Perhaps most well known, and released in January this year, was the paper Information Behaviour of the Researcher of the Future, better known as the Google Generation report.
An under-cited finding in this document emerged from tracking the reading behaviour of both students and academics. The report revealed that it was not only “the Google Generation” that is reading less, but academics more generally.
The researchers have observed a tendency to skim read and, in particular, a tendency to glance at an abstract but not progress further into a paper. They note with a flourish, “Society is dumbing down” – but it is not “the young people” that are the problem, but the academic community generally, and older researchers in particular, that “give the game away”.
Building on this report, Jisc has announced a new project that will increase our understanding of how research behaviour is changing. It has called for a “Scoping study on issues relating to quality-control measures within the scholarly communications process”. Basically, it is interested in how refereeing and peer review operate in traditional and online environments.
Jisc recognises that “the networked content environment has raised issues about quality control, many of the current quality-control measures having been designed in the paper era.” I look forward to reading the results. The consequences of not understanding how peer review, expertise and refereeing operate online mean that academics and students will continue to “access” the easiest source, rather than the best.
Less recognised in both this call for tenders and the Jisc reports are the barriers that block academic and student attempts to read full text, online, refereed material. Google Scholar, a search engine that indexes scholarly literature, is providing an important function in the online environment, transforming a Fordist search engine into a post-Fordist sorter and sifter of information. It operates best not through conversational phrases but through an author search, a specific journal name or date of publication. Yet there is a profound problem with Google Scholar, and it is not of Google’s making.
Commercial aggregators such as Sage, Routledge and Elsevier have bought the journals that many of us used to read both in print and through screen-based delivery, and are bundling them into packages for purchase.
For example, in February I agreed to write an article for a freely available online refereed journal, but before the piece could be submitted to the editor, the publication has changed its name and publisher. There will now be a price levied for subscription.
This selling of scholarship is creating a two-tier system between the information-rich and the information-poor in the higher education sector. Some institutions can afford to buy the range of journals from the commercial aggregators, while others are left with a reduced list. When moving into Google Scholar, researchers can see their university’s position in the digital pecking order.
So many titles are unavailable to read in full text because publishers control the journal and restrict availability on the basis of paid subscription. A range of material is also searchable through Scirus, CiteSeer and getCited, but frequently more results are returned through subscription services such as Scopus and Web of Knowledge than via Google Scholar.
While the Google Generation report notes that scholars are reading abstracts, it leaves unmentioned the fact that frequently only the title, author and abstract are available online without the use of a credit card.
The report is correct to diagnose superficial reading, but did not question why this trend is emerging. Since the publication of the report, both of Microsoft’s “products” or “services” – Live Search Academic and Live Search Books – were discontinued as part of a wider corporate agenda to “wind down our digitizing initiatives”.
Even noting the flaw in the potential of Google Scholar caused by the commercial aggregators, the specialist search engine has made available diverse materials that were withheld from a geographically dispersed audience. The dissemination of international doctoral theses and conference papers, which are freely available, provides a gateway into new research.
However, the elite journals that maintain(ed) a leadership role in disciplines will face a reduced readership because fewer university libraries have the budget to subscribe and fewer academics will read the research either in paper copies in a library or digitally delivered to their desktops.
There are many solutions to this problem. For those of us who do not want restrictions on peer-reviewed scholarship in the online environment, this is the moment to demonstrate activism, courage and commitment.
The way to counter the commercial aggregators, improve the quality of online research and draw students away from blogs and Wikipedia is to commit to reading, citing, embedding in curriculum and publishing in online, open access, refereed journals. These are freely listed on Google Scholar and other portals.
An open-access journal is defined through both its peer review and funding model for distribution. Open access means that users can read, download, print and link to an article without charge to either users or their institutions. Journals with a period of embargo – with a subscription cost for a year and then freely released to the web – are rarely included in this definition.
Besides Google Scholar, the other increasingly important entrance into these quality online research materials is through the Directory of Open Access Journals (DOAJ) . Currently, 3,500 free, full text refereed journals are part of its list. While it is a reduced data set compared with Google Scholar, the entire articles are available, rather than merely the abstracts. DOAJ also aims to cover not only all subjects but a diversity of languages as well. It is – as described on the site – “a one-stop shop for users of open-access journals”.
DOAJ was founded as a result of the First Nordic Conference on Scholarly Communication in Lund/Copenhagen. The rationale for the project was that these free online journals have proved difficult to integrate into library and information services.
It would be of benefit to international scholarship if a way could be found to shape and organise this material. It is hosted, maintained and partly funded by the Lund University Libraries. There are two other open and free directories that also commit to this mode of publishing: Newjour and the Librarians Internet Index.
With the national ATHENS login system no longer existing in its current form after August 2008, there will be confusion in deploying new access methods to university subscriptions of online journals, databases and e-books. ATHENS allows people to use a single sign-on to access web resources and reduces the administrative burden for librarians and information managers.
While this temporary problem will be overcome in the next few months, it does highlight the instabilities in controlling and managing corporatised research material.
An important organisation assisting academics in operating outside of commercial aggregators is the Public Knowledge Project (PKP). Its goal is “a contribution to the public good on a global scale”.
PKP was formed as a way to assist academics to develop online journals and conference publishing and it has created free, open-source software to manage, publish and index journals and conferences. It is an integrated system; globally developed but locally managed.
A process has been initiated for the management of content, from online submission of articles to organisation of reader reports, presentation of copy and indexing of articles. The project has committed to the African academic community, with 250 journals from the continent already using PKP software.
The PKP is based on the Open Journal System (OJS), a software application developed in 1998 by John Willinsky who was employed by the University of British Columbia’s Faculty of Education.
His rationale was that knowledge from academics should be disseminated widely and freely, with a particular consciousness of the scholarly needs of developing nations where publication costs and conference attendance may be prohibitive. Currently there are 1,400 titles using OJS in ten languages (http://pkp.sfu.ca/ojs-journals).
The project has moved through pilot and beta status, and Version 2 has just been released. Reaching this stage, Willinsky entered into a partnership with Simon Fraser University Library, the SFU Canadian Centre for Studies in Publishing and Stanford University to ensure support for the burgeoning system.
The Public Knowledge Project provides academics committed to high-quality and freely disseminated academic research with the tools to create a new generation of journals, and DOAJ and Google Scholar are providing pathways into this material.
If we want to improve the calibre of the web for scholars and researchers, and if we want to provide alternatives to Wikipedia and sponsored links, then it is important not only to support these journals at the level of citation, but also to submit articles to journals committed to open access.
We also should give these publications our best articles, not the pieces that have bounced around editors’ desks for a few years. I remember a referee for one of my books commenting that not only did he not recognise the name of many of authors I referenced in the text, but he frequently did not know the journals. In committing to the next generation of scholars, we must also value peer review and high-quality research while providing space for new names, voices and views to be read and applied by a wide audience.
Complaining about the digital nonsense online is like abusing Fred Astaire because he was a better dancer than singer. It misses the point. We are far better off locating and diagnosing the problem – the flattening of expertise in digital environments – and then systematically and carefully improving the quality of the scholarly material that is available online. Researchers, writers, activists and citizens can complain about this corporate restriction of scholarship, or we can actually do something.
Examples of academics who have courageously answered this challenge and require our support include Ben Agger’s Fast Capitalism, Gerry Coulter’s International Journal of Baudrillard Studies, Paul Stortz’s History of Intellectual Culture and Samar Habib’s Nebula. In assisting these editors to improve the quality, range and influence of these journals, the flattening and compression of expertise in the online environment will transform.
By answering this call, we can provide alternative scholarly pathways for our students. If they still answer with Wikipedia, then we know that the wrong question has been asked.