A bit more byte!

October 13, 2006

The digital computer was one of the greatest inventions of the 20th century. Computing became a significant feature of almost every UK higher education institution, with students wanting to study it in large numbers encouraged by the job prospects. Nearly 30,000 applied to read the subject in 2000-01. Computing was pre-eminent.

But in the 21st century the climate changed. The dot-com crash precipitated a severe drop in the number of applications, resulting in departments being reduced in size. Computer scientists saw their invention perceived as a routine piece of domestic furniture rather than something that retained complex technical challenges.

Simultaneously, funding was cut by 35 per cent. In research, the exciting interdisciplinary initiatives using e-science - the use of very large data sets, large computing resources and high-performance visualisation across global collaborations - facilitated new application areas in astronomy, biosciences, environmental science, finance and healthcare, diverting cash and attention from core computing. Computing was at risk of becoming subordinate.

Demand for computer scientists, computing and information technology specialists and skilled computer users remains high. A 2004 report on perceived skills shortages highlighted the significant number of people needed each year for increasingly complex, high added-value jobs; the rapidly changing skills necessary in that workforce; the fundamental strategic importance of computing technologies to UK firms; and a poor take-up of advanced skills by business.

Further, research is buoyant, with computing retaining its potential to have a profound impact on biomedical and other sciences, economics and the environment. There is also no reason to doubt that computing power will continue to increase as research into quantum computers, more extensive networks of connected devices and the more widespread distribution of computational power bears fruit.

The challenge for universities is how to promote computing's resurgence as a discipline that offers exciting opportunities. Here are four suggestions:

* Think computationally . Such thinking represents a universally applicable attitude and skill set everyone would be eager to learn. Skills include problem-solving, system design, multi-tasking, abstraction, the anticipation of unwelcome events and the understanding of how humans communicate. To develop these skills, "just-in-case" learning (where someone teaches you something in case you need it later) is replaced by "just-in-time" learning (where you find out for yourself at the point you need the information). The real and the virtual worlds become less distinguishable. Students need to be educated to think as well as to know.

* Think small . Small devices contribute towards making the real and virtual worlds indistinguishable by providing data about the real world on a large scale. Devices are typically embedded within the real world, within buildings or even within human beings. Coupled with analytical technology, they offer the opportunity to study very large, complex systems on an unprecedented scale. They also provide the basis of an exposition of computing science that students can readily understand or even study in the context of their own small devices (mobile phones, iPods).

* Think big . Modern computing applications are among the most intellectually complex systems humans have produced. Their complexity derives not only from the functionality they are expected to deliver, but also from the functionality they are not expected to deliver, even when users try to use them in unexpected ways. They are unique combinations of theory and practice, scholarly and professional, supporting large and small organisations, and largely invisible until something goes wrong. Enthusing students with the breathtaking scale of these complex applications, and equipping them with the skills to tackle them professionally, is one of the greatest challenges educators face.

* Think out . The graduates of the 21st century were born into a digital world and want to study programmes that are modern (rather than traditional), more academic than vocational, in a friendly environment. They seek curriculums that prepare them for a career, delivered in an environment that provides an exceptional student experience and a social network for life.

Universities must rise to this challenge by providing a curriculum that engages these students more than law, psychology, creative arts, sports studies and business curriculums if they are to produce enough IT graduates to satisfy the national and international need.

Keith Mander is chair of the Council of Professors and Heads of Computing and deputy vice-chancellor of Kent University.

Back to Microsoft index page

You've reached your article limit.

Register to continue

Registration is free and only takes a moment. Once registered you can read a total of 3 articles each month, plus:

  • Sign up for the editor's highlights
  • Receive World University Rankings news first
  • Get job alerts, shortlist jobs and save job searches
  • Participate in reader discussions and post comments
Register

Have your say

Log in or register to post comments