Keyboards could become an optional extra if City University research produces a computer that can interpret the slightest hand movement.
The team includes experts in mime and computer science who will create a machine that can recognise and act on gestures, body language and sign language that would benefit the deaf and people with severe physical disabilities.
The machine would detect particular movements that corresponded to computer function commands. This could pave the way for a commercial version which would scan movements with a camera.
David Roy, a systems scientist and one of the London-based university's six- strong Gesture, Sign Language and Technology Group, said: "We are in danger of spending most of our lives tapping away at a keyboard. It is important that we greatly expand the range of human actions that can be used for computer input."
The group is seeking grants from the Engineering and Physical Sciences Research Council and industrial sponsorship.
Register to continue
Why register?
- Registration is free and only takes a moment
- Once registered, you can read 3 articles a month
- Sign up for our newsletter
Subscribe
Or subscribe for unlimited access to:
- Unlimited access to news, views, insights & reviews
- Digital editions
- Digital access to THE’s university and college rankings analysis
Already registered or a current subscriber? Login