Keyboards could become an optional extra if City University research produces a computer that can interpret the slightest hand movement.
The team includes experts in mime and computer science who will create a machine that can recognise and act on gestures, body language and sign language that would benefit the deaf and people with severe physical disabilities.
The machine would detect particular movements that corresponded to computer function commands. This could pave the way for a commercial version which would scan movements with a camera.
David Roy, a systems scientist and one of the London-based university's six- strong Gesture, Sign Language and Technology Group, said: "We are in danger of spending most of our lives tapping away at a keyboard. It is important that we greatly expand the range of human actions that can be used for computer input."
The group is seeking grants from the Engineering and Physical Sciences Research Council and industrial sponsorship.