If robots are to work effectively in homes and other non-industrial environments, the way they are instructed to perform their jobs, and especially how they will be told to stop will be of critical importance. The people who interact with them may have little or no training in robotics, and so any interface will need to be extremely intuitive. Science fiction authors also typically assume that robots will eventually communicate with humans by talking, gestures and facial expressions, rather than a command-line interface. Although speech would be the most natural way for the human to communicate, it is quite unnatural for the robot. It will be quite a while before robots interact as naturally as the fictional C3P0.
- Speech recognition: Interpreting the continuous flow of sounds coming from a human (speech recognition), in real time, is a difficult task for a computer, mostly because of the great variability of speech. The same word, spoken by the same person may sound different depending on local acoustics, volume, the previous word, whether or not the speaker has a cold, etc.. It becomes even harder when the speaker has a different accent. Nevertheless, great strides have been made in the field since Davis, Biddulph, and Balashek designed the first "voice input system" which recognized "ten digits spoken by a single user with 100% accuracy" in 1952. Currently, the best systems can recognise continuous, natural speech, up to 160 words per minute, with an accuracy of 95%.
- Gestures: One can imagine, in the future, explaining to a robot chef how to make a pastry, or asking directions from a robot police officer. On both of these occasions, making hand gestures would aid the verbal descriptions. In the first case, the robot would be recognising gestures made by the human, and perhaps repeating them for confirmation. In the second case, the robot police officer would gesture to indicate "down the road, then turn right". It is quite likely that gestures will make up a part of the interaction between humans and robots. A great many systems have been developed to recognise human hand gestures.
- Facial expression: Facial expressions can provide rapid feedback on the progress of a dialog between two humans, and soon it may be able to do the same for humans and robots. A robot should know how to approach a human, judging by their facial expression and body language. Whether the person is happy, frightened or crazy-looking affects the type of interaction expected of the robot. Likewise, a robot like Kismet can produce a range of facial expressions, allowing it to have meaningful social exchanges with humans.
- Personality: Many of the robots of science fiction have personality, and that is something which may or may not be desirable in the commercial robots of the future. Nevertheless, researchers are trying to create robots which appear to have a personality: i.e. they use sounds, facial expressions and body language to try to convey an internal state, which may be joy, sadness or fear. One commercial example is Pleo, a toy robot dinosaur, which can exhibit several apparent emotions.
No comments:
Post a Comment