
A study at Physiology at Brighton and Sussex Medical School shows that computers can judge a person’s level of interest through non-instrumental movements – tiny involuntary movements that people usually constantly exhibit when they use a computer. In the paper published in the journal Frontiers in Psychology, body language expert Dr Harry Witchel says if someone is absorbed in what they see on the screen – having rapt engagement – the body shows less involuntary movements.
For the study, researchers recruited 27 participants and they made them face a range of three-minute stimuli on a computer, from playing games to tedious readings from EU banking regulation, while using a handheld trackball to minimise instrumental movements.
Using video motion tracking, the participants movements were quantified over the three minutes. Researchers, in two comparable reading tasks, found that the more engaging reading resulted 42 percent reduction of non-instrumental movement.
Unlike non-instrumental movements, instrumental movements require eye-hand coordination. Moving the mouse or using the keyboard, etc., are the good examples of instrumental movements.
“Our study showed that when someone is really highly engaged in what they’re doing, they suppress these tiny involuntary movements. It’s the same as when a small child, who is normally constantly on the go, stares gaping at cartoons on the television without moving a muscle,” said Witchel in a news release. “Being able to ‘read’ a person’s interest in a computer program could bring real benefits to future digital learning, making it a much more two-way process.”
“Further ahead it could help us create more empathetic companion robots, which may sound very ‘sci fi’ but are becoming a realistic possibility within our lifetimes.”
Researchers believe their discovery could have a significant impact on the development of artificial intelligence, such as in making online tutoring programmes that adapt to a person’s level of interest, in order to “re-engage them if they are showing signs of boredom.”
It also creates huge opportunities for manipulating human attention and emotional responses without humans even realizing it. I’m sure, this technology will be used, first of all, for advertising rendering the “like” buttons obsolete since the computers will read user’s emotional responses directly. They should also track the users’s eyeball movements to determine which section of the screen the user is reading and which ad he might be looking at. The second application is political propaganda which is, really, just a different application of the marketing techniques.
I think, it becomes increasingly important to teach people how to control their attention and emotional responses through meditation, breathing exercises, or other techniques. Addiction to devices, gadgets, social media, etc. becomes a new reality. Just go to any public place and look around. You’ll easily see several people standing with ears plugged, holding their devices and staring at the screen. The Palantir from The Lord of the Rings is not fantasy any more.
LikeLike
I admire your forward-thinking abilities. Never thought the research could be applied to marketing/advertising. You’re just brilliant. Thanks for your wonderful comment. 🙂
LikeLike