A new study from a body-language expert at Brighton and Sussex Medical School (BSMS) in the UK shows that a computer can tell if you’re bored by how much you twitch while reading something on screen, tracking your tiny, involuntary movements to…
HEY! Are you even listening?!?
Dr. Harry Witchel, Discipline Leader in Physiology, has got you pegged, oh ye of wandering attention.
According to a release the school put out on Wednesday, we send out “rapt engagement” vibes by more or less freezing solid, like a slack-jawed kid in front of a TV screen when SpongeBob SquarePants is on.
Our study showed that when someone is really highly engaged in what they’re doing, they suppress these tiny involuntary movements. It’s the same as when a small child, who is normally constantly on the go, stares gaping at cartoons on the television without moving a muscle.
The school thinks the discovery could have an impact on the development of artificial intelligence (AI).
One example: AI online tutoring programs could discern when they’re boring students silly and could adapt to a given viewer’s level of interest in an attempt to re-engage them.
Another possible application: teaching companion robots how to better gauge somebody’s state of mind.
Being able to ‘read’ a person’s interest in a computer program could bring real benefits to future digital learning, making it a much more two-way process. Further ahead it could help us create more empathetic companion robots, which may sound very ‘sci fi’ but [which] are becoming a realistic possibility within our lifetimes.
That would have come in handy for Pepper, the emotion-reading, joke-telling AI customer service robot who didn’t know enough to take cover when a drunk beat it up.
Or, come to think of it, maybe hitchBOT, the smiling, privacy-invading, hitchhiking robot might have been saved from its barbaric dismantling if it could have read micro-movements?
Then again, probably not. Smashing and kicking seem more like macro-movements.
BSMS suggests another possible use of micro-movement tracking: movie directors or game makers could use the technology to read, moment-by-moment, whether the events on the screen are interesting.
While viewers can be asked subjectively what they liked or disliked, a non-verbal technology would be able to detect emotions or mental states that people either forget or prefer not to mention.
The BSMS study included 27 participants who faced a range of 3-minute stimuli on a computer, from fascinating games to tedious readings from banking regulation.
They were given a handheld trackball to minimize instrumental movements, such as those we make when we move a mouse.
Then, their movements were quantified over the three minutes with the use of video motion tracking.
In two comparable reading tasks, BSMS says the less boring reading resulted in a significant reduction – 42% – of movement.
This certainly isn’t the first computer user micro-tracking experiment.
Back in late 2013, Facebook mulled silently tracking users’ cursor movements to see which ads we like best.
Google’s yet another company interested in our micro-wiggling.
In 2014, it was experimenting with swapping text CAPTCHAs for our human quiveriness when we click a mouse.
Readers, how much do you value your micro-movement privacy? Would you kiss it goodbye if it meant no more snoring your way through online content?
Let us know below!