Some users of the Google Glass have received a bit of a backlash for being perceived as a bit of a tosser. So here is some wearable tech that is aimed at making you seem more sociable instead – the AgencyGlass.
The AgencyGlass is a prototype developed by Professor Hirotaka Osawa of the University of Tsukuba which is effectively the reverse of a Google Glass – whereas Google Glass projects information to the wearer, the AgencyGlass is designed to project information outwards to the people the wearer is interacting with.
The glasses feature two OLED screens, which project a set of socially responsive “eyes” to anyone the wearer is interacting with. The projected eyes are controlled by either a smartphone or PC via a Bluetooth wireless connection and the computer is also connected to a camera to take readings from the wider environment.
The glasses themselves also have a gyrometer and accelerometer sensors fitted to one arm to monitor the wearer’s behaviour and a battery sits on the other arm to power the device.
If the wearer nods, the AgencyGlass “eyes” on the glasses blink. If the wearer shakes their head, the eyes look upwards.
Then entering into the realms of slightly unsettling, the facial recognition software in the glasses detect when someone is looking straight at the wearer and the AgencyGlass eyes gaze back at them. If the wearer inclines their head, the eyes still look upward – so basically the wearer can move their head around and the “eyes” on their glasses will maintain a fixed stare at anyone looking at them…
Before you put this in the “weird and wacky inventions from Japan” basket, the Professor who invented the glasses is clearly aware that the prototype of the AgencyGlass makes people look a little… odd… Or at least the demonstration video he put together for it doesn’t entirely take itself too seriously (you’ve gotta love the computer voice narration and the close-up’s of creepy eyes).
On a more serious side, this invention is actually part of a much bigger field of research currently being conducted into understanding how we send and receive emotional information via our face. This non-verbal communication in turn has a range of wide range of applications:
- Developing more emotionally intuitive computer systems.
Computers that can understand our facial cues, as well as project their own will make it easier for computers to understand us, thereby making it easier for us to interact with software and voice recognition applications.
- Developing a better understanding of conditions like autism.
Those on the autism spectrum often have difficulty recognising and/or accurately assessing the emotional cues in the faces of people they interact with, often leading to social barriers to interaction. Emotionally intelligent computers can act as tools for autistic people to translate the emotional states of those around them in real time.
- Developing better language translation tools.
It appears that facial expressions of emotional are not culturally universal and that this can lead to cross-cultural misunderstandings and language translation errors. Tools that can both translate words an grammar but also the emotional context of the speakers can improve the accuracy of real-time language translators.
Looking for your own AgencyGlass?
Well you can’t get one.
AgencyGlass is even more exclusive than Google Glass since it is a prototype that consists of just one set at this stage. But fear not, for society has already provided us with the more low-tech version – Googly Eyes.
As demonstrated by Seth Green, these can be used to sneak a nap, while looking completely busy to everyone else.
Too much use of Googly Eyes can result in permanent state of Googly Eyes.