Augmented Human International Conference

Focuses on scientific contributions towards augmenting humans capabilities through technology for increased well-being and enjoyable human experience. The topics of interest include, but are not limited to: Brain-Computer Interfaces, Muscle Interfaces and Implanted Interfaces; Wearable Computing and Ubiquitous Computing; Augmented and Mixed Reality; Human Augmentation, Sensory Substitution and Fusion; Hardware and Sensors for Augmented Human Technologies; Safety, Ethics, Trust, Privacy and Security Aspects of Augmented Humanity.[1]

  • First Held: 2010
  • Last Held: February 25-27, 2016 in Geneva

The topics below are chosen from presentation held at the Augmented Human Conference at 25-27 February in Geneva 2016 and with a relevance to Augmented Reality.

Wearability Factors for Skin Interfaces [2]
There are two aspects of wearable skin interfaces to consider.
Body Aspect: location, body movements and body characteristics
Device Aspect: attachment methods, weight, insulation, accessibility, communication, interaction, aesthetics, conductors, device care and connection, battery life
Skin interfaces Skin interfaces example

A Lifelog System for Detecting Psychological Stress with Glass-equipped Temperature Sensors [3]
In this presentation a lifelog system enable us to measure biological information at all times with wearable devices . This experiment was done by using a glass that measures nasal skin temperature and makes a video at the same time. With that information the team could identify stress situation.
stress diagram

Augmented Visualization for Guiding Arm Movement in the First-Person Perspective [4]
The motivation behind the Guiding Arm Movement is to learn physical activities Tai-Chi. It can also be AR body partsused to learn any other movements. The user wears an AR-glass and sees the movement of the body as an augmented shape of body parts.  Occlusion caused by other students or objects becomes irrelevant as there is no more a traditional trainer showing the movements.
Physical activities can be learned in two steps:
1. We learn the new movement roughly, i.e. roughly learn the complete form of moves.
2. We know the basic movements and we need to learn the detail by correcting the small deviation.

Exploring Eye-Tracking-Driven Sonification for the Visually Impaired [5]
The idea is that the user can decide which information is relevant. The control is done by tracking the eye movements of the users exploration field. The device can play sounds for color, text and facial expression.
By color sonification, the color will be mapped with instruments and the pitch represents the brightness.
If there appears text in the Eye-Tracker it will be mapped to spoken sounds. With the Pitch and with Stereo it can be located in a 2D position (See picture below).
Facial Expressions mapped similar as color recognition to instruments.

text AR


[2] Wearability Factors for Skin Interfaces.Xin Liu, Katia Vega, Pattie Maes, Joe A. Paradiso. MIT Media Lab
[3] A Lifelog System for Detecting Psychological Stress with Glass-equipped Temperature Sensors. Hiroki Yasufuku, Tsutomu Terada, Masahiko Tsukamoto
[4]Augmented Visualization for Guiding Arm Movement in the First-Person Perspective. Ping-Hsuan Han, Kuan-Wen Chen, Chen-Hsin Hsieh, Yu-Jie Huang, and Yi-Ping Hung
[5] Exploring Eye-Tracking-Driven Sonification for the Visually Impaired. Michael Dietz, Maha El Garf, Ionut Damian, Elisabeth André

Leave a Reply

Your email address will not be published. Required fields are marked *