Accurate OnSite Georeferenced Subsurface Utility Model Visualisation

In our modern life it public utilities like gas, electricity, water etc. are essential. Maintaining this infrastructure is a challenge since pipelines are usually buried under the surface. To properly maintain gas pipelines it is important to plan the maintaining process carefully to avoid traffic congestion or to prevent the gas from leaking and harming the local environment. The paper “Accurate OnSite Georeferenced Subsurface Utility Model Visualisation” which was at the SALENTO AVR 2015 presents a solution with AR technology.

The gas pipes, electrical wires etc. mostly only have a vaguely correct position. In most cases a the terrain has to be scanned to locate the exact position of the pipes and the depth. The scanning process takes in the information of the buildings and hydrants and the street borders to locate the utilities. The next step is usually marking the presumed positon of the utilities with spray paint on the road surface. The paper proposes to use AR to use the scanning data as an input to display the renderings of the presumed locations in the real world. Furthermore for the exact measuring of the street the use of robotic total stations (RTS) are proposed, which are important to measure the geolocation of an asset within millimetre accuracy. The system uses the RTS, a survey prism mounted on a pole and a tablet computer attached to the same pole.



The user is able to move the prism around and the display updates in real time and shows a 3D virtual excavation near the prism location.


Although the system is in the early stages it shows potential for the future development. Future Steps could involve the use of model and terrain renderings to help interpret the data, the system could be used to collect data on the site like hydrant and manholes location and load it a database, this technology could be used in other contexts like in construction work, building site monitoring, or augmentation of building walls.[1]



[1] S. Côté and A. Girard-Vallée Accurate OnSite Georeferenced Subsurface Utility Model Visualisation


AR applications in medical training

Medical education is one of the application fields of augmented reality. As mentioned in the use-case section, the head-mounted device HoloLens released by Microsoft can be used to visualize a fully virtual body and its parts in a room for studying anatomy. But there are approaches in the scientific world in this area as well.

In 2014, a study about the skills assessment in minimal invasive surgery with AR was conducted. A system consisting of a box-trainer, a camera and surgery tools equipped with sensors was developed by the scientists. Basic skills, hand-eye coordination and bimanual operation were tested in this survey. Therefore, experienced surgeons and novices in this field were compared. Differences between the two groups regarding smoothness and economy of motion were found. The authors highlighted the potential of the technology for medical training as a conclusion. [1]

The study of Kim, Chan, & Du did not deal with the potential of AR for surgical training, but for interaction with a radar screen. Also for this study, an AR system was developed. In addition, a training textbook for the same task was written. Two different groups of undergraduates trained either with the AR system or the textbook. During a practice session, the performance of the students was measured. Significant differences were found between the two groups. Students practicing with the augmented reality application were more aware of the situation than students training with the textbook. This indicates that AR can have a positive learning effect on trainees in the medical field. [2]

The above studies suggest that the potential of augmented reality in medical training is high. In 2016, Lahanas el al. made a survey of the educational opportunities of this technology. Thus, the authors looked for relevant articles and evaluated them. Up to August 215, they found twenty-seven relevant studies. The articles could be assigned to the following categories:

  • Laparoscopic surgical training
  • Neurosurgical procedures
  • Echocardiography

Even though face- and construct-validity was proven for most training methods, none of the applications was proven to have the ability to transfer information to the user. [3]

In conclusion, there are promising possibilities of applying AR to medical training. Possible areas of application in the medical education field are minimal invasive surgery, interaction with a radar screen or anatomy. It seems like AR applications have a positive learning effect. However, there is no evidence in literature that AR systems are able to transfer information to medical personnel.


[1] Lahanas, Loukas, Smailis, & Georgiou, 2015

[2] Kim, Chan, & Du, 2015

[3] Lahanas et al., 2015

Visual aspects of Augmented Reality

Vision Summit 2016

The International Conference on Augmented Reality, Virtual Reality and Computer Graphics (SALENTO AVR) takes place annually in the city of Otranto in Italy. The main goal of this conference is to bring together researchers, scientists, and practitioners to discuss key issues, approaches, ideas, open problems, innovative applications and trends on virtual and augmented reality, 3D visualization and computer graphics. The main areas of research are in the fields of medicine, cultural heritage, arts, education, entertainment, industrial and military sectors. Some of the currently discussed topics about Augmented Reality of this conference are in the field of the visual aspects of AR or research regarding perceptual issues in AR. [1]

Legibility Issues within Augmented Reality Applications

A lot of newly developed Augmented Reality applications in the industrial environment use Head-Worn Displays (HWDs) to provide the user with technical documentation by adding textual graphics, 3D Models and animations to the screen. A main problem of augmented reality in the industrial sector is the legibility and readability of the text which is displayed with HWDs. Because it is very important to display text-style, colour and illustrations properly, efficient text visualisation is therefore critical for industrial AR applications. Currently there is no standard which defines how text should be displayed via HWDs. Because of that a lot of developers don’t know the optimal text style for specific displays and applications.

According to the literature, legibility of a text depends mainly on the following aspects: background, display technology and text style including font, size and colour. There are currently four approaches how to increase text legibility in AR:

  • Move the text on the display or direct user’s gaze towards high contrast areas.
    • For assembly and maintenance operations, where the user must be focused on the task, this is not very feasible. Some workstations may not afford darker areas of the scene.
  • Modify the text contrast with hardware solutions in the display (e.g. LCD masking)
    • This solution, while the most promising approach, is still in the research stage.
  • Adapt the text color according to the background
    • For industrial applications this is not always feasible because it could violate color-coding guidelines that can be regulated by international standards (e.g. “ASME A13.1, 2007”, “ISO 3864”-Safety Symbols)
  • Employ outline/billboard technique
    • Based on the latest results it is suggested that this is a feasible solution for industrial applications because it the flexibility and the ease of implementation

plain text and billboard

Comparing the two text styles. Plain text on the left, billboard on the right.

The latest research suggests that enhancing text contrast via software, using outline or billboard, is an effective practice to improve legibility in many situations. Maximum contrast styles, like “black text and white billboard” or “white text only” is suggested when the reading time of a scenario is important. In conclusion billboards provide the best performances but at the slight cost of scene occlusion. [2]



[2] A. Uva, M. Fiorentino, G. Monno, “Addressing Legibility Issues in Industrial Augmented Reality”, 2014


3D real time capturing combined with AR


Communication between two remote parties has always been an interesting research topic. With the emerging technologies of augmented reality and the devices based on it, the opportunities have again reached a new height. The aim to create a virtual face-to-face experience is now possible with the help of 3D capturing and augmented reality.


“holoportation is a new type of 3D capture technology that allows high-quality 3D models of people to be reconstructed, compressed and transmitted anywhere in the world in real time. When combined with mixed reality displays such as HoloLens, this technology allows users to see, hear, and interact with remote participants in 3D as if they are actually present in the same physical space. Communicating and interacting with remote users becomes as natural as face-to-face communication.” [1]


The room2room technology makes it possible to create a telepresence between two remote participants. Its aim is to recreate the experience of a real face-to-face conversation with the help of 3D capturing and augmented reality technologies. Other than in the holoportation approach stated above, the room2room telepresence does not relay on any wearable device. The local user is being captured by colour and depth cameras which allows to create a virtual copy of the user projecting an image with the correct perspective. This setup makes it possible to enhance the non-verbal communication and allows to perform physical collaborative tasks more efficiently proven in a research environment. [2]

Fig. 1: Room2Room Setup
Fig. 1: Room2Room Setup



[2]         T. Pejsa, J. Kantor, H. Benko, E. Ofek, and A. Wilson, “Room2Room : Enabling Life – Size Telepresence in a Projected Augmented Reality Environment,” 2016.

Augmented Human International Conference

Focuses on scientific contributions towards augmenting humans capabilities through technology for increased well-being and enjoyable human experience. The topics of interest include, but are not limited to: Brain-Computer Interfaces, Muscle Interfaces and Implanted Interfaces; Wearable Computing and Ubiquitous Computing; Augmented and Mixed Reality; Human Augmentation, Sensory Substitution and Fusion; Hardware and Sensors for Augmented Human Technologies; Safety, Ethics, Trust, Privacy and Security Aspects of Augmented Humanity.[1]

  • First Held: 2010
  • Last Held: February 25-27, 2016 in Geneva

The topics below are chosen from presentation held at the Augmented Human Conference at 25-27 February in Geneva 2016 and with a relevance to Augmented Reality.

Wearability Factors for Skin Interfaces [2]
There are two aspects of wearable skin interfaces to consider.
Body Aspect: location, body movements and body characteristics
Device Aspect: attachment methods, weight, insulation, accessibility, communication, interaction, aesthetics, conductors, device care and connection, battery life
Skin interfaces Skin interfaces example

A Lifelog System for Detecting Psychological Stress with Glass-equipped Temperature Sensors [3]
In this presentation a lifelog system enable us to measure biological information at all times with wearable devices . This experiment was done by using a glass that measures nasal skin temperature and makes a video at the same time. With that information the team could identify stress situation.
stress diagram

Augmented Visualization for Guiding Arm Movement in the First-Person Perspective [4]
The motivation behind the Guiding Arm Movement is to learn physical activities Tai-Chi. It can also be AR body partsused to learn any other movements. The user wears an AR-glass and sees the movement of the body as an augmented shape of body parts.  Occlusion caused by other students or objects becomes irrelevant as there is no more a traditional trainer showing the movements.
Physical activities can be learned in two steps:
1. We learn the new movement roughly, i.e. roughly learn the complete form of moves.
2. We know the basic movements and we need to learn the detail by correcting the small deviation.

Exploring Eye-Tracking-Driven Sonification for the Visually Impaired [5]
The idea is that the user can decide which information is relevant. The control is done by tracking the eye movements of the users exploration field. The device can play sounds for color, text and facial expression.
By color sonification, the color will be mapped with instruments and the pitch represents the brightness.
If there appears text in the Eye-Tracker it will be mapped to spoken sounds. With the Pitch and with Stereo it can be located in a 2D position (See picture below).
Facial Expressions mapped similar as color recognition to instruments.

text AR


[2] Wearability Factors for Skin Interfaces.Xin Liu, Katia Vega, Pattie Maes, Joe A. Paradiso. MIT Media Lab
[3] A Lifelog System for Detecting Psychological Stress with Glass-equipped Temperature Sensors. Hiroki Yasufuku, Tsutomu Terada, Masahiko Tsukamoto
[4]Augmented Visualization for Guiding Arm Movement in the First-Person Perspective. Ping-Hsuan Han, Kuan-Wen Chen, Chen-Hsin Hsieh, Yu-Jie Huang, and Yi-Ping Hung
[5] Exploring Eye-Tracking-Driven Sonification for the Visually Impaired. Michael Dietz, Maha El Garf, Ionut Damian, Elisabeth André

Smart eyewear and the opportunities of the future

In the past and the near future, on Augmented Reality and technology conferences, one circular topic is often discussed: The evolution of the wearable technologies and the opportunities for the future in the consumer and the enterprise areas.

One discussed and upcoming subject is the next generation of smart eyewear. After Google had presented his Google Glass, the important question is, what happens next with this mobile computing platform and what are the applications in enterprises.

These topics are actually discussed at the Augmented and Virtual Reality conferences. Below there is a selection of presentations and discussions about Augmented Reality wearables, especially about Smart Glasses or Smart Eye Wear, at 2016 technology conferences.

Conference Name: Wearable Technology Show, March 2016, London, UK [1]

Presentation title: Sony Smart Eyeglass – Designing for the Human Being
Description: Customers doesn’t know the possibilities of Smart Eyeglasses and the question is, what would be good solutions and applications for them. This keynote expands Sony’s philosophical design considerations for Smart Eyeglass and what are Augmented Reality application designs and use cases.

Presentation title: An Introduction to Smart Eyewear
In this presentation several of the leading protagonists in Augmented Reality eye wear present their smart glasses. And they discuss the possibilities for consumer and enterprises with these new mobile computing platforms.

Conference Name: Augmented World Expo, June 2016, Santa Clara CA, USA [2]

Presentation title: Smart Glasses – Opportunities for the Enterprise Market
The main question of this short presentation is “what are the valuable opportunities for smart glasses in the enterprise?” The presentation will review the evolution of the technology and how that the industry has adopt this change to force up the power of human productivity.

Presentation title: The butterfly dream: Smart eyewear in 2031
The population of Augmented Reality devices will increase in the next 15 years and the eyewear looks almost like regular glasses. That makes the Augmented Reality industry very powerful and important and they will also bear great responsibilities. These topics and the future design choices will be discussed in this presentation.

Presentation title: The business impact of smart glasses for work
This presentation shares real stories on how business are deploying smart glasses into production environments. The presentation also gives an outlook how smart glasses begins to transform the workplaces and how workers will use technologies like Internet of Things IoT and big data analytics. Smart glasses connect the human workforce to with the intelligence of machines and data.

Conference Name: ARVR Innovate, April 2016, Dublin, Ireland [3]

This conference has one presentation about the topic of smart glasses and Augmented reality. The presentations title is “Immersive AR & Smart Glasses – the opportunities on the road to consumer adoption“, but there is no description of the content of this presentation.

A short discussion about smart glasses and smart eye wear

Since Google launched his Google Glass project, it is clear, that the use of wearable eyewear will be in the near future inevitable. The question is not “if”, only “when” the technology is available and technically mature.

Smart glasses will link available information’s and humans closer together. They feed augmented live information to the normal view during the people’s activities. Another question are the application areas of this technology: Where is it help- and useful and where can enterprises generate money. It is only a matter of time before smart glasses become a part of our daily lives.

Here is a list of Augmented Reality and smart glasses which are in development or already available. The list of course not closing. [4]

Epson Moverio Smart Eyewear

Epson Moverio BT-200 Smart Glass
Epson Moverio BT-200 Smart Glass

Meta Augmented Reality Headset

Meta 2 Smart Glass (Dev Kit)
Meta 2 Smart Glass (Development Kit)

Vuzix Smart Glasses

Vuzix M300 Smart Glass
Vuzix M300 Smart Glass

LaForge Shima

La Forge Optical - Shima Smart Glass
La Forge Optical – Shima Smart Glass

Optinvent ORA-2

Optinvent Ora-X Smart Glass
Optinvent Ora-X Smart Glass


[1] Wearable Technology Show, Augmented Reality & VR Show Conference 2016, [28.04.2016]
[2] AWE, Augmented World Expo,[30.04.2016]
[3]ARVR Innovate: Where Augmented and Virtual Reality Get Down to Business, [01.05.2016]
[4] Hongkiat, Ten Forthcoming Augmented Reality & Smart Glasses You Can Buy, [01.05.2016]

Augmented Reality – Real Estate


Augmented reality is adding value to a varity of aspects in the real estate sector. The following chapters introduce the most common fields.

AR for Planning and Constructing

When planning an architectural object many different people are involved during the whole processes. Since not individuals are able to understand constructing plans easily, augmented reality can be a very useful support tool.

The system setup to provide the mentioned function consists of two elements.

  • Collaborative Design Platform (Fig 1, left)
  • On Site AR Application (Fig. 1, right)
Fig 1: System set up

Collaborative Design Platform

The Collaborative Design Platform (CDP) is based on a multi touch screen surface enhanced with real 3D object recognition. In the planning phase 3D models can be placed on the CDP which automatically get recognized and digitally replicated in the central system.

On Site AR Application

The On Site AR Application is linked to the central system with wireless communication. This enables an operator on site to receive changes in the model in real-time. With the help of a mobile device (e.g. tablet, phone or augmented reality glasses) and AR technology, the virtual objects can be integrated in the physical environment

The concept provides a possible solution which allows to reflect design changes directly on site with the capabilities of Augmented Reality. [1]


AR for Real Estate Agents and Buyers

Augmented Retail can be used as a sales support tool for estate agents as well as for the buyers. Agents benefit from a faster communication with potential customer. Furthermore it is possible to get real time feedback whenever a potential buyer is looking at an AR advertisement. This can help to generate new customers with the right targeting.

Buyers can get a virtual tour of the property before the construction is even finished. It is also possible to change the interior design without the existence of actual furniture. This is huge improvement compared to looking at an outline design. Taking a virtual tour is a helpful decision support system for the process of buying new property. [2]




[1] R. Velasco, A. P. Brakke, and D. Chavarro, “Computer-Aided Architectural Design Futures. The Next City – New Technologies and the Future of the Built Environment,” Springer-Verlag Berlin Heidelb., vol. 527, pp. 172–191, 2015.


Application Areas of Virtual Reality

With this article the different application areas of virtual reality are introduced. Other than augmented reality virtual reality is not extending the physical environment with virtual elements. The aim is to create a completely independent virtual reality a user is able to interact with. The sensory experience can include sight, touch, hearing and even smell. [1]

Application Areas

The most popular fields in which virtual reality is present are:

  • Entertainment
  • Urban Design and Architecture
  • Training and Education
  • Medicine

The following chapters cover example applications of virtual reality to give a high level overview.


The first product which really caught the attention of the public eye was definitely the crowdfunding campaign on Kickstarter for the Oculus Rift in 2012. [2]

In the meantime there have been many different products introduced to the market. In Tokyo, April 15th 2016 the first virtual reality exception opened for a limited time. It allows anyone to experience the fascination of virtual reality.  [3]

Urban Design and Architecture

Virtual reality can be used to plan large scale urban project as well as individual buildings. The core element is a three dimensional digital model of the planned development. This is especially useful for long term project and has been used for the development of the campus of the University of Cambridge. [4]

3D Model University Campus
3D Model University Campus [4]
Training and Education

The US Army is using virtual reality setups to train combat situations in an enclosed environment. This minimizes potential risks and allows to reset and simulate specific situations in a short amount of time. This minimizes costs and increases efficency.[4]

US Army – Virtual Reality Training [5]

Virtual Reality is used to help individuals with a stroke or neurological disorder to recover. A patient is able to manipulate objects in a virtual environment which can be structured to provide audio-visual feedback. This promotes motor learning to improve the rehabilitation process. Specific living tasks are being practiced within the virtual environment, e.g. making a cup of coffee in a kitchen. Real-time motion capture helps the therapist to document change with the support of a software. [6]

Virtual Kitchen Setup [6]
US Army – Virtual Reality Training [5]




[4] S. Roudavski, “VIRTUAL ENVIRONMENTS AS SITUATED TECHNO- SOCIAL PERFORMANCES Virtual West Cambridge case-study,” 2010.


[6] D. White, K. Burdick, G. Fulk, J. Searleman, and J. Carroll, “A virtual reality application for stroke patient rehabilitation,” IEEE Int. Conf. Mechatronics Autom. 2005, vol. 2, no. July, pp. 1081–1086, 2005.



Medical Augmented Reality

Health care is one of the application areas where augmented reality can be applied by all actors of the field. AR is already widely used in training of new doctors. There are already lots of applications for medical education. Also, patients are benefiting from therapy applications, helping them to regain full functionality of their body. In the future, AR will have a more profound impact on the health care sector. Doctors will use AR to conduct surgeries more efficient and with more accuracy than today. [1]

Medical education

Learning how the human body works is difficult. Medical students not only need to know how all the parts of the body function, but where they are located. Since the human corps is a 3 dimensional object, augmented reality makes it easier to visualize it compared to a 2 dimensional screen or piece of paper.

At the Build 2016 Keynote, Microsoft demonstrated an application for its AR glasses called “HoloLens”, which is used in medical education. Students are able to see a model of the human body including the relevant parts for the course. Through gestures the model can be manipulated. For example, it is possible to zoom in/out, rotate the model or switch between the parts, which should be visible. Besides interaction, it is possible to collaborate in a group. A model can be manipulated by more than one person. This person not necessary needs to be in the same room. With the application for HoloLens, it is possible to work with people in different geographical locations. Learning the functions of the human body with a 3D model compared with studying with books saves valuable time and allows students to understand functions more efficient. [2]

Video: Microsoft HoloLens: Build 2016 Keynote. Medical education application demonstrated at 6:00 to 10:40.


Patients are an additional user group, which can benefit from AR in the health care are. For example, AR can be used in the stroke rehabilitation process. Patients need intensive treatment, usually with a therapist. To lower therapy costs, patients practice arm and hand movements with an AR application instead of with a therapist. The app cannot fully replace a therapist. Compared with virtual reality solutions, an AR system feels more natural because the patient is able to see its own body and interact more intuitively. [3]

In 2010 already, a group of researchers developed a prototype of a game for rehabilitation of the upper-limb. Objects on a table and the hand movements are recognized by a webcam. Different movements can be practiced with various objects. The equipment for the training is affordable and is not only suitable for rehabilitation in a hospital, but also for home use.



[3] Mousavi, H., Khademi, M., & Dodakian, L. (2013). A Spatial Augmented Reality Rehab System for Post-Stroke Hand Rehabilitation, 279–285.

Augmented Reality in Manufacturing Industry

Augmented reality in the manufacturing industry is a small part of the fourth industrial revolution also known as industry 4.0. The other three industrial revolution was the mechanization of production using water and steam power, mass production with the help of electric power and digital revolution with the help of electronics and IT. The term “Industry 4.0” was created 2011 from the German government to promote the computerization of manufacturing. Outside the german speaking countries it is known under “Digitization” and related with the IT-hype “Internet of Things”[1] and “Cyber-Physical Systems (CPS)”[2].

What is needed for AR in manufacturing industries?
Augmented reality in manufacturing industries will show real time data of machines, workplaces, equipment and providing a Human Machine Interface (HMI). A connected shop floor is a basic requirement for any further digitisation solutions. There is no standard interface to collect data from shop floor this service is provided by companies who sell SCADA, MES and automation software.

The picture shows the first five levels of a connected shop floor. Every level exchanges data with the surrounding levels in both directions.

  • The first and second level are at the physical shop floor.
  • SCADA-Network is the third level and stays for Supervisory Control And Data Acquisition. It is an Industrial control system (ICS) that monitor and control industrial processes that exist in the physical world. It is a system for remote monitoring and control that operates with signals over communication channels to PLC, PC and PID. The control system may be combined with a data acquisition system that acquires information about the status of the remote equipment.[3] (1) =
  • MES is the fourth level and stays for Manufacturing Execution Systems. MES are computerized systems used in manufacturing, to track and document the transformation of raw materials to finished goods. MES work in real time together with data from SCADA-Systems to measure and control activities in the production areas to increase productivity and improve process quality. It provides information that helps manufacturing decision makers understand how current conditions on the plant floor can be optimized to improve production output. [4]

Who are the target customers?
Users can incorporate this new technology into several different types of existing products and solutions. Easiest to implement and use of augmented reality application are by companies having experience with SCADA and MES systems. They can use the existing connectivity to equipment, devices, machines, actuators and sensors. This benefits can be used by a variety of industries, including manufacturing, water and wastewater, and oil and gas.[5]

Production data can be displayed augmented by tablet or smart glasses. An interaction between the operator and the equipment will also be possible. An implementation effort is needed to visualize 3D instruction for maintenance or showing steps to correct a machine fault. Users of the application can be production manager, operators and maintenance team.

What is offered to the customer?
Recognizing the potential for mobile displays to be used anywhere, a wide range of technology in mobile devices used. For object identification it uses barcodes and QR codes, to environmental awareness near-field communication (NFC) and global positioning system (GPS) can be used. New 3D technology allows users to place models in their geographical context. This combination of easily adjustable services, available from a central location, provides users with a nearly unlimited range of tools for interacting with their environment. Tablets will become a cost-effective way to replace aging mounted displays. Managers can monitor on their existing smartphones, further driving efficiency without any upgrade to infrastructure. [5]
The picture shows an overview of wearable devices which can be used for augmented reality use cases.[5]
Wearable Devices1

Based on the position of the user data from a device or a machine are loaded automatically. Once the operators position is located in a workstation, an overlap real images from the camera and the associated current and historical information about the device will be shown in real time. By scanning a barcode, QR-code or read of alphanumeric characters (OCR), the integrated Augmented Reality technology automatically loads all the relevant data and widgets. For example, fuel consumption of engines or read the actual motor current. With NFC data can be displayed even components are not directly accessible. For example, by a fan which is installed in the ventilation shaft. [6]
Other examples:

Use of GPS to load relevant information based on a detected location.

Iconics example in manufacturing
A manufacturing company might use NFC to identify specific pieces of machinery, or a supplier might use barcodes to identify anomalous batches in an SPC system.

Ergosign together with Inosoft: Display of operating aids at the right place. An animated 3D hand presenting in the camera image the necessary steps, this understands the operator better than an instruction sheet.

How is value achieved?
To achieve value suppliers can use their existing business model. The new augmented reality feature can be sold the same as they sell the feature for mobile devices. The value will be created in sell more license and based on that the yearly license cost of customer will be more expensive. This generates more gross margin. Together with the new feature the customer needs assistance with product launch. This services and training cost can be delivered and invoiced separately.

How to create value?
The Supplier of HMI-, MES- and SCADA-systems are the first players with an application of augmented reality for the manufacturing industry. They core business is to collect and show real-time data and control equipment over a screen. Suppliers build a new application for a wearable device that shows information as augmented reality in a production area. [5] Iconics, Inosoft, Progea are suppliers who can expand its software for mobile devices by a new feature augmented reality.

  • Iconics: HMI/SCADA/MES supplier
    The HMI/SCADA System of ICONICS will be extended by smart factory function in 2016. Augmented reality will be implemented on mobile HMI. Localisation over GPS, NFC or QR code makes it possible to identify the position and overlay the real picture with appropriate data.[7] Neuheiten in HMI/SCADA, Analytics und Augmented Reality.(24.11.2105). Pressemitteilung von: ICONICS Germany GmbH
  • Inosoft GmbH – HMI/SCADA supplier
    Inosoft presented at the SPS IPC Drives 2014 an augmented reality application for maintenance and monitoring measurements of a 3D printer. [8]
  • Progea – HMI/SCADA supplier
    Progea presented at the SPS/IPC/DRIVES 2014 their showcase how the interaction between human and machine could look in the future. With google glass additional visual information about production lines will be shown to the operator. He gets in real time informed about the status of the system and receives the necessary information to states from production data on alarms to maintenance information or data sheets. The operator can interact by touchpad to perform actions. [9]


[5] Iconics White Paper. (2016). Augmented Reality and Wearable Devices, January 2016
[6] Iconics. (2015). Augmented Reality vor Ort. SPS IPC Drives 2015
[7] Neuheiten in HMI/SCADA, Analytics und Augmented Reality.(24.11.2105). Pressemitteilung von: ICONICS Germany GmbH
[8] Ergosign Augmented Reality. (2015)
[9] Progea. Augmented Reality per Google Glass. (2014)