Technology is an integral part of our today’s life. The “Personal Internet of Things” has become real and consists of connected devices such as SmartWatches, SmartPhones and SmartGlasses. These devices are communication interfaces to other people which will get increasingly more advanced over time.

These changes of progressive digitization through connected SmartDevices is also an interesting and promising field for the production economics. Interconnected machines, robots, tools and humans create cyber-physical-systems (CPS) and have the potential to significantly accelerate the product lifecycle. Humans will operate within those systems as flexible problem solvers and intelligent decision makers. However, an ever-growing number of connected devices and advanced sensors within the “Industrial Internet of Things” create huge amounts of data which must be understood and managed by the decision maker. Therefore, the communication with the CPS must be easy, intuitive and efficient.

The goal of iKPT 4.0 is to develop and investigate new ways of interaction with complex data of the “Industrial Internet of Things” in order to shorten production times and be able to flexibly react to problems and changes in the production chain as well as be able to dynamically create different variants of a product (custom-made manufacturing). In this context, novel technologies are used in order to create new multi-modal interaction paradigms which allows for much faster and easier perception of complex data and much more intuitive interaction for managing those data.

The project iKPT 4.0 is divided into 3 major parts with the following topics in detail:

M1) User-centered adaption of the working space in the CPS: The worker’s state is observed via physiological- and body-tracking technologies which can also be employed to measure the environment. According to this retrieved information, digital content as well as visualizations are appropriately processed for the worker. The result will be an interactive and adaptable system for the users.

M2) Multimodal interaction for cooperative human robot interaction: In the field of series production (e.g. automotive field), actions of humans and robots have to be synchronized and coordinated in order to set the human as cooperation partner of the robot. This cooperation should also be possible in unpredictable situations.

M3) Design engineering through mixed reality technologies: This part enhances the design process of technical products, services and systems with digital, innovative augmented and virtual reality solutions. A user-centered design process will be used in order to develop pre-visualization tools (sketches, modelling, illustration, simulation) to simulate the human machine cooperation in a collaborative manner.

All parts will be processed in similar steps:

  • Empathize: Come in contact with a specific working environment/field
  • Define/Analyze: Analyzation of empathize phase and define problems
  • Ideate: Create ideas to solve the problems with appropriate methods (e.g. Co-Design-Workshops)
  • Prototyping: Practical solution
  • Testing: Quantitative and qualitative user-evaluation

These steps must be individualized according to each project part. Moreover, adaptable solutions will be approached through iterative processes.

To sum up, the project focusses on the discharge of human in the industrial production and the usage of body-near interaction in industry 4.0. Existing technologies will be enhanced and used in novel combinations with appropriate methods to figure out innovative solutions which will be tested in industry.