Mixed Reality (MR) describes the blending of the real and virtual world. Applied to science, this merger has enormous potential. “MR.Dali” supports nature scientists in collecting, editing and sharing data locally.
Through object recognition, information can be bound to objects which have been created by the tool named “stolus“. Eye tracking & muscle sensors help the scientist navigate through the interface quick and intuitively, while walking. The collected data can be shared among researchers and used for analysis, education and planning.
DATE — March – May 2018
CATEGORY — Mixed Reality
PLACE — ZHdK, ETHZ & Technopark Zurich
TEAM — Stella Mühlhaus, Manuel Kallen , Micha Weber, Marco Ketzel
OUR GOAL
The aim was to find an intuitive control system for Mixed Reality. Based on the natural human behavior, our project submits a new control combination: the eye-gaze used as the cursor and the muscle as the trigger. The advantages of our work were then validated and visualized through a user case.
Early on, our team and project needed a name. We chose: MR for Mixed Reality + the surrealism artist Salvador Dalì. = “MR. Dali”
NATURAL SCIENCES
The topic of inventory in connection with the user case of natural sciences fascinated us throughout the entire project.
At first it may seem rather contradictory to use this modern piece of technology (mixed reality) outside in the nature. But since the natural scientists often have to carry technical equipment with them, it makes especially sense for them to wear a head-mounted display (HMD) instead to carry a laptop.
Furthermore in the course of the work it turned out that perhaps mixed reality may be also an initial asset that brings technology and nature together: “seeing the Data of Nature — Making the nature readable in human language … “
For example, with our idea the following points would be possible:
MR.Dali supports researchers in collecting, processing and sharing measurement data on site. Using image recognition, information can be linked to objects that are captured by a pen. Eye-tracking control helps the user navigate the interface swiftly and intuitively. MR.Dali enables researchers to exchange the collected geographical and climatic data and thus to use it for analysis and planning.
“Humans are eye-animals“. The first gesture humans normally do is to gaze at the object of interest. So why dragging a cursor after it? Especially if we have a computer in front of our eyes, the eyes are then actually predestined to function for control purposes.
We have combined the Microsoft HoloLens with the EyeTrackers from PupilLabs. Through a self-written code the captured eye gaze can now be used as cursor. This makes the navigation in mixed reality faster and more intuitive to use. One of the advantages, which was not possible before, is for example, that it can now also be used while walking.
The feeling that a computer object recognizes that it is being watched was an incredible joyful moment. A new level of proximity has been created.
To take action we need our muscles. Even if we only think about the action, we minimally tense muscles. So this is our trigger … also for mixed reality purposes.
And don’t we forget that it’s also natural for us to use tools instead our bare hands to perform a task. Especially in the case of MR, a tangible tool can be a beneficial extension for the interaction.
Therefore we designed “stolus”.
The wristband captures the muscle contractions and comes with a device in the form of a biface.
In MR, holograms are integrated into the physical world. So it can be under/on/behind/… real objects. But if you want to move a physical object with its hologram, the hologram stays at its spatial place. Reason, MR does not distinguish objects from the room surfaces yet. So the hologram remains unaffected.
In our work we defined a parenting system that categorizes the connection between holograms and the physical world in 3 stages:
Space-Child, My-Child, Object-Child.
Through “Object-Child” the holograms gain a bit more reality and a whole new interaction level.
In our project we work on all 3 stages. With vuforia we were able to develop our prototype of an object-child. The object was now recognizable and holograms could stick to it.
Depending on the purpose of the menu/feature, it is essential to create an appropriate appearance which enables a proper interaction. Here I introduce you to 3 interaction levels:
The satellite is a kind of “start menu” that always follows its user, hence its name. The satellite is generally invisible so the field of view remains unobstructed. Only when the user looks up left*, to the satellite’s position, it becomes visible and unfolds to the main menu.
Example: Drawing tool If the drawing tool is active, the tool-palette can be opened by a double click. The tool-palette appears around the hand that called it. The C shape allows a quick and practical handling by simply tilting the hand in a certain direction. The open side is not needed as it would be covered by the hand and from ergonomic test is perceived as a rather uncomfortable movement.