MR.Dali - Stella Mühlhaus
687
portfolio_page-template-default,single,single-portfolio_page,postid-687,bridge-core-1.0.4,ajax_fade,page_not_loaded,,qode-title-hidden,paspartu_enabled,qode_grid_1400,qode-content-sidebar-responsive,qode-child-theme-ver-1.0.0,qode-theme-ver-18.0.9,qode-theme-bridge,disabled_footer_bottom,wpb-js-composer js-comp-ver-5.7,vc_responsive

MR.Dali    A new approach on the use of mixed reality.

Mixed Reality (MR) describes the blending of the real and virtual world. Applied to science, this merger has enormous potential. “MR.Dali” supports nature scientists in collecting, editing and sharing data locally.

Through object recognition, information can be bound to objects which have been created by the tool named “stolus“. Eye tracking & muscle sensors help the scientist navigate through the interface quick and intuitively, while walking. The collected data can be shared among researchers and used for analysis, education and planning.

DATE  —  March – May 2018

CATEGORY  —  Mixed Reality

PLACE — ZHdK, ETHZ & Technopark Zurich

TEAM — Stella Mühlhaus, Manuel Kallen , Micha WeberMarco Ketzel

Project details
Foreword

OUR GOAL

The aim was to find an intuitive control system for Mixed Reality. Based on the natural human behavior, our project submits a new control combination: the eye-gaze used as the cursor and the muscle as the trigger. The advantages of our work were then validated and visualized through a user case.

THE NAME

Early on, our team and project needed a name. We chose: MR for Mixed Reality + the surrealism artist Salvador Dalì. = “MR. Dali”

Use Case

NATURAL SCIENCES

The topic of inventory in connection with the user case of natural sciences fascinated us throughout the entire project.

At first it may seem rather contradictory to use this modern piece of technology (mixed reality) outside in the nature. But since the natural scientists often have to carry technical equipment with them, it makes especially sense for them to wear a head-mounted display (HMD) instead to carry a laptop.

Furthermore in the course of the work it turned out that perhaps mixed reality may be also an initial asset that brings technology and nature together: “seeing the Data of Nature — Making the nature readable in human language … “

For example, with our idea the following points would be possible:

  • watching changes and symbioses
  • carrying less equipment with you
  • local based / attached data
  • easy recovery of marked objects
  • clearer comparisons
  • learn/read directly from the environment
  • easier planning by judging visible effects locally
Making of
Exhibition

MR.Dali supports researchers in collecting, processing and sharing measurement data on site. Using image recognition, information can be linked to objects that are captured by a pen. Eye-tracking control helps the user navigate the interface swiftly and intuitively. MR.Dali enables researchers to exchange the collected geographical and climatic data and thus to use it for analysis and planning.

the cursor

Eye Cursor

“Humans are eye-animals“. The first gesture humans normally do is to gaze at the object of interest. So why dragging a cursor after it? Especially if we have a computer in front of our eyes, the eyes are then actually predestined to function for control purposes.

We have combined the Microsoft HoloLens with the EyeTrackers from PupilLabs. Through a self-written code the captured eye gaze can now be used as cursor. This makes the navigation in mixed reality faster and more intuitive to use. One of the advantages, which was not possible before, is for example, that it can now also be used while walking.

and above all…

The feeling that a computer object recognizes that it is being watched was an incredible joyful moment. A new level of proximity has been created.

HoloLens with Eye-tracker
calibration
functional prototype
the mouse

“stolus”

To take action we need our muscles. Even if we only think about the action, we minimally tense muscles. So this is our trigger … also for mixed reality purposes.

And don’t we forget that it’s also natural for us to use tools instead our bare hands to perform a task. Especially in the case of MR, a tangible tool can be a beneficial extension for the interaction.

Therefore we designed “stolus”.

The wristband captures the muscle contractions and comes with a device in the form of a biface.

Manu testing “Myo”
prototyping
vision
tagging

Object recognition

In MR, holograms are integrated into the physical world. So it can be under/on/behind/… real objects. But if you want to move a physical object with its hologram, the hologram stays at its spatial place. Reason, MR does not distinguish objects from the room surfaces yet. So the hologram remains unaffected.

In our work we defined a parenting system that categorizes the connection between holograms and the physical world in 3 stages:

Space-Child,   My-Child,  Object-Child. 

Through “Object-Child” the holograms gain a bit more reality and a whole new interaction level.

In our project we work on all 3 stages. With vuforia we were able to develop our prototype of an object-child. The object was now recognizable and holograms could stick to it.

parenting system
vision
functional prototype
interaction

Depending on the purpose of the menu/feature, it is essential to create an appropriate appearance which enables a proper interaction. Here I introduce you to 3 interaction levels:

calling > Satellite

The satellite is a kind of “start menu” that always follows its user, hence its name. The satellite is generally invisible so the field of view remains unobstructed. Only when the user looks up left*, to the satellite’s position, it becomes visible and unfolds to the main menu.

* This position was chosen, among other factors, as an analogy to psychology: When a person unconsciously looks up to the left, it means that she or he visually remembers something.
navigating > Main menu

When the satellite is enabled, the main menu unfolds vertically downwards, so that the gaze can be lowered again. The clear line arrangement takes up less space in the field of view and ensures a overview which saves time.

handling – C-tool table

Example: Drawing tool      If the drawing tool  is active, the tool-palette can be opened by a double click. The tool-palette appears around the hand that called it. The C shape allows a quick and practical handling by simply tilting the hand in a certain direction. The open side is not needed as it would be covered by the hand and from ergonomic test is perceived as a rather uncomfortable movement.

Collaboration project between

Social Media & Press