How do you design for a system that breaks all existing principles?

by Daria Lanz on 8 June 2017

Meet the mi.mu gloves; a gestural musical interface. In plain English, these are gloves that allow you to control your digital music software (such as Ableton or Logic) by making hand gestures in the air. You can bang invisible drums around you to produce bongo sounds, play air guitar and strum the guitar strings and even bend the pitch of your instrument by raising and lowering your arm or fade a filter on and off by pinching your fingers (to name a few).  See the mi.mu gloves in action

Gestural control has been emerging for some time now, and there are lots of controversial pros and cons debated around their use and interaction. Digital musical instruments have been around for a while, and they too come with a cabinet of do's and don'ts. The challenge is when these two worlds collide, resulting in a clash of principles as tensions between traditional UI interactions, gestural technology and traditional musical academic findings, try to co-exist harmoniously.

We've spent the last few months working with the mi.mu team and researching best practices for the usability and interaction of these gloves. We've brushed up on the challenges of gestural interfaces and the vast opportunities that come with carving a new path within technology and digital products. Here's what we've found...

glovesbody2.jpg

source: https://www.virgin.com/music/all-you-need-is-glove-imogen-heap-and-the-kickstarter-dream

Gestural interfaces

The Nielson Norman Group state that gestural interfaces ignore and violate well-tested and understood standards of interaction design. The lack of consistency and inability to discover operations, coupled with the ease of accidentally triggering actions from which there is no recovery, threatens the viability of these systems.

They argue that there are several important fundamental principles of interaction design that are rapidly disappearing as new technology emerges. For a comprehensive list of these principles, check out this article.

Digital Musical interfaces

There's lots of academic literature out there on digital musical interfaces. The biggest takeaway, and the one most applicable to the mi.mu gloves, is around control and mapping. It states that, as a rule, digital musical interfaces shouldn't give complete control to the user to program how musical outputs are triggered.

Think about it this way: you're learning how to play the piano. Your muscles are getting used to the hand positions and movement required to play the keys and your brain is on overload understanding the rules of music theory. With little effort and time commitment, you can at least make a few good sounding notes by pressing the keys your teacher tells you to, and you can replicate the tune by memorising the order in which to play those keys.

Now imagine every time you played the piano, the action required to hammer the strings (hitting a key with your finger) changes. One time you might need to press the foot pedals to hammer the strings. The next time you might pull the keys upwards to strike the hammers. Or perhaps you must press 2 different notes in combination while activating the foot pedal to trigger one hammer string. Or simply, that the key you press produces a different pitch than the last time you played. This would make learning to play the piano exceedingly frustrating.

This is exactly what a musician does with the mi.mu gloves. The musician chooses which gestural movement will trigger whichever sound they like. It means they can take a song they've composed and choreograph the movements which feel most natural to them, triggering different parts of their song as necessary. Musicians are liberated to perform live without being tied to a computer to trigger pre-recorded tracks.

It also opens an entirely new type of music composition. Essentially, it brings the power of digital music to acoustic instruments. Traditionally these two streams have been developed in isolation. Digital music (DJs or any electronic music production) is full of software that allows musicians to build entirely new sounds, layer effects, and trigger different filters to manipulate the sound as the track plays out. This is done entirely on the computer, no instruments necessary.

Acoustic instruments like piano, acoustic guitar, or voice, are all about the physical instrument, and the mastery of playing the instrument. Going to see a pianist perform isn't about the different sounds or effects she can make using the piano, but rather her expert skills in performing difficult repertoire.

Some contemporary artists push the ability of their instrument by playing it in unconventional ways – they slap their guitar to make percussive sounds, or they use a loop pedal to build and layer their voice or instrument overtop each other. But these techniques are limited. The former is still constrained by the limitations of the physical instrument. And the latter is dependent on gear that you must awkwardly trigger with the push of a button.

As a rule, digital musical interfaces shouldn't give complete control to the user to program how musical outputs are triggered.

Many artists, like Imogen Heap (original creator of mi.mu gloves), manipulate acoustic instruments digitally when recording their albums. They do this in the studio by recording their instrument, and then digitally altering it on the computer using their music software. The problem comes when they need to replicate this sound on stage when they perform. They can't do it live. So, they must trigger the pre-recorded tracks at the right moment of the song. They do this using a control panel of buttons and faders that they must constantly revert to on stage (or they have a technician doing it behind the scenes).

This is where the mi.mu gloves answer a huge user need. By programming certain gestures to activate different tools within their music software, they can manipulate the acoustic sounds live on stage. Of course, they could also simply trigger the pre-recorded tracks as they would with their control panel, but it means they aren't constantly walking across the stage back to their computer to push a button. With mi.mu gloves, they can do it with a poetic spin of their wrist. They can communicate their music better to their audience, thus making their performance more engaging.

glovesbody1.jpg
source: https://www.virgin.com/music/all-you-need-is-glove-imogen-heap-and-the-kickstarter-dream

But it's complicated because the gloves are so customisable. It makes the learning curve incredibly steep when you first open the software and plug your gloves in.

Which is exactly where UX comes in. Our challenge is to explore how to create an atmosphere with a low learning curve, a low rate of abandonment, and high learnability and discoverability, but still, allows deep customisation and flexibility to apply across a wide range of musicians. Watch this space to follow our process as we break through these new challenges.

Thank you for your comment. It has been submitted for approval.

Comment Again?

Leave a comment

Please enter your name.
Sorry, this email address is not valid.
Please enter your comment.

Course basket x

Course

Price per place

Add another courseCheckout

Pay now by credit card or later by invoice (invoice payments only possible if all courses are 3+ weeks in advance)