As a UX professional, I suffer/benefit from the syndrome that it’s impossible to see a designed interaction and not make an instant evaluation of it. It makes frustrating interactions even more frustrating because I’m aware of the lack of care taken by the designer – I know it’s preventable. Like being suckered into pulling open a door that has a handle and discovering it can only be pushed open. The handle is a compelling affordance for pulling but the designer didn’t bother to think it through or heaven forbid user test it. Of course, the same syndrome gives me a kick when something is brilliantly designed, so it balances out. But it’s a syndrome that can’t be switched off. Even when watching films. Man-machine interfaces can make for special moments in movies – the high tension when they go wrong, the efforts to make something like a computer screen that’s essentially dull look interesting, and the sheer invention that comes from speculating what our future will look like. So in celebration of the expert-review-syndrome-that-once-awakened-can-never-be-switched-off for UX professionals everywhere, here’s my list of classic man-machine moments – good and bad.
So, Arnie’s terminator is holed up in grim rented accommodation because wounds on his flesh coating are showing the machine bits underneath. The landlord is hassling him. Arnie could perhaps punch through the door and silence the pest but he needs to blend in. We witness the terminator selecting from a worryingly limited set of verbal responses. It isn’t really plausible that a sophisticated AI masquerading as a human being would have to read these English text options from some kind of retina display – a case of spectacular over-engineering. But we get the sense of how functional Arnie is. And, hey, he makes the right choice all right.
Luckily, when Bond is poisoned during the poker game he has a poison-antidote kit with a direct line back to the office. Unluckily, the kit is quite complicated and surprisingly not very robust. He has to link his mobile phone to a blood sample detector and spike his skin whilst making an emergency call to the boffins at HQ. The poison is duly analysed but after selecting the right antidote he has to restart his heart with a separate stand-alone defibrillator (wouldn’t it be better if the defibrillator could be kicked off remotely by the boffins?) Worse still, the defibrillator is attached to a power pack via a very flimsy removable lead. No matter that the defibrillator has a nice clear call to action – the good old red button. Nice display of user frustration with Bond repeatedly pressing the useless control. Handy then that the Bond girl has spent some time working in IT support – the problem is clear. Plug it in, Bond.
A master class in using poor technology to ramp up the tension. A motion tracker is a great feature to have on a massive pulse gun when you’re trapped on all sides by advancing aliens. But the display only shows the horizontal distance and direction that your worst enemy is lurking. It seems to be an equipment malfunction when the display tells our heroes that aliens are within touching distance. If they’re in the room then why can’t we see them? But the equipment doesn’t lie – the pesky critters are just 5 metres away, in the ceiling. The display doesn’t give a reading for the vertical range – or maybe the display has been deliberately simplified for combat situations and the fault is with training. Or the manual. Seems like a reasonable user mistake to me – should definitely have been picked up at the early prototype stage.
You may well wonder at the point of an early scene in the film where Ripley gets to display her prowess at shifting containers around with an exo-skeleton power loader. But it all becomes clear when she’s got an angry alien queen to dispose of. Whenever you see technology of this kind being showcased with its intended use you can be sure that it will be used for less conventional purposes later on. I love the clunky JCB look of this device – it’s clearly a no-nonsense piece of functional kit, built on a budget, hence the rather rudimentary push-button keypad for choosing tools. Ripley shows admirable user patience under pressure in choosing the blowtorch function when locking horns with a lethal monster. Expert users only.
Deckard is on the trail of rogue replicants. He analyses an electronic image with a scanner using voice commands. It’s a fascinating speculation of the future from 1982 when Blade Runner was made. And it makes for good cinema. Deckard isn’t just silently manipulating the image with a mouse or pointer – we watch his thought processes and his skill, the blind alleys he explores and backtracks from. But does it make UX sense? Well, as a hands-free interaction it allows the hero to drink his way through a bottle of the hard stuff. But manipulating rich visual data with voice commands? What are these dimension numbers he’s quoting – and without scale guides how does he know which numbers to quote? Touchscreen or hand gestures would be the more efficient way to go – the task is spatial. It’s a bit like asking Siri to make delicate edits in a Photoshop application. There is no consistent or reliable way to describe what you want to achieve in such a visual application using natural language – the opportunities for the interface to misinterpret are too great. So, you need pre-determined commands, which are inflexible and need to be memorised. It looks cool but the drawbacks of voice control are laid bare. Voice activated interfaces are often tipped to be the future but I’m not convinced.
The President has two identical, unlabelled, enormous red buttons positioned next to each other in his emergency command-room bunker – one to launch all nuclear missiles, one to make a refreshing latte. A classic case of poor usability! The calls-to-action are easily noticed but indistinguishable from each other. Not dissimilar to placing OK and Cancel buttons next to each other and not using visual design to distinguish them. So, maybe the consequences of choosing the wrong button aren’t quite as catastrophic as in Monsters v Aliens but there are times when as a user I’ve lost my precious work to such design shortcomings and felt like unleashing all-out nuclear war…
Shaw has a reasonably pressing user need – to remove the alien growing inside her. Luckily she has access to a state-of-the-art self-surgery machine. She frames the solution to her problem as a caesarean section. It makes sense – she’s been contaminated through sexual intercourse, her original diagnosis was pregnancy, but the birth isn’t likely to be a beautiful experience. Unluckily the machine is just configured for men. Watching Shaw wrestle with the interface to find an alternative solution that the machine is configured for is one of the most gripping scenes in the film. How many times have we squared up to an interface that won’t do what we want but because the need is great we persist, looking for more and more desperate ways to hit success? Shaw rephrases the problem – she makes it gender non-specific. Sci-fi is littered with fantastically annoying, rigid and over-literal interfaces – they serve to contrast the difference between humans and so-called smart machines.
Fancy massive-scale computer screen or holographic interfaces that showcase the visual design credentials of special effects companies are ten-a-penny. Given that their true function in a film is to fill in potentially tedious back-story, they can grate on a UX practitioner. Why the storm of digits when the machine is only doing is some internal processing that wouldn’t be comprehensible to a person anyway? It’s like throwing half the budget of the film to make the equivalent of a revolving hourglass look pretty. The holographic console of the Engineer ship in Prometheus is a refreshing exception – a truly stunning 3D map of galaxies that resolves itself to planet Earth. It’s a chilling moment – the Engineers have programmed our home as their destination for biological weapons of mass destruction. The bewitching mass of swirling data helps communicate how alien the Engineers are – for them this complex interface is actually user-friendly! They must be a different order of smart.
HAL is the original malfunctioning AI in this influential sci-fi vision. As the all-powerful master of the US spaceship Discovery One, HAL illustrates the ultimate risk of abdicating our responsibility for control to machines. HAL calculates that the human crew are jeopardising the ship’s mission and since it considers concerns for human safety as secondary to the mission objectives, the crew must go. Its devious intelligence in committing murder is made more disturbing by its perfectly rational voice and disembodied presence. You can’t see HAL – just its camera eye observing human behaviour dispassionately. When the last crew member, Bowman, duels HAL for supremacy of the ship, there is no easy way to power HAL down. Oh for CTRL-ALT-DEL or a power button or a plug in a socket! When machines become autonomous, smarter than us and embedded in every aspect of our networked, manufactured world, there will be no easy way to switch them off. Perhaps switching them off will be more dangerous than leaving them on? Who would fly the ship without HAL to run things? HAL’s slow demise to kindergarten rhymes and machine death as Bowman grimly pulls out every one of its processors is compelling. In fact, when I force my flaky Windows PC to shut down after all the applications have frozen, I too want the satisfaction of hearing it sing Daisy in a warped voice.
It’s the Holy Grail of the retail world – to deliver location-specific personalised advertising content. So the technology has now duly arrived to push notifications to customers as they pass by physical stores, but witness the nightmare conclusion of this in Minority Report and you’ll wish to return to pre-industrial barter economy days. Anderton has a black market eye transplant to evade identification on the run. But biometric devices pick up his stolen identity and push cheesy content meant for his dead donor. There are no barriers between user and content in this vision – no simple way to opt out, change settings or even look away. The personalised holographic hoardings jump out of nowhere – the equivalent of insistent visual spam that lurks in waiting for us wherever we turn. The more we trade privacy for convenience or cost, the closer we edge to this reality. Google Glass is just around the corner. Perhaps in the same way that we opt for free access to Spotify content in return for being fed adverts, we will trade the control of what we see for access to augmented vision or customer discount.
Casting the world through a user experience lens is the curse of the UX professional! So, now you know that we never switch off – even when we’re at the movies. And beware, if you sign up for an HCI qualification, be prepared for your pain threshold to cope with badly designed interfaces to plummet.