I’m sure at least a few of us can recall using a Palm Pilot or an Apple Newton. Their monochrome displays and touch recognition were new and even groundbreaking at the time. Since then, those PDA’s, as they had become known, have slowly matured into what we now call smart phones — all encompassing mobile computers that essentially contained access to all of our personal data. Hence the name, personal data assistant.
Still, easily the most important innovation that PDA’s began bringing to the market were touchscreens. Originally, touchscreens were pieces of technology that could only recognize a single input. Sometimes, they required stylus pens in order to draw out text.
That’s changed almost entirely now. Now, touchscreens typically recognize inputs from multiple points (fingers) at the same time. Essentially, they allow us to interact with data and applications in a way that is both natural and easy for us — with our hands. Still, there are a few things that touchscreens have, especially recently, begun bringing to almost all forms of mainstream computing.
1. Touch Should Be An Inherent Part of Mobile Operating Systems, Not a Feature
The iPad has undoubtedly had a huge impact on the way we interact with our devices. Its simplicity has spawned dozens of imitators. Right now, it’s the coup de grace of tablets. Companies such as LG, Motorola, HP and others have constantly striven to create the perfect tablet to compete with the iPad, and all of them have come up short. Right now, Amazon is planning on releasing their pseudo-competitor the Kindle Fire soon.
But the most significant thing about Apple’s success is that it has shown how important it is to build an operating system from the ground-up with touch in mind. Microsoft’s desktop operating system Windows 7, although functional for tablet computers, was often cumbersome and strange to manipulate. Even Android, an operating system built for smart phone, had its issues when it was expanded to fit tablets that were significantly larger than the smart phones it was initially made for.
Apple has even begun to take a page out of their own book, looking to mimic much of the interactivity from their own iOS into the newest version of their OS X operating system. In that version, users can swipe through full screen apps with a three-finger swipe. Or get a bird’s eye view of all of their running applications by swiping up on the trackpad with 3 fingers.
2. Creating Intuitive User Interfaces Have Become The Central Focus
I’ve lauded the iPad time and time again. For me, it’s almost become an interactive piece of art. Something that serves with a wide range of functions from watching TV, to playing Angry Birds, but that also has an aesthetic appeal that I can simply admire. But that aesthetic appeal isn’t something that necessarily means that the iPad is just nice to look at from the outside. On the software side of things, it’s also extremely pleasing to the eyes, and even more pleasing to use. Switching between apps is as easy as double clicking on the home screen and clicking on another app you use. Browsing articles are as easy as flipping pages. Listening to music can be as simple as two button clicks and a swipe.
Yet, even in spite of the fact that desktop applications and software are created with a mouse and keyboard in mind, even they have begun to succumb to the importance of creating applications that ultimately work well for touch-based users as well.
3. Accessibility Is Key
I remember one time when I brought an iPad to my friend’s house. His family was there, of course, but sitting down on the couch was his short, whispy-haired and nearly entirely deaf grandma, fiddling away with the T.V. remote. I sat down next to her, and we talked for a bit. At one point she asked me, “what’s that in your arm?” I showed it to her. Now, mind you, his grandmother is in her mid-80s and has rarely even seen a computer, let alone use one. Needless to say, within seconds she was fiddling with particular apps, scrolling through lists in iTunes and ultimately having, what seemed like, a fun time. Sure, she didn’t understand how to manipulate pictures, or visit particular sites, but the operating system provided her with a level of accessibility far more obtainable than desktop operating system alternatives. By interacting with something that she was already so familiar with — her own hands — she was able to easily find her way around the simple interface in no time.
Most importantly, what all of this has taught is just how important our hands are when it comes to experiencing things. The experience becomes far more intimate when our hands are involved, simply because we know our hands well.
And when user interfaces are built around our hands, that experience becomes even more intuitive. Magical, even.