Posts tagged ‘Mobile Design’

Yahoo Weather App for iOS is ‘good design’


Yahoo Weather App on Vimeo.
Music is LANDRA’S DREAM by Jason Shaw.

I’ve been looking at software design through new glasses – ones with a much better prescription. Put another way, I am looking with a much more critical eye and looking at both the function and the form.

While I agree that form without function is frustrating and a waste, there is no need for function without form. Even more to the point, good form can be the tipping point when similar function is present.

Case in point is the Yahoo Weather App for iOS.  I’ve used the WeatherBug paid app for a couple years but with recent iterative changes, it is becoming more and more tedious to use. There are two many taps and gestures to get to the relevant information.

Yahoo made an entirely new weather app and my guess is they have hired some top notch designers and artists along side unbridled developers to create their trifecta – a comprehensive set of function with truly beautiful execution.

In the video you will notice subtle details such as the use of a spinning sun graphic for the refresh indicator. The background images are pulled from Flickr with relevance to location, its weather, and its local time. The background blurs when the screen is scrolled vertically to view details of the weather. The wind is elegantly renders with both the wind speed & direction as well as animated windmills. Even the time of day is animated with a rising sun.

From anywhere in the app, horizontal scrolling takes you to successive locations while maintaining the context. Scroll left while looking at precipitation for one location will show you precipitation for the next location.

What strikes me about the Yahoo Weather App is that as much as it is beautiful to look at, it is completely intuitive to use. The interface never gets in the way and more important is that “it does what you expect”.

User Experience – the challenge to satisfy left and right handed consumers

It's hard to define “user experience” but they say “you know it when you see it”. I say, “you know it when you feel it”. User experience is more than visual. Perhaps the best urban description it to say it is visceral.

Recently, I was reviewing ideas with a group do user interface designers. We were discussing software applications for smartphones.

While the design patterns being discussed were important, what was missing was attention to the person who would eventually hold a device in their hand. All of us in the meeting had a smartphone and yet, there was very little thought to “would I like to use'this thing we are designing?”

Precision vs non-precision gestures

Mobile devices support a number of different input. While shaking, bumping, and rotating the phone can be used, the majority of input comes from tapping and swiping. Tapping is typically a precision gesture – tapping a button, selecting from a list, typing on a virtual keyboard. The user must hit a specific target for a specific amount of time and not move while doing it. Contrast this with a swiping gesture where the user can typically start in any of a large part of the screen and only needs to slide the finger (or thumb) in the general desired direction. Thus, swiping is a non-precision gesture – scrolling a list, sliding between two screens, opening a menu at the side or bottom, exposing a drop-down from the top (for examplr the iPhone notifications page).

When given the choice of implementing a precision vs non-precision gesture, it is “kinder to the user” to go with the non-precision choice.

Left vs right handed users

A smartphone is not a large device – even the comparatively huge 5″ devices. Most smartphones are designed to be usable with one hand (as seen in the photo). The user is cradling the phone and only has their thumb for interaction. The thumb naturally follows an arch from the upper corner closest to the hand to the lower corner farthest from the hand. The challenge is that these locations are dependent on which hand hold the phone. In the right hand, the ease of access starts with the upper right, then lower left, then upper left, and finishing with the lower right as the most difficult location to reach. A user holding a phone in the left hand would reverse these to upper left, lower right, upper right, and then lower left.

So where do you put input control when you have both left and right handed users?

Part of the answer flows back to precision vs non-precision input. The more non-precision input you achieve, the less challenging to left and right handed users. Next, the comfort of top buttons exceeds that of bottom buttons.

Conclusion

There is no perfect answer. However, by thinking about “how” a user will interact with a mobile application, (in addition to why and what-for) you will achieve a better user experience and a happier user.

 

Mobile Design is about to change again – get used to it

I recently read a New York Times article on police using a smartphone app to get information “on the street”. What I took from the article is that LEOs would love fast simple access to large amounts of data to get the snippet that is of value “right there; right then”.

One aspect of Mobile Computing that is “the game changer” is what I use to refer to as “just in time information”. Ten years ago it was a concept that was not possible. Now it is. The keys to just-in-time information is that the device knows some of the parameters of the search – the “when” and “where”. The user specifies the “what”.

While this model is too simple for all information requests, it does simplify most searches. Another capability mobile designers are slow to incorporate is voice.

Voice dictation / input is what I call a “first order” voice capability. We need to be thinking beyond that. For iOS users, this would be Siri-style integration. There are obvious security issues (we can’t give Apple access to all the necessary data) but it is a concept to be thinking about. All of this will be converging – and soon.

Two technologies are going to stretch our current thinking of Mobile First – smart watches and smart glasses. These will exploit voice, GPS, and constant display I/O to a much greater extend than anything we are targeting today. They will emphasize “just-in-time” information.

What is important is to be thinking about “what’s next”. When I lead discussions on designing for mobile I frequently remind the audience they can not think of their designs the way they did for the desktop. Many of the current mobile design patterns didn’t exist even 12 months ago. Change is happening much more quickly in mobile design than it has ever happened in physical design or previous generations of computer design.

Assume this to continue … at least until the next big change :-)