Can you imagine a tea pot with the handle and the sprout on the same side? As cognitive scientist and usability engineer Donald Norman writes in his best-selling book The Design of Everyday Things, this tea pot can at best, brew tea – but not serve it. In other words, it’s not usable. In a technology-driven environment, life is more complicated than tea pots. While tactile and responsive web design may be the current trend, many innovations are just round the corner, making user-centric design more important than ever.
E-commerce has revolutionized the sales cycle, and more businesses are selling online. Landing pages are replacing across-the-counter sales representatives, while checkout and payment pages are playing the role of billing cashiers. In a brick-and-mortar model, you could train your staff to be polite and also sell more items to customers. At your online store, you turn to design to do the job usable design, based on principles of human-computer interaction (HCI).
Over the years, user interfaces have evolved steadily, perhaps stealthily from punch cards, keyboards, mouse, touch pads, icons, and kiosks. Newer HCI technologies such as voice recognition and eyeball-tracking are gaining momentum. Usability experts especially those whove heard about the Florida ballot – also focus on the importance of brain-computer interaction (BCI).
The Florida ballot story is a classic example of BCI (or failed HCI!). It is proof that users dont always read instructions – so interfaces need to be intuitive. Someone lost an American presidential election because voters were too lazy to read! And this happened not because the election process was not validated or assured. The missing piece was a simple usability or user test. A minor usability flaw on the ballot card, lead to disqualification of 29,000 votes. Heres what actually happened voters were given a two column ballot, with arrows pointing to holes that required punching to cast a vote. Al Gore was the second name on the left column, after George Bush. The first name on the right column was Pat Buchan from the Reformist party. However, in the ballot center, the arrow for Al Gore corresponded to the third hole, while the arrow for Pat Buchan led to the second punching hole. Small, short arrows further made visual correlation weak and difficult. The result – voters who wanted to vote for Al Gore punched the second hole, and ended up voting for Pat Buchan. The designers thought they got it right they had drawn the arrows and given instructions punch the hole next to the arrow next to the number next to the candidate you wish to vote for. But users didnt read complex instructions they went with their gut feel and intuition. Psychological experts had another term – familiarity blindness to prove that people are most likely to ignore the obvious. You can judge for yourself the peculiar layout of the ballot here.
Coming back to familiar territory, in the software development environment, research shows that when confronted with bad interfaces, users slow down, thereby impacting productivity. The cost of productivity loss is much more than a user test. The pressure of software release cycles is well known to all of us. We have seen support teams being ramped up before every release to handle user calls and support tickets. User-centric intuitive design can eliminate this onslaught of calls from frustrated users, and reduce customer support cost.
The same principle holds good for quality assurance and testing interfaces as well. As more and more automation and tool-driven testing takes center stage, usability assumes greater importance. So how do we build in usability in the assurance process?
In the assurance space, traditional user-interface testing and usability tests such as screen design, aesthetics and transactional design have been around for a while now. UX is the term we use to describe the notion of designing for usability or ease of use. However, there are many variants in assurance for elegant user interfaces. One such aspect is ‘information usability’. For example, most users view advertisements as disruptions, and find them annoying. But leading search engines have reversed this user thinking with information usability. They deliver timely, relevant, and meaningful advertisements on the search results page (and earn revenue as a result!). Users actually appreciate the sponsored links, and view them as high value, relevant content. When viewed from the usability lens, this provides an important lesson on mixing business expectations and user experience.
Traditional testing also had stubs and drivers to simulate behavior of pre-requisite software components that the module under test depends on. These enabled early-on testing in the cycle, without dependencies. But now, you also need to address the virtualization challenge. With mobile and cloud applications making calls to web services, you need to develop the front-end early in the process, in isolation, without having access to the back-end. For complex systems, such virtualized environments are difficult to create and manage.
As I discussed in my earlier blog U and I in Assurance, assurance for usability and superior consumer experience is fast moving from simplistic visual validation and verification, to complex and sophisticated engineering for human/brain computer interaction. Lines are blurring in this high-velocity world of high-frequency, multi-architecture, disaggregated transactions. Interfaces need to be intuitive, elegant, sound and most importantly, simple. For the future, HCI and assurance present many learning opportunities, besides being an investment that offers fast and significant returns.
The ever-evolving dynamic market conditions and changing technology landscape present intriguing challenges for quality assurance. The new assurance era demands higher standards of precision engineering, instrumentation, and sophistication for HCI assurance. As an assurance professional, you need to understand requirements, use cases, design, technology and happy and/or sad paths of applications that use varied technologies, as well as the variety and volume of test strategy, plan, scenarios, data, and environments.