Remote User Testing: Strengths and Weaknesses

Screen Shot 2015-10-20 at 3.52.40 PM Screen Shot 2015-10-20 at 3.51.16 PM

Remote user testing has a reputation for being a quick and dirty evaluation method, but like other methods it has strengths, weaknesses, and best use cases. Remote user testing can be divided into two types: moderated and unmoderated. In unmoderated remote user testing, the evaluator and user are not in the same place, and the user evaluates the interface independently. Moderated remote user testing also involves displacement of evaluator and user, but the evaluator is in the “same ‘virtual’ space” as the user- whether via screen sharing, chat, or some other form (Schade).

 

Strengths

Remote user testing can be significantly cheaper than face-to-face user testing. There can still be costs associated with purchasing software or incentivising users, but even these costs can be mediated with free options and recruiting services.

With remote user testing, recruiting either a greater diversity or specificity of users is easier. Often user testing sites will offer an option to customize the demographic of users to be sampled. The internet also gives the ability to recruit a greater diversity because of its connected nature. It is far easier to recruit users from India via the internet than to fly to India to recruit users face-to-face.

Evaluation can offer both qualitative and quantitative data. With technology that analyzes a user’s computer data and the user’s physiological responses, a variety of quantitative data can be obtained through remote user testing. Evaluators can also collect qualitative data through their own observations in moderated testing and self-reporting from users. Remote user testing does not yield only one type of data.

Results can be obtained quickly. As mentioned, the recruiting process is made easier, and with all of the data collected digitally, analysis can also be speeded up because it can be automated. Of course, qualitative data may still require a human evaluator, but initial results can be spit out by a computer that already has the raw data quickly.

The users will be captured in their own use contexts. As user testing has evolved, the importance of use context has become more and more apparent. With remote user testing, the user can test an interface within their normal environment: on their machine, among their other open programs, and in their space (work, home, where ever they would normally use the interface).

 

Weaknesses

With unmoderated remote user testing, there is no support or help for the user if they get stuck. With no evaluator present – virtually or otherwise – the user may go down the wrong path and the data collected will not be useful to the evaluator.

Most remote user tests are kept short, which means the interface to be evaluated, or the section of it, needs to be small. Evaluating a full website is most likely not feasible with remote user testing.

Security of information shared over the internet can be a concern in remote user testing. Evaluators have less control over the security of the information about the interface they are sharing with the users when they are not in the same room.

Some remote user testing methods require special equipment or software that a user will not have. For example, not all PC laptops have webcams installed, which limits PC users from participating in moderated user tests with video. This can skew or limit results.

Because remote user testing relies on technology, from an internet connection to specialized software, the possibility of technical problems increases. With face-to-face user testing, even if one technological aspect of the test fails, for instance, the webcam, the evaluator can still record observations with a pencil and paper.

As with data-driven design decisions, quantitative data obtained from unmoderated testing does not explain why participants made a certain action, and with no evaluator present or in contact with the user, it is not possible to ask follow up questions (Barnum 2011). However, this weakness can also be a strength in that the sheer amount of quantitative data that can be obtained with unmoderated testing can impress managers who are bent on making design decisions with data.

 

Remote user testing is not inherently better or worse than face-to-face user tests, and the evaluator’s discretion should still be used in selecting a method of testing. We have a habit of becoming enamored of the newest technology, and while remote user testing is the best option in many cases, it still does not completely replace or replicate face-to-face user testing.

 

Referenced Sources

Barnum, C. M. (2011). Usability testing essentials: Ready, set…test! Burlington, MA: Morgan Kaufmann.

Bolt, N. (2010). “Pros and Cons of Remote Usability Testing.” Johnny Holland.

Keep It Usable. “What’s the real difference? Face-to-face versus Remote user testing.”

Schade, A. (2013). “Remote Usability Tests: Moderated and Unmoderated.” Nielsen Norman Group.

We Are Personal. “Remote Vs. Local User Testing.”

 

Sources to Help Select a Tool for Remote User Testing

Nielsen Norman Group. (2014). “Selecting an Online Tool for Unmoderated Remote User Testing.”
Bolt, N. (2010). “Quick and Dirty Remote User Testing.” A List Apart.

Design Critique: iMovie

imovie-0624

iMovie is a program that is designed by Apple, a company known for their design aesthetic, but less so the usability of their products. iMovie attempts to make creating films from movie clips easy, but several problems with the interface make executing these tasks difficult. The most pressing design problems I have found involve saving work and navigating the left menu.

 

Design Problem #1: Saving Work

Screen Shot 2015-09-07 at 3.22.08 PM

Screen Shot 2015-09-07 at 3.22.08 PM

To save a project created in iMovie, the user would, using knowledge acquired through repeated use of computer programs, go to File in the menu bar, then Save or Save As. It has become an accepted norm in computer culture to find Save under File. iMovie, with its poor visibility and logical constraints, does not have a Save option in the File menu. Don Norman’s principle of visibility, “making what needs to be done obvious,” fails here. It is not apparent what needs to be done to save the project, because no Save option is available. The correct action to save a project in iMovie is Share. This is an issue in Norman’s Gulf of Execution: the answer to the question “Does the system provide actions that correspond to the intentions of the person?” is no, and the user cannot execute their intention to save the project.

 

Design Solution #1: Provide a Save option.

Adding a Save option to the File menu would best solve the design problem.

Other Apple programs (Keynote and Pages, for example) autosave changes and provide a Save option in the File menu.

 

 

Screen Shot 2015-09-07 at 2.23.39 PMDesign Problem #2: Left Menu

In iMovie, an Event is a group of clips imported or recorded on the same day at the same time. A new Event is automatically created each day, or can be created by the user. In order to view all Events in the main pane, a user can either choose All Events or iMovie Library. In Norman’s words, this does not “provide a good conceptual model for the user” because it does not present the options with consistency. The other options in the menu display different groups of clips, but here two differently named options display the same group of clips.

The iMovie Library option, which also reveals a drop down list of all Events, is redundant and does not properly constrain the user. There is more than one way to access all of the Events; there is no natural or logical constraint. Without constraints, a user can be confused when clicking either iMovie Library and All Events yields the same results, while each of the other menu options yields a unique result.

“The difficulty of dealing with novel situations is directly related to the number of possibilities,” Norman says.

 

Design Solution #2: Remove iMovie Library option from left menu.

Removing the iMovie Library option and adding its drop down functionality to the All Events option would solve the design problem because it would constrain the user to only one choice in displaying all of the Events. It would make clear that each option on the menu yields a unique result.

 

All quotes drawn from Norman, D. (2002). The Design of Everyday Things. New York: Basic Books.

Innovative Interactions: Vickie Culbertson @ Fuzz

Vickie Culbertson

I was connected to Vickie Culbertson, UX Designer at Fuzz Productions, through another friend who was recently hired at Fuzz after completing General Assembly’s User Experience Design Immersive. Vickie describes herself as “A Baltimore native living and working in New York City. I love seeing live music and going to art exhibits. You can also find me in the water with my longboard, Tomatillo. I guarantee I will laugh within the first 30 minutes of meeting you.” Vickie also completed a General Assembly course in UX Design (part-time) and has a visual design background. I asked her big picture about working at Fuzz and as a UX designer.

Fuzz Productions is located in Williamsburg, Brooklyn and specializes in mobile apps – mCommerce, publishing, and enterprise particularly. Notable clients include New York Post, Wegman’s, Capital One, Xbox, and L’Oreal.

 

Marlee: How did you become interested in UX?

Vickie: I was looking for a career change and re-discovered UX as a combination of art and science, which I was missing at my old job as an in-house print advertising art director at a luxury beauty company.

 

Do you have training in UX design?

I took the General Assembly part time UXD course and have a BFA in Graphic Design and Painting from MICA.

 

How long have you been working at Fuzz?

I have been at Fuzz since December…a little over 6 months.

 

What does your typical day look like at Fuzz?

In the morning, walk over the Williamsburg Bridge to work and think about my plan for the day, while listening to podcasts. When I get to the office, I drink some water, check emails and go to stand-ups. Then, I either start with research, user flows, and wireframes; or go to meetings to discuss project statuses and get together a game plan with my team. Some days we present to clients or run client meetings with features ratings, etc. We eat lunch together, as a company, so that mid day break is important for talking with people outside of your project teams. My afternoon mostly consists of wireframing and meetings. Thrown into the mix is talking with other UX Designers about a screen that might be troubling and how to approach it. We’re always bouncing ideas off of each other.

 

How do you define success on a project? When are you “finished” with a project?

Success is doing the best I could and coming up with innovative interactions. I am “finished” when I hand wires over to the developers. However, I am still involved with the project while the UI and Development continues and I start on a new project. At an agency, my role is considered complete when the wires are approved by the client. I personally like to continue being involved as long as I can. In general, success to me could mean that a project failed or was loved by the client. What matters most is doing the work and discovering the best solution for your end user.

 

What other types of positions do you work with? What does a project team look like?

I work with Producers (1-3 on a given project), UI Designers (1-2 at most), Developers (iOS, Web, and Android. #s vary per project). We all come together to create the best product for the client wants and user needs.

 

What is the most rewarding part of your job?

When I see delight on people’s faces as they interact with an app I have designed.

 

Are there problems or UX trends you see popping up in your day-to-day work?

As always, the great debate of a hamburger vs menu. 

I’m also interested in creating interactions that aren’t super standard for each platform. Getting boxed in makes me want to explore how to break the rules within the system. 

Another trend I see is hiring managers not really knowing what UX is and asking for a developer or visual designer and UX designer all in one. The unicorn can exist, but it’s a challenge for the designer. It’s something we should address as an industry to let smaller companies know that while there are overlaps, those overlaps don’t necessarily define the role of what a UX Designer does.

 

You can follow Vickie on Twitter or visit her website.

I’d recommend following her on Twitter.

Back Pocket Apps

One morning last week my phone was dead, so on my commute to work I looked around at the other people on the bus, and everyone else was buried in their cell phones. Every now and then an idealist pops up, haranguing this and moaning for the good ole days when people actually talked face to face and cared about each other. But, this is idealistic because our world – business, social interactions, everything – takes place with our phones (tablets, wearables, laptops) in front of us. Golden Krishna isn’t idealistic about getting back to face to face interactions because he has a solution for changing the paradigm of how the world works with technology: back pocket apps.

back pocket phone

Back pocket apps, as detailed in Krishna’s new book The Best Interface Is No Interface and a UX Booth article Craig tweeted, are apps that work best while in the user’s pocket. In other words, they do not require the user to pull out their phone and interact with the app on screen. Krishna cites Lockitron as an example of what a back pocket app – and rethinking our current apps – can do.

 

Lockitron’s first iteration was designed to do away with remembering your keys and make it easier for users to lockand unlock their doors. But, it required users to pull out their phones, unlock their phones, find the Lockitron app, open the Lockitron app, and tap to unlock their door. The 12 step process Krishna diagrams is somewhat startling because it is not exclusive to Lockitron. It’s not a design flaw in the Lockitron app – it’s how all of our apps work. And it does take work. The second iteration of Lockitron used Bluetooth technology to allow the app, inside the phone, inside the user’s pocket, to talk to the lock and open it as the user approached. No 12 step process – just keep walking (sound like automatic doors to anyone?).

12 step process12 step process diagram

2 step process

The Lockitron example as a pioneer of back pocket does pose problems, mostly security issues. We have passcodes on our phones to prevent others from getting into them, and if our passcode is required to get into our home, the likelihood of someone who stole our phone getting into our home is less. It also poses a convenience problem – what if you come home and your phone is out of battery? But, the idea of interacting less with apps and allowing our technology to actually make our lives easier and more seamless without taking all of our attention is appealing. Krishna also notes that ‘thinking beyond screens isn’t applicable to every type of problem,’ which is an important concession. He isn’t an idealist railing against everyone being absorbed in their phones, rather he is proposing ways for technology to do what it can do – make our lives easier seamlessly.


Applying back pocket apps to the museum context, as the context we are working within in this class, I think can be useful. Combining aspects of the Pen from Cooper Hewitt and the interactive app used at the Armory Show by Artsy, a back pocket app that tracks where users are in a museum, where they spent the most time (i.e. in front of which painting), and sends data to both the museum (hey, this painting is really popular) and the user (oh yeah, that was the painting I really liked and couldn’t remember the name of) is a useful possibility. A back pocket app that worked as a back pocket audio tour might be a bit more intrusive: imagine walking through a gallery and all of a sudden your phone in your pocket starts talking really loudly about Monet – embarrassing. But, then again, we all have our headphones in at all times anyway, and automating an audio tour could streamline the process of walking up to a work of art, pulling out your phone, unlocking your phone, finding the audio tour website, waiting for it to load  the wifi is slow, signing in to the wifi, being redirected to the audio tour website, scrolling through to find the painting you’re standing in front of, clicking play, waiting for it to buffer, then finally getting a one minute blurb about the painting. This is what back pocket apps are designed to alleviate, which in a museum setting where curators (and some patrons) view the gallery as a sanctuary not to be intruded upon by technology, would be unobtrusive and useful.

Material Design

”Material design is a visual language synthesizes the classic principles of good design with the innovation and possibility of technology and science.”

material design

Google uses material design to guide the interaction and interfaces in their apps, mobile and web-based, and makes the standards available for other android app developers. Material design was unveiled in summer 2014 long with Lollipop, Google’s new android operating system. Since then Google rolled out redesigned apps, Gmail, Maps, and more, using material design principles. Google designers made physical prototypes of their icons before designing the digital counterparts, to capture the correct, real-world dimensions.

style_logos_product_intro_material_lightingstyle_logos_product_intro_material_color

 

There are three main principles to material design: material is the metaphor; bold, graphic, intentional; motion provides meaning. Material is so called because the design and interaction within an app mimics that of the physical (material) world. Two windows within an app cannot occupy the same space: this is a law of physics. To design this, Google provides shadow specifications which separate levels, as shadows do in 3D space. Elements also do not pass through each other – “cards” in the Google Now interface behave as index cards would in the physical world. The motion between the elements and their real world-bounded interaction provides meaning in that a swipe in an app does that a hand swiping cards across a table would do in the physical world. Designers also aim for the user experience to be relevant to the individual and, a word that appears frequently in the documentation, delightful.

components_cards12

components_buttons_usage6

Screenshot_2015-03-08-19-59-42

 

The use of graphics and color are also important in material design. Stickers, provided by Google, are meant to remain consistent across apps. A messaging icon in Google Now would be the same as the Google Hangouts icon, to make clear the the connection. Color palettes are also specifically chosen, and generators exist on the internet to aid designers. Colors should be limited, and the choices bold. Google’s apps rely on large, color-blocked headers, with buttons in an accent color. Font is also consistent.

palette

style_typography_roboto1

Material design may be a Google standard, but other company’s designers may resist. The documentation is extensive and prescriptive, and as Carrie Cousins questions on Design Shack whether “experienced designers need this level of guidance.” She suggests playing with the concepts, but not marrying oneself to them. While Google designers are required to marry into the family of material design, others do still have a choice. Google would like to make material a standard for all android pps, but while their influence is great, and it may not be that great yet.
I find material design to be clean and pleasing to look at, but I think that it sells digital space short. Bounding digital space by physical space’s rules is unnecessary. Digital space exists part from the laws of physics and literally any experience can be created. Objects passing through one another is of course not possible in the physical world, but limiting in the digital world is unnecessary – they can pass through each other with  few lines of code! Using physical specifications can increase usability, I believe, as these are laws we already understand. Someone not familiar with a digital environment may find an interface that behaves as they expect their Moleskine day planner and pencil to function is easier to adapt to. This raises the question of who the user is. re designers creating an app for a user group that is more comfortable with physical tools? Material design could be a selling point for this user group. Is the app going to be used by digital natives who have never had a physical address book? Will the concepts of material design be lost on those users creating a clunky experience?