Issues Evident in Contemporary Internet & Social Media Research

This blog is part of the series Usability in the Real World: Ethics in Usability Research

Abstract: While there are many cases for a ‘unified code’, the information science and technology industry hardly has centralized ethics codes. There are two, significant explanations that intend to offer the possibilities for the absence of the so-called ‘unified code’. For one, it is widely accepted that the internet itself is not regarded by the community as a moral agent. On the other hand, information professionals already follow many principles that are sometimes perceived to include outlines of moral behavior. Still, these hardened views on ethical responsibility—the culmination of what is merely an outlook—is not enough to negate technology’s impact on ethical systems. Despite seemingly flailing jurisdiction, due to the perpetual, rapid advancement and distribution of technology, there are still real legal and ethical implications (Koehler and Pemberton, 2000). If the internet itself is not indicative or a representative of moral principles, then why should we, as information professionals, be concerned with identifying and sharing codes of ethics (Kelsen, 1990)? This is true for all categories and facets throughout the information science and technology industries. More specifically, what does a code of ethics mean for user data and its analysis by researchers? This small collection of contemporary cases are indicative of the current ethical challenges confronted by internet and social media users and researchers.

Introduction

Foremost, research that is usually conducted by UX researchers is deemed as proprietary information, and does not fall under the legal definition provided by the US Department of Health and Services. Why not? Results don’t really contribute to the industry of UX as a body of work. More so, the data collected is hardly tied to personal identification, which guarantees a sense of anonymity to human subjects (Bowman, 2014). Sounds chill, right? Think again!

The Current State of Ethical Responsibility in Usability Research

If you need proof of the current wavering state of ethics as it pertains to moral responsibility in usability research of social media channels, then look no further than to the widely cited account referenced as the “emotional contagion”. The study, published in June 2014, indicates that “researchers from Facebook and Cornell University manipulated the news feed of nearly 700,000 Facebook users for a week in 2012 to gauge whether emotions spread on social media” (Albergotti and Dwoskin, 2014). One critique points to the structure of the study breaking ethical guidelines for informed consent, which is required when involving human subjects. Facebook may have hid behind their Terms & Conditions, but, plainly put, users were never informed or even signed consent forms (Author, 2014). Facebook and Cornell both agreed that no direct access to users’ profiles existed, so no harm could potentially be done to users—this is an ethical perspective that many behavioral studies scientists subscribe to. However, anonymity doesn’t guarantee that users will be relieved of any negative effects. In fact, the test was reported to cause emotional distress in some users.

To offer a counter experience, Microsoft performed a similar test on Twitter users. Specifically, through aggregating certain words tweeted by uninformed users, Microsoft deemed some tweets as exhibiting patterns of depression. Although no one consented to their tweets being analyzed by Microsoft, the risk of contacting those in need to offer medical support outweighed the negative consequences. In this case, Microsoft assessed the risks, and decided that it was their moral responsibility to surpass consent and target particular people.

Ethical Perspectives & How To Be More Responsible

The aforementioned cases provide us with clues on how to approach testing in a responsible and ethically sound manner. In Accepting the Challenges of Social Media Research, Weller collected the voices of many social media users, but there was hardly a general consensus on how users feel researchers are justified in collecting data (2015). While most users agree that Terms of Service agreements are not the appropriate manner for companies to gain consent to collect user data. So what should be a researcher’s approach?

First, “each Internet research project requires an individual assessment of its ethical issues and selection of the most appropriate methodological approach” (Golder, Ahmed, Norman, and Booth (2017). Always inform participants that you require their consent, before they can participate in your study.  You must keep all your participants’ identities separate from data and analysis. Never expose your users to any type of harm or negative effects—if there exists potential for harm, then the societal benefits of the test must outweigh the risks (Bowman, 2014).

Need more context of this critical juncture? Here’s a listicle from Forbes on how Facebook has failed many of us, through their unfair, immoral behavior as user researchers10 Other Facebook Experiments On Users, Rated On A Highly-Scientific WTF Scale.

References

Albergotti, R., & Dwoskin, E. (2014). Facebook study sparks soul-searching and ethical questions. Wall Street Journal30.

Arthur, C. (2014, June). Facebook emotion study breached ethical guidelines, researchers say. Retrieved April 01, 2018, from https://www.theguardian.com/technology/2014/jun/30/facebook-emotion-study-breached-ethical-guidelines-researchers-say

Bowman, N. (2014, August). The Ethics of UX Research | UX Booth. Retrieved April 01, 2018, from http://www.uxbooth.com/articles/ethics-ux-research/

Golder, S., Ahmed, S., Norman, G., & Booth, A. (2017). Attitudes toward the ethics of research using social media: A systematic review. Journal of medical Internet research19(6).

Kelsen, H. (1990). General theory of norms.

Design Critique: Cash (iPhone–iOS application)

Cash application logo

Introduction

Square Cash, known simply as Cash on iPhone iOS, is a service that allows US debit card users to send and receive payments in US dollars. Users do not necessarily create an account with a username. Instead, new users are prompted to enter their existing debit card information and email address. Cash will pull debits from and push credits to the bank account tied to the debit card. Users also have the option to personalize a Cash Card—you can use your Cash Card like an ATM/debit card—and now you can buy and sell Bitcoin.

A Breakdown of Human Cognition and Emotion Present in the Interface and Experience of Cash

Cash successfully toggles between the visceral and behavioral levels of a user. Signifiers, like arrows and ‘x’ marks, lead the way and act as breadcrumbs to easily map your way. Equally, confirmation screens double-up on reassuring users of their decisions. Even sensibility checks in the form of pop ups that block secure information come through as you traverse the application and move between screens.

Load and home screen

A sea of green: The load screen with the logo is your initial signifier, then the home screen populates sans hamburger menu.

Although a clear map is not initially apparent, Cash gives feedback from the minute it loads the home screen. Users have immediate access to typing in numbers in a calculator keyboard. Buttons sit in the footer and other screens are accessible at the top. That’s it. There is no menu tucked away in a hamburger icon. It’s as if there’s nothing to hide. The home screen really allows users to easily fill in the conceptual model of the software by applying a personal user model and map.

Cash is so cool, it even allows you to draft your balance and deposit into your bank account immediately! Amazing.

Enforcing Semantic Structure to Assist Behavioral Actions

In going from a visceral to a behavioral level, the software provides a seamless transition by applying a cultural constraint then applying a semantic constraint. Visually, there are no cues of this transition, unless you count the open screen lag before the calculator screen populates. Immediately, users can jump from their perception of what they are about to do, to specifying and interpreting those actions into a real-life transaction.

Diminishing Learned and Taught Helplessness

Unlike applications like Venmo, Cash does not require you to create a profile attached to a username. There is no news or social feed where friends gather to divulge.

It’s your profile, you can do what you want to with it. Nobody cares, because no one is looking at it.

The absence of a social media feed can alleviate the anxiety that some people face in sharing their information on-line. Going without a social feed, in this instance, does not render the application less relevant. While a social feed is an attribute that can result in positive psychological feedback for some users—that results in them coming back to use the app more often than other users—it can also be a disadvantage to another base of users. In fact, going without a social media feed that publishes your activity and exchange of money can help to alleviate and diminish the phenomenon of learned helplessness. Users don’t have to worry about publicly announcing their monetary exchanges. While the product already involves a form of mathematics, a phobia of some, the task at hand is clear, and users do not have to spend time and attention on other aspects that exacerbate phobias about using online software and publicly sharing information.

I believe this particularly important decision is an exercise of a favorable, cultural constraint to impose on users. Specifically, as a US consumer, I am not too keen on publicly sharing my purchases and other transactions that have a price tag assigned to them. Intuitively, I am able to use the app for two, sensitive instances: send and/or receive money. I don’t have to think about any other behavior. Thus, cultural constraints and conventions are satisfied, and I can focus on the flow and features of the software.

Recommendation: Upgrade to Reflect

Visually, the semantic constraints to only feature the amount paid or received, time-stamp and the name of the recipient or sender, makes all the info easy to digest. Users can process information quickly and effectively, so that we can discern between being paid, sending money, or cashing out. In this case, Cash decides that adding a description note is secondary information, unless you are the sender and remember to input a description before you send the money. On the other hand, it also adds an extra step, in attempting to determine, for example, when and why cashed out.

From left to right: the second screen shows your activity, but it isn’t very descriptive. Instead of users having to populate another screen, perhaps an option that shows a small description of the transaction would eliminate this step. See the farthest right screen for an example of what a one-liner descriptor would look like.

I suggest that Cash app incorporate a short descriptor in the activity feed, so that users avoid the additional step to secure that information. A simple application can become overwhelming when more screens are added to our mental mapping of the UI. Adding the suggested descriptor line to the activity feed will diminish the need to populate another screen. Users can jump into the window that displays each transaction details separately on their own accordance. This not only alleviates extra steps, but it also creates a simpler map for users who are not invested in the details.

Conclusion

Overall, the strategies that Cash uses in their interface fulfills the seven stages of action. As mentioned in the recommendations, allowing for alternative sequences can give the user more flexibility in their actions when moving between screens. Allotting control, where designers may have not given the user enough, can alleviate some of the frustration that comes with having a broad range of users.