Design Story (Alexander Street)

Alexander Street Landing Page Redesign:


Project Brief:

The goal of this project was to ensure that the users of Alexander Street Press, a video and primary source archive for educational programming, were met with the highest possible satisfactory landing page. One that would aide in sign-up statistics and let the users know what it is that Alexander Street has that no one else does.


The Current Site:



Our Opportunity:

How might the Alexander Street marketing landing page serve our users better?


The Process:


User Research:

We had 4 main thoughts that we carried with us through our entire redesign process. Those being:

  • Understand what drives people to seek out cultural or educational programming
  • Discover which channels people use to watch video and how frequently
  • Understand how people define educational programming versus entertainment
  • Learn how much of priority self-improvement is for people in general

We conducted user research to try and find some answers to these questions. In trying to solve for these we implemented a number of different user tests and insight gaining methods.

We created personas to try and get into the mind set of what a typical and an atypical user of the site might encounter while viewing the page. The use of a user journey map was also helpful when trying to get into the mindset of an Alexander Street User. 

Card Sorting was done in order to see which content was most intriguing and most useful to the visitors of Alexander Street.

Through all of this we found a number of different features we saw could be implemented into the Alexander Street Site to make it more impactful for users.

From all of this we gathered a few key insights into our users.

  • People often feel that self-improvement is a priority but don’t take action towards that goal.
  • People have strong preconceptions regarding functionality for searching and playing videos.
  • Users like engaging content, but can be turned off by packaging.
  • People associate videos with unwinding and relaxing rather than an opportunity to educate themselves.


Content Structuring:


We developed an organizing principle for the content on the landing page.

The three main areas are:

  1. Membership
  2. Content
  3. Research Tools and Resources

We also have “Institutions” as an item for visitors from Libraries or Faculty because this site is currently geared towards them. We felt giving their messaging within the page a prominent position may confuse general visitors.



Competitive Analysis:

Through a competitive analysis we looked to other video services that had a similar layout to what we wanted, but the tone of each selected competitor was different from site to site. Our big three inspirations for login inputs and layout specs were Netflix, Kanopy, and TED.

Some Insights we gained:

  • Having a visible and easy to locate account area of the page provides a clear signal to users that there is a membership component to the site.
  • Breaking down the sign-in process into parts can be helpful if there are choices.
  • It’s beneficial to provide a few different paths to browsing content. Some curation is particularly important if some users are not be familiar with a genre or topic.



The start of our prototyping process began with a paper prototype, where we came up with general guidelines for all of the content that we wanted to appear on the page. Further into the process we refined the proportions and put in copy to reflect the tone that we wanted Alexander Street to express.

Final Design:

The final Design Proposal incorporates all of our features; curated playlists to show off the content, a smooth sign in process, and a streamlined explanation of some of the advanced features that Alexander Street has to offer.


The entire design process is about constant push and pull. Iteration in every step of the process makes a small victory seem monumental. This in turn makes the process so fulfilling.

Benchmarking: Worth it?

There are many differences that should to be taken into consideration when choosing whether to set up your test as a qualitative or quantitative study. Qualitative testing yields results that can answer questions related to “why is this happening?” Or “how did that interaction make you feel?”. Where the main goal of performing a quantitative usability test is to gain some metrics, whether it be the success of a certain task, or time taken to complete a task, to name only a few.

So when thinking broadly the main difference between the two is getting the answer to two separate questions, “why?” and “how much?”.

This is especially true in the case of the quantitative usability test known as benchmarking. It follows the same methodology as a qualitative usability testing. The focus, however, is placed more on metrics that can be gained rather than insights. Users are asked to perform realistic tasks while the testers observe the time it takes to complete said tasks, as well as the success and failure rates of those tasks.

But mainly, benchmarking is used to chart a site or app’s progress over time! This can be the progress of a prototypes iterations, different versions of an application, or can be the progress of different sites running parallel to one another.

The benefits of benchmarking are not only in the information gained, but the way in which the information can be shared. Numbers and figures are concrete and hold a bit more weight than the insights gained from just qualitative testing alone. Also charts and direct comparisons can be more effective when trying to persuade higher ups in any one direction. The metrics are readily available at a glance, so the need to sit through users test videos is lessened, if only a bit.

“How is it done?”, I hear you ask. Well, it starts with a plan. This is vital, there should be no deviations from the plan once you have started the benchmarking path. So full agreement on what you and your team are trying to achieve is paramount. Some of the main questions you have to ask: “Do we have the budget to keep up with this?”, “What are we measuring?”, “Is the insight gained worth it?”. Then you create the test with a script that you keep consistent. Goal oriented tasks that avoid over explanation are used, much like they are in qualitative tests. The difference is the use of evaluative questions after each task, numerical values can be given to how “hard” or “easy” a task was. This rhythm of task and question needs to be maintained through every future test.

Demographics should be chosen. Any signifier with which you do your first test should remain constant through all future tests. Though they don’t need to be the same testers again and again, the demographic should be the same ( e.g collage students under 25). The sample size of quantitative usability tests are much higher than those of qualitative, because a higher sample size is required to bring any relevance to the insights gained. The frequency with which you should keep up with new tests is dependent on how much the product you are testing changes. If it is a concrete build less frequent tests can be taken.

There are glaring differences in price, time and size of user groups. Usability testing for quantitative data skews all the above categories to the higher, more time intensive and money expensive side of the scale. These, of course, depend on the specific tests you run and whether or not you are doing them in person or remotely. Generally speaking though, quantitative research will always be more expensive due to the amount of participants needed to make the sample size viable.

The end result however, can be beneficial in showing how your UX has improved over time, as compared to competitors, and may be more persuasive when showing your findings to stakeholders in the company.


Design Critique : Grub Hub (Website)



GrubHub is a service that allows people to place food orders for pickup or delivery to local restaurants, either online or through their phones. It boasts a huge selection of local restaurants as well as fast food chains for their customers to choose from.

Critique 1

When first entered, the website greets you with this page (Fig.1). It is a good example of Human Centered Design. It first starts off by asking the user, “Who delivers in your neighborhood?” as opposed to a more calculated, computerized question like “Where are you located?” The action of choosing your location is centered on the page and is one of the only things that you are able to do, making it simple to understand the next step that needs to be taken, Normans design principle of Visibility is present here. Along with the location finder, the user is presented with infographics that state exactly what the user is expected to do. along with making the page more appealing it makes concrete the end goal of using this service.  Enter your desired location and hit the “Find food” button and you are off to the next page.

Fig. 1

Critique 2

Once the user has entered their address or zip code they are sent to a page that is overrun with options for food and deals, and different ways to filter the content (fig. 2). It is a marked difference from how the user first entered the site. the minimalistic view that graced the landing page is now gone and the abundance of options is placed in its stead. While the feedback that is received when a filter is clicked is fine, the gulfs of execution and evaluation are not hard to discern, I feel that a more guided experience would be helpful in the first stages of using the site.


If, instead the site took you through all the filter steps (price, rating, distance, etc.) one at a time the end result of all the options being displayed at once wouldn’t be so jarring, as the user has had a chance to look at each piece individually first. Some constraint would make the experience feel more like gradually entering the pool, than being thrown into the deep end.


Fig. 2

Critique 3

When you have chosen the restaurant and your food options there is a handy sidebar to the right of the screen that, in real time, adds the items to your checkout bag and updates the total amount you are spending (Fig. 3). As you reach the end of choosing your meal only now is the idea of signing up for the service presented to you (Fig. 4). The user is given a variety of options in order to finish the process of ordering. It is a streamlined process that is made simple by the use of photos and well proportioned text, as well as grid-like sections that make sure everything has a place and everything is in that place.

Fig. 3

Fig. 4