Local Projects, one of the earliest self-described experience design studios, is responsible for the successful facilitation of the abstract experiences of emotion, memory, and social connection. How then, does Local Projects continue to make successful design decisions while relying almost entirely on qualitative feedback? The answer, it seems, is a familiar one: it depends. I spoke with Christina Latina, Art Director at Local Projects about what goes into taking projects from ideation to their final forms.
What are your major responsibilities in your position at Local Projects?
I am an Art Director at Local Projects, which I often analogize to being like an Octopus. I work in a cross-disciplinary capacity across many of our departments (we have a few: Visual Experience Design, Creative Technology, Content, Strategy, Project Management and UX) to bring our work to life. From concepting and sketching ideas to media production, space build-outs and technical development, I get to work on so many aspects of our projects (hence the Octopus).
What drew you to working there?
I really fell in love with the work with Cooper Hewitt, The Tech Museum of Innovation and 9/11 Memorial — each of these has a depth and sensitivity to content that translates into these really embodied, relatable and meaningful moments. I was also impressed by our Principal Jake Barton and our Leadership Team’s dedication and fearlessness in exploring and incorporating new creative technologies.
How do prior research and past projects inform the work that Local Projects is presently doing?
Every project gives us the opportunity to refine our methodologies.
What does the process of user testing look like before a prototype is reached?
At Local Projects we use a methodology of ‘Prototype First’ that is crucial to the success of our ideas; so testing is fundamental to our process. We take an iterative approach to prototyping, often incorporating rounds of user testing at different stages of the prototype’s development, depending on the nature of the interaction and the design, and using those insights to evolve the work. We are productively critical of each other, democratic, transparent and encourage a strong POV with each other to push our concepts as far as we can, and our clients are a crucial part of this process as well.
Local Projects is known for the emotional content and interactive elements of its work. Are there certain metrics, technologies, or methodologies used to evaluate these aspects when testing a prototype?
The primary aim with our work is to create experiences that drive meaningful conversation between humans in public spaces. It’s difficult to assign metrics of success to that, as it’s often tacit, and sometimes concepts just work or they don’t. Each team spends a lot of time researching and developing concepts through sketching. Once we’ve got a solid concept in place, we can quickly evaluate the idea and our assumptions by creating lo-fi prototypes, setting them up in our studio and testing them on each other. We often are creating unconventional modes of interaction and using unconventional technology, so it’s crucial at this stage that our ideas make sense and can be built. After the initial prototyping round, we take it out into the real world and test it with real users. We create many more rounds of prototypes until we get to a conceptual and technical solution that works. While we have a toolkit of methods, each project is uniquely different and suggests different metrics and methodologies.
How do these provide direction in the development of projects?
Our UX team is methodical in conceiving of testing methods and documenting findings from these tests. After these qualitative and HCI type interaction evaluation methods, they take an adapted grounded theory methodology at evaluating and developing insights that inform subsequent iterations of the work.
Do you have a personal preference for a certain method of measuring user emotions / engagement? Why?
My favorite method (which perhaps is arguably not even a method) is when you capture photo or video documentation and see the facial expressions of test users as they engage with your prototypes and ideas. Microexpressions convey a lot, and often more than the users themselves say through words.
How does Local Projects evaluate the effectiveness of a project once it has launched?
With our clients we create a set of goals, objectives and requirements for the project from the outset to measure success, each unique and customized. But to me, the public’s engagement and reaction is always the strongest measure.
Are there any methods of evaluating engagement that you anticipate becoming more widely used in the future?
I would love to see something like an open interaction laboratory where guests can come and play out ideas, especially in different and unexpected contexts. In general, design and technology could do a better job at incorporating underrepresented viewpoints, and especially for works that live in public spaces.