Design research revisited
We spend a lot of time in front of clients. When we get a chance to sit in the audience and let other researchers and designers talk, it’s a welcome change of pace. Exposure to new models, new tools, and new ideas is one way we keep our work fresh and our thinking current.
Recently, Chris and I attended the Design Research Conference in Chicago. Presented by the IIT Institute of Design, the conference is in its tenth year of exploring the business impact of design research. Here’s a recap of some highlights from two information-packed days…
Donna Flynn, Director of Design Research at Steelcase, kicked off the first day of presentations. She shared an interesting model following the arc of impact that design research can have on an organization under the right conditions, moving from knowledge-building through influence-building to guidance.
Jay Melican, a research scientist on the Interaction & Experience Research team at Intel, began his talk with an interesting observation: When it comes to designing design research tools, we are our own users.
Greg Ames and Joel Kashuba of P&G stressed content delivery and the fact that how design researchers capture what we do and report it needs to be easily internalized… and a PDF is not easily internalized, at least not in a way that will drive memory and meaning. As Chris summarized: Great content with mediocre delivery might as well be bad content.
The first ID faculty member to present, Tom MacTavish, shared a viral video to illustrate how our expectations for technology are evolving…
Tom challenged attendees to think about how the meaning of terms such as “privacy,” “friend,” “presence,” “conversation,” and “real” have changed in the era of social media and ubiquitous mobile devices.
In my favorite talk from Day One, Jump Associates’ Peter Mortensen and Joyce Chen explained how a theory developed in 1943 to illustrate patterns in the diffusion of corn seed hybrids could be used to identify appropriate design research methods based on where a product or service is in the adoption lifecycle.
Earning the endorsement of innovators means alpha/beta-testing and co-creation. Curation for early adopters requires an understanding of frames (how do they create meaning?), so much can be learned from contextual interviews as well a narrative or semiotic analysis. By observing the activities of the early majority (through shadowing, activity analysis, and needs analysis), you can discover how they’ve been able to integrate products or services into their lives. Through heuristic walk-throughs or usability testing, you can economize products or services for adoption by the late majority. If you can spot style trends or play with features, you can reach laggards; fashion forecasting and focus groups can be useful here. By that point, it’s time for a refresh, which is where resale markets, social media, and even customer service logs can be useful.
I also enjoyed listening to Nate Bolt of Bolt Peters, who presented next. Besides tipping me off about the next innovation from iPod inventor Tony Fadell, his talk also illuminated the importance of timeliness in research. Finding people in the act of thinking about or performing the behaviors you hope to learn more about can be incredibly valuable. As Nate said, “Attachment to the moment yields immediacy,” a truth he probes more deeply in his book.
Kayne Burke, formerly of MindSwarms, talked about the increasing speed of research. Digital technology enables unfiltered, authentic responses from a potentially global pool of research subjects. On the flip-side, digital research is still the Wild West stage. Kayne’s advice: Trust, but verify.
Michael Winnick and Martha Cotton extended the topic of digital research with a talk about their recently launched project, dscout – a tool designed to richly capture real world experiences in real time. We’re excited about the potential of dscout as it might apply to some of the empathic research projects we have in the pipeline.
There is a huge difference between data as data and data as document, according to Jeff Stanger, Director of the Center for Digital Information. Stanger shared the statistic that 98 percent of all research output is currently delivered in the form of a static PDF. In a digital environment, that is a recipe for irrelevance. Visualization of data is no longer sufficient, according to Stanger. Data that can be shared, manipulated, etc. is now the expectation.
The City of Chicago’s Chief Technology Officer John Tolva wrapped up Day One’s presentations by sharing some of the interesting ways his department’s open data policy allows Chicagoans to access and build upon the massive amount of data generated daily by digital systems across the city.
Highlights from the second day of presentations start with the charmingly matter-of-fact (but eminently knowledgeable) Luis Arnal, managing partner of international strategic innovation consultancy in/situm. He began by listing the problems he has with the word insight. Besides the fact that the word can’t be easily translated into Spanish or Portuguese, the word is often misused to describe other things: data, codified data, analysis, patterns. None of these are insights, according to Luis. His definition? An insight is 30 percent imagination, 30 percent data, 30 percent analysis, and 10 percent luck. Insights are not the end, he cautioned, but the means to an end.
He concluded his talk with this bit of advice: While not all researchers are courageous, and not all researchers care about quality, it always helps to understand the segment you’re talking to.
Arjun Chakravarti, Assistant Professor of Management and Marketing at IIT’s Stuart School of Business, reminded us that moving from design insight to market opportunity is hard, especially when price is a key constraint of the design process. Clients are understandably biased toward return on investment, so in the end failure to design to price is failure.
Sapient/Nitro’s global lead for research and insights, Todd Cherkasky, offered four tactics for making design research more collaborative – especially in the age of data analytics.
- Talk about our work in terms of the services we provide, not in terms of the methods that we use. By reframing how we talk about research, we move away from talking about method – data collection – and toward the interpretation of that data, which is where the real value can be found.
- Use tools that encourage collaboration across disciplines. One of the tools Todd uses is a Customer Eco-system. In his mind, the dialogue generated by using these tools actually becomes more important than the artifacts themselves.
- Build an “asset” library as a way to consolidate and share insights. Think in platforms that can be used over time to accumulate knowledge. I was heartened by how perfectly this advice aligns with efforts currently well under way at Peopledesign.
- Audit an organization’s ability to deliver a cohesive customer experience.
A few important points were raised by Ken Kellogg and Gina Villavicencio of the User Research group at Marriott, both in the text of their talk and in the subtext. First, they discussed some of the differences between what clients need out of research and what designers need. For me, these were some of the key takeaways from the two days of presentations. Also important: The very fact that big, established, seemingly conventional U.S. corporations like Marriott have strong innovation stories to tell speaks to a changing paradigm. Among this group at least, big corporations are no longer seen as the enemies of innovation.
In the afternoon of Day Two, the talks focused on research deliverables. Robert Zolna and Edwin Lee shared some examples of how gravity tank has shifted out of the PowerPoint mentality.
Research deliverables can run the gamut from full-serve to self-serve, sensual to informative. According to Ted Frank of Backstories Studio, choosing the right tools for the right audience means delivering real value.
Anthropologist/industrial designer/strategist Jacob Simmons of NBBJ used a recent project with Kaiser Permanente to talk about creating a design strategy, while Barbara Denton, Kaiser’s national team manager, followed up with a talk about implementing a brand strategy.
Jacob presented NBBJ’s model for design strategy, beginning with Research and Strategy, moving through Concept Development, and ending with Change Management. On a higher level, it bridges the gap between aspiration/vision and transformation. The interesting layer for me was the relationship of “value” to “project work” across this bridge.