Lessons from the Ethnographic Pratice in Industry Conference
I attended the 2017 Ethnographic Pratice in Industry Conference, or EPIC for short, in Montreal from October 22-25.
I had two motivations for going:
The first is that, the more I find myself tackling digital change at the FT, the more I come to believe that understanding and influencing company culture is at the heart of what I do. I have operated on instinct so far, and I wanted to learn more about approaches, methods, competencies and theoretical frameworks that others have developed.
The second is my long-standing interest in exploring the boundaries of journalism and storytelling. The rise of computer-assisted reporting and data visualisation brought new ways of gathering and presenting data to journalism by borrowing the techniques, language, and ethos of statisticians and those who deal mostly with quantitative data. What can we learn from ethnographers, who gather, structure and present mostly qualitative data?
I took notes of all the sessions I attended (day 1, day 2, day 3, day 4), but they were written in the moment and therfore largely an exercise in stenography. So, in the interest of synthesis and reduction (a lesson I learnt in Sam Ladner’s Ethnographic Research Design tutorial), here are my main takeaways from the conference. Any misinterpretation or misunderstanding is my own.
A different way to make sense of data
Quantitative data journalism takes a very directed approach towards interrogating data sets. In fact, the whole idea of ‘interrogating data sets’ and ‘treating the data as a source’ presupposes certain things: You already have a way to come up with what questions to ask, there’s a way to abstract the data or manipulate it into a ‘useable’ form.
But ethnographers often deal with so-called ‘wicked problems’: situations where the initial question might be the wrong one and it’s unclear what are the right questions to ask. They’ve developed a set of methods and competencies to guide them in those circumstances. It is a very different process. There’s a lot more emphasis on an experiential understanding (Tom Rowley’s pecha kucha on bringing both feet back to ethnography; Vyjayanthi Vadrevu on dance as foundational to her ethnography practice). A lot more emphasis on letting time do its work (Melissa Cefkin on the pedestrian perspective and slowing things down).
Since ethnography mostly studies people rather than things, there’s also a lot more emphasis on conversation and entering into the participant’s cultural framework (Therese Kennelly Okraku, Valerio Leone Sciabolazza, Raffaele Vacca & Christopher McCarty’s choice to prioritise scientists’ feedback over the network analysis data in identifying emerging communities).
One example of this difference is in the construction of new data sets. Both data journalists and observational researchers sometimes create new data sets through observation, but the purpose is very different. The former creates data sets as a source of truth. Representativeness and accuracy matters. The data set, while a sample, has to say something about the bigger whole. The process (the act of recording) is in service to the outcome (the data set).
The observational researcher, however, uses the process of creating the data set to focus her mind, spark curiosity, and to generate questions and hypotheses about the bigger whole. Representativeness and accuracy matters a lot less than salience and what she chooses to focus on. The process matters more than the outcome.
How do you bring rigor to unstructured data? Ethnographers have a set of methods for note-taking, and the organisation and structuring of notes into files. Not everything is recorded; a firehose of data is overwhelming. Instead, they take notes with an eye towards: capturing information about the context in which the observation takes place, and metadata about the observations that will allow them to be turned into data sets later on (i.e. location, date, time and time span, and what’s being observed).
It’s a dance between structure and openness, and (compared to data scientists, at least) a much greater willingness to abandon or remake data structures so that the ethnographer can examine the data from different perspectives and in different contexts. This requires them to capture and work in much ‘bigger’ data at the outset. A really good illustration of the importance of embracing openness is Dana Sherwood’s talk on day two. Also, Fiona Moore’s research identifying that long-term vs short-term joiners are more relevant demographic categories than gender or ethnic ones at the BMW Mini plant in Oxford.
Communication and articulation
Part of the value of ethnography lies in articulation and explanation. Barry Dornfield (“Delivering the Secret Sauce”: Culture and Identity in a Corporate Merger) noted, for example, that it’s critical to understand what, in the client’s world, counts as culture and how to make it concrete to work on.
Michael Griffin’s talk (Surveillance, Technology, and American Conceptions of Freedom) was about articulating what people really mean when they use the word ‘freedom’.
Josh Kaplan’s talk (When ‘The Emperor Has No Clothes’: Performance, Complicity, and Legitimacy in Corporate Attempts at Innovation) highlights how quantitative data can tell you, for example, that a company is shipping products despite evidence of risk uncovered by its internal research function, but qualitative research can tell you why.
Some of the answers, at a top line level, may appear obvious or common sense: The company is insular and its research employees have become complicit. But the value comes in the ability to provide detail (This is similar to quantitative data analysis where showing the distribution is a lot more valuable than a summary statistic like the mean). What exactly are the characteristics of insularity? What constraints cause them to become complicit?
Emma Saunders, MaiLynn Stormon-Trinh & Stephani Buckland’s talk on the different understandings of productivity between the New Zealand government and small business owners highlights the importance of understanding why as well as what.
But it is easier to communicate a scientific and teleological understanding than an ethnographic and experiential one: you can take people step by step through your data analysis in a way that you cannot when the knowledge is embodied. Ethnographers have found (and are continuing to evolve) methods around this problem: ethnographic film, interactive maps (Ageing Gracefully in Singapore: An Interactive Journey by Gabriella Piccolo & Michele Visciola • Experientia), workshops, providing ‘actionable objects’ and ‘precise measures’ etc.
Ethnography is very much rooted in practices that took place in physical space. There are a lot that still happens in physical space (see talks on Red Rooster, ridesharing, etc) but the fact that increasingly what people are doing within a digital space is more salient than what they are doing within a physical space at any given time is presenting a challenge.
Some of this has to do with grappling with tools and techniques of digital observation (Christopher Golias on remote usability testing). But the bigger question is how to achieve, within a digital context, the sort of embodied, experiential understanding that ethnography is so good at in physical spaces. (Julia Hines’s talk on Multi-Dimensional Ethnography) What does it mean to do observational research when the researcher cannot engage all of his/her senses?
Having an impact
How to advocate and create change within a shifting corporate context was a running theme in the conference (The Ethnography/agile session that I missed, for example). What was interesting to me is the framing/language used was interventions and recommendations, which seems to presuppose the ethnographer as coming in from the outside and of not being in charge. Much of the questions and discussion appeared to revolve around how researchers can be leaders and impact strategy.
To me, this echoes similar discussions among designers about ‘design leadership’ or (to grossly stereotype) how software engineers just went ahead and built technology/engineering-driven companies (and who are now having to return to discussing/examining issues of culture).
It’s very hard to both be producing knowledge and producing things at the same time
Autonomous systems are essentially relational systems. We need to be involved in the shifting of relationships
How do we see things and find things that we didn’t know to look for?
Projects that go wrong are the ones that have most feeling and are most memorable
“Oh my god that babooon just stole someone’s birthday cake!”
I’ve seen empathy do nothing time and time again. Empathy has to help the client win. You have to come up with a bunch of commercial implications that’s attached to that empathy
One of the intersting things is how to design products and services for populations you know nothing about. If you don’t take the view that you are only qualified to design products for yourself and people like you, then you need a foundation of knowledge about the population
Luckily, all grammars leak. There’s space to tell better stories…Resistance can be fertile, even if it can take decades
Sometimes we know what we believe more than we believe what we know
Interestingly, corporate and startup mindsets are aligned in not underststanding the customer mindset
Culture problems are wicked problems: The client is facing something important but you don’t know what you don’t know during the project design phase, nor at the outset of research.
A ghost is not simply missing or dead persion but is a social vision. Ghosts can indicate social dynamics just beneath the surface
Behaviours are expressions of feelings. There’s truth and emotions that words obscure, and it would do us well to know what they look like
We have wrongly convinced ourselves that utilitarian economic theory is timeless and mirrors nature. We’ve forgotten it has a before, and has to have an after