The upfront “end-user's perspective” is disturbingly inadequate and highly likely to lead to wasted effort
With statements like “solving problems from an end-user's perspective at our core” who could possibly argue with the approaches espoused by design thinking and human centred design practitioners? The answer is, anyone familiar with sampling error, statistical inference, the frailty of respondents’ stated responses; that’s who.
The overlapping element between design thinking and human centred design is the approach frequently used to collect the end-user's perspective. A skilled marketing researcher would describe the data collection approach commonly prescribed by design thinking and human centred design as a judgement based, convenience sample used to capture unstructured, qualitative data. Sure, for the novice, terms like ethnography might add to the mystique and standing however, it does not alter the plain facts and the lose foundations upon which design thinking and human centred design initiatives are often built.
If the qualitative data collection was being used purely to help form and broaden hypotheses, then that approach would be fine. However, if the convenience sample is being used for developing CEX initiatives such as designing a new service or product, then the upfront “end-user's perspective” is disturbingly inadequate and highly likely to lead to wasted effort.
Absence of Rigor
There is little rigor in the data collection techniques undertaken at the forefront of human-centered-design and design thinking. Indeed, it has frequently been observed that other than management judgement, a convenience sample undertaken in the name of human centred design is often the only justification for undertaking yet another, expansive CEX initiative.
I have witnessed human centred design practitioners take the elevator to the ground floor and conduct intercept interviews with people in the café, some of whom were employees, and return upstairs with “insight” used to inform major investment decisions.
In the words of Amos Tversky and Daniel Kahneman from their journal article, ‘Belief in The Law of Small Numbers[i],’ ‘They (in this case, design thinking and human centred design practitioners) tend to extract more certainty from the data than the data, in fact contain.’ The first guiding principle of scientific inquiry is the ability to replicate the results within the margins of experimental error. Collecting numerous convenience samples is likely to produce entirely different results each time.
They tend to extract more certainty from the data than the data, in fact contain.
Little wonder organisations suffer from initiative overload. Too many initiatives and allowing “zombie” initiatives to linger is a function of poor initial screening of approaches such as a convenience sampling. It is a forlorn hope to expect middle management to kill the initiatives that are politically charged, or upon which their livelihood partly depends.
The other symptom of too many initiatives is excessive demands on management leading to poor execution of initiatives. ‘Many organizations lack mechanisms to identify, measure, and manage the demands that initiatives place on the managers and employees who are expected to do the work.[i]’
According to the HBR[ii], ‘Design thinking provides a structured process that helps innovators break free of counterproductive tendencies that thwart innovation.’ I completely accept that there are human tendencies that get in the way of innovation, and that methods like design thinking and its 1980’s forebear, total quality management are ideal for circumventing those blockers. However, substantial investment decisions based on such a poor qualitative substratum undermines the worthy principles of design thinking and human centred design and exposes the organisation to unnecessary risk.
The Solution is Clear
Once the qualitative data collection has established the range of options, design thinking and human centred design practitioners should undertake a quantitative study that appropriately captures inferred responses (uncovered through econometric modelling) and therefore, uncover underlying behavior. Techniques such as choice modelling are ideal for this purpose. Choice modelling is a scientifically robust approach used to objectively understand the underlying attributes driving choice and is applied to avoid the unreliability of respondents’ stated responses.
It is somewhat irrelevant that the design thinking process of innovation encompasses elements such as concept development, applied creativity, prototyping, and experimentation when the initial qualitative foundations are so rickety. Methods applied in design thinking and human centred design are a “creative” or pseudo-qualitative approach to proposing a customer generated solution. Trouble is, without the benefit of a properly structured quantitative assessment, there is often little objective, scientifically derived evidence that the “problem” is related to any organisational outcome or indeed, that the “solution” will change the future behavior of consumers.
Ken Roberts | Executive Chairman | Forethought
[i] Belief in the law of small numbers. Tversky, A., & Kahneman, D. (1971). Psychological Bulletin, 76(2), 105–110
[i] Too Many Projects, how to deal with initiative overload, HBR, Hollister R., Watkins M. September–October 2018