Democracy

What is Uncomfortable Knowledge?

Attribution: Oxford Martin School:
Professor Steve Rayner announced as 2020 Paradigm Award winner 2020

The process of unwrapping the drivers of persistent societal and environmental problems involve extraordinary political challenges. Persistent wicked problems include such diverse issues as pervasive anthropogenic pollution, nutrient loss in food, the privatisation of data, climate change, obesity and non-communicable disease.

Wicked problems require that the public, scientists and policy makers ‘wade upstream’ to identify the overlapping and embedded tangles of social life, biological health and commercial activities. Wicked problems are complex socio-biological challenges, arising from activities embedded in daily life, the effects often slow and crescive, expanding and interwoven, and thereby, becoming more complex over time.

The process of unpicking a wicked problem tends to reveal ‘uncomfortable knowledge’. Uncomfortable knowledge bridges two separate but related theoretical concepts: ‘wicked problems’ & ‘clumsy solutions’

Steve Rayner (1953-2020) theorised that uncomfortable knowledge (Rayner, 2012) was knowledge, which, if revealed, presented a danger to institutions because it such knowledge could potentially undermine institutional principles, arrangements and goals. Rayner considered that uncomfortable knowledge would not merely be knowledge resulting almost accidentally from a ‘background failure to acquire, store and retrieve knowledge’, – rather he theorised that:

‘ignorance, the deliberate exclusion of information and knowledge was a ‘necessary social achievement rather than as a simple background failure to acquire, store and retrieve knowledge.’

(Rayner, 2012, p. 108).

Rayner drew on Mary Douglas’ work on institutional memory and the application of non-knowledge as a tool of political and social authority (Douglas, 1986; 1994). This work has since been expanded in sociological studies on the social construction of ignorance (Gross & McGoey, 2015).

Bridging wicked problems & uncomfortable solutions

Uncomfortable knowledge was theorised as Rayner as the bridge between wicked problems and the uncertain and unknowable process of re-solving wicked problems – a concept theorised by Michael Shapiro, as ‘clumsy solutions’ (Shapiro, 1988). Shapiro applied the term ‘clumsy solutions’ to judicial selection processes applied in U.S. courts where different processes of consideration lead to a conclusion. Rayner envisaged clumsy solutions as satisficing (as compared to optimizing) social arrangements which allow:

different sub-sections of a society or organization to rub along with each other by not questioning each other’s motivations and worldviews too deeply’

(Rayner, 2012, p. 112; Shapiro, 1988)

Uncomfortable knowledge, that liminal and often sequestered bridge between wicked problems and their solutions, was considered problematic for institutional actors because uncomfortable knowledge carries with it the potential to be disruptive and hence dangerous to institutional function.

Rayner’s work explored the relationship between science and technology, the environment and governance and his observations on the necessary social construction of ignorance arose through this research. Rayner articulated how for powerful interests, ignorance is a necessary social achievement, and was applied as a strategic tool that was essential to preserve institutional interests.

(Rayner, 2012)

Rayner theorised that this occurred in two ways. Firstly, when uncomfortable knowledge is brought to the surface and legitimised (made authoritative, such as by publishing in a scientific journal), the principles or goals of organizations risked being undermined. Rayner cites the famous case of O-rings on the Challenger space shuttle.

However, science drawing attention to toxic pollutants in daily life has the potential to undermine strategic interests. if an ingredient in a common household product was recognised unsafe at a level previously considered safe. Once a pollutant was flagged as unsafe, measures could be taken to deter production, ranging from taxes to withdrawing approval of a synthetic chemical, formulation or emission (Michaels, 2020; Pinto & Hicks, 2019; Proctor R. , 1995). Restricting or deterring the science that would draw attention to risk was recognised as a strategically important measure. Similarly, if non-communicable disease and severe multimorbidity were flagged as a driver of risk for communicable disease pandemics, policies protective of health which involve taxation of non-nutritive foodstuffs, or increased nutrition education might result. Research in the COVID-19 pandemic researching environmentally derived risk, including multimorbidity substantially lag behind research advocating new medical treatments.

Such scientific research provides evidence to support policy. If such policy were enacted, the resultant costs either in lost profits, or through taxation could be significantly disruptive to a given institution, for example, to economic goals, and to principles claiming institutional responsibility.

Secondly, Rayner considered that admitting knowledge could open institutions up to criticism that institutions should have known about an issue or harm. The literature is rife with examples of institutions recognising harm long before harm was publicly recognised, and include cigarettes, asbestos, Teflon/PFAS, bisphenol plastics, fluoride, biosolids releases, medical opioids, talcum powder, the herbicide glyphosate (Roundup) and neonicotinoid insecticides. Frequently, while early (legacy) chemicals are displaced, industry arrives with substitute compounds – regrettable substitutions, such as vapes, PFAS fire retardants, shorter chain PFAS, BPZ, selective serotonin reuptake inhibitors (SSRIs), herbicides from other classes that are presented as multiple formulations, and genetic engineering technologies. These technologies carry similar or different risks, but regulators often lag behind industry knowledge, and for most part, they are dependent upon industry knowledge to regulate.

These institutions include government and regulatory agencies. These institutions can exclude particular knowledges from their risk processes, crafting the way chemical risk is perceived and understood by the relevant agency (Iorns Magallanes, 2018; Vandenberg, et al., 2013) or excluding data. Regulatory agencies often maintain outdated knowledges. For example, while biomarker technologies are embedded in medical research, chemical regulators do not require biomarker technologies to evaluate systemic exposure risks and population level biomonitoring is not supplied to support regulatory decision-making.

A failure to act on new knowledge and evidence in the scientific literature can lead the public to question the legitimacy of an institution and distrust future inaction or action (Rayner, 2012, p. 111).

What is politically valued is often a function of influence & power

Powerful institutions can contest actions taken to explore a given wicked problem through implicit and explicit processes. Such knowledges are directly political and dependent on power asymmetries (Proctor R. , 1991). However, the process of solving a wicked problem is dependent, as Rittel and Webber recognised, on the richness of reasoning (p. 166). Yet frequently, policy and planning papers discussing wicked problems exclude the function of power and the asymmetries created through the exercise of power. This is where a sociological perspective can be useful and why Rayner’s paper is valuable. I argue that wicked problem-solving is not simply a problem of political distance between actors (Turnbull & Hoppe, 2019); nor is it improving inclusiveness (Ney & Verweij, 2015); nor a matter of getting down to it (McConnell, 2019) and putting together a research program (Peters, 2017) nor is it a matter of simply taking determined incremental steps (Levin, Cashore, Bernstein, & Auld, 2012).

Wicked problem solving is political. The process of revealing knowledge could be perceived by institutions as dangerous and disruptive, and consequently take strategic action at multiple levels to manage uncomfortable knowledge. Rayner proposed that institutions deployed four tacit information management strategies to alter how knowledge was understood, via processes of denial, dismissal, diversion (or decoy) and displacement:

‘Denial represents a refusal to acknowledge or engage with information. Dismissal acknowledges the existence of information, and may involve some minimal engagement up to the point of rebutting it as or irrelevant. Diversion involves the creation of an activity that distracts attention away from an uncomfortable issue. Finally, displacement occurs when an organization engages with an issue, but substitutes management of a representation of a problem (such as a computer model) for management of the represented object or activity’ (Rayner, 2012, p. 113).

There is potential that these processes of information management could be happening with respect to EDC research in human health in New Zealand. Science, as information is a strategic tool in framing policy debates (Cordner, 2015) and science can be denied, dismissed, diverted or displaced when science has potential to bring uncomfortable knowledges to the surface. A scientist can objectively chart her choice of subject, equipment and parameters, but the funding for that science arrives via socially determined processes which are rarely neutral (Proctor R. , 1991). This process of scientific knowledge production that prevents uncomfortable knowledge arising is not necessarily intentionally malevolent. While some efforts to suppress knowledge are deliberative and manipulative, in many cases the production of non-knowledge or ignorance that shapes responsibility, is not necessarily deviant, but a normative feature of institutional and scientific processes (Croissant, 2014). These processes fragment and diffuse responsibility over time (Heffernan, 2011). The control or funding of the production of that knowledge and non-knowledge, and the authority to determine which science is legitimately included in debate is tactically important (Croissant, 2014; Hess D. , 2015; Suryanarayanan & Kleinman, 2013).

Doubt & uncertainty: strategic tools

Often the funding that does not arrive for a chosen project can explain as much about the strategic aims of funding investment streams as the funding that does arrive (Hess D. J., 2015). The positioning of doubt and uncertainty in reasoning and discussion is central to action or inaction by decision-makers and responsible institutions (Michaels, 2020).

The issue of uncomfortable knowledge is not restricted to commercial industries. Governments are tasked with the paradox of both promoting economic development and regulating those same goods. The setting of regulations and standards, while observed as a technical act, has direct political consequences, and the economic demands of governments can often be contradictory and at odds with their obligation to maintain public and environmental health (Jasanoff, 2011; Pinto & Hicks, 2019). There can be conflict when industries are tasked with providing the science that proves safety of their products (Frickel & Moore, 2006; Michaels, 2020). Government has a role in fostering innovation for growth, and when public resources are applied as a tool to support economic development ‘research is much more likely to be pulled in directions chosen by industry than to be pushed toward democratically chosen ends, no matter how open the priority-setting process in government become’ (Cozzens & Woodhouse, 1995, p. 3).

For actors seeking public authorisation for economic activity, the release or deployment of new technologies, and the continued release of controversial and potentially harmful technologies, requires strategic action to diffuse, delay or restrict regulatory action that might increase the costs of production and alter financial bottom-line.

In this world, science can be a powerful ally, deployed to depoliticise debates (Ezrahi, 2003). As Ulrich Beck observed

‘[s]cience is one of the causes, the medium of definition and the source of solutions to risks’

(Beck, 1992, p. 155).

Bibliography

Beck, U. (1992). Risk Society. Towards a New Modernity (2013 ed.). SAGE: London.

Cordner, A. (2015). Strategic Science Translation and Environmental Controversies. Science, Technology, & Human Values, 915-938.

Cozzens, S., & Woodhouse, E. (1995). Science, Government, and the Politics of Knowledge. In S. Jasanoff, G. Markle, J. Peterson, & T. Pinch, Handbook of Science and Technology Studies (pp. 532-553). Online: SAGE Publications.

Croissant, J. (2014). Agnotology: Ignorance and Absence or Towards a Sociology of Things that Aren’t There. Social Epistemology, 4-25.

Douglas, M. (1986). How institutions think. . Syracuse University Press.

Douglas, M. (1994). Risk and Blame : Essays in Cultural Theory, , . ProQuest Ebook Central,: Taylor & Francis Group.

Ezrahi, Y. (2003). Science and the Postmodern Shift in Contemporary Democracies. In B. Joerges, & H. Nowotny, Social Studies of Science and Technology: Looking Back Ahead (pp. 63-75). Kluwer Academic Publishers.

Frickel, S., & Moore, K. (Eds.). (2006). The New Political Sociology of Science. Madison WI: The University of Wisconsin Press.

Gross, M., & McGoey, L. (Eds.). (2015). Routledge International Handbook of Ignorance Studies.

Heffernan, M. (2011). Wilful Blindness. London: Simon & Schuster.

Hess, D. (2015). Undone science and social movements. A review and typology. In M. Gross, & L. McGoey (Eds.), Routledge International Handbook of Ignorance Studies (pp. 141-154). Oxon: Routledge.

Hess, D. J. (2015). Undone Science, Industrial Innovation, and Social Movements. In M. Gross, & L. McGoey, The Routledge International Handbook of Ignorance Studies (pp. 141-154). London: Routledge. Retrieved from http://www.davidjhess.org/uploads/3/5/1/3/3513369/handbookignorance.finaldraft.pdf

Iorns Magallanes, C. (2018). Permitting Poison: Pesticide Regulation in Aotearoa New Zealand. EPLJ, 456-490.

Jasanoff, S. (2011). The Practices of Objectivity in Regulatory Science. In C. Camic, N. Gross, & M. Lamont, Social Knowledge in the Making (pp. 307-338). Chicago: The University of Chicago Press.

Levin, K., Cashore, B., Bernstein, S., & Auld, G. (2012). Overcoming the tragedy of super wicked problems: constraining our future selves to ameliorate global climate change. Policy Sciences, 45(2), 123-152.

McConnell, A. (2019). Rethinking wicked problems as political problems and policy problems. Policy & Politics, 46(1), 165-180.

Michaels, D. (2020). The Triumph of Doubt. Dark Money and the Science of Deception. Oxford University Press.

Ney, S., & Verweij, M. (2015). Messy institutions for wicked problems: How to generate clumsy solutions? Environment and Planning C: Government and Policy , 1679-1696.

Peters, B. (2017). What is so wicked about wicked problems? A conceptual analysis and a research program. Policy and Society, 36(3), 385-396.

Pinto, M., & Hicks, D. (2019). Legitimizing Values in Regulatory Science (Commentary). Environmental Health Perspectives, 35001-1-8.

Proctor, R. (1991). Value-Free Science? Purity and Power in Modern Knowledge. Cambridge: Harvard University Press.

Proctor, R. (1995). Cancer Wars. How Politics Shape What We Know and Don’t Know About Cancer. New York: Basic Books.

Rayner, S. (2012). Uncomfortable knowledge: the social construction of ignorance in science and environmental policy discourses. Economy and Society, 41(1), 107-125.

Rittel, H., & Webber, M. (1973). Dilemmas in a general theory of planning. Policy Sciences, 4(2), 155-69.

Shapiro, M. (1988). Introduction: Judicial selection and the design of clumsy institutions. Southern California Law Review, 1555-63.

Suryanarayanan, S., & Kleinman, D. (2013). Be(e) coming experts: The controversy over insecticides in the honey bee colony collapse disorder. Social Studies of Science, 43(2), 215-240.

Turnbull, N., & Hoppe, R. (2019). Problematizing ‘wickedness’: a critique of the wicked problems concept, from philosophy to practice. Policy and Society, 38(2), 315-337.

Vandenberg, L., Colborn, T., Hayes, TB, Heindel, J., Jacobs, D., . . . Zoeller, R. (2013). Regulatory Decisions on Endocrine Disrupting Chemicals Should be Based on the Principles of Endocrinology. .38:1-5. Reprod Toxicol(38), 1-5.