28 July 2022

Feeling the Heat: Modern Data for the Critical Decade

    /documents/20142/7283455/Lights-across-the-US-at-night-Header.jpg/01622d01-ef42-fec5-597f-8a38281cf23d?t=1658225370656

     

    Early into Michael Mann’s lauded action thriller Heat, Robert de Niro’s sceptical master criminal Neil McCauley is persuaded to take on the fateful heist by an off-beat, wheelchair-bound, tech expert, Kelso. When the sceptical McCauley asks what the haul is likely to be for his crew, Kelso casually puts the figure between $12.1million and $12.2million.

    “That’s not really an estimate. Those are exact figures.” Kelso assures McCauley, as he leafs through the perforated computer printouts of the banks balance sheets.

    “How d’ya get this information?” De Niro’s McCauley asks.

     “It comes to you. This stuff just flies through the air. ” replies Kelso, with a louche wave of the hand.

    “You’ve just got to know how to grab it.”

    - Heat, Director: Michael Mann, 1995

    In this brief but pivotal interchange, Michael Mann encapsulates the myth of the ‘data age’. All the data we could ever need is just there, sequestered in this cloud or another, just out of reach. If we can access it, we will be able build our research, forecasts and models on the ‘exact figures’ of which Kelso is so proud.  The only difficulty is knowing the right way to to pluck it out of the air. 

    The Modern Data for the Critical Decade workshop explored the potential applications for utilising data to understand, mitigate and adapt to climate change and variability, whilst highlighting the challenges and complexity of such approaches. What was perhaps most fascinating, however, was the opportunity to draw on - and elucidate - the transdisciplinary methodologies employed by Critical Decade for Climate Change researchers. In doing so, the workshop encouraged us all to lift our gaze from the spreadsheets, graphs and questionnaires that so often characterise our imagining of ‘data’, and consider more widely from what sources we can derive data, the differing qualities that data can take, and what forms the analysis of these data can take.

    A key utility in the use of large datasets are that they provide the ability to identify specific issues, and suggest potential changes that will mitigate climate change and variability. For example, the departure point for Grace Lin’s research “Transforming diets for environmental sustainability through experimental interventions” is the calculation that our global food is responsible for around 30% of greenhouse gas emissions. Therefore, to meet climate change targets, whether they are set at 1.5ºC, 1.8ºC or 2ºC,  this system will need to be transformed through a widespread shift to plant-based diets. The aim of the project is design and test potential interventions that will foster such a change. Part of the challenge of this is understanding dietary trends and measuring changes in dietary behaviour.

    The rather elegant ‘big data’ solution employed by Grace is to draw on loyalty card purchasing data from one of the UK’s biggest supermarket chains. These data include information on quantities and prices of thousands of products on a weekly basis, segmented by geo-demographic characteristics. These data capture both specific short-term shifts in consumer behaviour, such that around Christmas and New Year, as well as  overall trends. As opposed to customer surveys, which suffer from social desirability bias and are less reliable, the strength of this data is that it captures actual consumption behaviour. As Kelso says, “that’s not really an estimate. Those are exact figures”.

    Grace Lin presenting on Supermarket Loyalty Data
    Grace Lin, Critical Decade Leverhulme Scholar presenting on Supermarket Loyalty Data

     

    Equally, data analysis has a role to play in determining the efficacy of different mitigation approaches. Through “Promising words, evaluating actions: assessing Greenhouse Gas Removal in national net zero plans”, Harry Smith is assessing differing countries relative reliance on carbon removal systems to meet their commitments in terms of greenhouse gas emissions. For this project, the data is out there ”flying through the air” within each country’s national net zero plans. With each nation, however, reporting in a differing format, with varying levels of consistency, detail and clarity, drawing out the relevant information is very much a case of “knowing how to grab it.”

    The initial methodology attempted by Harry was to R programming to ‘text mine’ the documents, to systematically extract the relevant textual data. The specific, long-term low emission development strategies, however, were discovered to be too complex to accurately extract the information. Instead, using NVivo, Harry has used inductive and deductive coding to review 3892 pages of policy. The process coded for carbon dioxide removal methods, policy statements and calls for international cooperation.

    This, perhaps indicates the limitations of technology and programming to process large and complex data when it is in non-standardised form - in this case buried in textual reports. Or perhaps, more precisely, the limitations given limited time and resources. No doubt, given a team of AI engineers, and terabytes of computing power, a programming approach could successfully be developed to mine the key data, but in absence of that the superiority of the linked systems of human eye-and-mind remains the best tool at our disposal.

    The further we move away from numerical and statistical data, the less we are able to rely on technological or computing solutions to sift through the wealth of data on which we have constructed our information age. What was most nourishing about the workshop was the role that human experience and storytelling have to play in making that necessary societal changes needed to face the threat of climate change and variability. Through her project “Other Words, Other Worlds: diversifying narratives of a climate-changing planet”,  Anna Lau aims to employ workshops and conversations (not ‘interviews’ she highlights) to share in the stories of lived experiences. These will inform and shape her own creative response, in the form of a novel.

    Rather than characterising this as a data gathering exercise, Anna describes it as a listening and interpretation process, encompassing interpretations of emotional experiences. In doing so, the project looks to de-prioritise climate change as a ‘technocratic challenge’ requiring a scientific, technological or policy response. Instead, Anna’s project urges us to understand climate change as the latest sign in centuries’ long arcs of earth conversations, signalling the recycling of death, disease and the disabling of a collective human capacity to listen well to how humans could fulfil a role as carers, or stewards of earth.

    For those who have not yet seen Heat, I won’t detail how it unfolds. When you watch it  - or watch it again, as I urge you to do - it is interesting to see how the story contrasts the approaches of Robert de Niro’s career criminal Neil McCauley, coolly assessing the quantitive data and risks, with Al Pacino’s chaotic police detective, Lt. Vincent Hanna, as he employs his instinct to qualitatively analyse interviews with informants, ex-prisoners and eye witnesses. To take a step back, you can see how Michael Mann uses both to power his narrative forwards.

    This was the overriding message I took away from the Modern Data for the Critical Decade workshop. The development of the information age has enabled access to data of experiences and phenomena in a way unparalleled in human history - an arsenal that we can usefully deploy in meeting, and averting the worst impacts of climate change. Yet, the most important aspect remains how we use these data to construct narratives that give as accurate depiction as possible of how the world is changing, how we can meet those changes and what is at stake if we fail to act. This will be essential if we are to meet the challenges of this critical decade for climate change. Thankfully, given the evidence of this workshop there is huge potential to take the knowledge and insight derived from the data and transform it into compelling stories that will be heard, understood and thereby provide a catalyst for effecting real change - and providing us with hope for the future.

    This blog post was written by Roland Smith. Roland is a Leverhulme Trust Doctoral Scholar at the University of East Anglia as part of the Critical Decade programme. His research focuses on understanding the impact of climate change on patterns of migration and population displacement.

    Read more about Roland and our other Leverhulme Doctoral Scholars and their research projects on our website. 2023 projects are likely to open to applications in mid-November 2022. If you wish to be notified of when available projects are live for application, please email criticaldecade.lds@uea.ac.uk.