Davenport, United States — photo by @kellysikkema

Predicting Floods & Drought with AI

Using LSTMs for Climate Change Assessment Studies

One thing is certain: the Coronavirus is not the only crisis we are facing. Although of course it is an important one we have to deal with water to an increasing degree. Too much water in the wrong place (floods, rain) or too little water (drought, water scarcity). These changes in our environment brought on in part by the drastic increase in carbon emissions is affecting occurrences off floods and droughts around the globe.

This article focuses on an article written for NeurIPS 2019 called Using LSTMs for climate change assessment studies on droughts and floods.

Why was this paper written? In the introduction I could see a few reasons mentioned.

  1. Occurrences of floods and droughts are drastically changing.
  2. Predicting climate impacts over individual watersheds is difficult.
  3. Floods and droughts affect more people than any other type of weather-related natural hazard.

The work presented is a large-scale LSTM-based modelling approach.

What is LSTM?

The authors attempt to learn the model a diversity of hydrological behaviours.

They mention that the most commend strategy for the current modelling is done from analysing individual catchments against historical records.

“A catchment is an area where water is collected by the natural landscape.”

According to the paper this neglect catchment characteristics that are not modelled correctly.

They refer to Kratzert et al. who made an LSTM.

  • modified input gate, trained on meteorological time series data from hundreds of riverine systems, where static catchment characteristics are used to condition the model for a specific site.

According to the article prediction in Ungauged Basins was the decadal problem of the International Association of Hydrological Sciences from 2003–2012.

“…there exists a proof-of-concept that deep learning can transfer information about hydrologic processes and behaviors between basins, time and unobserved locations.”

Data Used

  • Models were trained on the data from 531 basins of the freely available CAMELS data set.

They used the method of Morris to investigate which catchment characteristics influence droughts and floods,

  1. They calculated the gradients of simulated streamflow w.r.t. xs at each day of the simulation
  2. Averaged the absolute gradients separately for each static input feature (catchment characteristics and climate indexes) over the low- and high-flow periods.
  3. Averaged values were normalized to [0,1] separately in each basin [17], so that the features could be ranked according to their relative influence.

Opening up the possibilities for large-scale impact assessment

They argue that:

The current method for basin and climate characteristics are derived once for the entire data period.

However: “…the model structure allows for dynamic input features (e.g., dynamic climate and vegetation indexes, or dynamic anthropogenic demand indexes).”

Further they argue that by:

This is #500daysofAI and you are reading article 290. I am writing one new article about or related to artificial intelligence every day for 500 days. My current focus for 100 days 200–300 is national and international strategies for artificial intelligence. I have decided to spend the last 25 days of my AI strategy writing to focus on the climate crisis.

AI Policy and Ethics at www.nora.ai. Student at University of Copenhagen MSc in Social Data Science. All views are my own. twitter.com/AlexMoltzau