Image for post
Image for post
Photo by — @tiagoaguiar

Impact Transparency and xAI

Explainable Artificial Intelligence (xAI) for a Climate Crisis

Forgetting anything besides human life is not besides the norm, apparently. At least considering the type of situation we find ourselves in both with a climate crisis (worsening conditions for life on planet earth) and what has been termed as the sixth extinction (massive loss of biodiversity).

“Consequently, we believe that it is imperative to develop legislation regarding transparency and accountability of AI, as well as to decide the ethical standards to which AI-based technology should be subjected to.”

Furthermore referring to an article by Jones N. they say this:

“Advanced AI technology, research, and product design may require massive computational resources only available through large computing centers. These facilities have a very high energy requirement and carbon footprint”

If we talk of impact transparency and xAI together it becomes both more complicated and interesting. How can we in a good manner explain the impact AI has on the planet as well as for human-beings?

Definition of xAI and Common Issues

Google Cloud says on their website:

  • Info-besity (overload of information)
  • AI systems optimize behavior to satisfy a mathematically-specified goal system chosen by the system designers

What Different Approaches Have Been Used To Address xAI?

Ways to address this has been:

Image for post
Image for post
  • Accumulated local effect (ALE) plots, one- and twodimensional partial dependence plots, individual condi‐ tional expectation (ICE) plots, and decision tree surrogate models.
  • Explainable boosting machines (EBMs).
  • Monotonically constrained gradient boosting machines.
  • Scalable Bayesian rule lists.
  • Super-sparse linear integer models (SLIMs).

Could the Climate Crisis Be part of xAI?

Perhaps we have to learn from the different issues that we are facing. What are we trying to explain? It seems a symptomatic case of amnesia for those working with AI at times when they attempt to explain the actions inherent in the technology that is being built. Complexity does not absolve xAI of responsibility for the environment, if nothing else that is certainly a ‘black box’ as much unknown in consequences as buying a banana in the store. It would be wrong to blame the industry or field too much, yet it certainly is strange not to see this as part of good implementation of xAI for people, planet and the wider ecological footprint.

Written by

AI Policy and Ethics at www.nora.ai. Student at University of Copenhagen MSc in Social Data Science. All views are my own. twitter.com/AlexMoltzau

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store