Photo from Unsplash

What are the Obstacles to Computational Social Science?

Exploring a recent paper by authors from Germany, the US and the UK

Alex Moltzau
3 min readSep 11, 2020

--

A series of different authors from Germany, the US and the UK wrote a short article about obstacles and opportunities to computational social science. This article was released the 28th of August 2020. I thought I would write a short post to explore the main arguments of the article named: “Computational social science: Obstacles and opportunities.”

The article claims there is a growth of computational social science (CSS). They argue that many institutions fall short, especially with ethics, pedagogy, and data infrastructure.

They suggest opportunities to address these issues.

They define computational social science (CSS) as:

“…the development and application of computational methods to complex, typically large-scale, human (some- times simulated) behavioral data (1). Its intellectual antecedents include research on spatial data, social networks, and human coding of text and images.”

Different people has gathered under this umbrella phrase.

They argue that incentives and structures are poorly aligned for this multidisciplinary endeavour.

Within this integrating computational training into social science and social science into computational disciplines has been slow.

Collaboration is often not encouraged.

There are few mechanisms to bring them together.

  • United Kingdom’s Research Excellence Framework for example allocate funding focused within disciplines. In that sense multidisciplinary approaches are less rewarded.

Computational research infrastructures cannot often support analysis of large-scale, sensitive data sets.

There has been partnership with government in relation to data.

They stress the importance of innovation in differential privacy.

In addition there is a contrast between public and private when it comes to datasets.

“Public accountability inherent in sharing data is likely seen as a positive for the relevant stakeholders for government agencies, but generally, far less so for shareholders for private companies.”

They argue that collaboration with market research companies is hard to see as the way forward.

The authors sketch two broad concerns.

  1. Many have been cutting back data pulled from platforms (such as with GDPR). Yet they argues this can shut down potential research.
  2. Data generated by consumer products and platforms are imperfectly suited for research purposes. Users online may be unrepresentative of the general population (and behaviour biased in unknown ways).

The platforms were often not designed to answer research questions, so important information of high relevance.

“The design, features, data recording, and data access strategy of platforms may change at any time because platform owners are not incentivized to maintain instrumen- tation consistency for the benefit of research.”

According to the authors there has been a failure to establish adequate ‘rueles of the road’. Few universities provide guidance to contain and manage sensitive data. There are challenges around consent.

They made a list of incentives and rules that I found valuable to repost:

Resources and rules, incentives and innovations

Strengthen collaboration

  • Develop enforceable guidelines in collaborations with industry around research ethics, transparency, re- searcher autonomy, and replicability.
  • Develop secure data centers supplemented by an administrative infrastructure for granting access, monitoring outputs, and enforcing privacy and ethics rules.

New data infrastructures

  • Develop large-scale, secure, privacy- preserving, shared infrastructures driven by citizen contributions of time and/or data. Capture the metadata that describe the collection process.
  • Develop infrastructure to capture the dynamic, algorithm-driven behaviour of the major platforms over time.
  • Prmote legal frameworks that allow and mandate ethical data access and collection about individuals and rigorous auditing of platforms.

Ethical, legal, and social implications

  • Professional associations should help develop ethical guidelines.
  • Large investments are needed to develop regulatory frameworks and ethical guidance for researchers.

Reorganize the university

  • Develop structures that connect researchers having shared interests in computational approaches.
  • Fundamentally reconceive graduate and undergraduate curricula.
  • Reward collaboration across silos.
  • Appoint faculty with multi-unit affiliations
  • Physically collocate faculty from different fields
  • Allocate internal funding to support multidisciplinary collaboration.
  • Empower and enforce ethical research practices — e.g., centrally coordinated, secure data infrastructures”

I think these rules are worth checking out and considering what improvement you can make where you are.

What do you think?

This is #500daysofAI and you are reading article 465. I am writing one new article about or related to artificial intelligence every day for 500 days.

--

--

Alex Moltzau

AI Policy, Governance, Ethics and International Partnerships at www.nora.ai. All views are my own. twitter.com/AlexMoltzau