### Chapter 1: Examining Power
Here are some main ideas from this chapter on examining power.
#### Key P
- **Understanding Influence**: It is essential to recognize how racism, sexism, and privilege shape oppression.
- **Naming Oppression**: Examining power involves identifying and explaining the oppressive forces ingrained in our daily lives.
#### Domains of Power
Power systems can be configured and experienced across four domains:
1. **Structural**: The institutional framework.
2. **Disciplinary**: Regulatory and normative practices.
3. **Hegemonic**: Political and social contexts.
4. **Interpersonal**: Day-to-day interactions.
#### Intersectionality
- Dimensions such as gender, race, sexuality, geography, and ability can lead to unjust oppression or unearned privilege across these four domains.
#### Critical Questions
To understand how power unfolds in and around data, consider:
1. **Who is engaged in data science work (and who is not)?**
2. **Whose goals are prioritized in data science (and whose are not)?**
3. **Who benefits from data science (and who is overlooked or harmed)?**
These questions are uncomfortable but necessary as they reveal that certain groups disproportionately benefit from data science while others are disproportionately harmed.
#### Influential Works
- **Algorithms of Oppression by Safiya Umoja Noble**: Explores how gender and racial biases in information systems are complex and often rooted in the data and models created by small, homogenous groups.
#### The Problem of Dominance
- When data teams are predominantly from dominant groups, their perspectives unduly influence decision-making processes.
#### Privilege Hazard
- This phenomenon occurs when those in the most privileged positions (with good education, respected credentials, professional accolades) wield disproportionate influence. This hazard is more harmful in aggregate as it permeates structural, disciplinary, and hegemonic domains.
#### The Threat of AI
- Social scientist Kate Crawford argues that the biggest threat from AI systems is not their intelligence but their potential to embed sexism, racism, and other forms of discrimination into our digital infrastructure.
#### Case Studies
- **Serena Williams and the Healthcare System**: Illustrates racial bias in healthcare.
- **Joy Buolamwini and MIT's Facial Recognition System**: Highlights issues of bias in technology.
- **CloudWalk and Zimbabwe**: Raises ethical concerns about data sharing and human rights.
Researchers have shown that images of immigrants, abused children, and deceased individuals have been used to train software without consent, raising serious ethical issues.
#### Further Reading
- **Invisible Women**: Discusses gender biases in data.
- **Foundation: Data 2X**: Focuses on improving data quality for women and girls.
#### Current Data Goals
Data science is predominantly used for:
1. **Profit (for a few)**
2. **Surveillance (of the marginalized)**
3. **Efficiency (amidst data scarcity)**
Universities focus on science, governments on surveillance, and corporations on selling.
#### The Goal of Examining Power
The objective is not only to understand power but also to challenge and change it.