Embed Ecological Accountability in AI Governance Now
To contribute to climate justice efforts, ecological accountability must be embedded in AI governance, a Brazilian study has found. The study highlights how the use of algorithmic systems by Big Tech obscures the material, energetic, and extractive dimensions of digital infrastructures, thereby reinforcing environmental injustices.
This issue is particularly relevant to the Global South, as reported by the study that examined institutional reports, sustainability claims, and advertising campaigns from major tech corporations like Google, Amazon, and Microsoft from 2023 to 2025.
Key Findings of the Study
The recent study, titled “Algorithms on Fire: Leadership, Power and Climate Collapse in the Age of AI,” published last month in the Leadership & Organization Development Journal (LODJ), reveals that corporate narratives construct a grammar of ecological denial that conceals the environmental costs of AI and legitimizes unsustainable practices.
It shows that “algorithms are not merely computational tools but discursive-material formations that organize meaning, legitimize unsustainable practices, and reinforce environmental injustice.”
Practical and Social Implications
The findings of the LODJ study have both practical and social implications. Practically, it encourages tech corporations, developers, and policymakers to embed ecological accountability into AI governance. Understanding how discourse shapes perceptions can help institutions craft more transparent and responsible environmental policies.
The study advocates for a shift from computational efficiency toward an ethics of technological care in AI design, development, and deployment.
Socially, it urges companies, policymakers, and developers to embed ecological accountability into AI governance, contributing to broader climate justice efforts, especially relevant to the Global South.
Epistemic Struggle in Climate Change
Connecting closely with other studies, the LODJ study emphasizes that AI and platforms transmit ecological information while also configuring its meaning, emotional resonance, and political visibility. AI systems and digital platforms have become co-producers of environmental truth, reshaping the conditions under which climate policy, public debate, and democratic decision-making occur.
Regulating Algorithmic Infrastructures
Experts suggest that climate governance must expand its scope to include the regulation of algorithmic infrastructures as part of climate policy. This includes transparency mandates, public-interest design, and accountability mechanisms.
The analysis points out that algorithms shape what becomes thinkable, urgent, and actionable, often in ways that evade democratic scrutiny. It challenges the assumption that leadership resides solely with identifiable actors or institutions, showing how it is increasingly distributed across platforms and infrastructures.
Call for Further Research and Action
To cultivate reflexive capacities in future researchers and leaders, the study suggests embedding critical perspectives on algorithms, power, and communication into climate education and research practice.
The LODJ study is significant as it shifts the debate on AI and climate change from technical efficiency to questions of power, discourse, and environmental justice. It emphasizes the urgent need to integrate ecological accountability into AI governance, research agendas, and curricula.
Overall, the study serves as a foundational intervention in digital climate justice, highlighting the empirical and governance work required to translate critical insights into effective policy action.