published on 22.10.25
For over a decade, this international workshop has brought together a community of scholars engaged in critical and alternative approaches on governance, challenging 20th-century paradigms centered on efficiency, growth, and profit maximization. Today, the stakes are higher than ever. Political instability, armed conflicts, and the rapid spread of artificial intelligence across economic, political, and organizational domains have transformed the landscape of governance. From geopolitical risks and algorithmic decision-making to automated management, surveillance infrastructures, and generative systems, the question is no longer whether governance matters but what kind of governance is still possible.
Governance has often been presented as neutral—a technical system detached from politics. History demonstrates the opposite. From colonial enterprises to contemporary wars, corporate power has been closely intertwined with violence, domination, and profit extraction, generating both social unrest and ecological devastation. Denying the political nature of governance is, paradoxically, its most political act: it normalizes destruction as inevitable and safeguards entrenched structures of inequality.
We argue that governance now faces a “double or nothing” moment. Either we collectively rethink governance through the lens of algorithmic systems, or we allow governance itself to be rewritten by them. The rise of AI raises profound and urgent questions: Who governs the governors of algorithms? What forms of accountability are possible in a world where agency is increasingly distributed, opaque, and machine-led?
This workshop calls for a fundamental transformation in governance logic – one that embraces alternative epistemologies and places life, rather than profit, at the core of decision-making. We seek contributions that critically examine how AI and sustainability intersect with governance structures, offering pathways toward more moral, glocal, interconnected, and reflective approaches.
We invite research papers, theoretical essays, empirical studies and practitioners and activists papers that advance critical and alternative thinking on governance at the intersection of AI and sustainability. We particularly welcome contributions that challenge Western-centric frameworks and propose transformative alternatives rooted in diverse epistemological traditions, while also addressing the urgent organizational and institutional dimensions of AI governance.
We encourage critical investigations addressing the challenges of governance with, by, and through AI—that is, how algorithmic systems govern and are governed. We invite contributions under several interrelated themes:
1. Epistemologies and Ontologies
2. Environmental and Material Dimensions
3. Corporate Power and Institutional Capture
4. Agency, Accountability, and Justice
5. Alternatives, Resistance, and Transformation
We encourage submissions in diverse formats that challenge conventional academic boundaries:
Format: Papers should embrace interdisciplinary approaches, welcoming contributions from diverse methodological traditions including ethnography, participatory action research, speculative design, and activist scholarship.
Target audience: PhD students, established scholars, independent researchers, activists, and practitioners working across governance, technology, sustainability, organizational theory, and critical management studies.
Submission deadline: April 05 2026
Notification of acceptance: April 30 2026
Workshop dates: June 22-23 2026
Location: ISC Paris
Selected papers and essays will be considered for publication in an edited volume.
Contact: [email protected]
Ajunwa, I. (2024). The Quantified Worker: Law and Technology in the Modern Workplace. Cambridge University Press.
Binns, R. (2018). Fairness in machine learning: Lessons from political philosophy. Proceedings of the 2018 Conference on Fairness, Accountability, and Transparency, 149-159.
Crawford, K. (2021). Atlas of AI: Power, politics, and the planetary costs of artificial intelligence. Yale University Press.
Ekbia, H. R., & Nardi, B. A. (2017). Heteromation, and Other Stories of Computing and Capitalism. MIT Press.
de Sousa Santos, B. (2015). Epistemologies of the South: Justice against epistemicide. Routledge.
de Sousa Santos, B. (2024). The epistemologies of the South and the future of the university. Journal of Philosophy of Education, 58(2-3), 166-188.
Eubanks, V. (2018). Automating inequality: How high-tech tools profile, police, and punish the poor. St. Martin’s Press.
Floridi, L. (2023). The Ethics of Artificial Intelligence: Principles, Challenges, and Opportunities. Oxford University Press.
Georgescu-Roegen, N. (1976). Energy and economic myths: Institutional and analytical economic essays. Pergamon Press.
Jensen, M. C., & Meckling, W. H. (1976). Theory of the firm: Managerial behavior, agency costs and ownership structure. Journal of Financial Economics, 3(4), 305-360.
Mökander, J., Sheth, M., Gersbro-Sundler, M., Blomgren, P., & Floridi, L. (2024). Challenges and best practices in corporate AI governance: Lessons from the biopharmaceutical industry. arXiv preprint. https://arxiv.org/abs/2407.05339
Pasquale, F. (2015). The Black Box Society: The Secret Algorithms That Control Money and Information. Harvard University Press.
Paul, J. (2022). Can critical policy studies outsmart AI? A research agenda on artificial intelligence and public policy. Critical Policy Studies [preprint]. https://doi.org/10.13140/RG.2.2.24983.04001
Rodhain, F. (2019). La nouvelle religion du numérique. Le numérique est-il écologique? EMS Editions Management & Société et Libre & Solidaire.
Swyngedouw, E. (2009). The antinomies of the postpolitical city: In search of a democratic politics of environmental production. International Journal of Urban and Regional Research, 33(3), 601-620.
Zaidan, E. (2024). AI governance in a complex and rapidly changing world. Humanities and Social Sciences Communications, 11(1). https://doi.org/10.1057/s41599-024-03560-x
For further information, please contact: [email protected]
Submissions should be sent to: [email protected]
‹ Previous news Next news ›
To provide the best experiences, we use technologies such as cookies to store and/or access device information. Consenting to these technologies will allow us to process data such as browsing behavior or unique IDs on this site. Failure to consent or withdrawal of consent may adversely affect certain features and functions.