Unsecurities Lab: Art as Environment for Rethinking Security : No1: Charybdis/Abiogenesis_Unknown Incident Response in Deep-Sea Contexts

Jones, Nathan (2025) Unsecurities Lab: Art as Environment for Rethinking Security : No1: Charybdis/Abiogenesis_Unknown Incident Response in Deep-Sea Contexts. [Report]

[thumbnail of UnsecuritiesLab1]
Text (UnsecuritiesLab1)
UnsecuritiesLab1.pdf - Published Version
Available under License Creative Commons Attribution.

Download (31MB)

Abstract

What is Unsecurities Lab? Unsecurities Lab is a new platform for exploring how immersive artworks can help us rethink security in an era of global complexity. Developed at Security Lancaster, the Lab brings together researchers, artists, technologists, and policymakers to engage with speculative artworks as if they were real-world events—treating art as a research environment where urgent questions about ecology, intelligence, and resilience can be rehearsed. What happened in the March 2025 workshop? In the first Lab, held in Lancaster University’s 180° immersive data suite, over 20 participants from neuroscience, marine biology, defence, cybersecurity, political theory, and the arts encountered two immersive films by artist Joey Holder. → In Session One, the film Charybdis was presented as a security incident. Participants applied adapted incident response protocols, revealing how traditional methods break down when faced with unfamiliar, emotionally intense data. → In Session Two, groups developed “stabilisation protocols” for fictional marine intelligences introduced in the film Abiogenesis—prompting participants to think like nonhuman entities and design radically different models of security. Why does this matter now? Our security institutions are structurally unprepared for the challenges already emerging: deepfakes that destabilize visual evidence, AI systems that exceed human comprehension, climate disruptions that operate on ecological timescales, synthetic biology that blurs the boundaries between natural and artificial. Unsecurities Lab reveals three critical failures in current security practice: 1. Emotional disruption breaks expert analysis - When incidents are genuinely unprecedented and emotionally destabilizing, traditional frameworks fail 2. Disciplinary silos cannot process complex threats - Cyber-physicalbiological challenges require sustained interdisciplinary collaboration 3. Human-centered models inadequate for nonhuman actors - AI, ecological systems, and synthetic life require new forms of negotiation and coexistence What could change? The findings point toward potential institutional reforms: Crisis Training Revolution: Incident response should prepare analysts for scenarios involving unreliable visual evidence, emotional disorientation, and threats that don’t fit existing categories. Interdisciplinary Environments: Security institutions can use standing teams that bring together technical experts, social scientists, ecologists, and creative practitioners as core operational capacity. Post-Human Governance: Consider frameworks for engaging with autonomous systems, ecological actors, and synthetic intelligences that don’t conform to human assumptions about agency and negotiation. Institutional Adaptation: Organizations could utilise creative and immersive mechanisms to explore functioning effectively under conditions of fundamental uncertainty—when the nature of the threat itself is unclear, but intuition and 'hunches' are amplified. What’s next? → A second Lab will run in July 2025, centred on the speculative film LUMI by Abelardo Gil-Fournier and Jussi Parikka. → A co-authored discussion paper is in development, drawing from transcripts and participant responses. → Future Labs will deepen the method, strengthen cross-sector partnerships, and develop new frameworks for embedding art into security research, strategy, and policy.

Item Type:
Report
Subjects:
?? securityimmersive artartistic researchimmersive ??
ID Code:
229980
Deposited By:
Deposited On:
11 Jun 2025 13:55
Refereed?:
No
Published?:
Published
Last Modified:
15 Jun 2025 23:11