Press, PR & Media

Somabotics Icon
@EmbodyingAI
Keep up to date with our latest updates on X
Follow

Somabotics AAAI Creative Live Interactive Performance 2026 Workshop

On January 26th, researchers, artists and industry practitioners gathered in Singapore for the first Workshop on Creative AI for Live Interactive Performances (CLIP). Held as part of the fortieth Association for the Advancements of Artificial Intelligence (AAAI) conference, the workshop brought together a diverse mix of perspectives, creative practitioners, AI researchers and industry to explore how can AI systems meaningfully participate in live, real-time creative processes.

Generative AI has made huge progress in producing creative outputs such as images, music, text but live performance remains underexplored. It demands systems that respond in the moment, adapt to human improvisation and support collaboration rather than replace it. CLIP was designed to showcase this type of work, bringing together contributions exploring dance, music, visual arts, brain-computer interfaces and embodied interaction.

The morning opened with oral presentations covering a broad range of approaches to creative AI. Papers included DanceChat, which uses large language models to guide music-to-dance generation, and MOVE-ME, exploring AI-assisted dance choreography. TalkSketch demonstrated generative AI for real-time sketch ideation using speech, while another paper presented an autonomous policy debating system, pushing the boundaries of what persuasive AI looks like in a live setting.

The highlight of the morning was Dr Richard Ramchurn’s presentation of the Somabotics Fellowship AI Lens project – a live generative AI camera system that transforms images in real-time. Following the presentation, a demonstration of AI Lens took place during which attendees could see and interact with the transformations as they happened, showcasing the possibilities of creative AI.

The afternoon keynote was delivered by Dr Jose Luis Contreras-Vidal, who spoke on the challenges and opportunities for BCI-driven generative AI in the performing arts. His case study of Balinese Gamelan performance with the video of dancers wearing EEG headsets showing their brain signals driving live visualisations, vividly illustrated the potential for brain-computer interfaces to open up entirely new forms of creative expression.

What made CLIP distinctive was its emphasis on interactive demos and discussions. Across two poster and demo sessions, attendees had the opportunity to engage directly with working systems. TalkSketch invited people to try real-time sketch generation combined with speech. The autonomous debating system allowed participants to test its persuasive capabilities and directing space explored directing smart home devices using explainable AI, offering a unique interactive experience.

Other contributions included TradJockey, a live remixing system for traditional music; SightDog, exploring AI-enhanced guide dogs through creative dialogue; and work on 3D human-centred video generation, visual editing, and energy-based image modelling, together painting a picture of just how broad creative AI has become.

The closing discussion turned to a question that had been present throughout the day including how we better involve artists in this research. There was a strong sense that the most compelling work at the workshop emerged where technical and creative perspectives came together and that future events should look to involve artists as collaborators and co-designers. Plans for future events were discussed with  lots of enthusiasm for continuing to build this community.

The workshop was thoroughly interactive and the mix of people in the room including researchers, artists and engineers meant that conversations were interdisciplinary and exactly the kind of exchange that creative AI needs if it is going to move beyond technical novelty towards work that is artistically meaningful.

Thank you to the workshop committee Prof Steve Benford, Dr Alicia Falcon-Caro, and Dr Richard Ramchurn and also to all the authors, presenters, and attendees who made the day really special.

Accepted papers from the workshop will be published as Springer CCIS proceedings, expected in March 2026. The full programme is available here.

Witten by Kieran Woodward

Back to News