Press, PR & Media

Somabotics Icon
@EmbodyingAI
Keep up to date with our latest updates on X
Follow

Welcome Alastair!

Alastair Howcroft is a PhD student in the School of Computer Science at the University of Nottingham. He recently joined the Somabotics programme in partnership with BLUESKEYE AI, a company developing affective computing technologies with advanced emotional intelligence for healthcare, wellbeing, and social robotics. His supervisory team includes Professor Steve Benford, Professor Michel Valstar (industry supervisor), Dr Maria Elena Giannaccini, and Professor Holly Blake from Health Sciences.

Alastair holds a BSc in Computer Science from the University of Nottingham and is interested in building AI that feels real and emotionally intelligent, exploring how people respond to systems that can talk and listen naturally. His work examines how such interactions can help users feel understood or accompanied, and how emotional connection can shape human–AI relationships.

For his dissertation, Alastair created an immersive Mars survival game, Exploring Mars, featuring an AI companion that guides players through exploration and problem-solving. The character could chat naturally, offer help and encouragement, and show emotion through expressive facial animations, giving the impression of a living, responsive presence within the game. More recently he published a systematic review and meta-analysis, AI Chatbots Versus Human Healthcare Professionals: A Systematic Review and Meta-Analysis of Empathy in Patient Care in the British Medical Bulletin. His review found that people often perceived AI chatbots as empathic in text-based healthcare interactions – and in many cases rated them as more empathic than human healthcare professionals.

Building on these foundations, his PhD brings emotionally intelligent AI into the physical world through robotics. Alastair will design, build, and evaluate social robots that combine natural conversation with responsive touch, aiming to create interactions that people experience as empathic and emotionally supportive. Large language models (LLMs) already generate dialogue that many users find empathic; Alastair’s research takes this a step further by embodying such capabilities in robots that can also communicate empathy through touch. By integrating speech, affect sensing, and tactile interaction, his work will explore how embodied AI can help people feel empathised with and emotionally connected.

Back to News