A Low-Cost and Accessible Digital Twinning System for Virtual Reality and Embedded Devices
Dr. Duke Gledhill
Dr. Duke Gledhill is a Senior Lecturer in Digital Media at the University of Huddersfield, specialising in computer vision, panoramic imaging, and human-computer interaction. He earned his PhD in 3D panoramic imaging from the University of Huddersfield in 2009. His research contributions include a widely cited review of panoramic imaging techniques and pioneering work in virtual and augmented reality applications, with an emphasis on usability and accessibility. With over two decades of academic experience, Dr. Gledhill has published extensively in visual computing and currently leads games development courses, integrating cutting-edge research insights into his teaching.
Abstract
Digital twinning bridges the gap between virtual environments and real-world systems, yet many implementations demand significant technical expertise and financial investment. This paper presents a simple, low-barrier digital twinning system that integrates Unreal Engine 5 (UE5) in virtual reality (VR) with a Raspberry Pi, enabling real-time interaction with physical hardware. The system utilises a lightweight Python server on the Raspberry Pi to receive plain-text commands from a VR application and translate them into physical actions. By leveraging free Unreal Engine plugins and open-source software, this approach provides an affordable and accessible framework for researchers, educators, and hobbyists looking to explore digital twinning without requiring extensive networking or security expertise.
A proof-of-concept implementation is demonstrated using an RGB LED Christmas tree, which responds dynamically to VR inputs. The architecture prioritises ease of use and cost-effectiveness over advanced security measures, making it ideal for prototyping and experimentation. The entire system, including UE5 integration scripts and server code, will be made available as an open-source resource on GitHub to encourage further development and customisation.
This work contributes to the field of virtual worlds and digital twins by offering a practical, reproducible example of a low-cost, real-world interaction system. It serves as a foundation for future research into more robust and scalable digital twin applications while remaining accessible to those with limited programming or networking experience.
Augmented Reality Storytelling as a Catalyst for Mental Health Education: Encouraging Open Dialogue and Emotional Coping in Children
Melissa Ann Marie James and Delas Santano
Melissa Ann Marie James is a Lecturer at the School of Computing and Creative Media, UOW Malaysia. She began her academic career while pursuing her master’s degree at Sunway University, where she served as a Teaching Assistant and a volunteer Field Assistant at the Centre for Research–Creation in Digital Media. She contributed to significant projects, including Transmedia Exhibition 2, which showcased innovative mixed media works from faculty and external collaborators, and The Malaysian Textile Museum Project, which involved digitising treasured Malay textiles and accessories. Before transitioning into academia, she accumulated six years of professional experience in advertising, public relations, and media agencies. As a creative designer and art director, she specialised in branding, advertising, and digital media production. Now, she integrates her industry expertise into her research and teaching, focusing on advocating for mental health awareness in Malaysia, where cultural stigmas often hinder open discourse on the subject.
Delas Santano is Senior Lecturer at the Department of Art, Design & Media, Sunway University. He also lectures in the Digital Film Production course in the School of Arts. Prior to Sunway, he was at the Centre for Creative Content and Digital Innovation in University of Malaya where he was involved in the Transmedia and Polysensory Exhibition which showcased notable exhibits like Mah Meri Unmasked and Textile Tales of Pua Kumbu. His short documentary on The Traditional Boatbuilder of Pangkor Island, produced in 2016, was screened and nominated in the Best Documentary section of the 2016 Canada-China Film Festival held in Montreal. Besides having a wealth of experience in video content production, design and video editing, he is actively involved in research, curriculum development and teaching (notably Creative Multimedia at several private colleges in Malaysia). His main research area is in audiovisual production particularly on culture and heritage, and video production combined with digital humanities.
Abstract
Mental health remains a sensitive and stigmatised issue in Malaysia, often discouraging individuals from seeking professional help. This is particularly concerning for children and adolescents, whose mental health needs are frequently overlooked. Early intervention is crucial, as unaddressed emotional and psychological struggles in childhood can lead to severe mental health issues in adulthood. Therefore, it is vital to prioritise mental health education and accessible intervention strategies from a young age to foster resilience and emotional well-being. In today's digital world, leveraging emerging technologies like Augmented Reality is essential for innovation and accessibility. This paper explores the use of innovative Augmented Reality (AR) storytelling as a catalyst for mental health education, highlighting its potential to foster open dialogue and support emotional coping in children.
This research explores the potential of Augmented Reality (AR) storytelling to enhance mental health education for children in Malaysia by making it more engaging and accessible. It focuses on the planning and design process required to develop a relevant AR storytelling experience. Drawing from interviews with clinical psychologists, an in-depth review of children’s mental health literature, and established design principles, this research examines how narrative techniques such as storyline development, character design, and thematic elements combined with open reading techniques can help address childhood anxiety. By merging traditional mental health insights with AR technology, the findings highlight the potential of digital interventions to foster open conversations and emotional resilience. The insights from this study will guide the development of an AR prototype in the next phase of research.
Showcasing extended reality applications in the policing and security domains: insights, innovations and challenges from research and real-world initiatives
Arthur Jones, James Fenwick, Steffi Davey, and Paul Hancock
Arthur Jones: With a BSc in Social Computing and a MSc by Research in Computer Science from the University of Lincoln, and a background in the use of VR for exposure to confrontation, Arthur has an interest on the utilisation of Virtual Reality for policing, security and defence applications. At CENTRIC, Arthur has led on the CoLabXR and CyberSpotlightVR projects and provided contributions to several EU Projects, including INFINITY, WELCOME, GRACE, POLIIICE, TESTUDO and VANGUARD. Arthur has presented at NATO's Modelling and Simulation Group MSG-210 in Rome, and at the Immersive Learning Research Network’s Conference in Glasgow.
James Fenwick: James is a Research Fellow and the Serious Games Lead at CENTRIC. He holds a BSc in Games Design, and has a decade of experience within research organisations, delivering theory to applied design. James has a leading role in multiple UK and EU-funded projects, predominately focused on VR and experiential learning through the use of games. His projects have covered a broad range of domains including command and control, data visualisation, tracing cryptocurrency and training decryption techniques and best practices.
Steffi Davey: After graduating with a Bachelor of Science in Interactive Media with Animation, Steffi worked in industry as an e-learning developer before beginning her career at CENTRIC. During her time here she has completed a Post Graduate Certificate in Psychology, to improve skills and knowledge specifically regarding the psychology of learning and design for enhanced useability and user experience. Her work at CENTRIC involves designing serious games and interactive applications for enhanced learning experiences. She has extensive experience in 3D asset and environment creation, contributing significantly to the asset creation of 20+ projects.
Paul Hancock: Paul completed an MSc in Entertainment Software Development after completing a BSc in Computing. This was followed by over 10 years of working as a programmer in the games industry before starting a career in research at CENTRIC. His work at CENTRIC has included being the lead developer on a serious game developed in collaboration with Europol which was awarded a European Commission Security Innovation Award. He also led development on an AR application for the EU Horizon 2020 funded CREST project and has worked as a developer on many other research projects both domestically and internationally.
Abstract
The increasing accessibility of extended reality (XR) technologies has prompted growing interest in their potential applications within the security and policing domains. Exploring the feasibility of immersive technology in crime prevention and counterterrorism has become an increasingly prevalent area of research. From simulation-based training and educational applications to enhancing situational awareness and facilitating remote collaboration, the interest and adoption of XR in these domains is expanding. This talk provides an insight to a number of projects utilising XR in the fight against crime and terrorism, providing a high-level overview of the CoLabXR, CREST, VANGUARD, TESTUDO, Cyber Spotlight VR and INFINITY projects. These case-studies showcase applications of XR technologies for policing and security and how immersive technologies can be utilised for good; from completed and ongoing proof-of-concepts to applications utilised in real-world crime prevention initiatives by practitioners. Additionally presented are common challenges faced, alongside our best practices for developing XR applications for Law Enforcement Agencies.
Exploring Quill VR: Pushing the Boundaries of Animation Production in Immersive Formats
Dr Monireh Astani
Dr Monireh Astani is a researcher and practitioner in animation, creative, and immersive media, with a PhD in Media Studies. With over a decade of professional experience in animation and media production, Monireh is now a Senior Lecturer in the Department of Creative Industries at the University of Staffordshire, leading the Animation Program’s 2D pathway. Monireh’s research and practice interests include the application of art and technology for education and mental and physical well-being, immersive and interactive education and storytelling, employability in animation and creative industries, cultural and creative practices, animation production, and experimental media arts.
Abstract
Quill is a virtual reality application that enables artists to create and animate painterly visual concepts within a three-dimensional space (Quill by Smoothstep, n.d.). Released in 2016 by the Oculus Story Studio team (Quill, n.d.), Quill was initially developed as an innovative tool for producing one of the studio’s acclaimed virtual reality shorts, Dear Angelica (2017). Since its release, Quill has been extensively used to produce immersive animations by award-winning studios and indie teams such as Studio Syro, Baobab, Lavamachine, and Reimagined VR. This study explores Quill VR application and the works of these studios, alongside interviews with leading VR artists, to understand the Quill VR’s features and construct a pipeline centred around this tool. By examining this innovative application and its workflows, this research aims to demonstrate how Quill pushes the boundaries of screen-based animation into an immersive format.
Beyond Polygons: Rethinking VR Art Asset Pipelines with Gaussian Splatting
Matthew Novak
Matthew Novak is a Senior Lecturer in Games Art at University of Staffordshire, specializing in digital capture techniques and recreation of real-world 3D assets. His research focuses on optimizing photogrammetry data for virtual environments, enhancing the integration of high-fidelity assets into immersive applications. Matthew has extensive experience in games art development and game education.
Abstract
High-fidelity 3D assets, whether captured via photogrammetry or created synthetically, can greatly enhance the immersion of VR games and virtual worlds. However, such detailed models often exceed the rendering budgets of current VR hardware, especially standalone low-powered VR headsets, necessitating aggressive optimization by artists. Traditional approaches, such as mesh decimation, normal map baking, texture optimization, and Level of Detail (LOD) systems, help maintain frame rates but are time-consuming and often significantly degrade visual fidelity without the artistic skills of 3D and game artists.
Building upon the foundation of artist-driven methodology for optimizing photogrammetric data for low-power VR, we explore Gaussian Splatting as a novel approach that further streamlines asset optimization by representing models as volumetric Gaussian splats instead of conventional polygons. This technique yields a more compact data representation than dense meshes, significantly reducing storage and computational demands while enabling near real-time rendering of complex scenes on lower-end VR hardware.
Importantly, Gaussian Splatting integrates seamlessly into existing art workflows. Artists can input high-detail photogrammetric scans, digital sculpts, or complex 3D models without requiring manual retopology or LOD creation, as the technique inherently adjusts levels of detail dynamically. While Gaussian Splatting is an emerging technology too new for comprehensive VR benchmarking, early demonstrations show it can handle scenes with millions of splats with minimal performance loss.
Gaussian Splatting bridges the gap between high-end offline 3D rendering and real-time VR performance, offering a transformative approach to asset optimization. By streamlining workflows, it potentially enables developers and 3D artists to create and deploy photorealistic assets on mobile VR platforms without compromising detail or frame rate. Gaussian Splatting presents a promising solution for achieving high visual fidelity within the constraints of current and future devices.
The Poetics of Place in the Age of Virtual Production: Preserving Embodied Experience in Digital Filmmaking
Linda Curtin
Linda Curtin is an award-winning filmmaker with an understanding of media production that comes from practical experience combined with technical and academic qualifications. Her films have been funded by bodies such as CREATE and the Arts Council of Ireland and have been TV broadcast and exhibited
internationally in galleries and festivals. She has worked with a variety of communities including kids from underprivileged and traveller backgrounds, migrants, older people and differently-abled groups. Linda sits on the board of two film organisations, has a successful funding track record and is currently exploring
creative VR design. She is passionate about democratising access to new technologies and has brought over 3000 people on free learning journeys in XR. She is also a PhD Researcher with Ulster University.
Abstract
This presentation examines the transition of the "poetics of place" from location-based filmmaking into emerging extended reality (XR) and virtual production (VP) environments. Drawing on phenomenological frameworks from Merleau-Ponty, Heidegger, and Ingold's concept of the "meshwork," I investigate how the sensory, affective, and experiential qualities that transform settings into resonant cinematic landscapes can be preserved in increasingly digital workflows.
Through a dual methodological approach combining qualitative interviews with film practitioners (Location Scouts, Production Designers, Cinematographers, Directors, and World Builders) and autoethnographic reflection on my own creative practice, this research explores the tension between physical authenticity and digital representation. As exemplified in Werner Herzog's embodied filmmaking approach, the landscape functions not merely as a backdrop but as an active participant that shapes meaning through its material resistance to human intention.
This research addresses the fundamental challenge facing contemporary filmmaking: how to maintain the tactile, sensory connection to place as production environments become more virtual. By examining the evolution from traditional embodied practices to real-time technologies, I propose frameworks for translating the genius loci or "spirit of place" into digital worlds without sacrificing the atmospheric and experiential dimensions that give cinematic landscapes their emotional resonance.
Virtual(ly) Music? Exploring a Hybrid Local-Virtual Environment for Collaborative Music Making
Chris Payne, Mark Edwards, and Mat Dalgleish
Dr Mat Dalgleish studied with accessible instruments pioneer Rolf Gehlhaar before a PhD in digital musical instrument design. After working at The Open University Music Computing Lab, Mat joined the University of Wolverhampton where he designed and led its MSc Audio Technology (2013-2023) and Masters courses in the Performing Arts (2019-2023). Mat is currently a Senior Lecturer in Game Audio and Technical Design at Staffordshire University. He is also a trustee of The OHMI Trust and a one-handed musician.
Dr Chris Payne spent six years working as a full-time dance music producer and DJ in the Hard House scene in the early 2000s. Since then, Chris has worked for 15 years in the further education and higher education sectors, mostly as a music technology lecturer and more recently as a games design lecturer. He has experience of teaching, course leading and writing programmes, completed a PhD in music computing for education in 2023, and is currently a Lecturer in Game Design at Staffordshire University.
Abstract
The earliest musical instruments date back tens of thousands of years but, until recently, there have been no notable instruments designed for use by multiple (simultaneous) players. Moreover, only a few instruments have been found to usefully support multiplayer usage, most notably the piano. In contrast, multiplayer gameplay has been an integral part of video games since their earliest days and, although online multiplayer titles have come to dominate the industry landscape, local multiplayer games still have a sizeable (niche) audience.
TYG 2.0 blends elements of a digital musical instrument (DMI) in a virtual environment. Building on an earlier prototype and created in Unreal Engine 5.4, it enables three local players to explore two distinct modes: one intended for compositional activity and the other for real-time performance activity. Although each player has an individual presence in the virtual environment in both modes, players must communicate locally to coordinate their efforts. Thus, social dynamics are heavily entangled with the ‘instrument-environment’, and invisibly but inevitably shape the collective musical output. Equally interested in these human dynamics as the system’s musical output, and process as much as product, we discuss the potential ‘game’ and ‘play’ aspects of TYG. To do so, we draw on Huizinga’s concept of the magic circle, where locations of play are determined as a list of designated ‘playgrounds.’ Furthermore, we investigate TYG qualities as a ‘frame,’ defined by Tekinbaş and Zimmerman as “responsible not only for the unusual relationship between a game and the outside world, but also for many of the internal mechanisms and experiences of a game in play.” We are particularly interested in the notion that the boundary of the magic circle is permeable, allowing for an exchange between TYG and the world outside its frame. Lastly, we discuss a variety of emergent issues around human-technology and human-human interactions and discuss paths for future development.
Spaces of Meaning: Recreating Physical Environments in Virtual Worlds to Design Immersive and Engaging User Experiences
Callum Lee Roberts
Callum Roberts is an Associate Lecturer at The University of Staffordshire Games Institute. Callum's PhD is centred around exploring how physical locations can be recreated and utilised within virtual worlds. Callum's research also delves into what benefits virtual worlds can have for enhancing education, training and architectural preservation.
Abstract
This research will be exploring the effectiveness of recreating physical locations within virtual worlds, as meaningful spaces of engagement. Through utilising qualitative and quantitative research methods, this study aims to contribute to the understanding of virtual worlds as transformative meaningful digital social spaces.
Creating physical environments virtually has great potential to be utilised for a range of applications. This includes spaces such as education, training, and architectural preservation and can provide an excellent alternative and improvement to traditional methods of engagement. In this proposal, I will set out to discover the value of this cutting-edge technology, specifically the use of virtual world building tools such as Unreal Editor for Fortnite (UEFN). These tools enable the creation of immersive and accessible environments that closely replicate real-world spaces, enhancing user engagement and interaction.
A core data source of this research will be gathered through utilising StaffsVerse. StaffsVerse is a Virtual Campus designed as a faithful recreation of the University of Staffordshire. The StaffsVerse builds on legacy research into virtual spaces for student engagement. This modern interpretation of the virtual campus provides an excellent opportunity to gather high quality research data in the form of playtesting, interviews, and engagement tracking. This is due to its ease of access and playability by students both current and prospective, and the accessibility of its hosted platform, Fortnite.
This research will contribute valuable insights into developing effective frameworks for enhancing user engagement and accessibility within virtual environments.
VR Delusion vs Reality
Claire Blackshaw
Claire Blackshaw is the Technical Director at Flammable Penguins Games, currently working on an unannounced title. Based in London with her wife and cats, she's passionate about building a new VR Studio. Claire's career spans back to the PS2 era, bringing a wealth of experience in game development, particularly in networking, motion controllers, R&D, and Virtual Reality. Her strong technical background in networking and platform R&D is complemented by design and consulting work.
With previous roles at industry giants like Adobe and Sony, as well as a variety of smaller studios, Claire brings a unique perspective to game engine technology. Her passion for pushing the boundaries of what's possible in interactive entertainment drives her to continuously explore and evaluate new tools and methodologies in game development.
Abstract
Building a VR studio now... bit of a punt, isn't it? Quitting a proper job to gamble on XR might seem a touch mad, but veteran Claire Blackshaw reckons there's something to it. Forget all the hype versus reality chat – let's actually have a proper look at the VR market as it is, separate the wishful thinking from genuine opportunity, and see what's what.
There's a bit of a gap, isn't there, between some developers' ideas and the actual VR space. Drawing on her journey from PlayStation VR, via Adobe's Substance Modeller and tangles with the likes of Apple, Pico, and Meta, Claire will take us through the XR landscape. We'll move past the usual surface-level stuff to get a handle on why things might just be lining up to make VR, well, interesting.
Let's have a peek at the often-underestimated potential of VR, not just in games, but beyond. And maybe have a bit of a chat about why, frankly, VR isn't for everyone developer-wise. Walk away with a clear-eyed view of the XR future, and perhaps a few practical thoughts for building a studio in this ever-evolving, rather bonkers medium. Is it really time? Let's have a think.
Invisible Wars: Paths of Resistance in a Society of Control within Virtual Worlds
Ivan Phillips and Maria Paszkot
Ivan is the Technical Specialist (Games) and Early Career Scholar at the University of Staffordshire, London campus. His day job involves the maintenance, teaching and development of new gaming technologies. He also works within games production as a narrative designer, writer and tech lead on various independent projects.
Maria is an artist and independent researcher with a speciality in investigative journalism and Eastern European Studies.
Abstract
Technology has launched the 21st century into a proto-cyberpunk reality. This age is marked by advanced technologies such as total surveillance apparatuses and autonomous drones controlled by private corporations obsessed with the fastest and easiest ways of generating capital. This research paper will examine the ‘Invisible Wars’ that are waged in order to promote this scenario. Furthermore, we discuss how advanced technologies can be perceived through a Deleuzian lens and the concept of ‘Societies of Control’. Through the analysis of depictions of technologically advanced warfare in video game media, with discussions of “Military Shooter” genre games, we reveal how these representations manufacture consent and contribute to the fragmentation of individuals into "dividuals."
We argue that virtual worlds, which are often used to perpetuate these mechanisms of control, can also serve as platforms for resistance (with discussion of game media such as Death Stranding, and specific examples of the “Serious Games” genre). In addition we propose that "Serious Game" production can be utilised as a subversive tool of resistance, fostering critical engagement with technological advancements and challenging the moral justifications for their existence.
Through our research, we advocate for responsible and subversive depictions in games media that avoids valorising these technologies and seeks to speak about rather than for impacted communities. Our objective is to encourage the players of our experiences to question the underlying power structures that utilise and propagate these advanced, military technologies. Ultimately, this work aims to demonstrate how virtual worlds can be repurposed as a path to illuminate and resist the invisible wars of control, and offer alternative viewpoints that could assist in the ethical work of pushing us out of our dystopian trajectory.
Keynote Talk - Crafting Immersive Worlds: Blending History, Technology, and Creativity in Warwickshire
Exploring Augmented Reality, Virtual Reality, and Photogrammetry in Local Immersive Installations
Alex Harvey
Join Alex Harvey co-founder of RiVR (Reality in Virtual Reality), for an inspiring journey through the cutting-edge world of immersive technology. This talk showcases combined local history with innovative installations across Warwickshire, including augmented reality experiences featuring 3D-printed models created from drone-scanned towns like Warwick and Leamington. Discover how RiVR leverages Unity for virtual reality products, employs photogrammetry scanning techniques accessible via your smartphone, and collaborates with the fire service and police for immersive content. Drawing from Alex Harvey’s 10 years as a video editor at Codemasters crafting game trailers and 15 years as a drone pilot specializing in cinematography and FPV flying, this session will showcase how creativity and technology converge to bring history to life. Learn how these tools can be harnessed to create and monetize 3D models, and get inspired to shape the future of immersive storytelling.