Simulated Reality
Because the National Portrait Gallery has always reflected how portraiture evolves alongside change in technology, I thought it would be fitting to explore where portraiture might be headed next — especially as AI and social media increasingly influence how we see ourselves and how we choose to present our identities.
The images were generated using Midjourney and ChatGPT, then processed in TouchDesigner to create an effect that makes them appear as if they're dispersing into particles.
As it was my first time using TouchDesigner, it was definitely a challenge to learn the software and figure out how to make it work with my concept, but overall, I’m really glad it came together in the end!
I was able to use MediaPipe along with TouchDesigner’s particle system to create an interactive piece where visitors can see themselves fragmented and dispersed into particles in real time.
That’s me in the video — this was the moment I got it working and saw it projected onto a large surface for the first time. So exciting!
Using Adobe Aero, I was able to create a VR poster where viewers can have a glimpse of the exhibition.