Hello
I am exploring ways to push interactive visuals beyond just motion tracking and was wondering if anyone has attempted / considered integrating VYV Photon with real-time environmental sensor data (like temperature, humidity, or ambient light) to dynamically adjust projection visuals on the fly?
For example; imagine an outdoor installation where the projection subtly shifts color tones / particle behaviors based on environmental input; creating a living canvas that mirrors its surroundings.
While Photon handles video tracking and projection mapping superbly; I’m curious if there’s a documented method (or even a clever workaround) for pulling live data from IoT devices or microcontrollers & feeding it into Photon’s control logic or triggering system. Checked https://forum.vyv.ca/t/general guide for reference .
Has anyone experimented with this kind of Mulesoft Training setup, or would it require a custom middleware layer (e.g. via TouchDesigner, Max/MSP / even a custom OSC bridge)? Any advice, experiences / potential pitfalls would be greatly appreciated!
Thank you !!