I’m currently studying how to build VR systems, and thinking about how it would be possible to extrapolate from real-world data, in order to present data visualisations more effectively.
Depending upon the granularity of the sensing hardware, have you thought of trying something like a Kinect-style system?
With the appropriate use of fabrics that stand out in terms of the reactivity to the sensory hardware, it should be possible to get an effective form of data capture, so the movement of the ribbons would be measurable.
I know that there are some Open-Source libraries that can do this as i have seen versions of this done at the local hackspace.
As for the display software/hardware, there are versions of Unity that are optimised for mobile phones, and for mobile phone game development, so the Proof-of-Concepts already exist.
.
I was working on physical theatre for performances using fire, but the math processing to completely simulate plasma flows is incredibly expensive in terms of machine resources, so when creating digital simulations of fire, it’s always a compromise between accuracy and usability.
Most of the digital versions of dust/fire effects that you see in games, are always energy-cost-effective approximations, rather than full replications.
It’s why i still prefer the physicality of the analogue universe… 