Are we or any member a catalyst network partner?

I attended a rapid ideation event with techforgood.live yesterday and our paired charity wants to take one of our ideas forward.

DCDC doesn’t alone meet the criteria to be a partner (not enough non-profit project work) but if co-tech or a co-tech member can meet the criteria we will put you in touch with the charity and be available as tech specialists to consult (with a few days pro-bono to assess viability)

These are the existing partners: https://www.dovetail.network/

These are the criteria: https://thecatalyst.org.uk/open-projects

1 Like

hi @ben-dcdcio
this sounds really interesting, Agile Collective is on the list and I am sure we’d be up for knowing more (we may or may not be the best fit depending on what the tech is etc).
You want to ping some more info over to me? email is aaron[at]agile[dot]coop
Cheers,
Aaron

2 Likes

I think Alpha Communications are involved in Catalyst in some way

2 Likes

Yes and Abi at Outlandish is involved, as are Cat and Annie from the Dot Project (I think they barely use this forum).

3 Likes

Wow, thanks everyone for getting back to me. I wrote that post on my phone while dog walking so I could only fumble around to see the network list.

Anyway…

I wrote to the charity to let them know there are these options. I presume they
will be in touch themselves if they take it forward.

I won’t mention the charity because the forum is public, but it doesn’t matter if the idea is out there if anybody wants to do it anyway!


The (really brief version of the) Problem:

They are a group that do inclusive performing arts work and during the pandemic they have to translate that work to video conference. It doesn’t work well and they want to bring some human connection to products like zoom etc.

The Idea:

Create a mobile video calling app that embeds gyro/accelerometer data in the broadcast data so that participants can move towards and away from one another, thereby creating a virtual physical space. This space would be realised by painting videos on surfaces in a 3D space.

I gathered a short list of links to prove this stuff a. is technically viable and b. has opensource pedigree:

  1. Realtime video to surface in three.js: https://threejs.org/examples/webgl_materials_video_webcam.html

  2. WebRTC video chat room: https://apprtc.appspot.com/

There’s doubtless loads of FOSS AR/VR work out there that could inform this, and there may be video apps with advanced enough APIs that this could be just a frontend for something else.

Other Useful Links:

eduMeet - an open source fullstack video conference app (mediasoup based): https://github.com/edumeet/edumeet

three.js device orientation example: https://threejs.org/examples/?q=orientation#misc_controls_deviceorientation

3 Likes

That is super interesting!

One of my best friends is a university dance tutor and is really struggling to find a way to have a real creative virtual space for dance so this is brilliant.

I wonder if @SzczepanOfAnimorph has any thoughts? Given mention of AR/VR…

3 Likes

Yeah I was supposed to be an actor but IT got in the way. I know a few people desperate for pandemic safe performing arts.

Happy to contribute/start a github repository for this if anyone wants to contribute.

Edit: because we could do it anyway sans a commission right?

If I’m to start my stack de jour is svelte/sapper, TypeScript, node.js. AKAIK every API this would need is available on web platform so PWA is doable.

2 Likes

We (Go Free Range) are also in the catalyst network.

2 Likes

Have you come across any of the early Machinima Movement’s work?

They were a group of artists using internet collaboration in 2000 who were using game engines as the basis for puppet theatre.

The “actors” would be wearing headsets using Teamspeak-style communication software to do the audio, while moving their own avatars around the game landscape.

The camera’s were avatars that were set to be invisible, and were running recording constantly, while the director was telling them what to do.

It was a nice cross-over between digital puppet theatre and radio drama’s. :slight_smile:

2 Likes

Hey they already said my idea was “blue sky” thinking. Any more would have blown their minds.

But that said it did just occur to me tonight that the data track you capture could be used to do some fun realtime visualisation… like dancing with ribbons. That would be a neat way to indicate in the virtual space where another performer is working. And then when you get close enough to them you “join” their video chat and face them to work one to one.

I suppose these things don’t have to be too spectacular, only, just interesting enough to invite participation. Also, that participants have access to a smartphone as about the only guarantee we have.

2 Likes

I’m currently studying how to build VR systems, and thinking about how it would be possible to extrapolate from real-world data, in order to present data visualisations more effectively.

Depending upon the granularity of the sensing hardware, have you thought of trying something like a Kinect-style system?

With the appropriate use of fabrics that stand out in terms of the reactivity to the sensory hardware, it should be possible to get an effective form of data capture, so the movement of the ribbons would be measurable.

I know that there are some Open-Source libraries that can do this as i have seen versions of this done at the local hackspace.

As for the display software/hardware, there are versions of Unity that are optimised for mobile phones, and for mobile phone game development, so the Proof-of-Concepts already exist. :slight_smile: .

I was working on physical theatre for performances using fire, but the math processing to completely simulate plasma flows is incredibly expensive in terms of machine resources, so when creating digital simulations of fire, it’s always a compromise between accuracy and usability.

Most of the digital versions of dust/fire effects that you see in games, are always energy-cost-effective approximations, rather than full replications.

It’s why i still prefer the physicality of the analogue universe… :smiley: