One use case for cloud rendering is low-end mobile devices that can't handle the rendering of complex 3d-scenes. Or you could create a live presentation using a 3D world, eg. about an architectural design.
Other people in the project are building tools on top of WebGL to do the rendering in the browser - mainly porting the [realExtend toolset](http://realxtend.org/) to Javascript.
Everything I do will be released as open source later on. I'll give you a shout when we got something to show.
Yes - all clients connecting to the same renderer instance are seeing the same video.
> What made you do it this way rather than render on the client and update positioning over WebRTC?
This is a part of a larger [research project](https://forge.fi-ware.eu/projects/miwi/) to create and test the viability of different kinds of UI enablers.
One use case for cloud rendering is low-end mobile devices that can't handle the rendering of complex 3d-scenes. Or you could create a live presentation using a 3D world, eg. about an architectural design.
Other people in the project are building tools on top of WebGL to do the rendering in the browser - mainly porting the [realExtend toolset](http://realxtend.org/) to Javascript.
Everything I do will be released as open source later on. I'll give you a shout when we got something to show.