It’s running HDMI to a frame-grabber, which is then compressing the image and sending it over USB to a Linux box which is then decompressing and displaying it. There is zero value looking at performance at this point. No, a released driver wouldn’t work like this at all.
And of course, it needs external trackers, controllers etc.
EDIT: ok that’s more about how they recorded the footage than the implementation. I don’t know how it works!
External trackers… so that sounds like they literally just use it as a display that has no idea how your head is oriented or moves. All that has to be added through other hardware. Which makes sense since that probably is one of the most challenging things to implement for this, still a very long way to go for anything actually usable if it even happens.
sorry … they made the change after I posted it.
The explanation from iVRy:
And of course, it needs external trackers, controllers etc.
EDIT: ok that’s more about how they recorded the footage than the implementation. I don’t know how it works!
External trackers… so that sounds like they literally just use it as a display that has no idea how your head is oriented or moves. All that has to be added through other hardware. Which makes sense since that probably is one of the most challenging things to implement for this, still a very long way to go for anything actually usable if it even happens.
@kbinator @virtualreality