Thanks! I only watched the first few minutes but that was enough to answer some of the q's I had about OR. The performance at higher settings is disappointing. I was expecting that the unit itself would process the visuals and that the GPU would only need to drive a GUI. Obviously I haven't been following the technology closely but it sounds like it has more potential with console, where there is tight coordination between developers and complete consistancy among equipment.
The most I could expect processing-wise out of the HMD itself is barrel distortion for the eye buffers, and it's likely they do that on the GPU too because doing it HMD-side would introduce more latency.
Needless to say, latency is BAD for VR. Having a screen or two consume most of your FOV ruthlessly reveals deficiencies in responsiveness and head-tracking that a typical monitor/TrackIR setup wouldn't, and a lot of the advancements they've been trying to make over the years involve cutting down motion-to-photon latency (or in other words, input lag) as much as possible.
On the PC side of things, it'll take serious optimization from the developers to make a good VR experience because the whole user experience needs to be designed with VR in mind. It looks like Eagle Dynamics has come a fair way with DCS 1.5, even having a virtual monitor for the menus, but cockpit and distant unit visibility may be issues even at CV1 resolutions, just extrapolating from my Gear VR experience (which works with 2560x1440 phone screens divided into 1280x1440 per eye, technically higher than the 1080x1200 per eye res of the CV1, but Gear VR apps run the eye buffers at 1000x1000 for performance reasons).
Nevertheless, I'm excited enough that when the Rift CV1 finally hits the market, I'll be snapping one up and playing lots of DCS with it. Probably Elite: Dangerous too, since I told myself I wouldn't buy that until the CV1 releases. Watching YouTube footage of that and DCS 1.5 through the Gear VR utterly spoiled me, it's like actually being in the cockpit.