Thanks @Attiq for take the time to do that research, is nice to have here people on the same wave, it’s clear that you have done the homework.
Is awful that I need to use those marketing terms to make the history short, but as you may read on my details from the Hackaday profile for the project, we are against all those corporative strategic decisions for decap technology progress for the rest of the people that would benefit as end users or technologic SMEs. Just like here on the EVE Community,
Is not a secret that a Hololens kit is not cheap and easy to join on the partner developer program, either all the top VR hardware with full features like the Oculus DK + Constellation and the HTC Vive + Lighthouse, all that bulky bill with even buy a VR ready Desktop PC.
Just like the Pyramid Flipper concept you have here, we are looking for make those technologies more convenient and cheaper for mass adoption and open for suggestions by the future user base.
One person this days tells me if he would be able to use our technology on his Samsung Gear VR, and then you realize that these kind of device would be the most affordable and mass adopted for VR Content but is ALWAYS BE AN INDIVIDUAL EXPERIENCE with a screen blocking your visual of the surrounding. Some solutions were developed for that problem like adding some cameras to the HMD for InsideOut Tracking and the VR app will aware of the obstacles and dimensions of the room like on the Intel’s Project Alloy.
Like you say more of the current hype for these “new” technologies, first those tech are not even new, some older than others, even the main concept of HMD for Virtual Reality Itself is older than computer graphics. As you say those technologies have a long way maturing and millions invested over time but is not that the progress? As the companies invest in their products and technologies they need to retain the ROI as long their patents let them but another acceleration of adoption of new technologies are the Standardization Efforts along the industries.
And that my friend, Open Standars, is the key of the future of interoperability and integrations of technologies. We are building our software and hardware stack over open standards that guarantees future proof operability and compatibility with a diverse ecosystem of devices. With all the work we are getting close to test at Uruguay, we are following the certification path for become OSVR industrial partner and our hardware will be on the list on a total different group aside HMDs.
This strategic of getting close to standards will follow from the top most abstraction level like OSVR is and down to the physical level with connectors (USB-C, USB 3.1 gen2, PowerDelivery 2.0, etc.) and SystemOnModules like the SMARC 2.0 standard for the hardware side. For the software side we are solving that issue you pointed out about parallelism and multi-core sharing with a new Asynchronous Heterogeneous Parallel Programming Model called OmpSs that will work on Intel x86 and ARM architectures with support of CUDA for Nvidia GPUs and OpenCL Mali GPUs.
The OpenCL part is crucial because is a Software Development Industry Standard for offloading most of the GPGPU task that would benefit from the massive parallelism on GPU for those novel algorithms like Machine Learning and its cousins Deep Neural Networks that will accelerate all the fancy stuff like object/face/voice/language/body/hand/gestures recognition, for that we have the “Khronos royalty-free, open standards for 3D graphics, Virtual and Augmented Reality, Parallel Computing, Neural Networks, and Vision Processing” that include the well know OpenGL for graphics and the WebGL counterpart for the Web where is the base for the WebVR specification and the glue for all that with the AR/VR Hardware is OpenXR.
For the Operative System we are merging a base Chromium OS with optimizations with WebCL and direct Vulkan implementation with Daydream specifications of latency and hardware integration plus ARCore libraries for the rest of the interactions with local augmenting layers. This operative system will be present on the BYOD HUB Docking Station and the Portrait Tablet Like Device. for the rest of integrations with the others Operative Systems from the devices connected on our BYOD HUB will be an application ported for each OS with sufficient permissions to manage network, memory and cpu on demand like a virtual machine, where each resource from each device will be present on each other device through an API served from the Docking Station. Just like an local and private microcloud with accelerators.