Stick with teardowns.
The stereo image is pretty poor. Was this actually taken with stereo offset camera sensors, or one camera you shifted the image in photoshop?
Thanks for this. It also explains the process they are using to derive the six planes, which incidentally corresponds to the six color fields the OmniVision LCOS is capable of generating at I assume 1/360th of second per field. However this does not explain how they can sustain a two complete RGB frames in 1/60th of second unless the image resolution is 1280x720 (allowing 120 fps) instead of 1920x1080 (60 fps). Unless they multiplex the six LEDs at 1/360th of second with two frames of image information. Essentially if the resolution is 1080P (unlikely) the NVIDIA TX2 is processing two (1920 x 2) x 1080 images (one for each eye) every 1/60th of second. Pretty impressive, but I think it is actually just 720P as confirmed in my Unity SDK the image is 1280 x 960.
I wished there were more detailed images of complete breakdown of the projection system instead of trying explain the process. Some of us already have a good understanding of the process and have relied on IFIXIT to show us the components to verify our assumptions. With only few images we are still relying on vague block diagrams.
Actually I done my own tear down and you have incorrectly stated the purpose of the Cypress CYUSB3064-BZXC by just identifying its title. This is actually the MIPI controller for the two LCD panels.
Nope didn’t happen or even confuse either headset.
The ridging on Fresnel lens is not a process. A Fresnel lens has ridges due to the way Fresnel lenses work and most are smooth on one side. So the actual description is these are Fresnel lenses with a FL approximately 35mm (don’t remember the exact number I came up with). I don’t have any of the other specs, like physical/chromatic curvature. The are also about 45mm at their widest point, making the smaller than both the Oculus and HTC/Vive lenses.
I have not personally had a chance to review or tear-down the Samsung Odyssey, which not only have larger OLED displays, but use aspheric PMMA lenses that are part of the reason for the sharper viewing experience.
Commented in your the HP XR headset not having a complete component identification, including those the process the cameras that make up the parallax inside/out positional tracking.
A little disappointed you didn’t spend more time identifying more of the other components like they HDMI to MIPI controller (not a Toshiba component) as well as the as the processor chip used to handle the two cameras that make up the parallax based inside/out positional tracking. But I have not checked out your Acer tear-down, so you might have done that it already.
Having designed a lens system for Occipital’s Structure sensor (which is also a Prime Sense derivative). The structured light emitter, which is actually a laser; vibrates to create the pattern that bathes your face. I am pretty sure the structure light source does not have the power to do anything besides scan your face, but also the infrared camera is lens system is probably tuned for 300 mm-500 mm unlike the Kinect(1), ASUS XTION or PrimeSense’s own early versions, which have a much longer range. If anyone is interested in doing more than just face scanning, check out Occipital’s website.: https://structure.io/ .