
I am interested in virtual and augmented reality technologies. But I have fairly bad eyes with a complicated prescription, and these technologies all cause me some (quite literal) headaches as well as nausea. That limits my interest to the theoretical: I own exactly zero such devices.
I was reading through my news feed today and came across this: a company called AllFocal Optics is working on a new technology that ‘bypasses’ the eye lens and projects directly onto the retina (Wired). This has applications for VR, AR, and even heads-up displays. Although the technology itself is new, the optics behind it are well-established.

The AR and VR aspects of the Allfocal technology make sense to me: it seems fairly obvious that projecting an image directly onto my retina would be fairly doable if I was already wearing a device on my face. But I’m less clear on how the HUD technology is expected to work: HUDs are not projected into your eyes directly, they are reflected off of a windshield or similar surface.
Wired’s article (linked above) references some other HUD technologies like holographic surfaces being developed by Hyundai and Zeiss optical. But what is Allfocal doing in this space? That is less clear to me from my reading of the article. Getting the HUD-like information directly to your retina without something attached to your face (e.g.: glasses) seems problematic- but maybe I’m missing something obvious here.
I don’t know if AllFocal will have any products or collaborations quickly, but I’m definitely interested. I’d love to be able to use some of the newfangled ‘realities’ that are all the rage amongst folks with perfect vision and a lack of any motion sickness. Maybe one day I will have that opportunity!