Intel has unveiled a virtual reality headset that allows nearby objects from the real world to be integrated into its computer-generated views.
It describes Project Alloy as being a “merged reality” device. One key advantage, Intel says, is that users will be able to see their own hands.
It intends to offer the technology to other manufactures next year, but will not sell the headsets itself.
The company is keen to avoid repeating the mistake it made with smartphones.
The California-based company has previously acknowledged that it botched an opportunity to supply Apple with chips for its original iPhone.
It subsequently lost out to rival ARM-based technology that now powers the vast majority of handsets.
Project Alloy marks an opportunity to pitch its RealSense depth-sensing cameras, Replay graphics-creation software and other proprietary inventions to others before virtual reality (VR) and augmented reality (AR) devices – which superimpose graphics over a real-world view – go mainstream.
Even so, it is far from being the only technology giant involved in the sector:
Furthermore, AMD – a rival chipmaker to Intel – is developing an untethered headset of its own that mixes together AR and VR technologies.
Intel’s chief executive, Brian Krzanich, offered a first look at Project Alloy at his company’s developer forum in San Francisco, where he suggested the technology could “redefine what is possible with computing”.
One of the benefits of its approach, he said, was that the headset’s RealSense cameras could detect a user’s finger movements and allow them to appear in a virtual world and manipulate simulated objects.
“Merged reality is about more natural ways of interacting with and manipulating virtual environments,” he said in a blog later published on Medium.
“[That liberates] you from the controllers and the nunchucks of today’s VR systems by immersing your hands – your real-life hands – into your simulated experiences.”
In an on-stage demonstration, the hands could be seen only if they were held near to the centre of the user’s field of view.
When Mr Krzanich’s own face appeared within the VR world, it also became apparent that “merged reality” objects only appeared as low resolution graphics, at least for now.
He added that the technology had benefits over rival systems that required a user to install external sensors in their room to detect their movements.
And he suggested that going wireless would prevent an owner being “jolted” out of their experience because they had reached the limit of the cord used to transmit data from a PC or games console.
However, he acknowledged that one trade-off of relying on wi-fi would be that the computer involved would take slightly longer to respond to a user’s actions – something that might concern gamers.
Microsoft has pledged to support the headset in a forthcoming version of Windows 10.
One expert said Project Alloy had promise, but it might only have limited appeal.
“Having a real-time rendition of your hands or other objects in VR could have appeal to enterprise applications, such as a surgeon training with a body diagram or a mechanic having graphics overlaid onto an engine part,” said Ed Barton, from the technology consultancy Ovum.
“But when it comes to gaming, there hasn’t been much clamour to be able to see your hands in real-time.
“Vive, for example, has addressed the issue with special controllers. It’s not something that people have been crying out for.”
The unveiling of Project Alloy comes seven months after Intel disclosed some of its other VR and AR efforts.
At the Consumer Electronics Show in Las Vegas, it showed off an augmented reality helmet for construction workers, made by the start-up Daqri, and a smartphone virtual reality accessory made by IonVR.
Both incorporated its RealSense sensors.