It seems that everyone is talking about the metaverse at the moment, despite the fact that most people aren’t quite sure what it is or what purposes it could serve in the real world. Facebook’s name change to Meta helped to kickstart this trend, though many companies have been and will continue to develop technologies and products for it.
News sites are awash with discussions on the possibilities that the metaverse could bring. One of the most talked about areas that could be transformed by this digital world is gaming, as blurring the lines between what happens in the real world and what happens inside a digital environment takes place.
Concerts with real-life musicians within video games are one area of a metaverse world that has already come to fruition. In recent years, Minecraft, Fortnite, and Roblox have all put together a programme of performances from artists like Lil Nas X, Royal Blood, 333, IDLES, and Ariana Grande.
In the future, a metaverse gaming environment could make its way into other genres. For example, the best online casinos USA might adopt virtual reality technology to create a fully-immersive 3D environment for players to step into. Instead of selecting a slot game from a list of different options, they could walk through a virtual gaming floor and place their digital chips into the machine.
But while all of these prospects may sound exciting, we need to spare a thought for what is necessary to make them possible.
The most commonly accepted definition of the metaverse is one that operates in a “perpetually available 3D environment”. For most people, this means using either virtual or augmented reality.
Such technologies already exist today, but a lot of work is required to bring them into the mainstream. Facebook’s Oculus has significantly improved its VR headsets in recent years by removing the need to be constantly plugged into mains power or a computer.
That said, even the most expensive models still aren’t quite fast enough with their head tracking, and still need more powerful graphics processors to improve this.
Rather than building more powerful GPUs into VR, eye-tracking technology could help to reduce the demands placed on the hardware. This would use a camera to detect where you were looking and then turn down the quality of the display in the areas surrounding it. (lexingtonservices)
On top of that, VR headsets are still much too heavy to wear for any extended periods of time, meaning your trips to the metaverse would have to be kept brief.
Gaming GPUs like the upcoming RTX 3090 Ti are way too big to use in a device you strap to your head, so VR headset manufacturers need to find ways to balance the needs for beefy hardware and a lightweight product.
Beyond the processing requirements to create hardware that lets us live in a world that transitions seamlessly between AR, VR, and real-life lies a big step forward in display technology.
If you’ve used VR before, you may have noticed that gaps appear in the image. This is because the pixels that make up the image are not close enough together.
A smartphone’s display is fine for looking at from an arm’s length away, but when the display is a couple of centimetres from your eyeball, it needs at least 60 pixels per degree of field of view.
On top of that, most current display technologies aren’t bright enough to provide a clear AR image. Some in the industry are hypothesising the use of microLED technology to overcome this, but they can’t currently display colour images very well.
Most involved in the industry believe that hardware constraints will be what holds us back from realising the metaverse and that companies involved in its development will play a crucial role in helping to close this technological gap.
Given how much needs to be improved in this area, we could be waiting five to ten years for a working metaverse to be realised.