Reimaging the path to the Metaverse

The new buzzword of metaverse seems to change the way we live in the future. With Facebook’s transition into a Metaverse company and Chinese tech giant ByteDance’s acquisition of VR headset maker Pico, tech companies have already forayed into the VR segment and claimed that a virtual reality universe is the future of the Internet.

Andreas_metaverse

Author of this article: Andreas Lifvendahl, the CEO of Imint

Derived from the sci-fi literature of the 90s, Metaverse refers to an immersive digital space where you can have interactive experiences under a particular identity. Regardless on your definition of the concept, many of us have already existed in a metaverse-like ecosystem during the COVID-19 pandemic. For instance, Microsoft Teams and Zoom have just about crossed over from communication tools to metaverse-like platforms. And we are immersing ourselves in incredible virtual reality games, concerts or sport matches through a VR headset.

Now, with the increasing adoption of VR/AR devices and upcoming maturity of 5G and AI, we are approaching another phase shift in the way camera videos are being used and optimized to build a Metaverse.

Connecting to Metaverse through VR/AR devices

VR and AR technologies are vital components of Metaverse. You need related hardware devices such as VR headsets or smart glasses to see and experience the virtual world. These devices make use of cameras to track user movement, and provide an immersive simulation where users have a feeling of being in a virtual world. In this way, camera sensors become the initial entry points to bridge the virtual and physical worlds.

We are increasingly seeing the application of VR/AR in almost all aspects of society, especially in the education, telecommunicating, healthcare and gaming. With the Chinese government’s goal to become a global leader in VR technologies by 2025, the industry is poised to witness robust growth. IDC shows the compound annual growth rate of China’s VR and AR market over the next five years is expected to be 67.5 percent.

And this trend is further accelerating with the build of Metaverse. In particular, the VR/AR ecosystem in China is evolving beyond its early stage of addressing specific issues, to becoming a fully functioning ecosystem that can benefit all industries.

Overcoming the challenges of VR/AR video technologies

While the development of VR/AR industry seem to be promising, the widespread industry application is also facing challenges especially in terms of video technologies.

First, cameras equipped in the VR/AR device are always in motion during use, and the captured images must be kept clear and stable to provide users with the best video experience. By integrating Vidhance video stabilization software to the VR/AR devices, the video content quality will always be smooth, sharp and stable.

Second, a major challenge in VR video processing today is the excessive power consumption of VR/AR devices. The device power requirement will only grow as users demand higher resolution, presenting a practical challenge to the energy-and thermal constrained mobile VR devices. One of the strengths of Vidhance is the low power consumption. This is partly thanks to the intelligent use of sensor data instead of image-based calculations.

Third, VR/AR devices need to perform all video processing in parallel and in real time. You don’t need to wait for time-consuming post-processing of the video material. Instead, you can watch the video in a livestream. If that is not an option, at least the video is ready as soon as you lay your hands on the device.

In addition, since the key use-cases of VR is 360° video processing, it is important to leverage 360°camera sensors to optimize the video experience. Vidhance comes with a very flexible, clear-cut and modular API. This makes you free to optimize the parameters in the different features to suit your specific needs. You can test and tune the processing to obtain the best result possible for your device.

Accelerating the path to Metaverse through Vidhance

The build of Metaverse aligns with our vision to continue pushing the boundaries of video experience. We are working on a new generation of video enhancement technologies to create even better video experiences and enable more flexible optimizations.

Vidhance can be applied to all kinds of VR/AR devices that equipped with moving and motion cameras. It has the features of low power consumption, high performance in low light conditions, being developed in both real time and in post processing mode and 4K video support, which can provide users with clear and stabilization video performance.

These technologies have already been used in the smart glasses. With stable, high-quality, real-time video processing on remote mentoring – smart glasses allow remote technicians to “see” what frontline workers see while maintaining a safe distance during the pandemic, or by bringing their expertise to bear without incurring travel costs.

smart glasses

This year, we announced a unique collaboration with RealWear, the world’s leading developer of industrial grade assisted reality connected devices for industrial applications. The purpose of the collaboration is to bring superior video quality to RealWear’s award-winning assisted reality wearable computer for frontline worker.

A Metaverse Scenario and what it will require from future’s camera devices

The Metaverse is a concept rather than a ready blueprint. In one longer term scenario, principles like “immersive” will be coupled with “immediate” and “omnipresent”. In a world with connected smart camera-equipped devices, we could liberate our physical presence from our sensory presence. In this future scenario we would bring senses from the sensors. By instantly couple all available camera data, and doing seamless stitching, we could be virtually present in many other environments, for both entertaining and productive purposes. This world would have full information on each camera-carrying device, both image data and device metadata – and providing it to the metaverse cloud. Imintis well positioned to provide key pieces of this puzzle.

A true Metaverse is not here yet. We still need time to converge 5G, AI, VR/AR, IoT and other emerging technologies to build a completely immersive environment that combines virtual and reality worlds. And we still need to figure out what is the best way to integrate the Metaverse with the real economy. But we are on the right track of shifting our path with continuous technology innovations. The future is revved up for exciting opportunities and we are ahead of the pack.

Let’s redefine video stabilization testing together

At Imint, we are leading the transition to a new paradigm of how video stabilization is defined, tested, optimized and used. We’re working on a new generation of our world-leading video enhancement platform Vidhance to create more realistic video experiences and enable more flexible optimizations. Learn more in our guide, “Redefining video stabilization quality”.

Nobody has all the answers to questions like how video stabilization test criteria can be adapted yet, but let’s work on them together. We want to share our knowledge and help craft meaningful test criteria for next-gen video stabilization. Contact us to continue the dialogue. For inspiration, insights and best practices for the next generation of video stabilization, enter your email address below and subscribe to our newsletter.

Vidhance

We’re all about video enhancement

Book a Demo