“The Future is Always Beginning Now” — Mark Strand, US Poet Laureate
What a week this has been. The Consumer Electronics Show 2022, which took place in Last Vegas from Wednesday to Saturday, featured a plethora of newsmaking unveilings. While we did appreciate the announcement of many amusing new devices such as a smart lock that will let you unlock your front door using your Apple Watch, we appreciated much more the sheer magnitude of the attention given to all things Metaverse this year.
This exceptional attention helps explain why the CES named the evolution of the Metaverse and intelligent automation as the two most compelling technology megatrends of the future. We’re glad to have convinced them when it comes to the Metaverse. Exhibitors at the show ran the gamut from gaming and vehicle technology to Augmented Reality eyeglasses. So many exhibitors tried to seize some of the hype around the Metaverse that an industry veteran, Nima Zeighami, put together quite a collection of some of the most egregious uses of the theme in this fun twitter thread.
Without doubt, now that the Metaverse has become the buzzword it has, many companies will seek to exploit it to their advantage. But only some are building real products and tools that will contribute to making it a reality. Thankfully, we are here to sort out the chocolate chips from the computer chips, or in proper (read: archaic) English, to separate the wheat from the chaff, so you don’t have to.
In 2016, Dirk Van Gelder, the then-senior engineering lead at Pixar Animation Studios, presented a new technology called Universal Scene Description (USD for short, not to be confused with Uncle Sam’s green notes) at the prestigious SIGGRAPH computer graphics conference to cheers from the crowd. USD, used for the first time to create Pixar’s Finding Dory feature film, would soon become a critical tool, allowing filmmakers to streamline the animated filmmaking process by merging different tools used in producing animated movies into one integrated platform.
Fast forward to 2020, when Mr. Van Gelder jumped ship to join the NVIDIA team, which was now incorporating USD as the basis for a new platform it had been developing and which is the culmination of 20 years of work, named Omniverse. Omniverse brings together graphics, Artificial Intelligence, simulation, and computing power into a unique online toolkit that powers 3D workflows and creative applications. In simpler terms, it forms a foundation to create functional virtual worlds, just as pipes are needed to form a functional building. Thanks to the Omniverse and USD, developers and artists are able to collaboratively create 3D assets and scenes that can be incorporated and accurately represented in virtual worlds, even if they are not hosted on the same platforms.
By sharing the power of the Omniverse with the global community, NVIDIA hopes to achieve a grand ambition: becoming the irreplaceable plumbing underlying all the Metaverses being developed. Such an achievement would guarantee it a major role in the next decades of computing innovation. There is a reason why it has seen its stock increase more than 100% over the past 12 months vs. Intel’s 3%. The race is on among the computer hardware companies to provide the critical infrastructure for the Metaverse, and NVIDIA has taken a sharp lead.
One of the most interesting and thought-provoking presentations at the CES was Hyundai Motor Company’s unveiling of its vision for the future, which includes the Metamobility concept they have coined. We should have guessed when they announced the acquisition from Softbank of a controlling interest at a valuation of $1.1 billion in dancing robots company Boston Dynamics back in 2020 that they had big ambitions. But we could not have guessed the exact form these took on.
If Hyundai has its way, the following will be possible: You could be in the middle of touring a Parmigiano Reggiano dairy in picturesque Modena, Italy, when suddenly, while tasting different ages of the heavenly substance, you will feel some remorse about not sharing this acute olfactory experience with your pet Labrador back home in NYC, so will naturally plug into the Metaverse, access the digital twin of your home, and retrieve a treat to give him before you begin petting him, all through the use of a real-life avatar robot.
Quite seriously, Hyundai intends to lead the charge in connecting robots to the Metaverse in order to allow individuals to move freely between the real and virtual worlds. Their goal is nothing less than to enable “unlimited freedom of movement for humankind” and to “allow people to overcome the physical limitations of movement in time and space.” Some of the more practical uses (though we don’t have anything against petting dogs) will be seen in the manufacturing sector, where linking robots to VR will allow remote workers and experts, from anywhere in the world, to perform remote tasks with a physical connection, also useful in solving any potential problems. Even though it was good enough for us, the dancing robots will do more than just dance after all.
In addition to sharing some of the most significant events of the week, our goal with this newsletter is also to impart our readers with more exhaustive knowledge about the digital sausage-making involved in creating the Metaverse. We will therefore experiment with the inclusion of a Metaverse 201 section where we will give a brief overview of an important, but more technical, subject. This week’s focus is one of the five elements highlighted by investment bank and financial services company Jefferies as key to successfully building the Metaverse: Technical Infrastructure.
One of the main obstacles on the way to the utopian vision of the Metaverse is obtaining the immense computational power necessary to support the relevant computing processes involved in enabling a scaled Metaverse that is accessible to billions of people concurrently. Raja Koduri, the head of Intel’s Accelerated Computing Systems and Graphics Group, estimated that computing capacity needs to increase by more than 1,000x to achieve this feat. A quick calculation using Moore’s Law, defined as computational capacity doubling every two years, would indicate that another roughly 20 innovative years are necessary to get there.
Fear not, however, as this doesn’t mean that we just have to wait with our arms crossed until then. We would be out of business quite quickly if that were the case, and that would be a shame as we have much more in store. As evidenced by the 2020 Travis Scott concert that saw, albeit with serious planning and clever technical tricks, 12.3 million people attend concurrently, it is already possible to figuratively taste the Metaverse. Nevertheless, it is the pace of hardware breakthroughs, in such areas as cloud offloading and edge computing, that will determine the timeline for the successful realization of the Reality-Virtuality Continuum first pondered by Professors Paul Milgram and Fumio Kishino in the winter of 1994. All this to underscore the statement made by Jefferies’ Global Head of Thematic Research Simon Powell that Metaverse investors should initially direct their attention to the hardware companies involved as they will be the cornerstones of it all.
Thanks for reading, until next week,
The team at Introducing Meta.
If you enjoyed this newsletter, you can join our community of readers by subscribing to it on our website to receive the weekly issues as soon as they are first sent out.
Disclaimer: This newsletter is distributed for general informational and educational purposes only and the opinions expressed therein are not intended to constitute investment advice.
Sign up to stay informed about the most impactful events and latest innovations and actors working diligently on shaping our tomorrow.