ANDROID

Nvidia has created Earth-2 to predict the weather using AI

×

Nvidia has created Earth-2 to predict the weather using AI

Share this article


Nvidia’s real-time 3D graphics platform, Omniverse, is taking a leap forward at Computex 2024. Creating an entire simulation of our planet and calling it Earth-2, Team Green leverages accelerated computing and generative artificial intelligence (AI) to predict the weather, protecting you from the elements.

Dubbed a ‘digital twin’ of our very own home, Nvidia uses AI to “predict the future of our planet to better avert disasters or understand climate change.” Currently, it’s only implemented in Taiwan via its local Central Weather Administration to spot typhoon landfalls, but the aim is to take the model global so we can all “better understand the impact of today’s actions on tomorrow’s world.”

Using the CorrDiff generative AI model trained on Weather Research and Forecasting (WRF) data, Nvidia says it can already generate weather patterns at 12 times the resolution other tools can, narrowing from 25km to just 2km. This makes it 1,000 times faster and 3,000 times more efficient than traditional physical simulation alternatives. Looking towards the future, the next step is to narrow this even further to predict weather within tens of metres.

Once Omniverse has the information, it then uses PALM, a physics model that mimics atmospheric and oceanic boundary layers to give you a visual representation. How this is then fed to the public is in the hands of whoever conducts the simulation, but my guess is that it’ll probably use a digital human.

Nvidia digital human looks at the user.

What is a digital human?

In an effort to make machine interaction as realistic and empathetic as possible, Nvidia has created what it calls a digital human. It goes several steps beyond ChatGPT, starting with accepting voice prompts via a conversation rather than text prompts. It then crafts the response in real time and conveys it from a humanoid on your screen. They can also react to your surroundings using your camera. This opens up a whole host of privacy concerns but that’s par for the course these days.

See also  Caitlin Clark's H.S. Coach Proud Of Star's Journey To WNBA, Didn't Predict This!

Digital humans will supposedly occupy the AI interior design space, helping you rennovate your home. They’ll be your personal healthcare worker, providing curated care in a more timely manner. You’ll be able to find them as customer service agents, tutors, and, perhaps my least favourite, AI brand ambassadors who tell you what you should wear and what the next trendy tech thing is.

This is possible because each uses a pre-trained AI model called a Nvidia Inference Microservice (NIM). For simplicity, you can consider them as individual digital people that are experts in their field. Eventually, you’ll be able to assemble a team of NIMs to combine expertise, enhancing your company or just general day-to-day. You won’t even need to think too carefully about how to set up the team, as you can give a task to the leader and let them pick the right candidates to figure out the problem.

Since the brand packages and optimises each NIM to run across the current CUDA install base, you’ll even be able to run a lot of these AI efforts locally on your own device, provided you have an Nvidia GPU. Your mileage may vary on what you can do, though, as newer graphics cards will handle tasks better with more TOPS.

Current examples of digital humans are a little off, to say the least, but it’s early days. Nvidia CEO Jensen Huang caveated their appearance before the presentation, highlighting that while they are more engaging, you need to “cross the uncanny chasm of realism” first. Credit where it’s due, though, it’s not easy to create something like this. It’s essentially AIs piggybacking on AIs to create the animated 3D mesh, complete with real-time path traced subsurface scattering to simulate light hitting the skin.

See also  Dragon's Dogma 2 launches on Steam, NVIDIA GeForce NOW

The untapped potentials of NIMs are both exciting and terrifying. On one hand, it would be nice to get timely medical attention without putting strain on the NHS here in Blighty or mounting extraordinary bills in the US, but the inaccuracy of large language models (LLMs) thus far doesn’t inspire confidence when handling such sensitive topics. It also begs the question where Nvidia gathers its data to teach its NIMs, and whether it’s sourced ethically.

For more from Nvidia this Computex, check out the new SFF Enthusiast GeForce branding for compact PCs and Project G-Assist, which gives you an AI gaming assistant.



Source Link Website

Leave a Reply

Your email address will not be published. Required fields are marked *