Nvidia has announced the launch of EGX Edge Supercomputing Platform designed to let organisations easily deploy the hardware and software necessary for high-performance, low-latency AI workloads. Instead of being deployed inside big data centres, an EGX deployment is designed to sit at the edge of the cloud which, Nvidia believes, makes it ideal for the next generation of use cases.
“We’ve entered a new era, where billions of always-on IoT sensors will be connected by 5G and processed by AI,” Jensen Huang, Nvidia founder and CEO, said at a keynote ahead of MWC Los Angeles earlier this week. “Its foundation requires a new class of highly secure, networked computers operated with ease from far away.
“We’ve created the Nvidia EGX Edge Supercomputing Platform for this world, where computing moves beyond personal and beyond the cloud to operate at planetary scale,” he added.
The EGX stack includes an Nvidia driver, Kubernetes plug-in, Nvidia container runtime, and GPU monitoring tools, delivered through the Nvidia GPU Operator, which allows you to standardise and automate the deployment of all necessary components for provisioning GPU-enabled Kubernetes systems.
Nvidia will certify hardware as ‘NGC Ready for Edge’ that customers will be able to buy from partners such as Advantech, Altos Computing, ASRock RACK, Atos, Dell Technologies, Fujitsu, GIGABYTE, Hewlett Packard Enterprise, Lenovo, MiTAC, QCT, Supermicro, and TYAN.
Nvidia says EGX is already being used by customers. At Walmart’s Intelligent Retail Lab in Levittown, New York, for example, EGX enables real time processing of more than 1.6 terabytes of data generated each second to “automatically alert associates to restock shelves, open up new checkout lanes, retrieve shopping carts, and ensure product freshness in meat and produce departments.”
Nvidia lists Samsung, BMW, NTT East, and Procter & Gamble among the early adopters of EGX.
The EGX platform features software to support a wide range of applications, including Nvidia Metropolis, which can be used to power smart cities and build intelligent video analytics applications. The city of Las Vegas, for example, is using EGX to capture vehicle and pedestrian data to make its streets safer. San Francisco’s Union Square Business Improvement District is using EGX to capture real-time pedestrian counts for local retailers.
“We use our smartphones sporadically — we type into it, or watch a movie now or then — and frankly there are only seven and a half billion of us,” Huang said. “In the case of sensors, it will be streaming all the time. It’s always on, continuously on, and it’s impractical to stream all of that data to the cloud from all of those devices, which we believe someday will be trillions of devices, not billions of devices. And so it’s too costly, it’s not cost-effective, and it’s not practical to stream all of that sensor information up to the cloud. “
“We’ve been working the last several years on building a computing system, a computing platform we call the EGX — Edge GPU Acceleration Computing, that’s what it stands for — and to be able to put that type of computing, which is the same architecture that we we put into a self-driving car,” he added. “It’s the same architecture that we use to run the world’s most powerful supercomputers. We’ve industrialised it, turned it into a data centre form-factor, and we’ve incorporated security technology, networking technology, storage technology that is necessary for extremely secure data centre in a box.”
Disclosure: Nvidia sponsored the correspondent’s flights and hotel for the event in Los Angeles.