• NVIDIA’s new edge computing solution uses built-in AI capabilities to analyze and process data generated by IoT sensors close to the points of data collection.
• Leveraging a range of infrastructure partners and integrating with the leading public cloud providers should help to establish NVIDIA as a leading player in the growing edge computing subsector.
In recent weeks it feels as though every major technology company has become intent on capturing what they see as the emerging opportunities associated with edge computing, a burgeoning technology subsector that is being driven partly by rising enterprise innovation with the Internet of Things (IoT). Companies with notably strong strategies towards edge computing include telecoms network operators like Deutsche Telekom and AT&T, public cloud service providers such as Amazon Web Services and Microsoft, and IT infrastructure vendors like Dell EMC and Hewlett Packard Enterprise.
In May 2019, graphics processing unit (GPU) specialist, NVIDIA, became one of the latest tech firms to target the edge computing opportunity, announcing the launch of the NVIDIA EGX server, a solution that uses built-in artificial intelligence (AI) capabilities to analyze and process data generated by IoT sensors close to the points of data collection.
With ever larger volumes of data expected to be generated by IoT sensors and 5G mobile networks, it will become essential for cost and performance reasons to process this data close to where it is generated, instead of transporting it over long distances to central data centers. This is where edge computing comes in, essentially providing a series of initiatives for bringing computing power closer to places where data is collected and where digital content and applications are consumed. The benefits of using AI at the edge include the ability to make more intelligent, real-time decisions about which data to keep and how to use it.
The NVIDIA EGX leverages server infrastructure from a choice of providers, including Cisco, Dell EMC, Fujitsu, HPE, Inspur, and Lenovo, combining compute with storage, networking and security technologies from Cisco and Mellanox, and with a choice of NVIDIA AI processors. NVIDIA offers several AI processors, ranging from high-performance T4 products that are designed for real-time speech recognition and other real-time AI operations, through to the 70x45mm Jetson Nano, which NVIDIA announced in March, and which is designed to empower small, low-power AI systems such as those found in embedded IoT applications.
The new EGX server will also make use of a new container-based software platform called NVIDIA Edge Stack. This allows users to manage complex distributed IT footprints comprising multiple edge servers and IoT sensor locations. NVIDIA Edge Stack also integrates with Red Hat OpenShift and connects to major cloud IoT services such as AWS IoT Greengrass and Microsoft Azure IoT Edge. This ensures that AI applications developed in the cloud can run on NVIDIA EGX and vice versa.
NVIDIA’s approach towards edge computing, which leverage partnerships with a broad range of technology vendors and integration with the leading public cloud providers, should help to establish it as a leading player in this growing technology sector. It is notable that NVIDIA already has more than 40 early adopters of its new edge computing solutions, including major multinationals like BMW Group and Foxconn.
Nevertheless, despite this strong start in providing tailored AI-powered solutions for emerging edge computing requirements, it is likely that NVIDIA and other edge technology providers will have to help potential customers address a whole range of adjacent concerns related to edge computing and distributed IT; these include all-important questions about security, architecture design, and placement and management of diverse sets of workloads.