Red Hat Inc.

05/07/2024 | Press release | Distributed by Public on 05/07/2024 10:29

Red Hat Unveils NVIDIA NIM Integration on Red Hat OpenShift AI

DENVER - RED HAT SUMMIT 2024- May 7, 2024 -

Red Hat, Inc., the world's leading provider of open source solutions, today announced coming integration support for NVIDIA NIM microservices on Red Hat OpenShift AI to enable optimized inferencing for dozens of artificial intelligence (AI) models backed by a consistent, open source AI/ML hybrid cloud platform. Organizations will be able to use Red Hat OpenShift AI with NVIDIA NIM - a set of easy-to-use inference microservices included as part of the NVIDIA AI Enterprise software platform to accelerate the delivery of generative AI (GenAI) applications for faster time to value.

Icon-Red_Hat-Media_and_documents-Quotemark_Open-B-Red-RGB

Every enterprise development team wants to get their generative AI applications into production as quickly and securely as possible. Integrating NVIDIA NIM in Red Hat OpenShift AI marks a new milestone in our collaboration as it will help developers rapidly build and scale modern enterprise applications using performance-optimized foundation and embedding models across any cloud or data center.

Justin Boitano

Vice President, Enterprise Products, NVIDIA

Support for NVIDIA NIM on Red Hat OpenShift AI builds on existing optimization for NVIDIA AI Enterprise on Red Hat's industry-leading open hybrid cloud technologies, including Red Hat Enterprise Linux and Red Hat OpenShift. As part of this latest collaboration, NVIDIA will be enabling NIM interoperability with KServe, an open source project based on Kubernetes for highly scalable AI use cases and a core upstream contributor for Red Hat OpenShift AI. This will help fuel continuous interoperability for NVIDIA NIM microservices within future iterations of Red Hat OpenShift AI.

This integration enables enterprises to increase productivity with GenAI capabilities like expanding customer service with virtual assistants, case summarization for IT tickets and accelerating business operations with domain-specific copilots.

By using Red Hat OpenShift AI with NVIDIA NIM, organizations can benefit from:

  • Streamlined path to integration to deploy NVIDIA NIM in a common workflow alongside other AI deployments for greater consistency and easier management.
  • Integrated scaling and monitoring for NVIDIA NIM deployments in coordination with other AI model deployments across hybrid cloud environments.
  • Enterprise-grade security, support, and stability to ensure a smooth transition from prototype to production for enterprises that run their business on AI.

NVIDIA NIM microservices are designed to speed up GenAI deployment in enterprises. Supporting a wide range of AI models, including open-source community models, NVIDIA AI Foundation models, and custom models, NIM delivers seamless, scalable AI inferencing on-premises or in the cloud through industry-standard application programming interfaces (APIs).

Red Hat Summit

Join the Red Hat Summit keynotes to hear the latest from Red Hat executives, customers and partners:

Supporting Quotes

Chris Wright, chief technology officer and senior vice president, Global Engineering, Red Hat

"In this collaboration with NVIDIA, Red Hat is hyper-focused on breaking down the barriers and complexities associated with rapidly building, managing and deploying gen AI-enabled applications. Red Hat OpenShift AI provides a scalable, flexible foundation to extend the reach of NIM microservices, empowering developers with pre-built containers and industry-standard APIs, all powered by open source innovation."

Justin Boitano, vice president, Enterprise Products, NVIDIA

"Every enterprise development team wants to get their generative AI applications into production as quickly and securely as possible. Integrating NVIDIA NIM in Red Hat OpenShift AI marks a new milestone in our collaboration as it will help developers rapidly build and scale modern enterprise applications using performance-optimized foundation and embedding models across any cloud or data center."

Additional Resources

Connect with Red Hat

In short

Red Hat today announced coming integration support for NVIDIA NIM microservices on Red Hat OpenShift AI to enable optimized inferencing for dozens of artificial intelligence (AI) models backed by a consistent, open source AI/ML hybrid cloud platform.

Mentioned in this article

Red Hat Enterprise Linux, Red Hat OpenShift, Red Hat OpenShift AI

Subscribe to the feed