Nutanix extends AI platform to public cloud

Nutanix extends AI platform to public cloud

Nutanix Enterprise AI provides an easy-to-use, unified GenAI experience on-premises, at the edge and now in public clouds.

Nutanix has extended the company’s AI infrastructure platform with a new cloud native offering, Nutanix Enterprise AI (NAI), that can be deployed on any Kubernetes platform, at the edge, in core data centres and on public cloud services like AWS EKS, Azure AKS and Google GKE.

The NAI offering delivers a consistent hybrid multicloud operating model for accelerated AI workloads, enabling organisations to leverage their models and data in a secure location of their choice while improving return on investment (ROI).

Leveraging NVIDIA NIM for optimised performance of foundation models, Nutanix Enterprise AI helps organisations securely deploy, run and scale inference endpoints for LLMs to support the deployment of GenAI applications in minutes.

Generative AI is an inherently hybrid workload, with new applications often built in the public cloud, fine-tuning of models using private data occurring on-premises, and inferencing deployed closest to the business logic, which could be at the edge, on-premises or in the public cloud. This distributed hybrid GenAI workflow can present challenges for organisations concerned about complexity, data privacy, security, and cost.

“With Nutanix Enterprise AI, we’re helping our customers simply and securely run GenAI applications on-premises or in public clouds. Nutanix Enterprise AI can run on any Kubernetes platform and allows their AI applications to run in their secure location, with a predictable cost model,” said Thomas Cornely, SVP, Product Management, Nutanix.

Justin Boitano, Vice President of enterprise AI, NVIDIA, said:Nutanix Enterprise AI can be deployed with the NVIDIA full-stack AI platform and is validated with the NVIDIA AI Enterprise software platform, including NVIDIA NIM, a set of easy-to-use microservices designed for secure, reliable deployment of high-performance AI model inferencing. Nutanix-GPT-in-a-Box is also an NVIDIA-Certified System, also ensuring reliability of performance.

Key use cases for customers leveraging Nutanix Enterprise AI include: enhancing customer experience with GenAI through analysis of customer feedback and documents; accelerating code and content creation by leveraging co-pilots and intelligent document processing; leveraging fine-tuning models on domain-specific data to accelerate code and content generation; strengthening security, including leveraging AI models for fraud detection, threat detection, alert enrichment, and automatic policy creation; and improving analytics by leveraging fine-tuned models on private data.

Jeff Boudier, Head of Product, Hugging Face, said:Thanks to the deep collaboration between the Nutanix and Hugging Face teams, customers of Nutanix Enterprise AI are able to seamlessly deploy the most popular open models in an easy to use, fully tested stack – now also on public clouds.”

Dave Pearson, Infrastructure Research VP, IDC, said: “By providing a consistent experience from the enterprise to public cloud, Nutanix Enterprise AI aims to provide a user-friendly infrastructure platform to support organisations at every step of their AI journey, from public cloud to the edge.”

Browse our latest issue

Intelligent CIO North America

View Magazine Archive