logo
Nutanix Enables Agentic AI Anywhere With Latest Release Of Nutanix Enterprise AI

Nutanix Enables Agentic AI Anywhere With Latest Release Of Nutanix Enterprise AI

Scoop08-05-2025

Nutanix (NASDAQ: NTNX), a leader in hybrid multicloud computing, today announced the general availability of the latest version of the Nutanix Enterprise AI (NAI) solution, adding deeper integration with NVIDIA AI Enterprise, including NVIDIA NIM microservices and the NVIDIA NeMo framework, to speed the deployment of Agentic AI applications in the enterprise.
NAI is designed to accelerate the adoption of generative AI in the enterprise by simplifying how customers build, run, and securely manage models and inferencing services at the edge, in the data centre, and in public clouds on any Cloud Native Computing Foundation® (CNCF)-certified Kubernetes® environment.
The latest NAI release extends a shared model service methodology that simplifies agentic workflows, helping to make deployment and day two operations simpler. It streamlines the resources and models required to deploy multiple applications across lines of business with a secure, common set of embedding, reranking, and guardrail functional models for agents. This builds on the NAI core, which includes a centralised LLM model repository that creates secure endpoints that make connecting generative AI applications and agents simple and private.
'Nutanix is helping customers keep up with the fast pace of innovation in the Gen AI market,' said Thomas Cornely, SVP of Product Management at Nutanix. 'We've expanded Nutanix Enterprise AI to integrate new NVIDIA NIM and NeMo microservices so that enterprise customers can securely and efficiently build, run, and manage AI Agents anywhere.'
'Enterprises require sophisticated tools to simplify agentic AI development and deployment across their operations,' said Justin Boitano, Vice President of Enterprise AI Software Products at NVIDIA. 'Integrating NVIDIA AI Enterprise software including NVIDIA NIM microservices and NVIDIA NeMo into Nutanix Enterprise AI provides a streamlined foundation for building and running powerful and secure AI agents.'
NAI for agentic applications can help customers:
Deploy Agentic AI Applications with Shared LLM Endpoints - Customers can reuse existing deployed model endpoints as shared services for multiple applications. This re-use of model endpoints helps reduce usage of critical infrastructure components, including GPUs, CPUs, memory, file and object storage, and Kubernetes® clusters.
Leverage a Wide Array of LLM Endpoints - NAI enables a range of agentic model services, including NVIDIA Llama Nemotron open reasoning models, NVIDIA NeMo Retriever and NeMo Guardrails.. NAI users can leverage NVIDIA AI Blueprints, which are pre-defined, customisable workflows, to jumpstart the development of their own AI applications that leverage NVIDIA models and AI microservices. In addition, NAI enables function calling for the configuration and consumption of external data sources to help AI agentic applications deliver more accurate and detailed results.
Support Generative AI Safety - This new NAI release will help customers implement agentic applications in ways consistent with their organisation's policies using guardrail models. These models can filter initial user queries and LLM responses to prevent biased or harmful outputs and can also maintain topic control and jailbreak attempt detection. For example, NVIDIA NeMo Guardrails are LLMs that provide content filtering to filter out unwanted content and other sensitive topics. These can also be applied to code generation, providing improved reliability and consistency across models.
Unlock Insights From Data with NVIDIA AI Data Platform - The Nutanix Cloud Platform solution builds on the NVIDIA AI Data Platform reference design and integrates the Nutanix Unified Storage and the Nutanix Database Service solutions for unstructured and structured data for AI. The Nutanix Cloud Infrastructure platform provides a private foundation for NVIDIA's accelerated computing, networking, and AI software to turn data into actionable intelligence. As an NVIDIA-Certified Enterprise Storage solution, Nutanix Unified Storage meets rigorous performance and scalability standards, providing software-defined enterprise storage for enterprise AI workloads, through capabilities such as NVIDIA GPUDirect Storage.
NAI is designed to use additional Nutanix platform services while allowing flexible deployments on HCI, bare metal, and cloud IaaS. NAI customers can also leverage the Nutanix Kubernetes Platform solution for multicloud fleet management of containerised cloud native applications, and Nutanix Unified Storage (NUS) and Nutanix Database Service (NDB) as discrete data services, offering a complete platform for agentic AI applications.
'Customers can realise the full potential of generative AI without sacrificing control, which is especially important as businesses expand into agentic capabilities,' said Scott Sinclair, Practice Director, ESG. "This expanded partnership with NVIDIA provides organisations an optimised solution for agentic AI minimising the risk of managing complex workflows while also safeguarding deployment through secure endpoint creation for APIs. AI initiatives are employed to deliver strategic advantages, but those advantages can't happen without optimised infrastructure control and security."
To learn more about how to get started with the latest NAI version and new NVIDIA capabilities, visit our latest blog post.
NAI with agentic model support is now generally available.
About Nutanix
Nutanix is a global leader in cloud software, offering organizations a single platform for running applications and managing data, anywhere. With Nutanix, companies can reduce complexity and simplify operations, freeing them to focus on their business outcomes. Building on its legacy as the pioneer of hyperconverged infrastructure, Nutanix is trusted by companies worldwide to power hybrid multicloud environments consistently, simply, and cost-effectively. Learn more at www.nutanix.com or follow us on social media @nutanix.

Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Mirantis unveils architecture to speed & secure AI deployment
Mirantis unveils architecture to speed & secure AI deployment

Techday NZ

time3 days ago

  • Techday NZ

Mirantis unveils architecture to speed & secure AI deployment

Mirantis has released a comprehensive reference architecture to support IT infrastructure for AI workloads, aiming to assist enterprises in deploying AI systems quickly and securely. The Mirantis AI Factory Reference Architecture is based on the company's k0rdent AI platform and designed to offer a composable, scalable, and secure environment for artificial intelligence and machine learning (ML) workloads. According to Mirantis, the solution provides criteria for building, operating, and optimising AI and ML infrastructure at scale, and can be operational within days of hardware installation. The architecture leverages templated and declarative approaches provided by k0rdent AI, which Mirantis claims enables rapid provisioning of required resources. This, the company states, leads to accelerated prototyping, model iteration, and deployment—thereby shortening the overall AI development cycle. The platform features curated integrations, accessible via the k0rdent Catalog, for various AI and ML tools, observability frameworks, continuous integration and delivery, and security, all while adhering to open standards. Mirantis is positioning the reference architecture as a response to rising demand for specialised compute resources, such as GPUs and CPUs, crucial for the execution of complex AI models. "We've built and shared the reference architecture to help enterprises and service providers efficiently deploy and manage large-scale multi-tenant sovereign infrastructure solutions for AI and ML workloads," said Shaun O'Meara, chief technology officer, Mirantis. "This is in response to the significant increase in the need for specialized resources (GPU and CPU) to run AI models while providing a good user experience for developers and data scientists who don't want to learn infrastructure." The architecture addresses several high-performance computing challenges, including Remote Direct Memory Access (RDMA) networking, GPU allocation and slicing, advanced scheduling, performance tuning, and Kubernetes scaling. Additionally, it supports integration with multiple AI platform services, such as Gcore Everywhere Inference and the NVIDIA AI Enterprise software ecosystem. In contrast to typical cloud-native workloads, which are optimised for scale-out and multi-core environments, AI tasks often require the aggregation of multiple GPU servers into a single high-performance computing instance. This shift demands RDMA and ultra-high-performance networking, areas which the Mirantis reference architecture is designed to accommodate. The reference architecture uses Kubernetes and is adaptable to various AI workload types, including training, fine-tuning, and inference, across a range of environments. These include dedicated or shared servers, virtualised settings using KubeVirt or OpenStack, public cloud, hybrid or multi-cloud configurations, and edge locations. The solution addresses the specific needs of AI workloads, such as high-performance storage and high-speed networking technologies, including Ethernet, Infiniband, NVLink, NVSwitch, and CXL, to manage the movement of large data sets inherent to AI applications. Mirantis has identified and aimed to resolve several challenges in AI infrastructure, such as: Time-intensive fine-tuning and configuration compared to traditional compute systems; Support for hard multi-tenancy to ensure security, isolation, resource allocation, and contention management; Maintaining data sovereignty for data-driven AI and ML workloads, particularly where models contain proprietary information; Ensuring compliance with varied regional and regulatory standards; Managing distributed, large-scale infrastructure, which is common in edge deployments; Effective resource sharing, particularly of high-demand compute components such as GPUs; Enabling accessibility for users such as data scientists and developers who may not have specific IT infrastructure expertise. The composable nature of the Mirantis AI Factory Reference Architecture allows users to assemble infrastructure using reusable templates across compute, storage, GPU, and networking components, which can then be tailored to specific AI use cases. The architecture includes support for a variety of hardware accelerators, including products from NVIDIA, AMD, and Intel. Mirantis reports that its AI Factory Reference Architecture has been developed with the goal of supporting the unique operational requirements of enterprises seeking scalable, sovereign AI infrastructures, especially where control over data and regulatory compliance are paramount. The framework is intended as a guideline to streamline the deployment and ongoing management of these environments, offering modularity and integration with open standard tools and platforms.

Hitachi Vantara named leader for AI storage in GigaOm Radar 2025
Hitachi Vantara named leader for AI storage in GigaOm Radar 2025

Techday NZ

time3 days ago

  • Techday NZ

Hitachi Vantara named leader for AI storage in GigaOm Radar 2025

Hitachi Vantara has been recognised as a Leader and Fast Mover in the 2025 GigaOm Radar for High-Performance Storage Optimised for AI Workloads. The newly released GigaOm Radar report evaluates advanced storage platforms designed specifically for artificial intelligence (AI) workloads. This is the first time the report has assessed high-performance storage optimised for AI, and Hitachi Vantara's positioning underscores the company's capabilities in supporting enterprise-scale data requirements for AI and machine learning (ML). Report evaluation The GigaOm Radar for High-Performance Storage Optimised for AI Workloads reviews market solutions based on their key capabilities, foundational criteria, feature completeness, and readiness for future demands. Leaders in this assessment are noted for their maturity and strategic alignment with evolving enterprise AI needs, as well as their ability to execute at scale. Hitachi Vantara's placement as a Leader and Fast Mover was attributed to its Hitachi iQ portfolio, which provides AI-ready infrastructure solutions. The report highlighted several strengths, including quality of service (QoS) and workload isolation, GPU-direct storage integration and AI-optimised data layout and management. The report noted, "Hitachi iQ offers world-class QoS and workload isolation capabilities." By combining file system-level policies with flexible cluster architecture, Hitachi iQ enables effective resource allocation, supporting consistent and reliable performance for high-priority AI and ML workloads, even in multi-tenant or shared environments. The platform's GPU-direct storage integration was identified as another area of strong performance. Optimised drivers within Hitachi iQ support efficient data transfer between storage and GPU memory, enhancing outcomes for AI and ML workflows. Additionally, Hitachi iQ's data management strategy utilises intelligent placement across storage tiers, such as TLC/QLC NVMe flash storage for high performance and object storage for additional capacity. Real-time monitoring and user-defined policies are built in to balance performance and cost efficiency for varying data patterns. Industry commentary "What really stood out about Hitachi Vantara's offerings is the quality of service and the ability to isolate workloads," said Whit Walters, Field CTO and Analyst at GigaOm. "They've delivered a well-integrated, scalable platform in Hitachi iQ, backed by enterprise-proven storage. This combination gives organisations a powerful and flexible foundation for operationalising AI at scale, especially as large-scale AI and GenAI workloads will require the ability to manage data and performance as demands continue to grow." The GigaOm report also referenced ongoing collaborations including a partnership with NVIDIA on the NVIDIA Data Platform for Hitachi iQ, a partnership that adds to the capabilities of the platform. The roadmap for Hitachi iQ includes new hardware and AI solutions, among them the Hitachi iQ M Series, announced earlier in the year. Integration with the Virtual Storage Platform One (VSP One) enables further intelligent tiering between NVMe flash and object storage, providing additional flexibility and performance optimisation. Octavian Tanase, Chief Product Officer at Hitachi Vantara, commented on the recognition. "AI isn't just pushing the boundaries of what infrastructure needs to do; it's completely redrawing them," he said. "Our goal with Hitachi iQ is to give customers a high-performance foundation that removes complexity, accelerates outcomes, and adapts to whatever their AI journey requires next. By integrating Hitachi iQ with our VSP One platform, we're enabling a flexible, intelligent storage strategy that's ready for what's now and what's next." Ongoing awards This rating follows recent industry recognition for Hitachi Vantara. In May 2025, the company was awarded the Sustainable Technology Award at the Global Tech & AI Awards for its efforts in sustainable data infrastructure with the VSP One Block solution. Earlier in the year, GigaOm also recognised Hitachi Vantara as a Leader and Outperformer in the GigaOm Radar for Primary Storage, related to its VSP One hybrid cloud data platform. The GigaOm Radar for High-Performance Storage Optimised for AI Workloads aims to shed light on platforms capable of addressing the increased performance, scalability, and operational requirements integral to enterprise AI and machine learning deployments.

Nutanix Study Finds Public Sector Embraces Generative AI, But Faces Security, Skills, And Infrastructure Gaps
Nutanix Study Finds Public Sector Embraces Generative AI, But Faces Security, Skills, And Infrastructure Gaps

Scoop

time4 days ago

  • Scoop

Nutanix Study Finds Public Sector Embraces Generative AI, But Faces Security, Skills, And Infrastructure Gaps

Sydney, NSW – June 18, 2025 – Nutanix (NASDAQ: NTNX), a leader in hybrid multicloud computing, announced the findings of its seventh annual global Public Sector Enterprise Cloud Index (ECI) survey and research report, which measures enterprise progress with cloud adoption in the industry. The research showed that 83 per cent of public sector organisations have a GenAI strategy in place, with 54 per cent actively implementing, and 29 per cent preparing for implementation. As public sector organisations ramp up GenAI adoption, 76 per cent of IT decision-makers say their current infrastructure needs moderate to significant improvement to support modern, cloud native applications at scale. This year's public sector ECI found that infrastructure modernisation emerged as a top priority, underscoring the growing demand for systems capable of meeting GenAI's requirements for enterprise-ready data security, data integrity, and resilience. This year's report also revealed that public sector leaders are increasingly leveraging GenAI applications/workloads into their organisations. Real-world GenAI use cases across the public sector gravitate towards constituent/employee support and experience solutions (e.g., chatbots) and content generation. However, concerns remain with 92 per cent of public sector leaders highlighting the need for their organisations to do more to secure GenAI models and applications. The results of that need, according to 96 per cent of respondents, is security and privacy becoming higher priorities for their organisations. 'Generative AI is no longer a future concept, it's already transforming how we work,' said Greg O'Connell, VP, Federal Sales, Public Sector at Nutanix. '94 per cent of public sector organisations are already putting AI to work and expect returns in as little as one year. As public sector leaders look to see outcomes, now is the time to invest in AI-ready infrastructure, data security, privacy, and training to ensure long-term success.' Public sector survey respondents were asked about GenAI adoptions and trends, Kubernetes and containers, how they're running business and mission critical applications today, and where they plan to run them in the future. Key findings from this year's report include: GenAI solution adoption and deployment in the public sector will necessitate a more comprehensive approach to data security. Public sector respondents indicate a significant amount of work needs to be done to improve the foundational levels of data security/governance required to support GenAI solution implementation and success. 92 per cent of public sector respondents agree that their organisation could be doing more to secure its GenAI models and applications. Luckily, many IT decision-makers in the public sector are aware of this impending sea change, with 96 per cent of respondents agreeing that GenAI is changing their organisation's priorities, with security and privacy becoming higher priorities. Prioritise infrastructure modernisation to support GenAI at scale across public sector organisations. Running modern applications at enterprise scale requires infrastructure solutions that can support the necessary requirements for complex data security, data integrity, and resilience. Unfortunately, 76 per cent of respondents in the public sector believe their current IT infrastructure requires at least moderate improvement to fully support cloud native apps/containers. Furthermore, IT infrastructure investment was ranked as a top area of improvement among public sector respondents, a sign that IT decision-makers are aware of the need to improve. GenAI solution adoption in the public sector continues at a rapid pace, but there are still challenges to overcome. When it comes to GenAI adoption, public sector metrics show progress, with 94 per cent of respondents saying their organisation is leveraging GenAI applications/workloads today. Most public sector organisations believe GenAI solutions will help improve levels of productivity, automation, and efficiency. However, organisations in the public sector also note a range of challenges and potential hindrances regarding GenAI solution development and deployment, including data security and privacy, and the need for continued upskilling and hiring to support new GenAI projects/solutions. Application containerisation and Kubernetes deployment are expanding across the public sector. Application containerisation is increasingly pervasive across industry sectors and is set to expand in adoption across the public sector as well, with 96 per cent of segment respondents saying their organisation is at least in the process of containerising applications. This trend may be driven by the fact that 91 per cent of respondents in the public sector agree their organisation benefits from adopting cloud native applications/containers. For the seventh consecutive year, Nutanix commissioned a global research study to learn about the state of global enterprise cloud deployments, application containerisation trends, and GenAI application adoption. In the Fall of 2024, U.K. researcher Vanson Bourne surveyed 1,500 IT and DevOps/Platform Engineering decision-makers around the world. The respondent base spanned multiple industries, business sizes, and geographies, including North and South America; Europe, the Middle East and Africa (EMEA); and Asia-Pacific-Japan (APJ) region. To learn more about the report and findings, please download the full Public Sector Nutanix Enterprise Cloud Index, here and read more in the blog here. About Nutanix Nutanix is a global leader in cloud software, offering organisations a single platform for running applications and managing data, anywhere. With Nutanix, companies can reduce complexity and simplify operations, freeing them to focus on their business outcomes. Building on its legacy as the pioneer of hyperconverged infrastructure, Nutanix is trusted by companies worldwide to power hybrid multicloud environments consistently, simply, and cost-effectively.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store