Cloud

Microsoft integrates Nvidia’s AI Enterprise Suite with Azure Machine Learning


Microsoft is integrating Nvidia’s AI Enterprise software suite with its Azure Machine Learning service to help enterprise developers build, deploy, and manage applications based on large language models, it said Tuesday.

Developers and enterprises will have access to over 100 frameworks, pretrained large language models, and development tools as part of AI Enterprise Suite integration with Microsoft’s Azure Machine Learning service, the companies said in a joint statement. For now, the integration is only available through an invitation-only preview in the Nvidia community registry.

Nvidia’s AI Enterprise Suite aids in accelerating the data science pipeline and streamlines development and deployment of production AI including generative AI, computer vision, and speech AI, the chip maker said.

The suite comes with tools such as Nvidia RAPIDS for accelerating data science workloads, Nvidia Metropolis for accelerating Vision AI model development, Nvidia Triton Inference Server for standardizing model deployment, and NeMo Guardrails software that enables developers to add safety and security features for AI chatbots, it added. Users will also have access to Nvidia experts and a support service.

As part of the collaboration between the two companies, Microsoft will make Nvidia’s AI Enterprise software suite available on its Azure Marketplace.

The two companies are working to combine their software offerings in other areas too. Nvidia Omniverse Cloud platform-as-a-service (PaaS) is now available on Microsoft Azure as a private offer for enterprises. Omniverse Cloud provides developers and enterprises with a full-stack cloud environment to design, develop, deploy and manage industrial metaverse applications at scale, the companies said.

In the last few months, Nvidia has consistently partnered with several technology companies such as Oracle, Google Cloud, ServiceNow and Dell to provide services for developing AI and generative AI applications. And in March, the chip maker had said that it would make its DGX Pods, which power ChatGPT, available in the cloud.

Copyright © 2023 IDG Communications, Inc.



READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.