Microsoft and Hugging Face deepen generative AI partnership (2024)

We are dedicated to our mission of empowering generative AI developers and global organizations with the best AI infrastructure, open and proprietary foundation models, AI orchestration and developer tools as you build and scale your copilot stacks.

Microsoft Build 2024: Deepening our generative AI partnership with Hugging Face

Today, Microsoft is thrilled to announce our deepened generative AI partnership with Hugging Face!Building on our previous collaborations. Our extended partnership with Hugging Face underscores our joint commitment to making AI more accessible through our product updates and integrations with the Azure AI Model Catalog, the latest Azure infrastructure in partnership with AMD, VS Code and HuggingChat.

By combining Microsoft's robust cloud infrastructure with Hugging Face's most popular Large Language Models (LLMs), we are enhancing our copilot stacks to provide developers with advanced tools and models to deliver scalable, responsible, and safe generative AI solutions for custom business need.

Microsoft and Hugging Face deepen generative AI partnership (1)

Today, at Microsoft Build 2024, we're excited to deepen our collaboration in four key areas:

  1. We are introducing 20 newleading open Hugging Face models - like Rhea-72B-v0.5 from David Kim and Multiverse-70B from MTSAIR -into the Azure AI model catalog, further diversifying open model choice and selection for our customers.
  2. In partnership with AMD, we are enhancing Azure AI infrastructure to support the availability of Hugging Face Hub on the ND MI300X v5, powered by AMD GPUs.
  3. Our integration of Phi-3-mini in HuggingChat marks a significant enhancement in our interactive AI offerings.
  4. Our integration of Hugging Face Spaces with Visual Studio Code makes the development processes streamlined for AI developers.

Hugging Face is the creator of Transformers, a widely popular library for building large language models. In 2022, we announced our partnership with Hugging Face to integrate state-of-the-art AI capabilities into Azure, increasing efficiency for developers to deploy, operationalize and manage models.As we progress, our partnership continues to deepen, in service ofour joint mission to ensureuseful and seamless integrations that enable generative AI developers, machine learning engineers and data scientists to deploy open models of their choice to secure and scalable inference infrastructure on Azure.

Azure AI Model Catalog: 20 new leading open-source Hugging Face models added

The Azure AI Model Catalog is the hub to discover, deploy and fine-tune the widest selection of open source and proprietary generative AI models for your use cases, RAG applications, and agents. In addition to other model providers like Cohere, Meta, and Mistral, the Hugging Face collection has a wide selection of base and fine-tuned models, like tiiuae-falcon-7b.

Today, we’re adding20new popular open models either trending or from the OpenLLM leaderboard - like Smaug-72B-v0.1 from Abacus AI and a Japanese/English text generation model Fugaku-LLM-13B from Fugaku-LLM -from the Hugging Face hub to the Azure AI model catalog. These models offer user experiences when deployed from the model catalog for inferencing. This premium experience is driven by advanced features, software, or optimizations. For example, some of the new models are supported by Hugging Face’s Text Generation Inference (TGI) or Text Embedding Inference (TEI) – optimized inference runtimes for efficient deployment and serving of LLMs and embeddings models respectively.

TGI enables high performance text generation of LLMs like Falcon and StarCoder through tensor parallelism, continuous batching of incoming requests, and optimized transformers code using Flash Attention and Paged Attention. TEI achieves efficient serving of text embeddings models – like BERT and its variants – through skipping the model graph compilation step in deployment and incorporating token-based dynamic batching. For more information, see Hugging Face’s articles and list of supported models for TGI and TEI.

In partnership with the engineering team at Hugging Face, we plan to deepen our integrations between Hugging Face hub and Azure AI on tenants of model discoverability, custom deployment and fine-tuning as a starting point.

  • If you’d like to be engaged in the private preview program of these features, please share your contact information here.
  • Get started with Hugging Face inference through Azure AI using these Python samples.

Microsoft and Hugging Face deepen generative AI partnership (2)

Partner on Azure's latest AI infrastructure with AMD

At Microsoft Build this year, Microsoft unveiled the GA of Azure's latest AI infrastructure offering, the ND MI300X v5, which is powered by AMD Instinct™ MI300X GPUs. Hugging Face is one of the first AI partners to harness this new AI infrastructure and achieved a new benchmark for performance and efficiency of their models in just one month. Through a deep engineering collaboration between Hugging Face, AMD and Microsoft, Hugging Face offers Azure customers the full acceleration capabilities of AMD’s ROCm™ open software ecosystem when using Hugging Face models with its libraries on the new MI300X instances. Hugging Face users and customers of its premium Enterprise Hub service can run over 10,000 pre-trained models on Azure without needing to re-write their applications. Hugging Face Enterprise Hub customers will also be able to quickly and easily deploy and scale their workloads with Azure CycleCloud, Azure Kubernetes Service, and other Azure services.

Phi-3-mini on HuggingChat

We're also broadening the reach of Phi-3, a family of small models developed by Microsoft, by making them available on the HuggingChat playground. HuggingChat allows Phi-3 to meet the community where they’re at and gives developers and data scientists a great place to start experimenting with Phi-3 and discover new ways to leverage the power of small models. The deeper integration with Azure AI will enable developers to combine the power of open platforms and communities on Hugging Face with enterprise-grade offerings on Azure AI.

Microsoft and Hugging Face deepen generative AI partnership (3)

Visual Studio Code integration with Hugging Face Spaces

Microsoft has been working closely with Hugging Face on the developer experience, and today, we’re introducing a new “Dev Mode” feature for Hugging Face Spaces, designed to streamline the development process for AI developers. Hugging Face Spaces provides a user-friendly platform for creating and deploying AI-powered demos in minutes, with over 500,000 Spaces already created by the Hugging Face community.

With Dev Mode enabled, we made it seamless for developers to connect Visual Studio Code (VS Code) to their Space, eliminating the need to push local changes to the Space repository using git. This integration allows you to edit your code directly within VS Code, whether locally or in the browser, and see your changes in real-time.

Microsoft and Hugging Face deepen generative AI partnership (4)

For example, if you want to change the color theme of a Gradio Space, you can edit the code in VS Code, and simply click "Refresh" in your Space to see the updates instantly, without the need to push changes or rebuild the Space container.

Microsoft and Hugging Face deepen generative AI partnership (5)

Once you are satisfied with your changes, you can commit and merge them to persist your updates, making the development process more efficient and user-friendly.

Microsoft and Hugging Face deepen generative AI partnership (6)

Get Started with Hugging Face on Azure

  • Explore Hugging Face models in the Azure AI model catalog.
  • Learn more about the model collections in the model catalog.
  • Try Phi-3-mini on the HuggingChat playground.
  • Create a Space in dev mode here.
Microsoft and Hugging Face deepen generative AI partnership (2024)

FAQs

Is Huggingface owned by Microsoft? ›

Hugging Face (Independent Publisher) | Microsoft Power Automate.

Which industry can use generative AI to produce and translate content more economically? ›

Automated Content Generation: Generative AI is enabling media and entertainment companies to streamline content creation processes. These AI models can generate original scripts, articles, and even music compositions, freeing up human creators to focus on more complex and creative tasks.

How to setup Azure AI Studio? ›

To create a project in Azure AI Studio, follow these steps:
  1. Go to the Home page of Azure AI Studio.
  2. Select + New project.
  3. Enter a name for the project.
  4. Select a hub from the dropdown to host your project. ...
  5. If you're creating a new hub, enter a name.
  6. Select your Azure subscription from the Subscription dropdown.
6 days ago

What is Microsoft AI Studio? ›

Azure AI Studio is a trusted platform that empowers developers to drive innovation and shape the future with AI in a safe, secure, and responsible way. The comprehensive platform accelerates the development of production-ready copilots to support enterprise chat, content generation, data analysis, and more.

How does Hugging Face make money? ›

At Hugging Face, we build a collaboration platform for the ML community (i.e., the Hub), and we monetize by providing simple access to compute for AI, with services like AutoTrain, Spaces and Inference Endpoints, directly accessible from the Hub, and billed by Hugging Face to the credit card on file.

Why Hugging Face instead of GitHub? ›

As far as i know, github and huggingface are two hub playing different roles. On github, most contributers are software engineers, and they mainly concern on code and project issues. On huggingface, most are model engineers or students,and they want to use PTMs to save their time and cost, reporting the problems on it.

What is the most famous generative AI? ›

DALL-E 2

Among the best generative AI tools for images, DALL-E 2 is OpenAI's recent version for image and art generation. DALL-E 2 generates better and more photorealistic images when compared to DALL-E. DALL-E 2 appropriately goes by user requests.

What is the downside of generative AI? ›

Ethical Enchantments: The misuse of generative AI can lead to ethical concerns, such as deepfake creation or the amplification of harmful content, reminiscent of dark incantations. Bias Bewitchment: Generative AI can perpetuate biases present in the data it's trained on, casting a shadow of unfairness over its outputs.

Who is considered the father of artificial intelligence? ›

The correct answer is option 3 i.e ​John McCarthy. John McCarthy is considered as the father of Artificial Intelligence. John McCarthy was an American computer scientist. The term "artificial intelligence" was coined by him.

How to use GPT 4 in Azure OpenAI? ›

Deploying GPT-4 Turbo with Vision GA

To deploy the GA model from the Studio UI, select GPT-4 and then choose the turbo-2024-04-09 version from the dropdown menu. The default quota for the gpt-4-turbo-2024-04-09 model will be the same as current quota for GPT-4-Turbo. See the regional quota limits.

How is AI used in Azure? ›

Azure AI services provides capabilities to solve general problems such as analyzing text for emotional sentiment or analyzing images to recognize objects or faces. You don't need special machine learning or data science knowledge to use these services.

How does AI work in Azure? ›

Machine learning is a process that computer systems follow to achieve artificial intelligence. It uses algorithms to identify patterns within data, and those patterns are then used to create a data model that can make predictions. Machine-learning models are trained on subsets of data.

What will Microsoft do with OpenAI? ›

Microsoft has now invested more than $13 billion in OpenAI, adding its models to Office apps, its Bing search engine, Edge, and even inside its Windows operating system. It has helped Microsoft be seen as more of a leader in AI, instead of falling behind as it once feared five years ago.

What AI is Microsoft using? ›

Azure AI platform, services, and solutions.

What is Microsoft's AI called? ›

Copilot is owned by tech giant Microsoft.

Who created Hugging Face? ›

Hugging Face Inc. is the American company that created the Hugging Face platform. The company was founded in New York City in 2016 by French entrepreneurs Clément Delangue, Julien Chaumond and Thomas Wolf. The company originally developed a chatbot app by the same name for teenagers.

Is Hugging Face a public company? ›

Hugging Face is a privately held company and therefore does not have a public stock price.

What brand owns Microsoft? ›

Microsoft was founded by Bill Gates and Paul Allen in 1975. Currently, the CEO of Microsoft is Satya Nadella. Microsoft is a publicly traded company, which means that it is owned by its shareholders. Was this reply helpful?

Which companies use Hugging Face? ›

Companies Currently Using Hugging Face
Company NameWebsiteSub Level Industry
OutSystemsoutsystems.comSoftware Development & Technical Consulting
RealPage Inc.realpage.comSoftware Development & Technical Consulting
Capital Onecapitalone.comBanking
IBMibm.comComputer Hardware Manufacturers
2 more rows

Top Articles
Latest Posts
Article information

Author: Rubie Ullrich

Last Updated:

Views: 6212

Rating: 4.1 / 5 (52 voted)

Reviews: 83% of readers found this page helpful

Author information

Name: Rubie Ullrich

Birthday: 1998-02-02

Address: 743 Stoltenberg Center, Genovevaville, NJ 59925-3119

Phone: +2202978377583

Job: Administration Engineer

Hobby: Surfing, Sailing, Listening to music, Web surfing, Kitesurfing, Geocaching, Backpacking

Introduction: My name is Rubie Ullrich, I am a enthusiastic, perfect, tender, vivacious, talented, famous, delightful person who loves writing and wants to share my knowledge and understanding with you.