aka.ms/apimlove

A library of useful resources about Azure API Management.

Azure API Management ... What is it? ๐Ÿค”

Azure API Management is a hybrid, multicloud platform designed to manage APIs across various environments. It provides a comprehensive solution for the entire API lifecycle, including creation, deployment, and monitoring. The platform features an API gateway, a management plane, and a developer portal, enabling secure and scalable API management. It helps organizations expose services as APIs, protect and accelerate API traffic, and enhance API discoverability for internal and external users.

What's New โœจ

Visit the Azure API Management service changelog for the latest updates and new features. For developer portal updates, visit the developer portal release notes.

Hands-on Labs & Samples for you ๐Ÿ’ป

AI Gateway (Azure APIM โค๏ธ OpenAI)

This repository contains a set of experiments on using Generative AI capabilities of Azure API Management with Azure OpenAI and other services.

Azure OpenAI (Node.js) Sample

This sample shows how to manage Generative AI APIs at scale using Azure API Management and Azure OpenAI Service for a simple chatbot.

AI Hub Gateway Landing Zone accelerator

Reference architecture that provides a set of guidelines and best practices for implementing a central AI API gateway to empower various line-of-business units in an organization to leverage Azure AI services.

To get you started

Featured Videos ๐Ÿ“น

Combining Gen AI APIs and Azure APIM is a great idea! ๐Ÿค–

Adding your AI APIs in Azure API Management is a great move as we have new features specifically tailored to help you manage these APIs.

  • Cost Efficiency with controlled token limits and monitoring.
  • High Reliability with automatic failovers, enabling load balancers and circuit breakers
  • Robust Security with isolated user credentials
  • Enhanced Governance with runtime policies
  • Azure OpenAI Token Limit Policy ๐Ÿšซ

    Azure OpenAI Token Limit policy allows you to manage and enforce limits per API consumer, based on the usage of Azure OpenAI tokens

    Azure OpenAI Emit Token Metric Policy ๐Ÿ“Š

    Azure OpenAI enables you to configure token usage metrics to be sent to Azure Applications Insights, providing overview of the utilization of Azure OpenAI models across multiple applications or API consumers.

    Load Balancer and Circuit Breaker ๐Ÿ’ซ

    Load Balancer and Circuit Breaker features allow you to spread the load across multiple Azure OpenAI endpoints.

    Azure OpenAI Semantic Caching policy ๐Ÿซ™

    Azure OpenAI Semantic Caching policy empowers you to optimize token usage by leveraging semantic caching, which stores completions for prompts with similar meaning.

    Latest Articles ๐Ÿ“ฐ

    Azure API Management Turns 10: Celebrating a Decade of Customer-Driven Innovation and Success

    This September marks a truly special occasion: Azure API Management turns 10! Since our launch in 2014,...

    Sept 19, 2024 Read More โ†’

    Manage your Generative AI APIs with Azure API Management and Azure Open AI

    This is for you who have started with Generative AI APIs and youโ€™re looking to take those APIs into production. ...

    Aug 8, 2024 Read More โ†’

    Secure Azure APIM and Azure OpenAI with managed identity

    Ok, so you might have read somewhere that API keys is not secure, and you might even have heard about this managed identity thing...

    Aug 28, 2024 Read More โ†’

    Stories of Success: Real-World Impact ๐ŸŽ‰๐ŸŽ‰

    Our greatest successes are seen in the achievements of our customers.