BT

Facilitating the Spread of Knowledge and Innovation in Professional Software Development

Write for InfoQ

Topics

Choose your language

InfoQ Homepage News Cloudflare AI Gateway Now Generally Available

Cloudflare AI Gateway Now Generally Available

Cloudflare has recently announced that AI Gateway is now generally available. Described as a unified interface for managing and scaling generative AI workloads, AI Gateway allows developers to gain visibility and control over AI applications.

AI Gateway is an AI ops platform that offers a unified interface for managing and scaling generative AI workloads. It acts as a proxy between services and inference providers, regardless of where the models run.

Source: Cloudflare blog

Kathy Liao, product manager at Cloudflare, Michelle Chen, senior product manager at Cloudflare, and Phil Wittig, director of product at Cloudflare, write:

We've talked to a lot of developers and organizations building AI applications, and one thing is clear: they want more observability, control, and tooling around their AI ops. This is something many of the AI providers are lacking as they are deeply focused on model development and less so on platform features.

Connecting an application to the AI Gateway enables it to monitor user interactions with analytics and logging, and provides scaling features such as caching, rate limiting, request retries, and model fallback. Liao, Chen, and Wittig add:

With a single line of code, you can unlock a set of powerful features focused on performance, security, reliability, and observability – think of it as your control plane for your AI ops

In addition to Cloudflare Workers AI, the new AI Gateway supports multiple third-party providers, including OpenAI, Google Vertex AI, Azure OpenAI, HuggingFace, Amazon Bedrock, and Anthropic. Amogh Sarda, co-founder at Eesel, comments:

I’m interested to see this in motion. I’m sure there are going to be some fun ways to test out its sensitive data detection capabilities.

The AI Gateway dashboard shows metrics such as the number of requests, tokens, and the cost associated with running an application. It also tracks individual requests, providing information about the prompt, response, provider, timestamps, and whether the request was successful.

Source: Cloudflare blog

AI Gateway is not the only recent announcement from Cloudflare in the AI space. The company has also previewed Firewall for AI and made Workers AI generally available, among several other capabilities aimed at simplifying how developers build and deploy AI applications. Janakiram MSV, analyst and advisor, writes:

Cloudflare is challenging Amazon Web Services (AWS) by constantly improving the capabilities of its edge network. Amazon's serverless platform, AWS Lambda, has yet to support GPU-based model inference, while its load balancers and API gateway are not updated for AI inference endpoints.

Brendan Skousen, founder of Credexium, comments:

The latest AI tools I've been building incorporate services from Cloudflare. I'll replace platform-specific API endpoints with Cloudflare, whether it's my own API through a worker or using something like the AI Gateway or Web3 Gateway. Why? Because it's next to free, and it includes features like analytics out of the box in a safe manner. Real-time logs, caching, and rate limiting are essential when building LLM apps.

The current core features of AI Gateway are free on all Cloudflare plans, but future premium features, such as persistent logging and secrets management, will be available subject to fees.

About the Author

Rate this Article

Adoption
Style

BT