API company Kong today launched the open source AI Gateway, an extension of its existing API Gateway that allows developers and operations teams to integrate their applications with one or more large language models (LLMs) and access them through a single API . In addition to this, Kong is launching several AI-specific features, direct engineering, credential management, and more.
“We think AI is an additional use case for APIs,” Kong co-founder and CTO Marco Palladino told me. “APIs are driven by use cases: mobile, microservices — AI happens to be the latest. When we look at AI, we really look at APIs everywhere. We look at APIs for consuming AI, we look at APIs for optimizing AI, and AI itself. […] So the more AI, the more API usage we’ll have in the world.”
Palladino argues that while almost every organization is considering how to use artificial intelligence, it also fears data leakage. Eventually, he believes, these companies will want to run their models locally and perhaps use the cloud as an alternative. For now, however, these companies also need to figure out how to manage credentials to access cloud-based models, control and log traffic, manage quotas, and more.
“With our API gateway, we wanted to offer an offering that allows developers to be more productive when building AI by being able to leverage our gateway to consume one or more LLM providers without having to change their code “, write down. The portal currently supports Anthropic, Azure, Cohere, Meta’s LLaMA, Mistral and OpenAI models.
Kong’s team argues that most other API providers currently manage AI APIs the same way as all other APIs. However, by layering these additional AI-related features on top of the API, Kong’s team believes, it can enable new use cases (or at least make existing ones easier to implement). Using the AI request and response transformers that are part of the new AI portal, developers can change their messages and results on the fly to automatically translate them or remove personally identifiable information, for example.
Prompt Engineering, too, is deeply integrated into the portal, allowing businesses to impose their guidelines on top of these models. This also means that there is a central point to manage these instructions and prompts.
It’s been almost nine years since the launch of the API management platform Kong. At the time, the company that is now Kong was still known as Mashape, but as Mashape/Kong co-founder and CEO Augusto Marietti told me in an interview earlier this week, this was pretty much a last-ditch effort. “Mashape wasn’t going anywhere, and then Kong is the number one API product on GitHub,” he said. Marietti noted that Kong was cash-flow positive in the last quarter — and isn’t currently looking for financing — so that worked out pretty well.
Now, Kong Gateway is at the core of the company’s platform and also what powers the new AI Gateway. Indeed, current Kong users only need to upgrade their current installation to access all the new AI features.
As of now, these new AI features are available for free. Over time, Kong expects to release paid premium features as well, but the team stressed that’s not the goal for this current version.