Aporia, the leading AI control platform, today announced its partnership with Portkey, a premier provider of AI gateway solutions, to integrate Aporia’s state-of-the-art LLM Guardrails directly onto Portkey’s Gateway. This collaboration empowers Portkey’s community to drastically enhance the security and reliability of generative AI (GenAI) applications for Portkey’s community across the globe, empowering them to utilize AI to the fullest while adhering to stringent security standards.

Processing billions of LLM tokens daily for some of the world’s top AI teams, Portkey helps ship AI apps to production seamlessly. Through this partnership, Portkey users can effortlessly fortify any GenAI application with an additional layer of security and reliability -- deploying Aporia’s Guardrails at the click of a button to mitigate AI hallucinations, block prompt injections, prevent data leaks, ensure topical focus, enforce compliance based on company policies, and more. With over 20 pre-configured policies readily available and the option to customize individual guidelines, this integration provides robust control and protection for GenAI applications.

"We are thrilled to partner with Portkey to bring our industry-leading solution to their growing community and pave the way for a future in which the full potential of AI is harnessed safely and responsibly," said Liran Hason, CEO and Co-Founder of Aporia. "Together, we are committed to building an AI ecosystem that encourages innovation while safeguarding against risks by empowering users to remain confident that their AI applications are not only effective but also secure and reliable."

Aporia Guardrails' advanced features include real-time hallucination mitigation, preventing incorrect or fabricated responses that could potentially harm users. This is achieved through Aporia's recently launched multi-Small Language Model (SLM) detection engine, which ensures greater accuracy and lower latency in AI interactions. This cutting-edge technology significantly outperforms other solutions, offering a detection rate of 98% in identifying hallucinations. Aporia has also incorporated into its solution advanced security policies to handle sensitive data such as Personally Identifiable Information (PII), prevent prompt injections, and maintain conversation relevance.

“Bringing Aporia Guardrails onto Portkey is a direct reflection of our mission to bring responsible LLMs to production for enterprises worldwide,” said Rohit Agarwal, Co-Founder and CEO of Portkey. “This partnership will further help productionize AI for our community — bringing crucial security and reliability features offered by Aporia’s SOTA guardrails. By partnering with Aporia, we are closing one of the most critical gaps that stopped AI teams from taking their POC AI apps to production — seamless request orchestration that takes care of hundreds of LLM failure scenarios, in production.

About Aporia

Aporia is on a mission to help AI engineers deliver safe and reliable AI with the use of Guardrails. They created a multiSLM (Small Language Model) detection engine that provides sub-second latency ensuring the Guardrails don’t interfere with the AI’s latency. The company is recognized as a Technology Pioneer by the World Economic Forum for its mission of driving Responsible AI. Trusted by Fortune 500 companies and industry leaders such as Bosch, Lemonade, Levi’s, Munich RE, and Sixt, Aporia empowers organizations to deliver AI apps that are reliable, responsible, and fair. To learn more about Aporia, visit www.aporia.com.

About Portkey

Privately held and founded in 2023, Portkey is a leading provider of AI gateway, AI observability, AI governance, and prompts management solutions for world’s best AI teams. Hundreds of companies confidently take their AI apps to production using Portkey. For more information please visit https://portkey.ai/ and follow them on LinkedIn.

Aporia Media Contact:Mushkie MeyerHeadline Mediamushkie@headline.mediaUS: +19143364035

Portkey Media Contact:Vrushank Vyasvrushank.v@portkey.ai+91-9700888848

A photo accompanying this announcement is available at https://www.globenewswire.com/NewsRoom/AttachmentNg/d3bdcadf-bc00-4185-897c-ad6ea72b6e94