Help developers discover the tools you use. Get visibility for your team's tech choices and contribute to the community's knowledge.
It is the interface between your app and hosted LLMs. It streamlines API requests to OpenAI, Anthropic, Mistral, LLama2, Anyscale, Google Gemini, and more with a unified API. | π₯ Firewall β Risk Detection & Policy Gate; π§Ή Normalize β Sanitize & Rewrite Safely; π Policy Enforcement Layer; π‘ Data Protection (Inbound & Outbound); π Response Governance & Output Filtering; π Logging & Audit Trail; |
Blazing fast (9.9x faster) with a tiny footprint (~45kb installed);
Load balance across multiple models, providers, and keys;
Fallbacks make sure your app stays resilient;
Automatic Retries with exponential fallbacks come by default;
Plug-in middleware as needed;
Battle-tested over 100B tokens | Artificial Intelligence, AI, Security, AI Security, Generative AI, AI Tools |
Statistics | |
GitHub Stars 9.8K | GitHub Stars - |
GitHub Forks 775 | GitHub Forks - |
Stacks 2 | Stacks 0 |
Followers 4 | Followers 1 |
Votes 0 | Votes 1 |
Integrations | |
| No integrations available | |

AWS Key Management Service (KMS) is a managed service that makes it easy for you to create and control the encryption keys used to encrypt your data, and uses Hardware Security Modules (HSMs) to protect the security of your keys. AWS Key Management Service is integrated with other AWS services including Amazon EBS, Amazon S3, and Amazon Redshift. AWS Key Management Service is also integrated with AWS CloudTrail to provide you with logs of all key usage to help meet your regulatory and compliance needs.

Unleash your creativity with letsmkvideo, the leading AI video generator. Effortlessly create professional videos from text, animate photos, and create stunning AI video effects. Get started for freeβno watermarks, just high-quality results in minutes.

That transforms AI-generated content into natural, undetectable human-like writing. Bypass AI detection systems with intelligent text humanization technology

It is a framework built around LLMs. It can be used for chatbots, generative question-answering, summarization, and much more. The core idea of the library is that we can βchainβ together different components to create more advanced use cases around LLMs.

It allows you to run open-source large language models, such as Llama 2, locally.

It is a project that provides a central interface to connect your LLMs with external data. It offers you a comprehensive toolset trading off cost and performance.

It is a library for building stateful, multi-actor applications with LLMs, built on top of (and intended to be used with) LangChain. It extends the LangChain Expression Language with the ability to coordinate multiple chains (or actors) across multiple steps of computation in a cyclic manner.

It is a platform for building production-grade LLM applications. It lets you debug, test, evaluate, and monitor chains and intelligent agents built on any LLM framework and seamlessly integrates with LangChain, the go-to open source framework for building with LLMs.

Try Grok 4 on GPT Proto. Access xAIβs most advanced 1.7T LLM with 130K context, multimodal support, and real-time data integration for dynamic analysis.

Create polished visuals and clips in the browser with Nano Banana Pro using text prompts or reference images.