It is a headless LLM chatbot platform built on top of Rasa and Langchain. It is a boilerplate and a reference implementation of Rasa and Telegram utilizing an LLM library like Langchain for indexing, retrieval and context injection. | It is an open-source library designed to help developers build conversational streaming user interfaces in JavaScript and TypeScript. The SDK supports React/Next.js, Svelte/SvelteKit, and Vue/Nuxt as well as Node.js, Serverless, and the Edge Runtime. |
Document versioning and automatic “re-training” implemented on upload;
Customize your own async end-points and database models via FastAPI and SQLModel;
Full API documentation via Swagger and Redoc included;
PGAdmin included so you can browse your database | SWR-powered React, Svelte, and Vue helpers for streaming text responses and building chat and completion UIs;
First-class support for LangChain, OpenAI, Anthropic, and HuggingFace;
Edge runtime compatibility;
Callbacks for saving completed streaming responses to a database (in the same request) |
Statistics | |
GitHub Stars 2.4K | GitHub Stars - |
GitHub Forks 255 | GitHub Forks - |
Stacks 0 | Stacks 454 |
Followers 5 | Followers 13 |
Votes 0 | Votes 0 |
Integrations | |

That transforms AI-generated content into natural, undetectable human-like writing. Bypass AI detection systems with intelligent text humanization technology

It is a framework built around LLMs. It can be used for chatbots, generative question-answering, summarization, and much more. The core idea of the library is that we can “chain” together different components to create more advanced use cases around LLMs.

It allows you to run open-source large language models, such as Llama 2, locally.

It is a project that provides a central interface to connect your LLMs with external data. It offers you a comprehensive toolset trading off cost and performance.

It is a library for building stateful, multi-actor applications with LLMs, built on top of (and intended to be used with) LangChain. It extends the LangChain Expression Language with the ability to coordinate multiple chains (or actors) across multiple steps of computation in a cyclic manner.

It is a platform for building production-grade LLM applications. It lets you debug, test, evaluate, and monitor chains and intelligent agents built on any LLM framework and seamlessly integrates with LangChain, the go-to open source framework for building with LLMs.

It is a new scripting language to automate your interaction with a Large Language Model (LLM), namely OpenAI. The ultimate goal is to create a natural language programming experience. The syntax of GPTScript is largely natural language, making it very easy to learn and use.

Build, train, and deploy state of the art models powered by the reference open source in machine learning.

It is an open-source embedding database. Chroma makes it easy to build LLM apps by making knowledge, facts, and skills pluggable for LLMs.

It is a Rust ecosystem of libraries for running inference on large language models, inspired by llama.cpp. On top of llm, there is a CLI application, llm-cli, which provides a convenient interface for running inference on supported models.