It is a language model that aligns a frozen visual encoder with a frozen large language model (LLM) called Vicuna, using just one projection layer. It possesses many capabilities similar to GPT-4, including generating detailed image descriptions and creating websites from handwritten drafts. | It is a framework for building and running Generative AI (Gen AI) applications. It is designed to make it easy to build and run Gen AI applications that can process data in real-time. |
Detailed image description generation;
Website creation from handwritten drafts | Build Q&A chat over unstructured text in minutes;
Use the latest Gen AI technologies;
Manage embeddings in your vector database;
Code and deploy in Visual Studio Code;
Declare an app and deploy it to dev or prod;
Bring your existing data to the LLM;
Fix your day 2 problems with an event-driven architecture |
Statistics | |
GitHub Stars 25.7K | GitHub Stars 427 |
GitHub Forks 2.9K | GitHub Forks 34 |
Stacks 1 | Stacks 1 |
Followers 2 | Followers 3 |
Votes 0 | Votes 0 |
Integrations | |

That transforms AI-generated content into natural, undetectable human-like writing. Bypass AI detection systems with intelligent text humanization technology

It allows you to run open-source large language models, such as Llama 2, locally.

It is a platform for building production-grade LLM applications. It lets you debug, test, evaluate, and monitor chains and intelligent agents built on any LLM framework and seamlessly integrates with LangChain, the go-to open source framework for building with LLMs.

It is a new scripting language to automate your interaction with a Large Language Model (LLM), namely OpenAI. The ultimate goal is to create a natural language programming experience. The syntax of GPTScript is largely natural language, making it very easy to learn and use.

It is a framework built around LLMs. It can be used for chatbots, generative question-answering, summarization, and much more. The core idea of the library is that we can “chain” together different components to create more advanced use cases around LLMs.

It is an open-source library designed to help developers build conversational streaming user interfaces in JavaScript and TypeScript. The SDK supports React/Next.js, Svelte/SvelteKit, and Vue/Nuxt as well as Node.js, Serverless, and the Edge Runtime.

Build, train, and deploy state of the art models powered by the reference open source in machine learning.

It is a project that provides a central interface to connect your LLMs with external data. It offers you a comprehensive toolset trading off cost and performance.

It is an open-source embedding database. Chroma makes it easy to build LLM apps by making knowledge, facts, and skills pluggable for LLMs.

It is a Rust ecosystem of libraries for running inference on large language models, inspired by llama.cpp. On top of llm, there is a CLI application, llm-cli, which provides a convenient interface for running inference on supported models.