🤘 TT-NN operator library, and TT-Metalium low level kernel programming model.
-
Updated
Jun 3, 2024 - C++
🤘 TT-NN operator library, and TT-Metalium low level kernel programming model.
🤖 Collect practical AI repos, tools, websites, papers and tutorials on AI. 实用的AI百宝箱 💎
🧑🚀 全世界最好的中文LLM资料总结
A high-throughput and memory-efficient inference and serving engine for LLMs
👑 Easy-to-use and powerful NLP and LLM library with 🤗 Awesome model zoo, supporting wide-range of NLP tasks from research to industrial applications, including 🗂Text Classification, 🔍 Neural Search, ❓ Question Answering, ℹ️ Information Extraction, 📄 Document Intelligence, 💌 Sentiment Analysis etc.
Drop-in, local AI alternative to the OpenAI stack. Multi-engine (llama.cpp, TensorRT-LLM). Powers 👋 Jan
An Easy-to-use Knowledge Editing Framework for LLMs.
LMDeploy is a toolkit for compressing, deploying, and serving LLMs.
jetson-examples running AI models and applications on NVIDIA Jetson devices with one-line command.
Pathways: multi-modal AI/ML models on discord
⚡️Open-source AI LangChain-like RAG (Retrieval-Augmented Generation) knowledge database with web UI and Enterprise SSO⚡️, supports OpenAI, Azure, LLaMA, Google Gemini, HuggingFace, Claude, Grok, etc., chat bot demo: https://demo.casibase.com, admin UI demo: https://demo-admin.casibase.com
MinimalChat is a lightweight, open-source chat application that allows you to interact with various large language models.
Openai style api for open large language models, using LLMs just as chatgpt! Support for LLaMA, LLaMA-2, BLOOM, Falcon, Baichuan, Qwen, Xverse, SqlCoder, CodeLLaMA, ChatGLM, ChatGLM2, ChatGLM3 etc. 开源大模型的统一后端接口
By utilizing GPT4All-CLI, developers can effortlessly tap into the power of GPT4All and LLaMa without delving into the library's intricacies. Simply install the CLI tool, and you're prepared to explore the fascinating world of large language models directly from your command line!
Add a description, image, and links to the llama topic page so that developers can more easily learn about it.
To associate your repository with the llama topic, visit your repo's landing page and select "manage topics."