Turns Data and AI algorithms into production-ready web applications in no time.
-
Updated
Jun 12, 2024 - Python
Turns Data and AI algorithms into production-ready web applications in no time.
A high-throughput and memory-efficient inference and serving engine for LLMs
Apache Airflow - A platform to programmatically author, schedule, and monitor workflows
Label Studio is a multi-type data labeling and annotation tool with standardized output format
Kedro is a toolbox for production-ready data science. It uses software engineering best practices to help you create data engineering and data science pipelines that are reproducible, maintainable, and modular.
An orchestration platform for the development, production, and observation of data assets.
🔮 SuperDuperDB: Bring AI to your database! Build, deploy and manage any AI application directly with your existing data infrastructure, without moving your data. Including streaming inference, scalable model training and vector search.
Standardized Serverless ML Inference Platform on Kubernetes
Weaviate is an open-source vector database that stores both objects and vectors, allowing for the combination of vector search with structured filtering with the fault tolerance and scalability of a cloud-native database.
NucliaDB, The AI Search database for RAG
The easiest way to serve AI/ML models in production - Build Model Inference Service, LLM APIs, Multi-model Inference Graph/Pipelines, LLM/RAG apps, and more!
A common interface for registering, validating and auditing machine learning artifacts
⚡️SwanLab: your ML experiment notebook. 你的AI实验笔记本,跟踪与可视化你的机器学习全流程
RAG (Retrieval Augmented Generation) Framework for building modular, open source applications for production by TrueFoundry
Production Grade Nifi & Nifi Registry. Deploy for VM (Virtual Machine) with Terraform + Ansible, Helm & Helmfile for Kubernetes (EKS)
Workflow Engine for Kubernetes
📜 you interest to mlops? here's your encyclopedic for dummies
Add a description, image, and links to the mlops topic page so that developers can more easily learn about it.
To associate your repository with the mlops topic, visit your repo's landing page and select "manage topics."