A complete unstructured data solution: ETL processing, AI-readiness, open-source LLM hosting, and RAG capabilities in one powerful platform.
Follow the installation steps below or documentation for more details to build versatile AI applications locally.
Instill Core is an end-to-end AI platform for data, pipeline and model orchestration.
🔮 Instill Core simplifies infrastructure hassle and encompasses these core features:
- 💧 Pipeline: Quickly build versatile AI-first APIs or automated workflows.
- ⚗️ Model: Deploy and monitor AI models without GPU infrastructure hassles.
- 💾 Artifact: Transform unstructured data (e.g., documents, images, audio, video) into AI-ready formats.
- ⚙️ Component: Connect essential building blocks to construct powerful pipelines.
- 📖 Parsing PDF Files to Markdown: Cookbook
- 🧱 Generating Structured Outputs from LLMs: Cookbook & Tutorial
- 🕸️ Web scraping & Google Search with Structured Insights
- 🌱 Instance segmentation on microscopic plant stomata images: Cookbook
See Examples for more!
Operating System | Requirements and Instructions |
---|---|
macOS or Linux | Instill Core works natively |
Windows | • Use Windows Subsystem for Linux (WSL2) • Install latest yq from GitHub Repository• Install latest Docker Desktop and enable WSL2 integration (tutorial) • (Optional) Install cuda-toolkit on WSL2 (NVIDIA tutorial) |
All Systems | • Docker Engine v25 or later • Docker Compose v2 or later • Install latest stable Docker and Docker Compose |
Execute the following commands to pull pre-built images with all the dependencies to launch:
$ git clone -b v0.49.0-beta https://github.com/instill-ai/instill-core.git && cd instill-core
# Launch all services
$ make all
Note
We have restructured our project repositories. If you need to access 🔮 Instill Core projects up to version v0.13.0-beta
, please refer to the instill-ai/deprecated-core repository.
Execute the following commands to build images with all the dependencies to launch:
$ git clone https://github.com/instill-ai/instill-core.git && cd instill-core
# Launch all services
$ make latest PROFILE=all
Important
Code in the main branch tracks under-development progress towards the next release and may not work as expected. If you are looking for a stable alpha version, please use latest release.
🚀 That's it! Once all the services are up with health status, the UI is ready to go at http://localhost:3000. Please find the default login credentials in the documentation.
To shut down all running services:
make down
Visit the Deployment Overview for more details.
- 📺 Console
- ⌨️ CLI
- 📦 SDK:
- Python SDK
- TypeScript SDK
- Stay tuned, as more SDKs are on the way!
Please visit our official documentation for more.
Additional resources:
We welcome contributions from our community! Checkout the methods below:
-
Cookbooks: Help us create helpful pipelines and guides for the community. Visit our Cookbook repository to get started.
-
Issues: Contribute to improvements by raising tickets using templates here or discuss in existing ones you think you can help with.
We are committed to maintaining a respectful and welcoming atmosphere for all contributors. Before contributing, please read:
Get help by joining our Discord community where you can post any questions on our #ask-for-help
channel.
Thank you to all these wonderful people (emoji key):
This project follows the all-contributors specification. Contributions of any kind welcome!
See the LICENSE file for licensing information.