Skip to content
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
38 changes: 29 additions & 9 deletions docs/get-started/welcome.md
Original file line number Diff line number Diff line change
@@ -1,27 +1,47 @@
---
title: Welcome
description: Welcome to the Quix Developer documentation. This documentation includes guides, tutorials, and API references for using Quix Cloud, Quix Streams, and Quix Bring Your Own Cluster (BYOC).
description: Welcome to the Quix documentation. Quix lets you consolidate sensor data, configurations, and test metadata from your test rigs and simulations in a single platform. Process data in real-time or replay historical runs to accelerate R&D.
---

# Welcome

<p>Quix is a telemetry analytics platform that enables you to consolidate high frequency sensor data in a centralized cloud data store, for use cases like real-time monitoring and analytics, predictive maintenance, digital twins, and more. Quix provides the ability to connect, transform and process data in batch or real-time architectures and between legacy and modern systems in any language using easily customisable connectors.</p>
Quix is a data platform for consolidating measurement data, configuration metadata, and test parameters from test rigs, simulations, and sensors. The platform handles high-frequency telemetry data in real-time and provides the ability to replay historical runs for analysis and debugging.

## Workflow summary
This documentation includes guides, tutorials, and API references for Quix Cloud, Quix Streams (the open-source Python library), and Bring Your Own Cluster (BYOC) deployment options.

Your Quix workflow:
## Quix Features

1. Develop your applications locally in Python with Quix Streams, the Quix CLI, and your IDE of choice.
2. Test and debug your pipeline locally with the Quix CLI and Docker.
3. Deploy to Quix Cloud for scalability and observability.
**Data consolidation**: Connect sensors, data acquisition systems, MATLAB/Simulink outputs, and legacy instruments through pre-built connectors or custom Python code. All data streams to a centralized data store.

* [Learn about data integration →](../develop/integrate-data/overview.md)

**Configuration and measurement linkage**: Store test configurations alongside sensor measurements. Query by test parameters to find specific runs, or trace measurement anomalies back to their exact test setup.

* [Learn about dynamic configuration →](../quix-cloud/managed-services/dynamic-configuration.md)

**Real-time processing and historical replay**: Process live telemetry as it arrives, then replay historical data later to investigate issues or validate changes without re-running physical tests.

* [Learn about replay →](../quix-cloud/managed-services/replay.md)

**Python-based pipeline development**: Use Quix Streams to build data processing pipelines in Python. Develop and test locally, then deploy to managed infrastructure without requiring Kafka or DevOps expertise.

* [Get started with Quix Streams →](../quix-streams/quickstart.md)

## Typical Workflow

1. **Develop locally**: Build Python data processing pipelines with Quix Streams, your IDE, and the Quix CLI. Test with Docker before deployment.

2. **Debug with replay**: Replay historical test runs locally to reproduce issues or validate fixes.

3. **Deploy to production**: Push pipelines to Quix Cloud for managed infrastructure, monitoring, and data persistence. Alternatively, use BYOC to run on your own Kafka cluster.

<div class="grid cards" markdown>

- __Get started!__
- __Get started__

---

Start building your stream processing pipeline locally on the command line with Quix Streams.
Install Quix Streams and build your first data processing pipeline.

[Install Quix Streams :octicons-arrow-right-24:](../quix-streams/quickstart.md)

Expand Down