Skip to content

Improve onboarding pages for ingesting and querying #258

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 10 commits into
base: main
Choose a base branch
from
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion apl/entities/entity-names.mdx
Original file line number Diff line number Diff line change
@@ -36,6 +36,6 @@ Quote an identifier in your APL query if any of the following is true:
- Dash (`-`)
- The identifier name is identical to one of the reserved keywords of the APL query language. For example, `project` or `where`.

If any of the above is true, you must quote the identifier by putting it in quotation marks (`'` or `"`) and square brackets (`[]`). For example, `['my-field']`.
If any of the above is true, you must quote the identifier by enclosing it in quotation marks (`'` or `"`) and square brackets (`[]`). For example, `['my-field']`.

If none of the above is true, you don’t need to quote the identifier in your APL query. For example, `myfield`. In this case, quoting the identifier name is optional.
89 changes: 69 additions & 20 deletions apl/introduction.mdx
Original file line number Diff line number Diff line change
@@ -6,45 +6,94 @@ icon: door-open
tags: ['axiom documentation', 'documentation', 'axiom', 'APL', 'axiom processing language', 'data explorer', 'getiing started guide', 'summarize', 'filter']
---

## Introduction
import Prerequisites from "/snippets/minimal-prerequisites.mdx"

The Axiom Processing Language (APL) is a query language that is perfect for getting deeper insights from your data. Whether logs, events, analytics, or similar, APL provides the flexibility to filter, manipulate, and summarize your data exactly the way you need it.

## Get started
<Prerequisites />

Go to the Query tab and click one of your datasets to get started. The APL editor has full auto-completion so you can poke around or you can get a better understanding of all the features by using the reference menu to the left of this page.
## Build an APL query

## APL query structure
APL queries consist of the following:

At a minimum, a query consists of source data reference (name of a dataset) and zero or more query operators applied in sequence. Individual operators are delimited using the pipe character (`|`).
- **Data source:** The most common data source is one of your Axiom datasets.
- **Operators:** Operators filter, manipulate, and summarize your data.

APL query has the following structure:
Delimit operators with the pipe character (`|`).

A typical APL query has the following structure:

```kusto
DataSource
| operator ...
| operator ...
DatasetName
| Operator ...
| Operator ...
```

Where:
- `DatasetName` is the name of the dataset you want to query.
- `Operator` is an operation you apply to the data.

- DataSource is the name of the dataset you want to query
- Operator is a function that will be applied to the data
<Note>
Apart from Axiom datasets, you can use other data sources:
- External data sources using the [externaldata](/apl/tabular-operators/externaldata-operator) operator.
- Specify a data table in the APL query itself using the `let` statement.
</Note>

Let’s look at an example query.
## Example query

```kusto
['github-issue-comment-event']
| extend bot = actor contains "-bot" or actor contains "[bot]"
| where bot == true
| extend isBot = actor contains '-bot' or actor contains '[bot]'
| where isBot == true
| summarize count() by bin_auto(_time), actor
```

The query above begins with reference to a dataset called **github-issue-comment-event** and contains several operators, [extend](/apl/tabular-operators/extend-operator), [where](/apl/tabular-operators/where-operator), and [summarize](/apl/tabular-operators/summarize-operator), each separated by a `pipe`. The extend operator creates the **bot** column in the returned result, and sets its values depending on the value of the actor column, the **where** operator filters out the value of the **bot** to a branch of rows and then produce a chart from the aggregation using the **summarize** operator.
[Run in Playground](https://play.axiom.co/axiom-play-qf1k/query?initForm=%7B%22apl%22%3A%22%5B'github-issue-comment-event'%5D%20%7C%20extend%20isBot%20%3D%20actor%20contains%20'-bot'%20or%20actor%20contains%20'%5Bbot%5D'%20%7C%20where%20isBot%20%3D%3D%20true%20%7C%20summarize%20count()%20by%20bin_auto(_time)%2C%20actor%22%7D)

The query above uses a dataset called `github-issue-comment-event` as its data source. It uses the follwing operators:

- [extend](/apl/tabular-operators/extend-operator) adds a new field `isBot` to the query results. It sets the values of the new field to true if the values of the `actor` field in the original dataset contain `-bot` or `[bot]`.
- [where](/apl/tabular-operators/where-operator) filters for the values of the `isBot` field. It only returns rows where the value is true.
- [summarize](/apl/tabular-operators/summarize-operator) aggregates the data and produces a chart.

Each operator is separated using the pipe character (`|`).

## Example result

As a result, the query returns a chart and a table. The table counts the different values of the `actor` field where `isBot` is true, and the chart displays the distribution of these counts over time.

```kusto
| actor | count_ |
|---------------------|--------|
| github-actions[bot] | 487 |
| sonarqubecloud[bot] | 208 |
| dependabot[bot] | 148 |
| vercel[bot] | 91 |
| codecov[bot] | 63 |
| openshift-ci[bot] | 52 |
| coderabbitai[bot] | 43 |
| netlify[bot] | 37 |
```

<Note>
The query results are a representation of your data based on your request. The query doesn’t change the original dataset.
</Note>

## Quote dataset and field names

If the name of a dataset or field contains at least one of the following special characters, quote the name in your APL query:
- Space (` `)
- Dot (`.`)
- Dash (`-`)

To quote the dataset or field in your APL query, enclose its name with quotation marks (`'` or `"`) and square brackets (`[]`). For example, `['my-field']`.

For more information on rules about naming and quoting entities, see [Entity names](/apl/entities/entity-names).

The most common kind of query statement is a tabular expression statement. Tabular statements contain operators, each of which starts with a tabular `input` and returns a tabular `output.`
## What's next

- Explore the [tabular operators](/apl/tabular-operators/extend-operator) we support.
- Check out our [entity names and identifier naming rules](/apl/entities/entity-names).
Check out the [list of sample queries](/apl/tutorial) or explore the supported operators and functions:

Axiom Processing Language supplies a set of system [data types](/apl/data-types/scalar-data-types) that define all the types of [data](/apl/data-types/null-values) that can be used with Axiom Processing Language.
- [Scalar functions](/apl/scalar-functions/)
- [Aggregation functions](/apl/aggregation-function/)
- [Tabular operators](/apl/tabular-operators/)
- [Scalar operators](/apl/scalar-operators/)
239 changes: 118 additions & 121 deletions apl/tutorial.mdx

Large diffs are not rendered by default.

Binary file removed doc-assets/shots/overview-of-apl-introduction.png
Binary file not shown.
301 changes: 167 additions & 134 deletions restapi/ingest.mdx
Original file line number Diff line number Diff line change
@@ -1,65 +1,47 @@
---
title: "Send data via Axiom API"
description: "Learn how to send and load data into Axiom using the API."
title: "Send data to Axiom via API"
description: "This page explains how to send to Axiom using the API."
sidebarTitle: Send data
tags: ['axiom documentation', 'documentation', 'axiom', 'axiom api', 'rest api', 'rest', 'authorization', 'headers', 'send data', 'ingesting', 'json', 'arrays', 'nested arrays', 'objects', 'strings', 'csv', 'response']
---

This API allows you to send and load data into Axiom. You can use different methods to ingest logs depending on your requirements and log format.
import Prerequisites from "/snippets/standard-prerequisites.mdx"
import ReplaceDatasetToken from "/snippets/replace-dataset-token.mdx"

## Authorization and headers
The Axiom REST API accepts the following data formats:

The only expected header is `Authorization: Bearer` which is your to token to authenticate the request. For more information, see [Tokens](/reference/tokens).
- [JSON](#send-data-in-json-format)
- [NDJSON](#send-data-in-ndjson-format)
- [CSV](#send-data-in-csv-format)

## Using Axiom JS library to ingest data
This page explains how to send data to Axiom via cURL commands in each of these formats, and how to send data with the [Axiom Node.js library](#send-data-with-axiom-node-js).

Axiom maintains the [axiom-js](https://github.com/axiomhq/axiom-js) to provide official Javascript bindings for the Axiom API.
For more information on other ingest options, see [Send data](send-data/ingest).

Install using `npm install`:
For an introduction to the basics of the Axiom API and to the authentication options, see [Introduction to Axiom API](/restapi/introduction).

```shell
npm install @axiomhq/js
```

If you use the [Axiom CLI](https://github.com/axiomhq/cli), run `eval $(axiom config export -f)` to configure your environment variables.

Otherwise, create an [API token](/reference/tokens) and export it as `AXIOM_TOKEN`.

You can also configure the client using options passed to the constructor of the Client:

```ts
const client = new Client({
token: process.env.AXIOM_TOKEN
});
```

Create and use a client like this:

```ts
import { Axiom } from '@axiomhq/js';
The API requests on this page use the ingest data endpoint. For more information, see the [API reference](/restapi/endpoints/ingestIntoDataset).

<Prerequisites />

async function main() {
const axiom = new Axiom({
token: process.env.AXIOM_TOKEN
});
## Send data in JSON format

await axiom.ingest('my-dataset', [{ foo: 'bar' }]);
To send data to Axiom in JSON format:
1. Encode the events as JSON objects.
1. Enter the array of JSON objects into the body of the API request.
1. Optional: In the body of the request, set optional parameters such as `timestamp-field` and `timestamp-format`. For more information, see the [ingest data API reference](/restapi/endpoints/ingestIntoDataset).
1. Set the `Content-Type` header to `application/json`.
1. Set the `Authorization` header to `Bearer API_TOKEN`. Replace `API_TOKEN` with the Axiom API token you have generated.
1. Send the POST request to `https://api.axiom.co/v1/datasets/DATASET_NAME/ingest`. Replace `DATASET_NAME` with the name of the Axiom dataset where you want to send data.

const res = await axiom.query(`['my-dataset'] | where foo == 'bar' | limit 100`);
}
```
### Example with grouped events

These examples send an API event to Axiom. Before getting started with Axiom API, you need to create a [Dataset](/reference/datasets) and [API Token](/reference/tokens).
The following example request contains grouped events. The structure of the JSON payload has the scheme of `[ { "labels": { "key1": "value1", "key2": "value2" } }, ]` where the array contains one or more JSON objects describing events.

## Ingest Events using JSON

The following example request contains grouped events. The structure of the `JSON` payload should have the scheme of `[ { "labels": { "key1": "value1", "key2": "value12" } }, ]`, in which the array comprises of one or more JSON objects describing Events.

### Example Request using JSON
**Example request**

```bash
curl -X 'POST' 'https://api.axiom.co/v1/datasets/{dataset_name}/ingest' \
curl -X 'POST' 'https://api.axiom.co/v1/datasets/DATASET_NAME/ingest' \
-H 'Authorization: Bearer API_TOKEN' \
-H 'Content-Type: application/json' \
-d '[
@@ -74,9 +56,9 @@ curl -X 'POST' 'https://api.axiom.co/v1/datasets/{dataset_name}/ingest' \
]'
```

### Example Response
<ReplaceDatasetToken />

A successful POST Request returns a `200` response code JSON with details:
**Example response**

```json
{
@@ -85,144 +67,178 @@ A successful POST Request returns a `200` response code JSON with details:
"failures": [],
"processedBytes": 219,
"blocksCreated": 0,
"walLength": 8
"walLength": 2
}
```

### Example Request using Nested Arrays
### Example with nested arrays

**Example request**

```bash
curl -X 'POST' 'https://api.axiom.co/v1/datasets/{dataset_name}/ingest' \
-H 'Authorization: Bearer API_TOKEN' \
-H 'Content-Type: application/json' \
-d '[
{
"axiom": [{
"logging":[{
"observability":[{
"location":[{
"credentials":[{
"datasets":[{
"first_name":"axiom",
"last_name":"logging",
"location":"global"
}],
"work":[{
"details":"https://app.axiom.co/",
"tutorials":"https://www.axiom.co/blog",
"changelog":"https://www.axiom.co/changelog",
"documentation": "https://www.axiom.co/docs"
}]
}],
"social_media":[{
"details":[{
"twitter":"https://twitter.com/AxiomFM",
"linkedin":"https://linkedin.com/company/axiomhq",
"github":"https://github.com/axiomhq"
curl -X 'POST' 'https://api.axiom.co/v1/datasets/DATASET_NAME/ingest' \
-H 'Authorization: Bearer API_TOKEN' \
-H 'Content-Type: application/json' \
-d '[
{
"axiom": [{
"logging":[{
"observability":[{
"location":[{
"credentials":[{
"datasets":[{
"first_name":"axiom",
"last_name":"logging",
"location":"global"
}],
"work":[{
"details":"https://app.axiom.co/",
"tutorials":"https://www.axiom.co/blog",
"changelog":"https://www.axiom.co/changelog",
"documentation": "https://www.axiom.co/docs"
}]
}],
"features":[{
"datasets":"view logs",
"stream":"live_tail",
"explorer":"queries"
"social_media":[{
"details":[{
"twitter":"https://twitter.com/AxiomFM",
"linkedin":"https://linkedin.com/company/axiomhq",
"github":"https://github.com/axiomhq"
}],
"features":[{
"datasets":"view logs",
"stream":"live_tail",
"explorer":"queries"
}]
}]
}]
}],
"logs":[{
"apl": "functions"
}]
}],
"logs":[{
"apl": "functions"
}]
}],
"storage":[{}]
}]}
]'
"storage":[{}]
}]}
]'
```

### Example Response
<ReplaceDatasetToken />

A successful POST Request returns a `200` response code JSON with details:
**Example response**

```json
{
"ingested":1,
"failed":0,
"failures":[],
"processedBytes":1509,
"blocksCreated":0,
"walLength":6
"ingested":1,
"failed":0,
"failures":[],
"processedBytes":1587,
"blocksCreated":0,
"walLength":3
}
```

### Example Request using Objects, Strings, and Arrays
### Example with objects, strings, and arrays

**Example request**

```bash
curl -X 'POST' 'https://api.axiom.co/v1/datasets/{dataset_name}/ingest' \
-H 'Authorization: Bearer API_TOKEN' \
-H 'Content-Type: application/json' \
-d '[{ "axiom": {
"logging": {
"observability": [
{ "apl": 23, "function": "tostring" },
{ "apl": 24, "operator": "summarize" }
],
"axiom": [
{ "stream": "livetail", "datasets": [4, 0, 16], "logging": "observability", "metrics": 8, "dashboard": 10, "alerting": "kubernetes" }
]
},
"apl": {
"reference":
[[80, 12], [30, 40]]
curl -X 'POST' 'https://api.axiom.co/v1/datasets/DATASET_NAME/ingest' \
-H 'Authorization: Bearer API_TOKEN' \
-H 'Content-Type: application/json' \
-d '[{ "axiom": {
"logging": {
"observability": [
{ "apl": 23, "function": "tostring" },
{ "apl": 24, "operator": "summarize" }
],
"axiom": [
{ "stream": "livetail", "datasets": [4, 0, 16], "logging": "observability", "metrics": 8, "dashboard": 10, "alerting": "kubernetes" }
]
},
"apl": {
"reference":
[[80, 12], [30, 40]]
}
}
}
}]'
}]'
```

### Example Response
<ReplaceDatasetToken />

A successful POST Request returns a `200` response code JSON with details:
**Example response**

```json
{
"ingested":1,
"failed":0,
"failures":[],
"processedBytes":432,
"blocksCreated":0,
"walLength":7
"ingested":1,
"failed":0,
"failures":[],
"processedBytes":432,
"blocksCreated":0,
"walLength":4
}
```

### Example Response
## Send data in NDJSON format

To send data to Axiom in NDJSON format:
1. Encode the events as JSON objects.
1. Enter each JSON object in a separate line into the body of the API request.
1. Optional: In the body of the request, set optional parameters such as `timestamp-field` and `timestamp-format`. For more information, see the [ingest data API reference](/restapi/endpoints/ingestIntoDataset).
1. Set the `Content-Type` header to `application/x-ndjson`.
1. Set the `Authorization` header to `Bearer API_TOKEN`. Replace `API_TOKEN` with the Axiom API token you have generated.
1. Send the POST request to `https://api.axiom.co/v1/datasets/DATASET_NAME/ingest`. Replace `DATASET_NAME` with the name of the Axiom dataset where you want to send data.

A successful POST Request returns a `200` response code JSON with details:
#### Example

```bash
curl -X 'POST' 'https://api.axiom.co/v1/datasets/DATASET_NAME/ingest' \
-H 'Authorization: Bearer API_TOKEN' \
-H 'Content-Type: application/x-ndjson' \
-d '{"id":1,"name":"machala"}
{"id":2,"name":"axiom"}
{"id":3,"name":"apl"}
{"index": {"_index": "products"}}
{"timestamp": "2016-06-06T12:00:00+02:00", "attributes": {"key1": "value1","key2": "value2"}}
{"queryString": "count()"}'
```

<ReplaceDatasetToken />

**Example response**

```json
{
"ingested": 6,
"failed": 0,
"failures": [],
"processedBytes": 236,
"processedBytes": 266,
"blocksCreated": 0,
"walLength": 6
}
```

## Ingest Events using CSV
## Send data in CSV format

The following example request contains events. The structure of the `CSV` payload uses a comma to separate values `'value1, value2, value3'`.
To send data to Axiom in JSON format:
1. Encode the events in CSV format. The first line specifies the field names separated by commas. Subsequent new lines specify the values separated by commas.
1. Enter the CSV representation in the body of the API request.
1. Optional: In the body of the request, set optional parameters such as `timestamp-field` and `timestamp-format`. For more information, see the [ingest data API reference](/restapi/endpoints/ingestIntoDataset).
1. Set the `Content-Type` header to `text/csv`.
1. Set the `Authorization` header to `Bearer API_TOKEN`. Replace `API_TOKEN` with the Axiom API token you have generated.
1. Send the POST request to `https://api.axiom.co/v1/datasets/DATASET_NAME/ingest`. Replace `DATASET_NAME` with the name of the Axiom dataset where you want to send data.

### Example Request using CSV
**Example request**

```bash
curl -X 'POST' 'https://api.axiom.co/v1/datasets/{dataset_name}/ingest' \
-H 'Authorization: Bearer API_TOKEN' \
-H 'Content-Type: text/csv' \
-d 'user, name
foo, bar'
curl -X 'POST' 'https://api.axiom.co/v1/datasets/DATASET_NAME/ingest' \
-H 'Authorization: Bearer API_TOKEN' \
-H 'Content-Type: text/csv' \
-d 'user, name
foo, bar'
```

### Example Response
<ReplaceDatasetToken />

A successful POST Request returns a 200 response code JSON with details:
**Example response**

```json
{
@@ -235,4 +251,21 @@ A successful POST Request returns a 200 response code JSON with details:
}
```

Datasets names are usually case sensitive, Dataset names must be between 1-128 characters, and may only contain ASCII alphanumeric characters and the '-' character.
## Send data with Axiom Node.js

1. [Install and configure](/guides/javascript#use-axiomhq-js) the Axiom Node.js library.
1. Encode the events as JSON objects.
1. Pass the dataset name and the array of JSON objects to the `axiom.ingest` function.

```ts
axiom.ingest('DATASET_NAME', [{ foo: 'bar' }]);
await axiom.flush();
```

<ReplaceDataset />

For more information on other libraries you can use to query data, see [Send data](send-data/ingest).

## What’s next

After ingesting data to Axiom, you can [query it via API](/restapi/query) or the [Axiom app UI](/query-data/explore).
9 changes: 8 additions & 1 deletion restapi/introduction.mdx
Original file line number Diff line number Diff line change
@@ -24,7 +24,7 @@ Axiom API follows the REST architectural style and uses JSON for serialization.
For example, the following curl command ingests data to an Axiom dataset:

```bash
curl -X 'POST' 'https://api.axiom.co/v1/datasets/{dataset_name}/ingest' \
curl -X 'POST' 'https://api.axiom.co/v1/datasets/DATASET_NAME/ingest' \
-H 'Authorization: Bearer API_TOKEN' \
-H 'Content-Type: application/json' \
-d '[
@@ -34,6 +34,8 @@ curl -X 'POST' 'https://api.axiom.co/v1/datasets/{dataset_name}/ingest' \
]'
```

For more information, see [Send data to Axiom via API](/restapi/ingest) and [Ingest data endpoint](/restapi/endpoints/ingestIntoDataset).

## Regions

All examples in the Axiom API reference use the base domain `https://api.axiom.co`, which is the default for the US region. If your organization uses the EU region, change the base domain in the examples to `https://api.eu.axiom.co`.
@@ -79,3 +81,8 @@ Below is a list of the types of data used within the Axiom API:
| **Float** | A number with decimals. | 15.67 |
| **Map** | A data structure with a list of values assigned to a unique key. | \{ "key": "value" \} |
| **List** | A data structure with only a list of values separated by a comma. | ["value", 4567, 45.67] |

## What's next

- [Ingest data via API](/restapi/ingest)
- [Query data via API](/restapi/query)
758 changes: 299 additions & 459 deletions restapi/query.mdx

Large diffs are not rendered by default.