-
Notifications
You must be signed in to change notification settings - Fork 5
Splunk
Дар'я Харлан edited this page Dec 28, 2022
·
14 revisions
- Port
-
Data format. Allowed values:
JSON,DELIMITED,AVRO,LOG. Default -JSON. Data formats config examples - Grok pattern definitions file path - Specify path to a file with predefined grok patterns
- Grok pattern - Specify an actual grok pattern to parse the message
-
Delimited format type (CSV, CUSTOM) -
CSV- default, comma-separated format - Custom delimiter character
-
Change fields names for
DELIMITEDdata. Format -key:val,key2:val2,key3:val3 -
Schema file path for
AVROformat - specify file path with json schema - Max Batch Size (records) - how many records to send to further pipeline stages. Default - 1000 records
- Batch Wait Time (ms) - how many time to wait until batch will reach it's size. Default - 1000 ms
> agent source create
Choose source (influx, kafka, mongo, mysql, postgres, elastic, splunk): splunk
Enter a unique source name: splunk
Ports: 9999
Data format (JSON, DELIMITED, AVRO, LOG) [JSON]:
Source config created
> agent source create
Choose source (influx, kafka, mongo, mysql, postgres, elastic, splunk): splunk
Enter a unique source name: splunk
Ports: 9998
Data format (JSON, DELIMITED, AVRO, LOG) [JSON]: LOG
Grok pattern definitions file path []:
Grok pattern: %{COMMONAPACHELOG}
Source config created
> agent source create
Choose source (influx, kafka, mongo, mysql, postgres, elastic, splunk): splunk
Enter unique name for this source config: splunk_csv
Ports: 9996
Data format (JSON, DELIMITED, AVRO, LOG) [JSON]: DELIMITED
Delimited format type (CSV, CUSTOM): CUSTOM
Custom delimiter character: |
Connecting to the source. Check again after 2 seconds...
...
Change fields names (format - key:val,key2:val2,key3:val3) []: 0:timestamp,2:ver,4:Country,7:Clicks
Source config created
| Property | Type | Description |
|---|---|---|
type |
String | Specify source type. Value - splunk
|
name |
String | Unique source name - also the config file name |
config |
Object | Source configuration |
All properties are required
| Property | Type | Required | Description |
|---|---|---|---|
conf.ports |
String | Yes | List of kafka brokers, separated with commas |
conf.dataFormat |
String | no | Allowed values: JSON, DELIMITED, AVRO, LOG. Default - JSON
|
conf.dataFormatConfig.csvFileFormat |
String | no | Allowed values: CSV, CUSTOM. Default - CSV
|
conf.csvCustomDelimiter |
String | no | Custom delimiter |
csv_mapping |
object | no | Names of columns for delimited data |
conf.dataFormatConfig.avroSchemaSource |
String | no | Allowed values SOURCE (schema is present in data itself), INLINE (specify schema in conf.dataFormatConfig.avroSchema parameter), REGISTRY (Confluent schema registry) |
conf.dataFormatConfig.avroSchema |
Object | no | Avro schema (json object) |
conf.dataFormatConfig.schemaRegistryUrls |
Array | no | Schema registry urls |
conf.dataFormatConfig.schemaLookupMode |
String | no | How to look up a schema in the registry. Allowed values SUBJECT, ID, AUTO
|
conf.dataFormatConfig.subject |
String | no | Schema subject (specify if schemaLookupMode is SUBJECT) |
conf.dataFormatConfig.schemaId |
String | no | Schema id (specify if schemaLookupMode is ID) |
grok_definition_file |
String | no | File with grok patterns |
conf.dataFormatConfig.grokPattern |
String | no | Grok pattern to parse the message |
{
"type": "splunk_server",
"name": "test_splunk_csv",
"config": {
"conf.ports": ["9997"],
"conf.dataFormat": "DELIMITED",
"csv_mapping": {"0": "timestamp_unix", "2": "ver", "4": "Country", "6": "Exchange", "7": "Clicks"}
}
}- Home
- CLI reference
- API
- Kubernetes setup using Helm
- Podman setup
- Creating pipelines
- Test sources
- Data formats (JSON, CSV, AVRO, LOG)
- How to parse logs with grok patterns
- How to store sensitive information
- Automated pipelines creation
- Filtering
- Transformation files
- Fields
- DVP Configuration
- Integrations
- Sending events to Anodot