Skip to main content
Version: 3.3.1

Pipe Language

A pipe consists of an input which generates data that passes through actions and finally gets sent to the desired destination with an output.

In addition, some inputs may be used as actions, as well as some outputs

Inputs

amqp: Read from AMQP queues

azure-blob: Send data to a Microsoft Azure Storage Blob (Block Storage)

echo: Create a simple static event

exec: Execute arbitrary commands

files: Read from multiple files, in order of creation

http-poll: Run HTTP queries (GET and POST)

http-server: Run HTTP server

kafka: Consume events from one or more Kafka topics

redis: Read from Redis in-memory key-value store

s3: Stream data from a S3 Object

scuba: Run BQL queries against a Scuba API endpoint

sql: Query a SQL database

tcp: Listen for incoming TCP connections (or write to existing server)

udp: Listen for incoming UDP connections

Outputs

amqp: Send events to AMQP server

azure-blob: Send data to a Microsoft Azure Storage Blob (Block Storage)

azure-monitor: Send data to an Azure monitor

elastic: Send events to Elasticsearch server

exec: Execute arbitrary commands

file: Write to a file

http-get: Run HTTP GET requests

http-post: Run HTTP POST requests

http-server: Run HTTP server

kafka: Write to a Kafka topic

print: Print to either STDOUT (the standard output for the terminal) or STDERR

redis: Write to Redis in-memory key-value store

s3: Write events to a S3 bucket file

splunk-hec: Output events to a Splunk HTTP Event Collector endpoint (Splunk HEC)

sql: Insert data into a SQL database

tcp: Send data to a TCP server

udp: Send data to a UDP server

Actions

add: Add new fields to an event

collapse: Converts JSON records to another format, like CSV or key-value pairs

convert: Converts data types of values

  • pairs Convert the following field: type pairs

  • full Additional conversion options

enrich: Allows using CSV lookup to enrich data

expand: Converts simple separated data into JSON

extract: Extract data from plain text, using a pattern

exec: Execute arbitrary commands

filter: Removes events, based on some given conditions

  • schema Accept events that contain only given fields

  • condition Accept events only if given expression is true

  • patterns Patterns that must match the field values for the event to go through

  • exclude Patterns that must not match for the event to go through

flatten: Flatten nested JSON Objects and Arrays into a single JSON Object containing only top-level fields.

generate: Create new events, specifically for alerts

  • map A map of events to generate when conditions are met

  • alert A single event to generate when conditions are met

raw: Operations on raw (non-JSON) data

  • condition Only run this action if the condition the specified condition is met

  • to-json Wrap plain text as a JSON field

  • extract Extract data from plain text, using a pattern

  • replace Replace data from plain text, using a pattern

  • discard-until A pattern that must match before data starts to be processed

  • multiline Combine lines into multi-line events

remove: Remove fields

  • warn-on-missing Emit a warning if a field is missing but was expected to be removed

  • condition Only run this action if the condition the specified condition is met

  • fields The list of fields that should be removed if present

rename: Rename fields

  • condition Only run this action if the condition the specified condition is met

  • fields The array of fields and what they should be renamed to

script: Set fields to computed values, perhaps conditionally

stalled: Reports when a stream has stopped getting events for a given duration

stream: Create a new field calculated on historical data

time: Time{stamp} manipulation

transaction: Collects events together based on some condition to make a single new event

transition: Performs various actions based on a changed field