Pipes Language
A pipe consists of an input which generates data that passes through actions and finally gets sent to the desired destination with an output.
In addition, some inputs may be used as actions, as well as some outputs
Inputs
amqp: Read from AMQP queues
azure-blob: Send data to a Microsoft Azure Storage Blob (Block Storage)
echo: Create a simple static event
exec: Execute arbitrary commands
files: Read from multiple files, in order of creation
http-poll: Run HTTP queries (GET and POST)
http-server: Run HTTP server
kafka: Consume events from one or more Kafka topics
redis: Read from Redis in-memory key-value store
s3: Stream data from a S3 Object
scuba: Run BQL queries against a Scuba API endpoint
sql: Query a SQL database
tcp: Listen for incoming TCP connections (or write to existing server)
udp: Listen for incoming UDP connections
Outputs
amqp: Send events to AMQP server
azure-blob: Send data to a Microsoft Azure Storage Blob (Block Storage)
azure-monitor: Send data to an Azure monitor
elastic: Send events to Elasticsearch server
exec: Execute arbitrary commands
file: Write to a file
http-get: Run HTTP GET requests
http-post: Run HTTP POST requests
http-server: Run HTTP server
kafka: Write to a Kafka topic
print: Print to either STDOUT (the standard output for the terminal) or STDERR
redis: Write to Redis in-memory key-value store
s3: Write events to a S3 bucket file
splunk-hec: Output events to a Splunk HTTP Event Collector endpoint (Splunk HEC)
sql: Insert data into a SQL database
tcp: Send data to a TCP server
udp: Send data to a UDP server
Actions
abort: Abort the pipe if the condition is met
assert: Validate an event against a JSON Schema, based on IETF's draft v7 (http://json-schema.org)
add: Add new fields to an event
collapse: Converts JSON records to another format, like CSV or key-value pairs
convert: Converts data types of values
pairs
Convert the following field: type pairsfull
Additional conversion options
enrich: Allows using CSV lookup to enrich data
expand: Converts simple separated data into JSON
extract: Extract data from plain text, using a pattern
exec: Execute arbitrary commands
filter: Removes events, based on some given conditions
schema
Accept events that contain only given fieldscondition
Accept events only if given expression is truepatterns
Patterns that must match the field values for the event to go throughexclude
Patterns that must not match for the event to go through
flatten: Flatten nested JSON Objects and Arrays into a single JSON Object containing only top-level fields.
generate: Create new events, specifically for alerts
map
A map of events to generate when conditions are metalert
A single event to generate when conditions are met
raw: Operations on raw (non-JSON) data
condition
Only run this action if the condition the specified condition is metto-json
Wrap plain text as a JSON fieldextract
Extract data from plain text, using a patternreplace
Replace data from plain text, using a patterndiscard-until
A pattern that must match before data starts to be processedmultiline
Combine lines into multi-line events
remove: Remove fields
warn-on-missing
Emit a warning if a field is missing but was expected to be removedcondition
Only run this action if the condition the specified condition is metfields
The list of fields that should be removed if present
rename: Rename fields
condition
Only run this action if the condition the specified condition is metfields
The array of fields and what they should be renamed to
script: Set fields to computed values, perhaps conditionally
stalled: Reports when a stream has stopped getting events for a given duration
stream: Create a new field calculated on historical data
time: Time{stamp} manipulation
transaction: Collects events together based on some condition to make a single new event
transition: Performs various actions based on a changed field