write_to_file Operator
The write_to_file
operator persists pipeline data to disk.
Syntax
| write_to_file(<file_path> [, <options>])
Description
This operator writes documents from the pipeline to a specified file location. The operator supports various output formats and writing modes to accommodate different persistence requirements.
Parameters
file_path
: String expression for the file pathoptions
: Optional object with file writing options
Examples
Basic Usage
| write_to_file("output.json")
| write_to_file("logs/events.log")
| write_to_file("data/" + date + ".json")
With Options
| write_to_file("output.json", { format: "json" })
| write_to_file("data.csv", { format: "csv", delimiter: "," })
| write_to_file("logs.txt", { append: true })
Dynamic File Paths
| write_to_file("logs/" + year + "/" + month + "/events.json")
| write_to_file("exports/" + product_id + "_data.json")
| write_to_file("backup/" + timestamp + ".json")
In Complete Flows
create flow data_export as
sensor_data
| where temperature > 25
| select { id, temp: temperature, timestamp }
| write_to_file("high_temp_events.json")
create flow log_processor as
application_logs
| where level = "error"
| write_to_file("error_logs.json", { append: true })
File Options
Format Options
format: "json"
- JSON format (default)format: "csv"
- CSV formatformat: "ndjson"
- Newline-delimited JSON
Writing Options
append: true
- Append to existing fileappend: false
- Overwrite existing file (default)
CSV Options
delimiter: ","
- Field delimiterheaders: true
- Include column headers
Performance Considerations
- File I/O can be slow for high-volume data
- Consider appropriate file formats for your use case
- Large files may impact system performance
- Use append mode carefully to avoid file corruption
Related Operators
- INSERT_INTO - Route data to streams
- ASSERT_OR_SAVE_EXPECTED - Testing output