diff --git a/README.md b/README.md index 253c968..dfbef92 100644 --- a/README.md +++ b/README.md @@ -3,16 +3,17 @@ The 12 factor [rule](https://12factor.net/logs) for logging says that an app "should not attempt to write to or manage logfiles. Instead, each running process writes its event stream, unbuffered, to stdout." The execution environment should take care of capturing the logs and perform further processing with it. -Funnel is meant to be a replacement for your app's "logger + [logrotate](http://www.linuxcommand.org/man_pages/logrotate8.html)" pipeline. No more sending SIGHUP signals, or reload commands to open a new file. No more setting up conditional flags to switch between writing to console and file when changing from dev to production. All you need to do is just print to stdout and pipe it to funnel. And let it take care of the rest. +Funnel is meant to be a replacement for your app's "logger + [logrotate](http://www.linuxcommand.org/man_pages/logrotate8.html)" pipeline. Think of it as a fluentd/logstash replacement(with minimal features!) but having only stdin as an input. All you need to do is just print to stdout and pipe it to funnel. And let it take care of the rest. ### Features quick tour -- Basic feature set of a log rotator: +- Basic use case of logging to local files. Acts as a log rotator also: * Rolling over to a new file * Deleting old files * Gzipping files * File rename policies - Prepend each log line with a custom string -- Live reloading of config file on save. No more messing around with SIGHUP or SIGUSR1. +- Supports other target outputs like Kafka, ElasticSearch. More info below. +- Live reloading of config on file save. No more messing around with SIGHUP or SIGUSR1. ### Quickstart @@ -26,6 +27,18 @@ $/etc/myapp/bin 2>&1 | funnel P.S. You also need to drop the funnel binary to your $PATH. +### Target outputs and Use cases + +| Output | Description | Log format | +|-------- | ----------- | ----------- | +| File | Writes to local files | No format needed. | +| Kafka | Send your log stream to a Kafka topic | No format needed. | +| Redis pub-sub | Send your log stream to a Redis pub-sub channel | No format needed. | +| ElasticSearch | Index, Search and Analyze structured JSON logs | Logs have to be in JSON format | +| InfluxDB | Use InfluxDB if your app emits timeseries data which needs to be queried and graphed | Logs have to be in JSON format with `tags` and `fields` as the keys | + +Further details on input log format along with examples can be found in the sample config [file](config.toml#L49). + ### Configuration The config can be specified in a .toml file. The file is part of the repo, which you can see [here](config.toml). All the settings are documented and are populated with the default values. The same defaults are embedded in the app itself, so the app can even run without a config file. @@ -42,8 +55,8 @@ Environment variables are also supported and takes the highest precedence. To ge - `rollup.file_rename_policy` becomes `ROLLUP_FILE_RENAME_POLICY` ### TODO: -- Add benchmarks -- Add new output targets like ElasticSearch, InfluxDB, AmazonS3 +- Add benchmarks. +- Add new output targets. - Add stats endpoint to expose metrics. #### Footnote - This project was heavily inspired from the [logsend](https://github.com/ezotrank/logsend) project.