Fluent Bit
Fluent Bit - agent that allows you to customize log collection, processing and uploading, as well as import logs from third-party storages. Fluent Bit is extensible with output and input plugins. A list of plugins can be found on the Fluent Bit website (Outputs, Inputs).
With Fluent Bit you can work with logs in the Logs service: get logs or add logs from your own storage.
Set up Fluent Bit before you start working.
Customize Fluent Bit
- Add a service user.
- Issue an S3 key to the user.
- Configure the agent to receive or add events.
1. Add a service user
Add a service user with permission in the access area Projects and role:
I'm a member;- or
reader.
Users can be added by the Account Owner or users with the role of iam_admin.
2. Issue an S3 key to the user
Users with access to the control panel can issue themselves S3 keys but we recommend to create service users and issue S3 keys to them.
S3 keys can only be issued to other users by the Account Owner or a user with the role iam_admin. A service user cannot get an S3 key by himself, because he does not have access to the control panel - he must be issued a key by the Account Owner or iam_admin.
A separate key must be created for each project. Multiple keys can be issued for one project.
-
In the control panel, on the top menu, click Account.
-
Go to the section with the desired user type:
- Users - for users with access to the control panel;
- Service users - for service users.
-
Open the user page → Access tab.
-
In the S3 keys block, click Add Key.
-
Enter the name of the key.
-
Select the project for which the key will work.
-
Click Generate. Two values will be generated:
- Access key - Access Key ID, key identifier;
- Secret key - Secret Access Key, secret key.
-
Click Copy and save the key - it cannot be viewed after the window is closed.
3. Customize the agent
Obtaining logs
Adding logs
With the built-in fluent-bit-cloudwatch-input-plugin, you can configure to offload events from the Logi service to your infrastructure, such as Elasticsearch, Loki, or Kafka.
-
Download the configuration file
config/fluent-bit.yaml. -
Open the
config/fluent-bit.yamlconfiguration file in a text editor. -
In the parameters of the configuration file, specify:
region- pool for exampleru-9;endpoint- URL to access the logging API. The list of URLs can be found in the instructions List of URLs;log_group_name- log group name, e.g.s/lbaas/Chromie-lb;log_stream_name- name of the event stream, e.g.http-21380357;sqlite_path- address of the SQLite database that stores information about the last received events. One database can be used for several groups and streams.
The list of events from the GetLogEvents method of Amazon CloudWatch API is split into separate records using a Lua filter. Each received event is converted into a separate independent Fluent Bit record. This allows the events to be processed separately, tagged and sent to different output plugins. The configuration file uses the stdout output plugin, but you can use any plugin, such as Elasticsearch or Kafka.
-
Optional: to customize data collection from multiple groups and streams within a single configuration file, add the appropriate
inputsblocks in thepipelineblock. -
Start Fluent Bit:
docker run \
--name fluent-bit-cloudwatch \
--rm \
-v ${PWD}/config/fluent-bit.yaml:/fluent-bit/etc/fluent-bit.yaml:ro \
-v ${PWD}/sqlite:/var/lib/fluent-bit/cloudwatch/sqlite:rw \
-e AWS_ACCESS_KEY_ID=<access_key> \
-e AWS_SECRET_ACCESS_KEY=<secret_key> \
ghcr.io/selectel/fluent-bit-cloudwatch-input-plugin:latestSpecify:
<access_key>- field value Access key from the S3 key you issued to the user;<secret_key>- field value Secret key from the S3 key that you issued to the user.
You can add
<access_key>and<secret_key>to the container in any way you want:- through environment variables. See Using environment variables to globally configure AWS SDKs and tools in the Amazon documentation for more information;
- or through
configandcredentialsfiles. See the Using shared config and credentials files to globally configure AWS SDKs and tools documentation from Amazon for more information.
Example of output to stdout in MessagePack format:
...
[1029] cloudwatch-input.0: [[1751577177.000000000, {}], {"ingestion_time"=>1750606362344, "message"=>"{"level":"System", "msg":"InnoDB initialization has started."}", "timestamp"=>1750606335369}]
[1031] cloudwatch-input.0: [[1751577177.000000000, {}], {"ingestion_time"=>1750608768923, "message"=>"{"level":"System", "msg":"InnoDB initialization has ended."}", "timestamp"=>1750608747624}]
...
-
Install the Fluent Bit agent. Use the Download and install Fluent Bit instructions in the official Fluent Bit documentation.
-
Open the
config/fluent-bit.yamlconfiguration file in a text editor. -
Add a configuration to the
fluent-bit.yamlfile to add logs. Example configuration:inputs:
- name: tail
path: /var/log/syslog
tag: system.logs
outputs:
- name: stdout
match: '*'
- name: cloudwatch_logs
match: system.logs
region: <pool>
endpoint: <log_endpoint>
log_group_name: <log_group_name>
log_stream_name: <log_stream_names> # or ${HOSTNAME}Specify:
<pool>- pool for exampleru-9;<log_endpoint>- URL for accessing the API of the Logi service. The list of URLs can be found in the instructions List of URLs;<log_group_name>- log group name, e.g.s/lbaas/Loabalancer-1;<log_stream_names>- list of streams from which you want to get logs, e.g.[http-c48d78e2-6f49-43b5-80b7-2f2b8e5f669d].
-
Add the Access Key and Secret Key field values from the S3 key using one of the methods:
- through environment variables. See Using environment variables to globally configure AWS SDKs and tools in the Amazon documentation for more information;
- or
configandcredentialsfiles. See Using environment variables to globally configure AWS SDKs and tools in the Amazon documentation for more information
-
Start Fluent Bit:
sudo /opt/fluent-bit/bin/fluent-bit -c /etc/fluent-bit/fluent-bit.conf -vv