Logs

Norsk generates a number of log files during operation. These can be accessed by simply mounting /var/log/norsk onto your local machine, and then you can view them with your favourite text editor:

> docker run --volume=/tmp/logs:/var/log/norsk --volume=/home/user/license.json:/mnt/license.json:ro norskvideo/norsk:latest --license-file /mnt/license.json

if you did that, the in your /tmp/logs you would observe a structure something like:

❯ ls -1R /tmp/logs                                                                                                                                                          ~/dev/norsk
/tmp/logs:
media
rust_nif.log.2023-05-16
stderr

/tmp/logs/media:
debug.json
debug.json.2023-05-16T16:27:02Z
debug.log
debug.log.2023-05-16T16:27:02Z
human.log
human.log.2023-05-16T16:27:02Z

/tmp/logs/stderr:
current
lock
state

The main files that you might be interested in would be debug.log and debug.json - they contain the same information, with debug.log aimed for humans and debug.json so automated processing. Norsk contains a significant amount of Rust code for performance reason, and it logs to rust_nif.log.* - typically you would be unlikely to need to look in here, but it may be something we ask for in a support ticket. Finally there’s a stderr folder which contains any output that went to stderr - this is typically third-party library code that is outside of Norsk’s control - Chromium Embedded Framework being the main user.

FluentBit Log Shipping

To simplify log processing within a production system, Norsk ships with FluentBit log shipping built in (https://fluentbit.io). All you need to do to make use of this is provide a FluentBit configuration file to say where you want the output sent. For example, if you had a file log-output.conf in /home/user/norskConfig:

[OUTPUT]
    name                loki
    match               erlang
    host                [your-loki-host]
    port                3100
    labels              source=norsk_erlang,$level,$pid,$domain,$mfa,$data
    log_level           error

[OUTPUT]
    name                loki
    match               rust
    host                [your-loki-host]
    port                3100
    labels              source=rust

[OUTPUT]
    name                loki
    match               stderr
    host                [your-loki-host]
    port                3100
    labels              source=cef

and provide that to norsk in your run command as in:

> docker run --mount type=bind,source=/home/user/norskConfig,target=/mnt/config,readonly norskvideo/norsk:latest --license-file /mnt/config/license.json --log-config-file /mnt/config/log-output.conf

Then FluentBit could capture all three sources of logs and send them to your Loki instance (Loki is a log aggregation system within the Grafana suite - see https://grafana.com/oss/loki/). FluentBit supports many outputs, so there’s almost certainly something out-of-the-box for your chosen log aggregation service.

FluentBit Configuration Example: CloudWatch

If you’re using AWS, you’ll likely want to send your logs to CloudWatch. Your configuration file would then look something like this:

[OUTPUT]
    Name cloudwatch_logs
    Match *
    region eu-north-1
    log_group_name norsk
    log_stream_name norsk
    log_stream_template $source
    log_retention_days 3
    auto_create_group true

A few things to note:

  • you would need to provide the correct IAM permissions to the container to allow it to write to CloudWatch, namely logs:CreateLogGroup, logs:CreateLogStream, logs:PutLogEvents, logs:PutRetentionPolicy and logs:DescribeLogStreams.

  • you are able to customize the stream name based on the content of the log event - in this case we’re using the source field, but you could use any other field, a combination of them, or just remove the template and have a fixed stream name.

For more info see the FluentBit CloudWatch documentation, and the FluentBit Record Accessor documentation for how to configure the log_stream_template field.