Install and setup graylog in docker¶
I have a router that has both a syslog and a firewall log, the routers interface for logs is cumbersome and paged with about 100 lines per page.
I would like to view my logs a different way and I am therefore trying graylog.
My install env is docker running on a spare computer, i have portainer to manage containers and stacks (docker compose).  
Inspiration is from this video from Lawrence Systems
Graylog: Your Comprehensive Guide to Getting Started Open Source Log Management
Installing graylog¶
- I am basing my install on this docker compose file from this github repo
 - Generate variables for the environment using this example 
# You MUST set a secret to secure/pepper the stored user passwords here. # Use at least 64 characters. # Generate one by using for example: pwgen -N 1 -s 96 # ATTENTION: This value must be the same on all Graylog nodes in the cluster. # Changing this value after installation will render all user sessions # and encrypted values in the database invalid. (e.g. encrypted access tokens) GRAYLOG_PASSWORD_SECRET="" # You MUST specify a hash password for the root user (which you only need to initially set up the # system and in case you lose connectivity to your authentication backend) # This password cannot be changed using the API or via the web interface. If you need to change it, # modify it in this file. # Create one by using for example: echo -n yourpassword | shasum -a 256 # and put the resulting hash value into the following line # CHANGE THIS! GRAYLOG_ROOT_PASSWORD_SHA2="" - Create a new stack in portainer and paste the docker cmpose file contents
 - Add generated pw secret and root pw as portainer stack variables
 - Change any port assignments that might be conflicting with other service
 - Deploy the stack
 - Access the web interface on 
localhost:9000or another port if you changed it in the stack setup 
The stack i ended up using was this:
# From https://github.com/Graylog2/docker-compose/tree/main/open-core
version: "3.8"
services:
  mongodb:
    image: "mongo:5.0"
    volumes:
      - "mongodb_data:/data/db"
    restart: unless-stopped
  opensearch:
    image: "opensearchproject/opensearch:2.4.0"
    environment:
      - "OPENSEARCH_JAVA_OPTS=-Xms1g -Xmx1g"
      - "bootstrap.memory_lock=true"
      - "discovery.type=single-node"
      - "action.auto_create_index=false"
      - "plugins.security.ssl.http.enabled=false"
      - "plugins.security.disabled=true"
    ulimits:
      memlock:
        hard: -1
        soft: -1
      nofile:
        soft: 65536
        hard: 65536
    volumes:
      - "os_data:/usr/share/opensearch/data"
    restart: unless-stopped
  graylog:
    hostname: "server"
    image: "${GRAYLOG_IMAGE:-graylog/graylog:5.1.5}"
    depends_on:
      opensearch:
        condition: "service_started"
      mongodb:
        condition: "service_started"
    entrypoint: "/usr/bin/tini -- wait-for-it opensearch:9200 --  /docker-entrypoint.sh"
    environment:
      GRAYLOG_NODE_ID_FILE: "/usr/share/graylog/data/config/node-id"
      GRAYLOG_PASSWORD_SECRET: "${GRAYLOG_PASSWORD_SECRET:?Please configure GRAYLOG_PASSWORD_SECRET in the .env file}"
      GRAYLOG_ROOT_PASSWORD_SHA2: "${GRAYLOG_ROOT_PASSWORD_SHA2:?Please configure GRAYLOG_ROOT_PASSWORD_SHA2 in the .env file}"
      GRAYLOG_HTTP_BIND_ADDRESS: "0.0.0.0:9100"
      GRAYLOG_HTTP_EXTERNAL_URI: "http://localhost:9100/"
      GRAYLOG_ELASTICSEARCH_HOSTS: "http://opensearch:9200"
      GRAYLOG_MONGODB_URI: "mongodb://mongodb:27017/graylog"
    ports:
    - "2055:2055/udp"   # Netflow udp
    - "5044:5044/tcp"   # Beats
    - "514:5140/udp"   # Syslog
    - "5140:5140/tcp"   # Syslog
    - "5555:5555/tcp"   # RAW TCP
    - "5555:5555/udp"   # RAW TCP
    - "9100:9100/tcp"   # Server API
    - "12201:12201/tcp" # GELF TCP
    - "12201:12201/udp" # GELF UDP
    - "13301:13301/tcp" # Forwarder data
    - "13302:13302/tcp" # Forwarder config
    volumes:
      - "graylog_data:/usr/share/graylog/data/data"
      - "graylog_journal:/usr/share/graylog/data/journal"
    restart: unless-stopped
volumes:
  mongodb_data:
  os_data:
  graylog_data:
  graylog_journal:
New user¶
The root users timezone is UTC and also the timestamp that logs are stored as.
I prefer to view the logs in my local timezone and to do that i create a new user.  
TIP to see the configured timezones click your user and select System->Overview then scroll down to see the configured timezones for:  
User admin: 2023-09-30 13:42:53 +02:00
Your web browser: 2023-09-30 13:42:53 +02:00
Graylog server: 2023-09-30 11:42:53 +00:00  
- Got to 
System->Users and Teams - Create a user with admin priviledges, change timezone to your timezone
 - log out and then in with your new user creds, confirm timezone in 
System->Overview 
Configure inputs¶
In order to get data to graylog you need to create inputs, my router is configured to send syslog data on UDP port 514 and netflow data on UDP port 2055.  
- Select 
System->Inputs - Click select input and select your source type, this example uses 
SYSLOG UDP - Give it a title, something sensible that yu choose
 - Select the port, this is the internal port of the docker container, i have mapped 
UDP port 514from outside the container toUDP port 5140inside the container (see stack config above for details) - Save
 - Check that data is recieved by clicking on the inputs 
Show received messagesbutton, messages should appear 
Create indicies¶
In order to route your data to another stream than the Default Stream you need to configure indices
- Select 
System->Indices - Click 
Create Index Set - Fill out 
TitleDescriptionIndex Prefix(no spaces in these names) - Configure 
Index Rotation ConfigurationandIndex Retention Configurationto your preferences, remeber that logs can grow big! - Confirm with 
Create Index Setat the bottom of the page 
Create stream¶
Putting data in specific streams makes it easier to navigate data later
- Click 
StreamsandCreate Streams - Give the stream a name and optional a description
 - Select the index set to get data from (your index set from before should appear in the dropdown menu)
 - Check 
Remove matches from ‘Default Stream’to remove matching data from theDefault Stream(no need to store it twice) - Confirm with 
Create Stream 
The status of your stream is paused and before you can use it you need to filter what data is received in the stream, to do this you need to grab the input Field:Value pair
- Go to 
System->Inputsand clickShow Received Messages, in the next window copy the string next to the magnifying glass example:gl2_source_input:6517dd3c0b3aa72ee5489355 - Go back to streams and click 
More->Manage Ruleson your corresponding stream - Click 
Add Stream Ruleto create a new rule - In the Field input paste the field part of the 
Field:Valuepair, examplegl2_source_input - in the Value input paste the value part of the 
Field:Valuepair, example6517dd3c0b3aa72ee5489355 - Test the rule by selecting the input and 
Load Message, confirm that the filter is matching the message without errors - Click the stream name and confirm that data is collected in the new stream
 - Look at the default stream to confirm that older data is removed and not present anymore
 
Now, happy monitoring :-) Play around and see how you can query and filter your data.   
I find it quite nice, ~a million times better than the DD-WRT interface log viewer!