Quick Start Guide

Technical Overview

The Syntmon application is comprised of three parts:

  • The Syntmon application
  • An InfluxDB (version 1.11.8) for storing all file information
  • A MySQL Database (version 5.7) for application configuration, logging, and authentication

The core application is provided by BrythonicBytes a Docker container, the database components are provided by third parties and can be setup according to your use case.

The default root password is “Brythonic@1”

Quick Install Walkthrough

This covers how to quickly set up Syntmon for evaluation using Docker on Linux. Here we walk through the install process, host addition, manual report uploading, alert creation, API key creation, and automated report uploading.

This setup is not fit for production and is for evaluation only.

Syntmon Service

The Syntmon service is responsible for accepting data from clients, processing information, and serving content for web frontend users.

docker pull brythonicbytes/syntmon:latest

The essential environment variables are:

VariableDescriptionExamples
INFLUXDB_HOSTThe InfluxDB endpoint“http://10.1.2.3:8086”
DOMAIN_NAMEThe domain name that the backend API (and web frontend) are accessible on“syntmon.example.com”
MYSQLDB_HOSTThe MYSQL endpoint

“mysql.syntmon.svc.cluster.local”

“mysql.example.com”

MYSQLDB_USERA username for accessing the MYSQL database

“admin”

“user01”

MYSQLDB_PASSWORDThe corresponding password for MYSQLDB_USER“P@5sw0rd”
MYSQLDB_PORTThe corresponding port for MYSQLDB_HOST“3306”
PORTThe port to access the HTTP service80

Quick Start Configs

Client Configuration

Configuring clients to produce and upload reports to Syntmon requires Aide and cURL to be installed along with the ability to POST data to the backend.

A suitable Aide configuration to scan all files on a system is:

/ p+u+g+sha512
database_out=file:/var/db/aide.db
database_new=file:/var/db/aide.db

After generating an API key from the users page within the web frontend, this can then be automated from crontab with something like:

aide -ic /etc/aide.conf&&mv /var/db/aide.db.new /var/db/aide.db&&curl -k -XPOST https://<ENDPOINT>/api/v1/upload-report -H"content-Type: multipart/form-data" -H'Accept: application/json' -H "Authorization: Bearer <APIKEY>" -Ftype="aidelogfile" -Fdata=@/var/db/aide.db -Fhost=$(hostname -f)

Reports can be generated and submitted from Windows in a similar fashion, the array of paths to be searched in $P, the ENDPOINT, and APIKEY all need changing to suit your environment:

$P="C:\","D:\";$T = New-TemporaryFile;Get-Date -Format "yyyy/MM/dd HH:mm:ss K">$T;Get-ChildItem $P -File -Recurse -PipelineVariable File|ForEach-Object{$stream = try {[IO.FileStream]::new( $File.FullName, [IO.FileMode]::Open, [IO.FileAccess]::Read, [IO.FileShare]::Read )}catch {[IO.FileStream]::new( $File.FullName, [IO.FileMode]::Open, [IO.FileAccess]::Read, [IO.FileShare]::ReadWrite )}if( $stream ) {try {Get-FileHash -InputStream $stream -Algorithm SHA512 | Select-Object @{ Name = 'Path'; Expression = { $File.Fullname } }, Hash}finally {$stream.Close()}}}>>$T;C:\WINDOWS\system32\curl.exe -XPOST https://<ENDPOINT>/api/v1/upload-report -H"content-Type: multipart/form-data" -H'Accept: application/json' -H "Authorization: Bearer <APIKEY>" -Ftype="windowshashlist" -Fdata=@$T -Fhost=$(hostname) -Ftimezone=$(get-date -Format"K ")

For uploading PS(1) output, run the following command either manually or using a cronjob:

ps auxww>/tmp/psout;curl -XPOST https:/<ENDPOINT>/api/v1/upload-procs -H"content-Type: multipart/form-data" -H'Accept: application/json' -H "Authorization: Bearer <APIKEY>" -Ftype="ps" -Fdata=@/tmp/psout -Fhost=$(hostname -f) -Ftimezone=$(date +"%z")
Shopping Basket