SIEM Logstash parsing for more than hundred technologies

Overview

LogIndexer Pipeline

Logstash Parsing Configurations for Elastisearch SIEM and OpenDistro for Elasticsearch SIEM

Why this project exists

The overhead of implementing Logstash parsing and applying Elastic Common Schema (ECS) across audit, security, and system logs can be a large drawback when using Elasticsearch as a SIEM (Security Incident and Event Management). The Cargill SIEM team has spent significant time on developing quality Logstash parsing processors for many well-known log vendors and wants to share this work with the community. In addition to Logstash processors, we have also included log collection programs for API-based log collection, as well as the setup scripts used to generate our pipeline-to-pipeline architecture.

Quick start Instructions

"Quick start" mostly depends on how your Logstash configuration is set up. If you have your own setup already established, it might be best to use the processors that apply to your organization's log collection (found in the "config" directory). If you are seeking to use the architecture in this repo, consult the README found in the build_scripts directory. We will be adding an elaborate setup guide soon.

Contributions

We welcome and encourage individual contributions to this repo. Please see the Contribution.md guide in the root of the repo. Please note that we reserve the right to close pull requests or issues that appear to be out of scope for our project, or for other reasons not specified.

Questions, Comments & Expected Level of Attention

Please create an issue and someone will try to respond to your issue within 5 business days. However, it should be noted that while we will try revisit the repository semi-regularly, we are not held beholden to this response time (life happens). We welcome other individuals' answers and input as well.

Licensing

Apache-2.0

Comments
  • improved cisco ACI processor

    improved cisco ACI processor

    Improved the cisco aci processor with the following changes:

    1. simplified grok parsing
    2. removed complex logic used to detected event and error messages
    3. fixed broken parsing of the device hostname sending logs
    4. tmp.rule does NOT rapresent an username , it's instead the even.reason as described by cisco, - The action or condition that caused the event, such as a component failure or a threshold crossing.

    sample messages used for testing

    <186>Dec 08 21:20:20.614 ABC-DCA-NPRD-ACILEF-104 %LOG_LOCAL7-2-SYSTEM_MSG [F0532][raised][interface-physical-down][critical][sys/phys-[eth1/47]/phys/fault-F0532] Port is down, reason being suspended(no LACP PDUs)(connected), used by EPG on node 104 of fabric ACI Fabric1 with hostname ABC-DCA-NPRD-ACILEF-104
    
    <190>Nov 24 18:20:53.237 ABC-DCB-ACIAPC-003 %LOG_LOCAL7-6-SYSTEM_MSG [E4206143][transition][info][fwrepo/fw-aci-apic-dk9.5.2.6e] Firmware aci-apic-dk9.5.2.6e created
    
    opened by anubisg1 3
  • [Help / Documentation] - how to classify incoming syslog messages

    [Help / Documentation] - how to classify incoming syslog messages

    As per title, how would we classify incoming syslog messages so that they end up in the proper process pipeline?

    Let's take a common use case where in the network we have Cisco IOS router and switches , Cisco ACI , Cisco WLC and ISE, then Checkpoint Firewalls , F5 load balancers etc ...

    generally those devices would all be sending logs to the syslog server IP port 514. but how would we classify from where each message is coming from in order to send it to the specific processor ?

    are we supposed to setup a different input queu for each processor (for example, different port ofn the syslog server so that for example, ACI goes to 192.168.10.10 port 5514 whole Checkpoint on port 5515? )

    or is there an ip filter that says, if source IP is X send to ACI processor if Y send to checkpoint ..

    or what other options are there?

    question 
    opened by anubisg1 2
  • host split enrichment error

    host split enrichment error

    For certain hostnames the host split enrichment is causing the pipeline to be blocked until grok timesout.

    [2022-06-10T15:54:58,563][WARN ][org.logstash.plugins.pipeline.PipelineBus][processor] Attempted to send event to 'enrichments' but that address was unavailable. Maybe the destination pipeline is down or stopping? Will Retry. [2022-06-10T15:57:22,451][WARN ][logstash.filters.grok ][enrichments] Timeout executing grok '^(?<[host][tmp]>.?).(?<[host][domain]>.?)$' against field '[host][hostname]' with value 'abc-name123-xyz.domain.com'!

    opened by nnovaes 2
  • Fix deprecation warnings

    Fix deprecation warnings

    User Story - details

    For translate we should use source, target instead of field, destination. On boot logstash 15 shows these warnings:

    [2021-11-09T16:53:33,518][WARN ][logstash.filters.translate] You are using a deprecated config setting "destination" set in translate. Deprecated settings will continue to work, but are scheduled for removal from logstash in the future. Use `target` option instead. If you have any questions about this, please visit the #logstash channel on freenode irc.
    [2021-11-09T16:53:33,519][WARN ][logstash.filters.translate] You are using a deprecated config setting "field" set in translate. Deprecated settings will continue to work, but are scheduled for removal from logstash in the future. Use `source` option instead. If you have any questions about this, please visit the #logstash channel on freenode irc.
    

    Atleast upto Logstash 13 new fields are not supported so let's make this change when we upgrade.

    Tasks

    • [ ]
    • [ ]

    X-Reference Issues

    Related Code

    << Any related code here... >>
    
    opened by KrishnanandSingh 2
  • native vlan mismatch and other improvements

    native vlan mismatch and other improvements

    Description

    • Parsing for Native VLAN mismatch error messages 2021-10-14T13:28:06.497Z {name=abc.com} <188>132685: Oct 14 21:28:07.975 GMT: %CDP-4-NATIVE_VLAN_MISMATCH: Native VLAN mismatch discovered on FastEthernet0/1 (1), with xyz GigabitEthernet1/0/1 (36).
    • Lowercase [actual_msg] field
    • fix typo on timestamp
    • add the timezone to [tmp][devicetimestamp]
    • removed the old parser code for native vlan mismatch
    • removed a catch all condition in the old parser
    • lowercase [rule.category]
    opened by nnovaes 2
  • Feature Request: Add known applications + risk score field based off destination.port fields

    Feature Request: Add known applications + risk score field based off destination.port fields

    User Story - details

    As a SIEM engineer I want to know port numbers associated with the destination.port field. This will allow me to quickly identify potential applications communicating on the session and also the risk of the traffic Im observing

    Tasks

    • Create a port lookup translation.
    • Add risk category score to application (scale of 1-10 or severity name).

    Examples:

    3389 -> Remote Desktop Protocol (high risk)
    22 - Secure Shell (high risk)
    3306 - MySQL (medium risk)
    6881-6889 - Bit Torrent (high risk)
    
    opened by ryanpodonnell1 2
  • Cisco IOS (cisco.router and cisco.switch) new rules

    Cisco IOS (cisco.router and cisco.switch) new rules

    Description

    new parsing rules for cisco.router and cisco.switch. The old version of this processor needs some rework. However, there are functioning bits of it that i have preserved, since they kind of work. the new rules provide some good foundation for future "full" parsing and also covers bgp and interface up/down msgs. the lookup database for translate filters is static.

    opened by nnovaes 2
  • Update syslog_log_security_sdwan.app.conf

    Update syslog_log_security_sdwan.app.conf

    Description

    These updates correct assignment of versa fields to the ECS model. It also adds back versa specific fields that do not map to ECS into a separate [labels][all] field that works like tags. I couldn't find clean way to implement it without using the add_tag command, so i have saved the event tags to another field and then restored back

    @Akhila-Y please review as well.

    opened by nnovaes 1
  • added space, testing new IDE

    added space, testing new IDE

    Description

    Please provide a description of your proposed changes - providing obfuscated log/code examples is highly encouraged.

    Related Issues

    Are there any Issues to this PR?

    Todos

    Are there any additional items that must be completed before this PR gets merged in?

    • [ ]
    • [ ]
    opened by MehaSal 1
  • added new ECS fields to .csv file

    added new ECS fields to .csv file

    Description

    Please provide a description of your proposed changes - providing obfuscated log/code examples is highly encouraged.

    Related Issues

    Are there any Issues to this PR?

    Todos

    Are there any additional items that must be completed before this PR gets merged in?

    • [ ]
    • [ ]
    opened by MehaSal 1
  • added missing fields for coverge reporting to aws cloudtrail

    added missing fields for coverge reporting to aws cloudtrail

    Description

    Please provide a description of your proposed changes - providing obfuscated log/code examples is highly encouraged.

    Related Issues

    Are there any Issues to this PR?

    Todos

    Are there any additional items that must be completed before this PR gets merged in?

    • [ ]
    • [ ]
    opened by MehaSal 1
  • [[enrichments]>worker22] ruby - Ruby exception occurred: no implicit conversion of nil into String

    [[enrichments]>worker22] ruby - Ruby exception occurred: no implicit conversion of nil into String

    Describe the bug

    [ERROR] 2022-11-26 07:54:05.540 [[enrichments]>worker14] ruby - Ruby exception occurred: no implicit conversion of nil into String {:class=>"TypeError", :backtrace=>["(ruby filter code):68:in `block in filter_method'", "org/jruby/RubyArray.java:1865:in `each'", "(ruby filter code):67:in `block in filter_method'", "/usr/share/logstash/vendor/bundle/jruby/2.6.0/gems/logstash-filter-ruby-3.1.8/lib/logstash/filters/ruby.rb:96:in `inline_script'", "/usr/share/logstash/vendor/bundle/jruby/2.6.0/gems/logstash-filter-ruby-3.1.8/lib/logstash/filters/ruby.rb:89:in `filter'", "/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb:159:in `do_filter'", "/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb:178:in `block in multi_filter'", "org/jruby/RubyArray.java:1865:in `each'", "/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb:175:in `multi_filter'", "org/logstash/config/ir/compiler/AbstractFilterDelegatorExt.java:134:in `multi_filter'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:301:in `block in start_workers'"]}
    

    X-Reference issues

    (Cross reference any user stories that this bug might be affecting)

    Steps To Reproduce

    start the enrichment pipeline. I'm using logstash 8.5.2

    Expected behavior

    no error should be seen

    Additional context

    The following components in enrichment make use of ruby filer, but i don't understand what is the culprit

    ./02_ecs_data_type.conf
    ./04_timestamp.conf
    ./11_related_hosts.conf
    ./12_related_user.conf
    ./13_related_ip.conf
    ./14_related_hash.conf
    ./16_related_mac.conf
    ./93_mitre.conf
    ./94_remove_empty_n_truncate.conf
    
    bug wontfix 
    opened by anubisg1 2
  • cisco processor fails because of missing hostname and lowercase date

    cisco processor fails because of missing hostname and lowercase date

    I'm working with syslog_audit_cisco.switch.conf and i found the following issues:

    1. the syslog message is assumed here https://github.com/Cargill/OpenSIEM-Logstash-Parsing/blob/1.0/config/processors/syslog_audit_cisco.switch.conf#L52 as
      # {timesdtamp} {facility} {severity} {mnemonic} {description}
      # seq no:timestamp: %facility-severity-MNEMONIC:description
    

    in reality most people would configure "logging origin-id hostname" which will change the log format into

      # {hostname} {timesdtamp} {facility} {severity} {mnemonic} {description}
      # seq no: hostname: timestamp: %facility-severity-MNEMONIC:description
    
    1. the parser at line https://github.com/Cargill/OpenSIEM-Logstash-Parsing/blob/1.0/config/processors/syslog_audit_cisco.switch.conf#L33 is modifying the hostname field before that field is parsed (maybe this is assumed from kafka, instead of being taken from the logs?

    2. in line https://github.com/Cargill/OpenSIEM-Logstash-Parsing/blob/1.0/config/processors/syslog_audit_cisco.switch.conf#L48 the message is converted to lower case, but that causes date parse failures later on, becuase of case missmatch .

    Nov 17 11:44:46.490 UTC matches, but when i have nov 17 11:44:46.490 utc it fails on the date parsing here: https://github.com/Cargill/OpenSIEM-Logstash-Parsing/blob/1.0/config/processors/syslog_audit_cisco.switch.conf#L77

    Sample log entry for reference:

    <14>4643: Switch-core01: Nov 17 11:44:46.490 UTC: %LINK-3-UPDOWN: Interface GigabitEthernet1/0/27, changed state to up

    opened by anubisg1 0
  • GeoLitePrivate2-City.mmdb doesn't exist

    GeoLitePrivate2-City.mmdb doesn't exist

    to use the geoip enrichment, you need to files, specifically

              database => "/mnt/s3fs_geoip/GeoLite2-City.mmdb"
              database => "/mnt/s3fs_geoip/GeoLitePrivate2-City.mmdb"
    

    unfortunately seems like GeoLitePrivate2-City.mmdb doesn't exist anywhere in the internet and maxmind only provides

    • GeoLite2-ASN.mmdb
    • GeoLite2-City.mmdb
    • GeoLite2-Country.mmdb

    i'd expect that either more information on where to find GeoLitePrivate2-City.mmdb is added to the documentation or the enrichment pipeline is updated to function without that file

    documentation question 
    opened by anubisg1 3
  • Validate ECS fields

    Validate ECS fields

    User Story - details

    There should be an enrichment checking that only permitted values are stored in ECS fields that have a predefined set of values, so those fields can be compliant with ECS. See https://www.elastic.co/guide/en/ecs/1.9/ecs-event.html for more info. I believe event.xyz are the only fields that have their values defined. If that's true the sample code below should take care of doing this validation.

    Tasks

    • [ ]
    • [ ]

    X-Reference Issues

    Related Code

    the sample configuration below picks the event.type value that came from the processors and populates ecs_status with valid or event.type-invalid_field_value. therefore, if the ecs_status is not valid, it will add a tag that will have event.type-invalid_field_value. i.e. if event.type is "process", because "process" is not among the allowed values for event.type, a event.type-invalid_field_value: process will be added.

     translate {
                field => "event.type"
                dictionary => [
                "access", "valid", 
                "admin", "valid", 
                "allowed", "valid", 
                "change", "valid", 
                "connection", "valid", 
                "creation", "valid", 
                "deletion", "valid", 
                "denied", "valid", 
                "end", "valid", 
                "error", "valid", 
                "group", "valid", 
                "info", "valid", 
                "installation", "valid", 
                "protocol", "valid", 
                "start", "valid", 
                "user", "valid"
                ]
                exact => true
                # [field]-[error]
                fallback => "event.type-invalid_field_value"
                destination => "ecs_status"
            }
        if [ecs_status] !~ "valid" {
            mutate {
                add_tag => [ "%{ecs_status}: %{event.type}" ]
                remove_field => [ "ecs_status", "event.type"]
            }
        }
    
        #EVENT.CATEGORY
        translate {
                field => "event.category"
                dictionary => [
                "authentication", "valid", 
                "configuration", "valid", 
                "driver", "valid", 
                "database", "valid", 
                "file", "valid", 
                "host", "valid", 
                "iam", "valid", 
                "intrusion_detection", "valid", 
                "malware", "valid", 
                "network", "valid", 
                "package", "valid", 
                "process", "valid", 
                "web", "valid"
                ]
                exact => true
                # [field]-[error]
                fallback => "event.category-invalid_field_value"
                destination => "ecs_status"
            }
        if [ecs_status] !~ "valid" {
            mutate {
                add_tag => [ "%{ecs_status}: %{event.category}" ]
                remove_field => [ "ecs_status", "event.category"]
    
            }
        }
    
        # event.kind
         translate {
                field => "event.kind"
                dictionary => [
                "alert", "valid", 
                "event", "valid", 
                "metric", "valid", 
                "state", "valid", 
                "pipeline_error", "valid", 
                "signal", "valid"
                ]
                exact => true
                # [field]-[error]
                fallback => "event.kind-invalid_field_value"
                destination => "ecs_status"
            }
        if [ecs_status] !~ "valid" {
            mutate {
                add_tag => [ "%{ecs_status}: %{event.kind}" ]
                remove_field => [ "ecs_status", "event.kind"]
    
            }
        }
    
    
        # event.outcome
         translate {
                field => "event.outcome"
                dictionary => [
                "failure", "valid", 
                "success", "valid", 
                "unknown", "valid"
                ]
                exact => true
                # [field]-[error]
                fallback => "event.outcome-invalid_field_value"
                destination => "ecs_status"
            }
        if [ecs_status] !~ "valid" {
            mutate {
                add_tag => [ "%{ecs_status}: %{event.outcome}" ]
                remove_field => [ "ecs_status", "event.outcome"]
    
            }
        }
    
    opened by nnovaes 0
Releases(v0.1-beta)
  • v0.1-beta(May 19, 2021)

    This release lack an elaborate usage documentation so marking this as beta. Users can still work with it by going through the python script. Soon documentation would be added.

    Source code(tar.gz)
    Source code(zip)
Owner
Working to nourish the world. Committed to helping the world thrive
This is Official implementation for "Pose-guided Feature Disentangling for Occluded Person Re-Identification Based on Transformer" in AAAI2022

PFD:Pose-guided Feature Disentangling for Occluded Person Re-identification based on Transformer This repo is the official implementation of "Pose-gui

Tao Wang 93 Dec 18, 2022
Normal Learning in Videos with Attention Prototype Network

Codes_APN Official codes of CVPR21 paper: Normal Learning in Videos with Attention Prototype Network (https://arxiv.org/abs/2108.11055) Overview of ou

11 Dec 13, 2022
Leaderboard, taxonomy, and curated list of few-shot object detection papers.

Leaderboard, taxonomy, and curated list of few-shot object detection papers.

Gabriel Huang 70 Jan 07, 2023
SCAAML is a deep learning framwork dedicated to side-channel attacks run on top of TensorFlow 2.x.

SCAAML (Side Channel Attacks Assisted with Machine Learning) is a deep learning framwork dedicated to side-channel attacks. It is written in python and run on top of TensorFlow 2.x.

Google 69 Dec 21, 2022
How Effective is Incongruity? Implications for Code-mix Sarcasm Detection.

Code for the paper: How Effective is Incongruity? Implications for Code-mix Sarcasm Detection - ICON ACL 2021

2 Jun 05, 2022
This example implements the end-to-end MLOps process using Vertex AI platform and Smart Analytics technology capabilities

MLOps with Vertex AI This example implements the end-to-end MLOps process using Vertex AI platform and Smart Analytics technology capabilities. The ex

Google Cloud Platform 238 Dec 21, 2022
Active and Sample-Efficient Model Evaluation

Active Testing: Sample-Efficient Model Evaluation Hi, good to see you here! 👋 This is code for "Active Testing: Sample-Efficient Model Evaluation". P

Jannik Kossen 19 Oct 30, 2022
FinEAS: Financial Embedding Analysis of Sentiment 📈

FinEAS: Financial Embedding Analysis of Sentiment 📈 (SentenceBERT for Financial News Sentiment Regression) This repository contains the code for gene

LHF Labs 31 Dec 13, 2022
Active window border replacement for window managers.

xborder Active window border replacement for window managers. Usage git clone https://github.com/deter0/xborder cd xborder chmod +x xborders ./xborder

deter 250 Dec 30, 2022
StyleGAN2-ADA-training-jupyter - Training custom datasets in styleGAN2-ADA by NVIDIA using Jupyter

styleGAN2-ADA-training-jupyter Training custom datasets in styleGAN2-ADA on Jupyter Official StyleGAN2-ADA by NIVIDIA Paper Training Generative Advers

Mang Su Hyun 2 Feb 24, 2022
Progressive Domain Adaptation for Object Detection

Progressive Domain Adaptation for Object Detection Implementation of our paper Progressive Domain Adaptation for Object Detection, based on pytorch-fa

96 Nov 25, 2022
This is the code for the paper "Motion-Focused Contrastive Learning of Video Representations" (ICCV'21).

Motion-Focused Contrastive Learning of Video Representations Introduction This is the code for the paper "Motion-Focused Contrastive Learning of Video

11 Sep 23, 2022
Source code for the paper "SEPP: Similarity Estimation of Predicted Probabilities for Defending and Detecting Adversarial Text" PACLIC 2021

Adversarial text generator Refer to "adversarial_text_generator"[https://github.com/quocnsh/SEPP_generator] project for generating adversarial texts A

0 Oct 05, 2021
An Exact Solver for Semi-supervised Minimum Sum-of-Squares Clustering

PC-SOS-SDP: an Exact Solver for Semi-supervised Minimum Sum-of-Squares Clustering PC-SOS-SDP is an exact algorithm based on the branch-and-bound techn

Antonio M. Sudoso 1 Nov 13, 2022
DLL: Direct Lidar Localization

DLL: Direct Lidar Localization Summary This package presents DLL, a direct map-based localization technique using 3D LIDAR for its application to aeri

Service Robotics Lab 127 Dec 16, 2022
DeepFaceEditing: Deep Face Generation and Editing with Disentangled Geometry and Appearance Control

DeepFaceEditing: Deep Face Generation and Editing with Disentangled Geometry and Appearance Control One version of our system is implemented using the

260 Nov 28, 2022
[ACL-IJCNLP 2021] "EarlyBERT: Efficient BERT Training via Early-bird Lottery Tickets"

EarlyBERT This is the official implementation for the paper in ACL-IJCNLP 2021 "EarlyBERT: Efficient BERT Training via Early-bird Lottery Tickets" by

VITA 13 May 11, 2022
Official Pytorch Implementation of 'Learning Action Completeness from Points for Weakly-supervised Temporal Action Localization' (ICCV-21 Oral)

Learning-Action-Completeness-from-Points Official Pytorch Implementation of 'Learning Action Completeness from Points for Weakly-supervised Temporal A

Pilhyeon Lee 67 Jan 03, 2023
A PyTorch implementation of Radio Transformer Networks from the paper "An Introduction to Deep Learning for the Physical Layer".

An Introduction to Deep Learning for the Physical Layer An usable PyTorch implementation of the noisy autoencoder infrastructure in the paper "An Intr

Gram.AI 120 Nov 21, 2022
EDCNN: Edge enhancement-based Densely Connected Network with Compound Loss for Low-Dose CT Denoising

EDCNN: Edge enhancement-based Densely Connected Network with Compound Loss for Low-Dose CT Denoising By Tengfei Liang, Yi Jin, Yidong Li, Tao Wang. Th

workingcoder 115 Jan 05, 2023