mirror of
https://github.com/valitydev/SigmaHQ.git
synced 2024-11-06 17:35:19 +00:00
commit
51f28271a5
108
tools/README.md
108
tools/README.md
@ -8,6 +8,9 @@ This folder contains libraries and the following command line tools:
|
||||
|
||||
# Sigmac
|
||||
|
||||
The Sigmac is one of the most important files, as this is what sets the correct fields that your backend/database will use after being translated from the (original) log source's field names.
|
||||
Please read below to understand how a SIGMAC is constructed. Additionally, see [Choosing the Right Sigmac](#choosing-the-right-sigmac) for an idea of which file and command line options (if applicable) that will best suite your environment.
|
||||
|
||||
## Configuration File
|
||||
|
||||
The configuration file contains mappings for the target environments:
|
||||
@ -222,3 +225,108 @@ realert:
|
||||
smtp_host: smtp.mailgun.com
|
||||
type: any
|
||||
```
|
||||
|
||||
|
||||
## Choosing the right SIGMAC
|
||||
|
||||
The section will show you which `-c` option (the Sigmac) and which `--backend-option`(s) to use. The rest of SIGMA should be run as normal.
|
||||
For example, run the rest of the command as you normally would, regarding the `-t` (target backend) and which rule(s) you are performing SIGMA on.
|
||||
|
||||
If the target backend/database does not do a lot of field renaming/normalization than the selection of which Sigmac to use is easier to determine.
|
||||
However, this section will help guide you in this decision.
|
||||
|
||||
### Elasticsearch or ELK
|
||||
|
||||
For this backend, there are two very important components. One is the field name and the other is the the way the value for the field name are analyzed AKA searchable in the Elasticsearch database. If you are interested in understand how this is important, you can read more [here](https://socprime.com/blog/elastic-for-security-analysts-part-1-searching-strings/) to understand the impact between `keyword` types and `text` types.
|
||||
You have a few different variations of what could be the correct Sigmac to use. Based on the version of Elasticsearch, using ECS or not, using certain Beat's settings enabled or not, and so on.
|
||||
|
||||
In order to aide in the decision of the correct Sigmac there are a few quick questions to ask yourself and based on those answers will be which one to use.
|
||||
Please not the answer to each question. It is OK to not know the answer to each question and in fact is very common (that's OK).
|
||||
1. What version of filebeat are you using (you may not be using this at all).
|
||||
2. Are you using Elastic Common Schema (ECS)?
|
||||
3. What index do your store the log source's data in? Some examples:
|
||||
- Window's logs are most likely in `winlogbeat-*`
|
||||
- Linux logs are most likely in `filebeat-*`
|
||||
- Zeek/Bro data is most likely in `filebeat-*`
|
||||
- If you are using logstash, data is most likely in `logstash-*`
|
||||
4. If you are using filebeat, are you using the module enabled? Here is link showing the description for Windows log [Security Channel](https://www.elastic.co/guide/en/beats/winlogbeat/current/winlogbeat-module-security.html)
|
||||
|
||||
|
||||
Now choose your data source:
|
||||
- [Windows Event Logs](#elastic-windows-event-log--sysmon-data-configurations)
|
||||
- [Zeek](#elastic---zeek-fka-bro--corelight-data)
|
||||
|
||||
|
||||
###
|
||||
|
||||
#### Elastic - Zeek (FKA Bro) / Corelight Data
|
||||
|
||||
- Corelight's implementation of ECS:
|
||||
`-c tools/config/ecs-zeek-corelight.yml --backend-option keyword_base_fields="*" --backend-option analyzed_sub_field_name=".text" --backend-option keyword_whitelist="event.dataset,source.ip,destination.ip,source.port,destination.port,*bytes*"`
|
||||
example of the full command running on all the proxy rules converting to a Kibana (lucene) query:
|
||||
`tools/sigmac -t es-qs -c tools/config/ecs-zeek-corelight.yml --backend-option keyword_base_fields="*" --backend-option analyzed_sub_field_name=".text" --backend-option keyword_whitelist="event.dataset,source.ip,destination.ip,source.port,destination.port,*bytes*" rules/proxy/*`
|
||||
|
||||
- Filebeat version 7 or higher and or Elastic's implementation:
|
||||
`-c tools/config/ecs-zeek-elastic-beats-implementation.yml --backend-option keyword_base_fields="*"`
|
||||
|
||||
- Using logstash and NOT using ECS:
|
||||
`-c tools/config/logstash-zeek-default-json.yml`
|
||||
|
||||
|
||||
|
||||
#### Elastic Windows Event Log / Sysmon Data Configurations
|
||||
|
||||
**index templates**
|
||||
If you are able, because this will be one of the best ways to dermine which options to use - run the following command. Take the output from question 3 and replace in the example command `winlogbeat` with index. You can run this from the CLI against your Elasticsearch instance or from Kibana Dev Tools.
|
||||
You will only need to use the first index template pattern. Look under the section `dynamic_templates` and then look for `strings_as_keyword`. Under that section, is there a `strings_as_keyword` ? If so take note.
|
||||
|
||||
`curl -XGET "http://127.0.0.1:9200/winlogbeat-*/_mapping/?filter_path=*.mappings.dynamic_templates*,*.index_patterns"`
|
||||
|
||||
The next question to ask yourself, is do you want easily bypassable queries due to case sensitive searches? Take note of yes/no.
|
||||
|
||||
Now lets determine which options and Sigmac to use.
|
||||
|
||||
**Sigmac's `-c` option**
|
||||
|
||||
1. Using winlogbeat version 6 or less
|
||||
`-c tools/config/winlogbeat-old.yml`
|
||||
1. Using winlogbeat version 7 or higher without modules enabled (answer from **question 4**) and `strings_as_keyword` does not contain `text`
|
||||
`-c tools/config/winlogbeat-old.yml`
|
||||
2. Using winlogbeat version 7 or higher with modules enabled (answer from **question 4**)
|
||||
`-c tools/config/winlogbeat-modules-enabled.yml`
|
||||
|
||||
**Backend options `--backend-option`**
|
||||
You can add the following depending on additional information from your answers/input above.
|
||||
|
||||
|
||||
1. If you are using ECS, your data is going to `winlogbeat-*` index, or your default field is a keyword type then add the following to your SIGMA command:
|
||||
`--backend-option keyword_field="" `
|
||||
- If you want to prevent case sensitive bypasses you can add the following to your command:
|
||||
`--backend-option case_insensitive_whitelist""`
|
||||
- If you want to prevent case sensitive bypasses but only for certain fields, you can use an option like this:
|
||||
``-backend-option keyword_field="" --backend-option case_insensitive_whitelist="*CommandLine*, *ProcessName*, *Image*, process.*, *FileName*, *Path*, *ServiceName*, *ShareName*, file.*, *Directory*, *directory*, *hash*, *Hash*, *Object*, ComputerName, *Subject*, *Target*, *Service*"``
|
||||
|
||||
|
||||
|
||||
1. If you are using analyzed (text) fields or your index template portion of `strings_as_keyword` contains `text` then you can add the following:
|
||||
`--backend-option keyword_base_fields="*" --backend-option analyzed_sub_field_name=".text"`
|
||||
|
||||
|
||||
1. If you only have some analyzed fields then you would use an example like this:
|
||||
`--backend-option keyword_base_fields="*" --backend-option analyzed_sub_field_name=".text" --backend-option analyzed_sub_fields="TargetUserName, SourceUserName, TargetHostName, CommandLine, ProcessName, ParentProcessName, ParentImage, Image"`
|
||||
|
||||
|
||||
#### Elastic - Some Final Examples
|
||||
So putting it all together to help show everything from above, here are some "full" examples:
|
||||
|
||||
- base field keyword & no analyzed field w/ case insensitivity (covers elastic 7 with beats/ecs (default)mappings) and using winlogbeat with modules enabled
|
||||
`sigma -t es-qs -c tools/config/winlogbeat-modules-enabled.yml --backend-option keyword_field="" --backend-option case_insensitive_whitelist"" rules/windows/process_creation/win_office_shell.yml`
|
||||
|
||||
- base field keyword & subfield is analyzed(.text) and winlogbeat with modules enabled
|
||||
`sigma -t es-qs -c tools/config/winlogbeat-modules-enabled.yml --backend-option keyword_base_fields="*" --backend-option analyzed_sub_field_name=".text" rules/windows/process_creation/win_office_shell.yml`
|
||||
|
||||
- base field keyword & only some analyzed fields and winlogbeat without modules enabled
|
||||
`tools/sigmac -t es-dsl -c tools/config/winlogbeat.yml --backend-option keyword_base_fields="*" --backend-option analyzed_sub_field_name=".text" --backend-option analyzed_sub_fields="TargetUserName, SourceUserName, TargetHostName, CommandLine, ProcessName, ParentProcessName, ParentImage, Image" rules/windows/process_creation/win_office_shell.yml`
|
||||
|
||||
- using beats/ecs Elastic 7 with case insensitive and some .text fields and winlogbeat without modules enabled
|
||||
`tools/sigmac -t es-dsl -c tools/config/winlogbeat.yml --backend-option keyword_base_fields="*" --backend-option analyzed_sub_field_name=".text" --backend-option keyword_whitelist="winlog.channel,winlog.event_id" --backend-option case_insensitive_whitelist="*" --backend-option analyzed_sub_fields="TargetUserName, SourceUserName, TargetHostName, CommandLine, ProcessName, ParentProcessName, ParentImage, Image" rules/windows/process_creation/win_office_shell.yml`
|
@ -1,468 +0,0 @@
|
||||
title: Zeek field mappings for default collection of JSON logs with no parsing/normalization done and sending into logstash-*index
|
||||
order: 20
|
||||
backends:
|
||||
- es-qs
|
||||
- es-dsl
|
||||
- elasticsearch-rule
|
||||
- kibana
|
||||
- xpack-watcher
|
||||
- elastalert
|
||||
- elastalert-dsl
|
||||
logsources:
|
||||
zeek:
|
||||
product: zeek
|
||||
index: 'logstash*'
|
||||
zeek-category-accounting:
|
||||
category: accounting
|
||||
rewrite:
|
||||
product: zeek
|
||||
service: syslog
|
||||
zeek-category-firewall:
|
||||
category: firewall
|
||||
conditions:
|
||||
'@stream': conn
|
||||
zeek-category-dns:
|
||||
category: dns
|
||||
conditions:
|
||||
'@stream': dns
|
||||
zeek-category-proxy:
|
||||
category: proxy
|
||||
rewrite:
|
||||
product: zeek
|
||||
service: http
|
||||
zeek-category-webserver:
|
||||
category: webserver
|
||||
conditions:
|
||||
'@stream': http
|
||||
rewrite:
|
||||
product: zeek
|
||||
service: http
|
||||
zeek-conn:
|
||||
product: zeek
|
||||
service: conn
|
||||
conditions:
|
||||
'@stream': conn
|
||||
zeek-conn_long:
|
||||
product: zeek
|
||||
service: conn_long
|
||||
conditions:
|
||||
'@stream': conn_long
|
||||
zeek-dce_rpc:
|
||||
product: zeek
|
||||
service: dce_rpc
|
||||
conditions:
|
||||
'@stream': dce_rpc
|
||||
zeek-dns:
|
||||
product: zeek
|
||||
service: dns
|
||||
conditions:
|
||||
'@stream': dns
|
||||
zeek-dnp3:
|
||||
product: zeek
|
||||
service: dnp3
|
||||
conditions:
|
||||
'@stream': dnp3
|
||||
zeek-dpd:
|
||||
product: zeek
|
||||
service: dpd
|
||||
conditions:
|
||||
'@stream': dpd
|
||||
zeek-files:
|
||||
product: zeek
|
||||
service: files
|
||||
conditions:
|
||||
'@stream': files
|
||||
zeek-ftp:
|
||||
product: zeek
|
||||
service: ftp
|
||||
conditions:
|
||||
'@stream': ftp
|
||||
zeek-gquic:
|
||||
product: zeek
|
||||
service: gquic
|
||||
conditions:
|
||||
'@stream': gquic
|
||||
zeek-http:
|
||||
product: zeek
|
||||
service: http
|
||||
conditions:
|
||||
'@stream': http
|
||||
zeek-http2:
|
||||
product: zeek
|
||||
service: http2
|
||||
conditions:
|
||||
'@stream': http2
|
||||
zeek-intel:
|
||||
product: zeek
|
||||
service: intel
|
||||
conditions:
|
||||
'@stream': intel
|
||||
zeek-irc:
|
||||
product: zeek
|
||||
service: irc
|
||||
conditions:
|
||||
'@stream': irc
|
||||
zeek-kerberos:
|
||||
product: zeek
|
||||
service: kerberos
|
||||
conditions:
|
||||
'@stream': kerberos
|
||||
zeek-known_certs:
|
||||
product: zeek
|
||||
service: known_certs
|
||||
conditions:
|
||||
'@stream': known_certs
|
||||
zeek-known_hosts:
|
||||
product: zeek
|
||||
service: known_hosts
|
||||
conditions:
|
||||
'@stream': known_hosts
|
||||
zeek-known_modbus:
|
||||
product: zeek
|
||||
service: known_modbus
|
||||
conditions:
|
||||
'@stream': known_modbus
|
||||
zeek-known_services:
|
||||
product: zeek
|
||||
service: known_services
|
||||
conditions:
|
||||
'@stream': known_services
|
||||
zeek-modbus:
|
||||
product: zeek
|
||||
service: modbus
|
||||
conditions:
|
||||
'@stream': modbus
|
||||
zeek-modbus_register_change:
|
||||
product: zeek
|
||||
service: modbus_register_change
|
||||
conditions:
|
||||
'@stream': modbus_register_change
|
||||
zeek-mqtt_connect:
|
||||
product: zeek
|
||||
service: mqtt_connect
|
||||
conditions:
|
||||
'@stream': mqtt_connect
|
||||
zeek-mqtt_publish:
|
||||
product: zeek
|
||||
service: mqtt_publish
|
||||
conditions:
|
||||
'@stream': mqtt_publish
|
||||
zeek-mqtt_subscribe:
|
||||
product: zeek
|
||||
service: mqtt_subscribe
|
||||
conditions:
|
||||
'@stream': mqtt_subscribe
|
||||
zeek-mysql:
|
||||
product: zeek
|
||||
service: mysql
|
||||
conditions:
|
||||
'@stream': mysql
|
||||
zeek-notice:
|
||||
product: zeek
|
||||
service: notice
|
||||
conditions:
|
||||
'@stream': notice
|
||||
zeek-ntlm:
|
||||
product: zeek
|
||||
service: ntlm
|
||||
conditions:
|
||||
'@stream': ntlm
|
||||
zeek-ntp:
|
||||
product: zeek
|
||||
service: ntp
|
||||
conditions:
|
||||
'@stream': ntp
|
||||
zeek-ocsp:
|
||||
product: zeek
|
||||
service: ntp
|
||||
conditions:
|
||||
'@stream': ocsp
|
||||
zeek-pe:
|
||||
product: zeek
|
||||
service: pe
|
||||
conditions:
|
||||
'@stream': pe
|
||||
zeek-pop3:
|
||||
product: zeek
|
||||
service: pop3
|
||||
conditions:
|
||||
'@stream': pop3
|
||||
zeek-radius:
|
||||
product: zeek
|
||||
service: radius
|
||||
conditions:
|
||||
'@stream': radius
|
||||
zeek-rdp:
|
||||
product: zeek
|
||||
service: rdp
|
||||
conditions:
|
||||
'@stream': rdp
|
||||
zeek-rfb:
|
||||
product: zeek
|
||||
service: rfb
|
||||
conditions:
|
||||
'@stream': rfb
|
||||
zeek-sip:
|
||||
product: zeek
|
||||
service: sip
|
||||
conditions:
|
||||
'@stream': sip
|
||||
zeek-smb_files:
|
||||
product: zeek
|
||||
service: smb_files
|
||||
conditions:
|
||||
'@stream': smb_files
|
||||
zeek-smb_mapping:
|
||||
product: zeek
|
||||
service: smb_mapping
|
||||
conditions:
|
||||
'@stream': smb_mapping
|
||||
zeek-smtp:
|
||||
product: zeek
|
||||
service: smtp
|
||||
conditions:
|
||||
'@stream': smtp
|
||||
zeek-smtp_links:
|
||||
product: zeek
|
||||
service: smtp_links
|
||||
conditions:
|
||||
'@stream': smtp_links
|
||||
zeek-snmp:
|
||||
product: zeek
|
||||
service: snmp
|
||||
conditions:
|
||||
'@stream': snmp
|
||||
zeek-socks:
|
||||
product: zeek
|
||||
service: socks
|
||||
conditions:
|
||||
'@stream': socks
|
||||
zeek-software:
|
||||
product: zeek
|
||||
service: software
|
||||
conditions:
|
||||
'@stream': software
|
||||
zeek-ssh:
|
||||
product: zeek
|
||||
service: ssh
|
||||
conditions:
|
||||
'@stream': ssh
|
||||
zeek-ssl:
|
||||
product: zeek
|
||||
service: ssl
|
||||
conditions:
|
||||
'@stream': ssl
|
||||
zeek-tls: # In case people call it TLS even though orig log is called ssl
|
||||
product: zeek
|
||||
service: tls
|
||||
conditions:
|
||||
'@stream': ssl
|
||||
zeek-syslog:
|
||||
product: zeek
|
||||
service: syslog
|
||||
conditions:
|
||||
'@stream': syslog
|
||||
zeek-tunnel:
|
||||
product: zeek
|
||||
service: tunnel
|
||||
conditions:
|
||||
'@stream': tunnel
|
||||
zeek-traceroute:
|
||||
product: zeek
|
||||
service: traceroute
|
||||
conditions:
|
||||
'@stream': traceroute
|
||||
zeek-weird:
|
||||
product: zeek
|
||||
service: weird
|
||||
conditions:
|
||||
'@stream': weird
|
||||
zeek-x509:
|
||||
product: zeek
|
||||
service: x509
|
||||
conditions:
|
||||
'@stream': x509
|
||||
zeek-ip_search:
|
||||
product: zeek
|
||||
service: network
|
||||
conditions:
|
||||
'@stream':
|
||||
- conn
|
||||
- conn_long
|
||||
- dce_rpc
|
||||
- dhcp
|
||||
- dnp3
|
||||
- dns
|
||||
- ftp
|
||||
- gquic
|
||||
- http
|
||||
- irc
|
||||
- kerberos
|
||||
- modbus
|
||||
- mqtt_connect
|
||||
- mqtt_publish
|
||||
- mqtt_subscribe
|
||||
- mysql
|
||||
- ntlm
|
||||
- ntp
|
||||
- radius
|
||||
- rfb
|
||||
- sip
|
||||
- smb_files
|
||||
- smb_mapping
|
||||
- smtp
|
||||
- smtp_links
|
||||
- snmp
|
||||
- socks
|
||||
- ssh
|
||||
- tls #SSL
|
||||
- tunnel
|
||||
- weird
|
||||
defaultindex: 'logstash-*'
|
||||
fieldmappings:
|
||||
# All Logs Applied Mapping & Taxonomy
|
||||
dst_ip: id.resp_h
|
||||
dst_port: id.resp_p
|
||||
network_protocol: proto
|
||||
src_ip: id.orig_h
|
||||
src_port: id.orig_p
|
||||
# DNS matching Taxonomy & DNS Category
|
||||
answer: answers
|
||||
#question_length: # Does not exist in open source version
|
||||
record_type: qtype_name
|
||||
#parent_domain: # Does not exist in open source version
|
||||
# HTTP matching Taxonomy & Web/Proxy Category
|
||||
cs-bytes: request_body_len
|
||||
cs-cookie: cookie
|
||||
r-dns: host
|
||||
sc-bytes: response_body_len
|
||||
sc-status: status_code
|
||||
c-uri: uri
|
||||
c-uri-extension: uri
|
||||
c-uri-query: uri
|
||||
c-uri-stem: uri
|
||||
c-useragent: user_agent
|
||||
cs-host: host
|
||||
cs-method: method
|
||||
cs-referrer: referrer
|
||||
cs-version: version
|
||||
# Temporary one off rule name fields
|
||||
agent.version: version
|
||||
c-cookie: cookie
|
||||
c-ip: id.orig_h
|
||||
cs-uri: uri
|
||||
clientip: id.orig_h
|
||||
clientIP: id.orig_h
|
||||
dest_domain:
|
||||
- query
|
||||
- host
|
||||
- server_name
|
||||
dest_ip: id.resp_h
|
||||
dest_port: id.resp_p
|
||||
#TODO:WhatShouldThisBe?==dest:
|
||||
#TODO:WhatShouldThisBe?==destination:
|
||||
#TODO:WhatShouldThisBe?==Destination:
|
||||
destination.hostname:
|
||||
- query
|
||||
- host
|
||||
- server_name
|
||||
DestinationAddress:
|
||||
DestinationHostname:
|
||||
- host
|
||||
- query
|
||||
- server_name
|
||||
DestinationIp: id.resp_h
|
||||
DestinationIP: id.resp_h
|
||||
DestinationPort: id.resp_p
|
||||
dst-ip: id.resp_h
|
||||
dstip: id.resp_h
|
||||
dstport: id.resp_p
|
||||
Host:
|
||||
- host
|
||||
- query
|
||||
- server_name
|
||||
HostVersion: http.version
|
||||
http_host:
|
||||
- host
|
||||
- query
|
||||
- server_name
|
||||
http_uri: uri
|
||||
http_url: uri
|
||||
http_user_agent: user_agent
|
||||
http.request.url-query-params: uri
|
||||
HttpMethod: method
|
||||
in_url: uri
|
||||
# parent_domain: # Not in open source zeek
|
||||
post_url_parameter: uri
|
||||
Request Url: uri
|
||||
request_url: uri
|
||||
request_URL: uri
|
||||
RequestUrl: uri
|
||||
#response: status_code
|
||||
resource.url: uri
|
||||
resource.URL: uri
|
||||
sc_status: status_code
|
||||
sender_domain:
|
||||
- query
|
||||
- server_name
|
||||
service.response_code: status_code
|
||||
source: id.orig_h
|
||||
SourceAddr: id.orig_h
|
||||
SourceAddress: id.orig_h
|
||||
SourceIP: id.orig_h
|
||||
SourceIp: id.orig_h
|
||||
SourceNetworkAddress: id.orig_h
|
||||
SourcePort: id.orig_p
|
||||
srcip: id.orig_h
|
||||
Status: status_code
|
||||
status: status_code
|
||||
url: uri
|
||||
URL: uri
|
||||
url_query: uri
|
||||
url.query: uri
|
||||
uri_path: uri
|
||||
user_agent: user_agent
|
||||
user_agent.name: user_agent
|
||||
user-agent: user_agent
|
||||
User-Agent: user_agent
|
||||
useragent: user_agent
|
||||
UserAgent: user_agent
|
||||
User Agent: user_agent
|
||||
web_dest:
|
||||
- host
|
||||
- query
|
||||
- server_name
|
||||
web.dest:
|
||||
- host
|
||||
- query
|
||||
- server_name
|
||||
Web.dest:
|
||||
- host
|
||||
- query
|
||||
- server_name
|
||||
web.host:
|
||||
- host
|
||||
- query
|
||||
- server_name
|
||||
Web.host:
|
||||
- host
|
||||
- query
|
||||
- server_name
|
||||
web_method: method
|
||||
Web_method: method
|
||||
web.method: method
|
||||
Web.method: method
|
||||
web_src: id.orig_h
|
||||
web_status: status_code
|
||||
Web_status: status_code
|
||||
web.status: status_code
|
||||
Web.status: status_code
|
||||
web_uri: uri
|
||||
web_url: uri
|
||||
# Most are in ECS, but for things not using Elastic - these need renamed
|
||||
destination.ip: id.resp_h
|
||||
destination.port: id.resp_p
|
||||
http.request.body.content: post_body
|
||||
#source.domain:
|
||||
source.ip: id.orig_h
|
||||
source.port: id.orig_p
|
Loading…
Reference in New Issue
Block a user