Failing to decode IPv6 addresses

Support/Development for OSSEC
DrydenK
New Forum User
New Forum User
Posts: 3
Joined: Fri Sep 11, 2020 8:12 am
Location: Sao Paulo

Failing to decode IPv6 addresses

Unread post by DrydenK »

Hi,

my install of OSSEC if having problems decoding IPv6 addresses. I used /var/ossec/bin/ossec-logtest to test one syslog message, and got the following output:


root@syslog-server:~# /var/ossec/bin/ossec-logtest
2020/09/11 09:15:46 ossec-testrule: INFO: Reading local decoder file.
2020/09/11 09:15:46 ossec-testrule: INFO: Started (pid: 17218).
ossec-testrule: Type one log per line.

Sep 10 08:40:51 2801:<x:x>::1 pdns_recursor[45779]: stats: throttle map: 16, ns speeds: 4345, failed ns: 51, ednsmap: 5103


**Phase 1: Completed pre-decoding.
full event: 'Sep 10 08:40:51 2801:<x:x>::1 pdns_recursor[45779]: stats: throttle map: 16, ns speeds: 4345, failed ns: 51, ednsmap: 5103'
hostname: '<ossec server name. NOT the remote address, as would be expected!>'
program_name: '(null)'
log: '2801:<x:x>::1 pdns_recursor[45779]: stats: throttle map: 16, ns speeds: 4345, failed ns: 51, ednsmap: 5103'

**Phase 2: Completed decoding.
No decoder matched.

**Phase 3: Completed filtering (rules).
Rule id: '1002'
Level: '2'
Description: 'Unknown problem somewhere in the system.'
**Alert to be generated.



Sep 10 08:40:51 200.<x.x>.1 pdns_recursor[45779]: stats: throttle map: 16, ns speeds: 4345, failed ns: 51, ednsmap: 5103


**Phase 1: Completed pre-decoding.
full event: 'Sep 10 08:40:51 200.<x.x>.1 pdns_recursor[45779]: stats: throttle map: 16, ns speeds: 4345, failed ns: 51, ednsmap: 5103'
hostname: '200.<x.x>.1'
program_name: 'pdns_recursor'
log: 'stats: throttle map: 16, ns speeds: 4345, failed ns: 51, ednsmap: 5103'

**Phase 2: Completed decoding.
No decoder matched.

**Phase 3: Completed filtering (rules).
Rule id: '90010'
Level: '0'
Description: '(null)'


As can be seen in the above output, when the remote IP is an IPv6, it fails completely to decode the line, considers the hostname as the server where OSSEC is running, the program name is considered as (null) and the entire log is thrown into the 'log' part. The very same entry, using IPv4, is properly decoded and matched.

Do I need to enable IPv6 decoding, or install some library to do so?

I'm using OSSEC 3.6.0 downloaded from the ossec page and installed using the provided 'install.sh' script. Only the 'local_rules.xml' file was modified. The OS is an up-to-date Ubuntu 18.04 LTS. IPv6 is working properly, both in the remote server that is sending this log and in the server where OSSEC is running (which is my central Syslog server).

Could somebody help me here?

Tks,

Roberto
User avatar
mikeshinn
Atomicorp Staff - Site Admin
Atomicorp Staff - Site Admin
Posts: 4149
Joined: Thu Feb 07, 2008 7:49 pm
Location: Chantilly, VA

Re: Failing to decode IPv6 addresses

Unread post by mikeshinn »

It doesnt look like you have a decoder for that application:

**Phase 2: Completed decoding.
No decoder matched.

Without a decoder, OSSEC doesnt know what each field means.
DrydenK
New Forum User
New Forum User
Posts: 3
Joined: Fri Sep 11, 2020 8:12 am
Location: Sao Paulo

Re: Failing to decode IPv6 addresses

Unread post by DrydenK »

Mike,

your answer didn't make much sense to me, so I made a few more tests. To make sure I had a match in the decoder, I used a well known program (for which there should be an included decoder), Apache. This was the result for IPv6 (I'm not masking the IP addresses this time):

2020/09/16 08:07:06 ossec-testrule: INFO: Reading local decoder file.
2020/09/16 08:07:06 ossec-testrule: INFO: Started (pid: 21035).
ossec-testrule: Type one log per line.

Sep 15 06:45:27 2801:88:fd::3 apache: 191.205.109.18 - - [15/Sep/2020:06:45:27 -0300] "GET / HTTP/1.1" 200 6221 "-" "Mozilla/5.0 (Linux; Android 8.0.0; SM-G935F Build/R16NW; wv) AppleWebKit/537.36 (KHTML, like Gecko) Version/4.0 Chrome/80.0.3987.132 Mobile Safari/537.36"


**Phase 1: Completed pre-decoding.
full event: 'Sep 15 06:45:27 2801:88:fd::3 apache: 191.205.109.18 - - [15/Sep/2020:06:45:27 -0300] "GET / HTTP/1.1" 200 6221 "-" "Mozilla/5.0 (Linux; Android 8.0.0; SM-G935F Build/R16NW; wv) AppleWebKit/537.36 (KHTML, like Gecko) Version/4.0 Chrome/80.0.3987.132 Mobile Safari/537.36"'
hostname: 'fumaira2'
program_name: '(null)'
log: '2801:88:fd::3 apache: 191.205.109.18 - - [15/Sep/2020:06:45:27 -0300] "GET / HTTP/1.1" 200 6221 "-" "Mozilla/5.0 (Linux; Android 8.0.0; SM-G935F Build/R16NW; wv) AppleWebKit/537.36 (KHTML, like Gecko) Version/4.0 Chrome/80.0.3987.132 Mobile Safari/537.36"'

**Phase 2: Completed decoding.
decoder: 'openbsd-httpd'

**Phase 3: Completed filtering (rules).
Rule id: '31100'
Level: '0'
Description: 'Access log messages grouped.'


So, we have a decoder, openbsd-httpd. The problem is that the hostname is wrong. 'fumaira2' is the syslog server, and it appears nowhere in the above line. Why did OSSEC consider it as the hostname? Also, program_name appears as '(null)', while it should appear as Apache. After the date, the entire line was considered as log.

Now, the same line using IPv4 as the source address:

Sep 15 06:45:27 200.145.62.3 apache: 191.205.109.18 - - [15/Sep/2020:06:45:27 -0300] "GET / HTTP/1.1" 200 6221 "-" "Mozilla/5.0 (Linux; Android 8.0.0; SM-G935F Build/R16NW; wv) AppleWebKit/537.36 (KHTML, like Gecko) Version/4.0 Chrome/80.0.3987.132 Mobile Safari/537.36"


**Phase 1: Completed pre-decoding.
full event: 'Sep 15 06:45:27 200.145.62.3 apache: 191.205.109.18 - - [15/Sep/2020:06:45:27 -0300] "GET / HTTP/1.1" 200 6221 "-" "Mozilla/5.0 (Linux; Android 8.0.0; SM-G935F Build/R16NW; wv) AppleWebKit/537.36 (KHTML, like Gecko) Version/4.0 Chrome/80.0.3987.132 Mobile Safari/537.36"'
hostname: '200.145.62.3'
program_name: 'apache'
log: '191.205.109.18 - - [15/Sep/2020:06:45:27 -0300] "GET / HTTP/1.1" 200 6221 "-" "Mozilla/5.0 (Linux; Android 8.0.0; SM-G935F Build/R16NW; wv) AppleWebKit/537.36 (KHTML, like Gecko) Version/4.0 Chrome/80.0.3987.132 Mobile Safari/537.36"'

**Phase 2: Completed decoding.
No decoder matched.

As you can see, the different parts where properly decoded and inserted in the proper fields, but OSSEC reports no decoder, despite the fact that it did report an 'openbsd-httpd' decoder when the address was IPv6.

So, what is happening? Do I have to add some configuration to make OSSEC decode IPv6 properly? If so, where can I read that documentation? If not, what is happening? Is that some bug?

Thank you,

Roberto
User avatar
mikeshinn
Atomicorp Staff - Site Admin
Atomicorp Staff - Site Admin
Posts: 4149
Joined: Thu Feb 07, 2008 7:49 pm
Location: Chantilly, VA

Re: Failing to decode IPv6 addresses

Unread post by mikeshinn »

I think I understand where you might be having trouble. Think of decoders as translators, so even though a log might be going thru a decoder, if it doesnt understand the log message it wont translate it correctly. You need the right decoder for that specific log format, even if its coming from the same application if the log format is different from what the decoder expects, it wont work correctlt. Applications generate all manner of unique log messages, theres just no standard format for anything. So log analysis tools have no idea what one field means over another. They all need some help translating what the log message means. Thats done via decoders (or similarly named tools)

The decoders tear the log message down, and tell OSSEC in this case, not only what a log message means but what each field is, for that uniquely configured application (which might be unique forto just you, that user, application, system, whatever it might be).

For example, lets say we have two applications and they generate logs like this:

Application 1:

11:23:57 PM March 28th 2020, ADMIN USER, LOGIN, SUCCESSFUL, WEB_SERVER, SSHD, PUBLIC KEY

Application 2:

Sep 25 10:47:01 web_server sshd[15366]: Accepted publickey for someuser from 1.2.3.4 port 33946 ssh2

As you can see, these applications logs are very different, the first one provides different information from the second one, even though they are for the same kind of thing (some one has logged into sshd successfully). They dont even have the same amount of information.

So log analysis tools need help to figure out what this means, and to put this information into some kind of standard format that it can then use. Decoders are how OSSEC does this, they translate the unique fields in unique log messages into standardized variables, for example they tell the software that field one is the time, field two is the hostname, field three is the user name, field four is progam name, and what the format of the log message is to find these fields (delimiters, multi line, language, bit type, etc etc ). For another application it might be that field ten is the time, field one is the program name, field two is the user name, etc. Everything is different, because there is no standard for logging things, developers log what they want, how they want, whatever they want.

You cant use the decoder you would use for Application 1 above to translate the log messages of Application 2 above, and vice versa, they both need unique decoders because while they are the same application (sshd) they dont even report the same information in the same order, or even in the same format (the date/time for example is different between them both, application 2 logs the client port, application 1 is in all upper case, etc.).

So if OSSEC isnt putting the program name in the right field or it doesnt know what it is thats because there isnt a decoder for the unique log format youre sending to OSSEC for that unique log format. Without a decoder for that logs format, it has no idea what each field means (program name for instance), so it cant populate that variable.
For example, here is the decoder for telnet log messages:

<decoder name="telnetd">
<program_name>^telnetd|^in.telnetd</program_name>
</decoder>

This tells OSSEC to scan the unique log message for the program name of telnetd, or in.telnetd. That then populates the variable program name. Then we can write a decoder to take the log message apart and extract more information from it:

<decoder name="telnetd-ip">
<parent>telnetd</parent>
<regex>from (\d+.\d+.\d+.\d+)$</regex>
<order>srcip</order>
</decoder>

Since it now knows this is a telnet log messafe, it scans the telnet log message, using regular expressions in this case and looks for a part of the log message that says "from 1.2.3.4", it then puts that IP address into the field "srcip".

You just need to either create a decoder for the format that application creates log messages in, or modify one youre using to put the fields into the right variables. If you need help with this, please contact our support team and we can help you.
DrydenK
New Forum User
New Forum User
Posts: 3
Joined: Fri Sep 11, 2020 8:12 am
Location: Sao Paulo

Re: Failing to decode IPv6 addresses

Unread post by DrydenK »

mikeshinn wrote:I think I understand where you might be having trouble. Think of decoders as translators, so even though a log might be going thru a decoder, if it doesnt understand the log message it wont translate it correctly. You need the right decoder for that specific log format, even if its coming from the same application if the log format is different from what the decoder expects, it wont work correctlt. Applications generate all manner of unique log messages, theres just no standard format for anything. So log analysis tools have no idea what one field means over another. They all need some help translating what the log message means. Thats done via decoders (or similarly named tools)

The decoders tear the log message down, and tell OSSEC in this case, not only what a log message means but what each field is, for that uniquely configured application (which might be unique forto just you, that user, application, system, whatever it might be).

For example, lets say we have two applications and they generate logs like this:

Application 1:

11:23:57 PM March 28th 2020, ADMIN USER, LOGIN, SUCCESSFUL, WEB_SERVER, SSHD, PUBLIC KEY

Application 2:

Sep 25 10:47:01 web_server sshd[15366]: Accepted publickey for someuser from 1.2.3.4 port 33946 ssh2

As you can see, these applications logs are very different, the first one provides different information from the second one, even though they are for the same kind of thing (some one has logged into sshd successfully). They dont even have the same amount of information.

So log analysis tools need help to figure out what this means, and to put this information into some kind of standard format that it can then use. Decoders are how OSSEC does this, they translate the unique fields in unique log messages into standardized variables, for example they tell the software that field one is the time, field two is the hostname, field three is the user name, field four is progam name, and what the format of the log message is to find these fields (delimiters, multi line, language, bit type, etc etc ). For another application it might be that field ten is the time, field one is the program name, field two is the user name, etc. Everything is different, because there is no standard for logging things, developers log what they want, how they want, whatever they want.

You cant use the decoder you would use for Application 1 above to translate the log messages of Application 2 above, and vice versa, they both need unique decoders because while they are the same application (sshd) they dont even report the same information in the same order, or even in the same format (the date/time for example is different between them both, application 2 logs the client port, application 1 is in all upper case, etc.).

So if OSSEC isnt putting the program name in the right field or it doesnt know what it is thats because there isnt a decoder for the unique log format youre sending to OSSEC for that unique log format. Without a decoder for that logs format, it has no idea what each field means (program name for instance), so it cant populate that variable.
For example, here is the decoder for telnet log messages:

<decoder name="telnetd">
<program_name>^telnetd|^in.telnetd</program_name>
</decoder>
Ok. I understood the decoder part. The problem I'm having, is that when the source address of the log message is in IPv6, the "program_name" field returns as <null>. As you can see in my previous messages, no decoder will work properly, since the parser of the messages is not filling the data in the proper places. With IPv6, the "hostname" is filled incorrectly, the "program_name" is filled as <null>, and the entire line is thrown into the log field. With an IPv4 line, the fields are filled correctly.

What I mean is that there seems to be a bug BEFORE the decoders are selected. The IP address is located before the log text itself, so it's not a matter of decoder here, but about the parser that separates the line into the the fields that are later processed by the decoders.

Roberto Greiner
User avatar
mikeshinn
Atomicorp Staff - Site Admin
Atomicorp Staff - Site Admin
Posts: 4149
Joined: Thu Feb 07, 2008 7:49 pm
Location: Chantilly, VA

Re: Failing to decode IPv6 addresses

Unread post by mikeshinn »

What I mean is that there seems to be a bug BEFORE the decoders are selected. The IP address is located before the log text itself, so it's not a matter of decoder here, but about the parser that separates the line into the the fields that are later processed by the decoders.
The parser is the decoder. It takes the raw log and bucketizes it however you want. It doesnt matter what order anything is in, its the decoders job to make sense of the whatever the log message is.
Post Reply