Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Getting exchange module working #1

Open
bennerj opened this issue Oct 16, 2023 · 10 comments
Open

Getting exchange module working #1

bennerj opened this issue Oct 16, 2023 · 10 comments
Assignees

Comments

@bennerj
Copy link

bennerj commented Oct 16, 2023

Hello,

Not sure what I am doing wrong, but I have tried to import everything that you have created for MS exchange to start ingesting the logs via the module instead of logstash as I have been doing. This way I can try and use your updated dashboards.

Currently I am working with Elastic 8.10.3 and I changed the pipeline to match 8.10.3 instead of 8.6.x. Everything looks like it is working and it tests fine when I use your test dataset and to include when I use my exchange server data as well. But for some reason looking at the discovery page I just get a bunch of:

error.message "Text '#Date: 2023-09-27T19:00:01.134Z' could not be parsed at index 0" among others all related to same first set of fields.

Looking at my MSGTRK*.LOG file I do see the first set of lines in every file look like this:

#Software: Microsoft Exchange Server
#Version: 15.02.0986.042
#Log-type: Message Tracking Log
#Date: 2023-09-16T20:00:10.026Z
#Fields: date-time,client-ip,client-hostname,server-ip,server-hostname,source-context,connector-id,source,event-id,internal-message-id,message-id,network-message-id,recipient-address,recipient-status,total-bytes,recipient-count,related-recipient-address,reference,message-subject,sender-address,return-path,message-info,directionality,tenant-id,original-client-ip,original-server-ip,custom-data,transport-traffic-type,log-id,schema-version

Any idea on how to get that trimmed out?

@leweafan
Copy link
Owner

Hello @bennerj!
Pipeline can't parse commented lines. Suppose I can add fix to ignore it.
Have you tried to test ingest_pipeline with data from your logs?

@bennerj
Copy link
Author

bennerj commented Oct 17, 2023

Yep, when I use data from my log excluding the 4 lines that start with “#” the pipeline works great.

unfortunately I can’t get it to go passed the first 4 lines

@bennerj
Copy link
Author

bennerj commented Oct 17, 2023

@leweafan I was able to get it to start importing data. I added exclude_lines: ['^#'] to exchange.yml inside the modules.

Now my issue is I can't get the dashboard to load at all. I got to saved objects and do an import, but I get the following error when using your .ndjson file: Sorry, there was an error The file could not be processed due to error: "Bad Request: Unexpected end of JSON input"

Just noticed the event log on my exchange server has the following error as well

{"log.level":"warn","@timestamp":"2023-10-17T07:33:43.097-0400","log.logger":"elasticsearch","log.origin":{"file.name":"elasticsearch/client.go","file.line":446},"message":"Cannot index event publisher.Event{Content:beat.Event{Timestamp:time.Date(2023, time.October, 17, 7, 32, 57, 702317300, time.Local), Meta:{\"pipeline\":\"filebeat-8.10.3-microsoft-exchange-pipeline\"}, Fields:{\"agent\":{\"ephemeral_id\":\"8e73ee61-c8ae-44d4-8f28-c32635d3519f\",\"id\":\"16618e89-914d-4860-bb68-b3d21f47e39b\",\"name\":\"exch\",\"type\":\"filebeat\",\"version\":\"8.10.3\"},\"ecs\":{\"version\":\"8.5.0\"},\"event\":{\"dataset\":\"microsoft.exchange\",\"module\":\"microsoft\",\"timezone\":\"-04:00\"},\"fileset\":{\"name\":\"exchange\"},\"host\":{\"architecture\":\"x86_64\",\"hostname\":\"exch\",\"id\":\"982acf9c-3858-4362-a367-67cb5e3e66e9\",\"ip\":[\"10.0.0.63\",\"10.0.0.25\"],\"mac\":[\"00-50-56-94-03-F4\",\"02-9C-C6-5D-9A-23\"],\"name\":\"exch\",\"os\":{\"build\":\"17763.4851\",\"family\":\"windows\",\"kernel\":\"10.0.17763.4851 (WinBuild.160101.0800)\",\"name\":\"Windows Server 2019 Standard\",\"platform\":\"windows\",\"type\":\"windows\",\"version\":\"10.0\"}},\"input\":{\"type\":\"log\"},\"log\":{\"file\":{\"path\":\"C:\\\\Program Files\\\\Microsoft\\\\Exchange Server\\\\V15\\\\TransportRoles\\\\Logs\\\\MessageTracking\\\\MSGTRK2023092916-1.LOG\"},\"offset\":2279616},\"message\":\"2023-09-29T16:42:13.979Z,,exch,,,CatHandleFail,,AGENT,AGENTINFO,158248070021802,\\u003c5bb76e99b8bb4dc49170de8f94d2e6db@my.domain\\u003e,d6f30b01-e75b-4fde-dd4f-08dbc10b08b6,member2@mail.domain,,21970,1,,,Meeting Forward Notification: OSS review,member1@my.domain,member1@my.domain,,Originating,,10.0.1.33,10.0.0.61,S:AMA=SUM|action=p|error=|atch=0;S:DeliveryPriority=Normal;S:AccountForest=my.domain,Email,859738a3-1990-401c-55e2-08dbc10b08d1,15.02.0986.042\",\"service\":{\"type\":\"microsoft\"}}, Private:file.State{Id:\"native::95354880-164251-2532818677\", PrevId:\"\", Finished:false, Fileinfo:(*os.fileStat)(0xc0004b2e00), Source:\"C:\\\\Program Files\\\\Microsoft\\\\Exchange Server\\\\V15\\\\TransportRoles\\\\Logs\\\\MessageTracking\\\\MSGTRK2023092916-1.LOG\", Offset:2280095, Timestamp:time.Date(2023, time.October, 17, 7, 16, 31, 472348500, time.Local), TTL:-1, Type:\"log\", Meta:map[string]string(nil), FileStateOS:file.StateOS{IdxHi:0x5af0000, IdxLo:0x2819b, Vol:0x96f7bef5}, IdentifierName:\"native\"}, TimeSeries:false}, Flags:0x1, Cache:publisher.EventCache{m:mapstr.M(nil)}} (status=400): {\"type\":\"document_parsing_exception\",\"reason\":\"[1:291] failed to parse field [related.ip] of type [ip] in document with id 'J9RpPYsBu7qW3REruKk9'. Preview of field's value: ''\",\"caused_by\":{\"type\":\"illegal_argument_exception\",\"reason\":\"'' is not an IP string literal.\"}}, dropping event!","ecs.version":"1.6.0"}

@bennerj
Copy link
Author

bennerj commented Oct 17, 2023

@leweafan So I figured out my issue with the dashboard. I just flattened your ndjson file so each { } was on a single line and I was able to import everything just fine.

New issue that I am not sure what is going on with it.

Saved field "microsoft.exchange.total_bytes" of data view "filebeat-*" is invalid for use with the "Sum" aggregation. Please select a new field.

That is what I get for any of your visualizations that do a SUM action. Looking into it, the microsft.exchange.total_bytes field in the index is a keyword and not a long as that would allow me to get this working.

I noticed that the component template that you have me upload was not mapped to the index template, so I went and remapped that but not entirely sure I got that correct. Any other ideas?

@leweafan
Copy link
Owner

@bennerj if you check exchange-mapping.json you will find that total_bytes field type is long.
I suppose that your case is that your template has dynamic mapping enabled and when a new field microsoft.exchange.total_bytes has been created it's been mapped as keyword. Then you discovered that not applied component template and fixed it. Now field mapping will not change or if it rollover and changed you will have conflict of two field types long vs keyword for your data view. The best way to fix it create a new datastream/indice or reindex your data.

@leweafan
Copy link
Owner

I have check exchange-mapping.json and found that I forgot to add suffix microsoft for fields. My bad. Fixed it.

@leweafan leweafan changed the title Getting this working Getting exchange module working Oct 18, 2023
@leweafan leweafan self-assigned this Oct 18, 2023
@bennerj
Copy link
Author

bennerj commented Oct 18, 2023

Looks like that did the trick! Thanks for taking a look at this! Also thanks for all the work on this. It truly is great to see all this come together here.

@bennerj
Copy link
Author

bennerj commented Oct 19, 2023

@leweafan So I have gotten everything to work at this point for exchange. However when I look at the dashboards I am not seeing anything for the "message receive" counters. Looking at the visualization I notice that I am looking for "event.id" DELIVER. When inspecting the discovery for filebeat-* I do not see that in there at all.

When I go to the logs I see DELIVER in the log MSGTRKMD*-*.LOG but not seeing that up on the elastic side.

Any ideas on what I could be missing on my setup?

@bennerj
Copy link
Author

bennerj commented Oct 19, 2023

@leweafan So I have gotten everything to work at this point for exchange. However when I look at the dashboards I am not seeing anything for the "message receive" counters. Looking at the visualization I notice that I am looking for "event.id" DELIVER. When inspecting the discovery for filebeat-* I do not see that in there at all.

When I go to the logs I see DELIVER in the log MSGTRKMD*-*.LOG but not seeing that up on the elastic side.

Any ideas on what I could be missing on my setup?

I think I found something related to this. Testing the pipeline with my actual log data I noticed that "related.ip" is adding a "" which is preventing elastic from actually adding anything related to those logs. I tried to mess with the pipeline, which I got it to ingest my delivered log files, but seemed to break everything else.

Here is the output related to information above:

 "related": {
            "ip": [
              "10.0.49.196",
              ""
            ]

@bennerj
Copy link
Author

bennerj commented Oct 20, 2023

@leweafan

So messing with your pipeline I noticed the issue.

This block of code:

  {
    "append": {
      "field": "related.ip",
      "value": "{{microsoft.exchange.server_ip}}",
      "allow_duplicates": false,
      "if": "ctx?.microsoft?.exchange?.original_server_ip != null"
    }
  },

Needs to actually be this:

  {
    "append": {
      "field": "related.ip",
      "value": "{{microsoft.exchange.original_server_ip}}",
      "allow_duplicates": false,
      "if": "ctx?.microsoft?.exchange?.original_server_ip != null"
    }
  },

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants