Log Ingestion via Logstash - Custom table

Copper Contributor

Hello All,

I am using Logstash to send logs to my MS Sentinel instance. it works fine on standard tables e.g syslog.

Now i am trying to ingest logs from other sources like gcplogs. I am using the gcp input plugin, filter via grok and use output plugin to send the logs to DCR.

 

i get a successful message on logstash console

[INFO ] 2024-01-22 14:44:13.225 [[main]>worker0] microsoftsentineloutput - Successfully posted 1 logs into log analytics DCR stream [Custom-gcplogs24_CL] x-ms-request-id [eeb36e07-bfd7-40ba-becc-fc438a6bfdaa].

 

But i am unable to find the logs when I search them in the log analytics table. Logs are sent in json format like below

Also i have attached my custom table schema for reference. 

Can someone help?Snip20240122_34.png

{
"host": "XXXXX",
"@version": "XXXXX",
"parsed_data": {
"severity": "XXXXX",
"protoPayload": {
"response": {
"@type": "XXXXX",
"service": {
"state": "XXXXX",
"parent": "XXXXX",
"name": "XXXXX",
"config": {
"title": "XXXXX",
"name": "XXXXX"
}
}
},
"@type": "XXXXX",
"requestMetadata": {
"callerIp": "XXXXX",
"callerSuppliedUserAgent": "XXXXX"
},
"resourceName": "XXXXX",
"serviceName": "XXXXX",
"authenticationInfo": {
"principalEmail": "XXXXX"
},
"authorizationInfo": [{
"permission": "XXXXX",
"granted": "XXXXX",
"resource": "XXXXX"
}, {
"permission": "XXXXX",
"granted": "XXXXX",
"resource": "XXXXX"
}],
"request": {
"@type": "XXXXX",
"name": "XXXXX"
},
"status": {},
"methodName": "XXXXX"
},
"timestamp": "XXXXX",
"logName": "XXXXX",
"operation": {
"producer": "XXXXX",
"last": "XXXXX",
"id": "XXXXX"
},
"resource": {
"labels": {
"project_id": "XXXXX",
"service": "XXXXX",
"method": "XXXXX"
},
"type": "XXXXX"
},
"receiveTimestamp": "XXXXX",
"insertId": "XXXXX"
},
"json_data": "{\"insertId\":\"XXXXX\",\"logName\":\"XXXXX\",\"operation\":{\"id\":\"XXXXX\",\"last\":\"XXXXX\",\"producer\":\"XXXXX\"},\"protoPayload\":{\"@type\":\"XXXXX\",\"authenticationInfo\":{\"principalEmail\":\"XXXXX\"},\"authorizationInfo\":[{\"granted\":\"XXXXX\",\"permission\":\"XXXXX\",\"resource\":\"XXXXX\"},{\"granted\":\"XXXXX\",\"permission\":\"XXXXX\",\"resource\":\"XXXXX\"}],\"methodName\":\"XXXXX\",\"request\":{\"@type\":\"XXXXX\",\"name\":\"XXXXX\"},\"requestMetadata\":{\"callerIp\":\"XXXXX\",\"callerSuppliedUserAgent\":\"XXXXX\"},\"resourceName\":\"XXXXX\",\"response\":{\"@type\":\"XXXXX\",\"service\":{\"config\":{\"name\":\"XXXXX\",\"title\":\"XXXXX\"},\"name\":\"XXXXX\",\"parent\":\"XXXXX\",\"state\":\"XXXXX\"}},\"serviceName\":\"XXXXX\",\"status\":{}},\"receiveTimestamp\":\"XXXXX\",\"resource\":{\"labels\":{\"method\":\"XXXXX\",\"project_id\":\"XXXXX\",\"service\":\"XXXXX\"},\"type\":\"XXXXX\"},\"severity\":\"XXXXX\",\"timestamp\":\"XXXXX\"}",
"@timestamp": "XXXXX",
"message": "{\"insertId\":\"XXXXX\",\"logName\":\"XXXXX\",\"operation\":{\"id\":\"XXXXX\",\"last\":\"XXXXX\",\"producer\":\"XXXXX\"},\"protoPayload\":{\"@type\":\"XXXXX\",\"authenticationInfo\":{\"principalEmail\":\"XXXXX\"},\"authorizationInfo\":[{\"granted\":\"XXXXX\",\"permission\":\"XXXXX\",\"resource\":\"XXXXX\"},{\"granted\":\"XXXXX\",\"permission\":\"XXXXX\",\"resource\":\"XXXXX\"}],\"methodName\":\"XXXXX\",\"request\":{\"@type\":\"XXXXX\",\"name\":\"XXXXX\"},\"requestMetadata\":{\"callerIp\":\"XXXXX\",\"callerSuppliedUserAgent\":\"XXXXX\"},\"resourceName\":\"XXXXX\",\"response\":{\"@type\":\"XXXXX\",\"service\":{\"config\":{\"name\":\"XXXXX\",\"title\":\"XXXXX\"},\"name\":\"XXXXX\",\"parent\":\"XXXXX\",\"state\":\"XXXXX\"}},\"serviceName\":\"XXXXX\",\"status\":{}},\"receiveTimestamp\":\"XXXXX\",\"resource\":{\"labels\":{\"method\":\"XXXXX\",\"project_id\":\"XXXXX\",\"service\":\"XXXXX\"},\"type\":\"XXXXX\"},\"severity\":\"XXXXX\",\"timestamp\":\"XXXXX\"}",
"event": {
"original": "{\"insertId\":\"XXXXX\",\"logName\":\"XXXXX\",\"operation\":{\"id\":\"XXXXX\",\"last\":\"XXXXX\",\"producer\":\"XXXXX\"},\"protoPayload\":{\"@type\":\"XXXXX\",\"authenticationInfo\":{\"principalEmail\":\"XXXXX\"},\"authorizationInfo\":[{\"granted\":\"XXXXX\",\"permission\":\"XXXXX\",\"resource\":\"XXXXX\"},{\"granted\":\"XXXXX\",\"permission\":\"XXXXX\",\"resource\":\"XXXXX\"}],\"methodName\":\"XXXXX\",\"request\":{\"@type\":\"XXXXX\",\"name\":\"XXXXX\"},\"requestMetadata\":{\"callerIp\":\"XXXXX\",\"callerSuppliedUserAgent\":\"XXXXX\"},\"resourceName\":\"XXXXX\",\"response\":{\"@type\":\"XXXXX\",\"service\":{\"config\":{\"name\":\"XXXXX\",\"title\":\"XXXXX\"},\"name\":\"XXXXX\",\"parent\":\"XXXXX\",\"state\":\"XXXXX\"}},\"serviceName\":\"XXXXX\",\"status\":{}},\"receiveTimestamp\":\"XXXXX\",\"resource\":{\"labels\":{\"method\":\"XXXXX\",\"project_id\":\"XXXXX\",\"service\":\"XXXXX\"},\"type\":\"XXXXX\"},\"severity\":\"XXXXX\",\"timestamp\":\"XXXXX\"}"
}
}

 

1 Reply

Hi - let's walk backwards a second.  

 

You have a schema setup.  Did you do this through a DCR, or manually creating the fields on the table?

 

If you didn't edit the DCR to create a transformKql segment, you'll want to use that - and it'll generate the fields for you.  When you click on the three dots next to the table name, it'll give an option for "Edit Transformation".  When you click that, it'll ask you to drop in a json file.

 

output {
    microsoft-sentinel-logstash-output-plugin {
        # The information below is for testing only… set to TRUE to output to a local json file when building the table.
        create_sample_file=> false
        sample_file_path => "/usr/share/logstash/output_to_host"
    }
}

 

Use the output above in logstash to have Microsoft give you the json file.  You can then use that file, drop it in, and click "next".  There you'll have to define "extend" fields to map each field to a name.

 

Keep in mind, when you create new fields (ie, | extend host = extract(...)), the table schema will append a "_s" or "_g" or some such at the end.  This is normal, and nothing you can change.  If you want it pretty, you'll need to use functions to remap each name to something without the _s / _g / _n / etc.

 

I found it harder to do Custom-SyslogStream than a normal custom table.  

 

These links may be of use: