How to ingest JSON logs from kafka streams

Hi Experts, want to check if InsightIDR can ingest logs from Kafka compatible streams pool, Any pointers would be appreciated. I am looking for Oracle Cloud Infrastructure Audit Log and VCN Flow log integration with Rapid7 SIEM solution.

Thanks

Bal

Hi @bsharma ,

Did you successfully manage to ingest OCI logs in InsightIDR?

I am also looking into this.

Thanks!

InsightIDR is essentially capable of ingesting anything you throw at it. But if you’re specifically looking for a direct integration where the logs will have a default parser and contribute to the investigations that a separate conversation.

Currently we do not have default integrations for the items you mentioned but if those tools have the ability to forward logs then you should be able to get them into IDR. The main thing is being able to route them to your collector. If these are internal systems then the routing shouldn’t be as much of an issue so long as the application has the ability to forward on a port or something to that nature. Hosted applications are sometimes a little tricker as you have to allow the traffic into your environment.

Either way, the collector will accept the logs either way but there are some things to consider when sending logs like the format of those logs. InsightIDR does great with JSON because it returns everything as JSON. So for example if you’re application has the ability to send the logs in a JSON format then I would choose that method so that InsightIDR can return those same key value pairs back to you and allow you to search on them within log search. If the application does not support JSON but send them in a typical flat syslog format log line then you can always create custom parsers within InsightIDR after the fact that will create the key value pairs for you.

https://docs.rapid7.com/insightidr/custom-logs/
https://docs.rapid7.com/insightidr/custom-parsing-tool/

Thanks for your reply!

In essence, we still need something in between that will handle the Kafka streams and send those data to the collector as a Syslog (if that is possible or atleast will forward those data as JSON).

There are tools out there that can handle that type of ingestion for example NXLog Enterprise Edition offers a Kafka input module https://docs.nxlog.co/refman/current/im/kafka.html

Unfortunately enterprise edition does have a cost associated with it. The Community Edition does not come with that input module.