Last week I needed to configure Splunk to consume JSON log fies, the documentation on the Splunk website wasn’t particularly clear, and ended in some strange results with data being repeated…With the help of an old colleague of mine (thanks Matt) he pointed me in the direction of this Splunk Answers question
, which described the problem that I was having as well as the solution – fixing the configuration.
So here are the steps required to setup Splunk to consume JSON from a log file. I’ll assume that you already have an instance of Splunk installed.
Step 1 – Install the Universal Forwarder (optional)
The setup that I was working with was a Splunk server running on a Virtual Machine in Azure and an on-premise server where the log files to consume were produced. Splunk provides a useful utilities called the Universal Forwarder that consumes events data and sends it on to the Splunk server.
Installation is really straightforward so I’m not going to cover that here.
Step 2 – Configuring a custom source type
This is the part that caught me out, from the searching that I did the first time around I learnt that I needed to setup a custom source type that told Splunk to parse the data as JSON. The mistake that I made was creating this custom source type on the remote node where I had the Forwarder installed.
To do it correctly, you will need to open/create a props.conf file on the Splunk server with the following content:
INDEXED_EXTRACTIONS = json
KV_MODE = none
The props.conf file can be found at
If props.conf doesn’t exist in this folder (it didn’t for me) then you will need to create it.
Step 3 – Setting up log file monitoring
This is the easy part, and the part that I did do correctly, on the remote node open the inputs.conf file and add the following
The inputs.conf file can be found at
With that done, data is going in and nothing is being duplicated.