Running Splunk In Docker is about as easy as it should be
You have fully embraced the world of Docker hosted containers, and no longer have the patience for local installations of “Enterprise” software that requires a developer account to download. Fortunately, there’s a Splunk Enterprise Docker Image just for you. It even comes preconfigured with a 500M per day developer license. No account required!
Starting Splunk in Docker
The Splunk Enterprise Docker Image makes installation as simple as this:
You will see some output to the console as Splunk is starting up…
You can login at
http://localhost:8000 with username
admin and password
Create a Splunk HTTP Event Collector for Docker Logging
Before another Docker container can log directly to Splunk, you need to create a new HTTP Event Collector that listens for logging events on port
- Login to the Splunk web interface at:
http://localhost:8000using the values for
passw0rdfor the username and password.
- Navigate to the Settings → Data Inputs → HTTP Event Collector page and click the New Token button.
- Enter the Name
dockerand click the Next button.
- Under the Index section, click the Create a new index link.
- Enter the Index Name
dockerand click the Save button.
- The Default Index will now read
docker. Click the Review button.
- Review these settings and click the Submit button.
- You will then read Token has been created successfully and can copy the hexadecimal Token Value to your clipboard for use in subsequent
- Finally, navigate to Settings → Data Inputs → HTTP Event Collector and click the Global Settings button.
- Select All Tokens Enabled and click Save. This enables the HTTP Event Collector on port
That sounds like a lot of work, but it’s almost entirely default values and you will only need to do this once after creating a new Splunk container.
Log Another Docker Container Directly To Splunk
Once the Splunk HTTP Event Collector is enabled and you have a valid Token Value for logging, it’s comparatively easy for Docker to log the rest of your containers directly to it.
Runing a command similar to the following tells Docker to log the output from your container directly to Splunk:
docker run command supports the
--log-opt switches which allow us to tell Docker how to handle logging:
--log-driver=splunktells Docker to send logs to Splunk’s HTTP Event Collector.
--log-opt splunk-tokenis where we put the Token Value we created for logging.
--log-opt splunk-urlis the HTTPS link to the Splunk HTTP Collector on port 8088.
--log-opt splunk-format=rawtells Docker to send the raw text from our logs.
--log-opt splunk-insecureskipverify=truetells Docker to skip cert verification.
That’s a lot of options, but it’s everything you’ll need to get those logs sent directly to Splunk for further analysis. Use the host IP address of the docker server, or an appropriate DNS entry (don’t use
localhost). Docker will exit immediately if this connection fails, providing you with a hint that you may not have completed step 10 above.
A logging connection failure looks like this upon executing the docker command:
Start Searching in Splunk
If you were able to start up a Docker container using all of the Splunk logging options in the previous section, the logging output from that container is now searchable in Splunk.
Try this simple search to see what events have been received:
If you ran the
alpine example above, you will see the JSON-based output from it in the results:
Although Splunk handles JSON output exceptionally well, any text output from a logged container will be entered as a searchable event.