Sql-server – Preferred Methods of Indexing MS SQL Server Audit Data to Splunk

auditsql server

What is your preferred method for indexing MS SQL Server for Splunk? I am collecting audit data in various environments by various means (profiler, app logs, and extended events) but I hope to consolidate them all into extended events moving forward.

However, one or two environments have Splunk connected and we want to ensure all security logs are collected and sent to the Splunk server. I have seen a few ways of doing this but I'm not sure which would be preferred or the 'best practice'. I haven't seen much on the Splunk community on this since really it's up to us to get it in Splunk however we can, so I figured I'd ask us here.


1. Make sure Splunk can index the darn thing.

Prefer minimal user scripting to accomplish it, if possible. Basically, make it a text or CSV file.

2. Minimal risk of 'fudging' the audit logs before Splunk gets them.

I want them to be written to Splunk or the text file that is indexed as fast as possible. This is to reduce 'man in the middle' attacks where audit logs are fudged 'in between' pulls and syncs. Also, we might be exposed to duplicate entries in certain scenarios.

3. Minimize DBA access.

I understand the DBAs with SA rights for the enterprise will always be able to get around issues, but we could vastly minimize our access to it. I am thinking of a solution so that security really manages this, because sometimes, even DBAs go bad.


1. Collect profiler data and use the .Net class to write a mini-app so it exports to a text file for Splunk.

I haven't done this before but it seems like it would resolve 'having data read immediately'. It requires a server side trace but in 2012 you can start it up without having to enable server start up stored procs which would go against the security standard.


  • Profiler is easy and everyone knows how to use it in and out; won't be dependent on me.

  • Easy .Net integration


  • Profiler sucks compared to extended events.

  • Will have to ensure it is always collecting data. In 2012 it's easy but in 2008 you will have issues if you don't use start up stored procs, and will have to get creative with jobs. Even then, you might miss some logging.

  • It is being deprecated.

2. Replace profiler with Extended Events.


  • Will always start up in 2008 and up without any special parameters or start up stored procs

  • Very light weight

  • Preferred new method


  • How the heck do you get the file saved in a text or CSV format so Splunk can easily access it? I haven't seen any way to do that and my PluralSight sub ran out :/

3. Log the data to a locked-down table and have Splunk read that table. Put proper permissions in place where a DBA/sec admin can modify the trigger/service broker app that logs it and alert SecOps/DBAs if someone changes anything in that table.

Problem is I have only found Splunk to be able to query MySQL and not MS SQL Server. Perhaps I could run a PowerShell script to continually query that table or get creative with a trigger that starts a PowerShell session (have never done that before) to reduce man in the middle editing the files.


  • Meets the requirements

  • Reduces man in the middle attack if we can have it update the file immediately.


  • It's perhaps more complicated than it needs to be.

4. Use this beta product.


  • Security team will manage it which is preferred since this is a security concern, and part of it IMO is protecting the data from DBAs as well.

  • Minimal overhead on the DBA team.


  • It's beta.

  • No real documentation.


5. Use Splunk DB Collect.

This seems like the best solution. We just have to ensure that not too many people have access to the table or trigger. I will check to see how 'real time' it is.