Working on extracting some Key/Value pairs out of DB2's log files. I have a file like this:
[...snip...]
Buffer pool xda writes = 0
Asynchronous pool xda page writes = 0
Total buffer pool read time (millisec) = 66
Total buffer pool write time (millisec) = 0
Total elapsed asynchronous read time = 46
Total elapsed asynchronous write time = 0
Asynchronous data read requests = 3
Asynchronous index read requests = 0
[...snip...]
While I can go and use EXTRACT commands/regexes for only the specific ones I want, it would be extremely tedious to do so. Especially since this spans across various sourcetypes.
Ideally, I'd like to be able to make use of Splunk's "CLEAN_KEYS" setting and have things come out extracted so I can do a search like this without having to configure anything else:
search {stuff} | timechart avg(Total_buffer_pool_read_time_millisec)
I have this so far.. (copied some values from other pre-packaged transforms.conf files)
props.conf:
[db2dynsql]
BREAK_ONLY_BEFORE=Number of executions
SHOULD_LINEMERGE=true
KV_MODE=none
REPORT-kv = db2_kv
transforms.conf:
[db2_kv]
CAN_OPTIMIZE = True
CLEAN_KEYS = True
DEFAULT_VALUE =
DEST_KEY =
FORMAT = $1::$2
KEEP_EMPTY_VALS = False
LOOKAHEAD = 4096
MV_ADD = False
REGEX = ([^=]+)\s+=\s+(.*?)
SOURCE_KEY = _raw
WRITE_META = False
Right now.. all that is getting me is this error message when I'm running a search:
[splunkhost] Field extractor name=db2_kv is unusually slow (max single event time=1522ms, probes=14 warning max=1000ms)