Quantcast
Channel: Latest Questions on Splunk Answers
Viewing all 13053 articles
Browse latest View live

Lookup File Editor Feature Request: Have the app create a backup before modifying the lookup file

$
0
0

Hi Luke, thank you for sharing this app!

I'd like to find out if it is possible to have your app create a backup before modifying the lookup file. Perhaps when user clicks on Save, a backup with the current timestamp is created:

mylookup.csv -> mylookup.csv_20140821


How to convert a table into a list format?

$
0
0

Hi,

I would like to convert a crosstable into a list.

Date      | A | B 
01.01.2014| 5 | 2
02.01.2014| 5 | 2
03.01.2014| 8 | 9

The output should be:

col1       |col2| col3
01.01.2014 | A | 5
01.01.2014 | B | 2
02.01.2014 | A | 5
02.01.2014 | B | 2
03.01.2014 | A | 8
03.01.2014 | B | 9

Is there an easy way to do this? I tried out the transpose command, but without success.

BR

Heinz

How to edit search so delta command does not return negative results?

$
0
0

Hi Everyone, I have a need to create a delta between the count of id today to the count of id yesterday search: search xyz| timechart count span=1d | sort - _time | delta count AS countdiff example result:

    _time   count   countdiff
2014-08-26T00:00:00.000+0300    4   
2014-08-25T00:00:00.000+0300    1   -3
2014-08-24T00:00:00.000+0300    0   -1
2014-08-23T00:00:00.000+0300    0   0
2014-08-22T00:00:00.000+0300    0   0
2014-08-21T00:00:00.000+0300    0   0
2014-08-20T00:00:00.000+0300    0   0
2014-08-19T00:00:00.000+0300    0   0
2014-08-18T00:00:00.000+0300    0   0
2014-08-17T00:00:00.000+0300    0   0
2014-08-16T00:00:00.000+0300    0   0
2014-08-15T00:00:00.000+0300    0   0
2014-08-14T00:00:00.000+0300    0   0
2014-08-13T00:00:00.000+0300    0   0
2014-08-12T00:00:00.000+0300    1   1
2014-08-11T00:00:00.000+0300    0   -1

using delta i was able to create the diff between today and the day before but the delta function does yesterday-today and i need it today-yesterday. Any thoughts?

Handling a session split up between multiple events

$
0
0

One of my VPN log sources is indexed all in different events, correlated by a session_id. This is making things very difficult for me when I am looking for specific events in the session. Consider every time I need a field other than what is in that specific log entry, I have to |transaction on the session to grab a field such as an IP or Username from that another event in that same session.

For example, Lets say I want to alert on a user who has 20 failed VPN login attempts within a 10 minute period. Seems pretty easy? Well, this is pretty easy as long as I only want the timestamps, username, and the number of fails. I simply do the following:

sourcetype="VPN" "Authentication Fail" |transaction user maxspan=10m |where eventcount>=20 |eval numFails=eventcount |eval FirstFailure=_time |eval LastFailure=_time+duration |table user, FirstFailure, LastFailure, numFails

So thats not too bad. But now lets say I want the IPs associated with these failures, this gets a lot more complicated. The only way I have found to do this is the following:

sourcetype="VPN" "Authentication Fail" |transaction user maxspan=10m |where eventcount>=20 |eval numFails=eventcount |eval FirstFailure=_time |eval LastFailure=_time+duration |streamstats count as id |stats values(user) as user, values(FirstFailure) as FirstFailure, values(LastFailure) as LastFailure, values(numFails) as numFails, values(session_id) as session_id by id|append [search sourcetype="VPN" |stats values(user) as user values(external_ip) as external_ip by session_id |search user= external_ip=] |stats values(session_id) as session_id values(FirstFailure) as FirstFailure values(LastFailure) as LastFailure list(external_ip) as external_ip by user

As you can see, I have to subsearch for the VPN logs and stats on the session_id to pull back a external ip to the search, then stats again on the results to associate the IPs with the sessions I got in the main search. I also realize I could be missing a much easier way of doing this, since I am fairly new to these advanced searches in Splunk.

I am hoping to find suggestions on how to better deal with the sessions so i don't have to have a sub-search anytime i need a field from another log in the session. When getting into more complicated searches and correlations, this slows the search down a significant amount as well.
I have provided a sample of the logs below for everyone to take a look at. Thanks in advance!

Aug 26 18:21:35 (session_id): Received User-Agent header: Aug 26 18:21:35 (session_id): New session from client IP 111.111.111.111 Aug 26 18:22:05 (session_id): Username 'user' Aug 26 18:22:05 (session_id): SECURID module: authentication with 'user' failed: Authentication failed. Aug 26 18:22:36 (session_id): Username 'user' Aug 26 18:22:36 (session_id): Retry Username 'user' Aug 26 18:22:38 (session_id): Access policy result: Full Aug 26 18:22:41 (session_id): Assigned PPP IPv4: 101.101.101.101

filed values is macro name

$
0
0

Can be used as a macro name field value?

EX)

index=_internal | table sourcetype | `sourcetype`

I have a 500 type I want to use each type of each macro.

What should I do?

Display graph for Top 5 Process for the metric selected by user

$
0
0

Hi All. If the user selects %_Processor_Time,then I need to show the graph for avg(%_Processor_Time) for top 5 processes that consumes %_Processor_Time. I got the top 5 processes using the search but unable to write the timechart command in the same query to display graph.

index=winserver_process sourcetype="PerfmonMk:Process" NOT instance=Idle NOT instance=_Total host="ddweng09" | eval Process=upper(instance) | table Process %_Processor_Time | sort - %_Processor_Time | dedup Process | head 5

Windows - How to monitor XML files within a sub-directory

$
0
0

I want to monitor XML files residing inside sub-directories.

Files inside Path :

D:\Roll\DIP\SessionLogs\35\1.xml
D:\Roll\DIP\SessionLogs\35\2.xml
D:\Roll\DIP\SessionLogs\35\3.xml
D:\Roll\DIP\SessionLogs\36\1.xml
D:\Roll\DIP\SessionLogs\36\2.xml
D:\Roll\DIP\SessionLogs\36\3.xml

I set inputs.conf: (in Universal forwarder)

[monitor://D:\Roll\DIP\SessionLogs\]
index = myindex
sourcetype = session_log

props.conf (in indexer)

[session_logs]
KV_MODE = xml

I dont get the logs in Search head ? Something am i missing here ..?

Using transaction or stats to do event correlation like Vlookup?

$
0
0

Hi All,

In my scenario, I have a batch of events that are for a particular Event Code, sorted by time. The fields included in this Event are Account Name, Computer Name, and Account Domain.

There is a separate batch of events for another Event Code, sorted by time. And these fields include Account Name, Computer Name and Client Address.

Basically I would like to run a search for the 1st scenario, however I'd also like to include the corresponding Client Address from the 2nd scenario, if the Account Name matches (for events that occurred at relatively the same time)

Any ideas or tips on how to go about this would be greatly appreciated!


Are there options other than using comma separation for server lists in wmi.conf input stanzas?

$
0
0

Hi,

are there anybody out there who's got some experience with defining WMI access from Splunk (HF) to a rather large quantity of windows servers for collecting System and Application event logs?

We are currently doing it, and it works like a charm, but until now it's only been done for a small number of servers. This is about to change...

So, to the question then: Are there really no other possibilities than using comma-separation for the Server-setting in the input specific settings stanzas? e.g:

[WMI:SystemAndApplicationLog]
index=TestServers
server = host1,host2,host3,host4,host5,host6,host7,host9,host11,host8,host12,host14,host20,host15,host13**
current_only = 0

Would have been nice to be able to use wildcards like

server = *.domain.name.

This is something we have to do out of necessity rather than out of choice.

Any input appreciated.

Rgds,

JVS

PS! The server-setting can only contain a string being less than 10000 bytes as well.

Can a searchTemplate in a form use report acceleration?

$
0
0

I have a form dashboard in SimpleXML that has a searchTemplate that references a saved search, but does some extra processing on top of it using a text input:

<searchTemplate> 
    | savedsearch "Fancy Accelerated Search" 
    | search area=$area$ 
    | streamstats dc(object_id) as objects by date_wday type 
</searchTemplate>
<fieldset>
    <input type="text" token="area">
    </input>
</fieldset>

This gives a distinct count over time, by day, for a specific "area" of objects. I then have charts that use searchPostProcess to do some filtering and charting of the results from the saved search:

<searchPostProcess> where type="package" | timechart max(objects)</searchPostProcess>

And this gives a kind of sawtooth-pattern graph.

So the problem is that the search performed by the dashboard and triggering the search by clicking the magnifying glass in the bottom of the chart panel won't take the same amount of time. At all!

The search in the dashboard does not use the accelerated report from the searchTemplate, resulting in much slower searches. Triggering the search by clicking the magnifying glass and running that in the Search UI does use the summary (I've checked with the Job Inspector that a summary_id was referenced).

  • Is it possible to use acceleration of a saved search in a searchTemplate with searchPostProcess?

difference between time

$
0
0

HI, I have two fields A and B with time format as 1/07/2014 3:41:12 PM. Please let me know how to find difference between A and B in hours with this format?

e.g., if A is 1/07/2014 3:41:12 PM and B is 2/07/2014 2:41:12 PM, B-A should be 23 hours

Embed a dashboard in a webpage

$
0
0

Hi, I have been looking through all of the questions and answers on this subject and a lot of them seem to be outdated. I have successfully embedded a report in a webpage, however I would like users to the be able to click on a link and see the whole dashboard. Is there any way to easily embed a dashboard to a webpage and for it to be updated say every hour?

rfc5424_syslog

$
0
0

Hi,

i'd install the "Security Intelligence for Vormetric Data Firewall (TM)" app to our running splunk system and I want to use the predefined tcp://5524 source.

inputs.conf

[tcp://5514]
disabled = false
index = myindex
connection_host = dns
sourcetype = rfc5424_syslog

If i now try to search the sourcetype "rfc5424_syslog" i have no results. The search about the "source=tcp:5541" shows for the vormetric data the sourcetype "syslog".

Overwrites splunk the sourcetype? Why is it syslog not rfc5424_syslog? In the inputs.conf the sourcetype is correct. Because this issue the Vormetric app doesn't work.

I hope anybody have an idea. Thanks in advance.

Regards Arne

How to find the difference between 2 times in date time format?

$
0
0

HI, I have two fields A and B with time format as 1/07/2014 3:41:12 PM. Please let me know how to find difference between A and B in hours with this format?

e.g., if A is 1/07/2014 3:41:12 PM and B is 2/07/2014 2:41:12 PM, B-A should be 23 hours

Why am I getting warning "deployment client explicitly disabled through config"?

$
0
0

I have tried restarting everything (forwarder, deploy settings, deployment server) to no avail. I have tested that the machine in question (windows box) can connect to the the deployment port with ncat. I have enabled it through the configuration files and the command line to no avail.

I am horribly confused and at a loss on how to troubleshoot this.


deploymentclient.conf

 [deployment-client]
 disabled = false
 clientName = my--custom--name

 [target-broker:deploymentServer]
 targeturi = myserver.com:8089

splunkd.log

08-28-2014 18:59:20.086 -0400 INFO  LMTracker - Setting feature=DeployClient state=ENABLED (featureStatus=1)
...
08-28-2014 18:59:24.095 -0400 INFO  DS_DC_Common - Initializing the PubSub system.
08-28-2014 18:59:24.095 -0400 INFO  DS_DC_Common - Initializing core facilities of PubSub system.
08-28-2014 18:59:29.929 -0400 WARN  DC:DeploymentClient - DeploymentClient explicitly disabled through config.
08-28-2014 18:59:29.929 -0400 INFO  DS_DC_Common - Deployment Client not initialized.

Edit

There is no license attached to these forwarders if that matters


How to use default certificate ssl to encrypt data between Splunk Server and Universal Forwarder

$
0
0

Hi Splunkers,

I am trying to encrypt my data in lab to learn this feature. I need apply this feature in my financial customer, who have critical data. In this case, I am using default splunk certification to test, located in C:Program FilesSplunketcauth

|| Splunk Server Windows 127.0.0.1:9998 || <---DATA ENCRYPTED--- || Universal Forwarder Windows ||

Universal Forwarder Windows C:\Program Files\SplunkUniversalForwarder\etc\system\local\outputs.conf [tcpout] defaultGroup = default-autolb-group

[tcpout:default-autolb-group] compressed = true requireClientCert = false server = 127.0.0.1:9998 sslCertPath = C:\Program Files\Splunk\etc\auth\server.pem sslPassword = password sslRootCAPath = C:\Program Files\Splunk\etc\auth\cacert.pem

Splunk Server C:\Program Files\Splunk\etc\appssearch\local\inputs.conf

[splunktcp-ssl:9998] connection_host = ip compressed = true

[SSL] serverCert = C:\Program Files\Splunk\etc\auth\server.pem rootCA = C:\Program Files\Splunk\etc\auth\cacert.pem requireClientCert = false password = password

When I did a search, I didn't see data in my Splunk.

Anyone have any idea ?

Cheers!

How to add custom message in place of "Search is waiting for input" for a dashboard panel?

$
0
0

I have a dashboard in which a panel waits to load until a selection is made and the tokens are passed.

Until then the panel displays a warning sign with

"Search is waiting for input... "

Is there a way to customize this message? I want to display a more meaningful message.

The dashboard is developed in Django.

How to blacklist an IP from being indexed for Splunk for Palo Alto Networks?

$
0
0

Hello,

We have some PA devices in our network sending data to our master indexer over UDP:515. This data is being indexed fine, but one of our networks that's monitored is a guest network, and is sending a lot of extra information that we're looking to not index.

I've attempted to set a transform and property, but all that did was completely eliminate all new data, so I reverted that change.

Here's the inputs.conf: [udp//515] connection_host = ip sourcetype= pan_log no_appending_timestamp = true index = pan_logs

The transforms.conf and props.conf exist in the defaults directory and are the defaults that came with the app.

I know you can modify all of the dashboards to include an exception to not include the results in searches, but the requester is asking to modify the data before it's indexed.

Anyone have any ideas on how to do this?

How to override the sourcetype of events within the same source based on the event format?

$
0
0

I'm trying to override the sourcetype of events within the same source (for now, a file uploaded once and indexed - once I get it figured out, the source will be a scripted input from universal forwarders). I need to override the sourcetype of events in a source, based on the format of the event. If the event contains the word "share" (in a certain place) I'd like the source type to be "share"; likewise "dir" and "ext". The events' sourcetype defaults to "ext" (since this is the bulk of the events).

I am using Splunk documentation as a reference.

Here is a sample of the source data:

2014-08-11 22:14:54Z,foo900.example.com,share,seed,g:\seed,Disk,"General testing."
2014-08-11 17:14:54Z,foo900.example.com,dir,\\foo900.example.com\seed,182445977979,2014-07-17 17:00:28Z,2011-02-15 23:20:45Z
2014-08-11 17:14:54Z,foo900.example.com,ext,\\foo900.example.com\seed,.sgy,163108239992

Here is the props.conf:

[ext]
TRANSFORMS-change_sourcetype = transform_ext_sourcetype
SHOULD-LINEMERGE = false
REPORT-ext = transform_ext, transform_dir, transform_share

Here is the transforms.conf:

[transform_ext_sourcetype]
DEST_KEY = MetaData:Sourcetype
REGEX = [^,]+,[^,]+,(share|dir|ext),
FORMAT = sourcetype::$1

[transform_ext]
REGEX = (?<datetime>[^,]+),(?<hostname>[^,]+),ext,(?<share>[^,]+),(?<file_ext>[^,]*),(?<bytes_used>[^$]+)$

[transform_share]
REGEX = (?<datetime>[^,]+),(?<hostname>[^,]+),share,(?<share_name>[^,]+),(?<path>[^,]*),(?<share_type>[^,]+),(?<share_remarks>[^$]+)$

[transform_dir]
REGEX = (?<datetime>[^,]+),(?<hostname>[^,]+),dir,(?<share>[^,]+),(?<bytes_used>[^,]+),(?<last_access_datetime>[^,]+),(?<creation_datetime>[^$]+)$

How to audit files in Splunk monitoring security events in windows 2012 server.

$
0
0

Hi Splunkers,

I need a help to audit some files in Microsoft Windows 2012, files like C:WindowsSystem32driversetchosts, .dlls and so on. In this moment I want to monitor the files, for example: Who deleted this file? Who changed this file?

I am having problem to understand security logs in Windows. Is there any way to solve my problem? Do you have any idea about that?

Cheers!

Viewing all 13053 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>