Quantcast
Channel: Latest Questions on Splunk Answers
Viewing all 13053 articles
Browse latest View live

AD Field Dates Converting and Searching

$
0
0

Hello All

New to splunk and would like a bit of guidance on dealing with Active Directory attributes that ave dates such as accountExpires and pwdLastSet.

For example; this work well

source="ActiveDirectory" AND accountExpires="12:00.00 AM, Tue 01/01/2013" AND accountExpires>0 | dedup name | search userAccountControl="512"

However I would really like to see everything that expires prior to this date. "<" does not work because I suspect splunk see's this value as a string.

Anyone have some examples of efficient ways to accomplish what I am looking for.

TY


Assigning a max value from one field as a new field

$
0
0

I am attempting to write a search that creates arbitrary "buckets" for qualifying events using a numeric code (1-5). For this particular search, I'm calculating the number of orders that individuals placed during a specific window of time, then I need to take the maximum number of orders by ANY of the individuals, divide that by 5, and then assign a classification to all of the individuals based on how many orders they placed.

Example:

User Orders Code
   1      2    1
   2      5    3
   3     10    5
   4      4    2

Because of the format that the data is in, finding out how many orders each user has is really complicated and I really don't want to have to run it twice.

To get the largest number of orders in the dataset, I run:

... | stats max(Orders) as MaxOrders

Unfortunately, once I use a stats command, I can't go back and run calculations against the dataset without doing a join or append search that would run ALL the data through the calculations to find out how many orders they have all over again.

I've tried using appendpipe:

... | appendpipe [ stats max(Orders) as MaxOrders ]

This creates a whole new event where everything is null except the "MaxOrders" column. But then I need create buckets (something like this):

... | eval Buckets=(MaxOrders/5) | 
eval Bucket1=Buckets | 
eval Bucket2=(Bucket1+Buckets) | 
eval Bucket3=(Bucket2+Buckets) | 
eval Bucket4=(Bucket3+Buckets) | 
eval Bucket5=(Bucket4+Buckets)

At this point, I have the one extra event at the end of my dataset which contains the values of MaxOrders, Buckets, Bucket1, Bucket2, Bucket3, Bucket4, and Bucket5.

After the buckets are created, I need to compare Orders (for each user) against the buckets to assign the user their code, which requires an eval if statement:

... | eval Code=if(Orders<=Bucket1, 1, if(Orders<=Bucket2, 2, if(Orders<=Bucket3, 3, if(Orders<=Bucket4, 4, 5))))

But since the values for Bucket1, Bucket2, Bucket3, Bucket4, and Bucket5 only exist in that one event (not in all the events), I get 5 for all the users' Code value.

Is there a way to get the values for Bucket1, Bucket2, Bucket3, Bucket4, and Bucket5 to exist in all the records without having to do a join/append to rerun the calculation for Orders all over again?

CSV File Issues

$
0
0

Hello, I am having issues with csv files imported from an S3 bucket. The files get imported and indexed fine however what I get when i try to do a search on what has been indexed is something like this:

x95x90@x12xb1xe4;g.xa6xa3xfdx99xdfx88Ixc4cx08xdbx03x00x00\

Is there something im missing that prevents the simple csv file from being read properly? Thanks for your help. Cesar

Not Working

$
0
0

Not sure why -> everything looks installed correctly but I am not seeing the additional fields when running this search -> sourcetype = "iisw3c" | lookup browscap_lookup http_user_agent

We are running Splunk version 5.x -> could it be a version issue?

Splunk_TA_nix install from deployment server

$
0
0

Hi,

I have 9 universal forwarders where i want to install Splunk_TA_nix from deployment server.

Please let me know is it possible to install from deployment server or i have to go one by one to forwarder to install

The Answer to the Ultimate Question of Life, the Universe, and Everything

$
0
0

Anyone else notice that the default value under a the visualization option of "single value" is listed as 42?

en.wikipedia.org/wiki/42_%28number%29#The_Hitchhiker.27s_Guide_to_the_Galaxy

Good job developers :)

not seeing additional fields when using the browscap TA.

$
0
0

Not sure why -> everything looks installed correctly but I am not seeing the additional fields when running this search -> sourcetype = "iisw3c" | lookup browscap_lookup http_user_agent

We are running Splunk version 5.x -> could it be a version issue?

Mixed version environment possible?

$
0
0

Is it possible to have a Splunk environment with a mix of 5.0.x and 6.0.x versions?

Specifically have all ES components running the currently compatible 5.0.x version like the master node, indexers, search heads and have other components running 6.0.x like master license node, deployment server, universal forwarder, and other apps.


Alt-Click not working selected fields

$
0
0

When I have an event where there is selected fields that I want to eliminate, if I ALT-click on the value in the selected field it adds it to the search rather than removing it.

Ex I ALT-click machine1 in the selected field host = machine1

Which should add NOT host="machine1" to the search. Instead it just adds host="machine1" to the search. I believe that this has happened since I upgraded to Splunk 6.

Any ideas as to why this happens. If I alt click in the actual event on a field it works fine.

How to troubleshoot real-time alerts not working?

$
0
0

Hello, I am having a hard time trying to pin down why most of my real-time alerts have stopped working. I have looked into scheduler.log and python.log, and did not find much insightful details to the problem. Here are the symptoms:

  1. Only real-time alerts are not appearing to fire
  2. Non-realtime alerts appear to be fine as I am still getting alert emails
  3. Once splunk is restarted, some rt alerts appear to be firing; then eventually stopped

Applying time modifier (earliest and latest) to multiple search?

$
0
0

Hi!

Is it possible to do something like below possible?

If I have 5 searches ,

search A search B search C search D search E

and specify time modifier , for example , as earliest=-2d@d latest=-1d@d , Is it possible to apply the time modifier to all search at once and join them?

So my image is,

earliest=-2d@d latest=-1d@d | join [ search search A] | join [ search search B] | join [ search search C] | join [ search search D] | join [ search search E]

I want to put the time modifier as input of join for each search.

Thanks, Yu

Okta APP will not pull in data

$
0
0

Hi there,

I have requested access to the OKTA API and tested it with curl which works. Then I set up the Splunk App for Okta according to the documentation but it does not pull in any data. Any help would be greatly appreciated!

Rogier

Using an existing OSSEC app with ES

$
0
0

I have a working install of "Reporting and Management for OSSEC" working nicely now. Now that we have purchased ES and want to start deploying it, I'm a little lost on how if its even possible to use the existing OSSEC install with ES.

Can I just make the existing Reporting and Management for OSSEC a heavy forwarder to ES and classify the data as ossec on the ES server?

CERN HTTPD Access Control Bypass (Splunkd service)

$
0
0

Vulnerability scanning software returned the following result for a handful of systems in my environment:

"There exists a vulnerability in the CERN web server running on this host that could allow an attacker to gain access to sensitive files on the system. Service: Splunkd CVSSv2: AV:N/AC:L/Au:P/C:N/I:N/A:N (Base Score:5.00)

Remediation Action: Filter out input such as '//' and '/./' from page requests."

Has anyone run across something similar? I'm assuming the service is needed for the Universal Forwarder, but not sure why only a few systems are reporting this vulnerability and not all. The hosts in question are running WIN2012.

Wildcard certificate and PDF 1.3 failed to generate PDF: 400 Bad Request

$
0
0

PDF 1.3 under Splunk 4.3.3 was working fine until I replaced the current cert with a new wildcard certificate.

I get the email alert but instead of the expected results, the contents have the error message:

An error occurred while generating a PDF of this report: Failed to generate PDF: Appserver failed to dispatch report request to /services/pdfserver/renderpdf: 400 Bad Request

The python.log also reports the following:

XMLSyntaxError: PCDATA invalid Char value 26, line 164, column 1
2012-09-04 17:00:41,204 ERROR PCDATA invalid Char value 26, line 164, column 1
Traceback (most recent call last):
  File "/opt/splunk/bin/rest_handler.py", line 84, in <module>
    print splunk.rest.dispatch(**params)
  File "/opt/splunk/lib/python2.7/site-packages/splunk/rest/__init__.py", line 52, in dispatch
    requestXml = et.fromstring(requestInfo)
  File "lxml.etree.pyx", line 2532, in lxml.etree.fromstring (src/lxml/lxml.etree.c:48634)
  File "parser.pxi", line 1545, in lxml.etree._parseMemoryDocument (src/lxml/lxml.etree.c:72245)
  File "parser.pxi", line 1424, in lxml.etree._parseDoc (src/lxml/lxml.etree.c:71106)
  File "parser.pxi", line 938, in lxml.etree._BaseParser._parseDoc (src/lxml/lxml.etree.c:67875)
  File "parser.pxi", line 539, in lxml.etree._ParserContext._handleParseResultDoc (src/lxml/lxml.etree.c:64257)
  File "parser.pxi", line 625, in lxml.etree._handleParseResult (src/lxml/lxml.etree.c:65178)
  File "parser.pxi", line 565, in lxml.etree._raiseParseError (src/lxml/lxml.etree.c:64521)
XMLSyntaxError: PCDATA invalid Char value 26, line 164, column 1
2012-09-04 17:00:41,258 ERROR An error occurred while generating a PDF of this report: Failed to generate PDF: Appserver failed to dispatch report request to /services/pdfserver/renderpdf: 400 Bad Request
2012-09-04 17:00:41,259 DEBUG simpleRequest > GET https://127.0.0.1:8089/services/search/jobs/scheduler__admin__search_ZGF2ZWgtcGRmdGVzd_at_1346778000_662428414fef0f15?message_level=warn [] sessionSource=direct
2012-09-04 17:00:41,274 DEBUG simpleRequest < server responded status=200 responseTime=0.0149s
2012-09-04 17:00:41,274 DEBUG getStatus - elapsed=0.0153450965881 nextRetry=0.0500000289067
2012-09-04 17:00:58,024 INFO Sending email. subject="Splunk Alert: booboos-pdftest", results_link="https://mycompany.com:443/app/search/@go?sid=scheduler__admin__search_ZGF2ZWgtcGRmdGVzd_at_1346778000_662428414fef0f15", recepients="['yogibear@mycompany.com']"

splunkd keeps on crashing (crashing thread: archivereader)

$
0
0

Hey, i am currently experiencing severe problems with my splunk installation since splunkd repeatedly crashes right after starting splunk. Here's the output of the respective log file:

[build 182037] 2013-10-30 23:01:39
Received fatal signal 6 (Aborted).
 Cause:
   Signal sent by PID 8918 running under UID 0.
 Crashing thread: archivereader
 Registers:
    RIP:  [0x00007F7208496037] gsignal + 55 (/lib/x86_64-linux-gnu/libc.so.6)
    RDI:  [0x00000000000022D6]
    RSI:  [0x0000000000002441]
    RBP:  [0x00007F72085E5578]
    RSP:  [0x00007F7201FEC008]
    RAX:  [0x0000000000000000]
    RBX:  [0x00007F72097EC000]
    RCX:  [0xFFFFFFFFFFFFFFFF]
    RDX:  [0x0000000000000006]
    R8:  [0xFEFEFEFEFEFEFEFF]
    R9:  [0x00007F720983FF60]
    R10:  [0x0000000000000008]
    R11:  [0x0000000000000202]
    R12:  [0x0000000001299678]
    R13:  [0x000000000129A300]
    R14:  [0x00007F720638AB20]
    R15:  [0x00007F72060743DB]
    EFL:  [0x0000000000000202]
    TRAPNO:  [0x0000000000000000]
    ERR:  [0x0000000000000000]
    CSGSFS:  [0x0000000000000033]
    OLDMASK:  [0x0000000000000000]

 OS: Linux
 Arch: x86-64

 Backtrace:
  [0x00007F7208496037] gsignal + 55 (/lib/x86_64-linux-gnu/libc.so.6)
  [0x00007F7208499698] abort + 328 (/lib/x86_64-linux-gnu/libc.so.6)
  [0x00007F720848EE03] ? (/lib/x86_64-linux-gnu/libc.so.6)
  [0x00007F720848EEB2] ? (/lib/x86_64-linux-gnu/libc.so.6)
  [0x000000000083AA16] _ZN17ArchiveCrcChecker21seekAndComputeSeekCrcEv + 598 (splunkd)
  [0x000000000083D345] _ZN17ArchiveCrcChecker5writeEPKcm + 357 (splunkd)
  [0x0000000000AA0717] _ZN14ArchiveContext7processERK8PathnameP13ISourceWriter + 855 (splunkd)
  [0x0000000000AA0E95] _ZN14ArchiveContext9readFullyEP13ISourceWriterRb + 1221 (splunkd)
  [0x000000000083CFA2] _ZN16ArchiveProcessor20haveReadAsNonArchiveE14FileDescriptorlPK3Str + 578 (splunkd)
  [0x000000000083EE53] _ZN16ArchiveProcessor4mainEv + 2755 (splunkd)
  [0x0000000000D81A2D] _ZN6Thread8callMainEPv + 61 (splunkd)
  [0x00007F720882EF8E] ? (/lib/x86_64-linux-gnu/libpthread.so.0)
  [0x00007F7208558E1D] clone + 109 (/lib/x86_64-linux-gnu/libc.so.6)
 Linux / ubuntuSplunkHost / 3.8.0-19-generic / #29-Ubuntu SMP Wed Apr 17 18:16:28 UTC 2013 / x86_64
 Last few lines of stderr (may contain info on assertion failure, but also could be old):
    2013-10-30 23:01:25.780 +0100 splunkd started (build 182037)
    Cannot open manifest file inside "/opt/splunk/var/lib/splunk/audit/db/db_1383170465_1383170464_24/rawdata": No such file or directory
    splunkd: /opt/splunk/p4/splunk/branches/6.0.0/src/pipeline/input/ArchiveProcessor.cpp:1044: bool ArchiveCrcChecker::seekAndComputeSeekCrc(): Assertion `(file_offset_t)_seekPtr >= dp->curPos()' failed.
    2013-10-30 23:01:34.813 +0100 splunkd started (build 182037)
    Cannot open manifest file inside "/opt/splunk/var/lib/splunk/audit/db/db_1383170486_1383170485_25/rawdata": No such file or directory
    splunkd: /opt/splunk/p4/splunk/branches/6.0.0/src/pipeline/input/ArchiveProcessor.cpp:1044: bool ArchiveCrcChecker::seekAndComputeSeekCrc(): Assertion `(file_offset_t)_seekPtr >= dp->curPos()' failed.

 /etc/debian_version: wheezy/sid
Last errno: 0
Threads running: 40
argv: [splunkd -p 8089 start]
Thread: "archivereader", did_join=0, ready_to_run=Y, main_thread=N
First 8 bytes of Thread token @0x7f7206074230:
00000000  00 d7 ff 01 72 7f 00 00                           |....r...|
00000008

x86 CPUID registers:
         0: 0000000D 756E6547 6C65746E 49656E69
         1: 000306A9 00010800 9E982203 0FABFBFF
         2: 76035A01 00F0B2FF 00000000 00CA0000
         3: 00000000 00000000 00000000 00000000
         4: 00000000 00000000 00000000 00000000
         5: 00000000 00000000 00000000 00000000
         6: 00000077 00000002 00000009 00000000
         7: 00000000 00000000 00000000 00000000
         8: 00000000 00000000 00000000 00000000
         9: 00000000 00000000 00000000 00000000
         A: 07300401 0000007F 00000000 00000000
         B: 00000000 00000000 0000005D 00000000
         C: 00000000 00000000 00000000 00000000
         D: 00000000 00000000 00000000 00000000
  80000000: 80000008 00000000 00000000 00000000
  80000001: 00000000 00000000 00000001 28100800
  80000002: 20202020 20202020 65746E49 2952286C
  80000003: 726F4320 4D542865 37692029 3737332D
  80000004: 50432030 20402055 30342E33 007A4847
  80000005: 00000000 00000000 00000000 00000000
  80000006: 00000000 00000000 01006040 00000000
  80000007: 00000000 00000000 00000000 00000100
  80000008: 00003028 00000000 00000000 00000000
terminating...

..I already tried repairing/rebuilding indexes and buckets as explained here: http://docs.splunk.com/Documentation/Splunk/6.0/Indexer/HowSplunkstoresindexes but no success yet. After every restart, splunkd crashes again. The error message is almost the same, but the line " Cannot open manifest file inside "/opt/splunk/var/lib/splunk/audit/db/db_1383170486_1383170485_25/rawdata": No such file or directory" keeps changing, the number i marked bold is increasing after every crash.

Can anybody help solving this issue?

Thanks and regards, Flo

Administration App for NetWitness not working pulling from Broker

$
0
0

I am getting the following two errors when trying to connect to broker.

<urlopen error="" _ssl.c:494:="" the="" handshake="" operation="" timed="" out="">

Displaying results table in tab switcher tab, BEFORE clicking on drilldown field in panel above

$
0
0

I have a dashboard with two panels. The first panel contains a table which is a drilldown table. When the value is clicked, the second panel has three tabs with different searches, for the filtered by the clicked item. The drilldown and intention, etc all work fine. The problem is before you click, the second panel is hidden, or blank. The customer would like there to be a default search results shown (as if it were clicked!) I unfortunately cannot use sideview utils and all its wonderfulness :(

This is the xml (partial) for the dashboard: <module name="StaticContentSample" layoutpanel="panel_row2_col1_grp1"> <param name="text"><h1>Library</h1></param> </module> <module name="HiddenSavedSearch" layoutpanel="panel_row2_col1_grp1" group=" " autorun="True"> <param name="savedSearch">remote_level1_lib</param> <module name="ModifiedSimpleResultsTable"> <param name="drilldown">all</param> <param name="showResetButton">false</param> <param name="displayRowNumbers">False</param> <module name="EnablePreview"> <param name="enable">True</param> <param name="display">False</param> <module name="ConvertToIntention"> <param name="intention"> <param name="name">addterm</param> <param name="arg"> <param name="libname">$click.value$</param> </param> </param> <module name="SimpleResultsHeader" layoutpanel="panel_row4_col1"> <param name="entityName">results</param> <param name="headerFormat"> Details for Remote Monitor Library $click.value$ </param> </module> <module name="TabSwitcher" layoutpanel="panel_row4_col1"> <param name="mode">independent</param> <param name="selected">Subsystems</param> <module name="HiddenSearch" layoutpanel="panel_row4_col1_grp1" group="Subsystem" autorun="True"> <param name="search"> mysearch1here </param> <module name="Paginator"> <param name="count">25</param> <param name="entityName">results</param> <param name="maxPages">10</param> <module name="HiddenFieldPicker"> <param name="strictMode">True</param> <module name="ModifiedSimpleResultsTable" layoutpanel="panel_row4_col1"> <param name="showResetButton">false</param> <param name="allowTransformedFieldSelect">True</param> <module name="ModifiedViewRedirector"> <param name="viewTarget">flashtimeline</param> </module> </module> </module> </module> </module> <module name="HiddenSearch" layoutpanel="panel_row4_col1_grp2" group="Functional" autorun="True"> <param name="search"> mysearch2here </param> <module name="Paginator"> <param name="count">25</param> <param name="entityName">results</param> <param name="maxPages">10</param> <module name="HiddenFieldPicker"> <param name="strictMode">True</param> <module name="ModifiedSimpleResultsTable" layoutpanel="panel_row4_col1"> <param name="showResetButton">false</param> <param name="allowTransformedFieldSelect">True</param> <module name="ModifiedViewRedirector"> <param name="viewTarget">flashtimeline</param> </module> </module> </module> </module> </module> <module name="HiddenSearch" layoutpanel="panel_row4_col1_grp3" group="BIT" autorun="True"> <param name="search"> mysearch3here </param> <module name="Paginator"> <param name="count">25</param> <param name="entityName">results</param> <param name="maxPages">10</param> <module name="HiddenFieldPicker"> <param name="strictMode">True</param> <module name="ModifiedSimpleResultsTable" layoutpanel="panel_row4_col1"> <param name="showResetButton">false</param> <param name="allowTransformedFieldSelect">True</param> <module name="ModifiedViewRedirector"> <param name="viewTarget">flashtimeline</param> </module> </module> </module> </module> </module> </module> </module> </module> </module> </module> </view>`

Search generated too much data...

$
0
0

Has anyone run into this message?

"Search generated too much data for the current display configuration, results have been truncated"

The search is for collecting and grouping latency times (spent).

source="/opt/splunk/var/log/splunk/web_access.log"
| eval dum=case(spent==0, spent) | eval 0-99(ms)=case(spent>=0 AND spent<=99, spent) | eval 100-199(ms)=case(spent>=100 AND spent<=199, spent) | eval 200-299(ms)=case(spent>=200 AND spent<=299, spent) | eval 300-399(ms)=case(spent>=300 AND spent<=399, spent) | eval 400-499(ms)=case(spent>=400 AND spent<=499, spent) | eval over500(ms)=case(spent>=500, spent) | table spent 0-99(ms) 100-199(ms) 200-299(ms) 300-399(ms) 400-499(ms) over500(ms)

How to get the row values of the table using TableView.BaseCellRenderer

$
0
0

I have requirement to get the row values of a table at one time and store it in array and use them and specify the conditions.Actually i can able to retrive the single table cell value.

can any one help in how to get in array and use the array elements.

Viewing all 13053 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>