the errors messages in my logs have different formatting so I'm wondering if there is a way to combine the below two queries with an "OR" statement during my extraction. Is this possible or is there any other ideas that would be better?
query 1)
-\w{9}\s\:\s(?P<pay_fail_rsn>.+)
[2015-07-17T08:16:18.406-05:00] [gw_server12] [NOTIFICATION] [] [com.charter.care.customer.view.payments.backing.PaymentsManager] [tid: [ACTIVE].ExecuteThread: '11' for queue: 'weblogic.kernel.Default (self-tuning)'] [userId: dpalmore] [ecid: c6e22fa0-0a11-4641-8c00-9abd11a6b8ec-0004101c,0] [APP: chtrgwy] 2015-07-17 08:16:18.406 - PAYMENT REQUEST FAILED - EFT payments - 4DK - 8245124990214484 - 152.61 -EXCEPTION : MBC50E-RC=R08,PAYMENT STOPPED - 9977
"OR statement"
query 2)
-\s\w{9}\s\:\s(?P<pay_fail_rsn>.+) for below
[2015-07-17T08:17:10.639-05:00] [gw_server12] [NOTIFICATION] [] [com.charter.care.customer.view.payments.backing.PaymentsManager] [tid: [ACTIVE].ExecuteThread: '21' for queue: 'weblogic.kernel.Default (self-tuning)'] [userId: bbarrett] [ecid: c6e22fa0-0a11-4641-8c00-9abd11a6b8ec-000410c2,0] [APP: chtrgwy] 2015-07-17 08:17:10.639 - PAYMENT REQUEST FAILED - CC payments - 2T2 - 8351100660591807 - 90.58 - EXCEPTION : Good response-check reject rsn - Rejected Reason - 2 - Invalid cardholder number - - 5018
↧
Extract field with multi-values, is using an "OR" operator with two queries possible?
↧
How To List A Column Value Once in a Table?
I'm doing a project to detect click fraud. I created several extractions to take out the IP address, Web Request from that IP address, and the Browser they used from multiple indexes within Splunk. I put them in a table which is showing correctly but have one last issue explained below
I currently have the same IP listed multiple times throughout the table with its web requests. I need to only list that IP Address one time with all the web requests. How can I go about making that IP address distinct to where it lists only once?
I need all occurrences of the web requests tied to that IP to also be listed.. BONUS, how do I add another column counting the number of times that IP occurs?
Here's my search
index=access OR index=main | transaction RTG_JSession | table RTG_IPmain RTG_WebRequest RTG_Browser | where isnotnull(RTG_IPmain)
↧
↧
How to Commit a Change to XML?
If I edit the XML associated with a page, with, say, VI, why doesn't the change show up on the display? Is there some sort of time lag, or a need to "reload" the entire application before the changes take effect?
↧
How to add Count Column to a Table?
I have a table with 3 fields (IP Address, Web Request, and Browser used).. How can I add a column to that table to count the frequency of IP addresses?
I suspect that I have to change my search around because the IP Addresses are listed multiple times, so I think I have to make them list one time then add a column to count the occurrences.
So can someone help me add a column to count the number of times the IP is clicked
Here's my search
index=access OR index=main | transaction RTG_JSession | table RTG_IPmain dc(RTG_IPmain) RTG_WebRequest RTG_Browser | where isnotnull(RTG_IPmain)
↧
New users per month
Is it possible to find the earliest time for all users over all time. Then do a distinct count of users by month using the earliest time as _time, therefore resulting in the distinct count of new users per month?
I don't really want to do the lookup state of the world method. I need to end up with a table : _time "Users" (where Users = new users and _time is by month) I can append this in another search and create some calculated metrics using eval.
↧
↧
Datamodel creation
I want to create a datamodel which will hold aggregated usage. I have a data model currently that holds usage values, I am running complex "group by" and sum to get usage values which is affecting the performance and most jobs are unable to run. I attempted to create a new datamodel by adding this search as root search but I do not know how to use it.
What and how would you suggest I get this done best?
↧
바카라라이브게임©【ϡϡ U B C 6.COM ϡϡ】©바카라사이트Callgirl It looks like you have a question. That
바카라라이브게임【≫ U B C 6.COM≫ 】바카라라이브게임Callgirl It looks like you have a question. That's great! But before you start, please check out the top 5 most frequently asked questions by the new users
↧
오션바카라©【ϡϡ U B C 6.COM ϡϡ】©바카라사이트Callgirl It looks like you have a question. That
오션바카라【◇ U B C 6.COM◇ 】오션바카라Callgirl It looks like you have a question. That's great! But before you start, please check out the top 5 most frequently asked questions by the new users
↧
필리핀도박장©【ϡϡ U B C 6.COM ϡϡ】©바카라사이트Callgirl It looks like you have a question. That
필리핀도박장【’ U B C 6.COM’ 】필리핀도박장Callgirl It looks like you have a question. That's great! But before you start, please check out the top 5 most frequently asked questions by the new users
↧
↧
정선카지노VIP©【ϡϡ U B C 6.COM ϡϡ】©바카라사이트Callgirl It looks like you have a question. That
정선카지노VIP【〓 U B C 6.COM〓 】정선카지노VIPCallgirl It looks like you have a question. That's great! But before you start, please check out the top 5 most frequently asked questions by the new users
↧
In simpleXML is it possible when drilling down to a panel to also provide focus?
I have a dashboard panel that I hide initially with a depends=<csv token list>. This panel is at the top of the page and any of the initially shown panels will drilldown to it by setting tokens. Is it possible to also focus that dashboard on drilldown? Or is my only option sending these to a new window/tab with <link>?
↧
OPSEC LEA - Cannot look up HOME variable
I am seeing this message when trying to use the OPSEC LEA app for Splunk -
ERROR ExecProcessor - message from "/opt/splunk/etc/apps/Splunk_TA_opseclea_linux22/bin/lea-loggrabber.sh --configentity CMA-008" Could not look up HOME variable. Auth tokens cannot be cached.
How might I workaround this?
↧
Eval Description possibly not working due to special character in field
Query:
index=ctap host=sc58* sourcetype=gateway "PAYMENT REQUEST FAILED" pay_type="PAYMENT REQUEST FAILED - CC payments"
| chart count by pay_fail_rsn
| sort count | reverse
| eval Description = case(pay_fail_rsn = "Good response-check reject rsn - Rejected Reason - 7 - Invalid amount field -","Invalid amount field",
pay_fail_rsn = "CCC03E-AUTHORIZATION DECLINED -", "Authorization Declined",
pay_fail_rsn = "Good response-check reject rsn - Rejected Reason - 2 - Invalid cardholder number -", "Invalid Cardholder Number",
pay_fail_rsn = "Good response-check reject rsn - Rejected Reason - 5 - Invalid transaction type -", "Invalid Transaction Type",
pay_fail_rsn = "CCNUMBER; CCE05E-INVALID CREDIT CARD NUMBER -","Invalid Credit Card Number",
pay_fail_rsn = "Good response-check reject rsn - Rejected Reason - 3 - Invalid expiration date -","Invalid Expiration Date",
pay_fail_rsn = "CCC06E-NOT PROCESSED, AUTH REFERRAL -","Not Processed, Auth Referral",
pay_fail_rsn = "Good response-check reject rsn -","Not Processed, Reason Unknown",
pay_fail_rsn = "CCEXP_DATE; CCE08E-INVALID EXPIRATION DATE -","Invalid Expiration Date",
pay_fail_rsn = "Good response-check reject rsn - Rejected Reason - M - General message format problem -","Message Format Problem",
pay_fail_rsn = "Good response-check reject rsn - Rejected Reason - J - Function unavailable -","Function Unavailable",
pay_fail_rsn = "Good response-check reject rsn - Rejected Reason - 6 - [unknown] -", "Not Processed, Reason Unknown",
pay_fail_rsn = "CCC09E-CALL 800-247-4976 -", "Call 1-800-247-4976",
pay_fail_rsn = "[EBS] Invalid request - Amount. Is Required -","Amount Required")
| table Description , count
Result:
Description count
Invalid amount field 300
Authorization Declined 243
Invalid Cardholder Number 190
Invalid Transaction Type 44
Invalid Credit Card Number 37
Invalid Expiration Date 21
Not Processed, Auth Referral 6
5
Not Processed, Reason Unknown 5
Invalid Expiration Date 4
Message Format Problem 3
Function Unavailable 2
Not Processed, Reason Unknown 2
Call 1-800-247-4976
Why is only this one failing to EVAL? Is it the brackets? How do I deal with it?
pay_fail_rsn = "[EBS] Invalid request - Amount. Is Required -","Amount Required"
↧
↧
Splunk Web Framework MultiDropdownView choice limit?
Hello,
I am using the simple xml with the Splunk Web Framework. I've noticed that while using the MultiDropdownView my results get truncated. The SearchManager which drives the MultiDropdownView returns over 1300 values. Is there a limit to the length of the choices property? If so can the limit be increased?
Below is a snippet of code:
// Dropdown
new MultiDropdownView({
id: "multidropdown1",
el: $("#instance-select")
}).render();
// Splunk search managers
new SearchManager({
id: "multidropsearch1",
latest_time: "now",
earliest_time: "-1h@h",
autostart: "true",
search: "| metadata type=hosts | dedup host | table host"
});
var multiDropdown = splunkjs.mvc.Components.getInstance("multidropdown1");
var dropdownManager = splunkjs.mvc.Components.getInstance("multidropsearch1");
// populates dropdown menu
dropdownManager.on('search:done', function() {
var results = dropdownManager.data('results');
results.on('data', function() {
var rows = results.data().rows
var dropOptions = [];
for (var i in rows) {
var option = rows[i].toString();
dropOptions.push({label: option, value: option});
}
dropOptions.pop();
multiDropdown.settings.set('choices', dropOptions);
});
});
↧
Error with python scripts in Hurricane Labs app for Shodan
Getting an error with python scripts as shown below.
The SA-shodan add-on is installed and I do have an API key for Shodan.
Have tried configuring with single IP address X.X.X.X as well as range in CIDR format X.X.X.X/24
command="inputjson", Traceback (most recent call last): File "E:\Program Files\Splunk\etc\apps\Hurricane_Labs_App_for_Shodan\bin\inputjson.py", line 24, in <module> main() File "E:\Program Files\Splunk\etc\apps\Hurricane_Labs_App_for_Shodan\bin\inputjson.py", line 17, in main raise ValueError(sys.argv[1]) ValueError: shodan/my_subnets.json
↧
Where do I deploy the ta-forwarderquery app
Hi,
The app looks very interesting.
We have a central deployment server that pushes out all the apps, to all Splunk compoenents.
Can you please give me some guidance on where I push this app out to - Indexers, forwarders or Search heads or all 3 ?
Thanks,
Madan Sudhindra
↧
How would I develop and inject a custom output processor into the indexing pipeline?
I've found myself recently looking at the Pipelines in Splunk, through the [How Indexing Works](http://wiki.splunk.com/Community:HowIndexingWorks) wiki page, or @amrit and @Jag's [conf2014 talk](http://conf.splunk.com/speakers/2014.html#). It seems like the input through indexing pipelines are very modular and how they're wired together through various .xml files in $SPLUNK_HOME/etc that wind up together in $SPLUNK_HOME/var/run/splunk/composite.xml
If an event lands in the indexing pipeline, I have a number of options, I could send the event to a remote splunk instance or the raw event over TCP to some other server, I could send the event to a remote syslog server, or I could just index the event locally. (And with the steps taken on the event in previous pipelines and configurations, I could do any one to all three of these options).
But I have been asked, what if I wanted to ship raw logs during the indexing pipeline to another service that didn't take just raw TCP or UDP, for example, if I needed to wrap the event in some other protocol or something before sending? _**Is there a guide on how one could develop a custom processor, install it with a Splunk App, and inject it into the indexing pipeline (or alternatively create a queue and a custom pipeline before the indexing pipeline)?**_
Yes, there is a danger that my bad processor code could back up my indexing pipeline, but if I try to send tcp to a remote instance that cannot accept any more input [I could back up my indexing pipeline as well](http://answers.splunk.com/answers/266508/is-it-normal-behavior-for-splunk-to-block-queues-a.html) thus in that sense the choice to be in Splunk directly versus stand up a custom server that accepts tcp and sends it on seems a bit moot. However I would think that if I could be in my Splunk pipeline, I should be able to take advantage of some of the additional metadata around the events as well (and not just the raw event or have to parse things again) which feels like it would be an advantage.
Yes, I could let the event be indexed, then have a scheduled / real-time search ship off the event either with a custom command or an alert script. While I could join and transform the event with more sources of data, if my goal is simply to ship certain events off to another system, that means I'm potentially taking up search head processing time (and licensing) seemingly unnecessarily for this task.
Any ideas?
↧
↧
Error loading logging conf file - can't start Splunk Enterprise Server
Not able to start Splunk.
When running "./splunk start" from the command line, Splunk Enterprise Server fails to start with the following error message:
Error loading logging conf file='/opt/splunk/etc/log.cfg'; runContext=splunkd
↧
Need graphic artist or icon expert ASAP to make my app icons work in Splunk v6
I have tried everything and although my icons worked fine pre v6, I cannot get them to work in v6. I am happy to pay for your help but I really need somebody how knows image formats. I have read and (so far as I can tell) complied with all of this but nothing has worked and the deadline for the App contest in a day or 2 so I need help now! You can construct my valid email address by going to my public profile in this forum and using my last name (just the last name) plus the name of my company as the domain plus dot-com. I will post a followup WITH THE ANSWER when somebody figures out what in the world was the problem.
http://docs.splunk.com/Documentation/Splunk/6.0/Installation/ChangesforSplunkappdevelopers#We_have_changed_where_Splunk_looks_for_the_icon_files
http://answers.splunk.com/answers/134726/display-both-app-logo-and-app-name-in-splunk-6-custom-application.html
http://docs.splunk.com/Documentation/Splunk/6.2.4/AdvancedDev/Migration
↧
How do I group events by field (trans ID) and count as a single event?
My apologies is this has been asked and answered.
We have logs that record several error entries for a single transaction. We have mapped the transaction ID as a field and would like to group all of the log entries for a particular transaction together to be counted as a single event. Would this be best done as the logs are consumed by Splunk or when searching and how would I go about doing it?
Thanks
↧