Hi,
Im getting the follwing error on my indexers splunkd.log. I have a RF=3 and SF=3 clustering with 1 master and 1 search head.
ERROR S2SFileReceiver - event=statSize replicationType=eJournalReplication bid=trendsg~26~89C0FF94-5EB0-410A-9B4D-0E17DBD7FB78 path=/opt/splunk/var/lib/splunk/trendsg/db/26_89C0FF94-5EB0-410A-9B4D-0E17DBD7FB78/rawdata/journal.gz status=failed
Any thoughts?
Thanks,
I would like to limit certain users' access to URL filtering and config change data coming from the PAN. Is there a way to do that?
I would put the data in separate indexes, but the app documentation says to use pan_logs for everything.
dear splunker ,
am trying to persuade my boss to implement Splunk instead of Manage engine event log analyzer , but he keep asking me why and honestly i cant give him a good answer , can you help me ASAP.
peace and love ...
Good Afternoon All
I am creating a Dashboard Table with username, skill1, skill2, skill3. The values of skill1,2,3 will be a text value, None, White, Orange, Blue, Red, Black. Based on these values the cell will be that color for the skill.
![alt text][1] From the picture you can kind of get the idea. We share this Splunk instance with other department and don't want to make a bunch of custom changes just in case a future version breaks them. Sooooo, is there a way to do this with out touching the application.js and CSS?
Thanks
Ed![alt text][2]
[1]: /storage/temp/52219-table.jpg
[2]: /storage/temp/52219-table.jpg
hi, am new to Splunk, and now am using using splunk 6.2 under Linux, few days ago I configured SEP to forward all Events (*Client, System, Agents, etc*.)
and from Splunk side I've downloaded and added "*_splunk-add-on-for-symantec-endpoint-protection_201_*"
All network access are OK, and tested.
but i don't know if the logs were send from SEP or not or where and how to find it, am totally naive to SPLUNK, noticing that I have followed configuration steps but I didn't find the logs on "**$SEPM_HOME/data/dump**"
Hi all,
How can I do to UNLIMIT the rows on a MySQL Query on DB Connect V2?
I need to search ALL MY DATABASE, but I only received 1000 rows on search return.
Who does can help me, please?
Thanks and regards!
Luis Carlos
Hey,
I'm testing the IMAP in a windows environment and i keep running into this error"__main__.LoginError: Could not log into server: imap.gmail.com with password provided". I'm using gmail here, is there anything that needs to be done in the python script to make it work.? I have enabed imap in my gmail and the config file seems to be alright as well. Can someone tell me , what's wrong here?
ERROR ExecProcessor - message from "python "C:\Program Files\Splunk\etc\apps\IMAPmailbox\bin\get_imap_email.py"" Traceback (most recent call last):
08-16-2015 21:09:09.497 +0530 ERROR ExecProcessor - message from "python "C:\Program Files\Splunk\etc\apps\IMAPmailbox\bin\get_imap_email.py"" File "C:\Program Files\Splunk\etc\apps\IMAPmailbox\bin\get_imap_email.py", line 717, in <module>
08-16-2015 21:09:09.497 +0530 ERROR ExecProcessor - message from "python "C:\Program Files\Splunk\etc\apps\IMAPmailbox\bin\get_imap_email.py"" parseArgs()
08-16-2015 21:09:09.497 +0530 ERROR ExecProcessor - message from "python "C:\Program Files\Splunk\etc\apps\IMAPmailbox\bin\get_imap_email.py"" File "C:\Program Files\Splunk\etc\apps\IMAPmailbox\bin\get_imap_email.py", line 709, in parseArgs
08-16-2015 21:09:09.497 +0530 ERROR ExecProcessor - message from "python "C:\Program Files\Splunk\etc\apps\IMAPmailbox\bin\get_imap_email.py"" imapProc.getMail()
08-16-2015 21:09:09.497 +0530 ERROR ExecProcessor - message from "python "C:\Program Files\Splunk\etc\apps\IMAPmailbox\bin\get_imap_email.py"" File "C:\Program Files\Splunk\etc\apps\IMAPmailbox\bin\get_imap_email.py", line 345, in getMail
08-16-2015 21:09:09.497 +0530 ERROR ExecProcessor - message from "python "C:\Program Files\Splunk\etc\apps\IMAPmailbox\bin\get_imap_email.py"" raise LoginError('Could not log into server: %s with password provided' % self.server)
08-16-2015 21:09:09.497 +0530 ERROR ExecProcessor - message from "python "C:\Program Files\Splunk\etc\apps\IMAPmailbox\bin\get_imap_email.py"" __main__.LoginError: Could not log into server: imap.gmail.com with password provided
I want to just look at 1 hour for yesterday, but i want it to be relative to today so no matter when I look at it in the future it will always b yesterday.
So if I look at it today it will show yesterdays value at 12pm to 1pm
And if I look at it next week it will show the day before that day at 12pm to 1pm
I am thinking of something like `-1d@d` for the earliest and `@d` for the latest but how do i get the hour I want?
I am installing Splunk as splunk user. I have it all down but what directory are people installing it in? Using /opt does not seem like a good idea because you then need to make the /opt dir 775 or 777 depending on who owns /opt...
I welcome to hear form where others are installing it. Thanks!
There seems to be duplication/repitition in view of world.
For instance, half of australia]s east cost is shown on the left, and the whole of australia is shown in the centre, and similarily for other countries.
![alt text][1]
Can I have so that this is not the case? I just want to see 1 country and just want to see it once not repeated.
How do I do this in splunk?
[1]: /storage/temp/52220-map-zoom-level.png
I basically have 2 searches that I am combining using `appendcols`. 1 search is for each element. it looks some thing like:
`index=core ... ne=ne1 | stats sum(kpi1) as "kpi1" by ks_countryname | sort - "kpi1" |
appendcols [search
index=core ... ne=ne2 | stats sum(kpi1) as "kpi1_ne2" by ks_countryname | rename ks_countryname as ks_countryname_ne2| sort - "kpi1_ne2"]`
the output looks something like below:
ks_countryname kpi1 kpi1_ne2 ks_countryname_ne2
ZZ_undefined 1615 2631 ZZ_undefined
Australia 1500 1635 China
United States 676 1600 Australia
China 423 410 United States
Vanuatu 295 305 Samoa
Switzerland 220 247 Switzerland
Germany 165 213 Germany
France 157 181 France
Samoa 118 62 Vanuatu
how do I get column 3 and 4 to line up with column 1 and 2 such that:
row 2 of coulumn 1 and 2 `Australia 1500` would line up with row 3 of column 3 and 4 `1600 Australia`
ideally I would still like to sort max to min by kpi1 but I just want the countries aligned also
Hi,
I'm seeing some very unusual behaviour when extracting fields in Splunk 6.2. Basically I can see the fields are extracted successfully, but I can't use them to search. I have the following sample data:
101STUS NVLGCCCPDRf4cc5a8023ce40e28c9f260c376dabe9032134120864 032000123456789 191820013550000000000000000ESBtesSSP191820013550000000000000000abcdefSSD00071468C4875691F2CC0000000102763400095 C02763400095 20150721112211485002KO-001HIHI12345 6_1 ABCD02 20150721102147122201507211754000000000000000007400 AU S2015072120150721CRN MH48A 0201 ACSP 20150721112211485ACSC 2015072111221148511215201121520 BIS 0000000000 00000000
This is a fixed length field log file (from mainframe), with no field separators. Therefore, I am using the following regular expression to extract the fields, which basically just extracts them from their position in the log file:
.{11}(?.{4}).{1}(?.{2}).{32}(?.{6})(?.{28})(?.{6})(?.{28})(?.{36})(?.{36})(?.{20})(?.{8})(?.{15}).{33}(?.{3})(?.{6}).{132}(?.{17}).{17}0*(?\d{1,16}).{90}(?.{4}).{4}(?.{17})(?.{4}).{4}(?.{17})
Now when I search for the log in Splunk, I can see all the fields created with the correct values.
index=main sourcetype=mytype
However, if I try to add the fields to the search string, I am unable to see any results. For example:
index=main sourcetype=mytype Type=GCCC
I've found that if I put a * on either side of the field value, it does find them, which I find strange:
index=main sourcetype=mytype Type=*GCCC*
This indicates that there may be whitespace around the value, but it doesn't appear that way when I look at the values. I've also found that I can successfully search for the fields if I add it as an extra search function after the main search:
index=main sourcetype=mytype | search Type=GCCC
This looks like it doesn't run the field extraction until after the main search, however I can see in a lot of other sourcetypes I have that this isn't the case, as I can search for those.
I've also tried a number of other things to try to get this working:
- Separating the regexes so each field has it's own extractions. I still get the same issue.
- Extracting only 1 field from the data to simplify it. I still get the same issue.
- Adding the following Calculated Field. This works, but I don't want to add an EVAL for every field as I'm sure there will be performance implications
[mytype]
EVAL-Status = Status
Has anyone seen this before? I've played around a lot with the regex, but could there be a problem with this? Is there a better way to extract the fields for a fixed length file?
I suspect that it's partly because the fields have no separators, therefore Splunk isn't able to do keyword searches on partial matches, can anyone confirm?
Thanks in advance.
Ashley
Good day!
Currently, I am planning on putting a text input in the dashboard through JavaScript and but whenever I run the dashboard, it always displays an error in the console log saying, "Uncaught TypeError: mvc.tokenSafe is not a function."
My expected output is to create a text input in the JavaScript with a defined token.
Here is my current code:
new TextBox({
id: "textbox1",
default: "main",
value: mvc.tokenSafe("$deviceField$"),
el: $("#searchField")
}).render();
Thanks!
Assuming I have a dashboard with multiple panels, is there a way to have some visible and some not?
currently this skeleton has the following view
check box
row1
row2
row3
Is there a way I can do the following:
check box
row1
row2 show/hide
row3 show/hide
with row 2 and 3 having the same control to toggle them from visible to hide?
I am looking to do this in simple XML.
in the below pic is there a way that you can display the country name in the pop up instead of the lat and long values?
![alt text][1]
[1]: /storage/temp/52221-map-show-country-name.png
The cipherSuite parameter desired has been configured in $SPLUNK_HOME/etc/system/local/web.conf, but when I restart Splunk, the web interface is not available. I also see the following warning messages in splunkd.log.
WARN HttpListener - Socket error from 127.0.0.1 while idling: error:1408A0C1:SSL routines:ssl3_get_client_hello:no shared cipher
How can I get this to work?
Hi,
I am currently exploring the R App for Splunk. For a specific analysis purpose, I need to use ‘igraph’ library. I tried several times and didn’t work the code out. When I tried to add ‘igraph’ package in ‘Manage Packages’, the “state” is always “Installing” as you can see from the attached pic.
Could you please advise me whether this is because that this App currently does not support the ‘igraph’ package or maybe I didn’t configure the App correctly?
![alt text][1]
Thanks and Best Regards,
Ningwei
[1]: /storage/temp/54174-igraph.jpg
Trying to find the average PlanSize per hour per day.
source="*\\myfile.*" Action="OpenPlan" | transaction Guid startswith=("OpenPlanStart") endswith=("OpenPlanEnd") |
eval PlanSize=case(NumPlanRows>0 AND NumPlanRows<=100, "1. Small", NumPlanRows>100 AND NumPlanRows<=200, "2. Medium", NumPlanRows >200, "3. Large") |
eval weekday=strftime(_time,"%A") | eval hour=strftime(_time,"%H") |
I would like something like
`stats avg(count(PlanSize)) by weekday, hour, PlanSize` or some such
Namely, by day of the week, and hour of the day, what is the average count of each variety of plan size being opened?
I can't seem to find any syntax that works.