Quantcast
Channel: Latest Questions on Splunk Answers
Viewing all 13053 articles
Browse latest View live

Cluster ERROR S2SFileReceiver

$
0
0
Hi, Im getting the follwing error on my indexers splunkd.log. I have a RF=3 and SF=3 clustering with 1 master and 1 search head. ERROR S2SFileReceiver - event=statSize replicationType=eJournalReplication bid=trendsg~26~89C0FF94-5EB0-410A-9B4D-0E17DBD7FB78 path=/opt/splunk/var/lib/splunk/trendsg/db/26_89C0FF94-5EB0-410A-9B4D-0E17DBD7FB78/rawdata/journal.gz status=failed Any thoughts? Thanks,

Limit permissions by log type

$
0
0
I would like to limit certain users' access to URL filtering and config change data coming from the PAN. Is there a way to do that? I would put the data in separate indexes, but the app documentation says to use pan_logs for everything.

what is the advantage of splunk over manageengine event log analayzer?

$
0
0
dear splunker , am trying to persuade my boss to implement Splunk instead of Manage engine event log analyzer , but he keep asking me why and honestly i cant give him a good answer , can you help me ASAP. peace and love ...

Dashboard table cell color based on text value without jscript or custom CSS

$
0
0
Good Afternoon All I am creating a Dashboard Table with username, skill1, skill2, skill3. The values of skill1,2,3 will be a text value, None, White, Orange, Blue, Red, Black. Based on these values the cell will be that color for the skill. ![alt text][1] From the picture you can kind of get the idea. We share this Splunk instance with other department and don't want to make a bunch of custom changes just in case a future version breaks them. Sooooo, is there a way to do this with out touching the application.js and CSS? Thanks Ed![alt text][2] [1]: /storage/temp/52219-table.jpg [2]: /storage/temp/52219-table.jpg

Can not find SEP data on my Splunk Server

$
0
0
hi, am new to Splunk, and now am using using splunk 6.2 under Linux, few days ago I configured SEP to forward all Events (*Client, System, Agents, etc*.) and from Splunk side I've downloaded and added "*_splunk-add-on-for-symantec-endpoint-protection_201_*" All network access are OK, and tested. but i don't know if the logs were send from SEP or not or where and how to find it, am totally naive to SPLUNK, noticing that I have followed configuration steps but I didn't find the logs on "**$SEPM_HOME/data/dump**"

DB connect data are not indexed

$
0
0
I've set up a DB connect data input, from the surface all is working properly health shows successful connections. But I can't find any event in my index. This is inputs.conf from db connect Splunk 6.2.4, db connect v2. Setup is one search head and one indexer. [root@splunksrv01 splunk]# cat etc/apps/splunk_app_db_connect/local/inputs.conf [rpcstart://default] javahome = /root/apps/jdk1.8.0_51/jre useSSL = 0 proc_pid = 5927 bindIP = * [mi_input://SensorData] connection = site-protector description = Site protect "RealSecureDB"."dbo"."SensorData" index = Security input_timestamp_column_name = time input_timestamp_column_number = 1 interval = 30 max_rows = 10000 mode = tail output_timestamp_format = YYYY-MM-dd HH:mm:ss query = SELECT\ CONVERT(VARCHAR(50), S.AlertDateTime, 121) as time,\ S.SensorDataRowID,\ S.SensorDataID,\ S.AlertName,\ S.AlertID,\ S.SensorName,\ P.ProdName,\ A.AlertTypeName,\ S.AlertPriority,\ S.AlertFlags,\ CAST(ROUND( (cast(S.SensorAddressInt as bigint) / 16777216 ), 0, 1) AS varchar(4)) + '.' +\ CAST((ROUND( (cast(S.SensorAddressInt as bigint) / 65536 ), 0, 1) % 256) AS varchar(4)) + '.' +\ CAST((ROUND( (cast(S.SensorAddressInt as bigint) / 256 ), 0, 1) % 256) AS varchar(4)) + '.' + \ CAST((cast(S.SensorAddressInt as bigint) % 256 ) AS varchar(4)) as SensIP,\ S.ProtocolID,\ S.SourcePort,\ S.ObjectName,\ S.ObjectType,\ S.SourcePortName,\ S.DestPortName,\ S.UserName,\ S.ProcessingFlag,\ S.Cleared,\ S.HostGUID,\ S.HostDNSName,\ S.HostNBName,\ S.HostNBDomain,\ S.HostOSName,\ S.HostOSVersion,\ S.HostOSRevisionLevel,\ V.VulnStatusDesc,\ S.AlertCount,\ S.ObservanceID,\ S.ComponentID,\ S.SensorGUID,\ S.LicModule,\ S.VLan,\ S.VirtualSensorName,\ S.TargetID,\ S.SensorInterfaceName,\ S.SrcIPv6High,\ S.SrcIPv6Low,\ S.DestIPv6High,\ S.DestIPv6Low,\ S.SensorIPv6High,\ S.SensorIPv6Low,\ S.CVSSBase,\ S.CVSSTemporal,\ S.CVSSScore,\ S.ScanName,\ S.Imported,\ S.SourceLocationCode,\ S.TargetLocationCode,\ S.QuarantineName,\ S.QuarantineGUID,\ O.ObservanceTypeDesc,\ CAST(ROUND( (cast(S.SrcAddressInt as bigint) / 16777216 ), 0, 1) AS varchar(4)) + '.' +\ CAST((ROUND( (cast(S.SrcAddressInt as bigint) / 65536 ), 0, 1) % 256) AS varchar(4)) + '.' +\ CAST((ROUND( (cast(S.SrcAddressInt as bigint) / 256 ), 0, 1) % 256) AS varchar(4)) + '.' + \ CAST((cast(S.SrcAddressInt as bigint) % 256 ) AS varchar(4)) as SrcIP,\ CAST(ROUND( (cast(S.DestAddressInt as bigint) / 16777216 ), 0, 1) AS varchar(4)) + '.' +\ CAST((ROUND( (cast(S.DestAddressInt as bigint) / 65536 ), 0, 1) % 256) AS varchar(4)) + '.' +\ CAST((ROUND( (cast(S.DestAddressInt as bigint) / 256 ), 0, 1) % 256) AS varchar(4)) + '.' + \ CAST((cast(S.DestAddressInt as bigint) % 256 ) AS varchar(4)) as DestIP\ FROM "RealSecureDB"."dbo"."SensorData" AS S \ JOIN "RealSecureDB"."dbo"."VulnStatus" AS V \ ON S.VulnStatus = V.VulnStatus\ JOIN "RealSecureDB"."dbo"."Products" AS P\ ON S.ProductID = P.ProductID\ JOIN "RealSecureDB"."dbo"."AlertType" AS A\ ON S.AlertTypeID = A.AlertTypeID \ JOIN "RealSecureDB"."dbo"."ObservanceType" AS O\ ON A.ObservanceType = O.ObservanceType source = site_protector sourcetype = site_protector tail_follow_only = 1 tail_rising_column_name = SensorDataRowID tail_rising_column_number = 1 ui_query_catalog = RealSecureDB ui_query_mode = advanced ui_query_schema = dbo ui_query_table = SensorData tail_rising_column_checkpoint_value = 16000 outputs.conf: [root@splunksrv01 splunk]# cat etc/apps/splunk_app_db_connect/local/outputs.conf [tcpout:indx1] server=192.168.100.80:9997

Max 1,000 rows returned - DB Connect V2 - MySQL

$
0
0
Hi all, How can I do to UNLIMIT the rows on a MySQL Query on DB Connect V2? I need to search ALL MY DATABASE, but I only received 1000 rows on search return. Who does can help me, please? Thanks and regards! Luis Carlos

IMAP returns Login error "Could not log into server: %s with password provided' % self.server" for Google account

$
0
0
Hey, I'm testing the IMAP in a windows environment and i keep running into this error"__main__.LoginError: Could not log into server: imap.gmail.com with password provided". I'm using gmail here, is there anything that needs to be done in the python script to make it work.? I have enabed imap in my gmail and the config file seems to be alright as well. Can someone tell me , what's wrong here? ERROR ExecProcessor - message from "python "C:\Program Files\Splunk\etc\apps\IMAPmailbox\bin\get_imap_email.py"" Traceback (most recent call last): 08-16-2015 21:09:09.497 +0530 ERROR ExecProcessor - message from "python "C:\Program Files\Splunk\etc\apps\IMAPmailbox\bin\get_imap_email.py"" File "C:\Program Files\Splunk\etc\apps\IMAPmailbox\bin\get_imap_email.py", line 717, in <module> 08-16-2015 21:09:09.497 +0530 ERROR ExecProcessor - message from "python "C:\Program Files\Splunk\etc\apps\IMAPmailbox\bin\get_imap_email.py"" parseArgs() 08-16-2015 21:09:09.497 +0530 ERROR ExecProcessor - message from "python "C:\Program Files\Splunk\etc\apps\IMAPmailbox\bin\get_imap_email.py"" File "C:\Program Files\Splunk\etc\apps\IMAPmailbox\bin\get_imap_email.py", line 709, in parseArgs 08-16-2015 21:09:09.497 +0530 ERROR ExecProcessor - message from "python "C:\Program Files\Splunk\etc\apps\IMAPmailbox\bin\get_imap_email.py"" imapProc.getMail() 08-16-2015 21:09:09.497 +0530 ERROR ExecProcessor - message from "python "C:\Program Files\Splunk\etc\apps\IMAPmailbox\bin\get_imap_email.py"" File "C:\Program Files\Splunk\etc\apps\IMAPmailbox\bin\get_imap_email.py", line 345, in getMail 08-16-2015 21:09:09.497 +0530 ERROR ExecProcessor - message from "python "C:\Program Files\Splunk\etc\apps\IMAPmailbox\bin\get_imap_email.py"" raise LoginError('Could not log into server: %s with password provided' % self.server) 08-16-2015 21:09:09.497 +0530 ERROR ExecProcessor - message from "python "C:\Program Files\Splunk\etc\apps\IMAPmailbox\bin\get_imap_email.py"" __main__.LoginError: Could not log into server: imap.gmail.com with password provided

date/time range - look at 1 hour for yesterday relatively

$
0
0
I want to just look at 1 hour for yesterday, but i want it to be relative to today so no matter when I look at it in the future it will always b yesterday. So if I look at it today it will show yesterdays value at 12pm to 1pm And if I look at it next week it will show the day before that day at 12pm to 1pm I am thinking of something like `-1d@d` for the earliest and `@d` for the latest but how do i get the hour I want?

Running Splunk process as splunk user on linux... Where did you install Splunk

$
0
0
I am installing Splunk as splunk user. I have it all down but what directory are people installing it in? Using /opt does not seem like a good idea because you then need to make the /opt dir 775 or 777 depending on who owns /opt... I welcome to hear form where others are installing it. Thanks!

maps in splunk + do not want overlapp /dupliacation of the map layer + country showing more than once

$
0
0
There seems to be duplication/repitition in view of world. For instance, half of australia]s east cost is shown on the left, and the whole of australia is shown in the centre, and similarily for other countries. ![alt text][1] Can I have so that this is not the case? I just want to see 1 country and just want to see it once not repeated. How do I do this in splunk? [1]: /storage/temp/52220-map-zoom-level.png

aligning values in a column with values from another columns

$
0
0
I basically have 2 searches that I am combining using `appendcols`. 1 search is for each element. it looks some thing like: `index=core ... ne=ne1 | stats sum(kpi1) as "kpi1" by ks_countryname | sort - "kpi1" | appendcols [search index=core ... ne=ne2 | stats sum(kpi1) as "kpi1_ne2" by ks_countryname | rename ks_countryname as ks_countryname_ne2| sort - "kpi1_ne2"]` the output looks something like below: ks_countryname kpi1 kpi1_ne2 ks_countryname_ne2 ZZ_undefined 1615 2631 ZZ_undefined Australia 1500 1635 China United States 676 1600 Australia China 423 410 United States Vanuatu 295 305 Samoa Switzerland 220 247 Switzerland Germany 165 213 Germany France 157 181 France Samoa 118 62 Vanuatu how do I get column 3 and 4 to line up with column 1 and 2 such that: row 2 of coulumn 1 and 2 `Australia 1500` would line up with row 3 of column 3 and 4 `1600 Australia` ideally I would still like to sort max to min by kpi1 but I just want the countries aligned also

Unable to search for regex extracted fields in fixed length file format

$
0
0
Hi, I'm seeing some very unusual behaviour when extracting fields in Splunk 6.2. Basically I can see the fields are extracted successfully, but I can't use them to search. I have the following sample data: 101STUS NVLGCCCPDRf4cc5a8023ce40e28c9f260c376dabe9032134120864 032000123456789 191820013550000000000000000ESBtesSSP191820013550000000000000000abcdefSSD00071468C4875691F2CC0000000102763400095 C02763400095 20150721112211485002KO-001HIHI12345 6_1 ABCD02 20150721102147122201507211754000000000000000007400 AU S2015072120150721CRN MH48A 0201 ACSP 20150721112211485ACSC 2015072111221148511215201121520 BIS 0000000000 00000000 This is a fixed length field log file (from mainframe), with no field separators. Therefore, I am using the following regular expression to extract the fields, which basically just extracts them from their position in the log file: .{11}(?.{4}).{1}(?.{2}).{32}(?.{6})(?.{28})(?.{6})(?.{28})(?.{36})(?.{36})(?.{20})(?.{8})(?.{15}).{33}(?.{3})(?.{6}).{132}(?.{17}).{17}0*(?
\d{1,16}).{90}(?.{4}).{4}(?.{17})(?.{4}).{4}(?.{17}) Now when I search for the log in Splunk, I can see all the fields created with the correct values. index=main sourcetype=mytype However, if I try to add the fields to the search string, I am unable to see any results. For example: index=main sourcetype=mytype Type=GCCC I've found that if I put a * on either side of the field value, it does find them, which I find strange: index=main sourcetype=mytype Type=*GCCC* This indicates that there may be whitespace around the value, but it doesn't appear that way when I look at the values. I've also found that I can successfully search for the fields if I add it as an extra search function after the main search: index=main sourcetype=mytype | search Type=GCCC This looks like it doesn't run the field extraction until after the main search, however I can see in a lot of other sourcetypes I have that this isn't the case, as I can search for those. I've also tried a number of other things to try to get this working: - Separating the regexes so each field has it's own extractions. I still get the same issue. - Extracting only 1 field from the data to simplify it. I still get the same issue. - Adding the following Calculated Field. This works, but I don't want to add an EVAL for every field as I'm sure there will be performance implications [mytype] EVAL-Status = Status Has anyone seen this before? I've played around a lot with the regex, but could there be a problem with this? Is there a better way to extract the fields for a fixed length file? I suspect that it's partly because the fields have no separators, therefore Splunk isn't able to do keyword searches on partial matches, can anyone confirm? Thanks in advance. Ashley

How to properly use mvc tokensafe?

$
0
0
Good day! Currently, I am planning on putting a text input in the dashboard through JavaScript and but whenever I run the dashboard, it always displays an error in the console log saying, "Uncaught TypeError: mvc.tokenSafe is not a function." My expected output is to create a text input in the JavaScript with a defined token. Here is my current code: new TextBox({ id: "textbox1", default: "main", value: mvc.tokenSafe("$deviceField$"), el: $("#searchField") }).render(); Thanks!

checkbox to toggle panels/rows hide/visible

$
0
0
Assuming I have a dashboard with multiple panels, is there a way to have some visible and some not?
showhide
... ... ...
currently this skeleton has the following view check box row1 row2 row3 Is there a way I can do the following: check box row1 row2 show/hide row3 show/hide with row 2 and 3 having the same control to toggle them from visible to hide? I am looking to do this in simple XML.

maps in splunk + can you show country name instead of lat and long values

$
0
0
in the below pic is there a way that you can display the country name in the pop up instead of the lat and long values? ![alt text][1] [1]: /storage/temp/52221-map-show-country-name.png

SSH to Centos Instance

$
0
0
I created a security pair when turning up the Centos instance, but am unable to log in with it. SSH is allowed from my private IP.

How can I force a specific ECDHE cipher to communicate with the Splunk web interface?

$
0
0
The cipherSuite parameter desired has been configured in $SPLUNK_HOME/etc/system/local/web.conf, but when I restart Splunk, the web interface is not available. I also see the following warning messages in splunkd.log. WARN HttpListener - Socket error from 127.0.0.1 while idling: error:1408A0C1:SSL routines:ssl3_get_client_hello:no shared cipher How can I get this to work?

How to use igraph package in the R App for Splunk?

$
0
0
Hi, I am currently exploring the R App for Splunk. For a specific analysis purpose, I need to use ‘igraph’ library. I tried several times and didn’t work the code out. When I tried to add ‘igraph’ package in ‘Manage Packages’, the “state” is always “Installing” as you can see from the attached pic. Could you please advise me whether this is because that this App currently does not support the ‘igraph’ package or maybe I didn’t configure the App correctly? ![alt text][1] Thanks and Best Regards, Ningwei [1]: /storage/temp/54174-igraph.jpg

How to find the average count of a field per hour per day?

$
0
0
Trying to find the average PlanSize per hour per day. source="*\\myfile.*" Action="OpenPlan" | transaction Guid startswith=("OpenPlanStart") endswith=("OpenPlanEnd") | eval PlanSize=case(NumPlanRows>0 AND NumPlanRows<=100, "1. Small", NumPlanRows>100 AND NumPlanRows<=200, "2. Medium", NumPlanRows >200, "3. Large") | eval weekday=strftime(_time,"%A") | eval hour=strftime(_time,"%H") | I would like something like `stats avg(count(PlanSize)) by weekday, hour, PlanSize` or some such Namely, by day of the week, and hour of the day, what is the average count of each variety of plan size being opened? I can't seem to find any syntax that works.
Viewing all 13053 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>