Filtering log messages with Splunk

Steadily Refined

For this month's column, I installed Splunk at home on my Ubuntu machine. Figure 1 shows how I searched the log data in /var/log and the Apache access logs provided by my hosting provider, which I copied using rsync. Without any ado, Splunk consumed rotated and zipped logs as well.

The first query I issued was fail* OR error. This full-text search finds messages that contain the term error. Splunk is not case sensitive for search strings, but it does expect keywords, such as OR, to be uppercase. The regular expression fail* matches all terms that begin with fail – including failed. Without keywords, Splunk connects the strings with a logical AND; foo bar thus finds log entries that contain both foo and bar. The logical operator OR must be written all in uppercase and combines messages that contain either search term. The results in Figure 1 cover the time frame shown at top right, Last 7 days.

All of these results are from the extremely verbose Nagios debug log (nagios.debug). To filter them, the dialog in Figure 4 defines the nagios-chatter event type with the search function source=.../nagios.debug (seen on the darkened background). Splunk then internally associates all entries from the original nagios.debug file with the event type. If you then append the expression NOT eventtype=nagios-chatter to the query, the event search filters out any Nagios chatter.

Figure 4: All lines from the nagios.debug logfile (in the darkened area) are associated with the nagios-chatter event type.

The next application spamming the logfiles was my nightly backup process. The rsync command I used here seemed to be unable to copy some files because of a lack of access rights, which filled the backup.all.log file every night. Thus, I needed an event type called rsync-chatter to filter that out.

To keep the search commands short, I defined the chatter event type, which uses

eventtype="nagios-chatter" OR
eventtype="rsync-error"

to combine the two filters. This means that I just need to append NOT eventtype=chatter to my query to hide all this log chatter with the help of an arbitrarily extensible filter. The messages from /var/log/messages shown in Figure 5 remain; they report failed password attempts and seem to merit deeper manual analysis.

Figure 5: Without nagios-chatter and backup-rsync-error, only the important events remain.

Scaling with Hadoop

Sure, you could argue that instead of reading a mess of data in /var/log, it would be easier just to read the correct file, /var/log/auth.log. The great advantage of Splunk, though, is precisely that it stores all (!) your logfiles  – even those from entire server farms  – and lets you run queries against them all at once.

For this approach to work well in large data centers, Splunk uses a Hadoop cluster [2] [3] as its back end, which distributes the complicated computation of the search results across multiple nodes.

The web interface allows end users without programming skills to select fields and define limiting filters graphically. Splunk permanently stores all user entries in the configuration files in etc under the Splunk installation directory (Figure 6). As you gradually develop your own definitions, you would do well to back up these files with a source control system such as Git occasionally, so you can roll back if you realize that what you just changed was a bad idea.

Figure 6: Splunk stores definitions created in the web interface in readable configuration files.

The Poor Developer's Splunk Alert

Unlike the enterprise version, the free version of Splunk does not provide alerts that notify the user when queries exceed set limits. The script in Listing 1 [4], however, will help those on a more frugal budget; it runs on the same machine and periodically taps into Splunk's web API. Splunk sends the results in JSON format, and the script reformats them and mails them out. In other words, Splunk can run on your home computer behind a firewall, while a local cronjob periodically fires off queries and sends the results to a recipient anywhere on the Internet.

Listing 1

daily-incidents

001 #!/usr/local/bin/perl -w
002 use strict;
003 use XML::Simple;
004 use LWP::UserAgent;
005 use JSON qw( from_json );
006 use Text::ASCIITable;
007 use Net::SMTP;
008 use Email::MIME;
009 use File::Basename;
010
011 my $host = "127.0.0.1";
012 my $port = 8089;
013 my $login_path =
014  "servicesNS/admin/search/" .
015  "auth/login";
016 my $user     = "admin";
017 my $password = "changeme";
018 my $from_email =
019   'm@perlmeister.com';
020 my $to_email = $from_email;
021 my $subject =
022   'Daily Incidents';
023 my $smtp_server =
024   'smtp.provider.net';
025
026 my $ua =
027   LWP::UserAgent->new(
028  ssl_opts =>
029    { verify_hostname => 0 });
030
031 my $resp = $ua->post(
032   "https://$host:$port" .
033   "/$login_path",
034  {
035   username => $user,
036   password => $password
037  }
038 );
039
040 if ($resp->is_error()) {
041  die "Login failed: ",
042    $resp->message();
043 }
044
045 my $data =
046   XMLin($resp->content());
047 my $key =
048   $data->{sessionKey};
049
050 my $header =
051   HTTP::Headers->new(
052  Authorization =>
053    "Splunk $key");
054 $ua->default_headers(
055  $header);
056
057 $resp = $ua->post(
058   "https://$host:$port/" .
059   "servicesNS/admin/search/" .
060   "search/jobs/export",
061  {
062   search =>
063    "search fail* OR error " .
064    "NOT eventtype=chatter " .
065    "earliest=-24h",
066   output_mode => "json",
067  },
068 );
069
070 my $t =
071   Text::ASCIITable->new(
072  { headingText => $subject }
073   );
074 $t->setCols("date", "source",
075  "log");
076 $t->setColWidth("date", 10);
077 $t->setColWidth("log",  34);
078
079 for my $line (split /\n/,
080  $resp->content())
081 {
082  my $data = from_json($line);
083  next
084    if !
085     exists $data->{result};
086  my $r = $data->{result};
087  $t->addRow($r->{_time},
088   basename($r->{source}),
089   $r->{_raw});
090 }
091
092 my $smtp = Net::SMTP->new(
093  $smtp_server);
094
095 my $email =
096   Email::MIME->create(
097  header_str => [
098   From    => $from_email,
099   To      => $to_email,
100   Subject => $subject,
101  ],
102  parts => [
103   Email::MIME->create(
104    attributes => {
105     content_type =>
106       "text/html",
107     disposition => "inline",
108     charset     => "UTF-8",
109     encoding =>
110       "quoted-printable",
111    },
112    body_str =>
113 "<html><pre>$t</pre></html>",
114   )
115  ],
116   );
117
118 $smtp->mail($from_email);
119 $smtp->to($to_email);
120 $smtp->data();
121 $smtp->datasend(
122  $email->as_string);
123 $smtp->dataend();
124 $smtp->quit();

To begin, the script in Listing 1 needs to identify itself to the Splunk REST API. A fresh installation sets admin (account) and changeme (password) for the web GUI by default. The script executes the login as an HTTPS request. Line 11 defines the server running the REST API as 127.0.0.1; line 12 sets the Splunk server port to 8089.

The post() method called in line 31 uses SSL to send the login data to Splunk. Because the LWP::UserAgent CPAN module does not come with browser certificates by default, line 29 sets the verify_ hostname option to 0, which prevents the certificate check from happening.

Splunk returns the results of the login as XML, and the XMLin() function from the CPAN XML::Simple module in the script converts the XML into a data structure. After a successful login, the structure contains a sessionKey field. The key contains a hexadecimal number that must accompany each following REST request for Splunk to recognize it as belonging to a logged-in user. The UserAgent's default_headers() method handles this automatically for all subsequent requests.

The tutorial on the Splunk website [5] describes the details of the REST API. Although the Splunk SDK has client libraries for Python, Java, JavaScript, PHP, Ruby, and C#, it lacks a Perl kit. A module is available on CPAN, but it does not work with Splunk version 5. Fortunately, REST queries are easy to program, and Listing 1 thus uses this layer.

Buy this article as PDF

Express-Checkout as PDF
Price $2.95
(incl. VAT)

Buy Linux Magazine

SINGLE ISSUES
 
SUBSCRIPTIONS
 
TABLET & SMARTPHONE APPS
Get it on Google Play

US / Canada

Get it on Google Play

UK / Australia

Related content

  • Splunk Announces SDKs for Java and Python

    New SDKs aim to integrate Splunk with big data applications.

  • Tech Tools
    • NVidia gaming device
    • Qt 5.0 Released
    • Oracle NoSQL DB 2.0
    • SuperSpeed USB 3.0
  • adtool

    The simple but useful adtool lets you manage an Active Directory domain from the Linux command line.

  • Logstash

    When something goes wrong on a system, the logfile is the first place to look for troubleshooting clues. Logstash, a log server with built-in analysis tools, consolidates logs from many servers and even makes the data searchable.

  • Perl: Yahoo API Scripting

    Following in the footsteps of Google, Amazon, and eBay, Yahoo recently introduced a web service API to its search engine. In this month’s column, we look at three Perl scripts that can help you correct typos, view other people’s vacation pictures, and track those long lost pals from school.

comments powered by Disqus

Direct Download

Read full article as PDF:

Price $2.95

News