Splunking Oracle's ZFS Appliance

We have a bunch of Oracle ZFS Appliances. What I really like is their integrated dtrace based analytics feature.

However, some things are missing or causing problems:

-Storing long-term analytics data on the appliances produces a lot of data on the internal disks. This can fill up your appliance and in the worst case slow down the appliance software

-Scaling the timeline out too much, makes peaks invisible. This is probably a problem of the rendering software used on the appliance (JavaScript)

-Comparing all our appliances is not possible. There is no central analytics console.

As we are a heavy Splunk user, I sat together with our friendly storage consultant from Oracle and we brought these two great products closer together:

This is how we did it:

1. Setting up analytics worksheets

First we had to create the analytics worksheets. This is best done using the CLI interface, as the order of drilldowns should be always the same. Otherwise fields in the generated csv file might be in a different order on every appliance. Doing this in the BUI is possible, but hard...

I would also recommend to store the worksheet under a separate Appliance User.

Sample CLI commands:

analytics
worksheets
create Monitor
select worksheet-???


dataset
set name=io.ops[op]
set drilldowns=read,write
set seconds=3600
commit


dataset
set name=nfs4.ops
set seconds=3600
commit
...


2. Fetch Analytics Data

Script Excerpt:

ssh -T ${user}@${ipname} << --EOF--
${outputdir}/${wsname}.${ipname}.out
script
run('analytics');
run('worksheets');
var ws=list();
printf("Worksheets:%d\\n",ws.length);
printf("%s\\n",ws);
for(var i=0; i<ws.length; i++)="" p="" {<=""></ws.length;>
  run('select ' + ws[i]);
  var wsname=get('name');
  printf("Worksheet Name:%s\\n",wsname);
  if ( wsname == "$wsname" ) {
    var ds=list();
    for(var j=0; j<ds.length; j++)="" p="" {<=""></ds.length;>
      run('select ' + ds[j]);
      var dsname=get('name');
      printf("zfssa_%s\\n",dsname);
      dump(run('csv'));
      run('done');
    }
  }
  run('done');
}
run('done');


--EOF--

3. Configure Splunk Inputs

4. Create Splunk Dashboards

5. Enjoy Analytics Data Under Splunk













Happy Spelunking...

Comments

Popular posts from this blog

SLOG Latency

Heating up the Data Pipeline (Part 1)

Heating up the Data Pipeline (Part 2)