Heating up the Data Pipeline (Part 4)
In this last part of the "Heating up the Data Pipeline" blog series, we will go through some potentially useful NiFi dataflows. Previous Parts: Part 1 Part 2 Part 3 Updating Splunk Lookup Files with SQL Query Data The following simple workflow pulls data from a SQL Database using a JDBC connection. The ExecuteSQL processor returns data in Avro format. The results will be send through the ConvertRecord Processor to convert from Avro into CSV. As Splunk does not allow to directly upload CSV files, we have to put the data into a Spooldir on the Splunk Server. In this case Splunk resides on the same server, so we can use the PutFile Processor. We could also use the PutSFTP Processor for transferring the file to a remote server. Finally we will assemble the proper POST request, and invoke the REST endpoint. Splunk Event Copier Sometimes you want to copy a subset of events from a production system t...