Data Feeds: Getting Data into DarkLight

The Data Feeds View is used to configure the sources of data that will be flowing into DarkLight and triggering PRO Playbooks to activate. Data Feeds are configured in this view, and then referenced in PRO Playbooks by an Ingest step.

There are several ways to physically move the bits so DarkLight can use them as input for Playbooks.

  • Use middleware to send collected data to a Java Messaging Service (JMS) destination, like an ActiveMQ topic or queue. Useful tools in this space include Logstash, NiFi, Kafka, etc. In DarkLight, create a JMS or Kafka Data Feed to subscribe to the queue.
  • Have a data collection tool output json or csv files to a network folder that DarkLight can also reach and then set up a Folder Data Feed to ingest the files.
  • Create a Playbook with a Schedule Data Feed that launches a Splunk or ElasticSearch Data Source query to bring back results at regular intervals.
  • Have a data collection tool send HTTP POST requests directly to DarkLight and set up a POST Data Feed to ingest the files.

The Data Feeds View is on the PRO Playbooks and Data perspectives by default but can be added to any perspective by choosing Window → Show View and choosing the Data Feeds view.

Invalid Link
The icons in the Data Feeds view show the status of data flowing through them, or show why a feed is not active. Hover over the icon for more information.

DarkLight can be configured to accept data from a message queue, or from a locally-accessible folder. Any number of separate feeds of each time can be configured and named separately.

To create a new Data Feed configuration, click the New Data Feed icon and then choose if you want to configure a Java Message Service (JMS) queue/topic, Kafka Subscription, Streaming Text Orientated Messaging Protocol (STOMP), Folder, Schedule, or Post. Note: once this choice is made, it cannot be changed for this configuration.

A new configuration with a generated name will appear in the left side list and the settings will be shown on the right side of the view. Fill out the details (Description is optional). The settings will save automatically.

When all the required fields have been entered, the orange asterisk icon will be replaced with a grey circle, indicating the configuration is complete and disabled. Click the checkbox next to the name to activate the feed, instructing DarkLight to begin monitoring it for data.

  • Tip: Using an https connection? Certs are automatically downloaded to the server (show me)

Notes on Java Message Service (JMS) Feeds

  • Address (URL) and Topic/Queue name is required
  • protocol (tcp:// or ssl://) is required
  • Specify the port at the end of the address with a colon, e.g. :61616
  • Name and Password fields are optional
  • Topics operate with a publish/subscribe model where multiple clients can access the same data.
  • Queues operate with a push model where once one client reads the message it is not available to any other clients.

Notes on Kafka Feeds

  • Server address, Subscription, and Group ID are required
  • No protocol is required before the URL or IP address
  • Specify the port at the end of the address with a colon, e.g. :9092
  • For Secure Connections, check the Secure Connection box (and make sure you've specified the secure port)
  • Group ID is how Kafka does load-balancing, so if you have multiple instances of DarkLight subscribing to the feed and you want them to all receive copies of the data, make sure the Group ID is unique on each instance. If two instances of DarkLight are using the same Group ID, Kafka will only send data to one of them.
  • Use the Ingest Step with this Data Feed

Notes on STOMP Message Queues

  • Address (URL) and Subcription are required
  • protocol (tcp:// or ssl://) is required
  • Specify the port at the end of the address with a colon, e.g. :61613
  • Subscription must start with /topic/ or /queue/
  • Name and Password fields are optional
  • Topics operate with a publish/subscribe model where multiple clients can access the same data.
  • Queues operate with a push model where once one client reads the message it is not available to any other clients.
  • Use the Ingest Step with this Data Feed

Notes on Folder Feeds

  • A folder can be any path the server can access.
  • Using multiple folder locations can be a useful way to separate data created by scripts external to DarkLight.
  • Warning: As soon as the Data Feed is checked, DarkLight will consume ALL files in the chosen folder, regardless of file type.
  • Check the "Archive" box if you want DarkLight to create an "Archive" folder in the monitored location in which it moves a copy of any files that have been ingested (recommended especially during initial testing so you don't lose any files).
  • Use the Ingest Step with this Data Feed

Notes on Schedule Feeds

  • The schedule is controlled using a Cron Pattern or a Timer that counts down the number of seconds specified.
  • The Schedule Preset list has several common Cron patterns that will be filled into the box and used or further edited.
  • Malformed cron patterns will turn red
  • Use the Schedule Step with this Data Feed

Notes on HTTP POST Feeds

  • DarkLight only accepts payloads of JSON-formatted text. For encodings, use one of the following:
    • application/octet-stream
    • application/json
    • text/plain (or anything that starts with text/)
  • The URL your appliance should send to is a combination of configurable options.
    • Choose http or https from Window → Preferences: Web Server Settings
    • The address is the address of your DarkLight server
    • The port is set in the Web Server Settings preference (Window → Preferences: Web Server Settings)
    • The first part of the path is always /upload (see Roles Required below)
    • Any further path is defined in the Path input
    • Pattern: http(s)://<darklightserver>:port/upload/<path>/*
    • Example: http://192.168.2.2:48080/upload/mystuff/
  • Is Package Only Feed should be checked if this feed is receiving packages from other DarkLight installations (using the Web Request step).
  • Roles Required should be checked if you want to enforce the feed with a username and password.
    • Create a text file called jetty-realm.properties and put it in <path to workspace>/config/application
    • add one line for each Name/Password/Role you want to add using the format:
      • username: password,rolename
      • If you do not want the password in plaintext in this file, you can encode it with either CRYPT, OBF, or MD5. Use that prefix with a colon as a part of the password (example: MD5:34819d7beeabb9260a5c854bc85b3e44)
    • The first part of the URL's path will be the value of rolename. For example, the default role is upload so the path starts with /upload.
    • Use the Ingest Step with this Data Feed

To start a PRO Playbook with a Data Feed, click the Add Step button in the PRO Playbook Editor and choose one of the Data Feed types.

Any of the Ingest steps can change their source by clicking on the step (the step will turn blue) and choosing a Data Feed configuration from the menu.

Prevent Data Loss

When a Data Feed is enabled (i.e., checked), any PRO Playbook that has that specific feed at the top will receive any messages that arrive via that feed. If a message comes in from a Data Feed, but no active (checked) Playbooks use that ingest source, the message is dropped. To prevent data loss, uncheck any Data Feeds you do not have active Playbooks for.