The LogScale Azure Event Hub Collector is an open source project and not a CrowdStrike product. As such, it carries no formal support, expressed, or implied.
Event Hubs are data/event ingesters which can be integrated with functions and services (Azure internal and external).
As Event Hubs are often used as temporary storage of data/events, we can utilize Azure Logic Apps to forward the data/events to CrowdStrike LogScale for storage, analytics, or other purposes.
Refer to Logscale Documentation for:
- Instructions to set up a new LogScale repository
- Instructions to set up a custom parser
- Instructions to generate the ingest token
The following script can be copied and pasted for the custom parser:
parseJson(@rawstring)
| split("ContentData.records", strip=true) // Split into individual events
| drop([@rawstring, _index]) // Clean up
The Logic App will contain all of the workflows, each of which is used to forward data from a single Event Hub instance.
Each Event Hub instance to be ingested into LogScale requires a separate Workflow. To create a workflow, a Trigger and Action needs to be specified.
For the Trigger, we are utilizing the "Event Hub" Trigger in the "Azure" tab.
The trigger requires a connection with an Event Hub Namespace via the Connection String . The Connection String can be found after creating a Shared Access Policy within the Event Hub after adding an access policy in the "Shared access policies" tab with the "Listen" claim. The "Connection string–primary key" will be generated and can be used for the Workflow trigger.
After the Event Hub Namespace is connected, the trigger can be set up with an Event Hub Instance from the dropdown list, with the Content Type "application/json", the default Consumer group name, Maximum Event Count set to "175" and the interval to check for new events being "1 Second.
In the settings tab of the Trigger, turn off "Split On" as shown in the screenshot below.
For the Action, the simple "HTTP" action is needed with the Method being a "POST" request and the URI being the LogScale HEC URL (which should look similar to this: https://your-logscale-url.com/api/v1/ingest/hec/raw). The Header key and value, we follow the LogScale HEC format:
Key: "Authorization"
Value: "Bearer <Insert your Ingest Token created in Step 1>"
For the Body, "Body" needs to be selected to have the events in the correct format to be parsed into LogScale.
Finally, the workflow can be saved, which will start the execution of the workflow and forward the data/events to LogScale.
- Batching of multiple events coming from Azure as a single log in LogScale as 175 events per second is the maximum allowed by Azure.
- Dashboard are currently not available, however will be published as a LogScale package on the Marketplace once available