runZero integrates with Sumo Logic to make your asset inventory available directly in Sumo Logic. This article will show you how to export your runZero inventory into Sumo Logic for use within the SIEM.
Integrating runZero with Sumo Logic
Setting up the connection between Sumo Logic and runZero has three options with different configuration steps.
Option A: Local script
- Create a Sumo Logic HTTP Source.
- Configure your host to run the provided script.
Option B: AWS Lambda function
- Create a Sumo Logic HTTP Source.
- Configure the AWS Lambda function to run the provided script.
Option C: Sumo Logic script source
- Install a Sumo Logic collector.
- Create a Sumo Logic script source.
Once your data is flowing into Sumo Logic, you can start using the data in Sumo Logic.
Option A: Local script
Step 1: Create a Sumo Logic HTTP Source
- After logging in to Sumo Logic, navigate to Manage Data > Collection.
- Click Add Collector then select Hosted Collector.
- Provide a name, such as
runZero Collector
and click Save.
- If prompted to add a data source, click OK. Otherwise, find your Collector in the list and click Add Source.
- Select the HTTP Logs and Metrics source.
- Provide a name, such as
runZero Assets
, then click Save.
- Copy the URL provided to use in step 2.
Step 2: Configure your host to run the provided script
-
Identify the host you would like to run the script from.
-
Ensure the host has Python3 and Pipenv installed.
-
Save the script below to the host it will be run from.
#!/usr/bin/env python3
import json
import requests
import os
# RUNZERO CONF
RUNZERO_EXPORT_TOKEN = os.environ['RUNZERO_EXPORT_TOKEN']
HEADERS = {'Authorization': f'Bearer {RUNZERO_EXPORT_TOKEN}'}
BASE_URL = 'https://console.runZero.com/api/v1.0'
# SUMO LOGIC CONF
HTTP_ENDPOINT = os.environ['SUMO_HTTP_ENDPOINT']
def main():
url = BASE_URL + '/export/org/assets.json'
assets = requests.get(url, headers=HEADERS)
batchsize = 500
for i in range(0, len(assets.json()), batchsize):
batch = assets.json()[i:i+batchsize]
f = open('upload.txt', 'w')
f.truncate(0)
for a in batch:
json.dump(a, f)
f.write('\n')
f.close()
r = open('upload.txt')
requests.post(HTTP_ENDPOINT, data=r.read())
r.close()
if __name__ == '__main__':
main()
-
Create your environment variables by running the following commands:
export RUNZERO_EXPORT_TOKEN=XXX
: Use your runZero export API token, which can be obtained in your runZero console on an organization detail page. Select the organization you wish to export data from, then click Edit organization to view the export API token.
export SUMO_HTTP_ENDPOINT=XXX
: Use the Sumo Logic token obtained in step 1.
-
Create your virtual environment to run the script by running pipenv --python /path/to/python3
.
-
Install the requests
library in your virtual environment for making API calls:
pipenv shell
pip install requests
-
Test the script by running your script from the virtual environment.
- Use the location from the
pipenv
output to start.
- Append
/bin/python3
to use Python in the virtual environment.
- Use the full path to the script.
my-server:~/ $ /home/user/.local/share/virtualenvs/runZero-scripts-mVQtFLDO/bin/python3 /home/user/scripts/script.py
-
Configure a crontab task to run at the desired cadence.
- On the hour:
0 * * * * RUNZERO_EXPORT_TOKEN=XXX SUMO_HTTP_ENDPOINT=XXX /path/to/virtual/env/python3 /path/to/script.py
- Daily at midnight:
0 0 * * * RUNZERO_EXPORT_TOKEN=XXX SUMO_HTTP_ENDPOINT=XXX /path/to/virtual/env/python3 /path/to/script.py
- Weekly at midnight on Monday:
0 0 * * 1 RUNZERO_EXPORT_TOKEN=XXX SUMO_HTTP_ENDPOINT=XXX /path/to/virtual/env/python3 /path/to/script.py
Option B: AWS Lambda function
Step 1: Create a Sumo Logic HTTP Source
- After logging in to Sumo Logic, go to Manage Data > Collection.
- Click Add Collector then select Hosted Collector.
- Provide a name, such as
runZero Collector
and click Save.
- If prompted to add a data source, click OK. Otherwise, find your Collector in the list and click Add Source.
- Select the HTTP Logs and Metrics source.
- Provide a name, such as
runZero Assets
, then click Save.
- Copy the URL provided to use in step 2.
Step 2: Configuring the AWS Lambda function to run the provided script
-
Go to your AWS Console and navigate to the Lambda page.
-
Click Create a function.
-
Give your function a name.
-
Select Python 3.9
as the runtime.
-
Everything else can be left with the default setting. Click Create function to move to the next page.
-
Click Add Trigger to set up a cron job.
-
Select EventBridge
to set up a schedule.
-
Use an existing rule or select Create new rule.
- Give it a name and set Rule type to
Schedule expression
.
- Use one of these options or create your own based on desired cadence:
- Daily:
rate(1 day)
- Every 12 hours:
rate(12 hours)
- Every 3 hours:
rate(3 hours)
- Click Add to return to the main Lambda configuration page.
-
Under Configuration select Environment variables
.
-
Enter these two environment variables:
RUNZERO_EXPORT_TOKEN
which can be obtained in your runZero console on an organization detail page. Select the organization you wish to export data from, then click Edit organization to view the export API token.
SUMO_HTTP_ENDPOINT
which was obtained in step 1.
-
Click Save
to return to the main Lambda configuration page.
-
Click the Code
tab and replace the default code with this script.
import json
import urllib3
import os
# RUNZERO CONF
RUNZERO_EXPORT_TOKEN = os.environ['RUNZERO_EXPORT_TOKEN']
HEADERS = {'Authorization': f'Bearer {RUNZERO_EXPORT_TOKEN}'}
BASE_URL = 'https://console.runZero.com/api/v1.0'
# SUMO LOGIC CONF
HTTP_ENDPOINT = os.environ['SUMO_HTTP_ENDPOINT']
def lambda_handler(event, context):
http = urllib3.PoolManager()
url = BASE_URL + '/export/org/assets.json'
response = http.request('GET', url, headers=HEADERS)
data = response.data
assets = json.loads(data)
batchsize = 500
for i in range(0, len(assets), batchsize):
batch = assets[i:i+batchsize]
f = open('upload.txt', 'w')
f.truncate(0)
for a in batch:
json.dump(a, f)
f.write('\n')
f.close()
r = open('upload.txt')
http.request('POST', HTTP_ENDPOINT, body=r.read())
r.close()
-
Click Deploy to update the code.
-
Click Test to verify the code works.
Your asset data export will now be posted to Sumo Logic at the cadence you configured.
Option C: Sumo Logic script source
Step 1: Installing a Sumo Logic collector
Follow the Sumo Logic documentation in order to install a collector.
Step 2: Creating a Sumo Logic script source
Sumo Logic has documentation on script sources as well. Here are the steps to follow to set up the script source once your collector is installed.
-
Navigate to the Collection page in Sumo Logic.
-
Find your collector and click Add > Add Source.
-
Select Script
as the source type.
-
Input a Name
and Source Category
.
-
Select a Frequency
.
-
Select Command type /usr/bin/python
.
-
Add the following script in the Script field.
#!/usr/bin/python
import json
import requests
import os
# RUNZERO CONF
RUNZERO_EXPORT_TOKEN = os.environ['RUNZERO_EXPORT_TOKEN']
HEADERS = {'Authorization': 'Bearer ' + RUNZERO_EXPORT_TOKEN}
BASE_URL = 'https://console.runZero.com/api/v1.0'
def main():
url = BASE_URL + '/export/org/assets.json'
assets = requests.get(url, headers=HEADERS)
for a in assets.json():
print(json.dumps(a))
if __name__ == '__main__':
main()
-
Click Save to allow the source to start working.
Working with the asset data in Sumo Logic
Once your asset data in in Sumo Logic, you can use it in any way you would use any other log source. Here are some sample searches that you could use to create scheduled searches and dashboards.
Search distinct assets
_sourceCategory="runzero"
| json field=_raw "id"
| count_distinct(id) as distinct_assets
Search assets with more than 3 services running
_sourceCategory="runzero"
| json field=_raw "addresses_extra"
| json field=_raw "addresses"
| json field=_raw "id"
| concat("https://console.runzero.com/inventory/", id) as runzero_link
| json field=_raw "service_count"
| where service_count > 3
| count addresses, addresses_extra, service_count, runzero_link
Determine counts of different operating systems
_sourceCategory="runzero"
| json field=_raw "os"
| where !isEmpty(os)
| json field=_raw "id"
| count os, id
| count os