Free serverless, automated Security dashboards

Cristian Glavan
2 min readSep 5, 2020

This simple experiment shows how to pull data from https://nvd.nist.gov/ and display it in custom dashboards in datastudio, tailored to your needs, like so:

https://datastudio.google.com/reporting/41b955e6-a58d-46c8-88c1-bad0feb5b9bf

Here’s a breakdown of what we’ll use:

  • gcp project
  • docker container to pull data from nvd and send it to a bucket
  • bucket to store nvd results as json file
  • docker container to start a bigquery impot job — this imports the json file into bigquery
  • bigquery dataset with a simple autodetected schema.
  • datastudio dashboard with bigquery dataset as data source

The setup would look something like this:

https://clglavan.github.io/nvd-scrapper/

In this example I’ll provide the docker compose for nvd-scrapper and bigquery-job. The transition to GitLab or GitHub shouldn’t be that difficult but it’s not the scope of this piece.

The docker compose would look something like this:

version: '3'
services:
nvd-scrapper:
image: "clglavan/nvd-scrapper"
volumes:
- '../key.json:/google/key.json'
env_file:
- ../gcp.env
bigquery-job:
image: "clglavan/bigquery-job"
depends_on:
- "nvd-scrapper"
volumes:
- '../key.json:/google/key.json'
env_file:
- ../gcp.env

Alongside this you will need an env_file required by the containers to properly use the gcp sdk, that file includes:

GOOGLE_APPLICATION_CREDENTIALS="/google/key.json"
PROJECTID="{your-project-id}"
BUCKET="{your-bucket-name}"
FORMAT="json"
FILENAME="{your-filename}.json"
DATASETID="{your-dataset-name}"
GCSURI="{your-bucket-gcs-uri}"
TABLEID="{your-table-id}"

The thing to take into account here is that both containers work with the gcp sdk, so they need to authenticate to your project and have proper permissions. Thus, create a service account and grant it Storage Object Admin / BigQuery Admin.

One of the ways to authenticate the service account is by json key, download it from gcp IAM, then mount that key into the container.

volumes:
- '../key.json:/google/key.json'

Set the env var GOOGLE_APPLICATION_CREDENTIALS to point to the newly mounted key.

You should now start a docker-compose up and see the stdout results of the nvd-scrapper and bigquery job import confirmation.

With this working, go start customizing your dashboards!

By this point you should start looking into GitLab CI/CD or GitHub Actions.

More updates on this project, and public dashboard here:

https://clglavan.github.io/nvd-scrapper/

--

--

Cristian Glavan

DevOps Engineer @ Zitec.com — Full-Stack Web Developer — Creative— Martial Artist — Curious — Lateral Thinking