Offline Update

If the manager does not have a direct connection to the Internet, it is possible to keep the vulnerability feeds updated by fetching the database files from your local environment or network. To achieve this, specific vulnerability files must be downloaded and Wazuh configured to locate them.

Canonical

To perform an offline update of the Canonical feeds, you must download the corresponding OVAL files:

OS

Link

Focal

https://people.canonical.com/~ubuntu-security/oval/com.ubuntu.focal.cve.oval.xml.bz2

Bionic

https://people.canonical.com/~ubuntu-security/oval/com.ubuntu.bionic.cve.oval.xml.bz2

Xenial

https://people.canonical.com/~ubuntu-security/oval/com.ubuntu.xenial.cve.oval.xml.bz2

Trusty

https://people.canonical.com/~ubuntu-security/oval/com.ubuntu.trusty.cve.oval.xml.bz2

To fetch the vulnerability feeds from an alternative repository, the configuration is similar to the following:

<provider name="canonical">
    <enabled>yes</enabled>
    <os url="http://local_repo/com.ubuntu.focal.cve.oval.xml.bz2">focal</os>
    <os url="http://local_repo/com.ubuntu.bionic.cve.oval.xml.bz2">bionic</os>
    <os url="http://local_repo/com.ubuntu.xenial.cve.oval.xml.bz2">xenial</os>
    <os url="http://local_repo/com.ubuntu.trusty.cve.oval.xml.bz2">trusty</os>
    <update_interval>1h</update_interval>
</provider>

Note

Since March 2020, Canonical stores its feeds compressed in the bunzip2 format. Therefore, the Vulnerability Detector expects the feed in that format when downloaded from a remote repository. On the other hand, when the feeds are loaded from a local path, they must be uncompressed.

Alternatively, the feeds can be loaded from a local path. To achieve it, the path attribute is available as this example shows:

<provider name="canonical">
    <enabled>yes</enabled>
    <os path="/local_path/com.ubuntu.focal.cve.oval.xml">focal</os>
    <os path="/local_path/com.ubuntu.bionic.cve.oval.xml">bionic</os>
    <os path="/local_path/com.ubuntu.xenial.cve.oval.xml">xenial</os>
    <os path="/local_path/com.ubuntu.trusty.cve.oval.xml">trusty</os>
    <update_interval>1h</update_interval>
</provider>

Debian

To perform an offline update of Debian feeds, you must download the corresponding OVAL files:

OS

Link

Buster

https://www.debian.org/security/oval/oval-definitions-buster.xml

Stretch

https://www.debian.org/security/oval/oval-definitions-stretch.xml

Jessie

https://www.debian.org/security/oval/oval-definitions-jessie.xml

Wheezy

https://www.debian.org/security/oval/oval-definitions-wheezy.xml

In order to use a local feed file, just use the path attribute accompanying the os option as follows:

<provider name="debian">
    <enabled>yes</enabled>
    <os path="/local_path/oval-definitions-buster.xml">buster</os>
    <os path="/local_path/oval-definitions-stretch.xml">stretch</os>
    <os path="/local_path/oval-definitions-jessie.xml">jessie</os>
    <os path="/local_path/oval-definitions-wheezy.xml">wheezy</os>
    <update_interval>1h</update_interval>
</provider>

In order to update the vulnerability feeds from an alternative repository, the configuration is similar to the following:

<provider name="debian">
    <enabled>yes</enabled>
    <os url="http://local_repo/oval-definitions-buster.xml">buster</os>
    <os url="http://local_repo/oval-definitions-stretch.xml">stretch</os>
    <os url="http://local_repo/oval-definitions-jessie.xml">jessie</os>
    <os url="http://local_repo/oval-definitions-wheezy.xml">wheezy</os>
    <update_interval>1h</update_interval>
</provider>

Red Hat

To perform an offline update of Red Hat feed, you must make requests to its API to get the feed pages starting from a specified date. You can find a script that automates the process of downloading and controls the API downtime in wazuh/tools/vulnerability-detector/rh-generator.sh.

How to use the update script

  1. Create a directory to download the feed.

# mkdir /local_path/rh-feed
  1. Run the script indicating the starting year from which the vulnerabilities will be downloaded (minimum is 1999) and the target path.

# ./rh-generator.sh 1999 /local_path/rh-feed

It is possible that the script will output error messages like the following:

Page download failed (504), retrying...

This indicates that the Red Hat servers may be temporarily unavailable to you. The script will continue trying to finish the download until it acquires the full feed.

Finally, you will have the feed divided into a succession of numbered files whose names follow the format redhat-feed<number>.json. To update locally, the path to those files must be indicated by a regular expression such as the following:

<provider name="redhat">
    <enabled>yes</enabled>
    <path>/local_path/rh-feed/redhat-feed.*json$</path>
    <update_interval>1h</update_interval>
</provider>

If you want to upload these files to a local server, they must follow the same numerical sequence in the link and indicate their position with the [-] tag helped by the start and end attributes to indicate the numerical range. For example, if the previous script has returned 15 files, the configuration would look like this:

<provider name="redhat">
    <enabled>yes</enabled>
    <url start="1" end="15">http://local_repo/rh-feed/redhat-feed[-].json</url>
    <update_interval>1h</update_interval>
</provider>

National Vulnerability Database

To perform an offline update of the National Vulnerability Database, you must make requests to its feed from the desired date. You can find a script that automates the process of downloading and controls the server downtime in wazuh/tools/vulnerability-detector/nvd-generator.sh.

How to use the the update script

  1. Create a directory to download the feed.

# mkdir /local_path/nvd-feed
  1. Run the script indicating the starting year from which the vulnerabilities will be downloaded (minimum is 2002) and the target path.

# nvd-generator.sh 2002 /local_path/nvd-feed

It is possible that the script will output error messages like the following:

Page download failed (504), retrying...

This indicates that the National Vulnerability Database servers may be temporarily unavailable to you. The script will continue trying to finish the download until it acquires the full feed.

Finally, you will have the feed divided into a succession of numbered files whose name follows format nvd-feed<number>.json.gz. Those files are compressed and should be extracted. To update locally, the path to those files must be indicated by a regular expression as such:

<provider name="nvd">
    <enabled>yes</enabled>
    <path>/local_path/nvd-feed.*json$</path>
    <update_interval>1h</update_interval>
</provider>

If you want to upload these files to a local server, they must follow the same numerical sequence in the link and indicate their position with the [-] tag helped by the start and end attributes to indicate the numerical range. For example, if you have the files from 2015 to 2020, the configuration would look like this:

<provider name="nvd">
    <enabled>yes</enabled>
    <url start="2015" end="2019">http://local_repo/nvd-feed[-].json.gz</url>
    <update_interval>1h</update_interval>
</provider>