top of page

Automating SCAP Compliance Checks with SCC Scans and Gitlab

Background

If you are required to maintain compliance in a DoD environment or government contracting program you are most likely already familiar with the concept of Secure Content Automation Protocol (“SCAP”) which is a set of tools for scanning and monitoring your environment for compliance with the NIST 800-53 policies required. There are several ways to implement your compliance scanning. In this article we discuss how you can automate and consistently run your scans using the SCC tool provided by DISA (Security Content Automation Protocol (SCAP) – DoD Cyber Exchange) coupled with GitLab.


What is GitLab?

GitLab is a complete DevOps/DevSecOps platform. GitLab is feature packed for Continuous Integration/Continuous Delivery (CI/CD) . It provides a wide range of integration features and complete toolset. Aside from CI/CD pipelines and source code management (“SCM”) GitLab also offers a host of other capabilities, including, but not limited to, issue tracking, vulnerability management, monitoring, and package and container registries. GitLab uses YAML for all its CI/CD pipelines, which makes for much cleaner pipeline configurations. GitLab has both paid and free versions, allowing you to only pay for what you need; there are quite a few features included in the free version. You can also choose to host a GitLab instance in your own environment or use GitLab cloud. We prefer GitLab Ultimate, once you go Ultimate, it is hard to go back.


What is SCC?

The SCAP Compliance Checker (SCC) is used to scan systems for vulnerabilities and misconfigurations based off STIG Benchmarks. STIG Benchmarks have detailed information on what to check for each control in a system, which will ultimately determine if a certain control passes or fails for a scan. These two items used together can be imperative to determine if your environment or particular systems are meeting compliance requirements. SCC is typically used using the UI for all scans, however, the automation behind SCC can truly be unlocked by using the command line version of the tool. STIG Benchmarks as well as the SCC tool can all be found here; if you want to start using the SCC tool, just make sure to select the correct operating system that the tool will be run on.


Why automate SCC Scans with GitLab?

The process for scanning systems with SCC can be cumbersome, especially if you have several Linux systems that need to be scanned for compliance. If you have just Windows systems, it isn’t as bad because you can remotely scan those systems from a single device. Regardless, both can benefit from automation. The ability to set up the pipeline and have it run on a schedule can free up a lot of time if someone is having to do this all manually. Not only is running the scans automatically a benefit, but you can put a few extra steps in the pipeline to make managing the output files much easier as well. If you want the files to be sent to a file share, that can be done. Or just archived to GitLab for easy download on any system? That can be done as well. Now that you know what GitLab and SCC are as well as why you would use them, let’s start talking about how this can all be implemented.


How SCC can be Automated with GitLab?

Since the SCC tool is packaged as a ZIP file, this makes it very easy to deploy to systems for scanning; in this blog, we will be focusing on scanning Red Hat Linux (RHEL) and Windows. With SCC for Windows, you can remotely scan systems if the system you are scanning from, and the target systems, are all in the same domain. NOTE: Remote Windows scans use NTLM authentication. For the RHEL scans, and any other Linux SCC scan, there is no remote scan capability. The only option to scan Linux systems is to deploy the SCC tool to each system locally; we can automate this with the help of Ansible®. Ansible is an automation tool that allows easier remote code execution across multiple systems using SSH.


The steps for automating SCC scans for Windows and RHEL are different due to the capabilities of each tool. The high-level steps for each are outlined below:


Windows:

  1. If there is a host file with Windows systems defined, continue with the pipeline, otherwise, do nothing

  2. Change the working directory to where the SCC Zip file is located in the repository

  3. Create an output directory to redirect all scan files

  4. Create an upload directory to copy all scan files into and then artifact once the scans are done

  5. Expand the SCC Zip folder

  6. Set the Security Protocol to TLS 1.2 (Required for SCC to work properly)

  7. Run the SCC Scans using the host file and other parameters to control the SCC configuration

  8. Archive all XML files output from the SCC scans

RHEL:

  1. If there is a host file with RHEL systems defined, continue with the pipeline, otherwise, do nothing (this host file is used for the Ansible playbook, not the SCC tool)

  2. Create an output directory for all scan files

  3. Check if SCC is already installed on that machine. If not, then copy the SCC installation file to the system

  4. Run the Ansible playbook

    • Install SCC and remove the installation file if needed

    • Create output directory for the SCC Scans

    • Run the SCC Scan on the system

    • Copy the XML files created from the scan back up to the Ansible host

    • Remove the output directory

What you will need

There isn’t a lot that is needed to build this setup. The systems and software needed for each are listed below:


Hardware:

  • 1x Windows Server/Workstation

  • 1x Linux system

Software:

  • Windows:

- GitLab Runner

  • Linux:

- Ansible

- Sshpass (only needed if using password authentication)

- OpenSSH

- GitLab Runner


As you can see, the setup is rather simple. The only thing that is similar between the two systems is the need for a GitLab runner. The GitLab runner is used to run the pipeline on those specific systems instead of using GitLab’s provided shared runners. The reason this is needed is because the shared runners would not have access to the systems that need to be scanned in your environment.


Putting it All Together

Now that we have some background on all the hardware and systems needed, let’s look at how it all works. To start, this is the file structure of the GitLab repository



As you can see, the naming conventions for most of the files and folders correlate to the operating system, or version, that is going to be scanned. This repository uses manually created host files, but these could also be created automatically depending on the tools used in the environment. The Linux host file is a little different than the Windows host file because SCC has different executables for different versions and flavor of Linux, so you need to make sure you deploy the correct version.


That is where the SCC folder comes in. This folder contains the SCC zip file for Windows, and the rpm file for RHEL which is used to install the SCC tool to each machine that the pipeline is run against. The Linux jobs in the pipeline will pass in a variable for the OS, which will determine which systems it will be running against. Additionally, they will also pass in variables for the file extension of the installation file and the command used to install the package. The reason for this is to make the Ansible playbook as dynamic as possible to account for just about any Linux flavor. This is not necessary for Windows because there is only one Windows SCC package which can subsequently scan any Windows machine.


Next, we’ll look at the Project variables needed in GitLab. This part is only needed if you are doing username/password authentication with Ansible. This is not the most secure way, but we are using it here for simplicity. The variables needed for this setup would be RHEL_ROOT_PASS and RHEL_USERNAME. These two variables are used to log in to the RHEL machine(s) for remote command execution over SSH. This naming convention is used so that you can use other Linux flavors and just replace RHEL with the new Linux flavor for the next set of credentials.


Now, it’s time to move on to the gitlab-ci.yml file. Starting from the top, we have the stages and variables.

stages:
    - scc_scans

variables:
    SCC_VERSION: 5.4.2
    SCC_DIR: /opt/scc

We can see that there is just one stage for scanning and a few global variables. One for the SCC directory, and the other for the SCC version. The SCC directory variable is used to define where the SCC executable is installed to on Linux systems. The SCC version variable defines what version of SCC to use.


Next, we have a job template, identified by the period (‘.’) at the start of the name, which is used to run scans on Linux systems. This template can be used on several Linux flavors so long as the correct variables are used. This template will first identify all files with the OS flavor in the name of the file; rhel_8.txt, for instance, in our repository. It will then iterate through each file and proceed with copying the SCC installation file to each host, if required, and then going through the scan process with the Ansible playbook. The Ansible playbook has several variables defined to make it as dynamic as possible.

.linux-scans:
    script:
        - >
            FILES=$(find ./scanning -maxdepth 1 -name "*$OS*" -type f)

            if [[ $FILES ]]; then
                mkdir -p ./$OS-scans

                for FILE in $FILES; do 
                    VERSION=$(echo $FILE | cut -d'_' -f2 | cut -d'.' -f1)
                    
                    for HOST in $(cat $FILE); do
                        EXISTS=$(sshpass -p "$ROOT_PASS" ssh \
                        -q $USERNAME@$HOST [[ -f $SCC_DIR/cscc ]] && \
                        echo "true" || echo "false";)

                        if [ $EXISTS == 'false' ]; then
                            echo "Copying files to $host"

                            sshpass -p "$ROOT_PASS" scp \
                            -o StrictHostKeyChecking=no \
                            ./scc/$OS-scc/$VERSION/scc.$FILE_EXT $USERNAME@$HOST:/tmp
                        fi
                    done

                    ansible-playbook ./ansible/scan-playbook.yml -i $FILE --extra-vars \
                    "ansible_sudo_pass=$ROOT_PASS \
                    ansible_ssh_pass=$ROOT_PASS \
                    ci_folder=$CI_PROJECT_DIR \
                    user=$USERNAME \
                    scc_dir=$SCC_DIR \
                    os=$OS \
                    version=$VERSION \
                    scc_version=$SCC_VERSION \
                    command=$COMMAND \
                    file_ext=$FILE_EXT" --ssh-common-args "-o StrictHostKeyChecking=no"
                done
            fi
    tags:
        - ansible
    artifacts:
        paths:
            - ./$OS-scans/*.xml

The Ansible playbook, as seen below, goes through the process of installing SCC (if necessary), running the scan, then copying the scan results back to the Ansible host.

---
- name: Run SCAP Compliance Scan
  hosts: all
  become: true
  remote_user: "{{ user }}"
  tasks:
    - name: Install SCC
      command: "{{ command }} -i /tmp/scc.{{ file_ext }}"
      args: 
        creates: "{{ scc_dir }}/cscc"
    - name: Remove SCC Install File
      file:
        path: /tmp/scc.{{ file_ext }}
        state: absent
    - name: Create Output Folder
      file:
        path: "{{ scc_dir }}/output"
        state: directory
        owner: "{{ user }}"
        group: "{{ user }}"
    - name: Run the SCAP Compliance Scan
      shell: "{{ scc_dir }}/cscc -u {{ scc_dir }}/output \
      --setOpt dirAllSessionsEnabled 0 \
      --setOpt dirContentTypeEnabled 0 \
      --setOpt dirSessionEnabled 0 \
      --setOpt dirSessionResultsEnabled 0 \
      --setOpt dirStreamNameEnabled 0 \
      --setOpt dirTargetNameEnabled 0 \
      --setOpt dirXMLEnabled 0"
    - name: Get Files Names that Need to be Copied
      shell: (cd {{ scc_dir }}/output; \
      find . -maxdepth 1 -name "*XCCDF*" -type f) | cut -d'/' -f2
      register: files_to_copy
    - name: Fetch Items From Remote Machine
      fetch:
        src: "{{ scc_dir }}/output/{{ item }}"
        dest: "{{ ci_folder }}/{{ os }}-scans/"
        flat: yes
      with_items: "{{ files_to_copy.stdout_lines }}"
    - name: Remove Output Folder
      file:
        path: "{{ scc_dir }}/output"
        state: absent

The last parts of the gitlab-ci.yml file are the jobs that run the Windows and Linux SCC scans. To start, we’ll look at the Linux job. This job uses the linux-scans template we saw previously and passes in several variables required by that job, as well as the Ansible playbook. To conduct scans on other Linux flavors, you would just need to copy the job, change the variables accordingly, and create a host file for the Linux Flavor(s) that you want to scan.

rhel:
    stage: scc_scans
    extends: .linux-scans
    variables:
        OS: rhel
        USERNAME: $RHEL_USERNAME
        ROOT_PASS: $RHEL_ROOT_PASS
        FILE_EXT: rpm
        COMMAND: rpm
    rules:
        - if:  $CI_PIPELINE_SOURCE == "web" || $CI_PIPELINE_SOURCE == "schedule"

Lastly, we have the Windows scans. These scans do not have a templated job since there is only one process for all windows systems. As you can tell, and as we highlighted previously, the process is slightly different than the Linux systems. For Windows, we are just unzipping an archive as opposed to installing the software on the system. Additionally, the scans can be done remotely, so they only need to be run from the GitLab runner.

windows:
    stage: scc_scans
    variables:
        OUTPUT_FOLDER: C:\Temp\SCC
    script:
        - >
            if(Test-Path -Path .\scanning\windows.txt -PathType Leaf) {
                cp .\scanning\windows.txt .\scc\windows-scc
                cd .\scc\windows-scc
                New-Item $env:OUTPUT_FOLDER -ItemType Directory -Force
                New-Item .\Upload -ItemType Directory -Force
                Expand-Archive .\scc.zip .
                [Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12
                & ".\scc_$env:SCC_VERSION\cscc" -f .\windows.txt -u $env:OUTPUT_FOLDER
                Get-ChildItem $env:OUTPUT_FOLDER -Filter "*XCCDF*.xml" -Recurse `
                | % { Move-Item $_.FullName .\Upload }
            }
    tags:
        - scc-win
    rules:
        - if: $CI_PIPELINE_SOURCE == "web" || $CI_PIPELINE_SOURCE == "schedule"
    artifacts:
        paths:
            - ./scc/windows-scc/Upload/*.xml

Final Thoughts

SCC’s scanning capabilities with GitLab’s automation opens a lot of possibilities for giving time back to those who need it for other tasks. There is not a lot of configuration or infrastructure required, but the Ansible playbook and GitLab YAML can be difficult for those that are unfamiliar with the process.


If you think this is something that would be useful to you or your organization, Reach out to us! You can set up a meeting by clicking here. We can further discuss deployment in your environment.

Comments


bottom of page