Merge pull request #2091 from ThalesSiliconSecurity/ci-rework-stages

ci: rework and document
This commit is contained in:
André Sintzoff 2023-07-28 15:44:29 +02:00 committed by GitHub
commit 2db59e20b7
No known key found for this signature in database
GPG key ID: 4AEE18F83AFDEB23
4 changed files with 352 additions and 290 deletions

View file

@ -104,6 +104,7 @@ cva6:
- project: '$CI_PROJECT_NAMESPACE/setup-ci'
ref: '$SETUP_CI_CVA6_BRANCH'
file: 'cva6/core-v-verif-cva6.yml'
- local: .gitlab-ci/core-v-verif-cva6.yml
strategy: depend
variables:
TAGS_RUNNER: $TAGS_RUNNER

252
.gitlab-ci/README.md Normal file
View file

@ -0,0 +1,252 @@
<!--
Copyright 2023 Thales Silicon Security
Licensed under the Solderpad Hardware Licence, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
SPDX-License-Identifier: Apache-2.0 WITH SHL-2.0
You may obtain a copy of the License at https://solderpad.org/licenses/
Original Author: Côme ALLART - Thales
-->
# GitLab CI for core-v-verif + CVA6
This document describes the different steps performed automatically when a branch is pushed to a repository.
It is not meant to be a complete description.
It is an entry point to help to understand the structure of the pipelines; to find the information your are looking for / the part of the CI you want to edit.
Please refer to the mentioned files for more details.
Only the GitLab-related tasks are described here.
## Before the branch reaches GitLab
CVA6 and core-v-verif repositories are mirrored into GitLab repositories owned by Thales, to perform regression tests on pull requests.
> Note: in CVA6 regression tests are also run on the `master` branch, and in core-v-verif on the `cva6/dev` branch.
## Pipeline boot
When a branch is pushed, the entry point of the CI is the `.gitlab-ci.yml` file at the repository root.
> See [`core-v-verif/.gitlab-ci.yml`] and [`cva6/.gitlab-ci.yml`]
[`core-v-verif/.gitlab-ci.yml`]: https://github.com/openhwgroup/core-v-verif/blob/cva6/dev/.gitlab-ci.yml
[`cva6/.gitlab-ci.yml`]: https://github.com/openhwgroup/cva6/blob/master/.gitlab-ci.yml
Both source files from a `setup-ci` project (to locate tools etc.), define workflow rules and perform a small environment check.
All pipelines need both CVA6 and core-v-verif to run tests.
By default the branches used are:
- The one from the PR
- The main branch from the other repository.
The main branch is defined in `setup-ci` (`master` for CVA6 and `cva6/dev` for core-v-verif).
However, the entry points also detect the `cvvdev/*` pattern in the branch name to run CVA6 and core-v-verif pipelines on branches with the same name.
It is useful to consistently test PRs impacting both repositories.
In the CVA6 pipeline:
1. The `core-v-verif-build` job gets the current commit hash of core-v-verif to set it as an environment variable.
It gets the list of tests to run [`core-v-verif/.gitlab-ci/cva6.yml`] (see next steps).
2. The `core-v-verif` job triggers a child pipeline using:
- [`core-v-verif/.gitlab-ci/cva6.yml`] fetched by `core-v-verif-build`
- [`cva6/.gitlab-ci/core-v-verif-cva6.yml`] which defines a `before_script` and an `after_script` to `cd` the core-v-verif repository with the hash defined by `core-v-verif-build`
[`core-v-verif/.gitlab-ci/cva6.yml`]: https://github.com/openhwgroup/core-v-verif/blob/cva6/dev/.gitlab-ci/cva6.yml
[`cva6/.gitlab-ci/core-v-verif-cva6.yml`]: https://github.com/openhwgroup/cva6/blob/master/.gitlab-ci/core-v-verif-cva6.yml
In core-v-verif pipelines, the `cva6` job triggers a child pipeline using:
- [`core-v-verif/.gitlab-ci/cva6.yml`] (the list of tests)
- [`core-v-verif/.gitlab-ci/core-v-verif-cva6.yml`] (global `before_script` and `after_script`).
[`core-v-verif/.gitlab-ci/core-v-verif-cva6.yml`]: https://github.com/openhwgroup/core-v-verif/blob/cva6/dev/.gitlab-ci/core-v-verif-cva6.yml
## Running the tests
Thanks to the previous step, in pipelines from both CVA6 and core-v-verif, the current working directory is core-v-verif, with CVA6 checked out in `core-v-cores/cva6`.
The tests are described in [`core-v-verif/.gitlab-ci/cva6.yml`].
Stages are defined as below (order matters):
- `init env`: only contains `pub_initjob`, which sets a hash for CVA6 as an environment variable, so that it is the same one for all jobs of this pipeline.
It is only run in core-v-verif pipelines as CVA6 pipelines already have the CVA6 commit hash of the pipeline!
- `build tools`: `pub_build_tools` build Spike and `pub_check_env` prints some environment variable for debugging.
- `smoke tests`: `pub_smoke` runs smoke tests.
- `verif tests`: many jobs runs different verif tests.
The template for them is described later in this document.
- `backend tests`: jobs which use results of `verif tests`, often synthesis results.
- `report`: `merge reports` merges all reports into a single yaml.
### Adding a verif test
A simple test looks like this:
```yml
pub_<name>:
extends:
- .verif_test
- .template_job_short_ci
variables:
DASHBOARD_JOB_TITLE: "<title for dashboard>"
DASHBOARD_JOB_DESCRIPTION: "<description for dashboard>"
DASHBOARD_SORT_INDEX: <index to sort jobs in dashboard>
DASHBOARD_JOB_CATEGORY: "<job category for dashboard>"
script:
- source cva6/regress/<my-script>.sh
- python3 .gitlab-ci/scripts/report_<kind>.py <args...>
```
- `.verif_test` tells that:
- The job goes in `verif tests` stage
- Before running the script part, additionally to the global `before_script`:
- Spike is got from `pub_build_tools`
- Artifacts are cleaned, `artifacts/reports` and `artifacts/logs` are created
- A "failure" report is created by default (in case the script exists early)
- `$SYN_VCS_BASHRC` is sourced
- All the contents of the `artifacts/` folder will be considered as artifacts (even if the job fails)
- `.template_job_short_ci` tells in which pipeline mode the job should run
- `variables` defines environment variables.
The 4 above are needed to generate the report for the dashboard.
- `script` defines the script to run:
1. Run the test, for instance sourcing a script in `cva6/regress/`
2. Generate a report running a script from `.gitlab-ci/scripts/reports_*.py`
> Notes:
>
> You can add more environment variables such as:
>
> ```yml
> variables:
> DV_SIMULATORS: "veri-testharness,spike"
> DV_TESTLISTS: "../tests/testlist_riscv-tests-$DV_TARGET-p.yaml"
> ```
>
> You can also have several jobs running in parallel with variables taking different values:
>
> ```yml
> parallel:
> matrix:
> - DV_TARGET: [cv64a6_imafdc_sv39, cv32a60x]
> ```
### Adding a backend test
```yml
pub_<name>:
needs:
- *initjob
- pub_<other_job>
- <...>
extends:
- .backend_test
- .template_job_always_manual
variables:
<same as for verif tests>
script:
- <mv spike from artifacts if you need it>
- <your script>
- python3 .gitlab-ci/scripts/report_<kind>.py <args...>
```
Backend tests are like verif tests, differences are:
- `needs` list is needed to specify in which conditions the test is run (with `.template_job_*`).
It contains:
- `*initjob` to be sure the correct CVA6 commit is used.
Without a `needs` list, all jobs from all previous stages are considered as needed.
However, when a `needs` list is declared, all useful dependencies must be specified by hand, which is more complex.
- `pub_build_tools` if you need spike (don't forget to `mv` it from the artifacts!)
- The jobs you need artifacts from
- `.backend_test` indicates that:
- The job goes in `backend tests` stage
- It performs the same steps than `.backend_test`, except that:
- it does not source VCS (so you have to do it if you need it)
- it does not move spike (so you have to do it if you need it)
## Generating a report
You might want to use `.gitlab-ci/scripts/report_simu.py`.
If it does not suit your needs, below are snippets to help you write a report generator using our python library.
```python
import report_builder as rb
# Create metrics
metric = rb.TableMetric('Metric name')
metric.add_value('colomn 1', 'colomn 2', 'etc')
# Gather them into a report
report = rb.Report('report label')
report.add_metric(metric)
# Create the report file in the artifacts
report.dump()
```
There are 3 kinds of metric:
```python
# A simple table
metric = rb.TableMetric('Metric name, actually not displayed yet')
metric.add_value('colomn 1', 'colomn 2', 'etc')
# A table with a pass/fail label on each line
metric = rb.TableStatusMetric('Metric name, actually not displayed yet')
metric.add_pass('colomn 1', 'colomn 2', 'etc')
metric.add_fail('colomn 1', 'colomn 2', 'etc')
# A log
metric = rb.LogMetric('Metric name, actually not displayed yet')
metric.add_value("one line (no need to add a backslash n)")
metric.values += ["one line (no need to add a backslash n)"] # same as above
metric.values = ["line1", "line2", "etc"] # also works
# You can fail a metric of any kind at any moment
metric.fail()
```
Failures are propagated:
- one fail in a `TableStatusMetric` fails the whole metric
- one failed metric fails the whole report
- one failed report fails the whole pipeline report
## Dashboard
The `merge reports` job merges the report from all jobs of the pipeline into a single file.
It pushes this file to a repository.
This repository has a CI which produces HTML dashboard pages from the latest files.
These HTML pages are published on <https://riscv-ci.pages.thales-invia.fr/dashboard/>
- Main pages [`dashboard_cva6_0.html`] and [`dashboard_core-v-verif_0.html`] gather results from all processed pipelines.
- Each page `dashboard_<project>_<PR id>.html` gathers results from all pipelines of one PR.
[`dashboard_cva6_0.html`]: https://riscv-ci.pages.thales-invia.fr/dashboard/dashboard_cva6_0.html
[`dashboard_core-v-verif_0.html`]: https://riscv-ci.pages.thales-invia.fr/dashboard/dashboard_core-v-verif_0.html
## PR comment
The `merge reports` job gets the list of open PRs.
It compares the name of the current branch with the name of each PR branch to find the PR.
If a PR matches, it triggers the GitHub workflow `dashboard-done.yml` in this repository, providing the PR number and the success/fail status.
> See [`core-v-verif/.github/workflows/dashboard-done.yml`] and [`cva6/.github/workflows/dashboard-done.yml`]
[`core-v-verif/.github/workflows/dashboard-done.yml`]: https://github.com/openhwgroup/core-v-verif/blob/cva6/dev/.github/workflows/dashboard-done.yml
[`cva6/.github/workflows/dashboard-done.yml`]: https://github.com/openhwgroup/cva6/blob/master/.github/workflows/dashboard-done.yml
This GitHub workflow creates a comment in the PR with the success/fail status and a link to the dashboard page.
However, the dashboard page may not be available right at this moment, as page generation, performed later, takes time.

View file

@ -0,0 +1,5 @@
before_script:
- echo no need to enter core-v-verif, already in it
after_script:
- echo no need to move artifacts, already in core-v-verif

View file

@ -14,6 +14,8 @@
# - In this pipeline, do not define before_script and after_script in the global section (avoid in job too).
# - Please prefix all jobs in this file with "pub_" which stands for "public" job.
# Please refer to .gitlab-ci/README.md to add jobs
variables:
GIT_STRATEGY: fetch
@ -63,37 +65,59 @@ variables:
- when: manual
allow_failure: true
stages:
- .pre
- build_tools
- one
- two
- three
- init env
- build tools
- smoke tests
- verif tests
- backend tests
- report
.verif_test:
stage: verif tests
before_script:
- !reference [before_script]
- mv artifacts/tools/spike tools
- rm -rf artifacts/
- mkdir -p artifacts/{reports,logs}
- python3 .gitlab-ci/scripts/report_fail.py
- echo $SYN_VCS_BASHRC; source $SYN_VCS_BASHRC
artifacts: &artifacts
when: always
paths:
- artifacts/
.initjob: &initjob
job: pub_initjob
optional: true
.backend_test:
stage: backend tests
before_script:
- !reference [before_script]
- mkdir -p artifacts/{reports,logs}
- python3 .gitlab-ci/scripts/report_fail.py
artifacts: *artifacts
# In the scope of a CI triggered by core-v-verif repository:
# This job will get the HASH of the given CVA6 branch
# This HASH will be used by the next jobs instead of the CVA6 given BRANCH name
# This prevents CI to not use the same version of CVA6 in case of new commit in CVA6 branch during the execution of the CI
initjob:
stage: .pre
pub_initjob:
stage: init env
extends:
- .template_job_low_footprint
- .template_job_init_cva6
needs: []
script:
- '[[ -e ./cva6 ]] && rm -rf cva6'
- git clone $CVA6_REPO -b $CVA6_BRANCH --depth=1 cva6
- cd cva6
- echo CVA6_HASH=$(git rev-parse origin/$CVA6_BRANCH) > ../.env
- cd ..
- echo CVA6_HASH=$(git -C cva6 rev-parse origin/$CVA6_BRANCH) > .env
artifacts:
reports:
dotenv: .env
pub_check_env:
stage: .pre
stage: build tools
extends:
- .template_job_low_footprint
- .template_job_full_ci
@ -136,10 +160,9 @@ pub_check_env:
- echo $LIB_VERILOG
pub_build_tools:
stage: build_tools
stage: build tools
extends:
- .template_job_full_ci
needs: []
script:
# ROOT_PROJECT is used by Spike installer and designates the toplevel of core-v-verif tree.
- 'export ROOT_PROJECT=$(pwd)'
@ -161,12 +184,9 @@ pub_build_tools:
- artifacts/tools/spike/*
pub_smoke:
stage: one
stage: smoke tests
extends:
- .template_job_full_ci
needs:
- job: pub_build_tools
artifacts: true
parallel:
matrix:
- DV_SIMULATORS: ["veri-testharness,spike","vcs-testharness,spike","vcs-uvm,spike" ]
@ -175,11 +195,9 @@ pub_smoke:
DASHBOARD_JOB_DESCRIPTION: "Short tests to challenge most architectures with most testbenchs configurations"
DASHBOARD_SORT_INDEX: 0
DASHBOARD_JOB_CATEGORY: "Basic"
before_script:
- !reference [.verif_test, before_script]
script:
- mkdir -p artifacts/reports artifacts/logs
- mv artifacts/tools/spike tools
- python3 .gitlab-ci/scripts/report_fail.py
- echo $SYN_VCS_BASHRC; source $SYN_VCS_BASHRC
# In order to capture logs in case of test failure, the test script cannot fail.
- source cva6/regress/smoke-tests.sh || true
# The list of files must NOT fail on various DV_SIMULATORS values, so use 'v*_sim' to match
@ -187,21 +205,12 @@ pub_smoke:
# at least until new RTL simulator configurations are added.)
- for i in cva6/sim/*/v*_sim/*.log.iss ; do head -10000 $i > artifacts/logs/$(basename $i).head ; done
- python3 .gitlab-ci/scripts/report_simu.py cva6/sim/logfile.log
artifacts:
when: always
paths:
- artifacts/reports/*.yml
- artifacts/logs/*.log.iss.head
artifacts: *artifacts
pub_riscv_arch_test:
stage: two
extends:
- .verif_test
- .template_job_short_ci
needs:
- job: pub_build_tools
artifacts: true
- job: pub_smoke
artifacts: false
parallel:
matrix:
- DV_TARGET: [cv64a6_imafdc_sv39, cv32a60x]
@ -212,26 +221,13 @@ pub_riscv_arch_test:
DASHBOARD_SORT_INDEX: 0
DASHBOARD_JOB_CATEGORY: "Test suites"
script:
- mkdir -p artifacts/reports
- mv artifacts/tools/spike tools
- python3 .gitlab-ci/scripts/report_fail.py
- echo $SYN_VCS_BASHRC; source $SYN_VCS_BASHRC
- source cva6/regress/dv-riscv-arch-test.sh
- python3 .gitlab-ci/scripts/report_simu.py cva6/sim/logfile.log
artifacts:
when: always
paths:
- "artifacts/reports/*.yml"
csr_test:
stage: two
extends:
- .verif_test
- .template_job_short_ci
needs:
- job: pub_build_tools
artifacts: true
- job: pub_smoke
artifacts: false
parallel:
matrix:
- DV_TARGET: [cv32a60x]
@ -242,57 +238,31 @@ csr_test:
DASHBOARD_SORT_INDEX: 0
DASHBOARD_JOB_CATEGORY: "Test suites"
script:
- mkdir -p artifacts/reports
- mv artifacts/tools/spike tools
- python3 .gitlab-ci/scripts/report_fail.py
- echo $SYN_VCS_BASHRC; source $SYN_VCS_BASHRC
- source cva6/regress/dv-riscv-csr-access-test.sh
- python3 .gitlab-ci/scripts/report_simu.py cva6/sim/logfile.log
artifacts:
when: always
paths:
- "artifacts/reports/*.yml"
pub_hwconfig:
stage: two
extends:
- .verif_test
- .template_job_short_ci
needs:
- job: pub_build_tools
artifacts: true
- job: pub_smoke
artifacts: false
parallel:
matrix:
- DV_SIMULATORS: ["veri-testharness,spike"]
DV_HWCONFIG_OPTS: ["--default_config=cv32a60x --isa=rv32imac --a_ext=1",
"--default_config=cv32a60x --isa=rv32imc --RenameEn=1"]
DV_HWCONFIG_OPTS:
- "--default_config=cv32a60x --isa=rv32imac --RenameEn=1"
variables:
DASHBOARD_JOB_TITLE: "HW config $DV_SIMULATORS $DV_HWCONFIG_OPTS"
DASHBOARD_JOB_DESCRIPTION: "Short tests to challenge target configurations"
DASHBOARD_SORT_INDEX: 1
DASHBOARD_JOB_CATEGORY: "Basic"
script:
- mkdir -p artifacts/reports
- mv artifacts/tools/spike tools
- python3 .gitlab-ci/scripts/report_fail.py
- echo $SYN_VCS_BASHRC; source $SYN_VCS_BASHRC
- source ./cva6/regress/hwconfig_tests.sh
- python3 .gitlab-ci/scripts/report_pass.py
artifacts:
when: always
paths:
- artifacts/reports/*.yml
pub_compliance:
stage: two
extends:
- .verif_test
- .template_job_short_ci
needs:
- job: pub_build_tools
artifacts: true
- job: pub_smoke
artifacts: false
parallel:
matrix:
- DV_TARGET: [cv64a6_imafdc_sv39, cv32a60x]
@ -303,27 +273,13 @@ pub_compliance:
DASHBOARD_SORT_INDEX: 2
DASHBOARD_JOB_CATEGORY: "Test suites"
script:
- mkdir -p artifacts/reports
- mv artifacts/tools/spike tools
- python3 .gitlab-ci/scripts/report_fail.py
- echo $SYN_VCS_BASHRC; source $SYN_VCS_BASHRC
- source cva6/regress/dv-riscv-compliance.sh
- python3 .gitlab-ci/scripts/report_simu.py cva6/sim/logfile.log
artifacts:
when: always
paths:
- "artifacts/reports/*.yml"
pub_tests-v:
stage: two
extends:
- .verif_test
- .template_job_short_ci
needs:
- job: pub_build_tools
artifacts: true
- job: pub_smoke
artifacts: false
parallel:
matrix:
- DV_TARGET: [cv64a6_imafdc_sv39]
@ -335,27 +291,13 @@ pub_tests-v:
DASHBOARD_SORT_INDEX: 3
DASHBOARD_JOB_CATEGORY: "Test suites"
script:
- mkdir -p artifacts/reports
- mv artifacts/tools/spike tools
- python3 .gitlab-ci/scripts/report_fail.py
- echo $SYN_VCS_BASHRC; source $SYN_VCS_BASHRC
- source cva6/regress/dv-riscv-tests.sh
- python3 .gitlab-ci/scripts/report_simu.py cva6/sim/logfile.log
artifacts:
when: always
paths:
- "artifacts/reports/*.yml"
pub_tests-p:
stage: two
extends:
- .verif_test
- .template_job_short_ci
needs:
- job: pub_build_tools
artifacts: true
- job: pub_smoke
artifacts: false
parallel:
matrix:
- DV_TARGET: [cv64a6_imafdc_sv39, cv32a60x]
@ -367,90 +309,28 @@ pub_tests-p:
DASHBOARD_SORT_INDEX: 4
DASHBOARD_JOB_CATEGORY: "Test suites"
script:
- mkdir -p artifacts/reports
- mv artifacts/tools/spike tools
- python3 .gitlab-ci/scripts/report_fail.py
- echo $SYN_VCS_BASHRC; source $SYN_VCS_BASHRC
- source cva6/regress/dv-riscv-tests.sh
- python3 .gitlab-ci/scripts/report_simu.py cva6/sim/logfile.log
artifacts:
when: always
paths:
- "artifacts/reports/*.yml"
pub_synthesis:
stage: two
timeout: 2 hours
extends:
- .template_job_always_manual
needs:
- job: pub_build_tools
artifacts: true
- job: pub_smoke
artifacts: false
parallel:
matrix:
- TARGET: [cv32a6_embedded]
PERIOD: ["0.85"]
variables:
INPUT_DELAY: "0.46"
OUTPUT_DELAY: "0.11"
DASHBOARD_JOB_TITLE: "ASIC Synthesis $TARGET"
DASHBOARD_JOB_DESCRIPTION: "Synthesis indicator with specific Techno"
DASHBOARD_SORT_INDEX: 5
DASHBOARD_JOB_CATEGORY: "Synthesis"
script:
- mkdir -p artifacts/reports
- mv artifacts/tools/spike tools
- python3 .gitlab-ci/scripts/report_fail.py
#ack trick to manage float gitlab-ci variables that seems to support only string or integer
- echo $(echo $SYNTH_PERIOD)
- echo $(echo $INPUT_DELAY)
- echo $(echo $OUTPUT_DELAY)
- echo $(echo $NAND2_AREA)
- echo $FOUNDRY_PATH
- echo $PERIOD
- echo $TECH_NAME
- echo $TARGET
- source ./cva6/regress/install-cva6.sh
- echo $SYN_DCSHELL_BASHRC; source $SYN_DCSHELL_BASHRC
- make -C core-v-cores/cva6/pd/synth cva6_synth PERIOD=$(echo $PERIOD) NAND2_AREA=$(echo $NAND2_AREA) FOUNDRY_PATH=$FOUNDRY_PATH TECH_NAME=$TECH_NAME INPUT_DELAY=$(echo $INPUT_DELAY) OUTPUT_DELAY=$(echo $OUTPUT_DELAY) TARGET=$TARGET
- mv core-v-cores/cva6/pd/synth/cva6_${TARGET}_synth_modified.v artifacts/cva6_${TARGET}_synth_modified.v
- python3 .gitlab-ci/scripts/report_synth.py core-v-cores/cva6/pd/synth/cva6_${TARGET}/reports/$PERIOD/cva6_$(echo $TECH_NAME)_synth_area.rpt core-v-cores/cva6/pd/synth/synthesis_batch.log
artifacts:
when: always
paths:
- artifacts/cva6_${TARGET}_synth_modified.v
- "artifacts/reports/*.yml"
pub_synthesis_others:
stage: two
timeout: 2 hours
extends:
- .verif_test
- .template_job_always_manual
needs:
- job: pub_build_tools
artifacts: true
- job: pub_smoke
artifacts: false
parallel:
matrix:
- TARGET: [cv64a6_imafdc_sv39]
PERIOD: ["1.1"]
- TARGET: [cv32a60x]
PERIOD: ["0.95"]
variables:
variables: &synth_vars
INPUT_DELAY: "0.46"
OUTPUT_DELAY: "0.11"
DASHBOARD_JOB_TITLE: "ASIC Synthesis $TARGET"
DASHBOARD_JOB_DESCRIPTION: "Synthesis indicator with specific Techno"
DASHBOARD_SORT_INDEX: 5
DASHBOARD_JOB_CATEGORY: "Synthesis"
script:
- mkdir -p artifacts/reports
- mv artifacts/tools/spike tools
- python3 .gitlab-ci/scripts/report_fail.py
script: &synth_script
#ack trick to manage float gitlab-ci variables that seems to support only string or integer
- echo $(echo $SYNTH_PERIOD)
- echo $(echo $INPUT_DELAY)
@ -465,37 +345,38 @@ pub_synthesis_others:
- make -C core-v-cores/cva6/pd/synth cva6_synth PERIOD=$(echo $PERIOD) NAND2_AREA=$(echo $NAND2_AREA) FOUNDRY_PATH=$FOUNDRY_PATH TECH_NAME=$TECH_NAME INPUT_DELAY=$(echo $INPUT_DELAY) OUTPUT_DELAY=$(echo $OUTPUT_DELAY) TARGET=$TARGET
- mv core-v-cores/cva6/pd/synth/cva6_${TARGET}_synth_modified.v artifacts/cva6_${TARGET}_synth_modified.v
- python3 .gitlab-ci/scripts/report_synth.py core-v-cores/cva6/pd/synth/cva6_${TARGET}/reports/$PERIOD/cva6_$(echo $TECH_NAME)_synth_area.rpt core-v-cores/cva6/pd/synth/synthesis_batch.log
artifacts:
when: always
paths:
- artifacts/cva6_${TARGET}_synth_modified.v
- "artifacts/reports/*.yml"
rules:
- when: manual
allow_failure: true
pub_synthesis:
timeout: 2 hours
extends:
- .verif_test
- .template_job_always_manual
variables:
<<: *synth_vars
TARGET: cv32a6_embedded
PERIOD: "0.85"
script: *synth_script
pub_smoke-gate:
stage: three
extends:
- .backend_test
- .template_job_always_manual
needs:
- job: pub_build_tools
artifacts: true
- job: pub_synthesis
artifacts: true
parallel:
matrix:
- TARGET: [cv32a6_embedded]
- *initjob
- pub_build_tools
- pub_synthesis
variables:
DASHBOARD_JOB_TITLE: "Smoke Gate $TARGET"
DASHBOARD_JOB_DESCRIPTION: "Simple test to check netlist from ASIC synthesis"
DASHBOARD_SORT_INDEX: 6
DASHBOARD_JOB_CATEGORY: "Post Synthesis"
TARGET: cv32a6_embedded
script:
- mkdir -p artifacts/reports
- mv artifacts/tools/spike tools
- python3 .gitlab-ci/scripts/report_fail.py
- echo $SYN_VCS_BASHRC; source $SYN_VCS_BASHRC
- echo $LIB_VERILOG
- echo $FOUNDRY_PATH
- echo $PERIOD
@ -504,106 +385,61 @@ pub_smoke-gate:
- source ./cva6/regress/install-riscv-dv.sh
- source ./cva6/regress/install-riscv-tests.sh
- mv artifacts/cva6_${TARGET}_synth_modified.v core-v-cores/cva6/pd/synth/cva6_${TARGET}_synth_modified.v
- echo $SYN_VCS_BASHRC; source $SYN_VCS_BASHRC
- cd cva6/sim
- make vcs_clean_all
- python3 cva6.py --testlist=../tests/testlist_riscv-tests-cv32a60x-p.yaml --test rv32ui-p-lw --iss_yaml cva6.yaml --target $TARGET --iss=spike,vcs-gate $DV_OPTS
- cd ../..
- cd -
- python3 .gitlab-ci/scripts/report_simu.py cva6/sim/logfile.log
artifacts:
when: always
paths:
- "artifacts/reports/*.yml"
pub_coremark:
stage: two
extends:
- .verif_test
- .template_job_full_ci
needs:
- job: pub_build_tools
artifacts: true
- job: pub_smoke
artifacts: false
variables:
DASHBOARD_JOB_TITLE: "CoreMark"
DASHBOARD_JOB_DESCRIPTION: "Performance indicator"
DASHBOARD_SORT_INDEX: 5
DASHBOARD_JOB_CATEGORY: "Performance"
script:
- mkdir -p artifacts/reports
- mv artifacts/tools/spike tools
- python3 .gitlab-ci/scripts/report_fail.py
- bash cva6/regress/coremark.sh --no-print
- python3 .gitlab-ci/scripts/report_benchmark.py --coremark cva6/sim/out_*/veri-testharness_sim/core_main.log
artifacts:
when: always
paths:
- "artifacts/reports/*.yml"
pub_dhrystone:
stage: two
extends:
- .verif_test
- .template_job_full_ci
needs:
- job: pub_build_tools
artifacts: true
- job: pub_smoke
artifacts: false
variables:
DASHBOARD_JOB_TITLE: "Dhrystone"
DASHBOARD_JOB_DESCRIPTION: "Performance indicator"
DASHBOARD_SORT_INDEX: 5
DASHBOARD_JOB_CATEGORY: "Performance"
script:
- mkdir -p artifacts/reports
- mv artifacts/tools/spike tools
- python3 .gitlab-ci/scripts/report_fail.py
- bash cva6/regress/dhrystone.sh
- python3 .gitlab-ci/scripts/report_benchmark.py --dhrystone cva6/sim/out_*/veri-testharness_sim/dhrystone_main.log
artifacts:
when: always
paths:
- "artifacts/reports/*.yml"
pub_fpga-build:
stage: two
timeout: 90 minutes
extends:
- .verif_test
- .template_job_short_ci
needs:
- job: pub_smoke
artifacts: false
variables:
DASHBOARD_JOB_TITLE: "FPGA Build $TARGET"
DASHBOARD_JOB_DESCRIPTION: "Test of FPGA build flow"
DASHBOARD_SORT_INDEX: 9
DASHBOARD_JOB_CATEGORY: "Synthesis"
parallel:
matrix:
- TARGET: [cv64a6_imafdc_sv39, cv32a60x]
TARGET: cv32a60x
script:
- mkdir -p artifacts/reports
- python3 .gitlab-ci/scripts/report_fail.py
- source $VIVADO_SETUP
- source cva6/regress/install-cva6.sh
- make -C core-v-cores/cva6 fpga target=$TARGET
- mkdir -p artifacts/reports
- mv core-v-cores/cva6/corev_apu/fpga/work-fpga/ariane_xilinx.bit artifacts/ariane_xilinx_$TARGET.bit
- python3 .gitlab-ci/scripts/report_fpga.py core-v-cores/cva6/corev_apu/fpga/reports/ariane.utilization.rpt
artifacts:
when: always
paths:
- "artifacts/ariane_xilinx_$TARGET.bit"
- "artifacts/reports/*.yml"
pub_generated_tests:
stage: two
tags: [$TAGS_RUNNER]
needs:
- job: pub_build_tools
artifacts: true
- job: pub_smoke
artifacts: false
extends:
- .verif_test
variables:
DASHBOARD_SORT_INDEX: 11
DASHBOARD_JOB_CATEGORY: "Code Coverage"
@ -626,10 +462,6 @@ pub_generated_tests:
DASHBOARD_JOB_DESCRIPTION: "Generate Random Arithmetic Jump tests using the RISCV-DV"
script:
- mkdir -p artifacts/coverage
- mkdir -p artifacts/reports
- mv artifacts/tools/spike tools
- python3 .gitlab-ci/scripts/report_fail.py
- echo $SYN_VCS_BASHRC; source $SYN_VCS_BASHRC
- source ./cva6/regress/dv-generated-tests.sh
- mv cva6/sim/vcs_results/default/vcs.d/simv.vdb artifacts/coverage
- mv cva6/sim/seedlist.yaml artifacts/coverage
@ -639,21 +471,12 @@ pub_generated_tests:
allow_failure: true
timeout: 4h
artifacts:
when: always
paths:
- artifacts/coverage/simv.vdb
- artifacts/coverage/seedlist.yaml
- "artifacts/reports/*.yml"
expire_in: 3 week
pub_directed_isacov-tests:
stage: two
tags: [$TAGS_RUNNER]
needs:
- job: pub_build_tools
artifacts: true
- job: pub_smoke
artifacts: false
extends:
- .verif_test
variables:
DASHBOARD_SORT_INDEX: 13
DASHBOARD_JOB_CATEGORY: "Functional Coverage"
@ -664,10 +487,6 @@ pub_directed_isacov-tests:
DASHBOARD_JOB_DESCRIPTION: "Execute directed tests to improve functional coverage of ISA"
script:
- mkdir -p artifacts/coverage
- mkdir -p artifacts/reports
- mv artifacts/tools/spike tools
- python3 .gitlab-ci/scripts/report_fail.py
- echo $SYN_VCS_BASHRC; source $SYN_VCS_BASHRC
- source ./cva6/regress/dv-generated-tests.sh
- mv cva6/sim/vcs_results/default/vcs.d/simv.vdb artifacts/coverage
- python3 .gitlab-ci/scripts/report_pass.py
@ -676,76 +495,62 @@ pub_directed_isacov-tests:
allow_failure: true
timeout: 4h
artifacts:
when: always
paths:
- artifacts/coverage/simv.vdb
- "artifacts/reports/*.yml"
expire_in: 3 week
pub_fpga-boot:
stage: three
tags: [fpga,shell]
extends:
- .backend_test
needs:
- job: pub_fpga-build
artifacts: true
- *initjob
- pub_build_tools
- pub_fpga-build
variables:
VERILATOR_INSTALL_DIR: "NO" # Skip install and checks of verilator
SPIKE_ROOT: "NO" # Skip install and checks of spike
DASHBOARD_JOB_TITLE: "FPGA Linux64 Boot "
DASHBOARD_JOB_DESCRIPTION: "Test of Linux 64 bits boot on FPGA Genesys2"
DASHBOARD_JOB_TITLE: "FPGA Linux32 Boot "
DASHBOARD_JOB_DESCRIPTION: "Test of Linux 32 bits boot on FPGA Genesys2"
DASHBOARD_SORT_INDEX: 10
DASHBOARD_JOB_CATEGORY: "Synthesis"
script:
- mkdir -p artifacts/reports
- python3 .gitlab-ci/scripts/report_fail.py
- source cva6/regress/install-cva6.sh
- source $VIVADO2022_SETUP
- mkdir -p core-v-cores/cva6/corev_apu/fpga/work-fpga
- mv artifacts/ariane_xilinx_cv64a6_imafdc_sv39.bit core-v-cores/cva6/corev_apu/fpga/work-fpga/ariane_xilinx.bit
- mv artifacts/ariane_xilinx_cv32a60x.bit core-v-cores/cva6/corev_apu/fpga/work-fpga/ariane_xilinx.bit
- cd core-v-cores/cva6/corev_apu/fpga/scripts
- source check_fpga_boot.sh
- cd -
- python3 .gitlab-ci/scripts/report_fpga_boot.py core-v-cores/cva6/corev_apu/fpga/scripts/fpga_boot.rpt
artifacts:
when: always
paths:
- "artifacts/reports/*.yml"
code_coverage-report:
stage: three
tags: [$TAGS_RUNNER]
extends:
- .backend_test
needs:
- job: pub_generated_tests
artifacts: true
- job: pub_directed_isacov-tests
artifacts: true
- *initjob
- pub_generated_tests
- pub_directed_isacov-tests
variables:
DASHBOARD_JOB_TITLE: "Report merge coverage"
DASHBOARD_JOB_DESCRIPTION: "Report merge coverage of generated tests"
DASHBOARD_SORT_INDEX: 12
DASHBOARD_JOB_CATEGORY: "Code Coverage"
script:
- mkdir -p artifacts/reports
- echo $SYN_VCS_BASHRC; source $SYN_VCS_BASHRC
- mkdir -p artifacts/cov_reports/
- python3 .gitlab-ci/scripts/report_fail.py
- mkdir -p cva6/sim/vcs_results/default/vcs.d
- mv artifacts/coverage/simv.vdb cva6/sim/vcs_results/default/vcs.d/
- mv artifacts/coverage/seedlist.yaml cva6/sim/seedlist.yaml
- echo $SYN_VCS_BASHRC; source $SYN_VCS_BASHRC
- make -C cva6/sim generate_cov_dash
- mv cva6/sim/urgReport artifacts/cov_reports/
- python3 .gitlab-ci/scripts/report_pass.py
rules:
- when: on_success
artifacts:
when: always
paths:
- "artifacts/cov_reports/urgReport"
- "artifacts/reports/*.yml"
expire_in: 3 week
merge_report:
stage: .post
merge reports:
stage: report
tags: [$TAGS_RUNNER]
rules:
- if: '$DASHBOARD_URL'
@ -759,4 +564,3 @@ merge_report:
when: always
paths:
- "artifacts/reports/pipeline_report_$CI_PIPELINE_ID.yml"