This document explains the recommended checklist items to review when transitioning from one Development Stage to another, for design, verification, and software device interface function (DIF) stages. It is expected that the items in each stage (D1, V1, S1, etc) are completed.
For a transition from D0 to D1, the following items are expected to be completed.
The specification is 90% complete, all features are defined. The specification is submitted into the repository as a markdown document. It is acceptable to make changes for further clarification or more details after the D1 stage.
The CSRs required to implement the primary programming model are defined. The Hjson file defining the CSRs is checked into the repository. It is acceptable to add or modify registers during the D2 stage in order to complete implementation.
Clock(s) and reset(s) are connected to all sub-modules.
The unit .sv
exists and meets comportability requirements.
The unit is able to be instantiated and connected in top level RTL files. The design must compile and elaborate cleanly without errors. The unit must not break top level functionality such as propagating X through TL-UL interfaces, continuously asserting alerts or interrupts, or creating undesired TL-UL transactions. To that end, the unit must fulfill the following V1 checklist requirements:
- TB_TOP_CREATED
- SIM_RAL_MODEL_GEN_AUTOMATED
- CSR_CHECK_GEN_AUTOMATED
- SIM_CSR_MEM_TEST_SUITE_PASSING
All expected memories have been identified and representative macros instantiated. All other physical elements (analog components, pads, etc) are identified and represented with a behavioral model. It is acceptable to make changes to these physical macros after the D1 stage as long as they do not have a large impact on the expected resulting area (roughly "80% accurate").
The mainline functional path is implemented to allow for a basic functionality test by verification. ("Feature complete" is the target for D2 status.)
All the outputs of the IP have ASSERT_KNOWN
assertions.
A lint flow is set up which compiles and runs. It is acceptable to have lint warnings at this stage.
The security countermeasures for the IP have been scoped. Taking OpenTitan’s Secure Hardware Design Guidelines and the specification of the IP into account, the designers have compiled a list of assets and their security properties as well as a high-level list of proposed countermeasures and reviewed both with the Security WG.
Any new features added since D1 are documented and reviewed with DV/SW/FPGA.
The GitHub Issue, Pull Request, or RFC where the feature was discussed should be linked in the Notes
section.
Block diagrams have been updated to reflect the current design.
All IP block interfaces that are not autogenerated are documented.
Any integration specifics that are not captured by the comportability specification have been documented. Examples include special synthesis constraints or clock requirements for this IP.
Any missing functionality is documented.
Feature requests for this IP version are frozen at this time.
All features specified are implemented.
All ports are implemented and their specification is frozen, except for security / countermeasure related ports that will not affect functionality or architectural behavior (the addition of such ports can be delayed only until D2S).
All architectural state (RAMs, CSRs, etc) is implemented and the specification frozen.
All TODOs have been reviewed and signed off.
The IP block conforms to the style guide regarding X usage.
All CDC synchronization flops use behavioral synchronization macros (e.g. prim_flop_2sync
) not manually created flops.
The lint flow passes cleanly. Any lint waiver files have been reviewed.
A CDC checking run has been set up (if tooling is available). The CDC checking run shows no must-fix errors, and a waiver file has been created.
If there is no CDC flow at the block-level, this requirement can be waived.
An RDC checking run has been set up (if tooling is available). The RDC checking run shows no must-fix errors, and a waiver file has been created.
If there is no RDC flow at the block-level, this requirement can be waived.
An area check has been completed either with an FPGA or ASIC synthesis flow.
A timing check has been completed either with an FPGA or ASIC synthesis flow.
Any custom security countermeasures other than standardized countermeasures listed under SEC_CM_IMPLEMENTED have been identified, documented and their implementation has been planned. The actual implementation can be delayed until D2S.
Where the area impact of countermeasures can be reliably estimated, it is recommended to insert dummy logic at D2 in order to better reflect the final area complexity of the design.
List the assets and corresponding countermeasures in canonical format in the IP Hjson (the canonical naming is checked by the reggen tool for correctness).
This list could look, for example, as follows:
# Inside the rstmgr.hjson
countermeasures: [
{ name: "BUS.INTEGRITY",
desc: "Bus integrity check."
},
{ name: "RST.INTEGRITY",
desc: "Reset integrity checks."
},
{ name: "RST.SHADOW_LOGIC",
desc: "Shadow reset logic."
}
# ...
]
For a full list of permitted asset and countermeasure types, see the countermeasure.py script that implements the name checks.
Note the SEC_CM_DOCUMENTED item in the D2 checklist, which is a precursor to this step.
Any appropriate security counter-measures are implemented. Implementations must follow the OpenTitan Secure Hardware Design Guidelines.
In particular, note that:
- For duplicated counters
prim_count
must be used. - For duplicated LFSRs
prim_double_lfsr
must be used. - For redundantly encoded FSMs, the sparse-fsm-encode.py script must be used to generate the encoding (in conjunction with the
PRIM_FLOP_SPARSE_FSM
) macro. - For multibit signals, the
mubi
types inprim_mubi_pkg
should be used if possible.
Compile-time random netlist constants (such as LFSR seeds or scrambling constants) are exposed to topgen via the randtype
parameter mechanism in the comportable IP Hjson file.
Default random seeds and permutations for LFSRs can be generated with the gen-lfsr-seed.py script.
See also the related GitHub issue #2229.
A review of sensitive security-critical storage flops was completed. Where appropriate, non-reset flops are used to store secure material.
Shadow registers are implemented for all appropriate storage of critical control functions.
If deemed necessary by the security council, an offline review of the RTL code sections pertaining to the assets and countermeasures listed in SEC_CM_ASSETS_LISTED has been performed.
Security council has reviewed the asset list and associated documentation (SEC_CM_ASSETS_LISTED, SEC_CM_DOCUMENTED) and deems the defenses implemented appropriate.
The security council decides whether an additional RTL review of the relevant code sections is necessary (SEC_CM_RTL_REVIEWED).
Any approved new features since D2 have been documented and reviewed with DV/SW/FPGA
All TODOs are resolved.
Deferred TODOs have been marked as "iceboxed" in the code and have been attached an issue as follows: // ICEBOX(#issue-nr)
.
The lint checking flow is clean. Any lint waiver files have been reviewed and signed off by the technical steering committee.
The CDC checking flow is clean. CDC waiver files have been reviewed and understood. If there is no CDC flow at the block-level, this requirement can be waived.
The RDC checking flow is clean. RDC waiver files have been reviewed and understood. If there is no RDC flow at the block-level, this requirement can be waived.
A simple design review has been conducted by an independent designer.
Any deleted flops have been reviewed and signed off. If there is no synthesis flow at the block-level, this requirement can be waived.
Any software-visible design changes have been reviewed by the software team.
All known "Won't Fix" bugs and "Errata" have been reviewed by the software team.
To transition from V0 to V1, the following items are expected to be completed. The prefix "SIM" is applicable for simulation-based DV approaches, whereas the prefix "FPV" is applicable for formal property-based verification approaches.
A DV document has been drafted, indicating the overall DV goals, strategy, the testbench environment details with diagram(s) depicting the flow of data, UVCs, checkers, scoreboard, interfaces, assertions and the rationale for the chosen functional coverage plan. Details may be missing since most of these items are not expected to be fully understood at this stage.
A testplan has been written (in Hjson format) indicating:
- Testpoints (a list of planned tests), each mapping to a design feature, with a description highlighting the goal of the test and optionally, the stimulus and the checking procedure.
- The functional coverage plan captured as a list of covergroups, with a description highlighting which feature is expected to be covered by each covergroup. It may optionally contain additional details such as coverpoints and crosses of individual aspects of the said feature that is covered.
If the DUT has a CPU for which a ROM firmware is developed (burnt-in during manufacturing):
- A detailed ROM firmware testplan has been written to adequately verify all functions of the ROM firmware in a pre-silicon simulation environment (i.e. a DV testbench).
- The testing framework may validate each functionality of the ROM firmware discretely as an individual unit test.
- Depending on the ROM firmware development progress, this may be postponed for V2.
A top level testbench has been created with the DUT instantiated. The following interfaces are connected (as applicable): TileLink, clocks and resets, interrupts and alerts. Other interfaces may not be connected at this point (connecting these is part of SIM_TB_ENV_CREATED). Inputs for which interfaces have not yet been created are tied off to the default value.
All available interface assertion monitors are connected (example: tlul_assert).
A UVM environment has been created with major interface agents and UVCs connected and instantiated. TLM port connections have been made from UVC monitors to the scoreboard.
A RAL model is generated using regtool and instantiated in the UVM environment.
A CSR check is generated using regtool and bound in the TB environment.
Full testbench automation has been completed if applicable. This may be required for verifying multiple flavors of parameterized designs.
A smoke test exercising the basic functionality of the main DUT datapath is passing. The functionality to test (and to what level) may be driven by higher level (e.g. chip) integration requirements. These requirements are captured when the testplan is reviewed by the key stakeholders, and the test(s) updated as necessary.
CSR test suites have been added for ALL interfaces (including, but not limited to the DUT's SW device access port, JTAG access port etc.) that have access to the system memory map:
- HW reset test (test all resets)
- CSR read/write
- Bit Bash
- Aliasing
Memory test suites have been added for ALL interfaces that have access to the system memory map if the DUT has memories:
- Mem walk
All these tests should verify back to back accesses with zero delays, along with partial reads and partial writes.
Each input and each output of the module is part of at least one assertion. Assertions for the main functional path are implemented and proven.
The smoke regression passes cleanly (with no warnings) with one additional tool apart from the primary tool selected for signoff.
A small suite of tests has been identified as the smoke regression suite and is run regularly to check code health. If the testbench has more than one build configuration, then each configuration has at least one test added to the smoke regression suite.
A nightly regression for running all constrained-random tests with multiple random seeds (iterations) has been setup. Directed, non-random tests need not be run with multiple iterations. Selecting the number of iterations depends on the coverage, the mean time between failure and the available compute resources. For starters, it is recommended to set the number of iterations to 100 for each test. It may be trimmed down once the test has stabilized, and the same level of coverage is achieved with fewer iterations. The nightly regression should finish overnight so that the results are available the next morning for triage.
An FPV regression has been set up by adding the module to the hw/top_earlgrey/formal/top_earlgrey_fpv_cfgs.hjson
file.
A structural coverage collection model has been checked in.
This is a simulator-specific file (i.e. proprietary format) that captures which hierarchies and what types of coverage are collected.
For example, pre-verified sub-modules (including some prim
components pre-verified thoroughly with FPV) can be black-boxed - it is sufficient to only enable the IO toggle coverage of their ports.
A functional coverage shell object has been created - this may not contain coverpoints or covergroups yet, but it is primed for development post-V1.
VeribleLint for the testbench is set up to run in nightly regression, with appropriate waivers.
- For a constrained random testbench, an entry has been added to
hw/<top-level-design>/lint/<top-level-design>_dv_lint_cfgs.hjson
- For an FPV testbench, an entry has been added to
hw/<top-level-design>/lint/<top-level-design>_fpv_lint_cfgs.hjson
Sub-modules that are pre-verified with their own testbenches have already reached V1 or a higher stage. The coverage level of the pre-verified sub-modules that are not tracked (i.e., not taken through the verification stages), meets or exceeds the V1 stage requirement. They are clearly cited in the DV document and the coverage of these sub-modules can be excluded in the IP-level testbench.
The design / micro-architecture specification has been reviewed and signed off. If a product requirements document (PRD) exists, then ensure that the design specification meets the product requirements.
The draft DV document (proposed testbench architecture) and the complete testplan have been reviewed with key stakeholders (as applicable):
- DUT designer(s)
- 1-2 peer DV engineers
- Software engineer (DIF developer)
- Chip architect / design lead
- Chip DV lead
- Security architect
The following categories of post-V1 tests have been focused on during the testplan review (as applicable):
- Security / leakage
- Error scenarios
- Power
- Performance
- Debug
- Stress
The V2 checklist has been reviewed to understand the scope and estimate effort.
To transition from V1 to V2, the following items are expected to be completed. The prefix "SIM" is applicable for simulation-based DV approaches, whereas the prefix "FPV" is applicable for formal property-based verification approaches.
It is possible for the design to have undergone some changes since the DV document and testplan were reviewed in the V0 stage. All design deltas have been captured adequately and appropriately in the DV document and the testplan.
The DV document is fully complete.
The functional coverage plan is fully implemented. All covergroups have been created and sampled in the reactive components of the testbench (passive interfaces, monitors and scoreboards).
For simulations, interfaces are connected to all ports of the DUT and are exercised. For an FPV testbench, assertions have been added for all interfaces including sidebands.
All planned assertions have been written and enabled.
A UVM environment has been fully developed with end-to-end checks in the scoreboard enabled.
All tests in the testplan have been written and are passing with at least one random seed.
All assertions (except security countermeasure assertions) are implemented and are 90% proven. Each output of the module has at least one forward and one backward assertion check. The FPV proof run converges within reasonable runtime.
All assumptions have been implemented and reviewed.
If the DUT has a CPU for which a ROM firmware is developed (burnt-in during manufacturing):
- The ROM firmware testplan is fully written.
- SIM_ALL_TESTS_PASSING checklist item is met, including these tests.
This checklist item is marked N.A. if the DUT does not have a CPU.
A nightly regression with multiple random seeds is 90% passing.
Line, toggle, fsm (state & transition), branch and assertion code coverage has reached 90%. Toggle coverage of the ports of the DUT and all pre-verified sub-modules have individually reached 90% in both directions (1->0 and 0->1).
Functional coverage has reached 90%.
Branch, statement and functional code coverage for FPV testbenches has reached 90%.
COI coverage for FPV testbenches has reached 75%.
Sub-modules that are pre-verified with their own testbenches have already reached V2 or a higher stage. The coverage level of the pre-verified sub-modules that are not tracked (i.e., not taken through the verification stages), meets or exceeds the V2 stage requirement.
Security countermeasures are planned and documented.
- Common countermeasure features (such as shadowed reg, hardened counter etc) can be tested by importing common sec_cm testplans, tests and adding the bind file
cm_sec_bind
. - Additional checks and sequences may be needed to verify those features. Document those in the individual testplan.
- Create testplan for non-common countermeasures.
All high priority (tagged P0 and P1) design bugs have been addressed and closed. If the bugs were found elsewhere, ensure that they are reproduced deterministically in DV (through additional tests or by tweaking existing tests as needed) and have the design fixes adequately verified.
All low priority (tagged P2 and P3) design bugs have been root-caused. They may be deferred to post V2 for closure.
The DV document and testplan are complete and have been reviewed by key stakeholders (as applicable):
- DUT designer(s)
- 1-2 peer DV engineers
- Chip architect / design lead
- Chip DV lead
- Security architect
This review will focus on the design deltas captured in the testplan since the last review. In addition, the fully implemented functional coverage plan, the observed coverage and the coverage exclusions are expected to be scrutinized to ensure there are no verification holes or any gaps in achieving the required verification quality, before the work towards progressing to V3 can commence.
The V3 checklist has been reviewed to understand the scope and estimate effort.
The testplan has been updated with the necessary testpoints and covergroups to adequately verify all security countermeasures implemented in the DUT.
These countermeasures are listed in the comportable IP Hjson file located at hw/ip/<ip>/data/<ip>.hjson
(or equivalent).
On OpenTitan, a security countermeasures testplan is auto-generated (the first time) by the reggen
tool for each DUT, and is placed at hw/ip/<ip>/data/<ip>_sec_cm_testplan.hjson
(or equivalent).
This testplan has been imported into the main testplan written for the DUT.
Tests implemented to verify the security countermeasures have been mapped to these testpoints.
Common countermeasures can be fully verified or partially handled by cip_lib. Follow this document to enable them. Make sure to import the applicable common sec_cm tests and testplans.
All security countermeasure assertions are proven in FPV. The required assertions for countermeasure are defined in Security Countermeasure Verification Framework. Follow this document to setup the FPV sec_cm testbench.
All security countermeasures are verified in simulation. Common countermeasures can be fully verified or partially handled by cip_lib. Refer to the cip_lib document for details.
Security countermeasure blocks may have been excluded in order to satisfy the V2 sign-off criteria. If so, these exclusions should be removed.
If UNR exclusion has been generated, it needs to be re-generated and reviewed after all security countermeasure tests have been implemented, as fault injection can exercise countermeasures which are deemed as unreachable code. The V2S coverage requirement is the same as V2.
The security countermeasures testplan and the overall DV effort has been reviewed by key stakeholders (as applicable):
- DUT designer(s)
- 1-2 peer DV engineers
- Security architect (optional)
This review may be waived if not deemed necessary.
To transition from V2 to V3, the following items are expected to be completed. The prefix "SIM" is applicable for simulation-based DV approaches, whereas the prefix "FPV" is applicable for formal property-based verification approaches.
Although rare, it is possible for the design to have undergone some last-minute changes since V2. All additional design deltas have been captured adequately and appropriately in the DV document and the testplan.
X-propagation is enabled in DV simulations. There are no pieces of logic that are reported unsuitable for X-propagation instrumentation by the simulator during the build step.
All assertions are implemented and 100% proven. There are no undetermined or unreachable properties.
A nightly regression with multiple random seeds is 100% passing (with 1 week minimum soak time).
Line, toggle, fsm (state & transition), branch and assertion code coverage has reached 100%.
Functional coverage has reached 100%.
Branch, statement and functional code coverage for FPV testbenches has reached 100%.
COI coverage for FPV testbenches has reached 100%.
There are no remaining TODO items anywhere in the testbench code, including common components and UVCs.
There are no compile-time or run-time warnings thrown by the simulator.
The lint flow for the testbench is clean. Any lint waiver files have been reviewed and signed off by the technical steering committee.
Sub-modules that are pre-verified with their own testbenches have already reached the V3 stage. The coverage level of the pre-verified sub-modules that are not tracked (i.e., not taken through the verification stages), meets the V3 stage requirement.
All design and testbench bugs have been addressed and closed.
For a transition from S0 to S1, the following items are expected to be completed.
Autogenerated IRQ and Alert DIFs have been created with the util/make_new_dif.py
tool, and exist in sw/device/lib/dif/autogen/
.
Additionally, a header file, dif_<ip>.h
and, optionally, dif_<ip>.c
exist in sw/device/lib/dif/
.
All existing non-production code in the tree which uses the device does so via the DIF or a production driver.
An on-device test exists (in sw/device/tests
) that uses the DIF.
This test should excercise the main datapath of the hardware module via the DIF, and should be able to be run on at least one OpenTitan platform (either on FPGA or in simulation).
For a transition from S1 to S2, the following items are expected to be completed.
The DIF's respective device IP is at least stage D2.
The DIF has functions to cover all specified hardware functionality.
For a transition from S2 to S3, the following items are expected to be completed.
The DIF's respective device IP is at least stage D3.
The DIF's respective device IP is at least stage V3.
The HW IP Programmer's guide references specific DIF APIs that can be used for operations.
The DIF follows DIF-specific guidelines in sw/device/lib/dif
and the OpenTitan C style guidelines.
Software unit tests exist for the DIF in sw/device/tests/dif
named dif_<ip>_unittest.cc
.
Unit tests exist to cover (at least):
- Device Initialisation
- All Device FIFOs (including when empty, full, and adding data)
- All Device Registers
- All DIF Functions
- All DIF return codes
All DIF TODOs are complete.