You can use the Question attribute to enter the question to use when you review elements. Write the question so that answering Yes indicates that the element is approved. You can use the following checkpoints when you review different types of elements, such as requirements, use cases, test cases, and trouble reports.
Name | Question |
---|---|
Uniqueness | Are the title and the ID of the requirement unique? |
Understandability | Can readers understand the requirements? |
Redundancy | Is the requirement free from unnecessary information? |
Completeness | Are all attributes complete? |
Ambiguity | Is the requirement worded clearly? |
Consistency | Is the requirement free from contradictions to other requirements and to overall system requirements? |
Organization | Is the requirement in the correct hierarchical level? |
Traceability | Does the requirement have a source requirement and a target requirement? |
Testability | Can the completion of the requirement be verified by testing, demonstration, review, or analysis? |
Responsibility | Is the person who is responsible for the requirement identified? |
Language | Is the requirement free from grammatical errors and words that are difficult to verify, such as often, at least, and sometimes? |
Type | Is the requirement a requirement and not a design or implementation solution? |
Duplicates | Is the meaning of the requirement unique? |
Splitting | Does this requirement specify one well-defined need? |
Grouping | Is this a stand-alone requirement? |
Scope | Is the requirement in scope for the project? |
Performance | Are the performance objective of this requirement identified? |
Reference | Are the cross-references of this requirement correct? |
Dependency | If the requirement has any dependencies to other requirements, are these dependencies clearly specified? |
Name | Question |
---|---|
Discreteness | Is the use case a stand-alone, discrete task? |
Objective | Is the goal, or measurable value, of the use case clear? |
Actor | Are the actors who benefit from the use case identified? |
Level | Is the use case written at the essential (abstract) level, rather than as a specific scenario? |
Type | Does the use case contain design and implementation details? |
Course completeness | Are all anticipated alternative courses documented? |
Exceptions | Are all known exception conditions documented? |
Splitting | Are there any common action sequences that could be split into separate use cases? |
Ambiguity | Is the dialog sequence for each course clearly written, unambiguous, and complete? |
Pertinence | Is every actor and step in the use case pertinent to completing the task? |
Course feasibility | Are the courses in the use case feasible? |
Course verifiability | Are the courses in the use case verifiable? |
Name | Question |
---|---|
Input and output | Does the test case include a complete description of the expected input and output? |
Reference | Are all dependencies for this test case described? |
Duplicates | Is this condition tested only once? |
Completeness | Is the test case complete? |
Ambiguity | Is the test case free from ambiguities? |
Reproducibility | Is the test case reproducible? |
Design | Is the test case designed to show the presence of failure and not the absence of failure? |
Name | Question |
---|---|
Understandability | Can readers understand the error report? |
Completeness | Are all attributes complete? |
Ambiguity | Is the error report worded clearly? |
Duplicates | Does a duplicate error report exist? |
Reproducibility | Can the error in the error report be reproduced? |