Reporting review results.
Several
information-rich items result from technical reviews. These items are listed
below. The items can be bundled together in a single report or distributed over
several distinct reports. Review polices should indicate the formats of the
reports required. The review reports should contain the following information.
1.
For
inspections—the group checklist with all items covered and
comments relating to each item.
2. For inspections—a status, or summary, report
(described below) signed by all participants.
A list of
defects found, and classified by type and frequency. Each defect should be
crossreferenced to the line, pages, or figure in the reviewed document where it
occurs. 4. Review metric data (see
Section 10.7 for a discussion).
The
inspection report on the reviewed item is a document signed by all the
reviewers. It may contain a summary of defects and problems found and a list of
review attendees, and some review measures such as the time period for the
review and the total number of major/minor defects.
The
reviewers are responsible for the quality of the information in the written
report [6]. There are several status options available to the review
participants on this report. These are:
1. Accept: The reviewed item is accepted in
its present form or with minor rework required that does not need further verification.
2. Conditional accept: The
reviewed item needs rework and will be accepted after the moderator has checked and verified the rework.
3.Reinspect:
Considerable
rework must be done to the reviewed item.
The
inspection needs to be repeated when the rework is done. Before signing their
name to such a inspection report reviewers need to be sure that all checklist
items have been addressed, all defects recorded, and all quality issues
discussed. This is important for several reasons. Very often when a document
has passed an inspection it is viewed as a baseline item for configuration
management, and any changes from this baseline item need approval from the
configuration management board. In addition, the successful passing of a review
usually indicates a project milestone has been passed, a certain level of
quality has been achieved, and the project has made progress toward meeting its
objectives. A milestone meeting is usually held, and clients are notified of
the completion of the milestone.
If the
software item is given a conditional accept or a reinspect, a follow-up period
occurs where the authors must address all the items on the problem/defect list.
The moderator reviews the rework in the case of a conditional accept. Another
inspection meeting is required to reverify the items in the case of a
―reinspect ‖
decision. For an inspection type of review, one completeness or exit criterion
requires that all identified problems be resolved. Other criteria may be
required by the organization. In addition to the summary report, other outputs
of an inspection include a defect report and an inspection report. These
reports are vital for collecting and organizing review measurement data. The
defect report contains a description of the defects, the defect type, severity
level, and the location of each defect. On the report the defects can be
organized so that their type and occurrence rate is easy to determine. IEEE
standards suggest that the inspection report contain vital data such as [8]:
(i) number of
participants in the review;
(ii) the
duration of the meeting;
(iii) size of
the item being reviewed (usually LOC or number of pages);
(iv)total preparation time for the
inspection team;
(v)
status of the reviewed item;
(vi)estimate of rework effort and the
estimated date for completion of the rework.
This data
will help an organization to evaluate the effectiveness of the review process
and to make improvements.The IEEE has recommendations for defect classes [8].
The classes are based on the reviewed software items‘ conformance to:
•
standards;
• capability;
• procedures;
• interface;
• description.
A defect
class may describe an item as missing, incorrect, or superfluous as shown in
Table 10.1. Other defect classes could describe an item as ambiguous or
inconsistent [8]. Defects should also be ranked in severity, for example:
(i) major
(these would cause the software to fail or deviate from its specification);
(ii) minor
(affects nonfunctional aspects of the software).
A ranking
scale for defects can be developed in conjunction with a failure severity scale
as described in Section 9.1.4.
A
walkthrough review is considered complete when the entire document has been
covered or walked through, all defects and suggestions for improvement have
been recorded, and the walkthrough report has been completed. The walkthrough
report lists all the defects and deficiencies, and contains data such as [8]:
• the
walkthrough team members;
• the name
of the item being examined;
• the
walkthrough objectives;
• list of
defects and deficiencies;
• recommendations
on how to dispose of, or resolve the deficiencies.
Note that
the walkthrough report/completion criteria are not as formal as those for an
inspection. There is no requirement for a signed status report, and no required
follow- up for resolution of deficiencies, although that could be recommended
in the walkthrough report.A final important item to note: The purpose of a
review is to evaluate a software artifact, not
the developer or author of the artifact. Reviews should not be used to evaluate
the performance of a software analyst, developer, designer, or tester [3]. This
important point should be well established in the review policy. It is
essential to adhere to this policy for the review process to work. If authors
of software artifacts believe they are being evaluated as individuals, the
objective and impartial nature of the review will change, and its effectiveness
in revealing problems will be minimized .
Related Topics
Privacy Policy, Terms and Conditions, DMCA Policy and Compliant
Copyright © 2018-2023 BrainKart.com; All Rights Reserved. Developed by Therithal info, Chennai.