Peer Code Review: Difference between revisions
From Federal Burro of Information
Jump to navigationJump to search
(→phases) |
|||
(One intermediate revision by the same user not shown) | |||
Line 19: | Line 19: | ||
; email pass-around | ; email pass-around | ||
== phases == | |||
;planning | |||
: files | |||
: invites | |||
;inspection | |||
: final defects | |||
: comment and chat | |||
; rework | |||
: fix defects | |||
: upload fixes | |||
; complete | |||
: check files into version control. | |||
where is testing? | |||
== Code Errors == | == Code Errors == |
Latest revision as of 16:48, 18 November 2010
Types of code reveiw
- formal inspection
- author
- reveiwer (who ndoes the explained based on the review, no author input till meeting)
- observer
- over-the-shoulder
- one walks through the code while the other watches.
- pair programming
- like in XP/agile
- tool assisted
- file gathering
- combined display: diff comments defects
- automated metric collection
- review enforcement,
- email pass-around
phases
- planning
- files
- invites
- inspection
- final defects
- comment and chat
- rework
- fix defects
- upload fixes
- complete
- check files into version control.
where is testing?
Code Errors
- severity
- major, minor
- type
- algorithm, documentation, data-usage, error-handling, input, output
- phase-injection
- developer error, design oversight, requirements mistake, QA error
Dunsmore 2000: Object Oriented Reviews
Three approaches:
- checklist oriented
- systematic review
- use-case
checklist was the most effective and efficient at finding defects.
reviews should not take longer than 1 hour
Things to look up
- Path analysis
- beta coefficient from a logarithmic least-sqares analysis is used as the measure of pair-wise correlation strength ( consider also confidence rate ~ 0.01 , P value ? )
Bibliography
- Best Kept Secrets of Peer Code Review - Jason Cohen 2006
- Code Inspection - Michael Fagan IBM 1974 (29 pages)
- http://en.wikipedia.org/wiki/Software_inspection