Peer Code Review: Difference between revisions
From Federal Burro of Information
Jump to navigationJump to search
Line 42: | Line 42: | ||
reviews should not take longer than 1 hour | reviews should not take longer than 1 hour | ||
== Things to look up == | |||
* Path analysis | |||
* beta coefficient from a logarithmic least-sqares analysis is used as the measure of pair-wise correlation strength ( consider also confidence rate ~ 0.01 , P value ? ) | |||
== Bibliography == | == Bibliography == |
Revision as of 21:17, 17 November 2010
Types of code reveiw
- formal inspection
- author
- reveiwer (who ndoes the explained based on the review, no author input till meeting)
- observer
- over-the-shoulder
- one walks through the code while the other watches.
- pair programming
- like in XP/agile
- tool assisted
- file gathering
- combined display: diff comments defects
- automated metric collection
- review enforcement,
- email pass-around
Code Errors
- severity
- major, minor
- type
- algorithm, documentation, data-usage, error-handling, input, output
- phase-injection
- developer error, design oversight, requirements mistake, QA error
Dunsmore 2000: Object Oriented Reviews
Three approaches:
- checklist oriented
- systematic review
- use-case
checklist was the most effective and efficient at finding defects.
reviews should not take longer than 1 hour
Things to look up
- Path analysis
- beta coefficient from a logarithmic least-sqares analysis is used as the measure of pair-wise correlation strength ( consider also confidence rate ~ 0.01 , P value ? )
Bibliography
- Best Kept Secrets of Peer Code Review - Jason Cohen 2006
- Code Inspection - Michael Fagan IBM 1974 (29 pages)
- http://en.wikipedia.org/wiki/Software_inspection