Augmenting Code Review Tools: Screen Age and Retina Burn…

Two more ideas to augment the code review process:

Screen Age

Imagine you’re reviewing a piece of code.  There’s something like…500 lines to look at.  So you’re scrolling through the diff, checking things here, checking things there…

And in the background, the review software is recording which sections of the code you’re looking at.

Hm.

I wonder if it’d be a useful metric to know what portions of the code were being looked at the most during a review?  I believe it’s possible (at least within Firefox) to determine the scroll position of a window.  I believe it’s also possible to determine the dimensions of the entire diff page, and the offset coordinates of elements on that page.

So shouldn’t it be possible to determine which elements were on the screen, and for how long?

Imagine seeing a “heat map” over the diff, showing where the reviewer spent most of their time looking.

Jason Cohen wrote that code reviews should take at least 5 minutes, and at most 60-90 minutes.  I wonder if collecting this information would help determine whether or not a review was performed carefully enough.

Now, granted, there are plenty of ways to add noise to the data.  If I’m doing a review, and stand up to go get a sandwich, and then my neighbour visits, etc…my computer sits there, gazing at those elements, and the data is more or less worthless…

Which brings me to:

Retina Burn

Eye tracking is not a new technology.  One of my roommates works at a lab where they do experiments with eye tracking tools.  From what he tells me, the eye tracking gear in the lab is pretty expensive and heavy-weight.

But check this out:

and this:

and this:

So it looks like a webcam and some clever software can do the eye tracking trick too.  This stuff is probably far less accurate than what my roommate uses – but it’s lighter, and cheaper, and therefore more likely to find its way into homes.  So it looks like the day is coming where I’ll eventually be able to use my eyes to control my mouse cursor.

But, more interestingly, this more or less solves the problem with my Screen Age idea:  this technology can tell when you’re looking at the screen.  And it can give a pretty good guess about what area of the screen you’re looking at.

I wonder if collecting this information from code reviewers would be useful – what exact parts of the code have they looked at?  And for how long?  What have they missed?  What did they gloss over?

UPDATE: Karen Reid has also brought to my attention the possibility of using this technology to see how TAs grade assignments by tracking what they’re looking at.  Hm…

5 thoughts on “Augmenting Code Review Tools: Screen Age and Retina Burn…

  1. Jesse Gibbs

    Wow, the ultimate report for the over-zealous development manager!

    It makes me think of 1984, where people wonder if their televisions are watching to make sure they are doing calisthenics 😉

    Seriously though, I think that this technology is really cool for eye-tracking studies, but it would be overkill in a code review tool. Most developers would shut off their web cams if a tool had this feature, as it really feels intrusive.

    I think the easy-to-use time tracking features in products like Crucible and Smart Bear are probably sufficient for decent metrics.

  2. Gregg Sporar

    I would agree with Jesse’s comment. Further, in Code Collaborator we’ve got a solution for the “I got up and went to the bathroom or started talking to a co-worker” problem already baked-in to the algorithm we use for tracking the amount of time recorded by the tool.

    Our browser user interface is already watching for keystrokes, focus changes, mouse movement, etc. so we did some detailed testing at an early customer site to fine-tune the algorithm to prevent us from reporting that you were reviewing a piece of code for 16 hours just because you left it up on the screen at the end of the day. 🙂

  3. Mike

    @Jesse and Gregg:

    Thanks for the comments!

    Yes; privacy did come up during my conversations with Karen about this. Imagine you’re reviewing some code, and all of a sudden, your webcam light quietly flicks on…creepy.

    However, instead of using it as a tool for managers, I wonder if it’d be a good tool for reviewers to get a sense of what they’ve really looked at, and what they may have missed.

    Taking a look at the Code Collaborator features listed here: http://smartbear.com/codecollab-features.php, it seems as if I can see how many person-hours everybody spent reviewing my code (and not using the washroom!). I’m also guessing that you took the next logical step, and can show a report of how much time each reviewer spent separately. But did you include what section of the code they looked at the most? The least?

    As a reviewer, I would find the information of where other reviewers have spent most of their time very useful to me.

    Academia-wise, it also opens doors for finding strategies that more experienced reviewers use. What does an experienced reviewer do when they pop open the review? What do they look at first? What do they do? Mountains of data here for study.

    I appreciate both of you for providing your input.

  4. Gregg Sporar

    >can show a report of how much time each reviewer spent separately.

    Yes, Code Collaborator can report that as well.

    >But did you include what section of the code they looked at the most? The least?

    No, that’s not in place today. But there are some interesting similar metrics that are available: where were the most comments and defects entered? That information is available and is a (very) loose approximation of where the majority of the time was spent.

    Your question about “what do they look at first?” is a very interesting area. I assume you are familiar with this study: http://portal.acm.org/citation.cfm?id=1117309.1117357

  5. Nelle

    Another really interesting use of eye tracking is improving the UI: it could be a sort of ReviewBoard for UI issues 🙂

    One thing one of my teacher underline is that, someone can focus on a part of the screen (text or image), because he doesn’t understand it, or because it is interesting (the reading speed then decreases). Unfortunately, an eyetracking tool cannot yet tell the difference (I don’t think any exist yet), but a few computer programs should be able to tell wether the “reader” is focused or not, by looking at the size of the size of the pupil, and facial caracteristics.

Comments are closed.