Design and Evaluation of

Interactive Proofreading Tools for Connectomics

Daniel Haehn, Seymour Knowles-Barley, Mike Roberts, Johanna Beyer,
Narayanan Kasthuri, Jeff W. Lichtman, Hanspeter Pfister

Harvard University

MotivatioN

PROOFREADING

ANALYSIS AND DESIGN

EXPERIMENT

RESULTS

Connectomics

Goal: Fully understand the wiring diagram of the brain

Acquisition

Registration

Labeling

Analysis

Proofreading

Labeling Errors

Original Image

with cell boundaries

Automatic Labeling

Correct Labeling

MotivatioN

PROOFREADING

ANALYSIS AND DESIGN

EXPERIMENT

RESULTS

Proofreading

"Manual Correction of Automatic Labeling"

Before

After

Proofreading operators

Merge

Split

Adjust

Before

After

Proofreading Result

Before Proofreading

After Proofreading

Related Work

Seung et al.: EyeWire, http://eyewire.org

Related Work (2)

Sicat et al.: Graph abstraction for simplified proofreading of slice-based volume segmentation, EUROGRAPHICS 2013

Related Work (3)

MotivatioN

PROOFREADING

ANALYSIS AND DESIGN

EXPERIMENT

RESULTS

Mojo

Mojo (2)

Usage of Mojo

Summer interns, Researchers

But so much and huge data..

Highschool Students

Problem: very restrictive IT environments and Mojo too complicated and for single users

Simple Web-based Solution

Dojo - Live Demo

DOJO (2)

Dojo (3)

Collaborative features

multiple users in 2D

problem marker

multiple users in 3D

Dojo (4)

Limitation

limited data size (best on 1024^3px volumes)

MotivatioN

PROOFREADING

ANALYSIS AND DESIGN

EXPERIMENT

RESULTS

User Study

Raveler

Mojo

Dojo

vs

vs

User Study (2)

between-subjects experiment with untrained participants (N=30)

proofread a small dataset

in 30 minutes

also asked 2 experts to segment the dataset

from scratch

User Study (3)

Participants

of all occupations

no experience with EM data or proofreading

N=30, 17 female, 20-40 years old (M=27)

10 users per tool

2 experts start from scratch using ITKSnap

User Study (4)

Dataset

400x400x10 pixel

cut from publicly available dataset

(ISBI 2013 challenge)

with ground-truth available

User study (5)

Hypotheses

H1  Proofreading will be better with Dojo

H2  Dojo's usability is higher

H3  In a fixed time-frame, proofreading by non-experts gives better results than completely manual annotations by experts

User Study (6)

Measures

Quantitative

Variation of Information

Rand Index

Edit Distance

Qualitative

10 questions regarding software usability

NASA-TLX

User Study (7)

Analysis

Quantitative

Analysis of Variance (ANOVA)

Parametric Tests if applicable

Qualitative

Created sub-groups

ANOVA (Holm's sequentially-rejective Bonferroni method)

Parametric Tests if applicable

MotivatioN

PROOFREADING

ANALYSIS AND DESIGN

EXPERIMENT

RESULTS

Results

Variation of Information

*

*

*

p=.015

The lower, the better

Experts

Input

Results (2)

Rand Index

*

*

p=.043

The higher, the better

Experts

Input

Results (3)

Edit Distance (Merges+Splits 'til finish)

+ (in favor)

The lower, the better

Experts

Input

Results (4)

Hypotheses

H1  Proofreading will be better with Dojo

H2  Dojo's usability is higher

H3  In a fixed time-frame, proofreading by non-experts gives better results than completely manual annotations by experts

Results (5)

Qualitative

Usability

*

*

p=.0408

Merge Tool (most used)

*

*

p=.0174

*

Slice Visualization

Segment Visualization (3D)

Dojo     Mojo    Raveler

Dojo     Mojo    Raveler

Dojo     Mojo    Raveler

Dojo     Mojo    Raveler

Generally liked the software

Dojo     Mojo    Raveler

*

*

p=.0416

+ (in favor)

+ (in favor)

Results (6)

Hypotheses

H1  Proofreading will be better with Dojo

H2  Dojo's usability is higher

H3  In a fixed time-frame, proofreading by non-experts gives better results than completely manual annotations by experts

Discussion

Not many participants could improve the labeling but even made it worse!

3D thinking?

Noisy data (hard to see boundaries)

Users felt rushed

Usability of tools

Training is required

Conclusion

Developed standalone and web-based Tools

for Proofreading of Automatic Labeling

Performed user study with non-experts

Thank You

VIS2014New

By Daniel Haehn

VIS2014New

  • 231
Loading comments...

More from Daniel Haehn