Using Library PCs to Crunch Academic Research Data
Jonathan Younker
Brock University

What We're Doing
-
Processing research data using high-performance computing (HPC) and many-task computing
-
~120 nodes (468 cores, 632GB RAM)
-
Idle time on existing lab PCs, expandable across campus
-
Processing overnight, or during periods of inactivity
-
Student use of PCs always has priority
-

Why We're Doing It
-
“A solution looking for a problem”
-
Digital scholarship lab initiative
-
Integrating research data/methods into library collection
-
-
Becoming ‘baby SharcNet’
-
Accommodating non-STEM researchers
-


Who's Using It
-
Working with Brock University’s Cognitive and Affective Neuroscience Lab to process EEG data
-
Prof. Sid Segalowitz & Technician James Desjardins
-
Pilot project started spring 2014
-
First full dataset processed weekend of September 20th, 2014

How We're Doing It
-
Mix of commercial and open source software
-
Microsoft's HPC Pack
-
Matlab/Octave
-
Ability to run Python, C++, etc.
-
-
Node Templates
-
Manual/scheduled/available when idle
-
-
Node Groups (mini clusters)

Challenges
-
Steep learning curve
-
Proprietary software/licensing
-
Existing processes may require change
-
Huge support/training/troubleshooting implications
-
Need a Mike

To-Do List
- Automation
- Extending to other researchers
- ‘Small’ investment in hardware
- Partnering with Brock ITS/Office of Research Services (via digital scholarship lab initiative)
- Usual SLAs, procedures, etc.

Questions?
- Jonathan Younker (Acting AUL):
- @jtyounker
- jyounker [at] brocku.ca
- Mike Tisi (Systems Administrator):
- @michaelvtisi
- mtisi [at] brocku.ca

Using Library PCs to Crunch Academic Research Data
By jyounker
Using Library PCs to Crunch Academic Research Data
- 962