The notes from Day 1, other than the opening keynote.
Let’s get started.
Growing the PLN: Challenges and Opportunities
- John Durno
- Bronwen Sprout
Digital preservation ramping up across members, so very timely.
Council of Prairie and Pacific University Libraries, providing leadership in developing collaborative solutions.
Collaborative preservation of digital collections of local interest that are not being preserved elsewhere. Acts as a backup.
- Governance: cooperative, self-governed, fees for own resources
- Materials: local e.g. locally hosted journals (OJS especially because of plugin), digitized collections, websites, online resources, theses and dissertations
- Digital Preservation Working Group (DPWG): steward, develop options for common approach to digital preservation
Distributed node. Replicated storage across the network. Server configured to send copies when content is marked for safe keeping.
Low volume requirements, but also low because of content ingestion constraints. Some plugins making it easier e.g. Archive-It, LOCKSSdm, DSpace, LOCKSS-O-Matic (Archivematica, etc.)
Did survey to look at content. Developing further governance documents (e.g. governance policy, technical specs, collection guidelines).
Proposed to broaden scope beyond storage to suite of tools that is actively managed under new management (half time position). Need to transform the service to platform-as-a-service, expand, coordinate 3rd-party services, shared software development, assessment, etc.
Evaluate other DP communications, needs assessment, look at cost recovery, and more as per previous points.
Using Library Lab PCs to Crunch Academic Research data
- 128 nodes (468 cores, 632GB RAM)
- using idle time on existing lab PCs to process research data using high-performance computing (HPC) and many-task computing
- want to be helping in between research consultation and collection
- integrating research data/methods into library collection
- pilot project started in spring 2014 working with Brock U’s Cognitive and Affective Neuroscience Lab to process EEG data, see bit.ly/AccessICA
- injected library into their processes
- use Microsoft HPC Pack where
** head note = Windows Server,
** no additional licensing, Web/GUI/CLI interface.
- Controlled through ‘Node Templates’ (can be manual, schedule, on idle)
** can group notes using mini-clusters
- steep learning curve
- proprietary software/licensing
- existing processes may change
- huge support, training, troubleshooting implications
- Need a Mike
- extend to others
- small investment in hardware
- partner with ITS/research services
- usual procedures, etc.
### #HackUOBiblio – libraries, hacking, and open data
Catherine McGoveran, Government Information Librarian, University of Ottawa
- focused on open data
- how to work with it, why
- collaboration with other groups and for participants
- open and accessible as possible
- library data for both Ottawa Public Library, and uOttawa were released
- fostered new interest in open data from staff and students
- maps of plane over Ontario (refreshes every 10 minutes)
- pull content from amazon when not available at the library
- locations of outdoor skating rinks
- where ILL materials are sent to and received from
- more coffee, less salad
- theme? mini-workshops?
- way to share results, projects
- many flavours of hackfest depends on audience, and goals
* overall successful
* data sharing culture
* even talked about creating a website
### Hacking the City: Libraries and the Open Data Movement
Alex Carruthers, Digital Public Spaces Librarian, Edmonton Public Library @acecarruthers & Lydia Zvyagintseva, MLIS/MA Candidate, University of Alberta @lydia_zv
- governments relase data, and license allows use and generally only require attribution
- cities were first to start doing this
- open data movement: increase gov transparency, social and commercial value, increase participation and engagement in gov
- hosting and supporting hackathons support digital literacy, civic engagement, leverage community knowledge, increase participatory culture
- Open Data Day at EPL: participants got to speak with city staff and others who released open data, worked with the available data
- 73% respondents enjoyed opportunity to network with city representatives
- staff motivated by citizens feedback and participation
- need more, better data
- data in civic lives required to further knowledge and innovation
Useful Usability Panel
- Gillian Byrne, Associate Chief Librarian, Ryerson University
- Jeff Carter, Solutions Architect, University of New Brunswick
- Krista Godfrey, Web Services Librarian, Memorial University of Newfoundland
- James MacKenzie, Associate Director of Libraries (Academic and Scholarly Technologies), University of New Brunswick
Going Beyond User Testing & UX: From Complaints to Opportunities
Presented by Krista & Gillian.
Usability testing tends to involve unsolicited feedback. User surveys frequently involve complaints. Getting feedback all the time.
Problems with feedback include:
* letting someone drive the conversation
* treating the problem without looking at the underlying issues
* often emotional, personal, biased, reactionary
Move from feedback to critique, implies critical thinking. Improve shared vocabulary, among other things.
What do we do with the feedback we’re not ready for? How do we get better feedback Not everyone uses the web the way you do, or think about it the same way.
Keep asking why until you get to the heart of the problem. Essential to use terminology that makes sense to our users.
Review feedback, flag what’s high priority, and make sure low priority ones at least get documented somewhere.
Two categories of issues that get solved right away even during testing:
* technical issue that’s immediately solvable
* questions where can immediately refer users to the right person
Make sure everyone knows testing is going on and what’s being tested.
Don’t just capture words, but also context, how it is said. Collect & correlate with feedback from different parts of the library. But what’s the tipping point? More about the nature of the issue rather than the number of complaints.
Be internally transparent in how it’s collecting, being reviewed, acted upon.
Not all feedback is useful, but while the feedback might not be useful, the person and the connection with the person is important. Encourage people to contribute e.g. beta testing group
Enter testing with an idea of what you’re looking for, but keep an open mind.
There is no one solution. Think about where, what, and why you’re testing.
Disrupted UX: User Search for Responsive Web Design
Presented by James.
Built user research as part of the redesign of website. Focused on mobile user experience.
Common to see mobile site as bridge to physical space. Saw an increase use of the regular site. Moved to responsive site (facets of the same experience).
User designed prototypes:
- which canvas are you interested in? analytics, campus/institutional programs
- allow users to create paper prototype, extreme vertical paper prototyping (putting desktop content, arranged into mobile sized paper)
- more than just a colouring exercise: placement can imply weight/importance, identify missing things
- top content similar to previous mobile site, paper not all used
* test 5 users each time, but test multiple times
* user group x representative devices?! – desktop, tablet, smart phone
* be clear on your audience and goals
* guerilla testing, intentional selection
Testing for Comparability
* across devices offers some opportunities to research from usability to user experience
* how are your mobile users different? what facet of UX meeting/not?
* test and capture consistently
* evidence of difficulties (pass, time), behaviours (scrolling/scanning, device orientation)
Targeting Pattern and Design
* requires creative restructuring
* non-visible items missed
* combination of icon and text tends to do better
Debrief interviews where some follow-up questions are asked, including asking how likely to do task on mobile, etc. Resist over-generalization.
Mind the Gap: Bridging Usability and Accessibility
Presented by Jeff.
Meeting accessibility “standards” won’t automatically create great user experiences.
Usability = How well your interface works vs. Accessibility = whether it works, but not how easy it is.
Accessible UX, but actually UX for Accessibility = improve usability of accessible navigation, page structure and design.
How? Not sure yet, but have a few ideas.
Users of accessible technologies are no more computer savvy and technically literate than anyone else; NEVER presume otherwise.
Improve your site to make it work for everyone. Most users browse, so use headings (and landmarks) well, use descriptive links with focus colour, layout, structure (in source order), don’t include useless text (title attributes, access keys). Also, ensure scalability.
Automated testing is a good start only.
You have real people with real needs.
End of the Day
Jumping off to the party