The hidden meaning of “a great degree of flexibility and customization”

Code4Lib mailing list has an interesting discussion about a discovery layer for Primo. This particular discussion piqued my interest not because of the technical content, but for what’s not actually being discussed. Here’s the sentence that intrigued me (italic part is mine):

We use Alma/Primo here at California State University Sacramento and are finding a great degree of flexibility and customization of the local collections.

Flexibility and customization! I do like this. However, something else nagged me as well. Admit it, most of us are tinkerer. We like the idea that we can customize anything to make sure the relevant information will be displayed properly, with additional bells and whistles if needed. We cherish the idea of “freedom” in this area, where we can basically create a “perfect” user interface without being constrained by the vendor’s product. After all, each library is different and cookie-cutter templates could never satisfy us.

Here lies the hidden meaning of the freedom that we are so wanted: we better know what we’re doing. There will be a time we have to devote a lot of our time for the panning and designing, and making careful considerations we have to work on to make the product work effectively. Anybody whose work deals with information architecture and/or user experience knows this. Design decision should be based on usability study, data analysis, and users research –understanding how our users would interact with our web presence. Most of us already have data from our web logs; our face-to-face or virtual interactions with users who are attempting to use our web presence gave us indications the pain points of our website; and, if we’re lucky, we already did one or two usability studies of our web presence.

However, when it comes to working on a totally new service with new web presence, do those data and the analysis we did apply to this new design? How do we exactly go about designing a totally new user interface? There is no easy answer to that. It is always a good thing to involve our user from the beginning, getting their input and and trust their opinion. Or create stories of personas (stake holders) and use them at least as a starting point. And this is probably where the paradox are happening. We know our services and collections, and we know our systems. So we design how we present our collections and services based on our previous understandings about our past users, who might or might not still relevant.

[lost my thought here. it might come back later. someday.]

On information seeking report

The Project Information Literacy released their research report titled “Lessons Learned: How College Students Seek Information in the Digital Age” in 2009. The PDF report can be found at http://projectinfolit.org/pdfs/PIL_Fall2009_Year1Report_12_2009.pdf.

What makes this report interesting is that the group also try to dig deeper on how students developed their strategy in their information needs both for their course-related works and everyday life. In general, the students use course readings, library resources, and things like Google and Wikipedia when conducting course-related research. They tend to use Google, Wikipedia, and friends when it came to everyday life research.

One of the findings is that students tend use the course readings first for their course-related research. This seems a no brainer to me. After all, the faculty is their “first contact” in the courses they take.

The report also suggests the differences between the guides that librarians provided and the strategy employed by the students. “All in all, the librarian approach is one based by thoroughness, while the student approach is based on efficiency.” (page 20.) This seems to line up nicely with what Roy Tennant wrote many years ago that “only librarians like to search; everyone else like to find.” (Digital Libraries – Avoiding Unintended Consequences, http://www.libraryjournal.com/article/CA156524.html)

As a side note, I’m curious about the time and effort on researches being done in learning students information seeking behavior. Public Services librarians seem to understand this already based on their interaction with the students. Interestingly enough, most of library collection decisions are based on faculty research needs. So, I wonder how the familiarity of the resources affects the faculty’s decision in constructing their course readings and whether it might also affect the student behavior in their information seeking.

All in all, this is their ultimate conclusion:

This is our ultimate conclusion: Todayʼs students are not naïve about sources, systems, and services. They have developed sophisticated information problem-solving strategies that help them to meet their school and everyday needs, as they arise.

The report came up with several recommendations and one of them gave me a pause:

We have come to believe that many students see instructors—not librarians—as coaches on how to consult research. This situation seems to occur whether the faculty may qualify as expert researchers in the area of student research methods, or not. Librarians and faculty should see the librarian-student disconnect as a timely opportunity, especially when it comes to transferring information competencies to students.

We recommend librarians take an active role and initiate the dialogue with faculty to close a divide that may be growing between them and faculty and between them and students—each campus is likely to be different. There are, of course, many ways to initiate this conversation that some libraries may already have in use, such as librarian-faculty roundtables, faculty visits, faculty liaison programs, and customized pathfinders to curriculum, to name but a few. And there is always room for creating new ways to facilitate conversation between faculty and librarians, too. No matter what the means of communication may be, however, librarians need to actively identify opportunities for training faculty as conduits for reaching students with sound and current information-seeking strategies, as it applies to their organizational settings.

Personally, I have no objection with the recommendation above. After all, that’s why we (the librarians) are here for. However, the recommendation above basically takes for granted that narrowing or closing the librarian-student disconnect would actually improve the outcome of the students research. Or, in other words, nowhere in the report indicated that this disconnect bring “harms” to the students outcome. It would be nice to see some kind of assessments on this.

snowy tree

snowy_tree.jpg

URL shorterner’s life

I was perusing some emails that came from a mailing list, old blog posts that I bookmarked, and old tweets that I favorited. Many of them contains somekind of link shorterners like tinyurl, bitly, and t.co.

While the URL shorterners are still functioning just fine, the actual URL themselves are not always so and sometimes I get a 404 error message from the target website. I know link rot happens, but somehow this irked me.

Web Services related terms

(just pulling out stuff from what my brain can come up with at the moment)

API – CSS – DTD – EDI – ElasticSearch – HTML – JSON – Linked Data – Mashup – Metadata – Microformats – OAI – OASIS – openURL – OSS – PURL – REST – SaaS – Semantic Web – SOAP – Solr – SRU – SRW – URN – W3C – WAI – WSDL – XML – XPath – XQuery – XSLT – YAZ

text comparison tool

Bookmarking:

Pretty Diff: http://prettydiff.com/
Text Comparion: http://www.textdiff.com/

Tools I use when performing accessibility assessment

Below are list of variety of tools I use when doing an accessibility assessment for our web precense. I don’t use all those tools all at once, though. :-) The tool I used the most are WAVE and WebAnywhere for a quick test. WAVE is most useful to inform the web developers if they’re missing anything, and WebAnywhere is most useful to show how a screen reader would operate on the site. For a thorough test, I collaborate with my blind student where I can observe her interactions with the e-resource’s user interface (in a way, doing a mini usability study) and note the “pain points” when she encountered difficulties in understanding the structure of the web pages on any given time, such as interacting with the search box, finding the relevant article from within the search results, finding the way to save and send the article citation to herself, read the article within the page, etc.

WAVE from WebAIM

http://wave.webaim.org
Their web-based tool works fine for websites that won’t need some kind of authentication such as open access e-resources like PubMed, etc. For subscription-based resources, especially if you append a proxy link on the e-resource, download and install their Chrome extension.

Functional Accessibility Evaluator (FAE) from UIUC

http://fae.cita.uiuc.edu/
This tool uses Illinois’ Web Accessibility requirements as their evaluation procedure, which tend to be more restrictive than other states. But it’s still a good tool. The explanation of the report is quite useful especially for website designer & developer.

Juicy Studio Accessibility Toolbar (Firefox extension)

https://addons.mozilla.org/en-US/firefox/addon/juicy-studio-accessibility-too/
I use this primarily for analysing color contrast. Many “modern” websites user grey font and sometimes with grey background, which makes reading the text is quite difficult for those with visual disability. Useful for checking the ARIA (Accessible Rich Internet Application) markups as well.

The three tools above would point out the coding problems, especially in the area of using proper tags, labels, etc. The rest of the tools below are useful to point out some user interaction challenges due to design decisions (information architecture, content structure within a page, etc.)

Keyboard manual operation

This is the simplest test. You just use the TAB and arrow keys on your keyboard to move around the page. Useful to check if the website has a “Skip to Main Content” option, especially if the site consist a lot of navigation links. It can be quite tedious if the site has a lot of links. But then you’d know the pain. ;-)

Fangs Screen Reader Emulator (Firefox extension)

https://addons.mozilla.org/en-US/firefox/addon/fangs-screen-reader-emulator/
This is probably the easiest tool to view how a screen reader might read the content of a website from top to bottom without user’s interaction. You’ll see them as a text rather than a voice over. If you do use this tool, please consider a donation to the developer.

SATOGO

http://www.satogo.com
SATOGO is a web-based screen reader. Pretty straightforward. You need to use IE and Windows OS, and download & install their file first. Create an account if you plan to use this service often.

WebAnywhere

http://webanywhere.cs.washington.edu/
Another web-based tool that would emulate a screen reader. Works pretty well, but cannot be used for resources that requires you to authenticate first (using the proxy link, etc.) or if the e-resource uses your IP address for authorization.

NVDA (NonVisual Desktop Access)

http://www.nvaccess.org/
Free screen reader, now it’s quite comparable to JAWS screen reader without the added $$$$. Works on Windows OS only. If you use this tool, please consider donating to the developer.

VoiceOver

For Mac/OSX users, the VoiceOver feature is quite useful. Follow their documentation on how to operate VoiceOver http://www.apple.com/voiceover/info/guide/

ER&L 2014 workshop: Influencing and improving products: structured interface reviews

I will collaborate with several fine fellow librarians from Cornell and Columbia University libraries doing a workshop titled “Influencing and improving products: structured interface reviews” at the upcoming Electronic Resources & Libraries conference, March 16-19, 2014 at AT&T Conference Center, Austin, Texas. The workshop will be held on March 19th, 1-5pm CST.

This workshop will analyse library electronic resource platform from usability and accessibility perspectives. Most of our assessment tend to be from either usability or accessibility perspective. Rarely we perform both type of assessments together when evaluating a library electronic resource. Hopefully this workshop would provide some basic understanding on how to conduct the assessments, document the findings, and communicate them with the vendor/provider.

On diversifying our conference experience

e.g. starts expanding your territory. ;-)

John Dupuis (Confessions of a Science Librarian) posted “From the Archives: My theory of conferences” in January 2010 and “A stealth librarianship manifesto” in February 2011.

Good readings.

I’m in agreement with his “theory of conference” post. I’m a systems librarian for electronic resources, so my focus is about technology as well as access to library resources (be that resources we bought or subscribe to, or the ones we produced in house, digital collections and stuff). I’m also doing library web presence assessments, so usability study, accessibility evaluations, and usage statistics also fall under my work area. In additional to that, I’m the subject librarian for Library & Information Science and the library liaison for the Museum Studies program.

So, when I had to choose conferences I need to (or want to or, many times, wished to) go to, there are tons of choices. My suggestion for those who can only afford go to the same conference(s), try to go to at least one session with topic that really has nothing to do with your area of work. It’s amazing to learn something new from those sessions and the topic might actually end up relevant to your area of work. Even if it was not relevant to your work, you’ll learn something new anyway that might inspire some type of collaboration.

Should you choose to go to a non-library conference, you’ll meet with amazing people who also care a lot about their profession and the discipline. You’ll either expanding your network or at least learn more about issues and trends in those area. Also a chance to promote your amazing library skills. :-) You don’t have to go to those non-library conferences all the time, of course, but at least you’ll understand how other people in different disciplines would interact, talk about issues and trends, and start a conversation with them. I have had luck going to museum related conferences (state [1] and national [2] [3] [4] level) and THATCamp events due to my liaison responsibilities.

For those who are curious which library-related conferences I usually go to:

I also attended Michigan Library Association, LITA Forum (ALA), and Computers in Libraries (Information Today, Inc.) conferences. I went to ACM SIGCHI conference once; it was awesome. Then there are C2E2 and Penguicon, which have had several programs related to libraries. Last but not the least, MSU Comics Forum.

[1] Michigan Museum Association
[2] American Alliance of Museums
[3] Association of Science-Technology Centers (ASTC)
[4] Museums and the Web

linux man pages

I need a linux man page system that show me what command(s) to use if I want to achieve something. Current man pages are not really helpful: you have to know the name of the command before you can even figure out what that command would do for you.

Anyway, I’m listing these as my bookmarks:

  1. 1
  2. 2
  3. 3