2019-12-02

Attendees

  • Nicole DeSarno

  • Courtney Eger

  • Darla Himeles

  • Josue Hurtado

  • Sarah Jones

  • Tom Ipri 

  • Vitalina Nova

  • Caitlin Shanley

  • Guest: Nancy Turner

Agenda

  1. Discussion of instruction statistics with Nancy

    1. Our questions from last meeting:

      1. What current fields are required for reporting out?

        1. Nancy created document with definitions - ARL, ACRL/IPEDS, AAHSL - based on same NISO definition

        2. we’re not required to collect stats - we do it to provide our peers with information about what we’re doing

        3. we have the option to provide a sample OR straight numbers 

        4. we have to provide tight documentation for numbers provided externally 

          1. eg, number of tours, attendance

          2. how many volumes the Press published

        5. “we can fudge anything we put in these systems; it’s not like it’s based on reality”

        6. ARL is pretty basic

        7. ACRL asks us to divide between physical and digital

        8. AAHSL asks about additional things - eg, number of databases

        9. Nancy appreciates having trend information, but the more conservative we are about it, the better

          1. ie, our LibGuides could be considered “online instruction” but if we count all of those, the data becomes meaningless 

      2. Are the Outcomes reported out beyond the University? Do we have flexibility to make them more useful internally?

      3. Would having separate fields for “tasks” and “outcomes” be useful?

        1. focus on program outcomes because that’s relevant to more people contributing to the form

        2. there is likely more variation in learning outcomes for particular classes

        3. want to be able to say how we are contributing to inst. outcomes for student success

      4. Academic Departments change and need to be updated. What role do these have for reporting?

        1. this is helpful for librarians, but doesn’t need to be reported

        2. make this an open text field?

        3. actually, the course identifier already captures this

      5. How can we streamline HSL and non-HSL data? HSL has somewhat different needs and uses different language (e.g. “Guest Lecture”) for their reporting.

        1. ABA also has particular needs for law libraries 

      6. Can we remove Course Title and Course Instructor Last Name?

      7. If/When we make changes, the data dictionary will need to be updated and perhaps be somewhere easier to get to than Confluence.

        1. you can include definitions on the form

      8. Synchronous online classes seem pretty straight-forward in that they align with in-person workshops other than location. Asynchronous poses numerous challenges, especially for tutorials and other materials that can be assigned by professors unbeknownst to the libraries. Which raises the bigger question of whether the dataset is tracking librarian work or instruction impact.

        1. would be helpful to be able to separate out online vs. in person

        2. creation of a guide could be counted as instruction labor, while the Springshare stats would track usage

      9. Can multiple selections be made for Type. For example, if a tour is part of a workshop.

        1. is it viable to have fewer options? answer: yes

        2. Nancy’s recommendation: make it broad, and then she (and others) will parse it out later

        3. there is value in clarifying what a workshop is

      10. Do we need to be more specific with community groups? As of now, they would fall under “General Public” or “Other.” Do we need to say “Community School Group” or is “School Group” sufficient? Should school groups and other non-Temple groups be tracked differently?

Notes

  1. general recommendation: revisit this year’s goals and prioritize

    1. think about impact 

    2. select a few that are doable in these murky times

Action Items

  1. Caitlin will create a dataset/form in LibInsight (or LibAnalytics) to start testing



Other Notes

At next meeting:

  • consider current goals (high vs. low impact and effort), prioritize

  • discuss program outcomes