Assessment in Archives: Definitions, Tools, Plans, and Outcomes GUGM 2011 Deborah S. Davis Valdosta...

36
Assessment in Archives: Definitions, Tools, Plans, and Outcomes GUGM 2011 Deborah S. Davis Valdosta State University Archives and Special Collections

Transcript of Assessment in Archives: Definitions, Tools, Plans, and Outcomes GUGM 2011 Deborah S. Davis Valdosta...

Page 1: Assessment in Archives: Definitions, Tools, Plans, and Outcomes GUGM 2011 Deborah S. Davis Valdosta State University Archives and Special Collections.

Assessment in Archives: Definitions, Tools, Plans, and Outcomes

GUGM 2011Deborah S. Davis

Valdosta State University Archives and Special Collections

Page 2: Assessment in Archives: Definitions, Tools, Plans, and Outcomes GUGM 2011 Deborah S. Davis Valdosta State University Archives and Special Collections.
Page 3: Assessment in Archives: Definitions, Tools, Plans, and Outcomes GUGM 2011 Deborah S. Davis Valdosta State University Archives and Special Collections.

Definition In higher education the term “assessment” can mean a variety of things. It can refer to the process of grading an individual student’s achievement on a test or assignment or it can refer to the process of evaluating the quality of an academic program. The overall purpose of program assessment does not focus on an individual student. Rather, the

Assessment is the systematic and ongoing method of gathering, analyzing and using information from measured outcomes to improve student learning.

The emphasis is on what and how an educational program is contributing to the learning, growth and development of students as a group. There are four levels of assessment: 1. classroom assessment (involves assessment of individual students at the course level typically by the class instructor), 2. course assessment (involves assessment of a specific course), 3. program assessment (involves assessment of academic and support programs and is the focus of this manual), and 4. institutional assessment (involves assessment of campus-wide characteristics and issues). (Adapted from Palomba and Banta, 1999; and Stassen, Doherty, and Poe, 2001) Program Assessment Handbook, University of Central Florida, 2008. http://oeas.ucf.edu/doc/acad_assess_handbook.pdf

Page 4: Assessment in Archives: Definitions, Tools, Plans, and Outcomes GUGM 2011 Deborah S. Davis Valdosta State University Archives and Special Collections.

Assessment is defined as data-gathering strategies, analyses, and reporting processes that provide information that can be used to determine whether or not intended outcomes are being achieved.[1]

Evaluation uses assessment information to support decisions on maintaining, changing, or discarding instructional or programmatic practices.[2]

http://www.foundationcoalition.org/home/keycomponents/assessment_evaluation.html

Page 5: Assessment in Archives: Definitions, Tools, Plans, and Outcomes GUGM 2011 Deborah S. Davis Valdosta State University Archives and Special Collections.

“Assessment is a process that follows a continuous cycle of improvement based upon measurable goals, involving data collection, organization and interpretation leading to planning and integration.”

http://www.mcli.dist.maricopa.edu/ocotillo/retreat98/f1.html

Page 6: Assessment in Archives: Definitions, Tools, Plans, and Outcomes GUGM 2011 Deborah S. Davis Valdosta State University Archives and Special Collections.

Teaching Evaluation and Program Evaluation are different measures

Prove you learned it?—assessment against student learning outcomes.

Does it work?– program evaluation

Page 7: Assessment in Archives: Definitions, Tools, Plans, and Outcomes GUGM 2011 Deborah S. Davis Valdosta State University Archives and Special Collections.

Institutional Effectiveness Plan Final Report – 2009-2010Academic Department or Division: Odum Library, Valdosta State University Archives and Special CollectionsDegree Program:Contact Person: Deborah S. Davis

Email: [email protected]: 229 259-7756

Assessment Cycle: 2009-2010

Mission: The VSU Archives and Special Collections supports the University’s commitment to scholarly and creative work, enhances instructional effectiveness, encourages faculty scholarly pursuits, and supports research in selective areas of institutional strength focused on regional need by collecting, preserving, and providing access to records of enduring historical value documenting the history and development of VSU and the surrounding South Georgia region.

Page 8: Assessment in Archives: Definitions, Tools, Plans, and Outcomes GUGM 2011 Deborah S. Davis Valdosta State University Archives and Special Collections.

Expected Student Learning Outcomes: Based on “Assessment of parts of the education programs in the VSU Archives and Special Collections” submitted September of 2009, our goals for our extra credit volunteer program and for our History 3000 classes are as follows:

Volunteers: Students should become intimately familiar with primary source documents and understand the difficulty of using those sources without indexing,

Students should have the opportunity to improve their grades through hands-on work in addition to the usual reading and writing,

Students should be introduced to the procedures of using archives as history majors, and

Students should experience Archives work and explore an area that is often a career option for history majors.

Page 9: Assessment in Archives: Definitions, Tools, Plans, and Outcomes GUGM 2011 Deborah S. Davis Valdosta State University Archives and Special Collections.

Assessments/Measures for Volunteer Learning Outcomes 1-4

Outcome #

Success Criteria

Measurement Method Data results Data Results Applied

1-primary sources & indexes

Participation indexing project working with primary sources

Number of students who volunteered and completed hours in these projects

Excel time and project logs; final printed evaluationsAlso Teacher interviews and printed teacher evaluations

Total Volunteers: 31; total hours 445; online evaluations** showed 85% connected info from project to their class; Used teacher interviews to assess success.

In fall 2010, Visited 2 lower level classes with an orientation to get more to come to Archives. Had been doing volunteer orientation as a tack on to a regular Archives class, but noted fewer students participated.

Page 10: Assessment in Archives: Definitions, Tools, Plans, and Outcomes GUGM 2011 Deborah S. Davis Valdosta State University Archives and Special Collections.

2-improve grades

Participation Number of students who completed projects for extra credit

Extra credit hours and grades reported in report to teacher

In fall of 2009, 13 volunteers worked 120 hours; in Spring 2010, 18 volunteers worked 325 hours.

In spring 2010 we began our changes in making sure one professor brought all classes to archives for an orientation. See above for continuation of that effort with another professor.

Page 11: Assessment in Archives: Definitions, Tools, Plans, and Outcomes GUGM 2011 Deborah S. Davis Valdosta State University Archives and Special Collections.

3-using archives

Participation Responses to written student evaluations showing understanding of archives procedures

Written student evaluations and written teacher evaluations

We started Archival Metrics online evaluations in Spring 2010. The student reported in qualitative questions their understanding of archives and willingness to do more archival research.

We have some data from this online survey. Slightly less than ½ of our volunteers for spring semester filled it out (although all the long term volunteers did). We want to re-time the evaluation to get a higher level of participation.

Page 12: Assessment in Archives: Definitions, Tools, Plans, and Outcomes GUGM 2011 Deborah S. Davis Valdosta State University Archives and Special Collections.

4-Archives career info

Completion of projects

Number of students who work enough hours to report for final grade and number of students who request jobs in archives, ask for professional mentoring, or ask about archives as a career

Excel time and project logs; Archivist keeps track of student requesting jobs or advising

29% of students worked over 30 hours; This is a large number doing the longest work. About that same percent answered the qualitative question about continuing in Archives. Completion numbers include 2600 records added to 3 databases.

Since spring of 2010, I have hired 4 history majors, all with a 3.0 GPA, all who have previous archival experience. From what I’ve seen of other archives, we have one of the best archival work pools in the state.

**Online evaluations were implemented in spring of 2010 so we have data for only part of year.

Page 13: Assessment in Archives: Definitions, Tools, Plans, and Outcomes GUGM 2011 Deborah S. Davis Valdosta State University Archives and Special Collections.

General Orientations: These are classes which may be “one-shot” or multi-meeting. The main focus of these classes is facilitating archival research and not creation of archival descriptions as is the focus in our Volunteer program and History 3000.

•Students will understand how to use our online finding aids to locate and request items in the Archives.

•Students will understand the differences between items in Archives and items in the library as a whole.

•Students will have the tools necessary to create a research plan for their projects using archival materials, library materials and the internet.

Or (for those classes only exploring archival materials and not how to find t hem,)

•Students will connect actual archival documents, photographs, and artifacts to time periods and concepts they have studied in their classes.

Page 14: Assessment in Archives: Definitions, Tools, Plans, and Outcomes GUGM 2011 Deborah S. Davis Valdosta State University Archives and Special Collections.

Assessments/Measures for General Orientation Learning Outcomes 1-4Outcome #

Success Criteria

Measurement Method Data results Data Results Applied

1-use finding aids

Task Completion

Number of students or groups who find materials in our archives or online in others

Observation and written student and teacher evaluations; teacher interviews

50 out of 71 students that this question applied to answered that either learning about our holdings or our catalogs were the most important parts of the orientation.

Information on our Archon catalog (now that it is more complete) will be stressed in these archives orientations. It has not been publically highlighted until this past semester.

Page 15: Assessment in Archives: Definitions, Tools, Plans, and Outcomes GUGM 2011 Deborah S. Davis Valdosta State University Archives and Special Collections.

2-library vs. Archives

Task Completion

Number of Students or groups who find materials in our archives or in the library

Observations and written student and teacher evaluation; teacher interviews

Our evaluations for specific classes did not capture this information in written form. However, in observations and teacher interviews of our most intensive research class, students brought in articles, books, and interlibrary loan requests for documenting application of the lessons. I also had five one-hour consultations with graduate students who brought in a bibliography of items based on the previous orientation sessions.

We will try to capture information on this statistic in a written evaluation form by changing our forms. We are changing some of our orientations to one day on book sources, one day on journal sources, and one day on internet-based primary sources. They have to bring in sources from the previous lesson and write up their search strategies. This is working.

Page 16: Assessment in Archives: Definitions, Tools, Plans, and Outcomes GUGM 2011 Deborah S. Davis Valdosta State University Archives and Special Collections.

3-research plan

Participation Students expression of understanding of different sources on written evaluation

Written student and teacher evaluations; teacher interviews

Our Archival metrics information did not capture this as well as observation and teacher interviews did.

This might need to be assessed by a post test during the longer orientation sessions or by adding some questions to our assessment instrument. We will be able to re-evaluate an effective way to assess this starting in Spring 2011.

4-archivial docs and class connect

Participation Students and teachers expression of understanding of primary sources found in archives in written evaluations

Written student and teacher evaluations; teacher interviews

Evaluations of a variety of classes commented on this aspect of orientation, but none did so directly. Observations were more appropriate to this goal. The materials students brought to follow up orientations showed this understanding.

Our assessments for these classes were designed by Archival Metrics and we have been experimenting with them this semester. We need to further explore and modify these tools to capture specific information related to our goals. In some cases we shortened the prepared evaluations and may have cut out queries we needed.

Page 17: Assessment in Archives: Definitions, Tools, Plans, and Outcomes GUGM 2011 Deborah S. Davis Valdosta State University Archives and Special Collections.

Expected Service Outcomes: (To assess archival services outside of a classroom setting—these are administrative goals)

•Patrons should find our website informative and easy to navigate

•Patrons should be able to find specific materials from our archival collections using our finding aids, especially our Archon System

•Patrons should be able to access information about all of our holdings electronically.

•Patrons who contact us by email or phone should receive expeditious initial replies to their queries and efficient and timely delivery of answers and documents to longer-term projects.

•Patrons who come into our archives for research purposes should receive answers and information quickly and efficiently.

•Patrons who come to our Archives or contact us by phone or email should feel that they dealt with staff who were helpful and friendly. Our staff should be enthusiastically engaged in patron reference questions.

•Patrons should find our physical environment inviting and comfortable for their research and study needs.

Page 18: Assessment in Archives: Definitions, Tools, Plans, and Outcomes GUGM 2011 Deborah S. Davis Valdosta State University Archives and Special Collections.

EXAMPLE of PROGRAM ASSESSMENT: Expected Preservation Outcomes: In 2009 the VSU Archives and Special Collections participated in a state-wide survey for the program “Georgia Healthy Collections Initiative.” Data from our survey were measured against preservation standards within the archival profession, and different aspects of our program were given a rating, along with information on steps necessary to make our collections and preservation program stronger. Based on that evaluation, we have chosen to set goals/ outcomes for 2010 to strengthen our data gathering practices, to update our disaster plan, parts of which are four years old, and to explore options for a preservation or digitizing grant.

•Data on climate should be available for analysis for our physical space, for our exhibits, and for any special environments we maintain.

•Data on light conditions in all exhibit areas will be logged regularly with actions taken to bring Archival lighting closer to lighting standard in all areas.

•Information on provenance and subsequent preservations actions will be readily available for our collections.

•The VSU Archives shall explore and seek grant funding for preservation and digitizing activities to add to our capacity to care for and make accessible our important collections.

•The VSU Archives will have an up-to-date disaster and emergency preparedness plan and a staff trained in following this plan.

Page 19: Assessment in Archives: Definitions, Tools, Plans, and Outcomes GUGM 2011 Deborah S. Davis Valdosta State University Archives and Special Collections.

Tools: Adopted and Discarded

• Libstats—easy to use, not so easy to pull data from, except counts, adaptable.

• PEM Data loggers• Archival Metrics Surveys• Home Made Surveys• Interviews• Assignments

Page 20: Assessment in Archives: Definitions, Tools, Plans, and Outcomes GUGM 2011 Deborah S. Davis Valdosta State University Archives and Special Collections.

Libstats

Page 21: Assessment in Archives: Definitions, Tools, Plans, and Outcomes GUGM 2011 Deborah S. Davis Valdosta State University Archives and Special Collections.

PEM Data loggers

Page 22: Assessment in Archives: Definitions, Tools, Plans, and Outcomes GUGM 2011 Deborah S. Davis Valdosta State University Archives and Special Collections.

Archival Metrics

Page 24: Assessment in Archives: Definitions, Tools, Plans, and Outcomes GUGM 2011 Deborah S. Davis Valdosta State University Archives and Special Collections.

The Web site evaluation

Page 25: Assessment in Archives: Definitions, Tools, Plans, and Outcomes GUGM 2011 Deborah S. Davis Valdosta State University Archives and Special Collections.

Results: Note the Number that Skipped

Page 26: Assessment in Archives: Definitions, Tools, Plans, and Outcomes GUGM 2011 Deborah S. Davis Valdosta State University Archives and Special Collections.

22 questions, some this detailed

Page 27: Assessment in Archives: Definitions, Tools, Plans, and Outcomes GUGM 2011 Deborah S. Davis Valdosta State University Archives and Special Collections.

Biggest Problem with Archival Metrics was Teaching Evaluations

• The Basic Archival Metrics Long Student Survey

• An effective tailored version

• Results:

• This is the “How well did it work” opinion evaluation.

Page 28: Assessment in Archives: Definitions, Tools, Plans, and Outcomes GUGM 2011 Deborah S. Davis Valdosta State University Archives and Special Collections.

Did you learn it:

• Something graded– Volunteer project folders, database entries, log

book of hours-- grade for work done– Assignments for history 3000 students turned in

after working in archives– Teacher interviews; student interviews– Embedded archives questions on history tests.– Embedded questions on otherwise general

evaluation forms

Page 29: Assessment in Archives: Definitions, Tools, Plans, and Outcomes GUGM 2011 Deborah S. Davis Valdosta State University Archives and Special Collections.

Is it working?

• Evidence of use—Lib stats and Google Analytics

• Replies to general and specific evaluation forms.

• Interviews and post-project meetings with staff and patrons

Page 30: Assessment in Archives: Definitions, Tools, Plans, and Outcomes GUGM 2011 Deborah S. Davis Valdosta State University Archives and Special Collections.

• Link to Introduction to Ancient and Near East History Class evals with embedded learning assessment:

• How well did they retain it?#45. the Mesopotamian clay tablets at VSU are examples of primary sources because:a) they are stored in the archives, and therefore are archival sources b) they are original to a Mesopotamian time period and culture **c) they are studied by world famous scholars of Mesopotamiad) they are on the internet as part of an international database of cuneiform texts

#49 According to the presentation that you had in the archives, the collection of cuneiform tablets that VSU owns came from and archaeologist named_______and the information on the tablets tells about______: a) Indiana Jones // the adventures of Gilgameshb) Richard Powell // Neolithic agriculturec) T.E. Lawrence // building zigguratsd) Edgar Banks // temple administration and income **

**=correct answerSect A Test question number 45: 94.59% correct

Test Question number 49: 62.16% correct 37 students tested SECT B Test Question 45: 95.24% correct

Test Question 49: 59.52 % correct 42 students tested

Page 31: Assessment in Archives: Definitions, Tools, Plans, and Outcomes GUGM 2011 Deborah S. Davis Valdosta State University Archives and Special Collections.

Spring 2011: History/Archives Work Program Evaluation

• Volunteer evaluation with hours.

• The “did you learn it part”

Page 32: Assessment in Archives: Definitions, Tools, Plans, and Outcomes GUGM 2011 Deborah S. Davis Valdosta State University Archives and Special Collections.

Overall Evaluation of our Assessment Results in 2009-2010:This was the first year that the VSU Archives attempted such a systematic process of assessing its teaching, services and preservation. Our overall goal was to create a culture of assessment within Archives. It was an ambitious goal, and I think our results were mixed. While we now have data in most areas, the value of that data in making decisions is still questionable. Observation and talking to professors and students who come to Archives are still our most powerful tools for assessing weakness and planning changes. The data we gathered from surveys does support our findings from observation, strongly. Using these two old, fairly informal measures have helped us make changes in the past that our data this year has validated.

Evaluation of tools adopted for assessment by the VSU Archives: Libstats was the most consistently successful tool. Part of the reason is because it is easy to use. Since it should be a part of every transaction, it can be worked into all our patron routines. Studying the questions and answers recorded in this simple tracking system has been very useful in seeing the type of questions we get and the completeness of our answers.

Page 33: Assessment in Archives: Definitions, Tools, Plans, and Outcomes GUGM 2011 Deborah S. Davis Valdosta State University Archives and Special Collections.

Our Archival Metrics Surveys were less successful. These surveys were developed by a team of archivists across several institutions and tested extensively. They were also designed to be sent out periodically, not for every transaction, in several-times-a-year studies. Several weaknesses have emerged. While the surveys collect quite a lot of data, that data is not necessarily tied to our goals and outcomes. Also, the surveys are long and, therefore, somewhat irritating for patrons to fill out. The response rate is low, about 25% of the surveys we sent out, and responses drop off during the survey. Also, with our current workload, it is not practical to gear up for large data-gathering efforts two to three times a year. Data gathering needs to be built more into our transactions. During spring of 2011, I plan to re-evaluate our surveys and try to either revise the Archival Metrics Surveys or replace them over the course of the next year with something created in-house. This will be a very time-consuming activity. For teaching surveys, we shortened the Archival Metrics survey, but it still does not reflect our goals and outcomes adequately, as most types of classes are both very specific and very different.

Our in-house Reference tracking database has been abandoned for work on other projects higher on our priorities list. Libstats provides a less robust replacement, but for the time being it is what we need.

Page 34: Assessment in Archives: Definitions, Tools, Plans, and Outcomes GUGM 2011 Deborah S. Davis Valdosta State University Archives and Special Collections.

Google Analytics works very well on the websites where it is deeply embedded, such as Vtext. However its reporting on the Archives website is less effective. It is currently not capturing our finding aids, thus missing a large part of how our patrons look for information. Working with automation, we need to embed Google Analytics more deeply into the Archives website. This will depend on workflow in the Automation department, but should be a goal for next year.

Our climate measurements: Vault and patron area data continue to be collected; however, our ability to use the data has been somewhat ineffectual. We have tried to work with Plant Operations to verify our data and make changes; however, it seems personnel changes and workload changes keep derailing focus on this long-term project. All responses to climate data are short-term, and this year we have discovered a long-term drift away from temperatures and humidity levels that we have been able to maintain for years. Since this drift seems connected to the piece of our system that draws in air from the main campus system and since HVAC work has been part of the library renovation for the past year, we want to gather data after the renovation is complete in December to see if we need to force a meeting for a long-term solution. Other records, such as light readings, display case readings, and freezer readings, have yet to become routine and thus are less useful than they could be.

Page 35: Assessment in Archives: Definitions, Tools, Plans, and Outcomes GUGM 2011 Deborah S. Davis Valdosta State University Archives and Special Collections.

General Plan for Assessment: Our first year of assessment showed us our strengths and weaknesses in this area. While we gathered data, we found that data not uniformly useful, nor did it always lead to the expected changes. We have several overall goals for assessment in 2010-2011:

•Study and modify at least two thirds of the Archival Metrics surveys we are currently using to make them apply more directly to our learning and service outcomes and to make them shorter and less onerous for patrons to complete.

•Begin sending out links to a survey with every reference question answered by email.

•Work to routinize and publicize our data gathering efforts. The success of libstats is a model for this. Train students in the critical nature of data gathering.

•Build in preservation data gathering in the areas of lighting, climate and displays, and make sure that data is gathered regularly.

2nd year: 2010-2011 plan

Page 36: Assessment in Archives: Definitions, Tools, Plans, and Outcomes GUGM 2011 Deborah S. Davis Valdosta State University Archives and Special Collections.

So….Assessment

• Remember Assessment is the tool, not the goal itself. Results often get lost.

• Make sure your instruments are tied to your goals.

• Limit the time you spend on assessment—make it reasonable.

• Make sure you apply your results.• Talk about assessment with your peers. We

need more good ideas.