15 Interinstitutional Collaborations to Forge Intracampus Connections: A Case Study from the Duke Endowment Libraries

Sarah Hare, Andrea Wright, Christy Allen, Geneen E. Clinkscales, & Julie Reed

Introduction

Academic libraries are increasingly providing education and support for faculty interested in open educational resources (OER). Some academic libraries offer comprehensive training programs and dedicated staff to support the creation and adoption of OER. Other libraries focus on smaller initiatives seeking to explore the interests of OER stakeholders on their campuses. However, there are still a significant number of academic libraries with little or no experience supporting faculty OER needs. This was the situation at Duke University, Davidson College, Furman University, and Johnson C. Smith University in early 2016.

Most of the librarians at these institutions had minimal to no experience working with OER, and had no clear concept of which of their faculty had adopted or contributed to OER. Fortunately, these institutions have access to a shared endowment that facilitates collaborative programming across their institutions. Beginning in the summer of 2016, librarians from these four schools joined together to develop a unique OER pilot program that pooled resources and created a support structure across institutions with notably different student populations, faculty interests, and library structures.

TDEL OER Pilot Program

The OER pilot program implemented by Duke, Davidson, Furman, and Johnson C. Smith benefited from external funding and interinstitutional collaboration made possible by the Duke Endowment. Established in 1924 by the tobacco and hydroelectric power magnate, James B. Duke, the Duke Endowment is a permanent trust fund with designated beneficiaries. Among these beneficiaries are four institutions of higher learning: Davidson College, Duke University, Furman University, and Johnson C. Smith University. In 2001, the libraries from these four institutions established an informal group called the Duke Endowment Libraries (TDEL) to foster collaboration and share knowledge and training across the institutions’ libraries. In 2015, TDEL established a project fund to provide financial support for joint projects among two or more of the Duke-endowed libraries. During preliminary discussions about possible uses of this fund, the topic of OER presented itself. With the exception of Davidson College, the OER experience among the libraries was fairly limited. Working collaboratively, the Duke-endowed libraries could realize the following benefits:

  • Connect librarians at different institutions with a shared interest in supporting open access and OER.
  • Collect and analyze OER information from several institutions to identify larger trends and interests in higher education.
  • Maximize resource utility by pooling training and funding across institutions, while still facilitating a program that best suits the needs of each individual campus.

Even with all the benefits of collaboration, it is important to note that the four beneficiary institutions of the Duke Endowment are very different from one another. They have varying missions, student bodies, and levels of involvement in scholarly communication initiatives. The TDEL library directors felt this institutional variety was an asset to the project.

Davidson College is a highly selective liberal arts college of almost 2,000 students located just outside of Charlotte, North Carolina. For its size, Davidson has been fairly progressive in the field of open education. In 2015, their Library and Center for Teaching & Learning began offering Open Educational Resources and Open Pedagogy Stipends. This program awarded $500 to five faculty interested in integrating OER into their fall 2016 classes (Center for Teaching & Learning, 2016).

Duke University is a private research university of over 14,800 undergraduate and graduate students located in Durham, North Carolina. While the Duke University Libraries have been active in scholarly communication including the management of the University’s institutional repository and offering funding to support faculty open access publications, they had done very little in the realm of strategic OER programming prior to this project.

Furman University is an undergraduate liberal arts university located in Greenville, South Carolina serving 2,700 students. While Furman Libraries have been actively involved in other scholarly communication initiatives including the management of the University’s institutional repository and the administration of the University’s Open Access Fund, they had done virtually nothing related to OER prior to this project.

Johnson C. Smith University is a historically black college and university (HBCU) serving approximately 1,350 students in Charlotte, North Carolina. The library has a strong reputation of collaboration with faculty; however, they had very little hands-on experience with OER prior to this project.

In the spring of 2016, the TDEL library directors approved the creation of a collaborative OER pilot program for academic year 2016–17. The OER pilot program was developed to have two major elements: a Train the Trainer Workshop and a Faculty OER Review Program. The budget for the program was $12,800 with $10,000 earmarked for faculty stipends. The program had the following goals:

  • Increase knowledge of OER among librarians;
  • Increase awareness of OER among faculty on campus;
  • Assess campus knowledge and climate regarding open access and OER;
  • Inform the development and/or expansion of OER initiatives supported by the libraries.

Train the Trainer Workshop

As noted above, an important goal of this program was to increase knowledge and experience of OER among the Duke-endowed librarians. To that end, they organized a Train the Trainer Workshop where an OER expert educated the librarians on the benefits and limitations of open resources, offered hands-on experience with locating and evaluating OER, provided tips for engaging faculty, and facilitated a brainstorming session on implementing a successful faculty OER review program.

Selecting an expert for the Train the Trainer Workshop was the result of interinstitutional collaboration. Each librarian conducted research to identify possible candidates to lead the workshop. This list of candidates was then discussed and decided over the phone. William Cross, Director of the Copyright and Digital Scholarship Center at North Carolina State University (NCSU), was chosen. His knowledge and expertise in implementing the Alt-Textbook Project at NCSU was a contributing factor in his selection (North Carolina State University, 2017). Because he was located in the Carolinas, like the other institutions, it was an added benefit that he had familiarity and a strong frame of reference for the Duke-endowed libraries.

Two librarians from each institution attended the workshop, which was held in the James B. Duke Memorial Library of Johnson C. Smith University. Having two librarians in attendance was extremely beneficial. First, it increased the amount of cross-collaboration and knowledge sharing within and between the institutions. Second, it allowed for a more distributed workload in implementing and running the OER faculty review programs at each local institution. Each of the librarian participants had other duties and responsibilities within their libraries. At the same time, none of the libraries had dedicated staffing to support open education initiatives, making it even more imperative to build expertise and support across neighboring institutions. There were a myriad of tasks involved in setting up the OER pilot program, such as training, logistics, outreach and promotion, budgeting, reporting, and maintaining a commitment to the proposed timeline. Because of the amount of work required and the limitations in staffing, having two librarians from each institution allowed for a more even distribution of the workload.

In the workshop, Cross introduced the concepts of open education and OER. He also incorporated an interactive exercise in finding, evaluating, and using these materials by providing hands-on experience with the OER websites OpenStax, Open Textbook Network, OER Commons, and MERLOT (Multimedia Educational Resource for Learning and Online Teaching). He then illustrated various models for establishing a successful and engaging OER faculty review program, including the Alt-Textbook Project at NCSU. He worked with the TDEL librarians to create an action plan that defined the priorities and timelines for organizing the logistics of an OER faculty review program centrally and implementing it locally. Cross concluded with a brainstorming session on how to conduct an OER training workshop for faculty with tips for promoting the program on each of the campuses.

The materials Cross used to facilitate the TDEL training session can be accessed at https://goo.gl/S3Ac7o.

OER Faculty Review Program Overview

The second component of the OER pilot program was a faculty review program in which faculty were paid a $250 stipend to conduct a written review of one or more OER. The faculty were not required or even encouraged to adopt OER for use in the classroom. The goal of the program was simply to introduce them to the concept of OER and allow them to spend some targeted time assessing an OER to see how it might work in their classroom.

Supporting the creation of faculty OER reviews through stipends had several benefits: it created an opportunity to start conversations about OER on campus; it allowed librarians to build expertise in locating and evaluating OER; and it provided faculty with hands-on experience using OER. Finally, this program was a low-resource, high-impact way for the libraries to slowly transition into campus OER support.

The concept of implementing a review program as a first step in OER support is not a new one. The TDEL libraries were inspired by initiatives like the University of Minnesota’s Open Textbook Network review program (Senack, 2015) and the University of South Carolina’s SCoer! Awards (University of South Carolina, 2017) which demonstrated great success in engaging faculty by paying them stipends to conduct reviews of OER. The University of Minnesota determined that awarding stipends to faculty willing to review OER eventually led to more holistic faculty adoption. OER expert Ethan Senack (2015), writing about the Minnesota program, stated that “[w]hile the original intent of the project was to build open textbook credibility through reviews, it soon became clear that when faculty engaged with open content to provide a review, they were likely to adopt the open textbook in their class” (p. 13). This model has been highly successful for the Open Textbook Network (OTN), a consortium of over 600 campus members working to increase open textbook adoption and access to course material. While the TDEL libraries considered a consortial OTN membership, both funding and time constraints were barriers to membership. The group ultimately decided that creating a similar internal program and implementing it successfully would be a good first step to potentially asking the TDEL libraries directors for additional funding for OTN membership.

In summary, even though other libraries had conducted similar faculty review programs, the TDEL program was unique, because it was conducted collaboratively across four institutions with varying student populations, faculty interests, and library structures. For the program to be a success, it was necessary to establish cross-institutional program features for standardized results and also accommodate individual institutional customizations.

Cross-Institutional Program Features

Common Documentation

To incorporate both the standardization necessary for cross-institution analysis and the flexibility of campus-specific marketing and data, the group decided to develop base forms for program participation. These base forms included the standardized elements in a format agreed upon by the TDEL librarians. Each institution would then create a copy of the forms to customize. Colors, logos, and identifying information were customized to the individual institutions. If schools wished to gather additional information, they added unique questions to the forms. Since the core of each form was the same across institutions, results could then be easily aggregated and analyzed.

The base forms are publicly available in Google Drive for other institutions to copy, adapt, and reuse under the license CC-BY 4.0.

Faculty participants completed a consent form prior to official participation in the program. Because of privacy concerns, this form was not created in Google Docs. Instead, it is a fillable PDF form that was printed out and physically signed by faculty.

The review form was created in Google Docs, easily allowing the institutions to make copies of the base form and customize as needed.

Stipends were paid by the Duke University Libraries Business Services which preferred stipend forms formatted in Microsoft Word.

The feedback form was created in Google Docs, easily allowing the institutions to make copies of the base form and customize as needed.

Institutional Review Board (IRB)

In keeping with research best practices, prior to launching the OER faculty review program, the librarians applied for Institutional Review Board (IRB) approval through Furman University. Originally, the application was submitted as Exempt Status, because the review form was similar in nature to a survey, and because the information being collected was not deemed by the librarians to pose any potential risks. However, after reviewing the application, the IRB requested it be resubmitted as Expedited Review. It is important to note that an IRB at a different institution may have come to a different conclusion and determined that Exempt Status was sufficient.

As part of the Expedited Review proposal, the librarians were required to address the following: a thorough rationale for the program, complete copies of all four forms, detailed procedures and methodologies, privacy and security of the research, and potential risks to participants. One of the major concerns expressed by the IRB related to the level of personally identifiable information that could be publicly included with the faculty reviews. Three out of the four institutions were small in size and the Board was concerned that individual faculty members could be easily identified due to limited demographic information. Given the concerns, the librarians updated the consent form to allow faculty to choose the level of personally identifiable information they were willing to share. The Board also required that the consent forms be paper-based (rather than a Google Form) and stored off-line to protect participant privacy.

The IRB process was a new one for the Furman librarians who took the lead in conducting this portion of the project. As such, they were required to go through extensive training on IRB procedures, and also had a much greater learning curve in understanding and completing the application. This process added several weeks onto the faculty review program. Because IRB approval was not built into the initial timeline, it did delay the launch of the program by a week. However, approval was well worth the delay because it enabled the librarians to publish and present on the results of the faculty review program.

Faculty Participant Requirements

Faculty participants from all the TDEL institutions went through essentially the same process, although its implementation differed slightly from institution to institution.

The format of the consultation was the most varied part of this process. In some cases, workshops were used to allow multiple faculty to be introduced to OER and the program process simultaneously. In other cases, the specific interest of a faculty member coupled with the difficulty of scheduling group meetings led to one-on-one consultations. In all formats, the consultation included: an overview of OER, particularly licensing; a discussion of the review criteria; and an introduction to platforms and sites designed for OER discovery.

As required by the IRB, all participants completed a hard copy consent form to participate. This detailed the program and allowed participants to indicate what level of identifying information they would allow to be included in any written or publicly available materials resulting from the study. The form also indicated the title and URL for all OER to be reviewed. Faculty were allowed to choose one large curricular component, such as an online course or textbook, or several small components such as videos or content modules.

A standardized review form was utilized across all four institutions. The first section covered basic demographics and contact information. The second section asked for basic metadata about the OER, including license and content level. The main body included a professional review that was based on the BCcampus OER review criteria (BCcampus, 2013). Faculty ranked and commented on the following areas:

A fourth section completed the survey with a personal review where faculty reflected on strengths and weaknesses of the OER as well as how the resource might be used in their own teaching.

Once the completed review was received by librarians at the faculty member’s home institution, they were directed to the Business Services Office at Duke University which collected the sensitive information required for issuing payment from the Duke Endowment funds. External payment processes are complex, particularly when compensating faculty at different institutions. It’s important to plan ahead, document how the compensation process will work, and think through the collection of sensitive information beforehand to ensure that expectations of faculty participants are clear.

  1. Attend a workshop or one-on-one consultation with a librarian, which included identifying possible OER.
  2. Sign a consent form to participate.
  3. Complete a written review for each OER using a standardized form.
    • Comprehensiveness
    • Content Accuracy
    • Relevance/Longevity
    • Clarity
    • Consistency
    • Modularity
    • Organization/Structure/Flow
    • Interface
    • Grammatical Errors
    • Cultural Relevance
  4. Submit personal information required for $250 stipend to be issued.
  5. Complete a follow-up survey.

Participants who completed a review were sent a follow-up survey in early summer after the program. This survey aimed at determining the success of the program as both a specific OER review and a larger OER awareness campaign.

The first section asked participants to rank their experience and knowledge of OER before and after the program, as well as reflecting on benefits and challenges to OER adoption in their teaching. A second section asked about perceived OER knowledge and challenges at the department and institutional level, including whether faculty would consider adopting OER in their classroom. The final section rated the various elements of the OER review program individually as well as a whole.

Questions about concrete textbook savings were not asked as this project was an introductory effort intended to build knowledge and understanding of the concept of OER.

Individual Institution Customization

Even though the faculty review program was largely standardized, each institution also had the flexibility to make its own customizations. Librarians utilized their knowledge of successful communication strategies, effective outreach tactics, and existing culture around openness on their campus to maximize success. As previously mentioned, while faculty across institutions were required to meet with a librarian to learn more about OER, each institution could decide what that meeting looked like.

At Davidson College, the program was promoted through the faculty listserv, internal web pages, and an OER tab on the Davidson Open Access guide (https://davidson.libguides.com/open/oer). Librarians met with interested faculty in one-on-one consultations where they could tailor their searches to each faculty member’s interest, as this aligned with the small, liberal arts college culture at Davidson.

At Duke University, the program was promoted through the Center for Instructional Technology (CIT) newsletter, a blog post on the Duke University Libraries website, and a presentation at the CIT Showcase. The program was also promoted to library staff through presentations at a staff digital scholarship discussion group, departmental meetings, and a discussion series about scholarly communication topics entitled “ScholComm in the Edge.” A website was also created: https://scholarworks.duke.edu/open-access/open-educational-resources/.

At Furman University, the OER faculty review program was promoted at the New Faculty Orientation and an Undergraduate Evening Studies Faculty Orientation. An OER Guide (http://libguides.furman.edu/oer) and information about the review program in specific was created and posted to the library website. Because of significant interest from these two orientations, the program was not actively marketed elsewhere. Group workshops to introduce OER and the program requirements were held to reach multiple faculty at once. A few faculty members who could not attend the workshop or joined the program later had one-on-one consultations with a librarian.

At Johnson C. Smith University (JCSU), there was no familiarity in general with OER. Some faculty were using OER without realizing it, however, when teaching with public domain materials and Creative Commons-licensed journal articles. To publicize the pilot program, liaison librarians for different subject areas attended meetings for their respective departments as well as meetings of the library committee and with departmental chairs. A LibGuide was also created to assist in promoting the initiative

(http://jcsu.libguides.com/OER). Word of mouth about the $250 stipend also assisted in recruitment and outreach efforts. Like Davidson, Johnson C. Smith found that meeting with professors for one-on-one OER consultations was a better model than a group workshop, because faculty could more easily receive individualized and modified assistance for their particular courses.

It was important to balance standardization across the TDEL group with institutional customization. Others hoping to create a similar program should aim to balance data collection across institutions with customization to each institution’s goals, mission, and culture in order to be effective.

Results

OER Review Results

As of June 12, 2017, 28 faculty members had completed 37 professional reviews of OER for the program. Johnson C. Smith and Furman had robust faculty response, with 11 and 10 faculty participants, respectively. Duke and Davidson had a smaller response, with 3 and 4 faculty participants, respectively, on each campus. Every possible rank of faculty was represented across all four campuses (see Table 1), with Assistant Professor and Instructor being the most common. The participants also represented a wide variety of disciplines and departments, (see Table 2), though the social sciences dominated the group. Also of note, 10 of the participants taught in nontraditional undergraduate programs. Introductory level material and textbooks where the most popular content levels and format, but there was a good bit of variety in both of these areas (see Table 3).

Table 1. Participant Ranks

Rank

Count

Assistant Professor

11

Associate Professor

3

Instructional Designer

1

Instructor

9

Professor

4

Grand Total

28

Table 2. Participant Disciplines & Departments

Disciplines & Departments

Count

Arts & Humanities

6

Classical Studies

1

English

1

Interdisciplinary Studies

1

Religion

1

Theatre Arts

1

Visual & Performing Arts

1

Sciences

6

Chemistry

1

Computer Science

1

Environmental Sciences

1

Mathematics

1

Nursing

2

Social Sciences

16

Anthropology

1

Business & Accounting

6

Communication Studies

2

Education

1

Ethnic Studies

1

Health Sciences

1

Interdisciplinary Studies

1

Psychology

1

Social Work

1

Sports Management

1

Grand Total

28

Table 3. OER Formats & Content Levels

Format & Content Level

Count

Article

2

Graduate Student/Professional

1

Introductory/Survey

1

Class Assignment/Exercise

1

Graduate Student/Professional

1

Online Course

4

Advanced Undergraduate

1

Graduate Student/Professional

1

Introductory/Survey

2

Other

1

Introductory/Survey

1

Textbook

23

Advanced Undergraduate

8

Graduate Student/Professional

2

Introductory/Survey

13

Video

6

Advanced Undergraduate

1

Graduate Student/Professional

1

Introductory/Survey

4

Grand Total

37

As noted previously, the professional review categories were developed from an OER evaluation rubric developed by BCcampus. This same rubric is utilized by Open Textbook Library and other organizations, making it a common and accepted tool for evaluation of OER.

Considering the variety of faculty and resources being reviewed, the professional reviews consistently gave high marks across all categories (see Table 4). On a five-point Likert scale, each area received above average scores.

Table 4. OER Review Rankings

Category

Average

Median

Mode

Stand Dev

Comprehensiveness Rating

3.65

4

4

1.23

Content Accuracy Rating

4.22

4

5

0.98

Relevance/Longevity Rating

4.00

4

5

1.11

Clarity Rating

3.89

4

4

1.10

Consistency Rating

4.22

5

5

1.11

Modularity Rating

4.19

5

5

1.17

Organization/Flow/Structure Rating

4.16

5

5

1.14

Interface Rating

3.95

4

5

1.25

Grammar Rating

4.41

5

5

0.93

Faculty participants were also asked about how they might use the resource, with the ability to choose more than one option. Participants indicated that they would not use the resource at all in its current form with only four OER, with the majority indicating that they would use it as supplementary material and several indicating that they would replace their textbook with the resource (see Table 5). When asked about changes to the resources, suggestions ranged from updating and expanding content coverage to supplemental resources such as bibliographies, timelines, and glossaries. With these changes, only two resources were still listed as not having a potential use for the faculty member (see Table 6).

Table 5. How might you use this resource in its current state?

Option

Count

As a textbook replacement

13

As a unit replacement

7

As an assignment or exercise

10

As supplemental material

29

I would not use this resource

4

Grand Total

63

Table 6. How might you use this resource with your recommended changes?

Option

Count

As a textbook replacement

20

As a unit replacement

10

As an assignment or exercise

8

As supplemental material

26

I would not use this resource

2

Grand Total

66

Feedback Survey Results

As of June 12, 2017, 23 faculty participants representing all four institutions had completed a program feedback survey (see Table 7). Paired-samples t-tests were conducted to compare self-reported OER knowledge and OER experience before and after participation in the review program. There was a significant difference in the pre-program knowledge ranking (M=2.22, SD=1.04) and the post-program knowledge ranking (M=4.04, SD=0.77); t(22)=-7.59, p=0.01). There was also a significant difference in the pre-program experience ranking (M=2.09, SD=1.04) and the post-program experience ranking (M=3.91, SD=0.79); t(22)=-6.72, p=0.01). This supports a finding that the program met its stated goals of increasing OER knowledge and experience on the campuses.

Interestingly, while faculty ranked their own OER knowledge before the program fairly low (Average rank=2.2; 1=None, 5=Expert), they perceived the knowledge amongst their colleagues as moderate (Average rank=3.2; 1=None, 5=Expert). Faculty participants also indicated that they were more likely than not to adopt the OER they reviewed for the program (Average rank=3.6; 1=Not at all, 5=Guaranteed) and to consider OER in future course development/revision (Average rank=4.4; 1=Not at all, 5=Guaranteed). The most consistent challenge to adopting OER was time and discovering appropriate resources. The program itself was well received, with faculty ranking indicating general satisfaction with the review form and the entire faculty review program. The most common suggestion for improving the program was to increase recruitment of participants.

Table 7. Program Feedback Survey Rankings

Average

Median

Mode

Stand. Dev.

Rate your knowledge of OERs prior to your participation in this program

2.22

2

2

1.04

Rate your knowledge of OERs since your participation in this program

4.04

4

4

0.77

Rate your experience with OERs prior to your participation in this program

2.09

2

1

1.04

Rate your experience with OERs since your participation in this
program

3.91

4

4

0.79

How likely are you to adopt the OER(s) you reviewed for the
program?

3.61

4

4

1.16

How likely are you to consider OERs in future course development/revision?

4.39

5

5

0.72

How would you rate OER knowledge amongst your colleagues?

3.22

3

4

1.04

Please rate your satisfaction with the OER Review Form

4.30

5

5

0.82

Please rate your satisfaction with the entire OER Review Program

4.43

5

5

0.66

Engagement Results

In addition to the quantitative outcomes, the review program had some unexpected and exciting outcomes related to library and faculty engagement. The new connections made between faculty members and librarians were valuable, even at institutions that did not see high participation in their review programs.

While all of the library/faculty interactions were positive, some faculty became incredibly engaged with OER as a result of their participation. At Furman University, one program participant was so interested in OER, he began conducting research about OER in his discipline. He developed a survey to determine the impact of OER in his discipline at other liberal arts colleges. He also conducted informal research by teaching one of his classes with OER and another with traditional textbooks. He then surveyed the students throughout the semester to compare their experiences with the course materials. Finally, he is actively partnering with the librarians at Furman to create presentations and publications on his research. At Johnson C. Smith University, a faculty member is planning a new course on LGBTQ and gender studies using OER exclusively. She is working with the librarians to choose materials for the class. Another professor plans to use the resource she reviewed as the main textbook in her class beginning fall 2017.

Faculty members from two different institutions were inspired to author their own OER. At Davidson, librarians were embedded in two spring 2017 courses (History 338: Berlin in Translation and Religion 278: Islamic City) where students learned more about copyright, intellectual property, and openness. One librarian at Davidson traveled to Berlin with the class to teach students about open access and privacy as a result of a Faculty OER Review Program consultation. At Furman University, a faculty participant in the program conducted a review of a LibreText. She liked the general content of the LibreText but felt that it needed significant revisions and additions before she could use it in her class. Since her review, she has been working with the LibreText website to create her own LibreText for an upcoming class in the fall.

Perhaps the most exciting outcome for the program was that it sparked broader discussions of OER adoption within the library and at the campus level. For example, at Duke University, the program triggered interest in rolling out a larger strategy for the use of OER in their MOOCs (massive open online courses). Through its MOOCs, Duke University has taught over 2.8 million students (Manturuk & Ruiz-Esparza, 2015). Adopting OER for MOOCs would have significant positive results for these enrollees. In addition, the Furman University Undergraduate Evening Studies (UES) program offers a small selection of bachelor’s degrees to nontraditional students. Five of the faculty from Furman’s OER review program taught in the UES program. Their participation has led UES directors to begin investigating the feasibility of converting the UES classes to OER-only.

All four institutions enjoy strong relationships between librarians and faculty. The faculty review program served to strengthen these relationships in several significant ways. The one-on-one consultations afforded faculty and librarians an opportunity to learn about one another’s expertise more deeply. Librarians gained a deeper understanding of the faculty members, their teaching focus, their classes, and their research. Similarly, faculty members gained a greater appreciation for the services, programs, resources, and research available to them from the library. For example, faculty who initially expressed interest in OER began to have discussions related to the use of the library’s print and electronic subscriptions and databases to support their classes. Moreover, the rapport built among librarians and faculty enabled some faculty members to begin collaborating with the library on unrelated, but equally valuable projects. These conversations would likely not have occurred if not triggered by the Faculty OER Review Program.

Lessons Learned

The TDEL OER pilot program was an experiment in interinstitutional collaboration with the goals of increasing knowledge of OER among librarians and faculty; assessing campus knowledge and climate regarding open access and OER; and informing the development and/or expansion of OER initiatives supported by the libraries. The pilot program achieved these goals all while fostering knowledge sharing, cooperative program management, and distribution of resources among the TDEL libraries. The success of this program can be attributed to four major factors:

  1. Building an OER support network;
  2. Providing opportunities for frequent virtual and in-person collaboration;
  3. Establishing a flexible timeline;
  4. Managing expectations and goals.

Building an OER support network was a critical component to the OER pilot program. The Train the Trainer Workshop served as a catalyst for forming this network, building trust, and giving all of the librarians a baseline understanding of OER. Through the workshop, Will Cross not only provided participating librarians with a wealth of information about OER, but also provided an interactive session allowing them to brainstorm, share ideas, and build a sense of trust and community. The workshop also included an informal lunch, allowing the TDEL librarians to chat personally, thereby increasing their camaraderie. These in-person interactions were a critical component to the success of their future virtual interactions.

To foster the rapport and collaborative spirit that was developed during the workshop, it was important for the TDEL librarians to meet on a frequent basis. These one-hour meetings were held every four to six weeks virtually using the online conference software Zoom. While a conference call would have sufficed, Zoom offered the added benefit of virtual face-to-face discussions and screen-sharing capabilities. During these meetings, the TDEL partners shared their progress in implementing the OER faculty review program. They also brainstormed about next steps, shared ideas about outreach efforts, discussed challenges, and celebrated successes. In between the scheduled meetings, the librarians used email and Google Drive to communicate. As a group, the TDEL librarians found that having a support structure of eight librarians (two librarians from each institution) was invaluable. Being able to share tactics for success and brainstorm solutions to challenges collaboratively has enabled the librarians to be more effective on their campuses.

Another important component to the success of the program was flexibility. As with any program, unexpected difficulties arose, so building in flexibility during the planning process was critical. This was especially true with the timeline. When establishing the timeline, the TDEL librarians failed to take into account the need for IRB approval. Adding this step to the timeline caused delays in rolling out the program, which was originally scheduled to launch at the beginning of the fall semester. Luckily, the general flexibility of the schedule allowed them to easily adjust the deadlines to compensate for the delay.

In addition to flexibility, it was also important for the librarians to manage their goals and expectations. This was especially important when it became clear that all four institutions would not be able to contribute the anticipated 10 faculty reviews. This happened for a variety of reasons, including faculty time constraints, lack of interest, and a lack of OER in niche topic areas. While this was a disappointing outcome, the number of completed reviews was not and never had been the only goal of the program. The reviews were simply a means to broaden faculty awareness of OER and start fruitful conversations about how the library could support teaching and research in nontraditional ways. Keeping this in mind throughout the duration of the program allowed the TDEL librarians to manage their expectations, and to celebrate their successes, even if those successes were different at each institution.

Due to privacy restrictions of the IRB process, faculty reviews were not shared. Developing a joint repository of reviews could be a goal of a prospective partnership; however, due to staffing and managerial changes at our institutions, next steps for future collaborative efforts have not yet been determined. One of the goals of this particular program was to spark OER interest on our respective campuses and that objective was definitively met.

Libraries wishing to pursue a cross-institutional collaboration to further their OER outreach should focus on not only creating efficiencies but also community. Starting OER outreach with a faculty review program can be a useful way to gauge campus climate while demonstrating librarian expertise and building connections on campus. Pooling financial resources and creating a shared OER faculty review program were an effective means for building a support structure, creating shared resources and workflows, and collaboratively working toward a better understanding of OER.

References

BCcampus. (2013). BC open textbooks review criteria. Retrieved from https://www.bccampus.ca/files/2013/10/BC-Open-Textbooks-Review-Criteria_Oct2013.pdf

Center for Teaching & Learning. (2016). Open educational resources and open pedagogy stipends—submission request. Retrieved from https://www.davidson.edu/news/ctl-news/151215-open-educational-resources-and-open-pedagogy-stipends

Manturuk, K, & Ruiz-Esparza, Q.M. (2015). On-campus impacts of MOOCs at Duke University. EDUCAUSEreview. Retrieved from http://er.educause.edu/articles/2015/8/on-campus-impacts-of-moocs-at-duke-university

North Carolina State University. (2017). Alt-textbook project. Retrieved from https://www.lib.ncsu.edu/alttextbook

Senack, E. (2015). Open textbooks: The billion-dollar solution. The Student PIRGs. Retrieved from http://studentpirgs.org/sites/student/files/reports/The%20Billion%20Dollar%20Solution.pdf

University of South Carolina. (2017). SCoer! 2017 faculty awards. Retrieved from: https://guides.library.sc.edu/OER/affordableclass

License

Icon for the Creative Commons Attribution 4.0 International License

OER: A Field Guide for Academic Librarians | Editor's Cut Copyright © 2018 by Christy Allen; Nicole Allen; Jean Amaral; Alesha Baker; Chelle Batchelor; Sarah Beaubien; Geneen E. Clinkscales; William Cross; Rebel Cummings-Sauls; Kirsten N. Dean; Carolyn Ellis; David Francis; Emily Frank; Teri Gallaway; Arthur G. Green; Sarah Hare; John Hilton III; Cinthya Ippoliti; DeeAnn Ivie; Rajiv S. Jhangiani; Michael LaMagna; Anne Langley; Jonathan Lashley; Shannon Lucky; Jonathan Miller; Carla Myers; Julie Reed; Michelle Reed; Lillian Hogendoorn; Heather M. Ross; Matthew Ruen; Jeremy Smith; Cody Taylor; Jen Waller; Anita Walz; Andrew Wesolek; Andrea Wright; Brady Yano; Stacy Zemke; and Liza Long, Amy Minervini, Joel Gladd is licensed under a Creative Commons Attribution 4.0 International License, except where otherwise noted.

Share This Book