14 Assessment

When course marking is well communicated, students have uniform and equitable access to details that allow them to make informed decisions about course registration. From an institution’s perspective, course schedules act, at the most basic level, as a mechanism to organize and manage classroom space and resources. Course schedule policies explicitly aim to maximize an efficient use of space and time to meet student needs (Boise State University 2011, University of Iowa 2020). For example, Drexel University’s (n.d.) course schedule policy states that their policies exist to enable students to create conflict-free schedules and to graduate in a timely manner. For many students, graduating in a timely manner is of paramount importance as the cost of tuition and fees becomes increasingly and prohibitively expensive. The cost of undergraduate tuition, fees, as well as room and board between the academic years 2006/07 and 2016/17 rose by 31% for public institutions and 24% for private institutions (U.S. Department of Education, National Center for Education Statistics 2019). While course markings primarily provide students with information to help them plan their academic program, institutions can also use them to collect data in order to evaluate its course marking program and measure other effects, including teaching loads, student behaviors, and student needs.

Open and affordable course marking initiatives may develop organically within an institution, collaboratively as a part of a consortial effort, or responsively to meet changing state legislative requirements. Recognizing the motivations that drive a particular course marking initiative is imperative in planning for and measuring success. Accordingly, open and affordable course marking initiative assessments will differ.

How the data will be used and to whom the data will be reported will determine what type of data should be collected. For instance, gathering and reporting data within a single institution will likely require codification and coordination between different departments. Standardizing the reporting mechanisms can lead to effective and reliable data collection and should be considered best practice. For those working with a consortial or statewide initiative, assessment may be complicated by the variety and type of reporting mechanisms and data collected. This was the case with the Affordable Learning Georgia (2020) program, whose impetus for implementing course markings was, in part, assessing their system-wide OER grant program. Though all  26 institutions involved in Affordable Learning Georgia use a system-wide registration system, differences in data entry processes and lack of enforcement made this impossible (Chae et al. 2019). It is advisable to have initial conversations with relevant stakeholders about what, how, and why data are to be collected and how that data can be mediated among various systems.

Creating best practices for assessing the impact of open and affordable course marking initiatives is complicated by the fact that even the most established open and affordable course marking projects are still—to some extent—under development. This chapter will outline potential strategies for measuring the impact of open and affordable course marking initiatives by discussing assessment methods, awareness, compliance with mandates, cost savings, student success, and enrollment.

Planning for Assessment

Planning for assessment before implementing the initiative is prudent. The planning can be an informal or formal process, depending on the needs, timeline, and resources available. If the effects of a course marking initiative need to be reported to other entities, such as administrators or peer institutions, a structured approach may facilitate the process. Running through the list of questions below might be sufficient for initiatives operating without reporting requirements.

What are the goals of this course marking initiative?

Developing goals early in the process makes it easier to conceptualize and measure their effects. Consider creating goals that are SMART—that is, specific, measurable, achievable, relevant, and time-bound. SMART goals should be specific to the needs and ability of your initiative. A sample SMART goal for an institution looking to start up a data-generating course marking program might be to establish a list of persons responsible for course marking reporting for all academic departments within six months; whereas a sample SMART goal for an institution with an established program might be to double the number of students enrolled in courses using an open or affordable textbook over the course of four semesters.

Who are the stakeholders?

Think about who this initiative impacts. Identifying stakeholders can help identify the target population for a survey or focus group to better identify the strengths and weaknesses of a course marking initiative. Stakeholders may include students, instructors, registrars, or partners from other units. For more detail, Part II (Stakeholders) provides a substantive overview of these groups.

Who are the collaborators?

Collaborators are stakeholders who help implement the initiatives because they have some level of influence or power over the process. Examples of collaborators include administrators, instructors, department heads, libraries, institutional research departments, and registrar and student affairs offices. For example, students are stakeholders as users of course markings, whereas student government is a  collaborator through active advocacy and feedback.

What is the timeline of the course marking initiative?

Generally speaking, creating a timeline using established goals can help keep an initiative on track. For evaluative purposes, marking specific times to evaluate and revisit specific goals, collaborations, or processes can be helpful to monitor progress and address changing needs and priorities. Since course marking is implemented in phases and dependent on collaborations with others, flexible timelines are vital. For example, since course marking requires working with specific departments, it is helpful to schedule initial meetings with key people in those departments, touch base with them throughout implementation, and check back with a post-implementation followup.

How will I measure the data?

Specific questions about measuring data include deciding on what data to collect while considering stakeholders, whether the evaluation will take a quantitative or qualitative approach, and what resources are needed to collect and measure data (e.g., survey instruments, incentives, software).

Collecting and Using Student Data

Administrators, admissions, and enrollment management departments collect a large amount of student information. This information might be used to analyze student academic cycles through student profiles, forecast academic offerings and financial state of an institution, or propose ways to improve student learning. Just as institutional review boards exist to protect research participants, instructors and administrators collecting and using student data⁠ should consider whether the collection and use of student data is ethical and necessary for the purposes of their assessment.

Though instructors adhere to the federally mandated Family Educational Rights and Privacy Act, which works to protect the privacy of students’ educational records, some questions about the ethical collection and use of student data remain unanswered (Jones 2019). Newsworthy international cases such as the 2017 Equifax data breach or the 2018 Facebook-Cambridge Analytica Data Scandal exemplify how large amounts of personal data make valuable and sometime vulnerable targets for exploitation. The ethical collection and use of student data is especially important to consider when contracting with third-party vendors whose ethics may not necessarily align with those in academia. In August 2019, Senators Dick Durbin, Edward Markey, and Richard Blumenthal sent letters to educational technology companies expressing their concern about these companies’ handling of student data (Durbin 2019).

The proliferation of learning analytics, the process of gathering and analyzing data in order to profile learners, may assist instructors in better understanding the variables that contribute to student success (Alexander et al. 2019). Most of the student data are collected through the digital interfaces of learning management systems such as Blackboard or Canvas. At an individual level, collecting student profiles could potentially help instructors, advisers, and student success staff provide early intervention through customized emails at critical points in the semester based on individual students’ performance and predictive learning analytics (Sclatar and Mullan 2017). These methods of early intervention have gained traction as institutions, facilitated by third-party vendors, actively try to figure out best practices to support student persistence, retention, and matriculation. Establishing institutional guidelines for student data collection and use is vital to preserving transparency while optimizing service.

If students’ preferences for open and affordable course markings are collected in student profiles, that data could provide some insight into what types of students—traditional, adult, first-generation, or veteran, for example—might select courses that use open or affordable learning materials. With the potential benefits learning analytics may bring to supporting students, instructors must also critically consider the ways in which learning analytics are susceptible to concerns of consent, bias, privacy, and ethics. Many colleges and universities are already collecting data about students through the learning management system or tracking their location through swipe systems in order to assist students, but benevolent use does not immunize data collection from ethical scrutiny. Understanding the ways in which institutions collect and use data, while respecting students’ autonomy and privacy, can help open educational resources (OER) practitioners better understand how collecting and analyzing open and affordable course marking data can fit into the larger landscape of learning analytics.

Collecting accurate and comprehensive student data is vital in colleges and universities where there is a heightened need for instructors to demonstrate return on invest as a result of neoliberal policies. Neoliberalism in academia conceptualizes higher education as a free market in which students are consumers and education is a commodity rather than a social or public good (Saunders 2007). Given decreases in funding to state colleges and universities over the last few decades (Chronicle of Higher Education 2014; Pew Trusts 2019), public institutions increasingly rely on revenue generated by tuition and other income streams. Institutions traditionally tracked student factors such as grade point averages, major selections, number of credits enrolled, and number of credits attempted to help determine individual students’ persistence and retention. Marking open and affordable courses can be another factor in attempting to understand student persistence and retention.

Institutions develop initiatives and programs, as well as collect and analyze student data, with the intention to improve students’ higher education experiences. These initiatives can be especially critical in a student’s first year. When administrators mark courses with designations and descriptions, they can track the implementation of institutional initiatives and analyze whether these interventions—such as offering more service learning courses—have had an impact on retention (Gardner 2002, 146). Though having this quantitative data is a valuable piece of the evaluation process for retention initiatives, the student data should be considered in connection with other factors not captured in the student information system (SIS), including external factors at home or work, that also contribute to attrition. Akin to the ways in which service learning designations may function, open and affordable course markings are another form of institutional intervention which may be measured against student retention and persistence.

Analyzing Data

Data for course marking initiatives may come from a variety of sources, such as reports from the SIS or focus groups with stakeholders involved in the course marking process. Since many reports draw data from a complex array of information sources, analyzing data requires a basic understanding of quantitative and qualitative methods and the ability to decide which method is most appropriate to use in the evaluation process. This overview of quantitative and qualitative methods will not be exhaustive or comprehensive, but explains some basic principles one must understand when considering how to evaluate an open or affordable course marking initiative.

Qualitative methods measure observations and data that are not numerical. Some methods of gathering qualitative data include focus groups, interviews, and observation. Using qualitative methods can provide insight into processes, experiences, and perceptions. Researchers often choose qualitative methods to explain and/or create a narrative of an experience or situation. A compelling narrative or case study about student agency can be a strong indicator of the success of open and affordable course marking projects.

Quantitative methods are used to measure countable aspects of a course marking initiative—for example, the number of courses marked, the number of students enrolled in marked courses, or the number of programs or departments involved in the marking initiative. Using quantitative methods can provide valuable insight into the reach of the course marking initiative. Measuring the reach of an initiative can be particularly helpful when demonstrating value to administrators, state officials, or other stakeholders who might be potential advocates or partners. Using quantitative methods requires an understanding of descriptive statistics, inferential statistics, and confounding variables.

Statistics can be categorized as descriptive or inferential, which are used for different purposes. Descriptive statistics report the basics of what is measured. In the case of course marking, descriptive statistics might be as simple as calculating the number of open and affordable course markings. On the other hand, inferential statistics use probability theory to infer other meanings and draw new conclusions from the data set. For example,  inferential statistics could be used to identify which types of students (e.g., traditional, adult, first-generation, or veteran) are more likely to enroll in courses using an open or affordable textbook in order to graduate more quickly. Having that type of information is helpful when formulating a marketing plan (e.g., partnering with advisers who work with specific student groups). Descriptive and inferential statistics are valuable for their specific purposes.

If not accounted for, confounding variables introduce bias into the analysis by implying a correlation where there is none. When analyzing the impact that course marking has on outcomes such as student awareness, course selection, and persistence, researchers identify and control for confounding variables to mitigate against distorting the association between an exposure to course markings and an outcome (Pennsylvania State University 2018).

For example, choosing a course marked as a service learning course does not necessarily correlate to a higher interest in service learning per se. The course could be the only one offered during an opening in a student’s schedule, taught by a popular instructor, or offered in a preferred format. The same is true for courses marked as using open and affordable materials. Researchers need to continually identify and control for confounding variables wherever possible to make accurate inferences. Transparency requires disclosure when it is too difficult to control for confounding variables, and this limitation must be mentioned when presenting the data to an audience of stakeholders.

Evaluating an initiative may require some familiarity with the principles and methods of data analysis. A number of open textbooks have been published on the subject. Introductory Statistics (Illowsky and Dean 2020) introduces the basic statistics principles necessary for data analysis.

Assessing Open and Affordable Course Markings

By marking open and affordable courses through the SIS, institutions create an opportunity to perform basic assessment of courses that use open and affordable content and to run reports based on student success metrics. Running reports through the SIS reduces the possibility of sampling error and duplicative reporting processes.

The Open Education Group published the Guidebook to Research on Open Educational Resources Adoption, which outlines ways to measure the impact of OER on student and instructor use, cost savings, student outcomes, and perceptions of OER (Hilton et al. 2016a). The guidebook provides specific research questions and measurable variables, identifies the confounds and offers suggestions for controlling for these variables, and indicates statistical methods for analyzing the data.

The data gathered for these processes may be collected via surveys, questionnaires, reports from instructors, or reports from the SIS. The OER Champion Playbook (2017) includes “plays” created to help one identify and measure goals related to the impact of a program, the amount of cost savings, instructor and student satisfaction, progress and completion, as well as student learning and engagement.

Wiley (2019a) of the Open Education Group also created the “OER Adoption Impact Calculator,” with an easy to use, web-based interface that allows one to enter data fields, such as the number of enrollments using OER, the average cost of textbook(s) replaced, and the average cost spent by students using OER. This tool allows one to calculate the total textbook cost to students, the course throughput rate, additional tuition revenue from increased enrollment intensity, tuition revenue refunded to students who drop, and net change in institutional revenue. The Guidebook to Research, the “OER Champion Playbook,” and the “OER Adoption Impact Calculator” are great resources for those are new to learning how to conduct basic assessment and research on OER.

Adding a course designation for open and affordable content in the course registration system and schedule of classes provides a mechanism for running reports to track open and affordable usage across an institution. Case studies from Houston Community College and State University of New York explicitly indicate that one of their goals in developing course marking initiatives at their respective institutions was to develop methods of tracking and reporting open and affordable course material usage. Many instructors independently adopt textbooks without a formal system set up to account for the actual number of courses marked or to assess the impact of open and affordable courses on student success. Marking the courses is the first step of collecting data in order to use that information. If institutions are solely focused on cost savings, they might choose to use descriptive statistics to measure the amount of potential cost savings, whereas other institutions might use statistical inferences to measure the impact of courses using open or affordable textbooks. Evaluating open and affordable course markings is not always straightforward, especially given the conflation and low awareness of the terms “open,” “OER,” and “affordable.” Notably, Houston Community College stopped using OER as a course marker and began marking Low Cost and Zero Cost courses. It is possible that the potential loss of the ability to untangle the impacts of OER versus affordable learning initiatives through a schedule search may have unintended detrimental effects for measuring impact. As more institutions move to include OER courses with no-cost and low-cost course markings, researchers lose the ability to differentiate the OER courses from the non-OER courses. The focus on student cost savings, as well as the cyclical and internal textbook adoption process, makes the process of measuring impact factors beyond cost savings difficult.

Open and Affordable resource awareness

As nascent course marking initiatives expand and new initiatives are created, program coordinators and researchers should focus on awareness of course marking among students and instructors. Evaluating course marking awareness is necessary for evaluating whether students and instructors know about open and affordable course markings and use them in making decisions. Coordinators interested in expanding course marking initiatives might consider collecting information that sheds light on the student enrollment decision-making process as a compelling argument in favor of expanding course marking to include open and affordable course designations.

There are a number of ways in which course marking initiatives can contribute to the general awareness of open and affordable concepts, materials, and programs. The act of marking open and affordable courses naturally leads to more awareness of courses that use open and affordable materials as students discover the markings in the SIS and instructors notice their peers using and talking about open and affordable materials in the classroom. Each institution represented in Part VII (Case Studies) made a conscious decision to use specific terminology when identifying OER, no-cost, and low-cost courses for their institutional audience.

The term “OER” does not mean much to the average student. Typically, policy-driven promotion of OER or open education or prioritization of OER as part of an affordable content initiative contributes to whether the course marking includes “OER” as a designator. Since implementing open and affordable course marking, Houston Community College has seen an increase in the number of courses marked. Though the increase could be attributed to a number of external factors, marking these courses is a significant step in furthering discoverability and overall awareness across the institution.

To promote awareness, Central Virginia Community College’s schedule of classes clearly defines OER at the point of usage (see fig. 18.1). Despite the prominent OER definition, some students mistakenly believe that courses marked as OER are delivered online. Even the instructors who adopt OER may have confusion about the term, particularly at institutions that mark zero or low cost materials. At Houston Community College, Smith notes that the number of students who were reported to be enrolled in OER courses is not accurate because many instructors were unaware of the differences between open and other affordable course materials, conflating terms and perhaps overestimating the number of students actually enrolled in OER courses.

For institutions that also use zero and low-cost designators, marketing materials and communications must be extra vigilant to prevent confusion around the terms, and program coordinators should develop a regular assessment routine to measure understanding of the terminology. Academic advisers and staff in registrar offices should be prioritized in educational outreach efforts to assist with the provision of explanations for students. Usability tests or cross-sectional focus groups could be useful mechanisms for assessing potential users’ understanding of the language used. Additionally, a brief follow-up survey requested from users who have encountered marketing materials designed to explain terminology and branding would be useful in assessing the effectiveness of various outreach techniques.

In the State University of New York system, Tompkins Cortland, Fulton-Montgomery, and Dutchess Community Colleges’ course registration systems provide a clickable link to a definition of OER. At Tompkins Cortland Community College, students can filter courses to “Show only OER courses.” Assessing these awareness strategies may include counting the number of clicks on the definition link or how frequently students used the “Show only OER courses” limiter.

Mt. Hood Community College and Nicolet College decided not to use OER as a designator. Instead, these campuses identify classes with affordable course materials as either No Cost or Low Cost in their registration system. Both institutions concluded that students do not clearly understand the term “OER” nor that No Cost materials are not all OER. Erie Community College also avoids using the term OER but has just one designation—an “AIM” badge—for Affordable Instructional Materials, which encompasses both OER and materials that cost less than $30.

Clear and concise descriptions of open and affordable designations are important in understanding students’ and instructors’ awareness of open and affordable concepts. As stakeholders design awareness surveys and questionnaires, they can easily refer to the clear descriptions used in open and affordable course markings to build the most effective assessment tools.

In Chapter 2 (Legislative Implications), a 2018 report for the Oregon Higher Education Coordinating Commission noted that students were not aware of OER courses and/or the information was not available in a timely manner (Freed et al. 2018). The researchers recommended making a common form of designation across the state. A City University of New York survey indicated students were not aware they were in a zero textbook cost (ZTC) course. These observations support registrar professionals’ assertion that students do not closely read information in the SIS; rather, they are in the registration system to conduct business (Kitch 2015). Measuring awareness should not stop after reporting the survey results. The assessment of open and affordable course marking programs should be iterative and continuously focus on areas where changes can potentially improve awareness. For example, the 2018 Oregon study recommendations include using a phrase or icon that is easy to understand and using it consistently in more places than just the registration system (Freed et al. 2018).

At Mt. Hood Community College, administrators specifically asked the student government association (SGA) to differentiate between OER, no cost, and low cost before implementing the course markings in the SIS. The SGA recommended definitions for each term, identifying terminology that would be easy for students to understand. This pre-course marking data collection from the target population not only provides evidence supporting the use of one set of terminology over another but also creates an early awareness of open and affordable characteristics among the target population (students) prior to rolling out the course markings.

Sometimes informal conversations with instructors or students about their usage or understanding of open and/or affordable materials can be illuminating. These comments, which are qualitative in nature, add value to an assessment report by providing more insight into the nuances of how aware instructors and students are of open and affordable concepts. Anecdotally, librarians at Lower Columbia College suggest that course marking, and the associated collaboration, outreach, and marketing performed to implement and advertise the initiative, led to greater awareness and visibility of affordable textbook initiatives on campus. Marking the courses and performing the necessary legwork to disseminate and retrieve information from instructors kept the program an active topic of campus conversations (Hicks and Gillaspy-Steinhilper, Personal communication 2018).

In “Participant Experiences and Financial Impacts: Findings from Year 2 of Achieving the Dream's OER Degree Initiative,” responses to surveys and site visit interviews from 2016/17 and 2017/18 indicated students were unaware of the OER course options before they registered for classes (Griffiths et al. 2018). Seven of the colleges included in the research study marked OER and ZTC in the schedule of classes or course catalog at the time of the research, and at least two of these institutions included explanations for the course labels. Twenty-four percent of students reported they saw the OER icon by the course name during registration, and 23% said cost saving was a strong factor in their enrolling in the class (Griffiths et al. 2018, 15). For institutions that approach OER course marking as a way to build awareness and promote OER courses to students, the findings from the Achieving the Dream report and several case studies seem to indicate that OER course marking alone is not enough to raise awareness levels.

In addition to SIS analytics, which show enrollment in open and affordable courses, surveys have been developed, implemented, and analyzed to understand student awareness of open and affordable course materials. Several surveys are available at the OER Research Toolkit (Open Education Group 2017), which are designed to identify student use of open and affordable course materials and their perception of the quality of the materials. In Achieving the Dream’s student survey responses, researchers discovered that though student awareness of OER was initially low, a majority would enroll in an OER course again (Griffiths et al. 2018). A combination of SIS data and student survey responses provides a more holistic view of open and affordable course initiatives.

Assessment Tools for Cost Awareness
Assessment Method Impact
SIS usage report Quantitative Tracks student hits on links/limiters for open and affordable courses
Student survey or questionnaire Quantitative
and qualitative
Descriptive of student awareness (e.g., OER, affordable, and zero-cost courses)
Instructor survey or questionnaire Quantitative
and qualitative
Descriptive of instructor awareness (e.g., OER, affordable, and zero-cost courses)

Compliance with mandates

Chapter 2 (Legislative Implications) and Chapter 3 (Institutional Policy) discussed state and institutional mandates for institutions to mark courses within the registration system or the schedule of classes. For example, Central Virginia Community College’s OER course marking initiative started because of a grant administered by the Virginia Community College System  Chancellor’s Innovation Fund and the Hewlett Foundation, which stipulated the need for the institution to mark the classes for the Virginia Community College System. One way to demonstrate compliance with legislative, institutional, and grant requirements is to run a usage report. Some mandates might require institutions to report the number of courses, the number of programs involved, or the overall student cost savings due to marking open and affordable courses. Marking open and affordable courses to demonstrate compliance often leads institutions to realize that collecting this data is advantageous in other ways. For instance, the grant funding from the Virginia Community College System and the Hewlett Foundation not only allowed Central Virginia Community College to implement the program, but also provided an easier way for students in the system to discover the courses and created a mechanism to report back simple data to the funding sources about adoptions across the institution.

Assessment Tools for Compliance
Assessment Method Impact
SIS usage report Quantitative Adherence to governmental mandates
Demonstrates achievement of institutional benchmarks and/or goals
Survey of instructors
Sample survey: Bliss et al, 2013
Quantitative,
qualitative, mixed methods
Number of students using open and affordable materials
Cost of previous course material(s)

Students’ Cost Savings

Some institutions use course markings to report on student cost savings of open and affordable materials versus traditional textbooks. Mt. Hood Community College, for example, collects data on courses adopting OER and compares those numbers with campus store data to determine a general estimate of student cost savings. City University of New York’s Open Education Librarian Ann Fiddler notes that though the system has seen a dramatic rise in cost savings and in the number of courses taught using OER, it has been difficult to determine how much of the cost savings can be attributed to the 3,000 ZTC sections (about 5% of the total courses offered) and how much to other more long-standing OER initiatives. Nevertheless, existing ZTC course designations can be used to run reports to measure OER usage and cost savings (Fiddler and McKinney, Personal communication 2018).

The data collected on student cost savings also may feed back into the marketing and communication efforts to promote open and affordable course marking initiatives. Whether the communications plan targets students, instructors, administrators, or external stakeholders, highlighting baseline student cost savings or trends in student cost savings over time can be an appealing part of the messaging, as evidenced by Lower Columbia College’s 2016 campaign to promote the success of their alternative educational resources (fig. 23.1).

Assessment Tools for Cost Savings
Assessment Method Impact
SIS usage report Quantitative Number of courses using open and affordable materials
Number of students enrolled in courses using open and affordable materials
Student survey
Sample questionnaire: Florida Virtual Campus (2016)
Quantitative How much money students spend on textbooks
Compare/track cost savings over time

Student Success

Though the question of cost requires access to systems outside of the SIS to calculate estimated savings, outcomes can be measured using the information contained within the SIS such as final grades, drops, and withdrawals for sections of courses using open and affordable course materials versus those using a traditional textbook. As mentioned in Chapter 9 (Student Information Systems), the registrar, records office, assessment program, and information technology department may have access to the SIS to run reports. In several case studies, including Mt. Hood Community College, Houston Community College, and State University of New York, institutional OER leaders also have access to the SIS to run reports. Student outcomes can also be measured by assessing course throughput rates⁠—drop rates, withdrawal rates, and C or better rates⁠—for sections of OER courses as compared with sections of courses taught with a traditional textbook.

Kwantlen Polytechnic University, an institution which added a course attribute field that allows students and other stakeholders to filter courses that are part of their Zed Cred program (the Canadian equivalent of the Z-Degree), has noted the vast potential of using this filtering mechanism to determine the impact of the overall Zed Cred program. For example, reports can be generated each semester comparing courses that have both participating and non-participating sections in the Zed Cred program. Using these reports, insights can be gained into important metrics, including grade distributions, course withdrawals, and course failure rates.

As Hilton and colleagues (2016b, 19) explain, “while cost-savings are important to some instructors, the more vital issue relates to student learning.” Tidewater Community College implemented ZTC courses (or Z courses), designed for the Z-Degree program. Students see and can choose Z courses during registration. During the Fall 2013 through Spring 2015 terms, researchers compared student course throughput rates in Z courses with rates in non-Z courses (Hilton et al. 2016b) using data generated from SIS reports. The authors acknowledge the study design does not indicate causation, but the results of the study align with previous studies that indicate students perform as well or better in courses using OER as in courses using traditional textbooks (Hilton et al. 2016b, 24).

Retention is also a popular metric among institutional administrators. Nathan Smith at Houston Community College describes his close relationship with the Office of Institutional Research in tracking metrics such as grades, drops, and withdrawals. The courses are marked as low-cost, zero-cost, or Inclusive Access courses, and these distinctions enable the institution to compare student success in open and affordable courses with that in courses that use traditional textbooks. Data collected about open and affordable enrollment does not, by itself, indicate an effect on retention. However, open and affordable courses can be a data point in the conversation about student retention, along with student engagement practices such as service learning courses (Bringle, Hatcher, and Muthia 2010, 45) and students’ personal financial situations (Hope 2015, 12), while still controlling for confounding variables. With access to reports in the SIS, researchers can also assess student success metrics for Pell-eligible students enrolled in OER, no-cost, and low-cost courses, which are important for assessing marginalized student persistence.

Assessment Tools for Student  Success
Assessment method Impact
SIS reports: final grades, failure rates, withdrawal rates, and Pell eligibility Quantitative Compare student success and persistence in open and affordable courses to comparable traditional courses
Student survey:
Sample questionnaire: Jhangiani et al. (2018)
Qualitative and quantitative Compare student responses with course performance data
Instructor report: final grades Quantitative Compare open and affordable courses to comparable traditional courses
Focus groups or interviews Qualitative Descriptive of why students choose courses marked with an open or affordable textbook

Student Enrollment

Few reported assessments have been performed to determine the impact of open and affordable course marking on student enrollment; it is an area in which future stakeholders will likely choose to explore. City University of New York stakeholders have begun to consider whether course marking impacts student enrollment in certain courses. City University of New York stakeholders report that future analysis for the ZTC initiative will focus on assessing whether students register for courses specifically based on searches performed for the ZTC designation or whether they enroll in these courses for unrelated reasons. Andrew McKinney, the City University of New York open education coordinator, reports that this analysis could be done by conducting student surveys or by requesting queries performed from the registrar’s office. While these quantitative and qualitative impact studies are still in the developmental phase, measuring the impact of the ZTC course marking initiative on student enrollment will remain a factor in determining future directions for City University of New York’s OER activities (Fiddler and McKinney, Personal communication 2018).

Since implementing course marking that designates courses using cost-free resources, Kwantlen Polytechnic University has seen an increase in the wait-list for Zed Cred courses over equivalent courses not participating in the program. This indication reflects student assertions that courses using cost-free resources are preferable to those with more traditional costs. Using wait-lists for Zed Cred or Z-Degree programs is one mechanism to assess the popularity or demand for courses using open and affordable materials. This type of information can be extremely compelling in demonstrating the value of marking open and affordable courses to administrators and other stakeholders who might be able to assist or expand existing initiatives. While City University of New York and Kwantlen Polytechnic University are exploring this assessment, researchers have not yet published results showing improvement in registration numbers for courses using open and affordable resources versus traditional course materials.

Assessment Tools for Enrollment
Assessment Method Impact
SIS usage report Quantitative Compare open and affordable enrollment to comparable traditional course enrollment
Student survey or questionnaire Qualitative and quantitative Descriptive of student decision-making

Conclusion

Though a handful of institutions that have recently implemented OER have shared assessments of their programs’ impact on students, instructors, and institutions, the available information on the effects of marking open and affordable courses is still scarce. Some institutions have expressed future plans to measure the impact of open and affordable course markings. For example, Nicolet College hopes to determine if a correlation exists between No Cost/Low Cost designations and enrollment, as well as what potential effects course marking may have on the degree pathways of students. It is likely that other recent initiatives are also actively collecting and assessing impact measures to be used internally or shared with the larger academic community at a later date. Thus far, the few programs that have assessed and shared their open and affordable course marking initiatives have measured compliance with mandates, cost savings, effects on student enrollment, and awareness of open and affordable initiatives and programs.

Within the growing literature of open and affordable course marking, initiatives frequently report about their impact using narrative and case study formats, such as those found in Part VII (Case Studies). Using a narrative reporting method allows programs to combine their qualitative and quantitative data in a way that delivers statistically relevant information while also providing critical context connecting the program to local communities. The descriptive nature of the narrative format lends itself well to the potentially disparate audiences of administrators and students alike and supports student outreach and promotional communication activities. See Chapter 13 (Implementation) for more details.

Opportunities exist for institutions as they implement course marking to develop new ways to measure their impact. As these programs evolve and literature is published on the subject, demonstrable effects of course marking may encourage stakeholders at other institutions to consider and develop new open and affordable course marking ventures.

definition

License

Icon for the Creative Commons Attribution 4.0 International License

Marking Open and Affordable Courses: Best Practices and Case Studies Copyright © 2020 by Breeman Ainsworth; Nicole Allen; Jessica Dai; Abbey Elder; Nicole Finkbeiner; Amie Freeman; Sarah Hare; Kris Helge; Nicole Helregel; Jeanne Hoover; Jessica Kirschner; Joy Perrin; Jacquelyn Ray; Jennifer Raye; Michelle Reed; John Schoppert; and Liz Thompson is licensed under a Creative Commons Attribution 4.0 International License, except where otherwise noted.

Share This Book