Assessing OER impact across organisations and learners: experiences from the Bridge to Success project

Rebecca Pitt

The Open University
beck.pitt@open.ac.uk

Nassim Ebrahimi

Anne Arundel Community College, USA
nebrahimi@aacc.edu

Patrick McAndrew

The Open University
patrick.mcandrew@open.ac.uk

Tim Coughlan

University of Nottingham
Tim.Coughlan@nottingham.ac.uk

Abstract: Open courses have received a lot of attention in the last two years; however, the question of whether they serve learners has yet to be determined. This paper explores the challenges and potential in assessing the impact of open educational initiatives, particularly those that produce and share Open Educational Resources (OER). We use a collaborative international project as a case study to explore this issue. Bridge to Success was supported as part of the Next Generation Learning Challenges (NGLC) programme to work with a range of community colleges and other organisations in the US through monitored pilots. The project adapted existing course materials in mathematics and learning/personal development skills and released these as OER. A range of approaches were then used to assess the impact of the materials across a diverse set of users, combining data gathered from interviews and questionnaires with both educators and learners and from instructor rating of performance and related student results. This approach allows different indicators of performance to be brought together and so demonstrate the value of OER. However, our findings also highlight tensions between applying robust research methodologies in situations of open use with diverse stakeholders. We provide reflections and suggestions for ways forward in addressing the particular characteristics of openness and how they affect research, and how the multiple perspectives on what constitutes impact can be addressed.

Keywords: OER, Community College, Mathematics

Introduction

This paper explores the challenges and potential in assessing the impact of open educational initiatives, particularly those that produce and share Open Educational Resources (OER). We describe and reflect upon a range of research methods and the resulting data gathered from the Bridge to Success project.[1] Through this, we identify tensions, and suggest ways forward in addressing the need to assess the impact of open projects, the particular characteristics of openness that affect this, and how multiple stakeholder perspectives on what constitutes impact can be included in this assessment process.

Previous research has presented evidence that the use of open materials can benefit students in individual contexts (e.g. Lovett, Meyer & Thille, 2008). However, if we are to relate the impact of openness on learning to its other characteristics, it is important that we consider more holistic approaches that include use across different institutions and outside institutional boundaries. Minimising the tension between ensuring materials are open, and collecting meaningful impact data, is a critical one for OER. Openness can lead to reduced information on the circumstances around resource use, making research more difficult than in a closed system (Thomas et al., 2012). Some forms of evaluation have been performed with open initiatives, for example using analytics to suggest numbers of users and their geographical distribution (e.g. Rodgers, 2011). Others have taken a qualitative approach to understanding the general impact of the OER movement and the practices developing around it (Masterman et al., 2011). However there has been limited methodological reflection around this complex but important issue. In this paper, we reflect upon a case study that reveals some of the tensions involved with evaluating the impact of a specific open initiative.

The Bridge to Success Project

Motivations

Enabling learners to achieve their educational goals upon entering the higher education environment is a complex and critical issue in the United States (US) as well as the wider world (Lingenfelter, 2012). This is particularly the case for community colleges, as students begin their college studies from a wide range of starting points, situations, and backgrounds. While some will have completed High School, or have undertaken a General Education Development (GED) to show equivalent skills, others will have dropped out of High School, or taken time out of education for employment or other reasons. Many also need to combine their studies with employment or providing support for their family.

Despite being willing to continue their studies, "about 60 percent of incoming [community college] students are referred to at least one developmental course" with the large majority of these students failing to achieve any form of certification within 8 years (Bailey and Cho, 2010). It is clear, therefore, that these students face complex and significant barriers to their learning and have gaps in their education that are not being effectively addressed by the education system. As Bailey and Cho (2010) argue, helping all of these learners transition effectively into their community college programmes is of critical importance if the Obama administration is to meet its goal of "America …once again hav[ing] the highest proportion of college graduates in the world"[2] by 2020.

Open Educational Resources (OER) are seen as having the potential to provide new approaches to tackling challenges in the education system (Atkins, Brown & Hammond, 2007). Possible barriers to entering higher education could include lack of familiarity with study techniques or the need to have certain education or qualification requirements. The Bridge to Success project, funded from April 2011 - December 2012 by the Next Generation Learning Challenges (NGLC) programme (Calkins & Vogt, 2013), was an international collaboration led by The Open University (UK), with partners Massachusetts Institute of Technology (MIT), Anne Arundel Community College (AACC), and University Maryland University College (UMUC). Bridge to Success aimed to "address the barriers to educational innovation and tap the potential of technology to dramatically improve college readiness and completion in the United States."[3] As part of the project two open courses for first-year college students were remixed, creating similar courses from existing materials, in a collaborative process involving UK and US content experts and educators. The courses addressed personal development/learning skills and mathematics matched to college student needs. The funder, NGLC, was specifically interested in the use of the courses in terms of how many students enrolled, completed, persisted beyond the courses, "mastered subject matter," and "mastered deeper learning." Thus, Bridge to Success had two phases: development of the courses, followed by piloting of the materials.

Course development

Reworking of existing course materials to create two new open courses (Learning to Learn and Succeed with Math) occurred through an iterative, collaborative process (discussed in more detail in Coughlan, Pitt & McAndrew, 2013). The Open University converted the Learning to Learn course, from its original print format and to an online environment, requiring relatively few changes to the materials. Faculty and staff from the US project partners worked during summer 2011 on modifications, such as replacing video sequences for three personal case studies with appropriate US versions and adjusting the terminology and spelling to suit the US context. Changes to the content occurred largely in the final section of the original course, which focused on "career development." It was felt that the focus of this part of the course was "…a little bit too far along for some of our students"[4] by one of the US educators involved in the reversioning.

The final version of Learning to Learn, consisting of 5 units, asked users to reflect on the way in which they learn before examining alternative strategies. The course is estimated to take approximately 100 hours to complete. Learning to Learn encourages learners to develop study skills and plan their workload, introduced some academic theory on different ways of learning and was punctuated by practical activities, including mind mapping and time management activities.

The core mathematics skills course, Succeed with Math, was also reworked for the US context, with more significant changes required than Learning to Learn to accommodate curricular alignment (e.g. fractions are more prevalent in US courses). UK and US mathematics educators/instructors worked collaboratively over a period of 6 months to produce the final version of the original course, consisting of 8 units. Succeed with Math is estimated to take approximately 80 hours to complete. The course is designed to increase confidence and reduce anxiety in learners by providing examples to encourage learners to think about math as part of their everyday life. Units start from basic study skills and then focus on mathematics themes of measurement, negative numbers, fractions, percentages, formulas, geometry and graphs. An online calculator and "hints" were included with the course. Quizzes were used at regular intervals in the Succeed with Math course with practice quizzes available at the start of each unit, and post-quizzes at the end of each unit. Succeed with Math also included games and pencasts (short videos which showed a problem being drawn on a virtual whiteboard with an accompanying commentary).

Pilot projects

Both courses are freely available as OER for use and remixing on LabSpace (The Open University's platform for open content), giving options for how the courses could be interpreted and used. This underscores the importance of understanding the context of use and the varied ways in which each instructor/institution integrated the resources, ranging from whole course offerings to instances where the materials were utilised in part (e.g. particular units) and integrated with pre-existing courses. Specifically, Learning to Learn was used in a variety of ways by piloting institutions. One college offered it as a 6-week, non-credit course for students as preparation for courses they had enrolled, facilitating preliminary interactions between students and instructors. Elsewhere, certain sections of Learning to Learn were incorporated into existing courses (as the instructor described it "we just did a wrap").[5]

Piloting institutions also implemented Succeed with Math in a variety of ways. For example, one project partner organisation offered the complete math course as "a purely online format" and also as "a hybrid format." In the latter instance, students used the materials over a three-week period, meeting weekly to discuss how they had used the course: "…we effectively had what I would call group therapy sessions where they were just openly sharing stories, common stories, of their anxieties about math, I think feeling like they were part of a community of people who shared similar experiences and concerns. That sort of helped support a cohort."[6] The skills and level of experience of learners using Succeed with Math also varied. While it had been anticipated that students with "very basic algebra skills" would participate in the "hybrid format" in reality "…there was a range of … math readiness or competency but still all had the common theme of anxiety around taking the mathematics programme or course."[7]

The low cost and flexibility of the OER also made them suitable for use with targeted outreach efforts. Many pilots included low-income students, defined as students eligible to receive governmental financial support through federal Pell Grants (Pell-eligible; US Department of Education, 2013). By the end of the project 11 institutions had taken part in pilots (see Appendix 1), connecting with a total of 1830 students. Of these students 399 participated in iterations of Succeed with Math, 675 in different versions of Learning to Learn and 756 in pilots where iterations of both courses were offered. 31 instructors/staff at the 11 institutions were involved in facilitating these course offerings.

Methods

Bridge to Success had to meet the data requirements of several stakeholders. Stakeholders included the project funders (NGLC), the piloting institutions who had invested staff and student time in participating in Bridge to Success, educators, students, and the wider OER community. All stakeholders were interested in and had different data requirements in relation to student success. Quantitative and qualitative research methods were used to evaluate the impact of Learning to Learn and Succeed with Math on both students and educators.

To address NGLC's interest in how many students enrolled and completed the courses, how many persisted to the next semester, the number of students who had "mastered subject matter" and "mastered deeper learning" (master core academic content, think critically and solve complex problems, work collaboratively, communicate effectively, and learn how to learn)[8] data was gathered from instructors, with the "mastery" data components reliant on their observation and assessment of students' progress. In addition, a sub-set of data with the same parameters was required in relation to low-income students. Dates of when pilots started/ended, institution names and enrolment dates also formed part of the required data. Particular pilots were also able to measure success in future exams or courses, incorporating comparative data as available.

Piloting institutions, educators, and the research team were interested in the following questions: What were the experiences of institutions, educators and students when participating in the project? How were the materials used? How did the remixing of the materials in order to make them suitable for a US audience impact upon their suitability for this context? What was the overall impact of the project? In what contexts did it have a particularly strong impact? Google Analytics was used to evaluate the number of unique users and page views over time, and to gain an understanding of the geographical locations of users. LabSpace analytics produced from the Moodle platform gave researchers further information allowing identification of individual users. This supported analyses such as student progress through the course and number of views of units, linked to specific users who had created accounts on the platform. In addition, institutional data was matched to students' LabSpace profiles (which contain limited personal data, e.g. a name, associated programme and institution, email address) with data held by colleges, in order to track individual student performance over subsequent semesters and re-enrolment.

Student feedback on their learning and satisfaction with the materials was captured via pre- and post-questionnaires, interviews, focus groups, and observations. In the interviews and focus groups, students were asked about their view of the materials, aspirations and how they used the materials. Researchers also observed cohorts being introduced to both Bridge to Success courses evaluating ease of use and the student experience.

Following use of the materials, instructors were invited to participate in an online evaluative questionnaire to reflect on their use of materials, feedback from students, and their own feedback. Additionally, interviews and focus groups were conducted with questions focused on gathering the context and detail of how Bridge to Success materials were used, impressions of how students found the materials, and whether instructors had previous experience of using OER.

Throughout the project researchers conducted a number of face-to-face interviews with UK and US team members and stakeholders. These provided different perspectives on the project's progress. Further research on the project post-completion is being carried out as part of the Open Education Resources (OER) Research Hub project.[9]

Measuring impact on learners

In this section we look at some of the results that have been achieved from different sources. We present three different categories and types of data. First, educator views on learner performance. Second, learner survey responses gathered whilst students used the course materials. Third, an analysis of student performance data in instances where students were studying formal assessed courses either alongside, or after using, the open course material from Bridge to Success. Each of these examples can be used to measure the impact of the open intervention; in each section we look at how these can be quantified and represented before discussing briefly how their messages need to be combined and then used along with the qualitative feedback from research with educators and learners to provide a broader picture of the project in action.

Educator views of impact on Learners

By the end of the project 1,830 learners engaged with Learning to Learn and/or Succeed with Math as part of a pilot project. Data was sought on those who completed the courses and those who then persisted with the colleges and were judged to have achieved mastery of both subject matter and/or deeper learning. This data was based on educators reporting back to the project on whether they felt their students had mastered subject matter or if they had completed the course. As some institutions piloted Bridge to Success materials on a rolling or open-ended basis it was not always possible to have a clear definition of the cohort involved or the completion point. In practice this meant variation in the numbers of learners that were possible in the different categories of interest. Using educator reporting nevertheless enabled data to be gathered that would otherwise be difficult to obtain, such as comparison between low-income and other cohorts of learner.

Table 1: Reported completion, persistence and mastery rates from pilots

All learners

Low-income

Completing students

1050 of
1235

85%

370 of
492

68%

Persisting students

571 of
628

91%

340 of
370

92%

Students who mastered the subject/materials

628 of
658

95%

382 of
400

96%

Students who mastered deeper learning

592 of
658

90%

357 of
400

89%

Table 1 gives an overview of the data for all students who participated in the pilots, and for which the project had received complete data. As can be seen in the right hand column a high number of participants were reported by educators to be "low-income" learners, most likely to be in receipt of financial assistance to help them with their studies. Of these 492 low-income students, 68% (370) completed the relevant Bridge to Success course they were participating in, compared to the total of all learners for which we have complete data (1235) shows that 85% (1050) completed their course. The figures for the other three performance measures are similar. In order to contextualise and understand the real meaning of these figures (e.g. why low-income students were less likely to complete, but were slightly more likely to persist and master their chosen subject) the results would need to be compared with the views of the learners themselves and, if possible, supported by objective data such as student performance in relevant courses

Feedback from Learners

Over the duration of the project 372 and 344 students completed the pre-questionnaire for Learning to Learn and Succeed with Math, respectively. However, the post-questionnaires, situated at the end of each course, received only 96 responses for Learning to Learn, and 30 forSucceed with Math. Whilst Learning to Learn was often offered in its entirety to learners, students studying with Succeed with Math were more likely to have focused, or been advised to focus, on particular parts of the content, rather than work their way through the entire course. Consequently, it is likely that many students did not realise there was a post-questionnaire available, as they did not reach the point where they would have been asked to participate in the survey.

In the Succeed with Math post-survey (n=30) 87% of respondents would recommend the materials to other students (13% undecided), 83% would like to use materials like Succeed with Math as part of enrolling in future courses (13% undecided), and 83% report overall satisfaction with the quality of the materials (17% undecided). There was also agreement in the value of the activities, pencasts, videos and quizzes, and the approach of using real life applications of maths, all of which were considered key to the design of Succeed with Math by the US and UK content authoring teams.

Learning to Learn results were less strong in post-survey responses (n=96). In this instance 76% agreed or strongly agreed that they were satisfied with the quality of the materials (18% undecided), whilst 64% would recommend the materials to other students (22% undecided). In instances where there was lower satisfaction, there were suggestions from respondents that the materials were not always appropriate for them, particularly in cases where Learning to Learn had been piloted as a mandatory part of students' entry to college.

Comparative analyses of test scores

Given the timescale of the project, schedule of semesters, and access to institutional data, it was difficult to conduct comparative cohort analyses. There were two exceptions that show the types of analysis that can be conducted when data is available. The first of these occurred at a non-profit organisation with a vocational programme to train the long-term unemployed. Acceptance of applicants on to the programme required successfully passing a graded math test with a score of 11/15 (73.3%) or above. Selected units of Succeed with Math were used over a period of 1-3 weeks to enable groups of applicants who had all previously failed the entrance exam to either refresh or develop the math skills needed to pass the resit exam.

On average a 32.4% improvement in scores was seen in the second attempt across this sample, with 28 of the 35 (80%) who originally failed passing the entrance examination after exposure to the Succeed with Math course. The difference in performance is significant at the 0.1% level (see Table 2).

Table 2: Comparing math test results before and after use of Succeed with Math

Pre-test

Post-test

Pre-test

Post-test

mean

41.9%

74.3%

Pass

0

28

sd

19.4

12.0

Fail

35

7

35

35

z=9.86, p<0.001

35

35

χ2=56.9, df=1, p<0.001

The second example of comparative data was generated by identifying students from particular community college pilot cohorts and then obtaining corresponding performance data from available results for the same students on related courses. The comparison group (Sub-group A, 154 students) took assessed Mathematics courses at the same time as using Bridge to Success materials, while the test group (Sub-group B, 86 students) took the Bridge to Success course first and then took the assessed courses afterwards. Note that the data collection approach did not exclude students who may have had to retake the course. There may therefore be some overlap between the two sub-groups.

The data was treated in two ways in order to assess whether any differences in results were significant. In the first analysis the data was viewed as pass/fail (Table 3) and then in the second a two tailed Chi-squared test with 1 degree of freedom (df=1) was carried out (Table 4).

Table 3: Pass/fail results for comparison sub-group A and sub-group B

Math

Sub-group A

Sub-group B

Pass

78

51%

59

69%

Fail

76

49%

27

31%

154

100%

86

100%

χ2=7.26, df=1, p<0.01

The Chi-squared test indicates the results based on pass/fail were significantly different.

The second analysis codes the grades as numeric values, that is A=4, B=3, C=2, D=1, F/FX=0. Non-completions were also treated as fail with value=0. This method (e.g. mapping grades to numerical values) mirrors that used to calculate Grade Point Averages (College Board, 2013). A two-tailed z-test was then carried out on the available data (see Table 4).

Table 4: Grade results for comparison sub-group A and sub-group B

Math

Sub-group A

Sub-group B

mean

1.72

2.12

sd

1.61

1.58

154

86

z=2.52, p<0.05

The z-test indicates the results based on grades were significantly different. This suggests that there was significant change on the basis of both grade point and pass/fail. The effect size, d, is estimated by the standardised mean difference that is the difference in means divided by the estimated common population deviation (Richardson, 1996). In this case for the pass/fail data d = 0.37 which implies an effect size that can be considered in the small (0.2) to medium range (0.5). Note the same analysis was also carried out on the students' performance in English/Reading. In this case there was no significant difference in grades. This is as expected as Succeed with Math does not address that subject area.

Discussion of Results

The quantitative results presented above all indicate positive value from the open materials. For example the highly positive feedback from educators could be represented as statements such as more than 95% of completing students are reported to have gained mastery of the subject matter. On the other hand, the feedback from the learners themselves and the performance data in comparative courses, while also positive, albeit less strong, is still significant. By gathering data in each of these ways the analysis serves to reinforce the overall lessons from the project.

The relative difficulty in gathering performance data meant we were unable to gather matching data for Learning to Learn. The confirmation of impact for Succeed with Math helps to support the similar findings from the educator and learner survey data across the two courses but cannot be relied upon.

It is clear that further research with a number of different cohorts would be required to fully understand the longer term impact of Learning to Learn and, in the instance of Succeed with Math, examine the materials impact in comparison to that of an alternative resource. The quantitative work therefore inevitably only provides part of the picture and needs to be brought together with other sources and qualitative research with the project team, educators and learners.

Reflections

The Impact of Openness

In order to fully understand the use and impact of OER, it is necessary to evaluate "in the wild" or informal use of the materials. This was an important part of researching the impact of the Bridge to Success materials as the courses were available openly on LabSpace and could be accessed at anytime, by anyone, regardless of whether they were participating in a pilot or not. More detailed information, beyond that collected by Google Analytics, pertaining to specific pilots' use of materials was therefore limited to instructors voluntarily providing information on how they were using the resources.

Consequently, generating robust quantitative research data on specific pilot cohorts whilst ensuring that the resource itself remained "open," presented Bridge to Success researchers with a methodological challenge. The approach taken was to look for ways to work with institutions to generate comparative datasets that could show whether OER materials impact learners. This proved valuable and helped confirm and communicate other results from more qualitative aspects of the research. However, the scope for comparison depended on identifying parallel measures and matching groups, which was complex. We were only able to achieve this for one pilot, and, even then, the process would have benefited from earlier prioritisation and agreement.

Once the materials were openly available, the project could only do so much to encourage effective participation in many of the data collection activities, as they were not a pre-requisite for accessing the materials. Access to online Learning Journals, Learning Clubs where cohort groups could work together, pre- and post- quizzes, or automatic record of their progress was not a sufficient incentive to encourage people to register by creating a profile on LabSpace. Similarly, consistent collection of learners' data was made difficult as there was no guarantee learners remembered to login each time they used the materials, particularly as they were often accessed from multiple devices.

Complexities of Data Collection and Analysis

Participation in research was optional for both learner and educator. Additionally, there was a dependency on instructor and institution willingness to participate in any qualitative data collection to complement what was being gathered via analytics. Although the feedback was varied enough to indicate that data was not skewed toward more "positive" experiences, the process of interviewing students and educators was resource intensive. Moreover, in projects such as Bridge to Success, where the potential to scale is high, this approach to gathering qualitative data is inappropriate beyond the creation of a number of case studies which provide deep understanding of how the resources are being used.

Although data was being collected in a number of different ways, data identifying individual students on LabSpace was often in a different format from that collected by institutions. Consequently matching different types of data was difficult (e.g. email vs. student identification number). Despite encouragement to login when using the course materials, students could access them without logging in and pre-questionnaire completion rates were low. It was also not clear to researchers why students were stopping using the materials at specific points: had they reached the end of the materials selected for use by their instructor? Had users forgotten, or decided not to, login to LabSpace? Had they decided that a particular section was too difficult, irrelevant or uninteresting?

Post-project suggestions, by those accessing data one student at a time and trying to match them with data from a number of other sources (e.g. data from LabSpace and other analytics), indicated that tagging students within a college system who were considering using resources such as Bridge to Success may enable longitudinal tracking and yield the kinds of impact data needed. In addition, the consistent use of identifiers by students across different platforms would result in more identifiable students and consequently more quantitative data on the project's impact on students. Minimising the variety of data sources that are being used to track students' progress remains a central concern. However, it is difficult to see how impact data could be gathered without piloting the whole course (which would limit the potential of use inherent in open resources) or students being encouraged to login consistently and using consistent identifiers.

Additionally, as students did not complete the courses in a linear fashion, the pre- and post-questionnaires yielded only a small amount of comparative data. Whilst qualitative data from interviews indicated that the materials were impacting on students' learning, data gathered via these questionnaires did not reflect this. Succeed with Math, which contained unit quizzes, may have yielded data that indicated students were improving their math skills. However, without understanding the initial level of students' math skills, there was no comparator data. Additionally, as it was difficult to understand how students were using the Bridge to Success materials, whether progressive improvement (or otherwise) was directly connected to using the resources was difficult to ascertain. Incorporating enough flexibility to accommodate the varied use patterns of materials, partial use, etc. is a complex issue and one which researchers must take a flexible approach, with (where possible) the ability to use a variety of ways to gather required data or identify other indicators of impact.

A more direct line of communication with instructors may have also ensured that a higher level of educator feedback was gathered. One option, which would also decrease any burden on individual project team members, would be to automate the process of gathering instructor feedback. This could be achieved by embedding an instructor questionnaire in the platform which houses the OER concerned. It would also ideally require no login to participate in the questionnaire: data from (for example) the Succeed with Math Self-Reflection questionnaire shows that although 344 responses were received during the duration of the materials' use, the page itself was viewed on 1428 occasions.

Crucially, it is also important to minimise barriers to participation. The first activity and introduction to both courses were the self-reflection pre-questionnaires and the login process associated with using them. These acted as a barrier to participation, particularly in groups of students that had low or no computer literacy.[10] Balancing data collection with an understanding of the needs of users is essential. Operating a different way of gathering data via LabSpace or similar would provide more detailed data for non-pilot use, but institutional negotiation and data gathering to provide retention, and other information would also be of note.

Lessons Learnt

This section outlines the lessons learnt from the mechanisms and process of researching Bridge to Success course materials. Overall research was able to draw on many perspectives and sources, gained from qualitative and quantitative methods. Monitoring modes of use and interviewing educators showed how pilots were able to exploit the flexibility of OER. Using instructor feedback gave a clear indication of their belief that the courses were effective for learners and improved retention. Talking to the learners themselves gave a more mixed message with positive statements in interviews balanced with more moderate feedback from the questionnaires embedded in the materials. However objective measures prove harder to find and for this we were limited by the range of comparative data we were able to use and found it difficult to introduce that aspect into existing studies. Where such data was gathered it supported the view that use of OER as supplementary material improves student performance but it is not clear that the results can be extrapolated into other context.

We therefore advocate the adoption of a two-staged approach to the collection of quantitative data: a negotiation stage followed by the data collection. This quantitative element needs to be matched to qualitative information so it can then be understood in context in specific case studies. The two-stages of this approach address the need for an OER evaluation strategy to have both a mechanism for gathering information on "in the wild" use (e.g. by placing evaluative mechanisms in the OER itself) and the need to formalise pilots from which comparative data can be gathered.

In the negotiation stage access to quantitative data needs to be agreed with institutions during the initial setting up of a pilot. Researchers should consider their "ideal" data requirements and then agree the format and collection mechanisms, delivery of data, etc. in advance. The necessity for negotiation is emphasised as the measure may well come from indirect measures such as performance on subsequent courses or return to study, rather than activity within the OER. Researcher involvement in early stage discussions is needed to ensure that the availability and parameters of data are clear from the offset, gathered consistently, and similar definitions are used by participating institutions.

It is particularly important to negotiate impact data early in the pilot process. The impact of Learning to Learn on students was more diffuse and difficult to gauge when compared with a course such as Succeed with Math where impact could be more easily measured (e.g. through examining test scores for any improvements). In the case of this course, data regarding student performance, retention, test scores, etc. and the possibility of generating a non-user comparator group provided a valuable route to assessing impact of the resources. It is also important to establish how the course will be used and whether a specific pilot will provide robust impact data. The integration of units or activities into pre-existing materials which provided great examples of how the materials could be utilised in different settings and "wrapped" (see Coughlan, Pitt & McAndrew, 2013) also amplified the more general challenge of defining and understanding impact: how can one identify Bridge to Success materials as being responsible for specific student outcomes?

This concern was applicable to whole course use of Learning to Learn. The materials focused on developing students' "soft skills" and, therefore, made gathering quantitative impact data more challenging. In this instance, one possible approach could have been to look at whether students persisted with their studies (which would perhaps indicate that the skills they learnt through utilising the course were being put into practice). However, this would have perhaps been too speculative: there are many reasons as to why students may have persisted with their studies. Moreover, the benefits of undertaking a course such as Learning to Learn may not always be immediately apparent.

Although Bridge to Success was successful in gathering a significant amount of qualitative or anecdotal evidence to support the claim that students were benefitting positively through their use of materials, it was only feasible to conduct a cross-comparison with a non-Bridge to Success user group at a project partner community college. Close liaison with a select number of organisations which are willing to track students who participate in pilots, and also release both demographic data and provide a "control group" is required to generate the data required by funders.

This requires the development of relationships over time, with the reassurance that data would be used in the strictest confidence. It became clear that some institutions regarded a release of data that could be used for comparative purposes as potentially controversial as it offered the potential for a comparison with other colleges. In setting up a project such as Bridge to Success where student impact data is required to track student progress over time, and create comparator groups, an initial negotiation period to discuss what data would be made available for research purposes and what format this data would be in is necessary. Many programmes utilising Bridge to Success did not have readily available control groups or only used select units or activities rather than the whole course. In these cases understanding what impact the use of one section of a unit had was impossible, despite positive anecdotal evidence as to what the material was bringing to a pre-existing course.

Conclusion

In reviewing our approach, we continue to advocate a mixed methods approach capturing different facets and contexts of OER use and leading to insightful use of impact data. Four aspects that made up our own research were:

  1. Extend an action research model to include other educators in the research processes to provide feedback on design, understanding of their ideas for use in context, and their view of the impact on learners.
  2. Increase access to learners through surveys, focus groups, and interviews to understand the effects on motivation and the personal impact.
  3. Highlight personal case studies captured in detail through recorded interviews, preferably with permission to share. These reveal the depth of factors from individual perspectives. Such illustrations apply to all project participants and are particularly valuable in communicating the views of the end user, in our case the learner.
  4. Gather comparative performance data as available. For formal learning such data may well be directly available and the focus for evaluation. For instances where learning takes place informally comparative data may be found by looking for impact in parallel or associated activities.

Combining methods allows a more holistic view to be taken and evidence brought in from multiple sources (McAndrew et al., 2012). The model shown in figure 1 represents the intended combination of definitions, data collection from institutions, and data collection through the OER repository itself. We expect that this model will prove useful in planning the evaluation strategies of similar projects in the future.

Figure 1: A Model of OER Impact Evaluation

In practice there are many studies that lack the use of comparative data; however, this valuable comparative data should not be ignored. Researchers need to acknowledge how such data can help increase impact from findings. Similarly, focus on the improvement in metrics such as pass rates alone will miss understanding of the effect on individuals and identifying the way openness can act to change attitudes as well as support results. If this combined approach is taken, the breadth in different contexts, such as community colleges, can be fully explored and analysed.

In this paper, we have explored the opportunities and complexities of evaluating an OER project through a complex case study. Our motivation in this was a realisation through conducting the project that this type of research has fundamental characteristics that require addressing. Although we have begun to unpack these issues, further work is required to refine the conceptual, ethical, social and technical aspects of research in this area.

Acknowledgements

The authors would like to thank all the project partner and pilot institutions, faculty, students, staff and supporters of the Bridge to Success project.

Bridge to Success was funded as part of the Next Generation Learning Challenges programme (funded by the Hewlett Foundation & the Bill and Melinda Gates Foundation). The authors would also like to thank the Horizon Digital Economy Research Hub Grant (EP/G065802/1) for their continued support for Bridge to Success. Further research into the impact of Bridge to Success post-project is being conducted by the Hewlett Foundation funded OER Research Hub project.

References

Atkins, D. E., Brown, J. S. & Hammond, A. L. (2007).A review of the Open Educational Resources (OER) movement: Achievements, challenges, and new opportunities. pp. 1-84. Available from: http://www.hewlett.org/uploads/files/ReviewoftheOERMovement.pdf [Accessed 21 October 2013].

Bailey, T. & Cho, S.W. (2010). Developmental Education in Community Colleges. Available from: http://www2.ed.gov/PDFDocs/college-completion/07-developmental-education-in-community-colleges.pdf [Accessed 21 October 2013].

Calkins, A. & Vogt, K. (2013). Next Generation Learning: The Pathway to Possibility, EDUCAUSE Library. Available from: http://www.educause.edu/library/resources/next-generation-learning-pathway-possibility [Accessed 21 October 2013].

College Board (2013). "How to Convert Your GPA to a 4.0 Scale". Available from: http://www.collegeboard.com/html/academicTracker-howtoconvert.html[Accessed 21 October 2013].

Coughlan, T., Pitt, R. & McAndrew, P. (2013). Building open bridges: collaborative remixing and reuse of open educational resources across organisations, In 2013 ACM SIGCHI Conference on Human Factors in Computing Systems "changing perspectives" (CHI 2013). 991-1000. Available from: http://oro.open.ac.uk/36473/ [Accessed 21 October 2013].

Lingenfelter, P. (2012). The Knowledge Economy: Challenges and Opportunities for American Higher Education, In Oblinger, D. G. (ed.), Game Changers: Education and Information Technologies, EDUCAUSE, pp. 9-23.

Lovett, M., Meyer, O. & Thille, C. (2008). The Open Learning Initiative: Measuring the Effectiveness of the OLI Statistics Course in Accelerating Student Learning, Journal of Interactive Media in Education.

Masterman, L., Wild, J., White, D., & Manton, M. (2011). JISC Open Educational Resources Programme: OER Impact Study: Research Report. Available from: http://www.jisc.ac.uk/whatwedo/programmes/elearning/oer2/oerimpact [Accessed 21 October 2013].

McAndrew, P., Farrow, R., Law, P. & Elliott-Cirigottis, G. (2012). Learning the lessons of openness, Journal of Interactive Media in Education. Available from: http://jime.open.ac.uk/jime/article/view/2012-10 [Accessed 21 October 2013].

Richardson, J. T. E. (1996). "Measures of effect size," Behavior Research Methods, Instruments, & Computers, 28(1), pp. 12-22, [online] Available from: http://www.springerlink.com/index/10.3758/BF03203631 [Accessed 21 October 2013].

Rodgers, E. (2011). Measuring Our Impact: the Open.Michigan Initiative. InProceedings of OpenCourseWare Consortium Global 2011: Celebrating 10 Years of OpenCourseWare. Available from: http://hdl.handle.net/2027.42/91308 [Accessed 21 October 2013].

Thomas, A., Campbell, L. M., Barker, P., & Hawksey, M. (2012). Technology for Open Educational Resources - into the wild. Available from: http://www.booki.cc/oer-tech/ [Accessed 21 October 2013].

US Dept of Education (2013). Federal Pell Grant Program. Available from: http://www2.ed.gov/programs/fpg/index.html [Accessed 21 October 2013].

Appendix 1: Institutions that participated in Bridge to Success pilots

Institution name

Institution Type

Number of
participating
students (where
final numbers
known)

Number of
participating low
income students
(where final
numbers known)

College 1

2-year Associate degree &
certification programmes

648

193

College 2

4-year degree & Masters
programmes

180

7

College 3

2-year Associate degree &
certification programmes

32

32

College 4

2-year Associate degree &
certification programmes

710

30

College 5

2-year Associate degree &
certification programmes

47

47

College 6

4-year degree & Masters
programmes

10

10

College 7

2-year Associate degree &
certification programmes

10

10

College 8

2-year Associate degree &
certification programmes

33

33

Non-profit 1

Non-profit organisation

35

35

Family Support Centre 1

Family Support Centre

8

8

Governmental Agency 1

Governmental agency

117

117


[4] Instructor Interview, June 2013.

[5] Instructor Interview, March 2012.

[6] Instructor Interview, June 2013.

[7] Instructor Interview, June 2013.

[9] For more on the project, see: http://oerresearchhub.org/

[10] Reference: observation of a student cohort during March 2012 and instructor interview feedback.