- Emily Chapman-Waterhouse
Harper Adams University, UK
- Ayona Silva-Fletcher
Royal Veterinary College, UK
- Kim David Whittlestone
Royal Veterinary College, UK
This intervention study examined the interaction of animal- and veterinary nursing students with reusable learning objects (RLO) in the context of preparing for summative assessment. Data was collected from 199 undergraduates using quantitative and qualitative methods. Students accessed RLO via personal devices in order to reinforce taught sessions. Interviewees reported that the RLO helped them meet the requirements of the curriculum. Quantitative data supported two valid points; the lack of engagement of students when given a free-choice and reluctance for self-assessment. The practical significance of the qualitative outcomes lies with how first year undergraduates on animal and veterinary nursing-related courses use RLO designed to address equine management and health topics, where the students have mixed equine experience.
The increased demand from learners in higher education to access study materials at any time, at any location and increasingly on a range of platforms including mobile devices has resulted in considerable development in the usage of Reusable Learning Objects (RLO) across the sector (RLO-CETL, 2005; Jenkinson, 2009; Kurilovas et al., 2011; Windle et al., 2011; Windle et al., 2010). RLO, also known as Shared Content Objects (SCO) are self-contained digital resources such as video, audio, web-pages, documents and graphics which are stored and accessed independently and can be used to support web-based learning. Kay and Knaack (2007) expand on this by saying that RLO are interactive tools which enhance and amplify the cognitive processes of learners. Literature tells us that one purpose of RLO is to enable students to learn new skills (Windle et al., 2010), within a controlled environment, at a range of difficulty levels and with arrangements for regular feedback (AAMC, 2007). Although there have been a number of studies undertaken to examine the role of RLO in higher education, they originate from medicine and health sciences education in the main. Therefore, in the first instance, practice in Veterinary Education must draw from findings in other subject areas.
A number of researchers have identified that the underpinning rationale for developing RLO is wide ranging, but those studies have emphasized flexibility (Johnson et al., 2013; AAMC, 2007), achievement of higher grades (Windle et al., 2011; Lymn et al., 2008; Trowler, 2010; Bacsich et al., 2011), meeting the needs of professional practice (Windle et al., 2011; Windle et al., 2010; Keefe & Wharrad, 2012; DoH, 2011; Evans, 2013; Blake, 2010) or those of institutions (Johnson et al., 2013; AAMC, 2007; Concannon et al., 2005; Evans, 2013; Kurilovas et al., 2011) as opposed to attempting to impact student learning as a whole. Firstly, to help students achieve higher marks in summative assessment and/or an improved overall outcome (Trowler, 2010), educators typically supplement face to face teaching (Lymn et al., 2008) with additional learning resources. The need to do this may in part be explained by the challenging nature of a subject for some students (Windle et al., 2011; Lymn et al., 2008). It has also been reported that some students feel they lack time to study content heavy modules, so they take a superficial approach to their studies, over which they feel they have limited control (Windle et al., 2011). To be effective, RLO require students to actively engage with the content (Johnson et al., 2013; AAMC, 2007). We know that student engagement per se is the extent to which students take an active role in a range of educational activities and that this process is likely to lead to high quality deeper learning (Trowler, 2010). Furthermore, formative assessment as a function within RLO would be advantageous in terms of preparing students for the high stakes summative assessment. RLO have been found to have a significant effect on examination result (Windle et al., 2011; Keefe and Wharrad, 2012), where RLO users have achieved an improved performance in assessment over non-users (Johnson et al., 2013).
Secondly, like other vocational disciplines, medical and veterinary sciences are subject to change in professional practice or policies (Windle et al., 2011; Blake, 2010) with typically profession-driven curricula (Keefe and Wharrad, 2012). Both these issues could be effectively addressed via RLO. It is important to note that high examination results and professional competencies have been considered as separate variables effected by RLO use, although based on the principle of active engagement, one could argue that the engaged students may achieve both high examination results and the required professional competencies following RLO use. Researchers have found a number of other desirable outcomes have been affected by RLO use including learning experience (Blake, 2010), critical thinking, practical competence, skills transferability, cognitive and psychological development, self-esteem, formation of identity, moral and ethical development and student satisfaction (Trowler, 2010; Sandlin et al., 2014). Lastly, development of RLO has in some cases been driven by the need for institutions to save money (Johnson et al., 2013; Kurilovas et al., 2014), be more competitive and attract a wider cross section of the potential student market. In addition, institutions have in some cases needed to reduce staff contact time within a module (Johnson et al., 2013).
The reasons why students choose to use RLO is reported as being affected by a number of factors, one being the student’s prior experiences (Bacsich et al., 2011; Kirkwood, 2008; Littlejohn et al., 2010). Use occurs where students have a positive attitude towards computers and they prefer to use technological educational resources (Concannon et al., 2005). Commonalities exist between the reasons students choose to use RLO and how educators should approach the development of such resources. In terms of the practical implications for this study, exploring why and how students use online learning resources is a starting point. Therefore, the present study was designed to answer why and how some students choose to access web-based RLO and others choose not to in relation to preparing for animal and veterinary-science related assessment.
Data was collected from a total of 205 students, which equated to 82% of the total eligible student population in their first year of a degree (BSc) or foundation degree (FdSc) in an animal- or veterinary nursing subject at Harper Adams University (HAU). The eligibility criteria for this study was that they were undertaking the core module, A4016 Large Animal Management (LAM). There was no pre-requisite requirement for evidence of equine specific learning, therefore, this led to participants having mixed prior equine experience.
The course areas represented in this study were Animal Behaviour and Welfare, Animal Health and Welfare, Bioveterinary Science, Veterinary Nursing, Veterinary Nursing with Practice Management, Veterinary Physiotherapy and lastly Animal Management and Welfare. All the students who took part in the study volunteered to do so and attended a face to face briefing and signed a participant consent form. The study consisted of two cohorts of students who undertook and were assessed in LAM in either 2011/12 (n = 98) or 2013/14 (n = 107). Students were grouped as follows:
- Group i: 2011/12 – who did not have access to the RLO.
Then of 107 students undertaking LAM in 2013/14, all of which were given access to the pre-test and RLO, the following groups were identified based on what students themselves chose to access (student access data were collected from user logs):
- Group ii: 2013/14 – but accessed neither the pre-test, nor the RLO.
- Group iii: 2013/14 – who accessed the pre-test, but did not utilize the RLO.
- Group iv: 2013/14 – who accessed both the pre-test and the RLO.
- Group v: 2013/14 – who did not access the pre-test, but did utilize the RLO.
Groups i and ii were the control groups because they did not access either the pre-test or the RLO. Group size was pre-determined by the enrolment figures for 2011/12 and 2013/14 respectively as LAM was a core component of those courses. Data was collected via paper-based questionnaires, online quiz hosted in the HAU virtual learning environment, Moodle (pre-test), paper-based written examination (post-test), interviews, and online survey. Module delivery was not modified for the purpose of this research. Data was not collected during the academic year 2012/13, as this was a developmental year in terms of the RLO. Participants did not receive any payments, but interviewees were offered refreshments at the time of the interviews. The study was approved by the HAU Research Ethics Committee. As one of the research questions within this study related to how students used the RLO, instructions to this effect this were not prescribed to participants beforehand. At the point that RLO were launched, as well as during the marking of the post-test, self-assignment of students to groups were not known by the researchers or the students. This was facilitated by implementing a delay on the researchers assessing Moodle user logs until after the post-test.
Within this study, Moodle access logs was collected, comprising which students had selected which RLO, on how many occasions and at what time and date. The known limitations of this data meant that it was not robust evidence of student engagement and hence could not be used as to infer impact on assessment performance.
This was an intervention study using RLO as the intervention, which in this case were five web-based narrated videos (defined as RLO as per the definitions of Valderrama et al. (2005) and Windle et al. (2010)), organized into chapters, each covering a key equine health management topic; farriery, dentistry, weight assessment, body condition scoring and worming. RLO duration was between 10-25 minutes and delivered in English. RLO were produced by two members of academic staff with equine backgrounds and an e-learning technologist during 2011/12 – 2012/13. All staff had prior experience in creating learning resources of this type. All five of the RLO were made available concurrently and continuously to all students ten days prior to the post-test via Moodle. This was the sole route to accessing the RLO. Notification of RLO availability was initially presented to students via e-mail including a hyperlink to the relevant course page in Moodle, immediately after the resources were made accessible to view. Instructions on how to use Moodle would have been issued at the start of the academic year and given the time of the year that the RLO were launched, the authors expected participants to have accumulated ample Moodle user experience by that stage. A hyperlink to each RLO was placed in a pre-determined topic in the LAM Moodle course page with set up such that a new window opened on a RLO being selected. The incentive for students to use the RLO was as a revision aid for the post-test, and therefore frequency of access to the RLO was unlimited. In order to encourage engagement with the RLO, a reminder e-mail was circulated two days prior to the post-test. Access data for participants in groups iv-v was exported from Moodle and saved in MS Excel format.
The pre- and post-tests each comprised twenty multiple-choice questions (MCQ) and were moderated prior to use by two separate academics at HAU, in line with the university assessment regulations. The pre-test was administered via Moodle, being available to all students in groups ii-v. The post-test was administered to groups ii-v as a paper-based written examination, under examination conditions. Equivalent post-test performance data for group i was collected from the HAU student records system and recorded anonymously; the pre-test (which from a student perspective was a revision quiz) was made available to groups ii-v via Moodle immediately following LAM taught sessions in 2013/14 but preceding access to the RLO, with results being exported from Moodle and saved in MS Excel format.
On the basis of the pre- / post-test score difference in combination with RLO access data, students in groups iii and iv were organized into four categories, namely:
- • Low score difference and did not use the RLO;
- • Low score difference and did use the RLO;
- • High score difference and did not use the RLO;
- • High score difference and did use the RLO.
For this purpose, low score difference equated to a negative or nil difference between pre- and post-test scores, whereas a high score difference equated to a positive difference within the range 6 – 68%. In addition, students were categorized as ‘did use the RLO’ where Moodle usage data indicated they had accessed one or more of the RLO. Students in group iii were categorized as ‘did not use the RLO’ where they had not accessed any of the resources. One student from each of the four categories referred to above was randomly selected and invited by telephone (with a follow-up e-mail) to a semi-structured face to face interview. The interview questions were piloted 3 months earlier with 3 female second year students, who were out with the study group but studying across the same course areas. The interview questions related to themes informed by the RLO-CETL evaluation toolkit (RLO-CETL, 2005) with the purpose of addressing both the ‘why’ and ‘how’ aspects of accessing RLO. Interviews were conducted by a qualified journalist, who was not the primary researcher. Interviews were recorded digitally, with the permission of the participants and transcribed by an external organisation. Interview transcripts were then analysed using a process of open coding and theory-related material themes.
Concurrent to the semi-structured interviews, a modified RLO-CETL evaluation was administered to groups iii-v, via a Bristol Online Surveys (BOS) online survey, responses to which were collated and exported to MS Excel format for later analysis. Use of a web-based platform enabled prompt exporting and filtering of data.
The planned timings for data collection are shown in Table 1.
Table 1. Timings of data collection for each stage of the study
|Data||Week during the 2013/14 Academic Year|