THE ROLE OF STAKEHOLDERS’ EVALUATION ON THE QUALITY OF REUSABLE LEARNING OBJECTS FOLLOWING THE ASPIRE PARTICIPATORY FRAMEWORK

As the population is becoming increasingly culturally diverse, there is a growing need for nurses to provide culturally competent care. It has been suggested that health disparities exist within in ethnic minorities and by providing culturally competent care could reduce disparities. The aim of the “TransCoCon: Developing Multimedia Learning for Transcultural Collaboration and Competence in Nursing (GA No 2017-1-UK01-KA203-036612) ERASMUS+ project has been to create innovative accessible multi-media learning resources that will enable undergraduate nursing students and registered nurse in five countries to develop knowledge and skills that enable self-efficacy to influence direct patient care. A participatory approach based on the ASPIRE framework followed, in order to develop the multimedia learning resources, namely Reusable Learning Objects (RLOs). The ASPIRE framework stands for Aims, Storyboarding, Population, Implementation, Release and Evaluation and it includes the following steps: i) Participatory Workshop, ii) Specification writing, iii) Peer Review of Specification – followed by amendments, iv) Development of the RLO, v) Review of the RLO– followed by amendments, vi) Evaluation with stakeholders– followed by amendments, vii) Publish the RLO online. The creation of an RLO is a time-consuming process, but in order to ensure its quality, peer review of the content (Specifications) before development, content and technical review once developed and evaluation from the stakeholders after amendments, are consider vital steps in the development process. This paper will present the evaluation of RLOs with stakeholders at 2 different stages of its development process aiming to discuss the value of the evaluation of the stakeholders at different stages of the process and not only once complete. Four RLOs developed by four partners of the TransCoCon project are evaluated; two after the initial Specification developed using a University of Nottingham HELM team bespoke Specification tool that allows preview of the specification as an RLOs without the interactivity element, and two at the stage of development. Each RLO was evaluated by 23 nursing students or registered nurses. The result of the evaluation was generally consistent across the 4 RLOs. Most participants agreed that the RLOs were clear about its objectives, easy to navigate and have introduced new concepts. On top of that, majority of the participants agreed that the content was appropriate for the topic and have enjoyed learning on their own. In terms of attributes that have contributed or might contribute once developed to their learning, interactivity, self-test exercises, working at own pace, ability to access the RLO anytime and from anywhere were found important by most participants. Since RLOs were on a prototype stage or on a specification preview, technical problems identified by the participants in all 4 RLOs. Technical problems seem to influence the perception of the participants around the reuse of other RLOs. Over 60% of the participants would like more of these learning objects in their University modules and over 50% intend to use the learning objects again, however those numbers are low in comparison with evaluations of complete RLOs.


INTRODUCTION
Innovative approaches in health care professional education have received much attention in the recent years. There is a recognition in the evolution of teaching methodologies of the need to employ interactive e-learning methods to recognise the technological age nurses are practicing in. Following these thought the TransCoCon project [1], an Erasmus+ project aimed to co-create Reusable Learning Objects (RLO) to enable undergraduate nursing students and registered nurses to develop their cultural self-efficacy and cultural competency. The high quality of the RLOs ensured though a peer-review process and through the engagement of stakeholders.

Reusable Learning Object (RLO)
A Reusable Learning Object (RLO) according to Wiley [2] is "a digital resource that can be reused to facilitate learning". The RLO term can be used to describe a range of things depending on the context of its use or the user. This can be a single digital image to a whole online module.
Students in health sciences respond well to e-learning resources that are interactive, visual, small in size and highly aligned with their learning needs [3]. These findings are in line with cohesion and decoupling model of RLOs proposed by Boyle and Cook [4] and being adopted by the TransCoCon project [5]. TransCoCon RLOs are small, granular, and highly focused on a single learning goal which generally utilize multimedia elements to engage the learner in a visual and interactive learning experience [6]. Contrary to whole online course, these are small stand-alone units of learning which can be used in many ways and across interprofessional disciplines. They are normally between 5-15 minutes as that makes them more valued and they considered more effective for the learner [7]. The use of multimedia is a key characteristic of RLOs. Digital media are very important in the learning process and based cognitive load theory [8], design principles have a big impact on how people learn.
Another characteristic of RLO is that they are designed to be reusable in different contexts [2] and offer flexibility of study through 'anytime, anywhere' availability [9]. This allow learners to access these resources at a convenient time and can be re-visited as many times as needed. Moreover, they can be very cost effective as they can be reused on different courses and institution or even across interprofessional disciplines making the final cost per student very minimal.

The TransCoCon ERASMUS+ Project
TransCoCon is an Erasmus+ strategic partnership in Higher Education of five European higher education institutions from UK, Germany, Portugal, Ireland and Belgium. It focuses on enhancing cultural awareness and promotion of transcultural competence in nursing as it has been suggested that health disparities exist within in ethnic minorities and by providing culturally competent care could reduce disparities [10].
The TransCoCon project partners are experienced nurse educators from across Europe who appreciate students learn in different ways. Within the project, partners worked collaboratively to identify similarities and differences within each other's health care systems. This process enabled the free flow of knowledge, ideas, lingual differences and health care expectations of individual countries and patient needs and nuances [1], [5].
Five RLOs were developed and each of them represents about 15 minutes of learning addressing one learning objective. Each RLO within the suite is individually standalone-learning tool, that follow the patient journey and how nurses can influence patient experience and health outcomes through having an awareness of individual patient needs. The educational content of each RLO primarily focuses on the topic of cultural competences in Healthcare, utilizing cultural competency frameworks to enhance nursing cultural competency in clinical practice across the five partner countries and beyond.

The Co-Creation Process and the Quality Assurance of RLOs
A participatory framework called "ASPIRE" was used in the development of the RLOs. ASPIRE stands for Aims, Storyboarding, Population, Implementation, Release, and Evaluation [11]. The concept of involving stakeholders in the development stems from the value of Wenger's community of practice model of organisation [12]- [14] . In participatory design where stakeholder views and expertise are harnessed in the production of digital learning, three elements are important. First, members of a "community of practice" should have a sense of connectedness and be able to effectively communicate as a group. Second, they should have an active and meaningful role within that community. The third characteristic of a community of practice is that members feel empowered. Although such an approach can be time-consuming, Windle and Wharrad [15] highlighted that placing tutors and students at the centre of the development process should ensure that the resources produced are highly aligned to realworld learning needs. They found that this approach is a highly effective way of unlocking content and encouraging many into the e-learning development process for the first time.
The ASPIRE framework as described in [11], [16] can be translated in a series of concreate steps. Stakeholders come together in a storyboarding workshop and they define the aim, the exact learning objective that the RLO should have. Then they storyboard the resource itself. A content expert then creates the specifications, a document describing in details the content of the RLO. A review of the specifications is taking place by other experts in the field. Once the adjustments on the specifications are made, the development process starts following the RLO specifications. The RLO then is reviewed again, both technical and in terms of the content and once corrections are implemented a user evaluation follows. Packaging and distribution are the last steps in the process.
The evaluation and review of an RLO is embedded in different stages of the development process in order to ensure the quality of the educational resource [17]. The check-approve method usually involves peer-review or inspection and can have 3 different steps: i) Self-reflection, ii) Peer Review and iii) External Review [18]. During the RLO co-creation process, the specifications are created with a bespoke HELM tool where the author has the ability to self-reflect on what it is written through a preview mode. The specifications and the developed RLO are Peer-Reviewed by external experts who judge and make suggestion for improvements in two iterative steps of the process. The final RLO is being External Reviewed and evaluated by the users in order to take the final approval for distribution.
While peer-review and evaluation are indeed important parts of the quality assurance of a Reusable Learning object there is no definite answer on which point or how many times the experts and/or stakeholders should be involved in the process. Existing research has focused on the general benefits of involving stakeholders in the development of RLO, minimal research has been done to investigate at what stage should stakeholder evaluation be carried out. The current study aims to explore the role and the potential benefit of stakeholders' evaluation at different development stages of an RLO following the ASPIRE framework.

METHODOLOGY
Following the ASPIRE participatory framework we evaluated 4 RLOs at 2 different stages of its development process. We used a modified questionnaire adapted from RLO-CETL Evaluation Toolkit [19], [20] which has a part understanding the digital competences of the participants.
The participants in the study were participants of the TransCoCon training event. Initially we presented the ASPIRE framework and the development methodology to the participants in order to make them aware of the context. Then, all the RLOs presented as topics and explanation about the different implementation stage of each RLO provided. Participant were given 4 links, one for each RLO. Each link was pointing to a pre-evaluation survey followed by the RLO or the preview of the specifications, followed by a post evaluation.
The four RLOs used, developed by four partners of the TransCoCon project. The development stage of the RLOs were as follow: two were at the stage after the initial Specification developed using a University of Nottingham HELM team bespoke Specification tool that allows preview of the specification as an RLOs without the interactivity element, and two were at the initial stage of development. While the initial development process foresees two review stages from topic experts and technical experts, we altered it by conducting an evaluation with stakeholders. Figure 1 depicts the stages of the development process in which stakeholders provide their evaluation.
Furthermore, the evaluation of experts on the specification, which has the form of open ended-questions of each RLO compared with the evaluation of the stakeholders, which has the form of a closed questionnaire. The responses from the experts and the stakeholders matched in the following areas of concern: Content, Text, Visual Elements, Interactivity, Assessment, Sequence and Flow, Complementary Information (Further resources, Glossary ,Keywords), Knowledge Transfer, while Notes or Other Comments permitted for both evaluators. Table 1 depicts the correspondence between the categories. Knowledge Transfer measured both as a 5-point Likert scale on addressing specific gaps in stakeholders' knowledge and retaining knowledge in this area and also as a pre-post evaluation of knowledge and confidence improvement on a 10-point scale. A two-tailed Wilcoxon signed rank test was conducted on the later 2 questions 'How would you rate your current knowledge and understand on the Transcultural nursing?' and 'How would you rate your confidence in practicing nursing in a transcultural environment?' pre-and post-completion of the RLO.

RESULTS
Stakeholders evaluated 4 RLOs or specifications within a TransCoCon event with 29 participants. Evaluation was voluntarily, thus there were between 23-26 participants in each RLO evaluation, with the majority of them (between 87%-73%) being nursing students and the rest being registered professional. The list of RLOs and their development stage are presented in Table 2.
Participant confidence on using computers were very high 96.3% with only one participant to feel unconfident. All the participants were confident on using web tools such as browsing a web page or using email) while 13% were not confident on using multimedia (video, audio) and some 21.7% were not confident on the use of "Office" tools. Use of eLearning perceived as rather not useful by 43.4% of the participants with 34.8% rate in the middle in a seven-point scale (Strongly disagree -Strongly agree) and only 21.7% perceived as useful, while use of eLearning considered as an easy task according to 65% of the participants and not so easy by the 13% of the participants. eLearning neither promoted by people that the majority of participants value their opinion or they are influenced from, nor was considered enjoyable or fun by the majority of the participants, nor they had to use it. The feedback from the experts and the stakeholders varied in places. While there are some evaluation areas that the experts could not provide an actual review, there are others that stakeholders do not have the capacity to do so. As depicted in Table 3 in the evaluations on the content of the RLO and whether the RLO meets its objectives, the experts can be more detailed and up to the point, while the stakeholders evaluate whether the RLO's content is relevant with the learning objective/topic addressed. The knowledge that can be acquired by the RLO cannot be easily reviewed by the experts, but the stakeholders can already provide at a specification stage whether an RLO can improve their knowledge. Furthermore, self-declared rating before and after seeing the specification of this topic revealed an increase in both knowledge and confidence for both RLOs. For the UK RLO, there was no significant increase in knowledge and understanding on Transcultural nursing. However, the confidence score has significantly increased (V = 45.00, z = -3.21, p = .001). There was a significant increase in knowledge (mean=4.86) compared to the pre-measure (mean=2.6) for the Belgium RLO. No significant increase was found in the confidence score.
Text clearness and accuracy as can be seen in Table 3 -UK RLO may perceived differently for experts and stakeholders. While the stakeholders were non-native English speakers, the UK-English text might be more challenging in comparison with International-English written text, while this is not considered an issue for an expert. Visual elements at the specification stage are not always in their final stage and sometimes it might be a description of an image and what is should depict instead for the image itself. That might by easier for the experts to provide a better review linking the concept of the image with the content than the stakeholders that they have to think as an academic or a content expert creator in order to visualise. The same may apply for interactivity, and the answers of the stakeholders at the specification stage may reveal the need and the willingness of stakeholders to see interactive elements in the RLO. High rating in the Assessment from students reveals the importance and the relevance of the designed assessment with the RLO, while experts' reviews provide comments not only in the relevance to the topic, but to the type and the number of assessments. Sequence and flow can be equally reviewed by the participants, however if the flow contains multiple interactive elements, that would be more difficult for the stakeholders. Complementary Information such as Further Resources, Glossary and Keywords are mainly expected by the experts review since they have the full knowledge on the topic. Notes or additional comments from experts concentrate on improvements for the RLO, while stakeholders' comments concentrate on elements that didn't work well during the evaluationreview. In the prototype stage not all interactive elements are in place and some of the audio and visual elements might missing or they are have not been procced yet. The 2 RLOs that are used were at different stages of the development, the Irish one was at a very early stage with most of the interactive elements uncompleted, while he German RLO was in a more final stage. While the content has already been reviewed once, comments are expected at this stage as well. Sometimes some wording around the learning objectives needs fine tuning or some phrases need to be re-written to better fit with the visual aids, in which experts' opinion might be more helpful for this attribute of review. Stakeholders, similarly with the specifications stage, can provide better justification on whether the knowledge can be enhanced, while the results should always be seen carefully explained at this stage, as there might be other elements (e.g. interactivity) that influence them. The self-declared rating before and after seeing the specification of this topic revealed an increase in both knowledge and confidence for both RLOs. For the Irish RLO the increase was not significant in neither knowledge or confidence, while for the German RLO there is a significant increase in knowledge and understanding where 63% of participants rated 'Good' or 'Excellent' in the post-evaluation compared to 33% in the pre-evaluation. Although the result of the two-tailed Wilcoxon signed rank test was not significant for confidence in practicing nursing in a transcultural environment (V = 9.00, z = -1.63, p = .102), a couple of participants who rated very low in confidence (1-3 on the scale of 0-10 with 10 being very confident) rated 1 or 2 points higher in the post-evaluation. Similarly, with the specifications the text might be perceived differently from the stakeholders depending on weather they are native speakers or not and whether they are aware of the specific terminology or not, while this is not usually an issue for an expert. While at this stage some of the visual elements have already been created the review of stakeholders can be valued as the experts' review. The dependence on the level of development of the visual elements is evident in Table 4 in which the Irish RLO, which has few visual elements ready scores lower than the German one, that has all the visual elements created. This is also reflected in the interactivity in which the importance to include interactivity is 100% for the Irish RLO which is heavily interactive and the interactive elements were not created yet. Thus, the experts review at this attribute can be considered more important the less the developed the interactive elements are. The Assessment review similarly with the specification stage experts' reviews provide comments not only in the relevance to the topic, but to the type and the number of assessments while students focus on the relevance part. Sequence and flow are the same as in the specification stage, while for the more completed RLO both evaluators can have a clearer picture. Complementary Information such as Further Resources, Glossary and Keywords usually can be seen and explored at this stage by both experts and stakeholders, but still the main comments are expected by the experts' reviews. Notes or additional comments from experts as in the Specification stage of the RLOs concentrate on improvements for the RLO, while stakeholders' comments concentrate on accessibility issues.

DISCUSSION
The accuracy of the Content is something that experts in the filed can better evaluate, while how it linked with the objectives and how clear are the learning objectives and their relation of the content can be contributed by both in all stages of the development. Knowledge Transfer is an attribute that self-proved can be evaluated by stakeholders better than experts and given the fact the ASPIRE participatory approach is based on the same principles of the wisdom of the community [14], [21] stakeholders evaluation is needed. Text clearness and accuracy review can be beneficial by both roles in Specifications and Prototype stage of development. The experts can easily check for the correct terminology or language errors, while the stakeholders can identify whether the language used is understood by them [22].
Visual Elements review at the Specification stage in which they haven't been created yet, RLO development might benefit more from Experts evaluation, as they can more easily link the pedagogy and the concept with the learning outcome [23]. This can change while the RLO moves to the final prototype stage where the visual element have been created and the stakeholders can provide more information whether they are helpful for them or not. The experts can focus more in the actual pedagogy behind the Interactivity of the educational resource and its role in the experiential learning [24], thus similarly with the visual elements their review is more beneficial closer to the Specification stage. The experts can better tailor the number and the type of the assessments needed, while it's easier to understand their role in the learning journey of the learner [25], either as self-reflection or as an assessment against a specific learning objective, hence their reviews seem to be more valuable.
Sequence and flow are influenced by the interactivity engaged in the flow, but as revealed from our comparison both of the evaluators can provide a beneficial review. While the experts are more familiar with Complementary Information as they are the experts in the topic and know what additional information exist. The stakeholders while have the opportunity to provide additional information they usually perceive themselves as only users of the RLOs, thus they provide comments around the accessibility of the resources, while experts provide more in-depth comments.
The roles of users in the co-design process has been valued a lot, however it has been also criticized [26], with the critiques claiming that "User studies can easily confuse what users want with what they truly need" [27]. This work has also limitations.We compared open ended questions with closed ones, and a further evaluation of RLOs once completed by both experts and stakeholders can act as a baseline evaluation providing more creditability and validity to the results. While these critiques are sound, this piece of work already provides some consideration of the benefits of using stakeholders during the development of an RLO. Further research is needed to better examine the cost-benefit relationship of including stakeholders at different stages, as this can increase a lot the cost of the RLO.

CONCLUSIONS
The input of experts' reviews in the development of the RLO have been proven useful in the development of many RLO [7], [28], but the role of stakeholders' evaluation on the quality of reusable learning objects following the aspire participatory framework has not been widely discussed. We identified 10 areas of concern for receiving feedback and through an analysis of the evaluation of both experts and stakeholders we aimed to see whether the stakeholders' evaluation is needed and in what areas is more beneficial. Table 5 summarizes the preferd role's feedback without excluding the others role's feedback but considering which one might be more beneficial for the RLO development. Another aspect that was taken under consideration is the stage of the development of the RLO: Specification, Early prototype, Final prototype development stage.