top of page
  • Writer's pictureKendra Oliver

Rethinking online education for external audiences using UX design

Short title: Applied UX Design for Educators





Authors: Dora Obodo (Increible graduate student at Vanderbilt),

Deborah McCabe (MICA UX Design Peer), &

Kendra H. Oliver (Me)

 

Overview:

User Experience (UX) design applies a user-centric design process to develop new technologies that improve the student experience. Today, educators and university administrators are interested in leveraging technology to engage current and new audiences better. However, case studies for applying UX in course design are limited and need to be more present in the chemistry education field. Here, we present a case study detailing the UX-based approach used in the design evolution of a drug discovery course for external audiences. We found that our external users had extremely low completion rates using a traditional learning management system to develop the courses. We followed the UX double diamond design process to identify design challenges and address them strategically from a user-centered perspective. We constructed the personas of our users, identified key features, and developed low, mid, and functional prototypes for the new user-based course with iterative usability tests. This case study demonstrates UX design's impact on online chemistry content development by universities and institutions that want to engage external users.


 

INTRODUCTION

 

Chemistry is multifaceted, and the pharmaceutical industry (Pharma) creates many innovations that touch all parts of our everyday lives1,2. This requires a multidisciplinary perspective and drug discovery is dependent on disciplines. Integration between pharmacology, medicinal, analytical, and biological chemistry are need to champion research and development of drugs for common and rare diseases, establish and formulate said drugs, and ultimately bring them to market 3–5. As an ever-growing field in which ideas are changing, Pharma offers many opportunities to young scientists entering the field and veteran scientists looking to advance in their career trajectories 6–8. These chemists are looking for state-of-the-art training in chemistry and the pharmaceutical industry and turn to continuing education opportunities 9,10.

 

As a result, leaders at the forefront of academic and pharmaceutical innovation are increasingly investing in training for pharmaceutical chemists and professionals 9. Digital and online resources like massive open online courses (MOOCs) have been used to deliver didactic content. Platforms such as Coursera (Stanford), MIT OpenCourseWare and EdX (from Harvard and MIT) offer several courses related to drug discovery and design, which often aim to teach skills and develop core competencies required to build a pharmaceutical career 11–13. As compared to degree granting programs, these systems often prioritize accessibility and inclusivity for non-traditional and non-academic students 14–16. However, designing effective digital chemical education courses remains challenging compounded by the fact that chemistry is often perceived as hard to learn or difficult to engage virtually17,18. Thus, higher education chemistry and science departments often struggle with how best to deliver this content to non-traditional students 15,19,20.

 

A better understanding of designing online programs centered on user experience (UX) may allow universities to better engage non-traditional pharmaceutical professionals 21. Traditionally, digital teaching occurs through learning management systems (LMS) such as Blackboard, Brightspace, or Canva. These platforms centralize technologies and teaching resources online for learners 22–24. LMS offer many benefits for teaching chemistry. For example, teachers can use LSM to organize and share recorded lectures, materials and assessments. Depending on the instructional design approach, these materials can be used by student to revisit the material (self-directed learning) or prepare before coming to class (flipped classroom) 25. Typically, LSM support in-person training however, the COVID19 pandemic brought to light the limitations of these systems26–28. This transition also highlighted the distinction between instructional design which aims to transfer knowledge or skills to learners or students, whereas UX design wants to create a usable (typically digital) experience 29–32. The success of digital learning environments to engage learners must consider the intended audience's distinct pain points and behaviors. Behavior-based measures of platform success, such as course completion rates, may provide a better indication of educational platform performance as compared to more traditional pedagogical measures. Educators typically don’t consider UX-based design strategies which could be alterative complementary measures to support accessibility and inclusivity when designing online resources for non-traditional learners.

 

Designing courses based on UX-design principles offer several benefits 21,33. In general, trends in digital learning show low student retention, often with 30% - 50% rate of completion in LMS-style courses and with rates approaching 40% for MOOCs 34–39. This number may be attributed to poor and often unintuitive course design 35,40,41. These courses may also require logging into various platforms, which hinders the user learning experience. A UX-design centered approach requires the educator to consider the digital experience for the learner. Studies show there to be a correlation between student engagement, as described by components of the learning environments (e.g., interactions between students and the learning systems, technologies, resources, navigation) and eventual learning outcomes 42–44. Studies also show that continuing education students often value flexibility, simplicity, and a collaborative learning environment. These are ideas that must be prioritized when designing chemistry content for professionals.

 

Here, we present a case study detailing the UX-based approach used in the design evolution of a drug discovery course which was initially hosted on a traditional LMS.  We found that for our users the tradition LMS course had low completion rates. We followed the UX double diamond design process to identify design challenges and address them strategically from a user-centered perspective. We constructed personas of our users, identified key features, and developed low, mid, and functional prototypes for the new user-based course with iterative usability tests. This case study demonstrates the impact UX could have on online chemistry content development by universities and institutions that want to engage external users.


MATERIALS & METHODS

VUSM Basic Sciences Drug Discovery LMS Online Program

The program was initiated in 2018, leveraging an external iteration of an LMS to increase current students' drug discovery research literacy and showcase the therapeutic development expertise within the Vanderbilt School of Medicine Basic Sciences. Specifically, the interactive curriculum of the program was designed for external participants interested in using online resources to enhance their understanding of drug discovery. The program was structured as a series of two-week mini courses that covered various topics crucial to drug discovery and development. The program was closed in March of 2023. The course structure and offerings during first launch in summer and fall 2021 can be seen in Supplemental Figure 1. A sample course syllabus and course schedule for one module in the course can be seen in Supplemental Figure 2.

Participant recruitment and audience

Once the course was launched, we recruited students by advertising via LinkedIn, Vanderbilt School of Basics Sciences email listservs, and a subscription email Newsletter offered on the course website homepage. From there, students could register, enter, and complete the course. Students who enrolled in one or more courses were automatically subscribed to the course newsletter. Demographics of subscribed students from newsletter can be viewed in Supplemental Figure 3.


IRB Approval

This project’s data collection includes compiling web search results, user interviews, and observation of twenty participants’ use of our LMS courses and three iterations of a prototype. We used semi-structured interviews with the participants. This is a low-risk human research project whose ethics has been cleared by the Maryland Institute College of Arts and Vanderbilt University Research Ethics Committee. The ethics approval numbers are VU220210 and MICA IRB Protocol 117 for Vanderbilt University and MICA, respectively.


User Experience (UX) Design

UX is an emerging and evolving field that offers a process for developing products informed by the audience's (user's) needs. UX design is the process of designing (digital or physical) products that are useful, easy to use, and pleasing to engage. This process is guided by how a product's design can enhance the user experience and identify the value provided. For a comprehensive overview of UX Design for educators we suggest Earnshaw et al 2018. Case studies of applying UX design for educators are limited and are nonexistent in the chemistry education field.


Double Diamond UX Design Process and Design Methods

The British Design Council popularized the double diamond process in 2005. The two diamonds represent exploring an issue more widely or deeply (divergent thinking) and then taking focused action (convergent thinking, Supplemental Figure 4). The first phase is the "discover" phase which involves taking steps to understand an issue rather than merely assuming a solution. Within this phase, we conducted two methods; a competitive analysis and a quantitative assessment of student completion rates based on the first implementation of the LMS course and interviews. The next step in the process is to “define” the problem, during which we refine the insight gathered from the discovery phase to determine a course of action. The method used for this phase was generating personas. The third phase is to “develop” a new prototype based on the feedback we have collected and continue testing said prototype through users.  The final stage is “deliver,” which involves building out the prototyped solutions at a small scale, rejecting those that will not work, and improving the ones that will. This was called our functional prototype.


Competitive analysis.

To start, we performed a competitive analysis of direct and indirect competitors to the LMS-style program (Table 1). A direct competitor offers the same course or targets the same student audience. An indirect competitor targets a different student type or offers a different course product to the same target audience. To do this, we conducted a google search for LMS-styled drug design and development programs with direct or indirect infrastructures and narrowed down the comparisons. After reading through course descriptions, we compared them with our program based on their potential appeal to students, objectives, types of content offered, and other factors. We also evaluated key features for each type of competitor (Supplemental Table 2).



Table 1. Competitive Analysis of existing platforms in direct and indirect competition. As the first step in the discovery phase, we compared our program with three direct and three indirect competitors based on specific features important for each competitor type. 


Quantitative evaluation.

We performed quantitative analysis on the LMS course, extracting usage metrics about user registration and completion rates to build user pathways with rates (Supplementary Figure 5).


We interviewed students currently or previously enrolled in the LMS-style course to understand users' needs after taking the LMS-style program. We wanted to understand user motivations, experiences, and key outcomes, as this would allow us to later classify user groups. We interviewed nine people: four advanced science undergraduate students just starting their careers and five working professionals interested in getting into or switching tracks within the drug discovery industry. The questions focused on the users’ prior experiences and present emotions about online education. We also asked a series of questions about their motivations, uses, and engagement with the online courses, including the educational, social, and cultural factors that may have influenced their experience. The interview was recorded and transcribed. Readers can find a complete interview guide in Supplementary Document 1.


Generating personas. From interviews with participants of the course, we defined and contextualized the qualitative information by creating user types to test whether they fit our target user groups. User groups are primarily defined based on occupation and qualifications. From here, we developed personas, which are used to characterize behaviors, expectations, and pain points of our defined user groups (Figure 2).



Figure 1. Personas created to identify typical users pain points and needs. Here we show a typical persona show in two columns describing approximate characteristics of a user belonging to Types A (gold) and Type B (red). When building the personas we developed stereotypical backgrounds, goals, habits and pain points for each persona. The level of understanding for drug discovery and development, interest in the topic, interest in networking and community and their comfort with a traditional course format are shown on a scale from low to high (right to left). From this evaluation we identified three shared problems between these personas, shown in the gold box. A more detailed figure of the personas is available in the supplementary materials.


SWOT Analysis, and value proposition. 

Based on the interviews, we summarized the next steps for the course using a Strengths, Weaknesses, Opportunities, and Threats (SWOT) analysis to determine the strengths and weaknesses of the course based on the user's qualitative and quantitative data (Supplemental Figure 6). A strengths, weaknesses, opportunities, and threats (SWOT) analysis has become a key tool used by businesses for strategic planning. It is often used by organizations to evaluate their position and to analyze their internal and external environments during times of indecision 45–47. Strengths refer to the internal elements of an organization that facilitate reaching its goals, while weaknesses are those internal elements that interfere with organizational success. Opportunities—external aspects that help an organization reach its goals—are not only positive environmental aspects but also opportunities to address gaps and initiate new activities. Threats, on the other hand, are aspects of the organization’s external environment that are barriers or potential barriers to reach its goals 48–51. Value can be created through multiple elements, such as price, quality and location 52.


Prototyping.

Prototyping describes the development of early, tangible models of ideas that can be tested with the user to refine the design. We went through 2 stages of course content and structure prototyping in which the first involved creating an app for the course while the second included adding features to the app. All prototyping was designed in Figma, a free web-based design tool (Figure 4).


Usability Testing. 

We aimed to evaluate our prototype by testing it with participants who completed the LMS version of the courses to try to walk through various platform features. The researchers watched, listened, and took notes as the users completed these tasks. We also asked about the users' expectations and preferences for the platform and asked them to compare it to their previous LMS experience. The goal was to identify any usability problems, collect qualitative and quantitative data and determine the participant’s satisfaction with the new prototype. Readers can find a complete usability guide in Supplementary Document 2.


Functional Prototype. 

We finally developed a temporary website design as a functional model for the new course platform. We designed this final website using Wix, a free website creation tool. The final web-based version of the course was entirely hosted on the Vanderbilt School of Medicine Basic Sciences website. We recruited students to participate in the new online courses through email newsletters attached to the course and LinkedIn posts from a leading faculty member teaching the new course. From these media platforms, we extracted usage metrics about user registration and completion rates.


Thematic evaluation of usability tests. For both the low-fidelity and high-fidelity usability tests, the user’s comments were evaluated based on summarizing themes developed by the research team and were grouped as positive or negative sentiments.  Any comments about individual features and aspects of the interface were added to the “design” theme. “Ease of use” referred to comments about the enjoyability, efficiency, and effectiveness of the user's ability to perform tasks. The “content” theme evaluated users’ expectations on content that they thoughts was present or missing from the experience. “Course structure” summarized comments about the information architecture and structure of the courses. “Ease of navigation” described comments on the design and visual aspects that made the interface intuitive to maneuver. Finally, “accessibility” related to comments on if the interface allowed for a stimulating learning environment.

 

RESULTS

DISCOVER

While the idea of an online learning ecosystem for drug development is not new, we aimed to make it specific for an external audience of working pharmaceutical professionals. To understand how our offering compares with the current online learning ecosystems, we performed a competitive analysis with three direct and indirect competitors in this space (Table 1). Our three direct competitors were all certificate or full Online Master’s programs that used LMS to administer their content. All required a background in biology or chemistry and offered the opportunity to learn the drug development process. In contrast to our system, direct competitors offered synchronous components in addition to their asynchronous online content, and some used various additive technologies in conjunction with the learning management system to improve the student experience. However, the number of systems students navigate could potentially complicate its use and decrease compliance. Similarly, it appeared that competitors sometimes developed new learning systems to supplement the course delivery platform. Our three indirect competitors offered programs marketed to external and professional users but outside of our content area. Most were hosted on MOOCs that were easy to use, easy to navigate, and were offered asynchronously. In contrast to our course, most systems were price-based, often with higher quality products having higher prices, longer durations, and lower levels of interactivity.


The LMS version of our course officially launched in Summer of 2021 and consisted of nine self-contained modules covering key ideas and processes in the drug development pipeline, with three key pathways: core modules, drug metabolism and pharmacokinetics, and medicinal chemistry (Supplementary Figure 1). The course was hosted on Brightspace, an LMS which supported multimedia and an array of learning resources such as quizzes, activities, and discussion boards and allowed for external non-institutional access. Each module featured one introductory section and three-to-five content sections with at least one short video, a discussion board, and several activities and quizzes (Supplementary Figure 2). The introductory section introduced students to the faculty member teaching the class, the proposed learning objectives, instructions for completing the class and how to get support for any additional questions or concerns. Finally, students earned a completion certificate for the course if they completed all module quizzes with a 100% score. Modules could be completed asynchronously with deadlines for the quizzes and activities. If participants did not complete the quizzes by the course closure date (two weeks after the start of a module), they did not achieve the course completion certificate. This structure allowed participants to learn at their own pace and to self-select sections to partake in based on their interests, which was the main draw of the course.


As students registered through the course website, we gauged interest by analyzing traffic to the website. Following course launch, the top webpage gained 1,894 page views. Of these views, 1,466 were unique, indicating individual sessions (i.e people) who explored the main program website, while a quarter indicated possible greater interest by returning to the page more than once. Traffic to the site came primarily from direct access (23.55%), followed by the emails, and finally from LinkedIn. 24% from the emails and newsletters, and 11.99% from LinkedIn (Supplementary Figure 5). The total number of registrants (495 registrants) seems to correlate with the quarter of people who showed more interest in visiting the site.


Finally, 120 registrants enrolled in one or more modules of the course (59% registered for one course, 34% registered for two-four courses, and 8% registered for over six courses). We could infer the demographics of those enrolled as they also subscribed to the course newsletter using a subscription sign-up form during enrollment (Supplemental Figure 3). There were 104 subscribers, who were mostly Vanderbilt University affiliated male professionals working in Nashville, TN. MailChimp predicted that 29% were female while 55.1% were male. It also predicted that most subscribers (51.4%) were between 25-34 years old. Top demographics showed that of the subscribers, 45.2% were current members of the Vanderbilt community, while 45.1% were external community members. Finally, in the subscription form, subscribers indicated that they were mostly interested in the basic core content (39.4%), drug metabolism, and pharmacokinetics. The next most popular topic was professional development (21.2%), with very little interest in special topics (8.6%) or entrepreneurship (6.7%).


An overall 0.05% of registrants completed all courses. Students incidentally dropped out of the course as they progressed through the modules with the highest number of learners joining the first module. Overall completion rates for all individual courses were 0.1% or lower. Additionally, engagement with the material was very low, with most learners only logging into the system once. This suggested a significant issue in our content delivery or the platform that resulted in participants not completing or engaging with the material.


DEFINE

Because our user completion rates were so low, we decided to collect qualitative data by interviewing participants to better understand their experience of the course.  We defined two hypothetical types of users based on participants which helped us sample participants who could represent the bulk of user experiences. The first user was an advanced undergraduate student or recent graduate who was interested in exploring potential careers in drug discovery (Type B). The second user type was an advanced professional working in the field interested in building a network with other professionals (Type A). We interviewed four and five users from both groups, respectively. We found that Type A users typically had very little time to dedicate to content and therefore required the platform and content to be easily accessible, flexible, and user-friendly. Their habits were often to log on to the website on weeknights and weekends (e.g., to counterbalance time spent with kids and daily life). As such, they also have varied levels of interaction with the course, often only selecting content they found most relevant. On the other hand, Type B users were often willing to explore, and had more time to engage with the course content, and thus required for the course to be clear, well thought out, and fulfilling. However, both user types found the free video-based content to be positive and enjoyable and found the content to be valuable to their career development. This information was used to develop personas (Figure 2).



Figure 2. Key features and alterations to design. After identifying the key pain points or problems in Figure 1, we identified three features for the new design. The first was to have the registration and course content on the same site, preventing users from having to navigate between different platforms. The second feature was to develop clear expectations and simple tasks for learners that help them identify what they needed to do to complete the course and earn a badge. Finally, the last feature was to increase the accessibility of the video by hosting the video both in the course and on a Vimeo streaming channel.


We summarized the results of our quantitative and qualitative data within an analysis which identified key areas of strength, weakness, opportunity, and threat (SWOT) of our program, which helps define a well thought out plan for the next iteration of course design (Supplemental Figure 6). The strengths of the program were our institutional reputation and excellent content created and led by experts in the field. Our weaknesses were in two major areas: content depth and the course platform. The depth and amount of content was often a challenge as it required a more advanced background in chemistry. The amount of education activities also seemed overwhelming for our target audience.  As a result, some courses need clearer expectations and requirements for completion. Users identified that the threats to completing the course included how difficult it was to access content as the course has a difficult login process and ideally that the videos would be accessible outside of the course. Finally, opportunities to resolve these conflicts would be to create an online delivery platform that only accentuated our video-based content which is an opportunity to clarify and simplify the course requirements. Two additional opportunities were the addition of a badging and certificate program and opportunities to develop an online communities and networks for participants.



Based on our user interviews, we identified three key features for the UX-based redesign of the online courses (Figure 3). These three features included having the registration and course on same site, having clearer expectation for learners to receive credit for the course, and increased accessibility to videos. Thus, the best design to meet these needs would be to build an app-based platform for the course, a more accessible mechanism which also offers content delivery on several devices. The platform would host more videos with less focus on activities to hold up time. We will also introduce micro-credentials, or badges that recognize learning achievements, providing more streamlined course requirements and expectations, and giving an opportunity to network outside of the course.

 

DEVELOP

For the new courses, we developed three mobile app prototypes (low-fidelity, mid-fidelity, and high- fidelity), with each fidelity levels of the prototypes denoting their level of detail and closeness to a final product. We performed usability testing on our mid-fidelity and high-fidelity prototypes using Figma, a flexible software tool used by design practitioners for application interface design and prototyping53,54. The low-level prototype offered a simple user interface based on a series of continuous pages starting from an introductory homepage, with directed links to course pages where one could find all videos, quizzes, and additional links to resources for more information and all flowing back to the home page (Supplemental Figure 7). The mid-level prototype offered a more complex setup, with a homepage that offered more functionality and personalization for users to customize their experience (e.g., creating a list of videos to watch) (Supplemental Figure 8). The high-fidelity prototype included much more functionality, a search, information about the instructors, and more, making it easier for our network to hear lectures, earn badges and join the online community (Supplemental Figure 9).


To perform usability testing, we recruited users to test the platform and performed interviews during their experience. Our testers (advanced undergraduates, experts in biomedical sciences, and Vanderbilt faculty and staff), included people both familiar (n=2) and unfamiliar with the LMS version of the course. We evaluated testers' perceptions of the prototypes in terms of interface layout, features, ease of use and navigation, course structure and setup, and digestibility.


The testers had an overall positive experience of the mid-fidelity prototype, especially its ease of use and navigation (Figure 4). One tester said, “I really like it; it reminds me of other online video learning platforms”. This suggests that the platform is approachable and easy to navigate. However, there was some confusion about specific features, such as the layout of the home page and how to find content. We also identified the need for a results page after taking quizzes, which impacted how users thought about content presentation and packaging. Finally, although testers were excited by the ease of use on mobile and tablet devices, many wanted a web-based platform as an additional option, as it was a more familiar interface, which raised course setup concerns. During our second usability testing with the high-fidelity prototype, we gained deeper insights into the expectations and behaviors of users.  overall, users found a positive experience in most of the thematic areas defined. The high-fidelity prototype received improved responses in the feature and layout design, and content available/missing code, indicating that the prototype was of a more acceptable format that met testers’ intuitions. As the user experience improved, users began to prioritize the course content, allowing us to identify areas to improve instructionally. We identified a few significant problematic themes about the initial landing page that the user engages with (home pages vs. profile), the quiz questions format, the similarity between the visual elements of the videos, and user expectations about starting a course. The testers began commenting more on text and aesthetic personal preferences as well.


DELIVER

Finally, in the functional product of the new course, called Ph.ideo, we created a website using Wix which was also available in app form. The website has a home page providing intuitive access to different user experiences based on the outcomes and interactions a user desires with the program including earning a badge, earning a certificate, or in-person workshops. The website also includes an about page with relevant overview information, an FAQ page, a link to all the course modules, and external links to the full online site. At the bottom of the page are a subscription form for a newsletter and images with links to all videos. The functional model focuses on ease of use, visual appeal, and concise and clear text content (Supplemental Video 1).


Finally, we launched the courses in May 2022 and collected quantitative data about course participation and completion. To pilot Ph.ideo, we offered a free one-month period for past participants. We then emailed 230 people of our previous newsletter subscribers to try out the new course. From there, out of 76 registrants for the course, we found a 45.45% completion rate, a major improvement from initial completion rates of 0.05% on the LMS platform. In addition, students were also interested in earning badges, which were specific to each module. 73% of registrants for the course earned badges, meaning they completed one or more modules without necessarily completing the entire course. This suggests that badges are a more intuitive approach to an asynchronous do-it-your-way style online MOOC where students are not beholden to completing the entire course, just sections they feel are relevant to them (Table 2).



Figure 3. Usability testing and course completion rates. First, we show a thematic evaluation of the comments from usability testing with the mid- and high-fidelity prototypes. Each circle representing a different user test. Red circles indicate that the user identities that theme as an area needing improvement, teal indicated that the user identified that theme in a positive way, and grey indicated no comments about that theme. The Figma design files for each prototype are available in the supplemental materials. Leveraging UX Design strategy increased the completion rate from an average of 0.05% with the LSM course to 45.45% for the functional prototype.

 

DISCUSSION

Through UX Design approaches, including interviews, persona building, and prototyping, we transitioned an online program from a learning management system to a simple video-based web application that was more suited for external participant needs. Employing UX Design evidence-based tools increased retention and completion rates for an audience of external users and students looking to learn or improve their foundational concepts in medicinal chemistry and pharmacology for the pharmaceutical industry. As the material for this subject matter is already challenging, we aimed to lower the activation energy needed to navigate the online learning systems and processes. We found that reducing these barriers led to increased engagement with the drug discovery content.


We were interested in developing non-credit online courses and resources for working professionals to advance their careers. We also included early career scientists (potentially masters, doctoral, and postdoctoral students) interested in learning introductory drug discovery and development concepts. In our discovery process, we found through a comparative analysis that we might better meet our intended audience expectations if it had more qualities typically associated with MOOCs (i.e., ease of use, accessibility, asynchronous). Yet, our product must also leverage the most appreciated features within a traditional LMS system (content management, interactions, community). Through user interviews, we were able to identify two critical weaknesses in our LMS course, which included the content accessibility and the usability of the course platform. While future efforts could increase the accessibility of the course material, here we focused on developing a digital course platform that could better fit user behavior. With various iterations of prototyping, we delivered a functional online course prototype that improved the completion rate from 0.05% to 45.45% by redesigning the system based on user needs.


Our process leads us to focus on three key features. First, we wanted registration and course content to be located on the same site or, at a minimum, not require participants to migrate between two systems. Password and user account exploitation is considered one of the critical issues in network security management 55–58. Previously two systems were used to register the student and to host content. Unfortunately, integrating these two systems created many negative experiences for users. We combined our registration and content hosting through a web-based application. Second, we reduced the participant's tasks and clarified expectations. Others have found that reducing requirements to those essential to learning objectives decreases stress and anxiety while not significantly impacting other learning or performance measures in traditional learning environments 59. When conducting the redesign, consideration of the assessment of learning outcomes lead to explicit assessment mechanisms that reduced redundancy and requirement clarity. Starting with assessment mechanisms and then building objectives was also seen in other MOOCs to instruct learners to work toward learning outcomes 60. Various learning outcomes in MOOCs have been studied, such as knowledge, skills, academic achievement, engagement, motivation, satisfaction, and perceptions of learning 61–66. Assessment tasks throughout the course differ in difficulty and complexity, which could align with different levels of learner motivation. Particularly with an external audience, learners' motivations, such as course complement, professional needs, satisfying curiosity, and connecting with people, their learning needs might differ as well 67. Examples of different learner groups are active learners who aim to complete and pass the course and lurkers who do not involve themselves in learning or complete any assessment tasks 68. Measurement with the same standard within a course might not satisfy all learners' learning needs. One way could be to apply the assessments in each learner group but differentiated in difficulty and complexity. In a word, differentiated learning assessments might align with learners' motivation levels and lead all of them to fulfill learning goals. Finally, we moved all video content to Vimeo, which increased the accessibility of the video outside of course content. Rapid globalization, along with the growing trend of openness and sharing, has created a standard for the accessibility of information through digital platforms 69. The potential of technology to enhance learning has been well established 70,71. Particularly for external audiences hosting a video on platforms like YouTube, Vimeo, and DailyMotion increases their broader accessibility. Therefore, it is reasonable to use these platforms to foster new ways of informal teaching and learning 72. These video platforms offer a tremendous opportunity at an affordable cost to learn long sequences (e.g., a 20-part video series on learning to play the dulcimer) as well as short knowledge insights (e.g., how to clean a clogged drain) and ongoing routine activities (e.g., exercise classes). The 2021 Pew study of adult Americans found that 81% are YouTube users 73, of whom 86% found YouTube videos useful for informal learning 74. Additionally, resources exist to help educators with low-cost, high-impact video production 75.  


This study has a few limitations, including measuring knowledge transfer and factoring in the digital literacy of our target populations. Future work must more thoroughly evaluate whether learning outcomes were equivalent in these two systems. Learning outcomes describe the measurable skills, abilities, knowledge, or values that students should be able to demonstrate as a result of completing a course. They are student-centered rather than teacher-centered in describing what the students will do, not what the instructor will teach. We created learning objectives for the courses, along with activities that evaluated these objectives. Based on user feedback, we cut all redundant learning outcomes measuring activities to the bare minimum. Reduced requirements decreased the time and energy necessary for the participant. Since progression through the course required completing objective-measuring activities, we used completion rates to indicate learning outcomes. Additional studies would be required to test if these changes lead to reduced knowledge transfer more generally. Additionally, in future studies, we should assess the digital literacy of our audience or compare the digital literacy between personas. Digital literacy refers to an individual's ability to find, evaluate, and communicate information through typing and other media on various digital platforms. A better understanding of digital literacy in our population could explain why specific design decisions lead to higher engagement and completion rates. Additional studies would be required to assess digital literacy specifically.


Our results emphasize the need for applying UX Design principles, particularly for external learners. Informal education efforts that seek to engage a broader, non-academic audience have alternative motivating factors for learners that impact their engagement. Our findings echo previous work contending that universities should utilize UX Design when creating broad-based learning systems online. However, our work provides an example of applying this approach and demonstrated benefits. Reimagining the design of education content for an external audience requires interactivity, motivational incentives, and presentation considerations. In a sense, UX Design blends communication with pedagogical practice to improve the experience for learners. Others can use our process to model improvements for other digital resources within the sciences and beyond.

 

 


 

REFERENCES

76. Earnshaw, Y., Tawfik, A. A., & Schmidt, M. (2018). User Experience Design. In R. E. West (Eds.), Foundations of Learning and Instructional Design Technology. BYU Open Textbook Network. https://open.byu.edu/lidtfoundations/user_experience_design


17 views0 comments
bottom of page