Close
The page header's logo
About
FAQ
Home
Collections
Login
USC Login
Register
0
Selected 
Invert selection
Deselect all
Deselect all
 Click here to refresh results
 Click here to refresh results
USC
/
Digital Library
/
University of Southern California Dissertations and Theses
/
Learners’ perceptions of the microlearning format for the delivery of technical training: an evaluation study
(USC Thesis Other) 

Learners’ perceptions of the microlearning format for the delivery of technical training: an evaluation study

doctype icon
play button
PDF
 Download
 Share
 Open document
 Flip pages
 More
 Download a page range
 Download transcript
Copy asset link
Request this asset
Transcript (if available)
Content Running head: LEARNERS’ PERCEPTIONS OF MICROLEARNING 1 Learners’ Perceptions of the Microlearning Format for the Delivery of Technical Training: An Evaluation Study by Wendy Peterson ___________________________________________________________________________ A Dissertation Presented to the FACULTY OF THE USC ROSSIER SCHOOL OF EDUCATION UNIVERSITY OF SOUTHERN CALIFORNIA In Partial Fulfillment of the Requirements for the Degree DOCTOR OF EDUCATION December 2017 Copyright 2017 Wendy Peterson LEARNERS’ PERCEPTIONS OF MICROLEARNING 2 Dedication This dissertation is dedicated to my entire family, but first and foremost, to my sons Mark Metrakos and Scott Metrakos. Their unquestioning belief in their mom never waivers and never ceases to amaze me. You always hug me in front of your friends and have done so even when you were teenagers. ILY guys. I am so proud of the honorable men you are. Also, through this dedication, I honor my late father and mother, Harold “Pete” Peterson and Jeanette Nisbet Peterson. You loved me unconditionally and patiently. You instilled in me the work ethic that made every bit of my success possible. To my sister, Susan Peterson Nofziger, my brother, Wayne Peterson, and to every member of their diverse, inspiring extended families. You all model excellence in everything that you do especially in your service to those who need help the most. I am honored that I can call each of you family. LEARNERS’ PERCEPTIONS OF MICROLEARNING 3 Acknowledgements Thank you to Dr. Kimberly Hirabayashi for chairing my committee and never failing to reassure me dozens of times over the past three years that I belong with those who have earned a doctoral degree at USC. Your patience with my quirks was unbounded and your expert guidance led me time and time again out of the rabbit holes and back onto the right path. Your ability to take my spider web of thoughts and help me turn them into clear prose is something I hope to master someday. Thank you also to Dr. Helena Seli and Dr. Megan Patel. Your wiliness to serve on my committee and your intellect, generosity, guidance, and compassion made the journey far less terrifying and far more valuable. Thank you to all of my USC professors in the OCL online program and all of the caring folks in the program support office. You took this unlikely senior citizen student and opened for me a world of scholarly potential. Of special note, Dr. H., you untangled my theoretical and APA style knowledge more times than I can count and I can’t thank you enough. Dr. Filback, I will be forever grateful for your brilliance and kindness in mentoring and supporting those of us who were suffering mightily from the imposter syndrome; you showed us that creativity is an empowering gift we are all capable of continuously giving ourselves. Dr. Sundt, I will continue to endeavor emulate your altruistic leadership for the rest of my days. Dr. Slayton, your guidance in helping me corral my out-of-control compulsion for detailed data analysis made it possible for me to finish Chapter Four before this decade came to a close. I also thank the staff at the Orange County USC Campus. You always held a study room open for me during the final month of writing to ensure I had the peace, quiet, and whiteboards I needed to focus, even when I was the only student left in the building. To the wonderful folks at the USC Dissertation Support Center: Thank you for holding so many “writing weekends” to LEARNERS’ PERCEPTIONS OF MICROLEARNING 4 give me the gentle we’re-all-in-this-together peer pressure I needed to keep going when I really wanted to do anything but. Finally, but certainly not least of all, my heartfelt appreciation to the “Saturday Crew” of colleagues in OCL’s Cohort #1 who were in class with me online for hours every Saturday morning for 2.5 years smiling from your Brady Bunch video squares and typing chat messages non-stop until we were asked (repeatedly) to calm down. That laughter, that collaboration, that support, that brainpower—it has all been absolutely invaluable. I don’t have room to list every one of you but you know who you are and you are each brilliant. Your doctoral research will improve lives and I am honored to know you. For Vic, Dave, Jody, and Deb, the word “thanks” just isn’t enough. You all pulled me across this finish line when it seemed like someone kept moving it farther away. I owe you one; please collect soon. Fight On! LEARNERS’ PERCEPTIONS OF MICROLEARNING 5 Table of Contents List of Tables .................................................................................................................................. 8 List of Figures ................................................................................................................................. 9 Abstract ......................................................................................................................................... 11 Chapter One: Introduction ............................................................................................................ 12 Context and Mission of Field-Based Study ............................................................................... 13 Goals for Organizational Training and Professional Development ........................................... 13 Related Literature ....................................................................................................................... 14 Importance of the Evaluation ..................................................................................................... 14 Description of the Stakeholder Group for the Study ................................................................. 15 Purpose of the Project and Research Questions ......................................................................... 16 Methodological Framework ....................................................................................................... 17 Definitions.................................................................................................................................. 18 Chunking ............................................................................................................................... 18 eLearning ............................................................................................................................... 18 Gamification .......................................................................................................................... 18 ILT ......................................................................................................................................... 18 LinkedIn ................................................................................................................................ 18 Microlearning ........................................................................................................................ 19 salesforce.com ....................................................................................................................... 19 Technical Training ................................................................................................................ 19 Trailhead ................................................................................................................................ 20 Web 2.0 ................................................................................................................................. 20 Organization of the Project ........................................................................................................ 20 Chapter Two: Review of the Literature ........................................................................................ 22 Introduction to the Problem of Practice ..................................................................................... 22 Stakeholder Knowledge, Motivation, and Organizational Influences ....................................... 27 Knowledge Influences ........................................................................................................... 27 Knowledge types ................................................................................................................ 27 Procedural knowledge influences ....................................................................................... 29 Metacognitive knowledge influences ................................................................................. 30 Motivation Influences ............................................................................................................ 31 Expectancy value theory .................................................................................................... 33 Goal orientation theory ....................................................................................................... 35 Self-efficacy theory ............................................................................................................ 37 Organizational Influences ...................................................................................................... 37 Cultural setting influences .................................................................................................. 38 Cultural model influences .................................................................................................. 40 Summary .................................................................................................................................... 42 Chapter Three: Methodology ........................................................................................................ 46 Purpose of the Project ................................................................................................................ 46 Research Questions .................................................................................................................... 46 Conceptual Framework .............................................................................................................. 47 LEARNERS’ PERCEPTIONS OF MICROLEARNING 6 Participating Stakeholders ......................................................................................................... 48 Methodological Framework ....................................................................................................... 50 Survey Instrument ................................................................................................................. 51 Survey recruitment, sampling criteria, and rationale ......................................................... 51 Interview ................................................................................................................................ 53 Interview recruitment, sampling criteria, and rationale ..................................................... 54 Data Analysis ............................................................................................................................. 56 Credibility and Trustworthiness ................................................................................................. 57 Ethics .......................................................................................................................................... 58 Limitations and Delimitations .................................................................................................... 61 Summary .................................................................................................................................... 62 Chapter Four: Results and Findings .............................................................................................. 63 Participating Stakeholders ......................................................................................................... 63 Overview of Survey Participants ........................................................................................... 63 Overview of Interview Participants ....................................................................................... 64 Detailed Survey Participant Demographics .......................................................................... 65 Survey participant age ........................................................................................................ 65 Survey participant employment .......................................................................................... 66 Survey participants with technical training experience ...................................................... 68 Survey participants without technical training experience ................................................. 69 Survey participants' reasons for taking technical training .................................................. 70 Types of technical roles held or desired by survey participants ......................................... 72 Survey participants' years of technical training .................................................................. 73 Survey participants' experience with microlearning .......................................................... 74 Survey participants with neither experience nor an understanding of the microlearning format .......................................................................................................... 74 Survey participants without microlearning experience but with an understanding of the format ....................................................................................................................... 75 Number of technical training courses in a microlearning format ....................................... 77 Research Question 1: Influences on Learners’ Perceptions of Microlearning .......................... 79 Overall Impression of Microlearning .................................................................................... 80 Microlearning compared to instructor-led courses ............................................................. 80 Transfer of knowledge to the workplace ............................................................................ 90 Learn by listening to recorded training .............................................................................. 91 Learn offilne from a workbook .......................................................................................... 92 Knowledge Influences ........................................................................................................... 93 Procedural knowledge ........................................................................................................ 93 Metacognitive knowledge ................................................................................................ 105 Motivation Influences .......................................................................................................... 117 Expectancy value .............................................................................................................. 118 Goal orientation ................................................................................................................ 130 Self-efficacy ..................................................................................................................... 138 Organization Influences ...................................................................................................... 141 Cultural setting ................................................................................................................. 141 Cultural model .................................................................................................................. 146 Interview Participants’ Implementation Recommendations ............................................... 148 LEARNERS’ PERCEPTIONS OF MICROLEARNING 7 Select appropriate topics for microlearning ..................................................................... 148 Specify pre-requisite knowledge ...................................................................................... 149 Include support information ............................................................................................. 150 Incorporate social learning ............................................................................................... 150 Include robust assessments ............................................................................................... 151 Increase breadth of microlearning courses ....................................................................... 153 Research microlearning best practices ............................................................................. 153 Test the course .................................................................................................................. 153 Summary ................................................................................................................................ 155 Chapter Five: Recommendations ................................................................................................ 156 Recommendations for Practice to Address KMO Influences .................................................. 156 Knowledge and Motivation Recommendations .................................................................. 158 Overview .......................................................................................................................... 158 Recommendations for knowledge and motivation influences ......................................... 161 Organization Recommendations ......................................................................................... 173 Overview .......................................................................................................................... 173 Cultural setting recommendations .................................................................................... 176 Cultural model recommendations .................................................................................... 180 Summary .......................................................................................................................... 183 Implementation and Evaluation Framework ............................................................................ 184 Organizational Purpose, Need and Expectations ................................................................ 185 Level 4: Results and Leading Indicators ............................................................................. 185 Level 3: Behavior ................................................................................................................ 187 Level 2: Learning ................................................................................................................ 188 Level 1: Reaction ................................................................................................................. 190 Evaluation Tools .................................................................................................................. 191 Immediate evaluation ....................................................................................................... 191 Delayed evaluation ........................................................................................................... 192 ELearning Considerations for Assessment and Evaluation Planning ................................. 193 Summary .................................................................................................................................. 195 Limitations and Delimitations .................................................................................................. 197 Conclusion ............................................................................................................................... 198 References ................................................................................................................................... 200 Appendix A: Survey Instrument ................................................................................................ 215 Appendix B: Interview Protocol ................................................................................................ 223 Appendix C: Survey Question Flow Design .............................................................................. 226 Appendix D: Example of an Immediate Program Evaluation Tool ........................................... 227 Appendix E: Example of a Delayed Program Evaluation Tool ................................................. 229 LEARNERS’ PERCEPTIONS OF MICROLEARNING 8 List of Tables Table 1: Global and Stakeholder Goals ....................................................................................... 16 Table 2: Knowledge, Motivation, and Organizational Influences and Assessments ................... 44 Table 3: Interview Participants for Qualitative Inquiry ............................................................... 65 Table 4: Comparison of Primary Microlearning Course Navigation Strategy ......................... 112 Table 5: Summary of Survey Responses Rating the Importance of Course Features ............... 113 Table 6: Summary of Knowledge and Motivation Influences and Recommendations ............. 159 Table 7: Summary of Organization Influences and Recommendations ..................................... 175 LEARNERS’ PERCEPTIONS OF MICROLEARNING 9 List of Figures Figure 1. Example of eLearning architecture ............................................................................. 26 Figure 2. Burke & Litwin Causal Model Approaches to Managing Organizational Change. ... 32 Figure 3. Conceptual framework for this study .......................................................................... 48 Figure 4. Frequency of survey participant age ........................................................................... 66 Figure 5. Frequency of participant employment organization type ........................................... 67 Figure 6. Frequency of participants’ employment organization size ......................................... 68 Figure 7. Frequency of participants who had taken a technical training course, sorted by organization type ......................................................................................................... 69 Figure 8. Frequency of participants who had not taken a technical training course, sorted by organization type .................................................................................................... 70 Figure 9. Frequency of participants who had and who had not taken a technical training course, sorted by organization type ............................................................................ 70 Figure 10. Frequency of participants’ reasons for taking a technical training course .................. 71 Figure 11. Frequency of participants’ stated technical roles or desired roles .............................. 73 Figure 12. Frequency of participants’ years of experience with technical training ..................... 74 Figure 13. Frequency of participants’ exposure to microlearning courses but without microlearning experience ............................................................................................ 75 Figure 14. Frequency of participants’ interest in taking a microlearning course but without microlearning experience ............................................................................................ 76 Figure 15. Frequency of participants’ perception of the use of microlearning for technical training but without microlearning experience. .......................................................... 77 Figure 16. Frequency of the number of technical training courses participants had taken in a microlearning format .................................................................................................. 78 Figure 17. Frequency of the participants experienced with microlearning who plan on taking additional microlearning courses in the future ............................................................ 79 Figure 18. Frequency of the survey participants’ ranking of the importance of instructor-led classroom training for technical content ..................................................................... 81 Figure 19. Frequency of the survey participants’ ranking of the importance of instructor-led webinar training for technical content ........................................................................ 82 Figure 20. Frequency of the survey participants’ ranking of the importance of instructor-led webinar training for technical content ........................................................................ 83 Figure 21. Frequency of the survey participants’ ranking of the importance of learning technical content in collaboration with colleagues ..................................................... 85 Figure 22. Frequency of the survey participants’ ranking of the importance of learning technical content by listening to a recorded webinar .................................................. 92 Figure 23. Frequency of the survey participants’ ranking of the importance of learning technical content from a workbook ............................................................................. 92 Figure 24. Frequency of the ranking of importance for clear course navigation instructions in microlearning courses ............................................................................................. 94 Figure 25. Frequency of the ranking of importance for the inclusion of definitions for terminology used in microlearning courses ................................................................ 97 Figure 26. Frequency of the ranking of importance for the inclusion of a course progress indicator ...................................................................................................................... 98 LEARNERS’ PERCEPTIONS OF MICROLEARNING 10 Figure 27. Frequency of the ranking of importance for the inclusion of time estimates for modules in microlearning courses .............................................................................. 99 Figure 28. Frequency of the ranking of importance for the ability to click through a required microlearning course quickly ..................................................................... 101 Figure 29. Frequency of the ranking of importance for the inclusion of interactive features in microlearning courses ........................................................................................... 102 Figure 30. Frequency of the ranking of importance of the ability to bookmark progress in microlearning courses ............................................................................................... 104 Figure 31. Frequency of single choice responses to the statement: If I could observe you working through your most recent microlearning course, what is the primary strategy I would see you use? ................................................................................... 111 Figure 32. Frequency of responses regarding the participants’ ability to acquire needed technical skills from a microlearning course ............................................................ 118 Figure 33. Frequency of responses regarding microlearning as an important use of time ......... 119 Figure 34. Frequency of responses whether it is worthwhile to be able to make course path decisions ............................................................................................................ 120 Figure 35. Frequency of responses whether it is worthwhile to be able to have control over time spent learning ............................................................................................ 121 Figure 36. Frequency of responses regarding the desire to see more technical content delivered in a microlearning format .......................................................................... 122 Figure 37. Frequency of the ranking of importance of convenience .......................................... 124 Figure 38. Frequency of the ranking of importance of learning in short segments ................... 126 Figure 39. Frequency of the ranking of importance of learning only specific content .............. 128 Figure 40. Frequency of the ranking of importance of taking the course content in any order .......................................................................................................................... 129 Figure 41. Frequency of responses regarding participants’ achieve their learning goals in a microlearning format .......................................................................................... 131 Figure 42. Frequency of responses regarding ease of learning technical content in a microlearning format ................................................................................................ 139 Figure 43. Frequency of responses regarding the participants’ ability to navigate through a microlearning course .............................................................................................. 140 Figure 44. Percentage range of technical training type delivered by survey participants’ organizations ............................................................................................................. 141 Figure 45. Frequency of responses ranking the extent of needed professional development provided by organization .......................................................................................... 142 Figure 46. Frequency of responses ranking the extent of organization encouragement compared to organizational commitment for professional development .................. 146 Figure 47. Example display of evaluation findings showing changes before and after training. ..................................................................................................................... 193 Figure 48. Presentation of results from ongoing application of learning to job performance. .. 194 Figure 49. ATD research findings on the use of the five levels of evaluation in organizations between 2009 and 2015 ...................................................................... 195 Figure 50. ATD research findings on the effectiveness of evaluation in meeting organizational goals between 2009 and 2015 ........................................................... 196 LEARNERS’ PERCEPTIONS OF MICROLEARNING 11 Abstract The speed of innovation is challenging instructional designers to meet the individualized goals of their technical learners. One relatively new approach is microlearning: segmenting curricula into small modules (or chunks) and giving the learner control of the delivery of the content to align with their learning objectives. This study was designed to contribute insight to a field-based problem of practice: specifically, what knowledge, motivation, and organization influences affect learners’ perceptions of the microlearning format when it is applied to technical training? The study employed a modified Clark and Estes’ (2008) KMO framework and utilized surveys followed by personal interviews to gather data. Both convenience and snowball sampling methods were employed. Survey participants were recruited from online technical user group communities; interview participants were randomly selected from survey respondent volunteers. The findings from this study indicated that three primary factors contribute to learners’ perceptions of microlearning. First, flexibility within the course is paramount; having support features available but not required, is essential. Second, providing suggestions for recommended course pathways allows learners to engage with the content and align it to their personal learning goals. Finally, offering robust opportunities (but not requirements) for self-assessment gives learners control to evaluate whether their level of mastery of the content meets their objectives. Keywords: microlearning, technical training, eLearning, KMO framework LEARNERS’ PERCEPTIONS OF MICROLEARNING 12 CHAPTER ONE: INTRODUCTION In high-tech industries, there are incessant competitive pressures on employees to stay current on ever evolving and expanding technical knowledge. The accelerating speed of innovation and the use of social media by individuals to share knowledge have shifted employees to the center of the learning process (Deloitte Consulting, 2014). Training and development organizations have been slow to adjust, anchored, in part, by their adherence to traditional instructional design models (Deloitte Consulting, 2014). What is needed is training agility—the ability to deliver rapidly changing content dynamically to learners who have little time to learn and may only be interested in the content that is immediately relevant to their jobs (Hsia, Chang & Tseng, 2014). Recently, the concept of a microlearning instructional format has been proposed as a solution. Microlearning breaks up training content into small segments of short duration and may give the learner flexibility to take as much or as little training at one time as they need when and where it is convenient to do so (Kerres, 2007). However, there is little if any empirical research regarding the learners’ perceived value of technical training delivered in a microlearning format (Hagleitner & Hug, 2007). This lack of evidence has not stopped the corporate training industry from touting microlearning solutions. The search term “microlearning” on the Association for Talent Development (ATD) website alone generates 32 buyer’s guide results, nine industry conference sessions, 12 online articles, and an $850 online course (ATD, 2016). The development of training in a complex and dynamic microlearning format is time consuming, technically challenging to design and, therefore, expensive for organizations to undertake without knowing more from the learners’ perspectives. Therefore, the focus of this study was to evaluate learners’ perceptions of microlearning as a delivery format for technical skills training. LEARNERS’ PERCEPTIONS OF MICROLEARNING 13 Context and Mission of Field-Based Study This project was a field-based study with participants drawn from the online global population of technical content learners; it was not limited to one specific organization. The study focused on the perceptions of a wide variety of individuals in various roles and with diverse levels of technical training experience. The mixed-method study sought to gather their perceptions of technical training delivered in a microlearning format. Field-based research has many definitions and takes place in countless environments. For purposes of this study, I borrowed from the world of accounting to define it as a form of research that involves “the in- depth study of real-world phenomena through direct contact with the organizational participants … to provide a ‘rich’ (i.e., thorough) understanding of the relevant phenomena” (Merchant & Van der Stede, 2006, p. 118). In short, the mission of the proposed study was to discover how adult learners, regardless of organization, perceived the value of technical training delivered in a microlearning format. Goals for Organizational Training and Professional Development As the types and features of technology continue to expand, so too do the needs of organizations’ workers to increase the speed of their acquisition of specific skill sets and close knowledge gaps as efficiently as possible. A 2012 Gartner study states that executives in Information Technology (IT) and human resources acknowledge the value of designing workforce plans that align a company’s strategic goals to the career goals and interests of its personnel (Berry & Mok, 2012). The Gartner study specifically states, “bridging the capability gap for the IT organization, while also addressing the interests of the employee, should be the key objectives of every IT professional development program” (p. 3). Since the target population for the study was limited only by individuals’ willingness to participate, each participant worked LEARNERS’ PERCEPTIONS OF MICROLEARNING 14 on technical training to meet unique goals for their employer, for themselves, or both. The study sought to discover each participant’s goal orientation in taking technical training and elicited their perceptions on the ability of the microlearning format to help them achieve their learning goals. Related Literature The speed of innovation requires agility in the delivery of up-to-date training content to technical employees on demand, any place, and at any time (van der Vyver, Pelster, Bersin, & Haims, 2014). In these times of fast-paced work environments, constantly changing technology, and demand for just-in-time and just-for-me content, many employees are not waiting for training to be pushed out to them in traditional formats such as instructor-led classrooms, live webinars, and training manuals. Instead of taking full courses to the point of completion just in case the content will be useful in the future, many workers are pulling just the information they need from the sources they trust and just at the time they need it (van der Vyver, Pelster, Bersin, & Haims, 2014). The use of microlearning is being touted by the training industry as a worthwhile solution for this demanding learning environment (ATD, 2016) despite little if any empirical research to support its use (Hagleitner & Hug, 2007) from a learner’s perspective. The literature reviewed in Chapter Two provides additional detail on this field-based problem of practice as well as the knowledge, motivation and organizational influences that impact adult learners’ perceptions of the microlearning format for delivery of technical training. Importance of the Evaluation It was important to study adult learners’ perceptions of microlearning as an appropriate delivery format for technical training for a variety of reasons. First, there is a dearth of empirical studies on the use of microlearning training design in corporate environments (Hagleitner & LEARNERS’ PERCEPTIONS OF MICROLEARNING 15 Hug, 2007). Expanding knowledge on any delivery methodology is in the best interest of the learning community at large. If learners’ perception of the format is positive, microlearning’s a- la-carte flexibility could fulfill multiple industry stakeholder requirements including just-in-time and just-for-me knowledge acquisition. Second, the plug-and-play microlearning structure and its use of reusable learning modules addresses both the learners’ and organization’s requirements that the rapidly changing content remain accurate. Third, microlearning may offer training organizations enhanced efficiency by leveraging reusable learner-driven content across multiple teams, courses, global locations, and career paths. For example, the ability to reuse a module of content in multiple courses designed for the same learner profile aligns with research on repetition as a key to making training stick. Technical expertise is best developed via repetitive exposures to dynamic learning experiences that are scalable and customizable by each learner (Knowles, Holton III, & Swanson, 2015). Through brief, iterative exposures to the same content, learners gain advantages from the reiteration. Learners are also reassured that the content will be available for retake on a piecemeal basis and on demand in the future if needed. At the same time, the organization benefits from a lower development and distribution cost that can be spread across multiple training programs and result in a lower cost per hour of instruction and enhanced return on investment (Miller, 2015b). Description of the Stakeholder Group for the Study The stakeholder group for the study consisted of voluntary participants who were recruited from multiple technical user groups on LinkedIn and the salesforce.com professional network including the Association for Talent Development [ATD], Learning Education and Training Professionals Group, eLearning Industry, The eLearning Guild, and Salesforce Power Users, as well as on salesforce.com’s public training (“Trailhead”) online technical discussion LEARNERS’ PERCEPTIONS OF MICROLEARNING 16 boards. As is described in Chapter Three, purposeful and snowball sampling techniques were employed in an attempt to maximize the diversity of the study’s survey population and, therefore, its stakeholder group. Table 1 shows the global and stakeholder goals. Table 1 Global and Stakeholder Goals Global Goal One hundred percent of employees will participate in professional development training that will help them achieve their career objectives. Employees Training and Development Professionals Organizational Leadership Teams Goals for Technical Learners Goals for Technical Curricula Designers Goal for Organizational Leadership Teams By February 2017, 100% of adult learners will articulate the benefits of microlearning for their technical skill development. By February 2017, 100% of technical curricula designer professionals will articulate the benefits of microlearning for professional development programs. By February 2017, 100% of organizational leadership teams will provide resources for delivery of professional development content in a microlearning format. Purpose of the Project and Research Questions The purpose of this project was to evaluate the perceptions of adult learners on technical training delivered in a microlearning format as a way to achieve their desired professional development goals. As such, the questions that framed this study were the following: 1. What are the knowledge, motivation and organization influences that impact adult learners’ perceptions of the benefits of microlearning for technical skill training? 2. What are the recommendations for practice in the areas of knowledge, motivation, and organizational resources related to the employees’ ability to articulate the benefits of microlearning for their technical skill development? LEARNERS’ PERCEPTIONS OF MICROLEARNING 17 Methodological Framework This project employed an explanatory sequential design mixed method for data gathering and analysis. First, quantitative data was gathered via a custom Qualtrics online survey instrument hosted by the University of Southern California. Once that data was analyzed, explanatory qualitative research was gathered via ten individual telephone interviews (semi- structured) with randomly selected participants from survey respondents who volunteered. The Qualtrics survey link was posted in multiple LinkedIn technical practice groups that are focused on either technical training, training development, or on the cloud-computing platform known as salesforce.com. A snowball sampling technique was employed to expose the survey to as broad a range of anonymous adult technical learners as possible. Multiple salesforce.com LinkedIn user groups and Trailhead technical training discussion boards were selected for sampling for two reasons. First, I am a member of the salesforce.com technical community and know it to be populated with thousands of highly engaged technical learners. Second, salesforce.com offers significant amounts of technical training in multiple formats including microlearning, live instruction, written documentation, and video. Most salesforce.com system administrators, developers, consultants, and individual users in these online groups take ongoing technical training as a normal part of their professional development activity and they know others training on related technical platforms who could be reached through snowball sampling. LEARNERS’ PERCEPTIONS OF MICROLEARNING 18 Definitions Provided in this section are terms and definitions used in this proposed study. Chunking Limitations in human cognitive capacity lead to the brain’s ability to either intentionally or automatically break content to be learned into small, related pieces (or chunks) to ease the burden on working memory’s ability to store the information in long-term memory. Over time, related chunks consolidate with each other to form a new single unit. It is through chunking that humans take in new knowledge from their environment, associate it to existing knowledge, and thereby continuously expand their cognitive capacity (Gobet et al., 2001). eLearning ELearning involves the delivery of information, knowledge, and instruction through the use of computer technology often via the Internet (Welsh, Wanberg, Brown, & Simmering, 2003). Gamification Gamification refers to the inclusion of game-like features in curriculum (e.g., content completion badges, leaderboards, points, etc.) to promote learners’ engagement, motivation, and behavior outcomes (Hamari, Koivisto, & Sarsa, 2014; Landers, Bauer, & Callan, 2017). ILT Instructor-led training (ILT) refers to a classroom or online learning event delivered synchronously and facilitated by an instructor or subject matter expert (Singh, 2003). LinkedIn A professional business networking website primarily utilized by adults. As of the date of the writing of this study proposal, LinkedIn has over 433 million members in 200 countries LEARNERS’ PERCEPTIONS OF MICROLEARNING 19 (LinkedIn, 2016). Microlearning Microlearning is an instructional format that leverages the concept of chunking and applies it to educational and training content. Material to be learned is broken into small, related segments of content or modules. In many cases, the modules are related at some level but stand independent of each other, thereby offering learners the ability to individualize the flow and pace of the instruction. Separate modules also create plug-and-play options for training developers to swap out content as it becomes out of date without the need to replace an entire course (Hug & Friesen, 2007). Microlearning also offers the potential for continuous and ubiquitous learning especially when delivered on mobile devices (Peschl, 2007). salesforce.com Salesforce.com is a San Francisco-based, publicly traded corporation that was an early innovator of cloud computing technology in the year 2000. Its technical platform, known as Force.com, is an open source, freely available, customizable development environment. As of the date of this proposal, over 150,000 businesses and millions of individual users work from salesforce.com online environments (salesforce.com, 2016). Technical Training “The boundaries between technical/functional training and non-technical training are somewhat blurred” (Combs & Davis, 2010, p. 15) due to the complexity of employee roles (e.g., sales skill training may include customer relationship management technology). For purposes of this study, the definition utilized was training that includes “content related to any technology and training on content specific to a discipline, function or profession” (Combs & Davis 2010, p. 7). LEARNERS’ PERCEPTIONS OF MICROLEARNING 20 Trailhead Trailhead.salesforce.com (Trailhead) is a free online technical training environment presented in a microlearning format, developed and managed by salesforce.com’s in-house corporate university, and designed to facilitate all levels of system knowledge. Trailhead offers hundreds of individual technical modules from basic introductory content through advanced system programming and technical architecture. “Trails” are optional pathways through the modules to help learners achieve their specific learning goals. “Trailmixes” are custom combinations of trails suggested by “Trailblazers” (learners) and shared within the Trailhead public environment to help others learn. Most trails include text, video, time estimates, progress indicators, multiple choice knowledge checks, hands-on graded projects, badges, points, links to related resources, users’ profiles, and feedback tools (Trailhead, 2017). Web 2.0 The term “Web 2.0” is used to describe the wide-ranging types of online resources that permit individuals to generate, communicate, and exchange information and knowledge such as wikis, blogs, social media sites, and others used to “harness collective intelligence” (O’Reilly, 2005, p. 2). Organization of the Project This dissertation is composed of five chapters. Chapter One contains the purpose of the study, global and stakeholder goals, terminology and a brief introduction to the literature on the topic of microlearning as a format for technical training. In addition, this chapter provides the framework of evaluation for the study. Chapter Two provides a review of relevant literature on the benefits and challenges of utilizing eLearning to deliver corporate training and includes a focus on microlearning as a relatively recent evolution of technology-assisted instruction LEARNERS’ PERCEPTIONS OF MICROLEARNING 21 delivery. This literature examines the knowledge, motivation, and organizational influences that act upon the achievement of technical learners’ stakeholder goals. Chapter Three describes the detailed methodology utilized on the project including the study population, data collection procedures, and survey instruments. Chapter Four presents the data collected and the analysis of the findings. This dissertation concludes with Chapter Five which includes knowledge, motivation, and organization recommendations for the use of microlearning, planning for implementation and evaluation of microlearning programs, and the limitations and delimitations of this study. LEARNERS’ PERCEPTIONS OF MICROLEARNING 22 CHAPTER TWO: REVIEW OF LITERATURE Introduction to the Problem of Practice In their annual survey of over 2,400 human resource professionals, the enterprise learning research firm Bersin by Deloitte found that over 70% of companies across all industries and global regions cited workforce capability as one of their top five challenges for 2015 (van der Vyver, Pelster, Bersin, & Haims, 2014). Despite annual expenditures of more than $130 billion per year, companies have a difficult time providing a modern-day learning experience to their employees to keep pace with the demands of technical and social innovations (van der Vyver et al., 2014). A 2015 study conducted by ATD found that half of the 400 learning executives surveyed expected to deliver more learning services in the following six months, yet less than one-third expected to increase staff to support the demand (Miller, 2015a). In addition, less than half of the learning executives projected that their organizations could increase the reuse of existing content (Miller, 2015a). In the Bersin by Deloitte study, less than 10% of the learning executives felt equipped to address the needs of overwhelmed employees (van der Vyver et al., 2014). More of the same type of training, however, may not be the solution. Traditional training evaluation looks at completion rates as a measure of eLearning adoption (Hsia, Chang & Tseng, 2012). Today’s technical learners do not always need to complete a full course or understand everything about a particular topic just in case the knowledge is needed in the future. Instead, many opt for just-in-time, just-for-me training to fill an immediate gap in required knowledge (Bova & Kroth, 2001). While course completion rates may be easier for training teams to evaluate and measure utilization and course acceptance, they do not necessarily capture the impact of transfer on the organization in terms of learner satisfaction and enhancement of corporate culture (Servage, 2005). Fast-paced industries such as LEARNERS’ PERCEPTIONS OF MICROLEARNING 23 high-technology require nonstop learning. Training content needs to be easily individualized, always on, and effortlessly integrated into the workday (van der Vyver, Pelster, Bersin, & Haims, 2014) or available from learners’ homes. This Bersin by Deloitte research found that learning leaders rated retention and engagement as a highly urgent priority to meet the demand caused by “the shrinking half-life of skills and technical knowledge” (van der Vyver, B., Pelster, B., Bersin, J. & Haims, J., 2014, p. 8). In response, many organizations have adopted one or more forms of eLearning technology. The term “eLearning” incorporates a wide variety of definitions that have one thing in common: the electronic delivery of information, training or education to an audience of learners (DeRouin, Fritzsche, & Salas, 2005). One common format of eLearning, known as self- paced instruction, involves pre-packaged content often with self-assessment features (Hsieh & Cho, 2011) and was touted as “here to stay” as recently as 2005 (DeRouin, Fritzsche, & Salas, 2005, p. 937). However, the pace of modern technological advances have negatively impacted the packaged courseware market; industry projections estimate a five-year negative growth rate (-6.4%) and a loss of $13.5 billion by 2021 (Adkins, 2016). The universal requirement to avoid content obsolescence demands agility in technical training delivery speed and accuracy. Companies that previously invested in pre-packaged courseware have looked for more modern solutions for technical training. The shift toward simulation- and game-based content delivered via mobile learning (Adkins, 2016) are well underway. There are few empirical studies, however, that have tested the actual learning that takes place (as compared to self- reported likeability) using technology to train in corporate environments (Hug, 2012; Kim, Lee & Kim, 2014). Šumak, Heričko, and Pušnik (2011) conducted a meta-analysis of 42 eLearning studies and found only nine involving corporate environments. A separate study found 30% of LEARNERS’ PERCEPTIONS OF MICROLEARNING 24 companies were delaying the use of mobile training via phone technology until there is proof of efficacy (Interactive Services, 2015). Adjustments to delivery technology alone may not be the answer. Instructional system design (ISD) calls for a methodical and consecutive content creation process (see, for example, the ADDIE framework described in Morrison, Ross, Kemp, & Kalman, 2010). The end result is the delivery of structured courses with curricula arranged in a sequential flow that commonly run 45 minutes to one hour in length (Kerres, 2007). According to the Deloitte Consulting global workplace study, the time to produce this content is long and it has resulted in redundant and rapidly obsolete, expensive course inventory (Deloitte Consulting, 2014). In response, and as mentioned in Chapter One, the corporate training industry is currently advocating the use of microlearning as a way to keep content current, reusable, mobile, on demand, and learner-specific. The term “microlearning” is relatively new in the United States and a brief discussion of its origin is warranted. The concept of microlearning as an instructional method has its genesis in George A. Miller’s 1956 seminal research on the limitations of human working memory (Hug & Friesen, 2007). Miller proposed that by “chunking” instructional content into meaningful segments no larger than seven units (plus or minus two) in size, intrinsic cognitive load on the learner is reduced (Miller, 1956, p. 92). According to Miller, there is no exact measurement of what constitutes a “chunk” or a unit of meaningful content; it varies depending on the extent of each learner’s existing knowledge and recall ability (Miller, 1956). In his research, Miller was able to show that as separate small chunks of content are learned, they can be subsequently combined automatically in working memory and later recalled as one single chunk, thereby lessening the load on working memory over time. The microlearning instructional format LEARNERS’ PERCEPTIONS OF MICROLEARNING 25 employs similar chunking techniques combined with learner-controlled flexible content sequencing that allow students to complete instructional modules in singular or grouped segments in an order and within an amount of time of the learner’s choosing (Hug & Friesen, 2007). Scherer and Scherer (2007) state that microlearning is not new in terms of neuropsychology. “Our brains have always been equipped to make sense of fragments and to integrate pieces of information into contexts” (p. 123). The conceptualization and development of microlearning courses is, however, technically challenging for instructional designers. Traditional linear training design assumes a planned uniformity of learners’ instructional content and path. Microlearning, on the other hand, encourages learners to generate their own sequence and extent of content to be learned even extending it via internal and external resources such as Web 2.0 social learning environments and learner-generated content (Kerres, 2007). Although the line between learners and instructors is imprecise in this process, it is not abandoned (Kerres, 2007). It is the job of instructional designers to ensure necessary content is available yet allow students to explore optional learning strategies that work best for them. As Kerres (2007) stated, “the learning process can be prepared, but not completely prearranged” (p. 108). This makes the role of instructional design more complex, challenging, and costly since organizational and employee performance goals usually require that students obtain at least a consistent baseline level of knowledge. Making this adjustment to microlearning is expensive; doing so without having insight into whether students actually value the microlearning format adds risk. Figure 1, for example, shows the complex schematic for development of a key performance indicator (KPI)-oriented eLearning program and suggests the need for highly skilled and experienced content developers (Wang, 2011). LEARNERS’ PERCEPTIONS OF MICROLEARNING 26 Figure 1. Example of eLearning architecture. Reprinted from “Integrating Organizational, Social, and Individual Perspectives in Web 2.0-based Workplace e-Learning” by M. Wang, 2011, Information Systems Frontiers, 13(2), p. 198. It is important for business organizations to identify how best to produce technical training curricula to support their employees’ professional development goals both for the good of the learners and for the financial bottom line of the company. As was described in Chapter One, today’s training industry vendors are actively promoting the application of microlearning instructional design without providing much if any empirical evidence to back their claims. Clark and Estes (2008) stated that organizations need to cut through the marketing hyperbole to avoid “the snake oil that creates performance losses or has no impact” (p. 6). The authors likened the application of unproven training interventions to iatrogenic treatments in health care that result in making patients worse. Clark and Estes (2008) emphasized that while the use of effective and well-designed employee performance improvement programs have been shown to increase LEARNERS’ PERCEPTIONS OF MICROLEARNING 27 corporate income and shareholder value, more than 60% of organizational change strategies are quickly discarded. Carefully selected and proven training methods are cost effective from a corporate finance standpoint and by the avoidance of disruptive, ineffective professional development instruction. Before an organization employs a microlearning training methodology, I would propose that more information is needed. The learners are lynchpins in the success or failure of organizational training. The learners’ perceptions of the format are important to hear. The focus of the next section is on the knowledge, motivation and organizational (KMO) influences that are pertinent to the technical learners’ stakeholder goal described in Chapter One—the ability of employees to articulate the benefits of microlearning for their technical skill development. Stakeholder Knowledge, Motivation, and Organizational Influences Knowledge Influences I have analyzed the reviewed literature in terms of the procedural and metacognitive types of knowledge influences that affect students engaged in instruction constructed in a microlearning format. Technical training needs to address both the information missing from employees’ accessible skill sets as well as any limitations in their ability to leverage their existing knowledge to solve novel problems (Clark & Estes, 2008; Rueda, 2011). It is not enough for technical learners in professional environments to memorize facts and features. Working through complex business challenges requires learners to develop and apply higher order cognitive problem solving processes that well-formulated instructional design can address (Jonassen, 2000). Knowledge types. In 1956, Benjamin Bloom (as cited in Krathwohl, 2002) published a hierarchical framework that identified six categories of human cognition listed in order of LEARNERS’ PERCEPTIONS OF MICROLEARNING 28 increasing complexity (knowledge, comprehension, application, analysis, synthesis, and evaluation) that came to be known as Bloom’s Taxonomy. Forty-five years later, Krathwohl (2002) discussed a revision to Bloom’s Taxonomy into a two dimensional framework. First, there is a knowledge dimension with four categories (i.e., factual, conceptual, procedural and metacognitive). Second, there is a cognitive dimension that redesigned Bloom’s original six categories into classifications labeled with action verbs (i.e., remember, understand, apply, analyze, evaluate, and create). The knowledge to be learned is assigned into the framework after careful consideration because some of these influences are more amenable to one instructional method over another (Rueda, 2011). Declarative knowledge (factual and procedural) includes the basic information important to how the course is constructed including terminology, major features, or foundational properties of the material (Krathwohl, 2002; Rueda, 2011). These influences are within the realm of instructional design research and, therefore, are beyond the focus of this study. Procedural influences include information on how to accomplish a task (Krathwohl, 2002; Rueda, 2011). These include the steps, techniques and methods necessary to achieve the goal (Krathwohl, 2002; Rueda, 2011). Metacognitive knowledge is described by Krathwohl (2002) as the “knowledge of cognition in general as well as awareness and knowledge of one’s own cognition” (p. 214). Rueda (2011) expands the definition of metacognitive knowledge to include the ability of a learner to understand why a task is appropriate and when it should be done which enhances the learner’s ability to extrapolate the knowledge into additional contexts. As stated in above, today’s complex business environments require more of technical learners than the acquisition of factual and conceptual knowledge. Clark and Estes (2008) state “findings estimate that between 50 and 90 percent of expert knowledge and skills are automated LEARNERS’ PERCEPTIONS OF MICROLEARNING 29 and unconscious” (p. 66). When learning in a novel environment such as microlearning, however, courses need to provide learners with sufficient knowledge and tools to inform their choices (Kerres, 2007). Although the volunteers for this study came from online technical professional groups, and, therefore, their proficiency with computerized training courses may be greater than the general population, it is nonetheless important to address the procedural and metacognitive influences on their learning. In 2001, Piccoli, Ahmad and Ives found that IT students had a difficult time adjusting to learner-controlled instruction in a virtual learning environment and were dissatisfied with the format. Sixteen years later, my study looked at microlearning and asks about current day technical learners’ perceptions. The next section will review the literature on procedural and metacognitive influences on microlearning. Procedural knowledge influences. ELearning courses, while commonly presented in a linear, workbook-style format, can offer the learner the ability to stop, start, advance and rewind the course as desired. Adequate procedural instructions for course navigation are, therefore, part of the influences that affect eLearners. Students also need to know how navigate through and among the modules, monitor their progress, download course materials, complete knowledge checks, perform assignments, and obtain other procedural knowledge. There is limited empirical research on the specific use of modular microlearning design to teach technological content in corporate learning environments. However, the increasing speed of technical innovation demands the accelerated use of new tools and techniques to keep up with the pace of demand for technical training (Brynjolfsson & McAfee, 2014; Siemens, 2007). Clear instructions, including navigation functionality (Knowles, Holton III, & Swanson, 2015), enhance knowledge transfer (Park & Wentling, 2007). Sweller, Kirschner, and Clark (2007) found that there is a need for guidance during training to reduce cognitive load, especially for novices on the topic, and that LEARNERS’ PERCEPTIONS OF MICROLEARNING 30 providing problem solutions helps with content transfer. For example, the microlearning format should be self-explanatory within each module (Eibl, 2007). Providing procedures for learners to view a worked example of a solution is preferable to having learners search for an appropriate resolution (Sweller, Kirschner, & Clark, 2007). Students also can be given instructions on how to group certain microlearning modules together (e.g., a suggested learning path); this is in line with the findings of Gobet et al. (2001) and Mathy and Fedman (2012) that learning similar content at one time enhances the knowledge retrieval process. Metacognitive knowledge influences. Part of the study was designed to identify metacognitive processes learners employ in microlearning courses. In the microlearning format, a course overview and detailed outline can help the learner take a few moments to review the curriculum and plan a personal approach to work through the content. Awareness of the option of self-regulation is important to the effective use of technology in learning (Kerres, 2007; Knowles, Holton III, & Swanson, 2015), especially in a microlearning format (Kerres, 2007). Throughout the course, learners can be asked to reflect and evaluate their existing strengths on the topic before launching the module. Students can be given the option to test out of the content in which they feel confident in their skills and, depending on the course content, completion of the course may not be required. Hierdeis (2007) found that transfer increases when the learner controls the learning path and recognizes the need to review or test out of content. Students can also be introduced to general learning strategies during problem solving exercises. The learner is made aware that they are in control of the process of learning the content in whatever order is most effective for them. Cognizance of course flexibility has been found to be a significant contributing factor in eLearner satisfaction (Sun, Tsai, Finger, Chen, & Yeh, 2008). Asynchronous courses are more effective when the learners understand the benefits of being in LEARNERS’ PERCEPTIONS OF MICROLEARNING 31 control including learning interactively (Knowles, Holton III, & Swanson, 2015), in small steps through a flexible path (Hierdeis, 2007), and via a mobile device (Kinnebrock, Baeßler, & Rössler, 2007). However, when learners become proficient at a task, they are not always aware of the actions they take (Rueda, 2011). The survey instrument in this study asked participants to state the number of years of experience they had both with technical training in general and microlearning specifically. For those participants with extensive microlearning experience, it was anticipated that some of their metacognitive processes had automated and not easily described. Therefore, there are several findings described in Chapter Four wherein I differentiate metacognitive data between those participants with high and those with low experience with microlearning. Motivation Influences The focus of this section is on the motivation-related influences that are pertinent to the participants’ engagement with technical training courses delivered in formats that accommodate and facilitate the achievement of their individual learner goals. In Burke and Litwin’s (1992) causal model for organizational performance and change, employee motivation is at center of the individual and personal factors that interact with all functions within a company’s internal environment and impact its external environment. A modification in one factor, such as motivation, will eventually have an impact on all of the other factors including the organization’s ability to perform on its mission and strategy. Motivation is, therefore, important to explore as it relates to an employees’ ability to acquire the skills and knowledge essential to contribute to the organization’s success. I next review the theoretical research on expectancy value, goal orientation, and self-efficacy. See Figure 2 for a diagram of Burke and Litwin’s (1992) model. LEARNERS’ PERCEPTIONS OF MICROLEARNING 32 Figure 2. Burke& Litwin (1992) Causal Model Approaches to Managing Organizational Change. Recreated from graphic by Manu Melwin Joy, Assistant Professor, Ilahia School of Management Studies (India) posted on LinkedIn SlideShare, 2015; retrieved October 23, 2016, from http://www.slideshare.net/manumelwin/burke-litwin-change-model. Motivation is the driving force that makes learners start, maintain, and put energy into completion of a task. In Clark and Estes’ KMO framework (2008), they identify motivation as one of the most critical factors in the enhancement or degradation of employee performance. The authors explain that motivation is affected by three factors: active choice, persistence and mental effort. In active choice, intention is transformed into action. A learner’s resistance in starting to take action on a task can point to motivational problems involving active choice (Clark & Estes, 2008). Persistence involves the willingness of learners to continue with a task despite obstacles including distractions (Clark & Estes, 2008). When less essential tasks are pursued instead of important goals, or when the learner stops the effort entirely, there is likely a motivational problem of persistence (Clark & Estes, 2008). Mental effort involves the learner investing sufficient work on achieving the goal (Clark & Estes, 2008). Confidence plays a role in a learner’s investment of effort; too much confidence often results in a lack of effort whereas too LEARNERS’ PERCEPTIONS OF MICROLEARNING 33 little confidence can result in problems with active choice and persistence (Clark & Estes, 2008). The most effective balance to ensure an appropriate investment of mental effort requires challenging the learner with a task that is moderately difficult on which the learners are neither over nor under confident (Clark & Estes, 2008). While the range of motivation theories is extensive, for purposes of this study, I focused on expectancy value, goal orientation, and self- efficacy theories. Expectancy value theory. According to Eccles (2006), the expectancy value theory of motivation posits that a learner’s motivation is controlled by two primary beliefs: confidence (“Can I do it?”) and desire (“Do I want to do it?”; p. 1). A positive response to the first question is a strong predictor of success and provides additional motivation to attempt even greater challenges (Eccles, 2006; Pintrich, 2003). A negative response predicts the opposite. For example, the addition of gamification features in learning is often intended to add intrinsic (e.g., sense of accomplishment) and extrinsic (e.g. badges, rankings, etc.) motivation (Karimi & Nickpayam, 2017) and for some learners it does. However, not all learners find value in gamification (Landers, Bauer, & Callan, 2017). Motivation can decline in “gamified” courses as learners encounter conflict due to the loss of points, frustration, disappointment, competitive pressure, and other factors (Roosta, Taghiyareh, & Mosharraf, 2016). Understanding a learner’s answer to Eccles’ (2006) second question, “Do I want to do it?” involves investigation into additional sub-factors including intrinsic value, attainment value, utility value, and perceived cost. Citing supporting research, Eccles (2006) stated that the first sub-factor, intrinsic value, concerns the gratification or meaningfulness the individual receives from the activity. The second sub-factor, attainment value, involves the extent of the activity’s contribution to the individual’s self-identity. The third sub-factor, utility value, represents the LEARNERS’ PERCEPTIONS OF MICROLEARNING 34 alignment of the task with the learner’s desires and goals. The fourth sub-factor, perceived cost, is derived from the learner’s calculation of whether the price of their participation in the task is worth the effort. In summarizing the research on the topic of expectancy value theory, Pintrich (2003) stated that the learner’s belief in the value of a task tends to predict active choice whereas the learner’s perception of their competence tends to predict the achievement of the undertaking once effort is begun. The answer to the “Do I want to?” question is highly individualized among learners and the factors that contribute to its evaluation are as varied as learners’ perceptions. Sun, Tsai, Finger, Chen, and Yeh (2008) found that a learner’s perceived usefulness of a course (e.g., how the course content can be applied to a job) increases eLearner satisfaction. The extent to which a learner values the flexibility of content, time, and path (inherent in training delivered in microlearning format) can increase motivation and deliver quick successes to encourage more use (Edge, Fitchett, Whitney, & Landay, 2012). The infusion of motivational design such as the ARCS model (i.e., attention, relevance, confidence, and satisfaction) for eLearning, and the inclusion such features as mystery, variability, personally selected goals, challenge, fairness, equity, and rewards, can affect motivation (Keller & Suzuki, 2004). Scherer and Scherer (2007) found that novelty in eLearning content with resulting rewards (extrinsic or intrinsic) may be valued by learners, that short meaningful content may resolve variable learning demands and reward the learner, and that the addition of reflective exercises may help learners create personal meaning. Welsh, Wanberg, Brown, and Simmering (2003) found that learners were motivated by the value they found in increased learner convenience through modular learning coupled with just-in-time options including customizable content flow. In addition, perceived usefulness of eLearning content has been found to have a critical relationship with user satisfaction and LEARNERS’ PERCEPTIONS OF MICROLEARNING 35 motivation to complete the course (Sun, Tsai, Finger, Chen, & Yeh, 2008). However, these factors are all susceptible to each individual’s perception of value. Therefore, it is important to assess the students’ perceived value of the microlearning format for technical training. Goal orientation theory. Yough and Anderman (2006) state that goal orientation theory focuses on the reasons why learners take part in training or educational activities. According to Pintrich, Wolters, and Baxter (2000), it is probable that individuals are aware, or can be made aware, of why they are undertaking the training and what they are trying to achieve. Yough and Anderman (2006) proposed a two-by-two matrix for assessing goal orientation in learners. On one axis, students are evaluated based on whether their goal orientation is focused on mastery or performance. Mastery orientation involves a student’s goal of actually learning a task, comparing their current performance to their own prior accomplishments. Conversely, performance orientation involves a student’s goal of doing better than anyone else regardless of whether the content is actually learned or transferred to the job. The other axis focuses on approach goals and avoidance goals. Mastery-approach oriented students demonstrate a desire to learn the content, whereas mastery-avoidance students have a desire to keep from misunderstanding the information. Performance-approach oriented students focus on proving their competency compared to everyone else, whereas performance-avoidance students desire to keep from looking bad or incompetent compared to other students (Yough & Anderman, 2006). It is interesting to add, however, that Pintrich, Wolters, and Baxter (2000) theorized that the relationship between a learner’s orientation to goal achievement and their desired outcome is too complex to fit a simple dichotomy model of good versus bad, mastery orientation (e.g., focus on learning and understanding) versus performance orientation (e.g., focus on being the best in comparison to LEARNERS’ PERCEPTIONS OF MICROLEARNING 36 others). Instead, Pintrich, Wolters, and Baxter (2000) recommended investigating the multiplicity and interrelationships of the assortment of variables that contribute to each student’s motivation. The specific demographics and professional backgrounds of the participants in the present study were identified from survey responses and are presented in Chapter Four. Most but not all participants were required to take technical training for work purposes. Many defined specific motivation factors for completing technical microlearning courses. These included intrinsic goals such as confidence in the currency and relevancy of their skill sets, as well as extrinsic goals such as standings on gamification leaderboards. It was important to assess the goal orientation of the participants as described by Yough and Anderman (2006) and further differentiated by Pintrich, Wolters, and Baxter (2000) in relation to their perceptions of the microlearning format for technical training. For example, if an individual stakeholder’s current role in their organization made attainment of a technical certification optional, it is possible that they took a performance-avoidance orientation to the technical microlearning courses; in other words, they needed to complete the courses successfully to avoid looking bad to their colleagues and their manager. According to Elliot and Harackiewicz (1996), only performance goals with an avoidance approach have been shown to lead more often to poor outcomes. In contrast, it was important to assess whether participants endeavored to learn the course content so that they could apply it to their role and/or achieve a technical certification. It is possible that those learners employed a mastery-approach orientation to become skilled at the modular format and control it to their best advantage. Those participants who were more risk- averse could take a mastery-avoidance approach to ensure they do not miss a step in their learning. Finally, learners might have employed a performance-approach orientation to microlearning courses if their goal was to be highly regarded as a top performer. LEARNERS’ PERCEPTIONS OF MICROLEARNING 37 Self-efficacy theory. A core principal in microlearning is the empowerment of students to control their own learning process (Hug & Friesen, 2007). The theory of self-efficacy (Pajares, 2006) was developed from the research by Bandura (1977) on the motivational power of personal belief in the ability to accomplish a task. A student’s perception of self-efficacy is highly predictive of their motivation for learning (Zimmerman, 2000). Beliefs in self-efficacy positively impact a learner’s performance through the identification of challenging goals, the exertion of effort, and persistence with the task (Pajares, 2006). Conversely, individuals with low perception of self-efficacy with a task can perceive unrealistic levels of difficulty and experience depression and stress (Pajares, 2006). The level of self-efficacy belief with eLearning systems has been found to be related to the learner’s locus of control (Hsia, Chang, & Tseng, 2014). Those who perceive they control their environment (“internals”) have greater confidence in their ability to understand eLearning systems, have high expectations for the ease of use of eLearning to gain knowledge, and express increased intentions to continue using eLearning. Those who do not believe they are in control of their environment (“externals”) tend to have less confidence in their ability to use new technology, expend less effort to learn, and demonstrate less persistence at the task (Hsia, Chang, & Tseng, 2014, p. 53). Therefore, it is important to develop a student’s confidence in their ability to learn and be successful with the organization of innovative eLearning courses (Kinnebrock, Baeßler, & Rössler, 2007) such as microlearning. Organizational Influences Organizational influences on employee learning can be broken down into two broad categories: cultural settings and cultural models (Gallimore & Goldenberg, 2010; Schein, 2004). Cultural settings are the structural aspects of the company including availability of resources (Gallimore & Goldenberg, 2010; Schein, 2004). Cultural models refer to the common LEARNERS’ PERCEPTIONS OF MICROLEARNING 38 understandings among employees within the organization as to how things work and what is valued (Gallimore & Goldenberg, 2010; Schein, 2004). An organization’s internal influences (e.g., structure, leadership, management, systems, mission and strategy, etc.) and external influences (e.g., economy, clients, competitors, etc.) all affect employees’ ability to learn the tasks and skills they need to perform. Organizations can cultivate different cultures over time as the company matures, grows and changes (Clark & Estes, 2008). Therefore, barriers to organizational learning can be transient or persist such as lack of corporate goals, financial constraints, and strict rules and regulations (Schilling & Kluge, 2009). The following sections provide additional detail on the cultural setting and cultural model influences that likely affect the adoption of microlearning by stakeholders. Cultural setting influences. When engaged in delivery of technical training in a microlearning format, organizations need to provide the resources necessary to support the design, development, and delivery. These resources include (a) technical infrastructure and staff; (b) funding; and (c) time for training. Technical infrastructure and skilled staff. By definition, any type of eLearning involves the use of technology to deliver of training (Welsh, Wanberg, Brown, & Simmering, 2003). For microlearning, organizations need to provide robust, high-speed online learning resources capable of supporting a variety of content (e.g., live instruction, recorded video, digital images, web content, etc.). A key to the successful use of microlearning is its on-demand, any time, and any place delivery potential. However, technical malfunctions and slow connection speed can dampen learner engagement and motivation (Sun, Tsai, Finger, Chen, & Yeh, 2008). Forrester Research, Inc. found that 26% of their study’s respondents stated that the complexity of their companies’ IT environments presented a primary barrier to their ability to deliver on-demand LEARNERS’ PERCEPTIONS OF MICROLEARNING 39 access to information for development of expertise (Owens, 2010). While not specifically focused on microlearning, the Forrester study respondents’ expertise initiatives included learning and development, knowledge and content management, collaboration, and information architecture. Beyond technology, organizations need to employ expert learning resources to build engaging content. According to the Forrester research, content and collaboration professionals are highly-skilled and highly-compensated with expertise beyond training design and delivery that includes cross-departmental influence, communication, empowerment, and engagement (Owens, 2010). In that same Forrester study, 23% of respondents cited a lack of staff as a barrier to the success of their knowledge sharing initiatives. Funding. Powerful technical infrastructure, instructional designers, content developers, and IT support employees are expensive (Ali & Magalhaes, 2008). While the price of technology has fallen, the demand for increased capacity has grown such as system bandwidth to stream video, access and data security, and technical support resources (Ali & Magalhaes, 2008). Cost- saving cloud-based solutions (off-premise infrastructure) are available, yet with the increase in computer hacking, security remains a key and potentially costly concern (Zhang, Cheng, & Boutaba, 2010). Forester research found that over 25% of its study respondents stated that a lack of funding impeded the success of its content and collaboration initiatives and only 30% expected any increase in funding the following year (Owens, 2010). Dedicated time for training. One common organizational barrier to employees’ effective use of learning content is the lack of time set aside for training (Ali & Magalhaes, 2008; Brethower, 2000; Ettinger, Holton, & Blass, 2006). A frequently cited benefit for training delivery in an eLearning format compared to traditional classroom instruction is the ability for employees to flex their training time to their workload (Klein, Noe & Wang, 2006; Zhang, Zhao, LEARNERS’ PERCEPTIONS OF MICROLEARNING 40 Zhou, & Nunamaker, 2004). However, organizations that provide time for an employee to attend an instructor-led class may not similarly allocate time for eLearning (Ettinger, Holton, & Blass, 2006). The lack of apportioned training time sets up what Brown (2005) calls an “eLearning paradox” (p. 476): those who need training to become more efficient are often not afforded the time by their organizations to take the eLearning. An organization’s commitment to making training time available to its employees is a key component in the initiation and completion of eLearning courses (Ettinger, Holton, & Blass, 2006; Rossett & Schafer, 2003). This includes the need for a reduction in workload (Brown, 2005) and opportunities for reflection, discussion, debate, and application (Paine, 2014). It is common for technical employees to be tightly constrained by the work hours and locations of their clients and demands of project deadlines (Donnelly, 2006). For example, in the IT consulting business, few if any clients are willing to pay consultants’ hourly rates in the midst of a project to give them time to take training even though the client is the ultimate beneficiary of the improved skills and knowledge (Donnelly, 2006). Time for training is an ongoing challenge to consulting industry employers as time and expertise are precisely the products being sold (Sarvary, 1999). Giving time for training builds the intellectual capital the organization needs to compete, but simultaneously cuts into its profitable billable hours (Sarvary, 1999). However, organizations that require employees train on their own free time impacts the employees’ satisfaction with their work/life balance (Donnelly, 2006; Ettinger, 2006) and presents its own barrier to learning. Cultural model influences. There is a wide range of cultural models that affect corporate learning including job involvement, perceived encouragement, and appreciation (Aguinis & Kraiger, 2009). In technical companies, an organization’s learning culture was found to be LEARNERS’ PERCEPTIONS OF MICROLEARNING 41 positively associated with employee job satisfaction (Egan, Yang, & Bartlett, 2004). Yet a Forrester research study found that nearly 40% of responding content and collaboration professionals felt that political aspects within the organization’s culture created a significant barrier to the attainment of their goals (Owens, 2010). Organizations need to promote learning technology within the corporate culture, managers need to support it, and companies need to make a serious commitment to its success (Ali & Magalhaes, 2008). Among the organizational barriers that affect technical training are two cultural models: ingrained traditional learning methodologies and the organization’s commitment to knowledge management. Ingrained traditional learning methodologies. The organization needs to be willing to expand beyond ingrained learning methodologies and engage in innovative training techniques for professional technical development. The speed of adoption for new methods depends in part upon the extent to which the existing learning culture is ingrained (Ali & Magalhaes, 2008; Paine, 2014) and is resistant to change (Ali & Magalhaes, 2008). The adoption of new technologies is impacted by organizational culture and group norms (Coeurderoy, Guilmot, & Vas, 2014). To maximize user adoption, the organization needs to support and embrace eLearning technology within the corporate culture (Ali & Magalhaes, 2008). Not every employee will immediately prefer the eLearning format for training (Zhang, Zhao, Zhou, & Nunamaker, 2004), the concept of being a self-directed learner will be novel for some students (Piskurich, 2000), and not all content can be reduced to written terms amenable to electronic distribution (Leidlmair, 2007). However, within organizations, the process of learning is changing. By linking specific micro content (including experiential commentary) to macro topics, an organization can lessen learner resistance by providing direct access to the specific nugget of knowledge being sought (Schäfer, & Kranzlmüller, 2007) rather than more traditional eLearning LEARNERS’ PERCEPTIONS OF MICROLEARNING 42 courses that require learners to endure a lengthy linear progression through modules. Therefore, organizations need to overcome ingrained methodological barriers and develop a learning culture supportive of innovative training technology. Organization’s commitment to innovative knowledge management. Organizations need to support their employees’ acquisition and development of technical acumen through innovative learning via social sharing of highly tacit knowledge (Kohlbacher & Mukai, 2007) such as through Web 2.0 technology and microlearning. In the technical industries, knowledge workers tend to work in teams on multiple simultaneous client projects (Sarvary, 1999). From those unique experiences, lessons are learned that could benefit the rest of the organization if the company has a knowledge management methodology and commitment to its dissemination (Sarvary, 1999). Synthesis and distribution of this knowledge creates a competitive advantage but takes a large-scale corporate commitment to enable, maintain, and encourage (Sarvary, 1999). Sarvary (1999) provides insight into the internal competition and reluctance between consultants to share hard-won expertise with those in the organization who stand to gain financial incentives by freely using the work of others. Development of a knowledge-sharing culture is a barrier not easily breached (Riege, 2005) but it is essential for organizations to do so in order to develop a culture with a strong emphasis professional and technical development. Summary This study sought to add to the eLearning industry’s knowledge repository by specifically evaluating the acceptability of the microlearning format as a means for delivery of technical training. Research summarized in this chapter indicates support for the concept of chunking content into user-defined segments to reduce the cognitive load on working memory. In addition, the research proposes giving users as much control as possible over their eLearning environment LEARNERS’ PERCEPTIONS OF MICROLEARNING 43 to facilitate personal mastery, encourage self-efficacy, and increase the perceived value for the time spent in the course. Multiple organizational barriers including time limitations, just-in-time training demands, and fast-paced changes in information technology all point to the possibility that the flexibility offered within the microlearning format can overcome those barriers and provide an acceptable format for technical training. However, questions remain. How do adult learners perceive of their use of microlearning- formatted instruction for technical training in relation to their ability to achieve their professional development goals? What are the perceived knowledge, motivation and organization influences that the learners report as having an impact on their willingness to accept the delivery of technical training delivered in a microlearning format? In Chapter Three, I will describe the methodological approach taken in the study including the research design, sampling procedure, and data collection process. Table 2 summarizes the knowledge, motivation and organizational influences on learners’ perceptions of the use of microlearning for technical training as well as the methods of assessment through a survey instrument and interview sessions. LEARNERS’ PERCEPTIONS OF MICROLEARNING 44 Table 2 Knowledge, Motivation, and Organizational Influences and Assessments Assumed Knowledge Influences Knowledge Influence Assessment Procedural: Learners need to know the steps, techniques and methods necessary to navigate eLearning courses to achieve their learning goals. Through survey questions, participants were asked about their perceptions of the importance of clear course navigation features and instructions. Metacognitive: Learners need to recognize metacognitive strategies to facilitate learning as they are a contributing factor in eLearning user satisfaction. Through survey and interview questions, participants were asked how the features of instruction in an eLearning format in general and the microlearning format in particular affect their ability to learn technical content. Assumed Motivation Influences Motivation Influence Assessment Expectancy Value: Learners need to believe that they are capable of achieving their technical training goals and find value in taking instruction via courses in a microlearning format. Through survey and interview questions, participants were asked what features of technical courses delivered in a microlearning format they value and expect to contribute to the achievement of their learning goals. Goal Orientation: Students need to know why they are taking technical training courses and what they are trying to achieve. Survey and interviews elicited participants’ reasons and goals for taking technical training. Self Efficacy: Learners need to feel efficacious in learning technical content delivered in a microlearning format. Through survey and interview questions, participants were asked what features of technical courses delivered in a microlearning format they felt competent in using to achieve their learning goals. LEARNERS’ PERCEPTIONS OF MICROLEARNING 45 Assumed Organizational Influences Organizational Influence Assessment Cultural Setting Influence: Organizations need to provide resources (infrastructure and staff, funding, and employee time) required to support microlearning technical courses. Through survey and interview questions, participants were asked about their current company’s commitment of resources to support technical training. Cultural Model Influence: Organizations need to embrace innovation and overcome ingrained learning methodologies to create a culture focused on organizational learning and professional technical development. Through survey and interview questions, participants were asked about their current company’s current training methods, eLearning use, and their perception of the company’s commitment to professional development. LEARNERS’ PERCEPTIONS OF MICROLEARNING 46 CHAPTER THREE: METHODOLOGY Purpose of the Project The purpose of the project was to evaluate adult learners’ perceptions of technical training delivered in a microlearning format. As described in Chapter One, workplace training industry vendors, conferences, and webinars have are promoting microlearning as the next best thing for employee training. Little empirical research can be found to support those claims from the perspective of the learners. The question of learning effectiveness using a microlearning format is beyond the scope of this proposed project due to the myriad of dynamic influences that impact the measurement of effect. The focus is, instead, on the perceptions of users. In other words, if we build technical training courses in a microlearning format, will people find value and use them? Research Questions 1. What are the knowledge, motivation and organization influences that impact adult learners’ perceptions of the benefits of microlearning for technical skill training? 2. What are the recommendations for organizational practice in the areas of knowledge, motivation, and organizational resources related to the employees’ ability to articulate the benefits of microlearning for their technical skill development? The first research question formed the basis for both the quantitative and qualitative phases of inquiry in this study. The recommendations at the core of the second research question were developed from the analysis of the quantitative and qualitative data and are discussed in Chapter Five. LEARNERS’ PERCEPTIONS OF MICROLEARNING 47 Conceptual Framework The purpose of a conceptual framework is to organize the potential concepts, theories, influences, and settings of the research as they pertain to the stakeholder goals (Maxwell, 2013). The conceptual framework for this study was designed to assess adult learners’ perceptions of the use of the microlearning format for technical training. The applied in this study is one developed by Clark and Estes (2008) “to help organizations make effective decisions about performance [improvement] products and services” (p. 1) by assessing how knowledge, motivation, and organizational factors affect performance goals. For the study, knowledge influences included the need for learners to understand the procedural features and metacognitive characteristics built into the microlearning format. These included how to navigate among detached yet related course modules, select specific content to meet learning goals, and employ metacognitive (self-monitoring) options. Motivation influences examined included the participants’ expectancy value (i.e. how much learners value the learning goal and expect to succeed), goal orientation (i.e. mastery of the content or performance measured against other learners), and self-efficacy (i.e. ability to successfully achieve performance goals) utilizing the microlearning format. Organizational influences included the participants’ employers’ cultural model (e.g. whether technical training and self-regulated learning is encouraged) and setting (e.g. are technical courses offered internally and is time allotted to employees for training). Figure 3 represents the conceptual framework utilized for this study. LEARNERS’ PERCEPTIONS OF MICROLEARNING 48 Figure 3. Conceptual framework for this study. Participating Stakeholders Because this study’s problem of practice was field-based, the stakeholder population consisted of anonymous volunteers from multiple online communities that had in common the need for technical training. The total membership of these groups is in excess of 50,000 individuals. Online communities offer researchers access to individuals who share common interests and are often accepting of researchers if the knowledge gained is shared back with the group (Wright, 2005). I am a member of these technical groups and have attained many of the industry certifications and knowledge that others in these communities seek. Members of these virtual groups accepted the study as one of good intentions conducted by an industry colleague following informed consent and an opt-in methodology. It therefore avoided most of the negative reaction experienced by outsiders researching online communities (Eysenbach & Till, 2001). I posted several messages to these communities requesting volunteers, included a link to this study’s survey instrument (see Appendix A), requested volunteers to send the link to people LEARNERS’ PERCEPTIONS OF MICROLEARNING 49 they know who had also taken microlearning courses (snowball sampling), and accepted all participants over the age of 18. The posts clearly stated that participation in the survey was voluntary and anonymous, and requested that volunteers complete the survey for purposes of my dissertation. Members of these online communities are either already utilizing technical software and have goals to learn more, seek certification on technical software, or want to learn additional technology for their careers (e.g., entry into the cloud computing industry, advancement, consulting, management, etc.). While the request for participation listed multiple types of microlearning course sources as examples (e.g., Skillsoft, lynda.com, Coursera, Udemy, etc.), salesforce.com’s free public Trailhead training website was listed first as it was likely to be a recognized source used by these specific social group members. As the pool of prospective participants was already demonstrating Internet literacy and Web availability via their participation in online communities, access to a survey delivered via the Web did not present a barrier to participation. Although online surveys of tech-savvy individuals have had response rates as high as 42.3% (Millar & Dillman, 2011), the buy-in percentage for the survey was estimated to be approximately .25% (113 participants) of the total recruited audience. This small response rate may have been caused by the offer of only a chance of a financial reward (a random drawing for one of ten online gift cards) as opposed to a guarantee to all participants. Millar and Dillman (2011) found that the use of a nominal incentive increases participation in online surveys. However, since the volume of participation could not be estimated, a high response rate with a guaranteed financial incentive would create a hardship on the researcher. This volume of participants, however, generated a wide range of perspectives given that they were from diverse organizations, experiences, levels of expertise, and technical acumen. LEARNERS’ PERCEPTIONS OF MICROLEARNING 50 Methodological Framework The study employed an explanatory mixed method data collection strategy utilizing an online survey program (Qualtrics, hosted by the University of Southern California) followed by individual telephone interviews of ten participants who volunteered from the survey response group. The explanatory mixed method research design is designed to utilize qualitative methods to discover greater understanding of research questions after quantitative data is collected (Creswell, 2014). The use of multiple methods of data collection allows for the discovery of different facets and depth of the topic being studied (Maxwell, 2013). The survey format has been found to be an effective method to study “hard-to-reach” online participants (Andrews, Nonnecke, & Preece, 2003). Individuals who are geographically distributed around the world and who work from business offices, home offices, client locations, hotels, and any number of mobile locations populate technical communities. The fact that this population was widely dispersed both in time zone and geography made it problematic for the use of any assessment techniques other than online surveys and telephone interviews. This project was a field-based study; therefore, survey participants were sought from a general population of individuals in various technical industries. Specifically, multiple social media posts were made in several technical discussion groups to solicit assistance with this dissertation study. The main survey instrument requested that participants opt-in for a chance to be one of ten individuals randomly selected for a short follow-up interview by leaving a contact phone number or email address in a separate survey instrument. Randomly selected interviewees were guaranteed a gift card. The combination of a survey instrument with interviews were designed enhance the validity, reliability, and comprehensiveness of the data through triangulation using multiple sources. Two or more measures of data serve to reduce bias, LEARNERS’ PERCEPTIONS OF MICROLEARNING 51 corroborate results, and improve reliability, validity and richness of research findings (Jonsen & Jehn, 2009). Survey Instrument The primary data collection method for this study was an online survey utilizing the Qualtrics tool hosted by the University of Southern California (see Appendix A). The survey format was selected for data collection for multiple reasons. The set, formal structure of a survey helped to compare results across participants (Maxwell, 2013). The survey was also practical for both the researcher and the participants. The microlearning instructional format was at the heart of the research project and the surveys measured how the features made possible by the microlearning design impacted learners; therefore, microlearning’s on-demand format aligned with an on-demand data collection instrument. Data collection design should include feasible methods (Maxwell, 2013). The survey format reduced inconvenience to the participants since it was completed asynchronously and eliminated the need to find a common date and time for a synchronous data collection method. The survey was designed to take participants less than ten minutes to complete and was open for 60 days. Galesic and Bosnjak (2009) found that online surveys no more than ten minutes in length were more likely to be completed than those that exceeded ten minutes. Survey recruitment, sampling criteria, and rationale. All volunteers over the age of 18 who opted-in for the surveys from the online communities were accepted as study participants. The study utilized an “opt in” format for subjects’ participation obtained via an informed consent form imbedded in the survey. Research has found that survey researchers who employ an “opt in” methodology have lower response rates than those who use an “opt out” format (Hunt, Shlomo, & Addington-Hall, 2013). However, a requirement of this study was to LEARNERS’ PERCEPTIONS OF MICROLEARNING 52 ensure voluntary participation at all times. Ethics required that participation be achieved via informed consent (Glesne, 2013; Merriam & Tisdell, 2009) with each subject given the option to stop participation at any time without retribution or penalty (Glesne, 2013). During this study’s proposal stage, it was estimated that the voluntary opt-in format would result in an approximate .25% response rate or between 125 to 250 study participants depending upon the fluctuating number of total active members in the online groups. As described in Chapter Four, the actual total number of respondents over the age of 18 was very close to that estimate: 113 eligible participants. The participants completed several demographic questions in the survey which showed that they differed in multiple respects: age, years of technical training experience, years of experience with microlearning, profession, gender, organizational setting, motivation, and other factors. In some cases, these were used to segment in the participants’ perceptions of the microlearning delivery format. For example, the responses of participants who had completed more than 50 microlearning courses were compared to those who had completed 10 or fewer. Although the volume of responses did not support full statistical analysis, the comparisons shed important light on the varying perspectives of the participants. Survey criterion 1. The first qualification standard for the survey sample of study participants was that the individuals were over the age of 18. This was required to secure informed consent. Survey criterion 2. The participants needed to be interested in learning technical content delivered via microlearning courses. The rationale for this criterion was that there are higher online survey response rates on topics that interest participants (Wright, 2005). In addition, I offered to share the results back to each group from which participants were recruited. Research LEARNERS’ PERCEPTIONS OF MICROLEARNING 53 should result in mutual gain for both the researcher and the participants (Glesne, 2011; Maxwell, 2013). Survey criterion 3. The third qualification standard for the participants was that they self- select (opt-in) for the study. The rationale for this criterion was that one of the primary doctrines of ethical research is that involvement in the study is voluntary with an option to stop participating at any time (Glesne, 2011). I respected the personal decision of each participant to opt-out of the survey and interview at any time. It is important to note here that 18 of the original 134 people who started the survey elected to stop answering questions beyond the demographic profile. The survey tool did not allow more than one response per IP address. Therefore, their decision to stop participation was honored and data from those individuals were excluded from the results reported in Chapter Four. Interview This study followed an explanatory sequential mixed method design. The explanatory sequential design first employs a quantitative component followed by a qualitative component and ends with data evaluation (Creswell, 2014). While the primary source of data for this study was the online survey responses, it also included an opt-in option for participants to take part in a personal semi-structured telephone interview. It is common for a mixed method study to have one main data source followed by a more secondary source such as “a qualitative component whereby you interview a certain subset of survey respondents based on a purposeful criteria” to capture more depth into the participants’ perceptions (Merriam & Tisdell, 2009, p. 44). The nature of the purposeful criteria is explained in the interview sampling criteria in the section below. LEARNERS’ PERCEPTIONS OF MICROLEARNING 54 Ten individual semi-structured telephone interviews were completed with participants who were randomly selected from those who had completed the survey and had opted in for an interview session. Ten interview questions were prepared; at least five of these questions were asked of each interview participant. In all interviews, the responses of the participants led to unscripted follow-up questions to dive deeper into explanations when the respondent’s answers revealed a unique train of thought related to the research questions. A qualitative interview is worth the time that needs to be invested because it allows the researcher to improve the understanding of the topic by adding, “the coherence, depth, and density of the material each respondent provides” (Weiss, 1994, p. 3). Prepared questions included the participant’s personal opinions as to the acceptability of the microlearning format for technical training, the participant’s reflection on their decision making process for the path they chose through the course, and their overall impression with the microlearning format (see Appendix B for the full slate of prepared interview questions). Interview recruitment, sampling criteria, and rationale. The final question on the Qualtrics survey asked participants to opt-in for consideration to be included in a brief telephone interview. The target interview audience was up to 10 participants; 21 survey participants volunteered for interviews. All volunteers were working professionals with limited time; therefore, the telephone interviews lasted between 15 and 30 minutes. The interview portion of the study was conducted after the survey participation was closed (60 days after launch). Interviews took place over a two-week period three weeks after the survey closed. This timing was designed to (a) give me time to conduct an initial evaluation of the survey results to further guide the interviews; and (b) maximize the windows of opportunity for interview participants to find a mutually agreeable time for participation in a telephone interview. LEARNERS’ PERCEPTIONS OF MICROLEARNING 55 I designed into the online survey a purposeful sample technique so that I offered the option for participation in the interview phase of the study only to those who indicated microlearning experience, thereby excluding those who stated they had no microlearning experience. The purpose of the interview was to gather a deeper understanding of the microlearning experience; therefore those without such experience would not have been equipped to answer the interview questions. However, these survey participants were not ignored. Those without microlearning experience were asked a separate set of survey questions to discover their interest in the format and expectations for taking future microlearning courses. Their responses are included in Chapter Four. Purposeful sampling has among its goals the desire to achieve a representative range of illustrative data and is appropriate for small research populations (Maxwell, 2013). Purposeful sampling is common in qualitative research and is employed to “discover, understand and gain insight and therefore [the researcher] must select a sample from which the most can be learned” (Merriam & Tisdell, 2009, p. 95). The use of purposeful sampling improves the likelihood that the participants can provide information specific to the study’s research questions that would not be forthcoming with another sampling method (Maxwell, 2013). It is possible that the selection of a small subset of volunteer participants from a large number of volunteer participants led to an increase in volunteer bias (Bell, 1961); however, due to the purposeful sampling method, the research findings will not be generalized beyond the assessment of the study group population’s perceptions about the microlearning format for delivery of technical training. Interview criterion 1. The first qualification standard for the sample of interview participants was that the individuals were over the age of 18. This was required to secure informed consent. LEARNERS’ PERCEPTIONS OF MICROLEARNING 56 Interview criterion 2. The interview participants had to complete the online survey portion of the study. This was required to facilitate triangulation of the data as well as elicit the explanatory narrative to supplement the survey responses and provide deeper meaning for the findings. The in-depth meaning that can be generated by qualitative investigation contributes to the researcher’s understanding of the topic (Maxwell, 2013). Interview criterion 3. The third qualification standard for the proposed interview participants was that they self-select (opt-in) for the study. The rationale for this criterion was that one of the primary doctrines of ethical research is that involvement in the study is voluntary with an option to stop participating at any time (Glesne, 2011). I respected the personal decision of prospective participants to opt-in to the interview portion of this proposed study and verbally advised them and gave them the option to stop at any time. Data Analysis Data analysis was conducted through an iterative process. The initial descriptive analysis of the survey data was begun as the responses were submitted. Interviews were conducted after the survey was closed from further participation but before complete survey analysis was complete; therefore, survey responses helped to inform the semi-structured interview sessions. I wrote analytic memos after each interview and documented my thoughts, concerns, and initial conclusions about the data in relation to my conceptual framework, research questions, and initial survey analysis. All interviews were transcribed verbatim. Although all interview participants were offered the opportunity to review their interview transcript prior to the writing of this dissertation, none of the interview participants made such a request. The survey responses were designed before the study began to answer both of the research questions; therefore, I created a list of a priori codes from the conceptual framework before the interviews began. After LEARNERS’ PERCEPTIONS OF MICROLEARNING 57 the interviews were complete, I used the NVivo tool to assist with the transcript coding process and conducted the first review while applying the a priori codes. With subsequent reviews of the transcripts, and in tandem with conducting the frequency analysis of the survey responses, empirical codes were added to the interview transcripts and aggregated into analytic/axial codes. In the third phase of data analysis, I identified pattern codes and themes that emerged in relation to the conceptual framework and research study questions. The study results are discussed in Chapter Four. Credibility and Trustworthiness This study was conducted in a sequential explanatory mixed method design that employed rigorous adherence to traditional qualitative and quantitative methodologies and in an ethical manner. Some of the strategies I used to enhance the credibility and trustworthiness of the study included: (a) I piloted the survey instrument and interview questions with technical training learners (as more fully described below) and excluded the pilot data from the study results; (b) I triangulated the study results using both survey and interview protocols; (c) I offered respondent validation to reduce the possibility of misinterpretation of the data; (d) I identified potential researcher bias and assumptions; and (e) I employed purposeful evaluation of data for outlier results including the potential for discovery of unique explanations. All data needs to be treated analytically and not simply accepted at face value (Maxwell, 2013). The application of rigor to multiple methods of data collection, evaluation, and reporting contribute to the establishment of a study’s credibility and trustworthiness (Merriam & Tisdell, 2009). Results need to be credible and understandable not only to those who read and evaluate a study’s findings, but also to the participants from whom the data is obtained (Maxwell, 2013). The goal LEARNERS’ PERCEPTIONS OF MICROLEARNING 58 was to ensure my analysis of the data and reporting of the results were as accurate to the intention of the participants as possible. I conducted two separate pilot surveys with 10 different individuals in each, and included people inside and outside of the information technology field to assess the applicability, effectiveness, validity and reliability of the survey items. None of the pilot data was retained; however, because participation in the study’s live survey was anonymous, it is possible that the pilot participants ultimately participated. Major modifications were made to the survey instrument flow after the first pilot (e.g., separation of those with and without microlearning experience, creation of a separate tool to gather interview volunteer contact information without contaminating the anonymity of the main survey, application of clearer formatting of the informed consent page, addition of the automatic exclusion if the participant’s age was under 18, etc.). Pilots are essential for fine-tuning research instruments (Merriam & Tisdell, 2009). Determining the validity and reliability of the final survey instrument was challenging in that the majority of the research was being conducted online. The study relied on the triangulation of responses offered by the mixed method design to improve the credibility and internal validity of the research findings. Ethics This study was conducted in accordance with the highest ethical standards for human subject research for the protection of the participants, the university, and the investigator. The data was collected via an online survey and personal interviews. Specific ethical concerns for data collection via an online environment included obtaining informed consent from the participants, safeguarding the confidentiality of the participants and security of the data, and determining what is private and what is public (Eysenbach & Till, 2001; Merriam & Tisdell, LEARNERS’ PERCEPTIONS OF MICROLEARNING 59 2009). This medium also requires qualitative researchers to specifically address the impact of the environment on the data, the effects of the use of an online tool on data acquisition, and the influence the online setting has on ethical design (Merriam & Tisdell, 2009). First, informed consent confirms the understanding between the researcher and each participant that involvement in the study is voluntary, that they may stop participating at any time, and any risks to the participants’ well being are disclosed (Glesne, 2011). In the this study, participants’ informed consent was obtained via a detailed “opt in” screen in the online survey that appeared prior to the launch of the survey. The informed consent form included the purpose of the study, a description of how the participant would be involved, the option to stop participating at any time, the lack of compensation for participation other than an option to enter a random drawing for a gift card, confidentiality of the data, and contact information for the researcher, faculty advisor, and the University of Southern California’s University Park Institutional Review Board (UPIRB). Second, the issues of security and confidentiality of the participants was of concern as the use of online surveys for data collection has an “unlikely,” yet a potential for, a loss of confidentiality (Merriam & Tisdale, 2009, p. 116). Since the researcher is not the “primary instrument for data collection” (Merriam & Tisdell, 2009, p. 186), survey tool selection was of critical concern. The instrument of choice for the surveys and data collection, Qualtrics, employs secure Transport Layer Security (TLS) encryption and high-end firewall protection (Qualtrics, 2016). My dissertation committee and I are the only individuals with password access to the raw data. There will be no public data and the participants’ private anonymous responses will remain private throughout the study and data retention period. The proposed study investigated a field-based problem of practice, not an organization- LEARNERS’ PERCEPTIONS OF MICROLEARNING 60 based problem. Participants were recruited from an online community with members from technical industries. Although power relationships among study participants are critical for a researcher to recognize and identify (Merriam & Tisdell, 2009), there were little if any issues of power in this study since the survey participants were anonymous to me. All who volunteered from the online community were accepted into the survey portion of the study; interview participants were randomly selected from volunteers. Although three of the interview participants turned out by luck of the draw to be people employed in my work organization, I obtained their verbal confirmation that we did not have any kind of reporting relationship and that I did not present any kind of power influence over their responses. Issues of race, ethnicity, sexual orientation, religion, or other definitive category were likewise be invisible to the researcher due to the survey participants’ anonymity; gender differences were noted in the survey results included in Chapter Four but responses were not analyzed by gender. By the nature of the format, interview participants could not remain anonymous to me; however, gender, race, ethnicity, sexual orientation, religion, names, and all other demographic variables were removed from the results included in Chapter Four. I am an experienced member of the eLearning and technical training communities but not well known outside of my organization. I was sincerely interested in learning whether microlearning is an acceptable format for technical training. Before expending significant amounts of time and effort to convert our company’s existing technical training into a microlearning format, I want to know the perceptions of its usability by learners in the general technical training ecosystem. I do not feel strongly one way or the other and my career will not be not affected by the results. As Mirriam and Tisdell (2009) pointed out, it is important “for researchers to examine their biases and assumptions about the phenomenon of interest before LEARNERS’ PERCEPTIONS OF MICROLEARNING 61 embarking on a study” (p. 27). To reduce the imposition of bias into the data collection, analysis, and reporting of this study’s results, I framed the interview and survey questions in as neutral a tone as possible. A neutral position when wording questions reduces the imposition of bias into the results (Merriam & Tisdell, 2009). The questions were also worded to reduce the potential for leading a participant toward one answer over another. “Leading questions reveal a bias or assumption that the researcher is making, which may not be held by the participant” (Merriam & Tisdell, 2009, p. 121). As mentioned above, two pilot tests of the survey and interview questions were conducted to further eliminate unintentional bias. A researcher’s bias can be subtle, unintentional, and difficult to uncover (Merriam & Tisdell, 2009). The goal was to ensure the participants’ perceptions of the acceptability of microlearning as a viable platform for delivery of technical training were honestly and ethically collected, analyzed, and reported beyond reproach. Limitations and Delimitations Acknowledged limitations included the truthfulness of the survey and interview respondents as well as the exact individuals willing to participate in the study. Online research is a “risky business” (Merriam & Tisdell, 2009, p. 177) as participants can pretend to be anyone they want to be and answer survey questions with little regard for the truth. Myers and Newman (2007) found that information systems interviews can be wrought with limitations including participants’ “artificiality,” “lack of trust as the interviewer is a complete stranger,” “lack of time” (p. 4), “fear of exploitation,” “fear of silence,” “fear of embarrassment” (p. 12), shyness, showboating, and boredom. Delimitations include potential sampling errors since I gathered survey input from a limited population of online learners. Low response rates have the potential to contribute bias to online survey results (Fan & Zan, 2010). In addition, in an attempt to increase the response rate, I LEARNERS’ PERCEPTIONS OF MICROLEARNING 62 used small surveys and asked a limited number of questions. Fan and Zan (2010) found that online survey completion time under 13 minutes increases response rates. I further limited the number of participants selected for interviews to 10. While the purpose of interviews in information systems qualitative research is to gain meaning beyond the facts (Schultze & Avital, 2011), there are no specific rules as to how many interviews are sufficient to gain meaning in social research. However, one study found saturation (the point where no new information was discovered) to be achieved after 12 interviews (Guest, Bunce, & Johnson, 2006). Therefore, by limiting the number of interviews to 10, I came as close as time would allow to serving both the logistical constraints of the researcher as well as saturation of the meaning making sought by the process. Summary Chapter Three described the mixed method study by detailing the stakeholders and the anticipated influences affecting them, the conceptual framework, the research instrumentation and methodology, and the factors affecting credibility, ethics and study limitations. In Chapter Four, the results of the research surveys and interviews are analyzed and presented. Chapter Five provides the recommendations that resulted from the study as well as suggested plans for the implementation and evaluation of microlearning as a format for technical training. LEARNERS’ PERCEPTIONS OF MICROLEARNING 63 CHAPTER FOUR: RESULTS AND FINDINGS The purpose of this study was to evaluate learners’ perceptions of the use of the microlearning format for the delivery of technical training. The study was conducted via a mixed method design employing an online survey followed by telephone interviews to gather enhanced meaning from some of the survey participants. Two research questions guided the project: 1. What are the knowledge, motivation and organization influences that impact adult learners’ perceptions of the benefits of microlearning for technical skill training? 2. What are the recommendations for organizational practice in the areas of knowledge, motivation, and organizational resources related to the employees’ ability to articulate the benefits of microlearning for their technical skill development? Participating Stakeholders The survey participants were recruited utilizing convenience and snowball sampling techniques initiated via posts I made on multiple public online technical user communities requesting anonymous participation in the study. A sweepstakes incentive was included in the requests for participation; it offered survey contributors the chance to win one of ten ten-dollar gift cards to a well-known online shopping website. Within the survey instrument, participants were given the opportunity to volunteer for one of ten live interview sessions and each interview participant (randomly selected from the full list of volunteers) would receive a ten-dollar gift card to the same well-known online shopping website. There was no restriction on any participant’s potential for winning both the survey and interview incentives. Overview of Survey Participants Of the 134 volunteers who started the online survey, two did not agree to the statement of consent to participate, one was underage to give consent (15 years old), and 18 gave their LEARNERS’ PERCEPTIONS OF MICROLEARNING 64 consent but did not answer any of the non-demographic questions. Therefore, the total survey population used for data analysis was 113 qualified participants. The study was specifically designed to evaluate microlearning training perceptions; therefore, the survey questions flowed down varying pathways depending upon each participant’s answers to specific questions. For example, those who did not have microlearning experience were asked questions about their access to and interest in taking microlearning courses and then given an opportunity to enter the survey sweepstakes but not the interview process. Those who did have microlearning experience were presented with microlearning-specific questions and then given the opportunity to enter both the survey and the interview sweepstakes. Therefore, sets of questions had varying numbers of responses. See Appendix C for the survey question flow design. Overview of Interview Participants Of the 57 survey participants with stated technical training and microlearning experience, 22 (19%) volunteered for a personal interview session by entering their email or telephone contact information (not their names) into a separate form to ensure their study survey responses remained anonymous. The contact information for all volunteers (email address or cell phone number) was copied into an Excel spreadsheet (one row per volunteer) and sorted into alphabetical and numerical order. The range of spreadsheet row numbers (one through 22) was entered into an online number randomizer and 10 random row numbers were identified. All 10 volunteers on those row numbers were sent up to three emails or text messages requesting their participation in an interview. Seven of the 10 responded. To fill the list of 10 interviewees, the 12 volunteers not originally selected were sorted alphabetically and three were randomly selected by the same number randomization process. All three of these volunteers agreed to participate in an interview. Therefore, a total of 10 personal interviews were conducted utilizing a semi-structured LEARNERS’ PERCEPTIONS OF MICROLEARNING 65 interview format. All interviews were recorded using a WebEx online conference call account. Nine of the recorded interviews were transcribed using Trint.com, a machine automated transcription service; the recorded interview for the participant from India was transcribed using the Rev.com, a manual transcription service. I simultaneously listened to all 10 interview recordings and edited the written transcripts as necessary to improve accuracy. Although participants were randomly selected from the list of volunteers, all held existing technical roles in their organizations. Refer to Table 3 for a visual representation of the interview participant demographics. Table 3 Interview Participants for Qualitative Inquiry Participant (Pseudonym) Organization Type Size of Organization Years of Technical Training Number of Microlearning Courses Age Country Participant A For-Profit 1,001 or more 3 30 40 USA Participant B For-Profit 501-1,000 4 2 33 USA Participant C For-Profit 1,001 or more 4 50 46 USA Participant D For-Profit Less than 50 30 16 65 Global Participant E For-Profit 1,001 or more 30 50 55 USA Participant F Non-Profit 501-1,000 20 20 40 USA Participant G For-Profit 1,001 or more -- 10 35 India Participant H For-Profit 1,001 or more 15 50 40 USA Participant I For-Profit 1,001 or more 30 50 56 USA Participant J For-Profit 1,001 or more 20 15 48 USA Detailed Survey Participant Demographics Survey participant age. Of the 113 qualified survey participants, 65 (57.2%) were female, 47 (41.6%) were male, and one (0.8%) preferred not to answer. The ages of the 113 participants ranged from 23 to 65 years of age with a mean of 44, modes of 33 and 40 (seven LEARNERS’ PERCEPTIONS OF MICROLEARNING 66 participants each), and a standard deviation of 10. See Figure 4 for visual representation of the participants’ age. Figure 4. Frequency of survey participant age. n=113 Survey participant employment. The majority of the survey participants (103) worked in the United States (91%), two worked in the United Kingdom (2%), and eight (7%) worked in other countries including two who did not specify a country, two who described their work location as global without designating a country, and one each in India, the United Arab Emirates, New Zealand, and Turkey. Of the 113 survey participants who responded, 52 (46%) worked in a for-profit company, 38 (34%) worked in a public organization such as a school or in the government, 13 (12%) worked in a non-profit company, six were self employed (5%), two were not currently employed (2%), and two (2%) worked for other types of organizations including a for-profit school and a federally funded research and development center (FFRDC). LEARNERS’ PERCEPTIONS OF MICROLEARNING 67 See Figure 5 for a visual representation of the types of participants’ employment organizations. Figure 5. Frequency of participant employment organization type. n=113 Of the 103 survey participants who worked in for-profit, public, or non-profit organizations, 45 (44%) had 1,000 or more employees, 15 (15%) had 501 to 1000 employees, 23 (22%) had between 101 and 500 employees, seven (7%) had between 51 and 100 employees, and 13 (13%) had less than 50 employees. See Figure 6 for a visual representation of the size of the participants’ employment organizations. LEARNERS’ PERCEPTIONS OF MICROLEARNING 68 Figure 6. Frequency of participants’ employment organization size. n=103 Survey participants with technical training experience. Eighty-seven of the 113 (77%) survey participants stated they had taken technical training courses; 46 (53%) worked for a for- profit company, 29 (33%) worked for a public organization such as a school, five (6%) worked for a non-profit organization, four (5%) were self-employed, one (1%) was unemployed, and two (2%) worked for other types of organizations including a for-profit school and a federally funded research and development center (FFRDC). See Figure 7 for visual representation of the types of organizations for which participants with technical training experience worked. LEARNERS’ PERCEPTIONS OF MICROLEARNING 69 Figure 7. Frequency of participants who had taken a technical training course, sorted by organization type. n=87 Survey participants without technical training experience. Of the 113 survey participants, 26 (23%) stated they had not taken technical training. Of those, nine (35%) worked for a public organization such as a school, eight (31%) worked for a non-profit organization, six (23%) worked for a for-profit organization, two (8%) were self-employed, and one (4%) was not currently employed. See Figure 8 for a visual representation of the types of organizations for which participants without technical training experience worked. LEARNERS’ PERCEPTIONS OF MICROLEARNING 70 Figure 8. Frequency of participants who had not taken a technical training course, sorted by organization type. n=26 See Figure 9 for visual representation comparing the organization types for survey participants who had and those who had not taken a technical training course. Figure 9. Frequency of participants who had (top diagram) and who had not (bottom diagram) taken a technical training course, sorted by organization type. n=118 Survey participants’ reasons for taking technical training. Of the 87 survey participants who had taken a technical training course, 49 (56%) currently worked in a technical LEARNERS’ PERCEPTIONS OF MICROLEARNING 71 role, 19 (22%) had a general interest in technology, four (5%) wanted to start working in a technical role, and 15 (17%) selected “other” reasons including those who (a) formerly worked in a technical role; (b) needed to learn a new system used in a non-technical role; and (c) needed to fill a knowledge gap. See Figure 10 for a visual representation of participants’ reasons for taking technical training. Figure 10. Frequency of participants’ reasons for taking a technical training course. n=87 Interview Participant C stated a common reason for taking technical training: I work in an ever-changing [technical] platform…and it is required of me [for] learning the new features and capabilities, the changes of that platform, and maintaining my certifications as well as expanding my certifications from merely the administrative or development side into product specialties. Other participants took technical training to increase their career potential. Participant H stated: LEARNERS’ PERCEPTIONS OF MICROLEARNING 72 In the technology space where I work, six to eight months a year new technologies come on board, new methodologies are introduced, and it's important from a career perspective to stay current with what's out there. One interview participant expressed similar reasons for taking technical training but from a different perspective. Participant E stated an IT leadership role requires current skills: Self-improvement is the main answer. I have elevated myself up through various years to a role of vice president of IT. But even in this role I feel the need and the requirement to stay current on technical topics especially those around development and the technologies that are used in businesses both to help my business grow and to be a good leader and also to better understand my staff and give them good direction. Types of technical roles held or desired by survey participants. Of the 87 survey participants who had taken technical training, 67 (77%) identified the roles they held or wanted to hold include 27 (40%) as system administrators, 11 (16%) as programmers, eight (12%) as system configurators without programming, six (9%) as system end users, and 15 (22%) as other roles including project management, teaching, solution architecture, and program management. See Figure 11 for a visual representation of participants’ existing or desired technical roles. LEARNERS’ PERCEPTIONS OF MICROLEARNING 73 Figure 11. Frequency of participants’ stated technical roles or desired roles. n=87 Survey participants’ years of technical training. Eighty-three of the 87 survey participants (95%) who had taken a technical training course indicated the number of years they had taken technical training. Participants were given a sliding scale in the survey instrument with options between one and 30 years. The response mean was 11.54 with a standard deviation of 9.56. Ten (12%) participants had 30 years of experience or more. See Figure 12 for a graphical representation of the participants’ years of experience with technical training. LEARNERS’ PERCEPTIONS OF MICROLEARNING 74 Figure 12. Frequency of participants’ years of experience with technical training. n=83 Survey participants’ experience with microlearning. Of the 113 survey participants, 63 (56%) had taken a course in the microlearning format. The 50 participants without microlearning experience were asked a unique set of four questions to determine their access to and interest in the format; 45 (90%) answered these questions. These participants were neither asked about nor included in the remainder of this study’s analysis of learners’ perceptions of microlearning features because they acknowledged they did not have experience with the format. These participants were, however, invited to enter the sweepstakes for a gift card for their participation. Their general perceptions of the concept of microlearning and interest in the format were tracked are presented in the next two paragraphs. Survey participants with neither experience nor an understanding of the microlearning format. Although the survey included a definition of microlearning, nine (20%) of the 45 participants without microlearning experience either strongly agreed or somewhat LEARNERS’ PERCEPTIONS OF MICROLEARNING 75 agreed that they did not understand what a microlearning course was. Of those who did not understand, seven (78%) participants acknowledged that they had not had an opportunity to take a microlearning course (two respondents were neutral); however, when asked if they would like to give it a try, one (11%) strongly agreed and four (44%) somewhat agreed. Seven (78%) of these nine participants were neutral in their opinion of the format (understandable since they stated they stated that they did not understand it). The responses of the nine participants without an understanding of microlearning were excluded from the remainder of the study. Survey participants without microlearning experience but with an understanding of the format. Thirty-six (80%) of the 45 participants did not have microlearning experience but had some understanding of the format. Twenty-four (67%) of these participants either strongly agreed or somewhat agreed that they had not had an opportunity to take a microlearning course. Eight (22%) strongly agreed or somewhat agreed that they had had an opportunity, and four (11%) were neutral. See Figure 13 for a visual representation of these participants’ exposure to a microlearning course. Figure 13. Frequency of participants’ exposure to microlearning courses but without microlearning experience. n=36 LEARNERS’ PERCEPTIONS OF MICROLEARNING 76 Twenty-seven of these 36 (75%) participants strongly agreed or somewhat agreed that given the opportunity, they would like to give the microlearning format a try, five (5%) participants disagreed or somewhat disagreed, and four (11%) were neutral. See Figure 14 for a visual representation of these participants’ interest in taking a microlearning course. Figure 14. Frequency of participants’ interest in taking a microlearning course but without microlearning experience. n=36 None of the 36 participants without microlearning experience but with some understanding of the format stated that they strongly disliked the format for technical training; however three (8%) had a somewhat unfavorable view. Seventeen (47%) of the participants had either a strong or somewhat strong favorable view and 16 (44%) were neutral. See Figure 15 for a visual representation of these participants’ perception of the use of the microlearning format for technical training. LEARNERS’ PERCEPTIONS OF MICROLEARNING 77 Figure 15. Frequency of participants’ perception of the use of microlearning for technical training but without microlearning experience. n=36 Number of technical training courses in a microlearning format. Using a sliding scale instrument, 57 participants indicated the number of technical training courses they had taken in a microlearning format. The scale ranged in one-unit increments from one to “50 or more”. The participants’ responses had a mean of 21, median of 15, and a mode of 50 or more, with a standard deviation of 17. See Figure 16 for a visual representation of the number of technical courses these participants had taken in a microlearning format. LEARNERS’ PERCEPTIONS OF MICROLEARNING 78 Figure 16. Frequency of the number of technical training courses participants had taken in a microlearning format. n=57 Survey participants were then asked how likely it was that they would take future technical training in a microlearning format. Thirty-four (60%) of the participants responded “Definitely Will,” 18 (32%) “Probably Will,” and five (9%) “Might or Might Not.” None of the participants stated they would not take technical training in a microlearning format. All 11 of the participants who had taken 50 or more microlearning courses responded that they would definitely take additional technical training in the microlearning format. See Figure 17 for a visual representation of likelihood participants will take future technical training in a microlearning format. LEARNERS’ PERCEPTIONS OF MICROLEARNING 79 Figure 17. Frequency of the participants experienced with microlearning who plan on taking additional microlearning courses in the future. n=57 Research Question 1: Influences on Learners’ Perceptions of Microlearning 1. What are the knowledge, motivation and organization influences that impact adult learners’ perceptions of the benefits of microlearning for technical skill training? The first research question sought to gain the general insight of microlearning consumers as to the use of the format for technical training, and then the specific knowledge, motivation, and organizational influences that impacted learners’ perceptions. Microlearning is a relatively new format for eLearning. As such, it was important to the study to understand learners’ perceptions of microlearning among of a range of training delivery methodologies and the factors that influenced those perceptions. Although all participants were asked questions about a range of eLearning experiences, only respondents who stated they had experience with both technical training and microlearning were given survey questions designed to elicit their perceptions of the microlearning format. The interview volunteers were also recruited from the same population with microlearning experience. LEARNERS’ PERCEPTIONS OF MICROLEARNING 80 Overall Impression of Microlearning Because the interview participants were recruited from the survey participants who indicated they had microlearning experience, all were able to articulate their impressions of the microlearning format for technical training. Their years of technical training and microlearning experience, type of organization, size of organization, and age varied as is shown in Table 3 above. Their impressions of the microlearning format for technical training varied as well. Most interview participants had a positive overall impression. Participant E stated: There are some perceptions about microlearning that if it is smaller and less expensive, how can it be that good? If I don't have an instructor, how am I going to get it? My experience and the experience of the staff that I'm running is that [microlearning] has really helped. Participant D described the cost of keeping content current in a microlearning format compared to the efficiency of wide distribution of content: [Technical knowledge] goes stale real fast…once you create something then you have to stay in a continual creation mode because the knowledge changes or there's new things that you want to add for application or usage. So it is costly from that standpoint. However distribution can be broad. So if you've got a hundred thousand people in your company now that need to learn one thing that kind of platform could be really advantageous. Interview participants were asked to compare microlearning to other formats of instruction. Their perceptions are described in the following sections. Microlearning compared to instructor-led courses. Live classroom training. Thirty-two of the 63 (51%) of the survey respondents with LEARNERS’ PERCEPTIONS OF MICROLEARNING 81 technical training experience responded that they felt classroom ILT for technical content was very important or somewhat important. Twenty-three (36%) responded that it was not too important or not important at all, and eight (13%) were neutral. See Figure 18 for a visual representation of the survey participants’ ranking of the importance of instructor-led classroom training for technical content. Figure 18. Frequency of the survey participants’ ranking of the importance of instructor-led classroom training for technical content. n=63 Live webinar training. Sixty-two survey participants rated the prompt “Learn technical content by attending a live webinar with an instructor” as follows: 39 (63%) regarded learning by attending a live webinar with an instructor as very important or somewhat important; 23 (37%) either had no opinion or did not find it important. See Figure 19 for a visual representation of all participants’ ranking of the importance of learning by attending a live webinar with an instructor. LEARNERS’ PERCEPTIONS OF MICROLEARNING 82 Figure 19. Frequency of the survey participants’ ranking of the importance of instructor-led webinar training for technical content. n=62 Effectiveness of instructor-led training over microlearning. When survey participants were asked to rate the effectiveness of ILT over eLearning (Question 15), the mix of the responses changed. Sixty-one participants responded to the prompt, “Technical training with a live instructor is more effective than eLearning”: 6 (10%) strongly agreed; 15 (25%) somewhat agreed; 21 (34%) were neutral; 16 (26%) somewhat disagreed; and three (5%) strongly disagreed. However, when the results were limited to the 11 most experienced microlearning participants, none of them agreed with the statement. Of those 11, and similar to all participants, four (36%) were neutral; five (45%) somewhat disagreed; two (18%) strongly disagreed; See Figure 20 for a visual representation comparing the percentage of responses of all 61 participants and the 11 most experienced microlearning users’ ranking of the effectiveness of classroom training for technical content over microlearning. LEARNERS’ PERCEPTIONS OF MICROLEARNING 83 Figure 20. Frequency of the survey participants’ ranking of the importance of instructor-led webinar training for technical content. n=62 Positive aspects of instructor-led training over microlearning. Some of the interview participants who preferred ILT mentioned that microlearning may not yet match the benefits of a classroom learning experience for technical content and the guidance of a live instructor. Participant F stated that microlearning is “[N]ot the same as having an expert face-to-face with you where they can sort of give you instantaneous feedback and maybe guide you if you are heading down the wrong path.” Participant G also preferred ILT and stated, “You have the instructor with you who…understands where the candidate makes the mistake and how he should correct him in order to make him understand the concepts in a better way.” Participant D stated, “I really enjoy in-person instruction because I like the dialogue, the pace, and the fluidity that you can have with an instructor to…either expand the topic at hand or contract it real time. Can't really do that with eLearning.” Interview Participant A found live ILT webinars beneficial: “Webinars like lunch time learnings—we used to do those once a month and different people would lead them. So that was helpful. It makes you feel connected to learning and the company and the importance of growing.” LEARNERS’ PERCEPTIONS OF MICROLEARNING 84 Participant J offered a view on the on-demand adaptive limitations of microlearning course content: Where you have this heterogeneous set of experienced students coming in and a kind of an amorphous goal about what are we trying to actually teach in this class, microlearning tends to miss more of the targets than it hits in my experience. And in those situations, its strict formality and static nature compare poorly with the dynamic structure of instructor- led training because the instructor can tailor the coursework, the amount of time spent on a particular topic, and the discussion, and really every aspect of the course because they're human and adaptive. The microlearning tools that I have seen tend not to be adaptive. There may be some that are really good I just don't have experience with them. But that static nature is the thing that means it will either hit or miss but cannot adapt. Participant C responded with a similar concern about microlearning’s ability to adapt: From a technical training perspective, there might be some architecture or design oriented principles and best practices [that] might be better served in a classroom where someone can demonstrate and convey "this was a good approach" and have an example and walk through where and why and how. And then “this was a bad approach.” And show you different methodologies, how do you use tools or standard documentation that's expected of us as engineers or architects. That might be better in a classroom. Learn in collaboration with colleagues. Sixty-three survey participants rated the prompt “Learn technical content by collaborating with colleagues” as follows: 42 (67%) regarded learning in collaboration with colleagues as very important or somewhat important; 21 (33%) either had no opinion or did not find it important. See Figure 21 for a visual representation of all participants’ ranking of the importance of learning in collaboration with colleagues. LEARNERS’ PERCEPTIONS OF MICROLEARNING 85 Figure 21. Frequency of the survey participants’ ranking of the importance of learning technical content in collaboration with colleagues. n=63 Interview Participant D focused on the collaborative and social aspects of ILT that many participants found helpful but lacking in microlearning: [ILT is] more interactive, there are concepts that I retain much better and for years on end, and I think it had to do with the sort of dialogue and debate that you get in classroom discussion. And so I think from a learning standpoint when I've learned something, especially when it has to do with conceptual or theoretical stuff or around principles or even processes, when I did it in a classroom setting where they've been discussed and debated and you hear other people's points of view about it, where there's been some sort of ... oh I don't want to just call it role play but you had to demonstrate or use it or apply it, that stuff has stuck with me for years… It would be more deeply embedded and recalled because of the tactileness about actually learning it in a social setting in an engaged setting with other people. Participant F offered a similar perspective: LEARNERS’ PERCEPTIONS OF MICROLEARNING 86 I liked the instructor-led because of the social aspects of learning, being physically in the same room, maybe having some snacks, creates a very welcome environment to learn together. It also allows you to create networking with others that are learning the same thing with you. Positive aspects of microlearning over instructor-led training. Participant I provided a succinct perspective: “Sometimes classes go too fast, sometimes classes go too slow. So you can either get overwhelmed or you're just bored.” Participant C offered a similarly concise statement: “It's not [always] a productive use of time if I can do the [content] training in 45 minutes, why torture me for four hours.” Participant J concurred: I'm generally dissatisfied with instructor-led pieces because again I feel like I learn at my own pace and that rarely aligns precisely with the instructor led. If the instructor is the sole party in the room other than me, then we're all good. I can say, “Yeah, yeah, I already know that because of this and what I want to know is that” and then I can compel the instructor let stuff to be more [me]-paced. But I feel like any instructional mechanism that requires me to go at a pace that's dictated by another—whether that's instructor-led or video based or some other mechanism that forces me through steps A, B, and C before I can get to D—those cause me frustration and I feel like I lose my interest and I'm much more apt to check email or browse the web or you know play a game or do something instead of pay attention to the elements that I find are less interesting to me. Participant C offered a similar response on the issue of ILT pace: [ILT] doesn't adjust well to the pace of the different learners. The instructor is up there going through the book as fast as the instructor can go through the book. And if you're in a classroom setting with 10 people, you're only as fast as your slowest learner. And so LEARNERS’ PERCEPTIONS OF MICROLEARNING 87 there's a gap always between the slowest learner and the fastest learner. And that's where you introduce boredom and distractions. Participant E also discussed the pace of ILT as problematic for the needs of all learners: The problem I have with instructor-led courses more times than not is simply the time factor. Because [for] an instructor-led course, you have to meet during the instructor's time. And they're harder to manage and harder to pace out because some topics need more time to sit and understand and absorb. And it's different for everyone so if you're in an instructor-led course with learners who are not pacing at your pace, either slower or faster, that can be problematic. So the micro courses, especially the self-led ones, self- directed, are very nice since you can take your time and work through it on your pace and if that's fast, it's fast, and if it's slow, it's slow, and it can vary on the topic. Participant D, who liked the pace of ILT as mentioned above, also admitted “that the constraint of the two-hour period for a class is you've got to really be mindful of being able to get through the material in two hours is that you just don't have more time.” Participant B offered a similar constraint: “You're kind of limited to the structure that they've come up with.” Interview Participant H stated: I tend to find that if it's instructor led, I don't get as much out of it because they're worried about time clock management. Meaning we've got a time box amount of time on either this section or this content. And so you start taking shortcuts with the material in order to meet the time constraints. So I can do those things and hopefully I pick up what's available and there's really good notes afterwards. But the microlearning allows me to work at my pace in most cases. And I can retake that same information. So if I didn't get the concept the first time I still had the reference material to work from. So I find that LEARNERS’ PERCEPTIONS OF MICROLEARNING 88 from an advantage perspective personally the microlearning tends to work out better and [more] closely mirrors my tinkering approach that I would take sometimes to new concepts. I'm going to try it a way I think I know it. Then if I if I have the material to go back and look on again rather than having to wait for an instructor to respond or reach out to a broader set of individuals for some feedback on it. Participant C had a similar perspective: Sometimes in the regular classroom, they're walking you through a very scripted script ... a very prescribed process. They want to do one, two, three, and four… [With microlearning,] I can go to the catalog and go work on it when I need it and not work on it on May 18th because that's the day everybody is doing that training. Participant H stated that the primary differentiator of microlearning over ILT was the ability to revisit the course content as needed without incurring additional cost: Some of the technical courses are very expensive and you get one shot in a class and it's over. Microlearning and other eLearning solutions I think help to break that down so you can consume it as you need it and in most cases from a cost perspective we can subscribe to a service or something along those lines and get access to the same material and consume that a number of times rather than having to pay multiple thousands of dollars to go pay for the class and travel there. Time and cost. A common theme in the interview sessions was the lack of time and the high expense of live instructor-led training especially in a classroom setting. Participant G discussed training at the speed of technology: “Gone are the days of classroom training where we actually take time out and spend the entire day in a class… Now we are running faster. We [have] moved to virtual [training].” Participant D preferred microlearning because although it LEARNERS’ PERCEPTIONS OF MICROLEARNING 89 “has some constraints to it, it also has some huge efficiencies.” Participant F discussed the cost aspect of classroom training: “The downside [of ILT] of course is the travel arrangements that might be necessary, whether that may be travel or lodging... And, of course, time. Generally, in- person training would require a little bit more time than the online.” While Participant A preferred ILT, time and money constraints made microlearning the preferred option: I actually prefer instructor led training but I just don't have the time to go do it or the money. I find it easier…if you can answer questions, take notes, have feedback, but...I don't like [set time] kinds of training. I find them a little bit stressful. So I find the microlearning better than that…I think I learn more potentially when I read a textbook. But again I just don't have the time anymore. So I really find microlearning just to be the time saver and getting just exactly what you need in just the right time. Participant D identified a related time benefit and stated, “I like [microlearning] because it's short in duration. Very focused on single topics or partial parts of topics. It's almost like going to an encyclopedia and you can kind of turn to a certain subject and in a very short period of time get a great overview and then that maybe may lead you to another place to get a more in- depth understanding.” Blended approach--microlearning and instructor-led. One interview participant preferred microlearning’s self-directed format over instructor-led courses but could learn using both styles. Participant H stated: My learning style tends to be closer to get my hands on it and get into it on my own so that I understand it my own way, not necessarily the way some instructor has put it together. But I found that I can learn both ways, right? And so having a little bit of guided LEARNERS’ PERCEPTIONS OF MICROLEARNING 90 learning through different technologies gets me to a baseline understanding of the information that I can then expand further. Interview Participant E proposed a compromise to employ the interactive benefits from instructor-led instruction with the on-demand availability of microlearning: So the easy answer is to say they're both valuable. I do prefer the microlearning personally. I know that's not something for everyone. Instructor led is not always convenient but I know a lot of people really need to have that interaction. I think microlearning courses if done properly along with some forum or interactive technology where questions can be posted and either peers or instructor experts can provide feedback are very useful. Transfer of knowledge to the workplace. The interview participants were divided in their perceptions of whether microlearning contributed to the transfer of learning to the workplace. Participant C stated: [Microlearning] helps immediately, considerably. I can go learn and apply same day… If I take the course today and I don't get to use it for three months and I get started on my project and I'm trying to use this skill and I say “Wait a minute I'm stuck.” Instead of just kind of being left out here in the cold, I can go back, pull up the microlearning, find the module, find the unit that I'm stuck on, get a quick refresher, and then move back. So I think that actually helps in the long run. Participant H’s perspective on workplace transfer was based on microlearning’s just-in-time format: I think that the microlearning approach …has the potential to be very useful to people you know across the board…[you] have the ability to go back, review the material at your LEARNERS’ PERCEPTIONS OF MICROLEARNING 91 own pace so that if you're six, eight months, a year down the road and you haven't used that and suddenly have an opportunity to do that, you can dive back in from the point where you were. And in some cases where that material may change on a regular basis, you get access to what's been updated since it was first introduced. Participant J, however, did not think the transfer of skills to the workplace was any different due to the content being in a microlearning format: I don't think that microlearning really makes a difference. If it was a question I'm learning a skill that I need to master and needs to become part of my set of tools that I bring to the table, then I feel like I need to apply it relatively quickly. But I don't feel like that's a function of microlearning. I feel like that's a true statement of whether it's instructor led, whether [I’m] reading a web page, whether it's microlearning with skill challenges, or someone just told me something in the hall. Learn by listening to recorded training. Sixty-three survey participants rated the prompt “Learn technical content by listening to recorded training” as follows: 38 (60%) regarded learning by listening to recorded training as very important or somewhat important; 25 (40%) either had no opinion or did not find it important. None of the interview participants commented on the use of a recorded webinar for technical training. See Figure 22 for a visual representation of all participants’ ranking of the importance of learning by listening to recorded training. LEARNERS’ PERCEPTIONS OF MICROLEARNING 92 Figure 22. Frequency of the survey participants’ ranking of the importance of learning technical content by listening to a recorded webinar. n=63 Learn offline from a workbook. Sixty-three survey participants rated the prompt “Learn technical content offline from a workbook” as follows: 27 (43%) regarded learning offline from a workbook as very important or somewhat important; 36 (57%) either had no opinion or did not find it important. See Figure 23 for a visual representation of all participants’ ranking of the importance of learning offline from a workbook. Figure 23. Frequency of the survey participants’ ranking of the importance of learning technical content from a workbook. n=63 LEARNERS’ PERCEPTIONS OF MICROLEARNING 93 Two interview participants mentioned the use of a book for technical training. Participant A stated, “I think I learn more potentially when I read a textbook. But again I just don't have the time anymore.” Participant B preferred using a book for certification test preparation: If you're studying for a certification with an exam at the end, I would feel more comfortable doing something a bit more traditional with a textbook and then augmenting that with the microlearning. But I wouldn't rely completely on the microlearning only because I'm not sure if I would get everything that I need in order to pass the certification exam or not. So, I would kind of defer back to getting a textbook to get that information. Knowledge Influences In part, the study sought to identify specific procedural and metacognitive factors that influenced the participants’ perceptions of microlearning. As is discussed in Chapter Two, procedural knowledge concerns the “how to” factors whereas metacognitive knowledge concerns self-reflection and “knowing how I learn” influences on learning. However, the results showed that there was a consistent crossover of procedural, metacognitive, and motivational influences such that it is difficult to separate them. Therefore, although these factors are discussed below as distinct categories, the cross-influence is evident and underscores the composite nature of learning perceptions. For example, the study showed that for some participants, a course that offered time estimates for each microlearning module provided procedural (“How do I work through this course?”), metacognitive (“I use time estimates to plan my course strategy”), and motivational (“I can get this finished before lunch and earn a badge”) influences. Procedural knowledge. Question 13 of the survey asked, “Thinking back on your overall experience with microlearning courses, how important is each of the following features to you?” Respondents rated statements on a five-point Likert scale with response options ranging LEARNERS’ PERCEPTIONS OF MICROLEARNING 94 from “Very Important” to “Not Important at All.” The responses to each statement are listed separately below and supplemented with data from interviews. Clear course navigation instructions. Fifty-six (90%) of the 63 survey participants with microlearning experience rated clear course navigation instructions as very important or somewhat important. The findings were very similar for those with the most and those with the least microlearning experience. Eighty-two percent of the 11 participants who had completed more than 50 microlearning courses and 87% of the 31 participants who had completed 10 or fewer microlearning courses found clear course navigation instructions were very important or somewhat important. See Figure 24 for a visual representation of all participants’ ranking of importance for clear course navigation instructions in microlearning courses. Figure 24. Frequency of the ranking of importance for clear course navigation instructions in microlearning courses. n=62 In the interviews, multiple participants mentioned the need to keep navigation instructions available, optional, and easily accessible. Participant D stated, “The majority of … microlearning learning experiences I've been in … have been really good and they make it easy enough … that if you've done it two or three times, there's a pattern or a feel to it that is pretty LEARNERS’ PERCEPTIONS OF MICROLEARNING 95 intuitive.” Some participants, however, stated that they have had issues of quality control and gaps in instructions. For example, Participant D stated that there have been problems in some courses with the “fluidness of going from one page or one screen to another.” To explain what can happen when clear navigation instructions are not included, Participant C discussed how even those experienced in a particular microlearning environment can become disoriented: It had been a while since I had been working on Trailhead [a technical training website]. And now there are playgrounds instead of just dev orgs [practice environments]. And I have no idea what we're supposed to be doing with the playgrounds. I have no idea why the playgrounds are of benefit to us. I had an org. I would like to work in my org. Why do I now want to work in playgrounds? They introduced that in a fashion to make it easier for us without a lot of “why” behind it. Sometimes there needs to be why. Interview participants also mentioned the need for content search tools to assist navigation in courses designed in the microlearning format. Instead of navigating through an entire technical course, Participant J stated that in many cases, “I'm going to use the search feature on the page to find the keyword that I'm looking for.” However, participants emphasized that in order to be functionally helpful, the search tool needs to return results that include procedural information. Participant I stated: I have had issues sometimes. … [I]f I search on a video on a particular topic, they have 20 or 30 videos on that particular topic and what they don't seem to me is that it's really a whole series of videos that they put together and you should watch them in order. And I found it hard to actually try and figure that out. So I [was] presented with 20 videos on the topic with all sorts of different subtitles for the topic and they don't tell me the LEARNERS’ PERCEPTIONS OF MICROLEARNING 96 ordering. But if I stumble, somehow stumble into the whole group, then I get them in order that I'm supposed to get them in. Training courses that are focused on a specific technology can include in their procedural support introductory information including navigation and the unique tools needed for the course. Participant F stated: I believe [that] an inventory of [software] tools that are required in order for us to successfully complete the learning module would be helpful because often times in the technical training one may not realize that they simply just don't have all the tools necessary. It's not just the knowledge, but the tool is missing. Definitions for terminology used in the course. Fifty-two (84%) of the 62 survey participants with microlearning experience rated the availability of course terminology as very important or somewhat important. The findings were similar for those with the most and those with the least microlearning experience. Nine (82%) of the 11 participants who had completed more than 50 microlearning courses and 26 (87%) of the 30 participants who had completed ten or fewer microlearning courses found terminology to be very important or somewhat important. See Figure 25 for a visual representation of all participants’ ranking of importance for the providing definitions of terminology used in microlearning courses. LEARNERS’ PERCEPTIONS OF MICROLEARNING 97 Figure 25. Frequency of the ranking of importance for the inclusion of definitions for terminology used in microlearning courses. n=62 Interview Participant D stated, “I do like glossaries. I do like things that I can refer back to … a learning aid or something … I like to have that from a memory standpoint to sort of trigger my thinking.” Participant H also commented on glossaries and stated, “[I]f you jump from a basic understanding into a more advanced, there may be material that you don't capture and so they [may not] explicitly spell out what … they meant by those things.” Participant D discussed the challenge with the inconsistent use of language: “Sometimes there are language problems where language or terminology is not consistent and I'm not sure what I'm answering or what I should answer.” Participant J explained: I mean we live in a world of three-letter acronyms and if someone or the training is using some three-letter acronym that you don't know then its value is diminished and so a glossary of terms is almost always helpful particularly if you're not compelled to sit through [the entire course]. If you can refer to it upon request that's good. Job aids and such like a table of contents I feel like having them available all makes sense. Making them compulsory, particularly in something where I can't skip ahead, is not good. LEARNERS’ PERCEPTIONS OF MICROLEARNING 98 Progress indicator. Fifty-one (82%) of the 62 survey participants with microlearning experience rated the availability of a course progress indicator as very important or somewhat important. The findings were similar for those with the most and those with the least microlearning experience. Eleven (91%) of the 11 participants who had completed more than 50 microlearning courses and 28 (93%) of the 30 participants who had completed 10 or fewer microlearning courses found the progress indicator to be very important or somewhat important. See Figure 26 for a visual representation of all participants’ ranking of importance for the inclusion of a course progress indicator in microlearning courses. Figure 26. Frequency of the ranking of importance for the inclusion of a course progress indicator. n=62 The interview participants concurred with the survey results. Participant C stated, “I would prefer a progress bar.” Participant B agreed and stated, “I really like those. An example of that is Code School[.net]. They have this banner type thing at the top showing you where the end is and all the modules in between where you are. I find that really helpful.” Time estimates for module completion. Fifty-one (82%) of the 62 survey participants with microlearning experience rated the availability of time estimates for module completion as very important or somewhat important. However, a higher percentage of those newer to the LEARNERS’ PERCEPTIONS OF MICROLEARNING 99 microlearning environment found course time estimates valuable (90% of the 30 participants) than those who had completed more than 50 microlearning courses (82% of the 11 participants). See Figure 27 for a visual representation of all participants’ ranking of importance for the inclusion of time estimates for module completion in microlearning courses. Figure 27. Frequency of the ranking of importance for the inclusion of time estimates for modules in microlearning courses. n=62 Interview Participant E found the inclusion of time estimates for module completion to help select courses that fit within available training time: I'll look at the times that the module is estimated at and then pick the ones that match the time I have available. Because I like to use my microlearning to fill in downtime at a job or weekends or evenings when there's nothing else going on or stuff to read or something. So I'll use it to manage my time and will look at that and I may pick a module. And if one says you're going to need to devote an hour to get through this, I want to make sure I have hour. However, Participant J stated that module time estimates may be meaningless to learners: “If [the time estimate is] video based, yes. If it's reading based, then the time estimate is meaningless because people read at different paces.” Participant C agreed: LEARNERS’ PERCEPTIONS OF MICROLEARNING 100 I don't think that their time estimates have ever been accurate…I find that I can spend hours on ones that they think take 30 minutes and I can spend 10 minutes on ones they think should take an hour. It just depends. Their premise with any of the trails [Trailhead modules] is that the person opening them is starting from zero knowledge. Not that “Hey here is a Salesforce professional of five years with [multiple] certifications sitting down in front of Trailhead to do the admin trail, right, because that ought to be easy for me to just blow right through because I'm already an admin…I'm experienced, but it's written for the person who is brand new. They assume you know nothing when you start the trail. Ability to click through a required course quickly. Fifty-two (84%) of the 62 survey participants with microlearning experience rated the ability to click through a required course quickly as very important or somewhat important. The learners’ perceptions were slightly higher among those newer to the microlearning environment (87% of the 30 participants) than those who had completed more than 50 microlearning courses (82% of the 11 participants). None of the interview participants specifically mentioned this functionality as having an effect on their perception of the microlearning format for technical training. See Figure 28 for a visual representation of all survey participants’ ranking of importance for the ability to click through a required microlearning course quickly. LEARNERS’ PERCEPTIONS OF MICROLEARNING 101 Figure 28. Frequency of the ranking of importance for the ability to click through a required microlearning course quickly. n=62 Interactive features. Fifty-nine (95%) of the 62 survey participants with microlearning experience rated the availability of interactive features (e.g. video, projects, knowledge checks, etc.) as very important or somewhat important. However, a higher percentage of those newer to the microlearning environment found interactive features valuable (97% of the 30 participants) than those who had completed more than 50 microlearning courses (90% of the 11 participants). See Figure 29 for a visual representation of all participants’ ranking of importance for the inclusion of interactive features in microlearning courses. LEARNERS’ PERCEPTIONS OF MICROLEARNING 102 Figure 29. Frequency of the ranking of importance for the inclusion of interactive features in microlearning courses. n=62 Interview participants found several interactive features important. Video training was mentioned by multiple participants as important for the demonstration of a technical skill as long as the video player offered a customized experience. Participant B highlighted the helpfulness of video speed control as “something I use all the time…Sometimes they're going through things you know about. So it helps to fast forward and jump to the parts that are a little bit more difficult to understand or you don't know.” Participant F also mentioned the need for speed control and the ability to repeat video content: [With] the ability to repeat if I missed a particular point, I can always rewind and go back. And when I said self-paced, I also mean literally the pace of the audio. Some of the environments allow you to watch it at a faster pace like 1.5 times normal rate or even slower like a half rate. So I think those features really benefit the microlearning over the traditional. However, Participant D pointed out that video takes time to create and is challenging to keep current in the world of technology training. “If there's actually audio or some sort of video LEARNERS’ PERCEPTIONS OF MICROLEARNING 103 attached that's embedded sometimes it's just ... you can just tell it's either out of date or the production quality is poor.” Tests and skill challenges received mixed reviews from interview participants. Participant B stated, “I tend to gravitate those [modules] that have kind of a hands on component that I can my hands dirty and have some experience.” Participant H stated, “I really do like the practical examination type items more than the theory...The theory of course is important. It's not nearly as entertaining frankly for me personally.” Participant A appreciated knowledge checks but preferred that they were not required: “Things to assess your knowledge I'll read in my head and I'll do it in my head and then that's about it. I'm happy to not submit them unless I need to.” Participant C compared features: I prefer the projects that help me retain the information. I feel I learn less with just items to read and answering the questions. The ones that are project-driven are more valuable from that perspective. ... Users are probably at least susceptible to reading the question and looking for the answer rather than reading the content and answering the questions. With a demonstrative type approach, demonstrate the skill, you have to cover it from start to finish or you're not going to understand how to connect the dots when you go do it. So I've put very little weight on their read and answer questions ones. Multiple interview participants stated that they appreciated interactive features (synchronous or asynchronous) that simulated live instruction such as the availability of an instructor chat or message board tool. Participant G stated, “There should be a chat option that's available between the candidate and the instructor so that if the candidate feels that he has a doubt…he [has an] option available.” Participant A stated, “We all like the immediate feedback of being able to ask an instructor or someone via chat.” Participant H, however, did not feel a LEARNERS’ PERCEPTIONS OF MICROLEARNING 104 message board approach would work: “I found that the message board approach to getting help is not very effective. It's sometimes hard to translate what you're asking for if you don't fully understand the terminology or the technology which leads to people giving you short, inaccurate, go look here for the answer ... that sort of thing without really enhancing your understanding.” Progress bookmarks. Fifty-eight (94%) of the 62 survey participants with microlearning experience rated the availability to bookmark a course as very important or somewhat important. The percentages were nearly identical for those who had taken at least 50 microlearning courses (10 out of 11 participants or 91%) as for those who had taken 10 or fewer (30 out of 30 participants or 100%). Despite the high value survey participants placed on the bookmark feature, only one of the interview participants specifically mentioned it as a valuable tool stating, “Those are very good” (Participant E). See Figure 30 for a visual representation of all participants’ ranking of importance for the ability to bookmark course progress in microlearning courses. Figure 30. Frequency of the ranking of importance of the ability to bookmark progress in microlearning courses. n=62 LEARNERS’ PERCEPTIONS OF MICROLEARNING 105 Course feedback tool. Participant F stated that a course feedback tool is important in a microlearning course: There comes a point where I will eventually not find what I'm looking for and that's where the disappointment comes in and I wish that somebody had created a video segment or a lesson on this particular topic. And in those cases, I would typically provide feedback to the company that produces those videos and hopefully encourage them to create content for it. Metacognitive knowledge. Microlearning courses generally offer flexible options that engage learners in metacognitive contemplation before and during a course whether they are aware of it or not. Several survey questions asked about common microlearning course features and those results are summarized in this section. Each is followed by the additional insight provided by the interview participants who went on to discuss other microlearning course features not asked about on the survey. All interview participants discussed their methods for evaluating a course before starting it and how they calculate their best options to proceed thorough the content. Decision to take the course. Interview participants mentioned several cognitive processes they undertake to decide whether the microlearning course is worth starting. Course overview. Participant B evaluates a microlearning course before launching it by going through a metacognitive planning process using the course overview: What's important about the overview is to give me an idea of what's to follow. I kind of like to know the nature of the course. Is it just watching videos, is it question and answer, or is it hands on labs somewhere in that course? So I like to use the overview to kind of get a gauge of that. LEARNERS’ PERCEPTIONS OF MICROLEARNING 106 Participant D described a similar process: [If] I have a course that is in multiple units that I've had in the past especially like in viticulture, and I know how long I have when I need to complete it, pretty much I'll look at my schedule and I'll look at OK how long do these things take? Are there assignments in between. If not, is there just an assignment embedded. Is there some sort of evaluation at the end and I'll look at sort of trying to plan out my time to make sure that I can give it my good attention and pick up the information and be able to complete the thing on time… the planning for those things to me is pretty critical. Table of contents. Participant J stated that a robust table of contents is essential to assist with determining a course strategy: In order to be able for me to select the right [modules], there needs to be meaningful table of contents that has some something other than step one, step two, step three. It needs to talk about the content of each section so that I can make informed choices about which element I want to participate in or read or watch or whatever the verb is. Content in chunks. Several participants discussed the benefits of having the course content broken down into small segments or chunks. Participant G stated that by “breaking down the topics into simpler things…the [learner] doesn't feel that this is a very complicated.” Participant D agreed and stated, “I'll take technical training that is pretty short, you know 15, 20 minutes. It's enough to give an overview, learn how to do something and apply it very quickly to solve a problem.” As to the appropriate size for a chunk, most interview participants stated that it depended on the course topic. Participant D stated, “[A] half an hour would be probably something that would be incredibly long. Usually they're 15 or 20 minutes or even less. You know sometimes even five or three. I like that sort of like bite-sized learning.” Participant A LEARNERS’ PERCEPTIONS OF MICROLEARNING 107 stated, “I will find the sections that I think I need and do those first…picking and choosing what's important and what don't I know or need to learn…I only take the stuff I need.” Time estimates. Microlearning courses often include time estimates for individual modules. These are designed in part to help the learner plan their course strategy. Participant A stated, “[D]efinitely the first thing I always look to see is the approximate time it will take … whether to even start.” Participant B agreed and stated, Depending on my time, like if I'm in a hurry or I only have like an hour or two, I'll try to do the shorter things first leaving the longer things to when I can sit down and focus. [I look at] the time to complete and kind of figure out [what to work on] based on how focused I want to be. Do I take a long course now or just a short one because I only have a few hours now? Maybe I'll just a quick hour course now and leave the five or six hour one until later.” Video and visuals. Participant I is drawn to courses with video and recognized a personal increase in learning efficacy from microlearning courses that employed video instruction for technical (in this case computer programming) training more than courses that required reading through material, yet acknowledged that other learners may feel differently: I find that I learn much better by actually watching a video seeing someone possibly apply the coding thing that they're trying to teach in that particular class as opposed to just reading. But I know other people that hate videos and want to just want to read the material. So for my point of view, I do much better with videos and actually listening to a human about what it is that they're trying to get across and then pointing out the various pieces in the code and what each one is doing. LEARNERS’ PERCEPTIONS OF MICROLEARNING 108 Participant E had a similar perspective: “Videos are always nice. They kind of give you a mix of an instructor led where you'll have a video or a PowerPoint. We can see something, make it more of a discussion, and see things in practice.” Participant B self-identified overall as a “visual learner” and valued the existence of video as well as written text and diagrams as beneficial to the definition of a course strategy. However, features that other participants identified as helpful to their learning strategy, Participant B found sometimes frustrating: I do enjoy reading text a bit. But you know if it's presented in with a series of diagrams or with bullet points or with different colors, I'm able to remember that or recall it later on. That's kind of how I break up information. The videos I kind of like only if they're short. If they're an eight-minute long video, I tend to get a little frustrated, like I want to get to the task and start working with the thing. So by “visual,” I meant how they present the information like dot points, bullet points, and so on with colors, or with diagrams... things like that. Participant A is attracted to visual learning as well. I don't actually often feel like I want to … actually do what they're telling me to do. But I want to see it. I want to see how things are done. Specific examples of how to use the knowledge. So whether or not you actually have to do it yourself or can just read, I think that's incredibly helpful. Search tool. Participant F used a mix of microlearning course features to judge the suitability of the course relating to specific learning goals before investing time and planning a path through the content: I'll do a search and often times the search results will have a strength indicator to show how appropriate this particular course or this video segment is in relation to my search. LEARNERS’ PERCEPTIONS OF MICROLEARNING 109 And that's how I would typically start. And then I would take a look... before watching the video ... I would take a look at the dictation if the speech or the audio is dictated, I would kind of like skim through that first. And then if it's exactly what I'm looking for, then I would invest the time and watch that particular video. And some videos are even segmented down to the minute. And if I can just jump straight to that to address what I'm looking for, that’s what I would do. Links to additional content. Like all instruction, microlearning courses cannot include all information needed by all learners. By adding links to related content, asynchronous courses can help learners find more information when desired. Participant A stated, “I find it helpful to have...in the margins some tools and tips... ways to use these things that you're learning.” Participant D stated, “I am sort of an expansionist when I when I think about these things and so sometimes even in a microlearning course I think wow, that's pretty interesting, I'd like to know more about it.” Participant F stated: I think that's the beauty of learning is that it leads to something else. And I believe that if a training [course] was designed to be a high quality, it should help the learner expand their knowledge. So ... and it may not be something that the learner wants to complete right away, but I think it should at least raise some awareness for the learner to know that oh there's something else that is related to this that you can learn. Ability to repeat course. Participant F discussed the importance of having a training format that allowed learners to repeat the course: Being able to go back as long as there is no expiration… If I were to compare it to a real live training, when you walk out of that classroom your trainer, while they may say they are accessible, may not be as accessible and I think with online training, if I know that LEARNERS’ PERCEPTIONS OF MICROLEARNING 110 after I have completed the course I can still go back a week later knowing that it's there it would be very helpful. Decision to use course flexibility. Question 16 on the survey asked: “If I could observe you working through your most recent microlearning course, what is the primary strategy I would see you use?” Sixty-two respondents who indicated they had taken technical training in a microlearning format were asked to select the primary strategy they used out of six options. Despite the ability for students to skip around in the content of microlearning courses, overall 29 (47%) of respondents stated they “worked straight through the segments in order.” Thirteen (21%) participants stated they “skipped segments” they felt they already knew; 10 (16%) stated they “skipped around in segments but took them all’; seven (11%) “skipped around the segments but only took the ones that interested” them; and three (5%) selected “planned out my learning path before starting the course.” The sixth answer option was “other” and allowed participants to fill in a unique response; however, none of the participants selected that response. See Figure 31 for a visual representation of the 62 survey responses to this question. LEARNERS’ PERCEPTIONS OF MICROLEARNING 111 Figure 31. Frequency of single choice responses to the statement: If I could observe you working through your most recent microlearning course, what is the primary strategy I would see you use? n=62 However, when the results are filtered to the 11 respondents with the most microlearning experience (50 or more courses), the frequency of response choices shifts. Seven (64%) of those most experienced with microlearning worked straight through the segments in order, three (27%) skipped around the segments but completed them all and one (9%) planned out the learning path before starting a course. Unlike the less experienced respondents, none of these participants skipped segments they thought they knew or skipped segments and only took the ones that interested them. See Table 4 for a comparison of responses. LEARNERS’ PERCEPTIONS OF MICROLEARNING 112 Table 4 Comparison of Primary Microlearning Course Navigation Strategy Strategy All Respondents Respondents with 10 or fewer microlearning courses Respondents with 50 or more microlearning courses Worked straight through the segments in order 47% 40% 64% Skipped segments I felt I already knew 21% 30% 0% Skipped around in the segments but took them all 16% 10% 27% Skipped around in segments but only took the ones that interested me 11% 20% 0% Planned out my learning path before starting the course 5% 0% 9% Other 0% 0% 0% Common features of a microlearning course also contributed to the metacognitive process by which learners decided how to approach the content. Question 13 of the survey asked, “Thinking back on your overall experience with microlearning courses, how important is each of the following features to you?” Respondents ranked flexible path options as among the least important features of the eight options given. See Table 5 for a comparison of survey responses rating the importance of course features. LEARNERS’ PERCEPTIONS OF MICROLEARNING 113 Table 5 Summary of Survey Responses Rating the Importance of Course Features Feature Very important Somewhat important No opinion Not too important Not important Clear course navigation instructions 34 22 3 3 0 Ability to bookmark progress 33 25 0 3 1 Interactive features (e.g. video) 31 28 1 1 1 Click through course quickly 24 28 3 6 1 Definitions for terminology 23 29 7 3 0 Progress bar 22 29 6 4 1 Module time estimates 21 30 4 6 1 Flexible path through course 21 30 3 6 0 Many of the interview participants expanded upon their decisions to use a course’s flexibility options. Of those, several mentioned specific ways in which the format suited or did not suit their preferred way of learning. Others described specific situations when taking advantage of course flexibility may be more appropriate than others. Sequential path. Interview Participant C stated, “I have a process driven mindset so I do like to work in order sequentially. Probably a tad bit of OCD.” Participant D takes an all or nothing approach to microlearning courses. I take things in order good, bad or indifferent. And maybe that's just sort of my sort of philosophy. I'm going to put a lot of trust in the designer and also how the course is developed to follow it in order because it would be like a novel. There are certain things in one chapter that lead into the next or they build or whatever. So I tend to take things in the order that are outlined. Participant E also completed individual units in the order presented but did not always complete entire modules: LEARNERS’ PERCEPTIONS OF MICROLEARNING 114 I tend to stay the course with the module, and I tend to take modules in the order that they were meant to be. I may jump around in larger courses. I may not follow a trail or a larger group of modules. I may bounce around based upon their time...If you think of modules having units, I will stay the course within the module and do [units] in order. I find that tends to work best for me. But I may take [the list of] modules out of order. Flexible path. Participant B expressed an appreciation for course flexibility but added a caution when choosing a flexible path: I can kind of pick and choose what I do, I can decide to deep dive into one particular area one day and then, the next day choose something completely different if I feel like it. So the flexibility is very good with microlearning. You can come up with your own agenda. But it's also kind of intimidating because again you can come up with your own agenda. There were several interview participants who stated that they chose their own path through microlearning course content but ultimately finish it all. Interview Participant I found path flexibility to be a benefit but cautioned against skipping content entirely: I am not just a picker and chooser or “well just pick a couple of them here and there and then never go back to the other ones” because I have been surprised sometimes where things that I didn't think were something that I needed to learn [turned out to be] important. You know there might have been one little nugget of information that was worthwhile to complete that entire particular microlearning class. Flexible time. Participant H appreciated microlearning’s options for time flexibility and described completing entire courses in that manner, but not always in one session: I tend to consume it start to finish. It sort of depends on how big the course is. The smaller, the better. I prefer to take a little bit of time a week. Usually at lunch, I'll take 15 LEARNERS’ PERCEPTIONS OF MICROLEARNING 115 to 20 minutes while I'm eating lunch or some down time. I used to set some time aside in the evenings so that I can focus on new skills from there. Again the flexibility makes it a big difference for me for that perspective. I consume it as I need to. But in most cases I do start to finish. But you know there are times where I break it down into sections and I may spread it out over the course of a week. Flexible interest. Participant B employed an “it depends” strategy to course flexibility decisions based upon the level of interest in the topic: If a topic surprises me and I become interested in it, I'll do more. And kind of just forget about the time portion of it. I might set out to say do an hour, but if it's interesting enough, I might sit down for three hours and try to finish the rest of it… [My] intent is whenever I do start a course is to complete it end to end. I might jump around order. It might take me a few days or weeks or months to complete it…I want to absorb everything that there is to see…I won't just start it then pick and choose what I want and then move on. I would like to complete everything in the outline.” Reason for taking course. Several of the interview participants approached the course flexibility decision-making process by asking themselves why they were taking the course. Participant I stated: I think usually my goal ultimately is to do all of [the modules], right? If I'm looking at one particular issue that a client is having at that time I might drill in and just do one or two at that time. But I would ultimately go back and probably do the entire thing… New content. Many interview participants indicated that new technical content was best learned in the order presented in the course. For example, participant I stated: LEARNERS’ PERCEPTIONS OF MICROLEARNING 116 If [I’m] learning something from scratch, then it would be best if the authority on the topic put an entire path together that I would follow because they would understand better about...which order to teach me the different parts so that at the end I'd learn the whole thing. Participant G also determined whether or not to use flexible path options depending on the extent to which the course covered new information: I have to put it into [two] different [situations]. One is if the technical course is in the same field of work that I'm currently in, then I will pick and choose what content am I supposed to learn. Because I know what is there, because I know what I do not know, and what needs to be learned. So if I'm a beginner starting to learn this course from scratch, then I will follow the path given to me because instructors would have had experience with previous candidates and they will have chosen the right path to make a candidate understand where to start from, how to go ahead, and how to build the expertise in that. So being a beginner is one way. Once you already know something in that, then you can pick and choose sometimes. Participant F found that course completion depended upon circumstance including learning new technology: It really depends on my need at that moment. If I'm looking for a particular skill to fill a need, I would I would just look for that particular video and if I if I can acquire that skill from that video, I'm done. But there are also times where perhaps I don't know a whole lot about a particular program and I'm just starting at a very baseline of knowledge, then I would be a lot more open to perhaps looking at an introductory video or introductory segment and then hopefully from there lead to more specific areas. LEARNERS’ PERCEPTIONS OF MICROLEARNING 117 Repeating content. Participant J stated that learner control over the module sequencing is important especially when a need arises to repeat content: The multiple elements should be randomly accessible in the sense that I could go through and run segment 2, 4, 17, and go back to 1, whether those are video based or word based or any sort of basis. I need to be able to go through and choose the order in which I go… and I want to be able to repeat things. New microlearning user. Participant A, who had completed 30 microlearning technical courses, differentiated between the importance of course flexibility for those who are experienced in the microlearning environment from those who are new: Definitely a suggestion for newcomers or newbies or especially for people who like to mark their progress, it might be nice to have [documented paths]. I mean there may be some paths but if you know there's a path that I can visually see that it's like just like you're walking a trail and here you want to finish this and then finish this and then finish this. So I think paths based on what your professional career goals are nice to see, you know, how can I get where I want to get. For me personally I don't care about them. I don't need a path. I just want to learn what I want to learn. Motivation Influences The survey instrument and semi-structured interview protocol used in this study were designed to uncover a range of motivational factors that influenced the participants’ perceptions of the microlearning format for technical training without restricting their input to align with my expectations. Their responses, summarized in this section, aligned with three motivation theory constructs: expectancy value, goal orientation, and self-efficacy. LEARNERS’ PERCEPTIONS OF MICROLEARNING 118 Expectancy value. As is discussed in the literature review contained in Chapter Two, expectancy value theory concerns the intersection of an individual’s expectations for taking action and the value of the objective sought. This study looked at learners’ express expectations for and value of microlearning to meet their learning objectives. Two questions in the survey instrument and one structured question in the interview protocol sought participant’s perceptions of expectations and value both on eLearning in general and microlearning in particular. Question 14 in the survey included two prompts as identified below. Ability to acquire technical skills from a microlearning course. Sixty-two survey participants rated the prompt “I can acquire the technical skills I need from a microlearning course” as follows: 54 (87%) strongly agreed or somewhat agreed; eight (13%) were neutral or somewhat disagreed. However, of the 11 participants who had taken 50 or more microlearning courses, 10 (91%) agreed strongly with the statement. See Figure 32 for a visual representation of all participants’ ranking of the ability to learn technical training in a microlearning format. Figure 32. Frequency of responses regarding the participants’ ability to acquire needed technical skills from a microlearning course. n= 62 LEARNERS’ PERCEPTIONS OF MICROLEARNING 119 Microlearning as an important use of time. Sixty-two survey participants rated the prompt “Taking microlearning courses is an important use of my time” as follows: 56 (90%) strongly agreed or somewhat agreed; five (8%) were neutral, and one (2%) strongly disagreed. However, of the 11 participants who had taken 50 or more microlearning courses, 10 (91%) agreed strongly with the statement. See Figure 33 for a visual representation of all participants’ ranking of whether microlearning courses are an important use of time. Figure 33. Frequency of responses regarding microlearning as an important use of time. n= 62 Survey question 15 included three prompts to identify how participants valued the microlearning format for technical training. Each is discussed below Worthwhile to have flexibility in course path. Sixty-two survey participants rated the prompt “I find it worthwhile to be able to make decisions on the path I take through the modules” as follows: 51 (82%) strongly agreed or somewhat agreed; 11 (18%) were either neutral or did not agree. However, for the participants who had taken 50 or more microlearning courses, the value of this feature was less than for the general survey participants. eight (73%) agreed or somewhat agreed and three (27%) were neutral. See Figure 34 for a visual LEARNERS’ PERCEPTIONS OF MICROLEARNING 120 representation of all participants’ ranking of whether a flexible path through technical training content is worthwhile. Figure 34. Frequency of responses whether it is worthwhile to be able to make course path decisions. n= 62 Worthwhile to have control of time spent learning. Sixty-two survey participants rated the prompt “I find it worthwhile to have control over how long I spend in training at one time” as follows: 36 (58%) strongly agreed; 22 (35%) somewhat agreed; two (3%) were neutral; one (2%) somewhat disagreed; and one (2%) strongly disagreed. However, of the 11 participants who had taken 50 or more microlearning courses, nine (82%) agreed and two (18%) somewhat agreed. See Figure 35 for a visual representation of all participants’ ranking of the ability to learn technical training in a microlearning format. LEARNERS’ PERCEPTIONS OF MICROLEARNING 121 Figure 35. Frequency of responses whether it is worthwhile to be able to have control over time spent learning. n= 62 Desire for more technical content in a microlearning format. Sixty-two survey participants rated the prompt “I would like more technical training to be delivered in a microlearning format” as follows: 51 (82%) strongly agreed or somewhat agreed; 11 (18%) were either neutral or did not agree. However, all 11 (100%) of the participants who had taken 50 or more microlearning courses agreed or somewhat agreed with the statement. See Figure 36 for a visual representation of all participants’ ranking of the ability to learn technical training in a microlearning format. LEARNERS’ PERCEPTIONS OF MICROLEARNING 122 Figure 36. Frequency of responses regarding the desire to see more technical content delivered in a microlearning format. n= 62 The interview participants were asked, “How valuable is the microlearning format to you compared to other training formats for learning technical content?” Several interview participants discussed overcoming an initial reticence to using the microlearning format for technical training. Participant H stated: Originally I sort of passed on the [microlearning] concept because I didn't think it aligned very well my learning style. But I'm now two years in to doing it on sort of a regular basis and either I've adapted or the material is gotten better. I'm not sure which [laughter] but I tend to consume as much as I can get my hands on. Participant B descried a similar adjustment to the microlearning style of training: I've kind of struggled with microlearning only because I've grown up used to consuming I guess content in a different way. You know growing up without the Internet, going to college where the Internet was kind of something on the side. I was used to just having to go through textbooks and distill out my own notes and kind of stay away from the computer. So it's been an adjustment in my professional life to kind of work in other LEARNERS’ PERCEPTIONS OF MICROLEARNING 123 ways of getting content such as videos and short courses like the microlearning. So it's been an adjustment and I don't know if I'm completely there yet. But I'm at least kind of accepting of it. Participant J explained that microlearning is just one tool out of many for getting the technical training appropriate for the individual and for the situation, and not always the preferred option: [Microlearning] is a very useful tool but it's like one arrow in the quiver. And I feel like you need a quiver of many distinct elements that are complimentary…maybe a dozen different things. From my perspective, I would preferentially choose other formats first. I feel like microlearning in a specified course causes me to go down the learning path of the vendor or whoever it is who put together the training. And frequently I'm looking to answer a specific question like does this tool offer this feature or can I connect this widget with that widget. And I find that a particularly a video-based a training course tends to force me down exactly one path. And I'd really rather not have a serial mechanism so I have to go through their steps explicitly. I'd rather have a random access method where I can search for the thing I want go to step 17 of 30, read step 17, answer my question, carry on with my day. Value of tangible course features. Several questions on the survey and in the interviews sought the participants’ insight into features of eLearning in general, and microlearning in specific, that they valued when taking technical training. Many of the responses pointed to specific suggestions for microlearning course design. Those are discussed in this section. Value of intangible course features. In this section, I present the more intangible value found in the microlearning format. Question 7 on the survey asked all participants with technical training experience, “Thinking about your preferences for learning technical content, please LEARNERS’ PERCEPTIONS OF MICROLEARNING 124 identify how important each of the following statements is to you.” Participants were asked to rate the importance of course features on a five-item Likert Scale with response options that ranged from “Not important at all” to “Very important.” Each of the statements is discussed below with data from the survey responses supplemented by interview responses. Convenience. Sixty-three survey participants rated the prompt “Learn technical content when it is convenient for me” as follows: 59 (94%) regarded convenience as very important or somewhat important; 4 (6%) either had no opinion or did not find it important. See Figure 37 for a visual representation of all participants’ ranking of the value of convenience in technical training courses. Figure 37. Frequency of the ranking of importance of convenience. n=63 Interview Participant B described a common perception expressed by the interview participants on the value of convenience offered by online learning in general: Just the fact that it's available online is very helpful because you can access it anywhere you are. You could be in at your workplace or at home or in a different country and kind of pick up where you left off. So that's very useful. You can't do that with books. You kind of have to carry them around with you. And that's a bit of a pain. LEARNERS’ PERCEPTIONS OF MICROLEARNING 125 Participant C found value in the convenience of time flexibility afforded by online training: I do like that I can choose my content and work at my own pace [and] focus my learning on things that are most relevant to me. I can do it round the clock at my convenience. I can work at home. I don't have to work at the office to do the training. I'm not in there from 9:00 to 5:00 waiting for lunch, blah blah blah. I can work from 6 to 9 or 9 pm to 11 pm after the kids are put to bed. And I can go learn and work on something if I've got a free evening. Participant G discussed that microlearning offered the convenience of improved concentration on technical content: When we look at microlearning, in a short amount of time you learn a good amount of knowledge [and] you don't deviate from what you're concentrating on. So microlearning could be a platform where you get to spend quality amount of time learning knowledge, maybe a refresher or something new that you're learning. But it will obviously help a person grasp [learning] at a faster pace in a shorter span. Participant D commented on the convenience of mobile eLearning: “We definitely are training people in companies to have to engage in and develop short term ability to concentrate on things. …they're putting [training] on telephone applications so that you can look at it anytime, anywhere.” Short segments. Sixty-three survey participants rated the prompt “Learn technical content in short segments” as follows: 53 (84%) regarded learning in short segments as very important or somewhat important; 10 (16%) either had no opinion or did not find it important. See Figure 38 for a visual representation of all participants’ ranking of the value of learning technical content in short segments. LEARNERS’ PERCEPTIONS OF MICROLEARNING 126 Figure 38. Frequency of the ranking of importance of learning in short segments. n=63 The interview participants were unanimous in their positive perceptions of “chunking” technical content into small segments for both time efficiency and retention. For example, Participant E stated a preference for “topics broken into nice usable chunks, you know, 15, 20, some 30 minutes to an hour. They're very handy.” Participant I, however, questioned impact on retention when learning small chunks of content: I mean the chunks just make it possible that if you've got just an hour or two here and there you can sit down and do a small amount of stuff. It's a little bit of a two edged sword, I believe, in that some people don't do a good job of retaining that information so that if they were to move onto the second part they might not totally remember the first part. So it's a little bit on each individual's learning style and what they're capable of learning. Some people don't do very good at learning things out of order. So I guess everybody's a little different. But for me I think it works pretty well. Participant A expressed a different perspective and stated, “You're going to retain more in a microlearning class than you would in an instructor-led class if you're not using that [knowledge] right away…little snippets are easier to remember.” Participant B had a perception unique to all LEARNERS’ PERCEPTIONS OF MICROLEARNING 127 of the other interview participants and touched on the value of chunking as it relates to personal accomplishment: [Short segments] helps with giving a sense of accomplishment like when you start something new, it just feels overwhelming and big and large. So breaking it up into short segments kind of helps you track your progress and it makes you feel that you're on the way to learning what you set out to do. It's very helpful. It also helps with breaks. It kind of gives you a natural time to get up and take a little break if you have to. Participant D referenced the value of knowing learners can revisit specific content when necessary: [It] is great for being able to go back to again and again. I like it because it's short in duration. Very focused on single topics or partial parts of topics. It's almost like going to an encyclopedia and you can kind of turn to a certain subject and in a very short period of time get a great overview and then that maybe may lead you to another place to get a more in-depth understanding…[It] doesn't take a lot of time and it's very efficient. Learn only specific content. Sixty-three survey participants rated the prompt “Learn only the specific content I need” as follows: 43 (68%) regarded the ability to take only specific content as very important or somewhat important; 20 (32%) either had no opinion or did not find it important. See Figure 39 for a visual representation of all participants’ ranking of the value of learning only the content they need. LEARNERS’ PERCEPTIONS OF MICROLEARNING 128 Figure 39. Frequency of the ranking of importance of learning only specific content. n=63 All interview participants had a positive perspective on the value of being able to access only the technical content they felt they needed. Participant A stated: I just approach it piecemeal because I tend to do it for the small learnings. I will find the sections that I think I need and do those first… So yeah, picking and choosing what's important and what don't I know and need to learn. Participant G mentioned a slight variation depending on existing experience: “If the technical course is in the same field of work that I'm currently in, then I will pick and choose what content am I supposed to learn…I know what I do not know and what needs to be learnt.” However, recall the earlier perspective offered by the very experienced Participant I above about the danger of learners operating under the assumption that they know what they need to learn and, in the process, skipping content that is valuable. Ability to take content in any order. Sixty-three survey participants rated the prompt “Have the ability to skip around in a technical course to take the content in the order I want” as follows: 45 (71%) regarded the ability to take only specific content as very important or somewhat important; 18 (29%) either had no opinion or did not find it important. However, for LEARNERS’ PERCEPTIONS OF MICROLEARNING 129 the 11 participants who had taken 50 or more microlearning courses, and consistent with the finding in the metacognitive section of this dissertation, the value of the flexible path option decreased: eight (24%) of the most experienced microlearning participants found flexible pathways very important or somewhat important and three (38%) found that feature not too important. See Figure 40 for a visual representation of all participants’ ranking of the value of skipping around in the course content to take the content in the order they want. Figure 40. Frequency of the ranking of importance of taking the course content in any order. n=63 As described in the metacognitive section of this paper above, the interview participants discussed the flexible path feature at length in response to questions about the processes they employ when deciding to take a microlearning course. As for the feature’s specific value in learning, several interview participants shared the perspective that its primary usefulness is for finding specific information on demand, not for learning the full content of the course. As was stated by Participant I: I mean there are some things where [I’m] researching an issue that a client's having, I can get in and just look at one particular piece. But if you're learning something from scratch LEARNERS’ PERCEPTIONS OF MICROLEARNING 130 then it would be best if the authority on the topic put an entire path together that I would follow because they would understand better about how to…which order to teach me the different parts so that at the end I'd learn the whole thing. Goal orientation. As is discussed in the literature review contained in Chapter Two, goal orientation theory relates to an individual’s reason for action. Two types of goal orientation were examined in both the survey and interview questions: mastery and performance. Mastery orientation involves a student’s goal of actually learning a task, comparing their current performance to their own prior accomplishments. Conversely, performance orientation involves a student’s goal of doing better than anyone else regardless of whether the content is actually learned or transferred to the job. Question 15 of the survey asked respondents about the microlearning format’s effect on their goals. Learning goal achievement. Sixty-two survey participants rated the prompt “The microlearning format helps me achieve my learning goals” as follows: 57 (92%) strongly agreed or somewhat agreed; five (8%) were either neutral or did not agree. However, all 11 (100%) of the participants who had taken 50 or more microlearning courses agreed or somewhat agreed with the statement. See Figure 41 for a visual representation of all participants’ ranking of the ability to achieve their learning goals in a microlearning format. LEARNERS’ PERCEPTIONS OF MICROLEARNING 131 Figure 41. Frequency of responses regarding participants’ achieve their learning goals in a microlearning format. n= 62 A more in-depth understanding of goal orientation was sought during the interview sessions including microlearning’s ability to impact goal achievement. The results are broken down into mastery, performance, and dual goal orientation. Mastery goal orientation. As is discussed in the “Overview of Survey Participants” and “Overview of Interview Participants” sections in this chapter, while 56% of the survey respondents worked in a technical role, 100% of the randomly selected volunteer interview participants were employed in technical positions. Therefore, the interview data were unintentionally skewed to the perspective of individuals in the field who likely require ongoing training on technical content. Survey questions were not designed to uncover the extent of participants’ mastery orientation other than by role; however, interview participants provided consistent responses on the goal of learning quickly in the IT environment. Stay current on fast-changing technology. Participant J succinctly stated, “I think that people ought to be taking the training for the knowledge and for the benefits that you derive therefrom.” Participant D had a similar perspective: LEARNERS’ PERCEPTIONS OF MICROLEARNING 132 [T]he motivation for me for taking any of this training is to know how to solve the problem that I've been given to solve. If it's by a company where I'm given an overview about how to use certain systems or navigate in their IT infrastructure, I want to be competent at it. It's usually required. So I guess from a status standpoint I don't want to be blacklisted in any way by saying hey you didn't do something and they get on my case about it. I just want to make sure I know how to do what I'm being asked to do. Participant J’s perspective was that the speed of technological advancement requires a mastery orientation to stay current for career protection: “The nature of my job is technology based and technology moves at a pretty rapid pace. In order to maintain some industry relevance as I grow older, I need to maintain active and current knowledge of technology.” Participant G also discussed how constant training is required to keep current for career growth: As you know, every single day there's something new coming up and unless I keep myself up to date, I increase my expertise in my field of work, there is no chance that I would grow in my own career…. It’s my responsibility to learn new things and I'm curious to learn something new in the market…Maybe if I start learning a new thing in my own field of work, what happens is the productivity at my work will increase. The efficiency…at my workplace will be better, thereby, my manager and other management will take a look at that, which in turn develops into [opportunities]. People will identify me as a strong resource available in technology. It's a way that you're creating a good record for yourself, which in turn fetches you the prizes, which is obviously the career. Participant I discussed how the pace of change in technology creates challenges just trying to keep knowledge current: “I'm computer scientist, software developer. And the industry is LEARNERS’ PERCEPTIONS OF MICROLEARNING 133 changing faster than we can possibly keep current. [Technical training] allows me to try and stay current and keep up to date with new technology as it's released.” Certified competence. Several of the interview participants were technicians on the Salesforce cloud computing platform. Certifications on Salesforce require a degree of mastery and, therefore, they are simultaneously an opportunity for individuals to challenge their expertise, an external confirmation of technical competence, and, in some circumstances, a requirement for employment. Interview participants who expressed mastery goals (as opposed to performance goals) for certification included Participant G: Not every technical course [has] a certification exam. It may be beginner, intermediate or advanced level of certification. But once you clear the certification, it assures you that you know something. You know a little about the particular concept. And that is a measure that the management or any other person who is evaluating me in terms of technology, it reminds them that I know something. However, microlearning was not the training method of choice for Participant I to accomplish mastery for certification. “I wouldn't rely completely on the microlearning only because I'm not sure if I would get everything that I need in order to pass the certification exam...So, I would defer back to getting a textbook to get that information.” Develop confidence and innovative thought. Participant G offered a unique perspective on mastery orientation: So when we talk about what do we get after completion of the technical courses… it is expertise in the same field. You feel more confident about implementing and you may start thinking in an innovating way that maybe you would have come across such a suggestion earlier, but you find a better solution than earlier. LEARNERS’ PERCEPTIONS OF MICROLEARNING 134 Performance goal orientation. As mentioned above, many of the interview participants were technicians on the Salesforce platform. As such, they were encouraged to utilize the microlearning platform known as “Trailhead” to acquire up-to-date system technology training. Trailhead offers badges for course completion with accompanying points for quizzes and projects as well as global leaderboards. Therefore, when the interview questions asked performance oriented questions, multiple respondents discussed their affinity to or dislike of badges, points, and other gamification features that may be included in microlearning. Two interview participants acknowledge the existence of recognition opportunities but that they were not their primary goal for taking technical training. Participant B stated, “[Recognition is] usually kind of afterthought I guess.” Participant I stated, “I'm an introvert which means that I'm much more motivated by myself. That being said you know I do get some recognition from the company for getting at least certifications.” Gamification and competition. Several interview participants’ comments indicated a distinct performance orientation (or lack thereof) including their interest in gamification and competition in microlearning courses. Some participants presented expansive explanations of the benefits, some delivered terse statements of disdain, and others were neutral. Participant B stated a moderate view of gamification: “It's nice to get a badge or something…I've found that to be effective with motivation. Try to earn a badge and that kind of keeps you going with trying to do the next one and the next one.” Participant C was very “motivated by the certifications and the badges and how many likes or questions or comments have I answered in the community, all of those sorts of things ...the social interaction of it.” Participant E, an IT team leader, described a positive perspective on performance orientation: LEARNERS’ PERCEPTIONS OF MICROLEARNING 135 If the tool is fun and full of good information, the staff actually sees it more of a game and learning is a side benefit. And so the gamification, the recognition...and simple recognition, you know. I use badges, and points of honor where they can brag and say I've done this or I went through this and now I'm doing this over here and it directly benefits them and shows that correlation. It just helps and it keeps that involvement and excitement going, especially as the teams that I direct are learning new technologies and applying new technology and showing benefit to the company. So I fully support it. I think it's a great tool. Participant C presented the clearest exemplar of a learner with a distinct performance orientation by providing multiple and lengthy discourses on Trailhead point calculation and leaderboards. One such sample is truncated here. A noted emphasis in this transcript is this participant’s comment that it is Salesforce’s goal, not the participant’s, to use Trailhead to learn the technology. I do like the competitiveness of the badges and the ranks of Trailhead. There are a couple of leaderboards where you can track yourself. The prestige of the badges and your certifications—sharing that out through your social profile and demonstrating your knowledge and your expertise—that’s important in the Salesforce space. You don't want to be just an admin, right. You want to be a well thought of admin in the community… But I don't like that … [Trailhead courses] continue to decrease the possible points if you are getting things wrong… I do feel that making mistakes is part of the learning process, and their ultimate goal with us is that we get the content correct… you lose half the points the first time you're wrong, you know. Then you lose half the points again the next time you're wrong. Right? Why not just take 50 points away every time you're wrong? LEARNERS’ PERCEPTIONS OF MICROLEARNING 136 …Not all modules are created equally. Not all modules are scored equally… So something that takes 10 minutes is or 15 minutes is a 100; something that takes two hours there's only 500. So there's disproportion in that…I have 75 plus badges, 45,000 plus points, I'm one tier away from being the highest level which they rank from various Salesforce users…At one time, I had close to 90 percent of all of the badges completed. Now the badges roll out so quickly that that overall percentage is significantly lower than 90 some percent. But that's where I spend a good part of time looking to make sure "Hey, there's a new badge. This badge is only available temporarily. Got to earn it during spring release.” So that when the summer release comes, the spring badge is gone, you've earned that badge and you accumulate your badges over time…If they introduce a new badge to a trail I've already completed, I'll immediately go back and do that one trail to get that one badge to get that trail completed again. I don't like to see on this trail 100 percent now it's only 95 percent because there's a new badge in it. Those interview participants who held a negative opinion on gamification tended to respond more concisely. Participant A simply declared, “I could care less about badges or anything that comes with it.” Participant J was similarly succinct: “I could care less about badges. Gamification drives me nuts. I think it's silly.” Certification for recognition. Participant J explained that technical training can lead to certification which may be a required performance goal for the individual as well as the company. “Our firm has a partnership with a technology vendor who requires us to maintain specific certifications. And so part of what I have to do is maintain this certification by doing some technology training along the way.” However, this participant personally viewed certifications to be as “silly” as gamification badges: LEARNERS’ PERCEPTIONS OF MICROLEARNING 137 Similar to gamification, Salesforce’s approach of requiring certifications, from my perspective is silly and kind of stupid. I would like them to focus on the knowledge and I understand I have to demonstrate the knowledge in some fashion. But I feel like the exams that they provide don't really do a good job of measuring that. I feel like there's a challenge…in the industry of how to meaningfully measure someone's expertise and share with them the relevant information that they need in order to do their job. And I feel like the five [certification maintenance exam] questions every six months that Salesforce does isn't a very good way to do that. Both mastery and performance goal orientation. Participant E described a balance between mastery and performance goal orientation: [Recognition] is a nice to have. It's more important to me to have the knowledge and have the understanding of the technology, to have usable knowledge, than it is to have a certification or a badge or a piece of paper on the wall… But I'm like any other human, right? We like to have our recognition and acknowledgement, especially when we're learning something new, but it's the actual usable information that's more important. Participant J described situations in which the learner may not have control over goal orientation: If someone's measuring me on completeness like a Salesforce partner requirement that some vendor or customer or employer requires that I achieve this badge, then I'm going to complete it. If I have interest in the topic area and the course maintains my interest, I'm going to complete it. If I am treating the training as transactional in the sense that I'm looking for a specific answer, I'm gonna answer my question and then I'm going to leave LEARNERS’ PERCEPTIONS OF MICROLEARNING 138 the training and move on to whatever task I was really trying to complete for which the training happened to be a reference manual. Participant H expressed a similar balance between mastery and performance goal orientation: There's a fair amount of both [recognition and learning]. In certain scenarios, some of the learning platforms have gamified that a little bit. And so people start publishing leaderboards and that adds some either internal competitions or competitions with people in the industry around consuming and completing the material and then using that information to produce applications or solutions based on that. There's probably just as much a driver for me personally as it is to stay current. Self-efficacy. The majority of participants in the survey (57%) and the interviews (100%) worked in a technical role. Therefore, their perceptions of self-efficacy within the microlearning format are likely not representative of the general adult learner population. The data on the topic of self-efficacy, however, produced insights into how confidence in the format was developed. Three questions on the survey asked participants about their level of confidence in their ability to use microlearning for technical training. Question 15 of the survey asked, “Referring to your overall opinions on the use of the microlearning format for technical training, please rate the following statements.” The rated statement that related to self-efficacy is discussed below. Easier to learn technical content in a microlearning format. Sixty-two survey participants rated the prompt “The microlearning format makes it easier for me to learn technical content” as follows: 51 (82%) strongly agreed or somewhat agreed; 11 (18%) were either neutral or did not agree. However, all 11 (100%) of the participants who had taken 50 or more microlearning courses agreed or somewhat agreed with the statement. See Figure 42 for a visual LEARNERS’ PERCEPTIONS OF MICROLEARNING 139 representation of all participants’ ranking of the ease of learning technical content in a microlearning format. Figure 42. Frequency of responses regarding ease of learning technical content in a microlearning format. n= 62 Question 14 of the survey asked, “Referring to your level of confidence specifically in the microlearning format for technical training, please rate the following statements.” The three rated statements that related to self-efficacy in that question are discussed below. Confidence in ability to navigate in a microlearning course. Sixty-two survey participants rated the prompt “I can successfully navigate through a microlearning course” as follows: 61 (98%) strongly agreed or somewhat agreed; 1 (2%) somewhat disagreed. However, of the 11 participants who had taken 50 or more microlearning courses, 10 (91%) agreed strongly with the statement. See Figure 43 for a visual representation of all participants’ ranking of the ability to learn technical training in a microlearning format. LEARNERS’ PERCEPTIONS OF MICROLEARNING 140 Figure 43. Frequency of responses regarding the participants’ ability to navigate through a microlearning course. n= 62 Most interview participants stated that their perception of efficacy with the microlearning format has increased as they have taken more courses. Participant H stated, “[M]y confidence has grown the more that I've used it…I think [it has increased] as the industry and the content around that has matured some, and you've got certainly a variety of different approaches that are being introduced.” This statement is consistent with the earlier findings showing those participants with more than 50 microlearning courses expressed a higher level of value in the format. Participant F also credited the improvements the training industry has made over time: Actually very confident especially at this point in time. I think going back perhaps as far as maybe 10 years ago and up until five years ago, online training was fairly new so the quality may not have been always there, the technologies to support it might not have always been there. But today with all the different online training resources that are available I feel very confident in accessing it and using it and being able to learn at my own pace. Other interview participants were more guarded. Participant I stated, “I think it depends a little bit on the microlearning program that they put together. Some are better than others.” LEARNERS’ PERCEPTIONS OF MICROLEARNING 141 Organization Influences This study asked participants about training in their professional work environments to identify potential cultural settings (e.g. time, tools, etc.) and cultural models (e.g. value of professional development, ingrained traditional learning methodologies, commitment to knowledge management, etc.) that may impact the use of microlearning for technical training. Both survey and interview responses are discussed in the following sections. Cultural setting. Survey question 9 asked respondents to estimate the percentage of technical training that is delivered in their organization in each of eight formats. One hundred ten (110) survey participants responded to the question. The eighth option of “Other” elicited open text responses as follows: training on my own (five), no technical training by the organization (two), and conferences or field based (two). See Figure 44 for a summary of the responses. Figure 44. Percentage range of technical training type delivered by survey participants’ organizations. n=115 LEARNERS’ PERCEPTIONS OF MICROLEARNING 142 Question 8 of the survey asked participants to rate their current organization’s cultural setting for technical training by responding to the prompt, “My organization provides the professional development courses I need.” One hundred and thirteen (113) participants responded as follows: 21 (19%) strongly agreed; 48 (42%) somewhat agreed; 17 (15%) were neutral; 13 (12%) somewhat disagreed; and 21 (19%) strongly disagreed. However, the 11 participants with more than 50 microlearning courses had a somewhat different perspective: although the percentage of those who strongly agreed (18%) and strongly disagreed (18%) was similar to the full group, 27% of the most experienced microlearning participants somewhat disagreed that their organization gave them the professional development courses they needed. See Figure 45 for a visual representation of all participants’ ranking of the extent to which their organizations provided professional development the participants needed. Figure 45. Frequency of responses ranking the extent of needed professional development provided by organization. n=113 Interview participants expanded on their organizations’ cultural setting for technical training including tools, budget, and time. LEARNERS’ PERCEPTIONS OF MICROLEARNING 143 Training tools. The common resource situation among the interview participants was that technical training was often a self-selected process and often available either online for free or provided by the organization. Participant B’s organization was unusual in that the cost of paid resources did not pose a problem: Quite often we're in charge of our own learning…[however my company is] happy to provide us resources so I'm able to suggest I want to go in this course, or we should buy these books, or we should buy this online service. And they're more than happy to pay for it…What I would like from the workplace is just opportunities to apply the learning that I do in my day to day jobs. But it's more important that they just support me with the resources. Participant C’s response was more common: “The company itself doesn't send us off to a lot of training from a technical perspective so it helps that the resources are available online.” Budget. Three interview participants referenced the barrier of expense for technical training. Participant E explained how microlearning improves the budgeting process: We're like a lot of companies. We have a limited budget but we do budget each year for some training for our staff. So we have to be very particular about how we use that money. And so I see computer-based learning as a huge tool to help me optimally manage that. And microlearning gives me a finer scalpel to particularly control how I'm spending that training money…microlearning helps me to give people the learning they need when they need it to better utilize my budget Participant I expressed a self-improvement perspective: “Getting [the company] to pay for classes that they agree are appropriate classes to take is great. But a lot of the stuff that I've done LEARNERS’ PERCEPTIONS OF MICROLEARNING 144 through the company I probably would have done on my own anyhow.” Participant C articulated an economic cost/value perspective: Technical training is expensive. The economics of today and the rate at which skills are changing, it seems easier to offboard someone with 15 plus years experience. That's a very expensive employee who needs technical training to keep up with the skills and the new technologies. Offboard them ... that's a gentle word, right? You go onboard a recent college graduate already trained in that ... that'll be more economically beneficial for the company. Dedicated training time. Interview participants had mixed perceptions when asked whether their organizations afforded them dedicated training time. Some participants confirmed that their organizations allowed them to carve out time within their workday for training. Participant E succinctly stated, “We allocate time. It's part of the cost.” Participant F gave a managerial explanation of providing time to train: We do provide time for employees to train. In my department, the technology department, we have designated times where we cut out and I give my staff freedom to attend the online courses. We subscribe to Lynda.com and we provide time for the employee to sign on. And as long as they come back and they demonstrate that they've completed the course, and I can also run a report that shows that, I'm very satisfied with that protocol. Some participants felt it was part of their job to do their technical learning regardless of whether it was done during working hours or not, and a few preferred learning on their own time. Participant B stated: LEARNERS’ PERCEPTIONS OF MICROLEARNING 145 They're very open to giving us the time needed pursue whatever we want to learn. So that's very helpful. In some places, you've just got to churn out work and there's not much time given to you to skill up…[This] company allows us to do it somewhat on the clock but we have to do most of it our own time…I feel it's part of the job. I feel I'm actually more productive doing it at home than I am at [work]. Participant H explained how an organization has been building training time into the workflow: “Certain teams have been working it [into] project specific items…but it's not explicitly enforced or required. …But I wouldn't say that there's a lot of visibility to that today.” Participant A also mentioned lack of communication of the organization’s policy: I actually don't know. I'm not sure…I have no idea. I haven't taken any…[N]obody— hardly anyone probably—actually takes time off of work even if we're allotted it. And what I'd like to ask but I've never felt comfortable asking and I don't know if other people feel this way is “Can I just take three days off to study for these exams on my own and to do learning in microlearning formats and different types of things? Does that count as professional development?” So I think as we shift to microlearning and doing things at home, people are just doing them on weekends instead of actually taking their professional development time. So I think companies need to shift how that works.” Participant C expressed a challenge in a consulting organization between consultants’ need for training to stay current and the organization’s time billing demands: [T]here is value in the technical training. There's also value in delivery and meeting deadlines and the business will make decisions over “We need to get to the deadline. I don't have time for you to go learn it ... learn it while you're doing it." Right? And you're salaried. So if it takes you 60-70 hours that week, so be it. That's all part of "your job." LEARNERS’ PERCEPTIONS OF MICROLEARNING 146 It's not "Hey, go learn it for a week and come back the next week" which would be two 40-hour weeks. You're doing your learning and your job in the same week…Nowadays it's more do it on your own time. Cultural model. Question 8 in the survey asked participants to respond to two prompts regarding the cultural model for training in their organizations. The first prompt was: “My organization encourages me to take ongoing professional development.” The second prompt in Question 8 was: “My organization has a strong commitment to employee professional development.” Overall, the participants perceived that their organizations provided them slightly more encouragement than commitment to their professional development. See Figure 46 for a comparison between organizational encouragement and organizational commitment for professional development. Figure 46. Frequency of responses ranking the extent of organization encouragement compared to organizational commitment for professional development. n=113 Interview participants provided varying perspectives on their organizations’ encouragement and support for professional development and technical training. Throughout the interviews, responses that described negative or ambivalent cultural models indicated a lack of LEARNERS’ PERCEPTIONS OF MICROLEARNING 147 communication from the organization to its employees of its encouragement and commitment to professional development. Participant F provided a managerial perspective that mentioned a cultural model of encouragement, commitment, and employee empowerment: [Technical training] is not so much expected. And it's not so much even a policy. I would say that it's having it as a resource so that our employees feel empowered. If they feel like they need to learn something or they just don't have the answer to something that they know that that resource is there for them…We have a staff of approximately 10 and we strongly encourage the use of online learning for many benefits [including] cost reduction, reduction in travel, and conference costs. Participant G provided an employee perspective and recognized the win-win benefit of organizational encouragement and commitment: We have mandatory 90 hours for technical training per financial year…It doesn't matter if it's classroom or whichever…I'm an employee. I'm working towards the growth of the company. I'm putting in my effort. They're helping me by giving me time and reimbursing me for the course fee. So this is a win-win situation for the company and the employee. Participant A, a virtual employee, was unsure about the company’s encouragement and commitment to professional development and expressed a disconnect with the company overall: I haven't felt like they've supported [technical training] at all. They might support it financially but I just don't know…It definitely decreases my motivation…I'm guessing they might be supportive if I also were more proactive. But I've been so busy on one project that I haven't been proactive on doing anything other than what I need to do…I LEARNERS’ PERCEPTIONS OF MICROLEARNING 148 don't feel connected to my company at all much around professional development or anything else. Participant B described an ambivalent cultural model for professional development: I would say [professional development] is an afterthought. I wouldn't say that they see it as a mission critical thing…I don't think they really make a point of it at all…There's that element of trust where it's kind of, you know, you have the freedom to decide if you need training and when and how. Interview Participants’ Implementation Recommendations In addition to the structured and semi-structured responses, interview participants presented additional recommendations for the practice of implementing microlearning instruction that do not fit within the KMO framework. These are presented below including specific quotes from participants. Select appropriate topics for microlearning. Skill type. Participant F felt the appropriateness of the microlearning format depends on the type of topic: “If someone is trying to learn…a creative experience, it may not be something that microlearning can address... But…technical skills [where] typically there is a correct or incorrect way of doing things, I think microlearning is the perfect platform.” Participant H expressed a similar perspective: “Things that are reliant upon real human interaction… might be better to do in a course so that you have some direct one on one with an instructor or with other students so that you could exchange ideas.” Participant E gave a specific example: I think they still may need an instructor for some of the softer…skills [like] selling… I think I can use microlearning to help people understand our processes and procedures [using microlearning]…But I think there's still going to need to be a set of training where LEARNERS’ PERCEPTIONS OF MICROLEARNING 149 they're more interactive with a seasoned sales person that will walk them through having a conversation with the customer. According to Participant D, “It really depends upon the content…I don't know if that would work in a more abstract knowledge. Concrete knowledge sounds to me in my experience has been extremely helpful and very appropriate. Abstract stuff I'm not sure.” Skill level. Participant A stated that the appropriateness of microlearning depends on the skill level of the topic: [I]t's a spectrum where more in-depth, more intricate types of learning, or more advanced types of learning I have a feeling that microlearning may be a more difficult format. If you can't ask questions and have someone give you immediate input, I think it may be less beneficial. I think it's more beneficial on the novice to medium versus hard types of activities…I've never tried to go take a microlearning course on anything that I find to be super advanced. Participant G gave an example of the different levels of learning in technical programming content: When you start learning programming [languages] over the computer, …if the [virtual] instructor is very good, he has all the knowledge that is required…and will be able to provide you with the basics and how to kick start with something. But you cannot expect the instructor to give you each and every component that you need in order to build it. Specify pre-requisite knowledge. Most interview participants mentioned the need for built-in statements of the course’s foundational knowledge prerequisites to ensure learners were advised what they should bring to the course before attempting higher-level modules. Participant LEARNERS’ PERCEPTIONS OF MICROLEARNING 150 H stated, “[I]n a technical realm, there are certain prerequisites as far as understanding that you have [a] solid … base to work from.” Include support information. Even with extensive testing, a microlearning course can experience errors. Incorporating into the course a list of resources to get help improves the learners’ experience. Participant D observed, “[There are] hiccups of any technology like this we have and not everything works as well as you'd like it to work all the time.” Incorporate social learning. Participant D recommended expanding course discussion beyond the walls of the programmed instruction: “When we talk about a certain either video or a concept that was given I actually do think that that helps with internalization…If you added something to deepen it through dialogue or discussion, I think that would definitely help.” Participant E agreed: Not all sites have a level [of feedback] that I think is beneficial but some sites that I've used have various forums in the terms of bulletin boards or some sort of electronic platform where the students can ask questions either of their peers or to instructors and it helps. So you kind of get the benefits of the instructor led with the benefit of time management that you have on the micro site computer based learning…You could email or communicate with peers and instructors. And it's important. However, Participant H stated: I found that the message board approach to getting help is not very effective. It's sometimes hard to translate what you're asking for if you don't fully understand the terminology or the technology which leads to people giving you short, inaccurate, go look here for the answer...without really enhancing your understanding. LEARNERS’ PERCEPTIONS OF MICROLEARNING 151 Include robust assessments. Interview participants had mixed feelings about built-in assessment tests and projects; the negative perceptions focused on their quality. Participant F stated: If I had to evaluate the microlearning platform, one of the areas that is lacking would be assessment. So there aren't really good tools at this point at least that I know of that can thoroughly assess an employee and then immediately sort of assign them videos or lessons to address that need. The assessment part is still done on a self-reflection basis or perhaps even a supervisor saying, “Well you know I really think that this is an area that you can improve and perhaps you can take this course.” But the assessment isn't quite there yet. The idea of including hands-on challenges in technical courses was generally perceived as offering beneficial support for technical learning in asynchronous environments. Participant H stated, [W]ith technology in many cases you have the ability to introduce the concept and then now do a challenge, right, of some programming assignments or something along those lines and have it technically vetted to see if it meets the technical requirements in the outcome. I think that makes a big difference. Participant E agreed. “I also like hands-on modules that walk through a concept. Like build a practice program or you may do something else, not just read and answer questions.” However, Participant J stated that the value of skill challenges is dependent on the learner’s goal for taking the microlearning course: It depends a little bit on why I'm coming to the microlearning. If I'm coming to the course or the instruction because I want to develop mastery in a topic, then yeah, that sort of LEARNERS’ PERCEPTIONS OF MICROLEARNING 152 skill challenge makes sense. And even better if it's something other than just a cursory ‘answer these three multiple choice questions’ [exercise] so that I actually have to think and maybe go back and refer and figure out how to synthesize the information and build a construction of my own that incorporates that. Then I feel like that's good. If my purpose for coming to the training is that I need the answer to ‘can this thing connect to the other using this mechanism,’ if I've got just a question of fact, then I don't want to have any sort of skill challenge. I want to be able to go look up and find the response to my question, answer my question ‘can I connect these two things using whatever my tool is,’ and, if the answer is ‘yes,” great. And if the answer is ‘no,’ that's fine too. I just want to be able to find the answer. And frankly I'm providing my own skill challenge of ‘did I find the answer’ and I don't want to be compelled to participate in any sort of skill challenge. But that's based on whatever reason it is that I'm coming to the learning. Participant I stated courses that include hands-on projects to be generally beneficial for technical training; however, there is variation on the quality of the assigned tasks and, therefore, variation on their efficacy: Sometimes they have very contrived examples, or they sometimes have an extremely simplistic example and I end up with a client that's got some totally different wonky thing that they want that makes use of this but in a totally different way. So sometimes some of them are very good and I have gotten a good level of knowledge out of them. And immediately I want to go start making use of that and I can go ahead and immediately start [using] that. But the other ones I can't. But at least if I have had to write the code or apply it then I have better knowledge than if I've just read some dry paper or something like that. I find that there are some [course projects] that pretty much almost babysit you LEARNERS’ PERCEPTIONS OF MICROLEARNING 153 through the solution. So the less that I have to challenge myself in order to solve whatever it is they're trying challenge me on, the less I'll retain. The more I'm challenged, the more I actually have to think about it and craft a solution, the better I will be in retaining it. Increase breadth of microlearning courses. As mentioned above, not all topics are appropriate for instruction in the microlearning format. However, several interview participants proposed that non-technical content should be considered. Participant H proposed: I'd love to continue to see the trend around microlearning expand to more than technology. I think that it's possible to do a number of real world topics and cover those in microlearning and do it in a way that you really feel like you walk out of that experience and really understand it. Research microlearning best practices. Participant D recommended researching the “really good in-class standards. There are organizations that you can even just do some research on to find out who is known for that…[who can] point out some pretty big error things that you would want to avoid.” Test the course. Interview participants recognized that microlearning is a relatively new format for technical training and, as such, its quality can improve over time. Several participants, however, described courses that they felt were rushed to market. Some interview participants articulated specific recommendations for testing before courses are broadly disseminated. Ensure the quality of the instructional design. Several interview participants described how not all microlearning courses are created equally well and can inhibit learning. Participant D summarized the disparate nature of course quality as follows: LEARNERS’ PERCEPTIONS OF MICROLEARNING 154 I can go from experiencing one when I've taken a really good short microlearning course that's really laid out well. I mean I really feel like I learned. Some of the other ones I feel like it's a chore, you know, that I'm spending more time navigating through the system than I am learning the content. Participant F stated that the value of the course is dependent upon the quality of the source: We have to evaluate who's the source of that training resource. We often have to compare, shop around, or maybe even consult with others in the field to determine whether or not that particular training is of good quality The quality of education we find that when we do find something that's of high quality and is produced by experts that really know what they're talking about, that in itself is more valuable than sometimes hiring an outside consultant that may not have all the skills. So having someone that's qualified and be able to provide that training online is something that we strongly encourage. Perform functionality testing by a range of end users. Participant C recalled being frustrated by a new microlearning course: “It's not working! [Try again.] It's not working! [Try again.] It's not working!” There was no guidance or leeway. I think they build and they learn almost using us as guinea pigs [laughter] to do it, you know, to test it...They can be quicker to market by not providing instruction, looking to see how we're going to end up using it, so that they can fix it around the common uses rather than all the edge cases.” Perform accessibility testing by a range of end users. Not all learners will have equal Internet bandwidth to access microlearning, especially courses with heavy use of video to be LEARNERS’ PERCEPTIONS OF MICROLEARNING 155 taken by those in rural or global locations. However, Participant G stated that the consistency of access is improving: Currently the [international Internet access] situations are really better. If you look at Bangalore and Chennai—cities in India—these are rapidly growing. Last couple of years, the Internet usage among the youth and those who are working in the IT industry has rapidly increased. And we recently had the launch of 4G Internet services, which is the fastest. The people [used to] stay later at work to take online trainings for the fast Internet speeds. [Now] the mobile and the broadband networks at their homes are faster…It’s quite accessible to everybody. Summary In this chapter, I presented the results from the quantitative survey supplemented with the qualitative data obtained through participant interviews. All of the assumed influences described in Chapter Three were either fully or partially validated by the data described in Chapter Four and additional influences were identified. In Chapter Five, I will present recommendations for microlearning course design to assist organizations in moving toward achieving the goal that 100% of employees are able to articulate the benefits of microlearning for their technical skill development. In addition, a four-level plan of training evaluation is presented in accordance with Kirkpatrick and Kirkpatrick’s (2016) New World Model. LEARNERS’ PERCEPTIONS OF MICROLEARNING 156 CHAPTER FIVE: RECOMMENDATIONS In Chapter One, I described the field-based problem of practice at the heart of this study: Despite the increasing promotion of microlearning to the training industry, the use of the format for technical skills training has not been studied from the perspective of learners. In Chapter Two, I summarized available literature on theoretical concepts inherent in microlearning design including the likely knowledge, motivation, and organizational influences on learners’ perceptions of the format. In Chapter Three, I explained the design and parameters of this mixed- method study that employed an explanatory sequential design and utilized an online survey followed by telephone interviews. In Chapter Four, I presented the results and findings from the study in an effort to answer Research Question One: Identification of the knowledge, motivation, and organization influences that impact adult learners’ perceptions of the benefits of microlearning for technical skill training. In this final chapter, I present my conclusions from the data to answer Research Question Two: Identification of the recommendations for organizational practice in the areas of knowledge, motivation, and organizational influences related to the employees’ ability to articulate the benefits of microlearning for their technical skill development. Finally, I will present a plan for organizations to evaluate current and past technical courses presented in a microlearning format and plan for future curricula. This plan was designed in alignment with adult learning theory and the Kirkpatrick and Kirkpatrick (2016) New World Model of training evaluation. Recommendations for Practice to Address KMO Influences Three primary categories of recommendations for the design of microlearning curricula evolved from this study: (a) build in options for learners to exercise choice and control; (b) provide recommendations, suggestions, and instructions for learners to help them engage with LEARNERS’ PERCEPTIONS OF MICROLEARNING 157 the content; and (c) offer opportunities but not requirements for self-assessment. The consistent theme is to give learners as much flexibility as possible in a microlearning course to fit their individual learning goals. This goes to the heart of the microlearning concept. No single feature, function, or process stood out as the most important except the flexibility to use or not to use it. For example, the ability to pick and choose content was important if the goal of learning was to grab an answer and exit the course; however, it was not important when the goal of learning was to understand a new topic. As another example, microlearning’s focus on delivering short segments of content was important to this study’s participants but more from the perspective of convenience rather than knowledge retention. The individual stakeholder goal stated in Chapter One was that 100% of learners would be able to articulate the benefits of microlearning for their technical skill development. The results of this study described in Chapter Four clearly show that this goal has not yet been achieved. For example, several survey participants expressed that they did not understand what microlearning was even though a definition was provided as part of the question; however, most of those without experience in microlearning expressed an interest in giving it a try. When there is a gap between performance and a goal, Clark and Estes (2008) propose application of their framework to identify the knowledge, motivation, and organization (KMO) influences that may be impacting the goal’s achievement. Therefore, the recommendations in this chapter will be presented within the KMO framework. It is important to note that this was a small study that utilized a convenience sample methodology in an effort to attract participants from an array of employment environments; therefore, the findings cannot be broadly interpreted to apply to other learners in other contexts. However, because the insights obtained from the study’s participants that form the basis for the recommendations in this chapter were from a variety of individuals LEARNERS’ PERCEPTIONS OF MICROLEARNING 158 with different levels of experience, work environments, and need for technical training, their perceptions and feedback provide at least points for consideration within any organization. Knowledge and Motivation Recommendations Overview. The data in this study signaled an interrelationship between knowledge (procedural and metacognitive) and motivation (expectancy value, goal orientation, and self- efficacy) influences in learners’ perceptions of microlearning; therefore, the recommendations in this section are driven by a combination of both factors. Clark and Estes (2008) stated, knowledge and motivation “must cooperate effectively to handle events that occur in the organizational environment” (p. 44). Pintrich, Wolters, and Baxter (2000) stated that the way people interpret their own specific learning strengths and weaknesses (e.g., better at memorizing than practicing) are a matter of motivation, and “knowledge of strategy variables [including] all the knowledge individuals can acquire about various procedures and strategies for cognition” (p. 46) is metacognitive. These closely related influences were expressed as one in the current study. For example, some participants described being motivated by the presence of video content in a microlearning course because they believed it could improve their learning via metacognitive strategies (e.g., planning the course based in part on the ability to speed up or slow down video playback, repeat content, etc.). Also, the inclusion of suggested learning paths in microlearning courses motivated some participants by the belief that this knowledge (procedural) would increase their ability to learn (self-efficacy). Therefore, the recommendations identified in this section result from a blend of knowledge and motivation influences. Table 6 describes the knowledge and motivation influences that were identified in the Chapter Two literature review, states whether the influences were confirmed by the study data, and provides research-based recommendations for practice. LEARNERS’ PERCEPTIONS OF MICROLEARNING 159 Table 6 Summary of Knowledge and Motivation Influences and Recommendations Knowledge and Motivation Need Validated (Y/N) Principle and Citation Context-Specific Recommendation The inclusion of procedural “how to” knowledge in microlearning courses affects learners’ perceptions of the microlearning format Yes Clark and Estes (2008) stated that training (including instructive feedback and practice) should be employed to help learners obtain knowledge on procedural skills. Eibl (2007) stated that navigation instructions for microlearning courses should be built into the course content. Build into microlearning courses optional support tools including job aids, procedural “how to” training, guided practice, definitions, and feedback to teach the steps, techniques, and methods to work through the course. Build in hyperlinked options for learners to access specific content. Access to worked examples of course path options influence learners’ perceptions microlearning. Partially Providing a worked example allows the learner to borrow knowledge from the course content thereby reducing the extraneous cognitive load (Clark & Mayer, 2008) Provide optional procedural instruction with learning path examples that model use cases and allow learners to practice realistic self-directed learning paths. The inclusion of metacognitive support in microlearning courses influences learners’ perceptions of the microlearning format. Partially Clark and Estes (2008) state that education is the appropriate process to deliver metacognitive knowledge in that it is strategic and designed to help learners manage novel situations. Build optional metacognitive education into microlearning courses designed to increase conceptual skills and promote self-regulation of their learning. LEARNERS’ PERCEPTIONS OF MICROLEARNING 160 As learners gain experience with microlearning, they perceive value in the format due to its emphasis on learner choice and control. Yes Activating personal interest through opportunities for choice and control can increase motivation (Eccles, 2006). Encourage those new to microlearning to take a sample course. Ensure learners are made aware of options for choice (e.g. content sequencing) and control (e.g. take or skip modules). Learners perceive the novelty and challenge of microlearning courses to help their achievement of their goals (mastery or performance) Partially Designing learning tasks that are novel, varied, diverse, interesting, and reasonably challenging promotes mastery orientation (Yough & Anderman, 2006). Including optional gamification with points, badges, and leaderboards can foster goal setting and improve motivation in some learners (Sailer, Hense, Mandl, & Klevers, 2013). Include course tasks that inspire persistence and are varied enough to generate ongoing interest in the learning goals but not so challenging that they result in failure and disinterest. For learners with a mastery goal orientation, include options for a variety of challenging, realistic, interesting, and novel exercises For learners with a performance goal orientation, include options for gamification. Learners feel efficacious in learning technical content delivered in a microlearning format. Partially Provide goal-directed practice coupled with frequent, accurate, credible, targeted and private feedback on progress in learning and performance (Pajares, 2006). Include challenging tasks within microlearning structured with specific SMART goals and private feedback tools (e.g. knowledge checks, hands-on exercises, etc.) LEARNERS’ PERCEPTIONS OF MICROLEARNING 161 Recommendations for knowledge and motivation influences. The findings from this study align with published research articles on the importance of the availability of features to support knowledge and motivation in microlearning and eLearning courses. The findings also add a nuance not yet found in the research: the support should be available but not required in technical learning curricula. In other words, technical learners understand that they might not need procedural help in one course or context (others may) but they may need it in a different course or context (others may not). The interview participants added insight that they often had multiple goals in their use of technical training curricula (e.g. quick fact finding, full end-to-end skill attainment, updating existing knowledge, gamification achievement, etc.) that require flexibility in their use of support. It is important to remember that this study’s participants were from technical backgrounds and many were veteran programmers. Still, they expressed the importance of having support available when needed. Job aids. It is recommended that microlearning courses include imbedded pop-up job aids that can be accessed from within course content to support learners’ engagement. Suggested job aid topics include course navigation options, a glossary and definitions of terms, resources for assistance, suggestions for external learning, and other topics appropriate for the specific course. According to Clark and Estes (2008), learners with similar but insufficient experience with procedural knowledge are best supported with job aids. Microlearning, however, is consumed on demand via electronic delivery systems such as computers, smart phones, and tablets. A printed and laminated job aid, while still viable for many instructional needs, is likely to be less practical for mobile technical learners. Performance support tools (PSTs), the job aids of technological environments, offer on-demand, just-in-time, just-enough knowledge delivery solutions electronically (McManus & Rossett, 2006). Research conducted by Sweller, Kirschner, LEARNERS’ PERCEPTIONS OF MICROLEARNING 162 and Clark (2007) found that the cognitive load imposed on an eLearner can be reduced by the addition of procedural guidance features within the course, especially when the topic of instruction is new to the learner. For microlearning courses in particular, Eibl (2007) stated that navigation instructions for microlearning courses should be built into the course content. To test PSTs in different environments, McManus and Rossett (2008) conducted a small study during the time of the dawn of mobile learning. They concluded that, “PSTs are uniquely positioned to meet a rising demand for contextual ‘knowledge’ by serving up … procedures, guidance, and other information precisely when knowledge workers need assistance” (p. 15). As described in Chapter Four, the data from the current study support this recommendation. Several interview participants acknowledged that having job aids available made sense; however, making their review required was not supported. Clear course navigation support. The recommendation is that detailed navigation instructions should be made available (not required) in all microlearning courses regardless of the complexity of the content and the experience of the learning audience. As described in Chapter Four, the procedural feature that was most often ranked as “very important” by the survey participants was the availability of clear navigation instructions. In the case of eLearning, well-defined instructions for navigating course content are vital for adult learners (Knowles, Holton III, & Swanson, 2015) and they improve knowledge transfer (Park & Wentling, 2007). Unfettered exploration by trainees through eLearning courses (also known as exploratory learning) has been shown to not only impede learning but to potentially result in damage such as the acquisition of jumbled, ineffectual knowledge (Clark & Estes, 2008). Therefore, asking learners to figure out their way around a microlearning course without guidance is not recommended. LEARNERS’ PERCEPTIONS OF MICROLEARNING 163 The data in my study support the literature reviewed. For example, when a new feature was added to a microlearning training website without being given any procedural assistance to understand how it could be used, an experienced technical learner became confused. Other participants talked about the lack of fluidness in poorly designed microlearning courses that negatively affected their motivation and efficiency. Well-designed technical courses with the goal for supporting learners in novel situations, however, can benefit from flexible pathways. Clark and Estes (2008) point out that “flexible navigation through large bodies of information … might be very beneficial for educational goals” (p. 71, emphasis in original) and promote learners’ ability to solve novel tasks. It is interesting to note that in the current study, the level of learners’ experience did not significantly impact the perception that having optional navigation instructions available was an important feature for microlearning courses. Detailed table of contents and site map. It is recommended that microlearning courses include a substantial and detailed table of contents as well as a site map to facilitate learners’ ability to strategize their approach to the course. Mehlenbacher et al. (2005) presented an heuristic tool to evaluate the usability of eLearning based on an analysis of empirical research studies. Among the dozens of recommendations, the authors include the availability of a meaningful table of contents for help as well as a site map to support the learner’s clarity on the overall organization of content. Study participants explained that a meaningful table of contents helps learners to make informed choices including, as will be discussed below, leveraging course length estimates to evaluate the fit of the course to time constraints. However, the recommendation here is to include substantive descriptions of the specific content included in each module so that learners do not have to build in the time it takes to click around to discover it on their own. LEARNERS’ PERCEPTIONS OF MICROLEARNING 164 Time estimates for modules. The recommendation is for inclusion of module time estimates to facilitate the learning planning for those who find them helpful; those who do not can ignore them. Grinberg and Hristova (2012) recommend the inclusion of time estimates in each eLearning module’s description to improve the usability of the course. As is shown in Table 5 above, survey participants ranked time estimates on modules to be tied for the least important feature in a microlearning course and more of the inexperienced learners found them beneficial compared to the more experienced learners. Interview participants had the opposite perceptions. Some interview participants used module time estimates to strategize the fit to available training time. Others found them to be unreliable in text content and only informational on video length (depending if there was a need to rewind and repeat content). The metacognitive use of time estimates was situational. When the learner’s goal was to learn new content, time estimates were leveraged to structure not only the best learning path, but also schedule out the time necessary for each unit. When the learner’s goal was to grab a specific answer out of a course, time estimates were not important. When the learner’s goal was to update existing knowledge, time estimates were rarely helpful because they were viewed as commonly reflecting the speed of a novice to the content. Hyperlinked navigation options. It is recommended that topics within a microlearning course be freely available, easily accessible via hyperlinks, and not severely restricted by course settings. However, it is incumbent on course designers to be cognizant of the potential for the imposition of unnecessary extraneous cognitive load when learners get off target in the content without a visible means to return to a desired place in the course. In summarizing the research on cognitive load theory and hyperlinked text, Madrid, Oostendorp, and Melguizo (2008) opined that there is no agreement on whether hypertext imposes greater cognitive load than straight text. LEARNERS’ PERCEPTIONS OF MICROLEARNING 165 The authors studied whether a larger number of hyperlinks imposed a greater extraneous cognitive load due to increased decision making requirements. Their findings did find a difference because of the number of links; however, they did recognized the mediating factor of reading order. When given navigation options, learners can get lost down unintended paths, thereby increasing their extraneous cognitive load. By providing learners with suggested content links based on their position in the course, as well as links to allow them to choose to return to previous viewed content (e.g. breadcrumbs), extraneous cognitive load is reduced. Grinberg and Hristova (2012) recommend links for free navigation in any path learners choose, links back to the beginning of a course and to the start of individual modules, and to go back or proceed forward in a pre-defined order. In my study, the second highest ranked procedural knowledge feature was the ability to click through a course quickly. This perception was further clarified in the interviews and interpreted to mean that learners do not want to be forced to complete any content before being allowed to access the topic they desire. Common exceptions would include compliance-type training that requires all learners to complete the same content (e.g., topics that subject the organization to legal action, employee safety instruction, etc.). The data from the interviews suggest that learners assume their access to specific content will not be impeded by wait-to- advance restrictions (e.g., page settings that require the learner must spend a specified amount of time on the page before it will advance). However, when faced with numerous non-linear course path options in a microlearning environment, decreasing the learners’ extraneous cognitive load in selecting the next content to study becomes of prime importance. This can be accomplished with the addition of a search tool with hyperlinked results, a hyperlinked detailed table of contents, in-text hyperlinks to related content within the same course and to external content, and LEARNERS’ PERCEPTIONS OF MICROLEARNING 166 a hyperlinked detailed topic index. Hyperlinks, however, need to be properly encoded to ensure learners experience success reaching intended content and thereby remain motivated to use them. “[I]nstructional manipulations to improve learning through diminishing extraneous cognitive load and freeing up cognitive resources will only be effective if students are motivated and actually invest mental effort in learning processes that use the freed resources (Van Merriënboer & Ayres, 2005, p. 7). Learning path examples. It is recommended that microlearning courses include instructional path examples that begin with the necessary foundational material followed by flexible secondary content. As was stated by Langreiter and Bolka (2005, p. 16): [W]e observed that while microcontent reduces the effort to create and therefore the barriers to publish, it shifts the job of organizing disparate pieces of information into a coherent whole to the learner. For microlearning in a decentralized setting to realize its full potential, the emergence of either organizing intermediaries or powerful tools for semiautomatic organization of materials is a necessity, so that learners can invest most energy in learning – instead of spending disproportionate amounts of time hunting down the proverbial herd of chunk-sized cats. Providing a worked example before asking the learner to perform allows the learner to borrow knowledge from the course content thereby reducing the extraneous cognitive load (Clark & Mayer, 2008). In the microlearning format, as defined within this study, the learners likely have at least some options to customize the path they take for learning what they need. This study’s results suggest that whether learners choose to do so remains contextual. The inclusion of learning path examples was recommended by interview participants especially for those new to the microlearning format. The data in my study also showed that learners did not LEARNERS’ PERCEPTIONS OF MICROLEARNING 167 perceive the microlearning format to be acceptable for technical training on content new to the learner unless the course includes instruction on the required foundational content that is presented in a structured format. For example, microlearning courses need to identify specific content pre-requisites. The data also showed that learners have a positive perception of flexible learning path options for more optional learning content. For example, learners new to Excel need to know how to create a new workbook and reference a specific cell (foundational content) before they can be given options to learn about pivot tables, formulas, or cell ranges (secondary content). However, for users who already know some foundational knowledge, it is recommended that they should be made aware of but not be forced to work through the pre- requisite content (e.g., pre-test, self-determine skill level, etc.). The data in my study showed that the learners’ purpose for learning was the primary determining factor as to whether a flexible learning path was a benefit. When the learner’s goal was to acquire extensive new knowledge, even the most experienced microlearning users appreciated suggested learning paths and often completed courses end-to-end trusting the expertise of the course designer to present the information in the most efficient order. Conversely, when the goal for launching a course was for quick retrieval of specific information, learning path examples were not perceived as important or useful. Therefore, the recommendation is to include optional worked examples of course path navigation regardless of the level of experience of the learners or complexity of the course. This involves endowing learners with a metacognitive level of trust that the learner will recognize when they need foundational content. Higher-order learning in asynchronous environments (such as microlearning) is enhanced through the development of self-aware learning strategies including metacognition (Garrison, 2003). LEARNERS’ PERCEPTIONS OF MICROLEARNING 168 Options for creating self-determined learning paths. It is recommended that microlearning courses embed optional course path reflective activities, customizable by the learner, to encourage their active and conscious participation in the individualization of their learning process. As described in Chapter Four, survey participants gave the self-determined learning path feature the fewest number of “very important” rankings out of the seven features listed in Question 13 (see Table 5 above). Several interview participants mentioned the flex path option as one they took under consideration when determining their learning strategy (e.g., the ability to pick and choose content) while others always completed course content in the order in which is was presented. Despite the flexibility offered by the microlearning instructional format, learners who have experience with traditional eLearning may unconsciously proceed in a “click next to continue” linear path and complete the entire course whether or not they need all of the content. Rueda (2011) stated that when learners become proficient at a task, they may be unaware of their actions. With microlearning, making learners consciously aware of their options to customize the path, content and pace of instruction will encourage them to make efficient and effective use of the technology (Kerres, 2007). The availability and use of reflective prompts embedded into hyperlinked content can positively affect learners’ performance of far transfer tasks (Bannert, 2006). While this metacognitive education can increase learners’ ability to transfer knowledge, care must be taken that it is not so demanding that it restricts their existing method for learning (Bannert, Hildebrand, & Megelkamp, 2009). In a subsequent study, Bannert, Sonnenberg, Mengelkamp, and Pieger (2015) found that in computer-based learning environments, “Self-directed metacognitive prompts foster strategic learning activities, in this case, the systematic selection of goal-relevant pages, which results in better learning performance” (p. 303). In other words, give learners the option to design the reflective prompts LEARNERS’ PERCEPTIONS OF MICROLEARNING 169 they want and when they want to receive them. It must be remembered, however, that the learners’ purpose for taking the course is important. For learners who simply want to access a course to get a specific bit of information, required reflective exercises would cause an unnecessary burden. Interactivity. The recommendation is to offer optional interactive features (e.g., video, skill challenges, quizzes, social learning, etc.) in microlearning courses to attract the attention of those who find them beneficial to their learning goals. In the current study, interactivity was ranked as the third most important feature in microlearning courses (see Table 5 above). When strategizing a course based on its inclusion of video (e.g., narrated demonstration of writing code), the interview participants mentioned the need for appropriate indexing, optional playback speed control, and the ability to repeat content. The inclusion of social learning components in microlearning courses was met with mixed perceptions. Instructor chats and collaborative forums were generally perceived as positive options to replicate the personal interaction common in instructor-led delivery. However, some interview participants did not value asynchronous message boards because of the need for learners to be able to articulate their question or position clearly enough for other participants to respond in kind. Self-assessment options. It is recommended that microlearning courses include options for learners to assess the extent to which they have learned the new information or skills presented in a microlearning course. Self-assessment provides both cognitive awareness of content acquisition and metacognitive accountability for learning (Shepard, 2000). This study’s participants discussed several self-assessment features and gave each varying degrees of support for technical skill training. The existence of rigorous, project-driven (not artificial) skill challenges attracted interview participants, but they were important only if the learning goal was LEARNERS’ PERCEPTIONS OF MICROLEARNING 170 mastery of the topic. The presence of knowledge checks and quizzes helped learners evaluate their knowledge retention but they wanted them optional and challenging. For example, some participants discounted the usefulness of multiple-choice questions that could be answered easily without having to learn the content (e.g., reading the question first and then looking for the answer); they preferred requiring the synthesis of the learned content to formulate the answer. Gamification. As a specialized form of self-assessment, the inclusion of optional gamification is recommended for microlearning courses albeit with multiple caveats. Landers and Callan (2011) studied undergraduate students and found that for some, but not all, learners, gamification had strong motivational value. This is supported by the data in the present study. Several of my study participants described with significant amounts of detail how they relished gamification to acquire technical knowledge. Other participants dismissed gamification, one going so far as to label it “silly.” Still others had a blended perspective: gamification was a nice side benefit to their learning goal of topic mastery. Landers and Callan (2011) did not attempt to identify differences between those motivated by gamification and those who were not; however, they included multiple cautionary attributes of gamification (e.g., participants must spend time learning the game instead of the content and time is often in limited supply) and attributed the following factors to a successful program: (a) the learners found the social context of the game meaningful; (b) the game was integrated with that social context; (c) the rewards within that social context were meaningful; (d) grading and rewards were immediate; (d) activities to earn points or badges were challenging but not impossible; (e) performance was not required but supported other learning toward specified outcomes; (f) tasks were scaffolded in difficulty to offer easier entry to participation; and (g) the game environment was not boring or confusing. The authors also caution against taxing learners with too many highly valued goals that could LEARNERS’ PERCEPTIONS OF MICROLEARNING 171 result in unintended competition or unscrupulous conduct. As has been mentioned within this paper, time is limited in technical work environments. Those who choose to spend time learning technical content through gamification can be highly motivated and should be given the option to do so but only if game designers have the expertise to construct the game appropriately to contribute to learning goals. Links to additional information on the course topics. It is recommended that microlearning courses include hyperlinks to related tips, worksheets, and other learning content to encourage learners to expand their knowledge beyond the four walls of the course, yet do so in a way that allow for immediate return to the primary course (e.g., open external links in a new browser window). This, coupled with the ability for learners to return to a course as often as necessary, provides metacognitive support that while that extra content might not be accessed immediately, it is available when needed. The cautions above regarding extraneous cognitive load apply to this recommendation as well. For example, Monahan, McArdle, and Bertolotto (2008) described several asynchronous learning exemplars that incorporate external links including those that direct the users to collaborative learning environments. While this external information can further expand the learners’ understanding of the microlearning course content, it can also distract them from their immediate learning goal. Learning strategy tools. It is recommended that microlearning courses include planning tools and other features to promote metacognitive activities. Key practices include optional self- reflective practice exercises and exposure to appropriately modeled behavior (Schraw, 1998). For example, Schraw (1998) described an example of a learning aid in the form of a strategy evaluation matrix (SEM) to help students understand when, how, and why to use a skim strategy versus a mental integration strategy when learning new content. A second recommendation is the LEARNERS’ PERCEPTIONS OF MICROLEARNING 172 use of a content progress checklist that can be embedded in the microlearning course. Schraw (1998) found the use of checklists enhanced learners’ metacognitive skills and employ strategic decision making to their learning process. Finally, it is recommended that microlearning courses are presented in a format and environment that promote the learners’ use of metacognitive skills. For example, there is strong empirical evidence supporting enhanced adult learning through the practice of spacing out study time. Son and Simon (2012) reviewed the empirical literature on spacing (study sessions spread out over time) versus massing (one long session) and found that despite adult learners’ perception that massing was more effective, long-term retention was actually improved through spacing techniques. These authors encourage the addition of intentional spacing strategies and self-testing in instruction. This technique is easily facilitated in the microlearning course design through rehearsal, review, and knowledge checks from previously completed content. Son and Simon (2012) also suggest offering contextual variability for a topic to enhance knowledge retrieval keys. Again, microlearning’s modular design can offer multiple scenario-based learning components. Making learners aware of the benefits of spacing and context performed through deliberate metacognitive strategies can be added to microlearning instructions. The availability of multiple metacognitive strategies in a learning environment improves performance (Schraw, 1998). Purpose is key. Learners accessing a course with a goal to grab a specific piece of knowledge (e.g., Interview Participant J’s quest for an answer to the question “Can I connect this widget with that widget?”) would not perceive forced instruction on metacognitive tools as beneficial or useful. According to Kerres (2007), making learners aware of their discretionary learning strategies is a goal within microlearning course instructional design. However, the metacognitive approaches employed by the participants in this study varied contextually from LEARNERS’ PERCEPTIONS OF MICROLEARNING 173 deliberate strategic planning to grab and go. Implicit in the recommendations to include the metacognitive tools as described above is the counsel that they remain optional to individual learners. Organization Recommendations Overview. The current field-based study generated data from a range of organizations that varied in size, locale, and type (e.g., global, local, for-profit, government, non-profit, etc.). Empirical research on the organizational influences that specifically affect the introduction, use, and acceptance of microlearning for technical training could not be found. However, research studies on the best practices for the introduction of similar types of technical training have been published. The organizational recommendations in this section of Chapter Five are categorized into cultural settings and cultural models; however, they can be interrelated. As was described in Chapter Two, cultural settings are the structural aspects of the company including availability of resources (Gallimore & Goldenberg, 2010; Schein, 2004). Cultural models refer to the common understandings among employees within the organization as to how things work and what is valued (Gallimore & Goldenberg, 2010; Schein, 2004). The effective utilization of professional development resources (cultural setting) can be interrelated with the organization’s learning attitude (cultural model). Schein (2009) presented an example in which a company’s IT training team wasted significant time and money (cultural setting) when they delivered training on a new system only to find out that its implementation failed due to a revised directive for employees to learn only when they could fit it in around their normal workload (cultural model). In this example, the organization’s unwillingness to provide employees time to take training (cultural setting) affected the value the employees were able to assign to learning (cultural model). LEARNERS’ PERCEPTIONS OF MICROLEARNING 174 The recommendations provided in this section are designed to focus on best practices independent of a specific organizational environment. These include the provision of sufficient physical and technical infrastructure for training and dedicated time for employee learning. Practitioners are advised to utilize these recommendations only as a guide but take significant time to analyze a specific organization’s cultural models and settings as part of the planning process for the introduction of microlearning format for technical training. The benefits gained by organizations’ commitment to employee training and development are well documented. Although some organizations consider training a frill and a target for cost cutting, Pfeffer and Viega (1999), in their extensive review of empirical studies state, “training is an essential component of high performance work systems” (p. 43). Pfeffer and Viega (1999) also found that an organization’s competitive advantage could be demonstratively increased when businesses invest in employee training and development by virtue of, among other benefits, increased quality of work and creative problem solving ability. Moreover, decreased employee attrition through increased organizational commitment has been found to result in organizations in which employees perceive of the company’s commitment to training and development as investments in their success (Paré & Tremblay, 2007). It is assumed, therefore, that organizations in general want to increase their financial performance, gain competitive advantage, and maximize employee retention. The research shows that one way to do that is to improve their commitment to employee training and development. A goal of this study was to identify suggestions and barriers that companies can leverage when utilizing microlearning as a delivery format to achieve their technical training and employee development goals. The findings from this study suggest that the high performance training needs of technical LEARNERS’ PERCEPTIONS OF MICROLEARNING 175 learners are complex and diverse; so too are the recommendations this study makes to those who design and deliver curricula. As Schein (2009, p. 75) stated: [W]hy do you need to analyze and assess culture in the first place? This is an important question not to be taken lightly. Understanding your culture is not automatically valuable, just as understanding your personality is not automatically valuable. It only becomes valuable and necessary if such understanding enables you to solve a problem, to make a change, to learn something new. Then you need to know how your culture would aid or hinder you. Table 7 identifies the organizational influences, both cultural settings and cultural models, which were validated after this study’s data was collected and analyzed. Table 7 Summary of Organization Influences and Recommendations Organizational Cause Validated (Y/N) Principle and Citation Context-Specific Recommendation Cultural Setting Influence: Organizations are challenged to provide sufficient resources to support microlearning including staff, funding, and dedicated time for employee training. Y Ensuring staff resource needs are being met is correlated with increased student learning outcomes (Waters, Marzano & McNulty, 2003). Fund forward- looking technical infrastructure projects, hire and develop training staff skilled in microlearning instructional design, and build employee training time into job responsibilities. LEARNERS’ PERCEPTIONS OF MICROLEARNING 176 Cultural Model Influence: Organizations need to embrace innovation and overcome ingrained learning methodologies to create a culture focused on self-directed professional technical development. Y A strong organizational culture controls organizational behavior and can block an organization from making necessary changes for adapting to a changing environment (Schein, 2004). The willingness of an organization to adopt novel training methodologies is dependent upon the extent that existing learning processes are ingrained in the organization’s culture (Ali & Magalhaes, 2008). Develop a change management program to encourage self- directed microlearning; support risk taking, trial and error, and failure; and solicit input from as many perspectives as possible. Cultural setting recommendations. It is recommended that organizations provide sufficient infrastructure, funding, and human resources to design, develop, deliver, and evaluate microlearning courses for immediate needs as well as strategically plan for growth and innovation. In addition, it is recommended that companies offer to employees specific dedicated training time apart from regular work demands so that employees focus on their professional development activities. As will be more fully described in the cultural model recommendations that follow in this chapter, it is also important for an organizations to communicate with its employees the extent of resources and policies it continues to pledge in the support of their technical learning goals. Allocate sufficient infrastructure and staff. It is recommended that instructional leaders in organizations evaluate their current technical infrastructure and training development teams to ensure the company has adequate resources to deliver effective microlearning curricula. Waters, Marzano, and McNulty (2003) found that student learning is improved when an organization’s LEARNERS’ PERCEPTIONS OF MICROLEARNING 177 staff is supported with sufficient resources to meet learners’ goals. As was described in Chapter Two, empirical research shows that companies have a difficult time providing a modern-day learning experience to their employees to keep pace with the demands of technical and social innovations (van der Vyver, Pelster, Bersin, & Haims, 2014). Half of the 400 learning executives surveyed expected to deliver more learning services in the following six months, yet less than one-third expected to increase staff to support the demand (Miller, 2015a). The data in my study showed that nearly two-thirds of participants perceived their organizations’ professional development resources provided them with the technical training they needed. It is recommended that instructional leaders research and confirm within their own organizations the extent of their employees’ sense of support as well as strategize methods to achieve 100% positive perception. Research current formal and informal training practices. It is recommended that corporate instructional leaders research the current methods utilized by their technical learners to identify infrastructure and human resources that can provide additional support before delivering new methodologies. Envision a path worn through grass that circumvents the sidewalk. For example, my survey data showed that although organizations employed a blend of instructional methods in varying degrees, more than one quarter of them engaged in collaboration for most of their technical training. However, other than public technical training websites, none of the interview participants mentioned that specific tools had been put into place by their employers to support collaborative learning. Organizations with high collaborative cultures, for example, can investigate options (e.g., an internal social learning platform monitored by one or more subject matter experts) to enhance knowledge sharing rather than launch training that implements a new learning paradigm. LEARNERS’ PERCEPTIONS OF MICROLEARNING 178 Establish realistic budgets. It is also recommended that organizations define detailed budgets specifically targeted to professional development and, if appropriate, build in not only the cost of work hours lost during professional development activities, but estimates for cost savings resulting from increased workplace performance efficiency that is gained as a result of training. As was quoted in Schein (2004, p. 117), “a vision without funding is a hallucination.” The existence and extent of funding for professional development varied among the current study’s participants’ organizations. In the interviews, it was found that some companies entrusted employees to be in charge of their own learning to find the resources best suited for their professional development goals; once identified and approved, the organizations funded the purchase. Other companies set minimum annual professional development requirements and reimbursed course fees. Still others did not fund any technical training at all. One participant did not know the employer’s policies on either the availability of existing training or purchase of new resources. Training budgets are often cut in time of economic challenges both within and external to an organization. According to Bulut and Culha (2010), “[t]he opposite should occur: organizations that invest more on human resources have a distinct competitive advantage over their competitors.” One of the current study participants explained how microlearning offers an organization’s budget a “finer scalpel” to splice in only the content updates needed rather than rebuild entire courses and do so much faster to keep technical curricula current. Offer dedicated time for professional development activities. It is recommended that employees be offered an appropriate amount of time within their normal business hours to work on the training content aligned with their learning goals. In the current study, the availability of training time during normal working hours was perceived as nice to have as an option but not necessarily required for learner engagement and commitment. For some participants, training LEARNERS’ PERCEPTIONS OF MICROLEARNING 179 time was built into the workday. For others, even though the organization provided training time, technical learners found studying at home to be more productive because they could avoid work interruptions. Still other participants stressed that it was their professional responsibility to stay current on technology and they would train on their off hours whether or not their employer offered dedicated time. Consult empirical research recommendations. It is recommended that organizations provide training leaders with reference tools to allow them to stay current on empirical research as microlearning evolves. The research that recommends types of organizational resources for training is extensive although there is currently a dearth of studies specifically related to microlearning best practices. For example, Esteves (2013) conducted research on best practices for employee training for the introduction of new enterprise resource planning (ERP) technology systems. Esteves conducted semi-structured interviews with a broad range of types and sizes of organizations in various global locations and included input from those holding different project roles (i.e., end users, consultants, project managers, and project team members). The author refined the results into a list of 24 best practices and ranked them in importance. Organizational resources and constraints dominated the top ten ranking: hands-on use of the customized software instead of a generic training environment (first on the list); “good support center dedicated to the users” (second); “training must respect the company’s critical periods/date” (third); “organizational alignment focus” (fourth); “mixed teams as trainers” (sixth); and “free users from their daily activities” (seventh; p. 677). Similar recommendations can be made for the introduction of microlearning into the cultural settings of corporations including appropriate technical training environments, training delivery teams experienced in the specific corporate setting, and project scheduling tools to limit competing resource priorities. LEARNERS’ PERCEPTIONS OF MICROLEARNING 180 Cultural model recommendations. It is recommended that training leaders investigate and evaluate existing cultural norms, attitudes, and valued practices within their organizations to identify avenues for growth as well as barriers for the successful implementation of microlearning for technical training. As is more fully described in Chapter Two’s literature review, an organization’s cultural model refers to the communal knowledge among its employees as to how things work and what is valued (Gallimore & Goldenberg, 2010; Schein, 2004). In the current study, survey participants perceived their organizations’ encouragement for professional development to be slightly higher than the company’s commitment to support the employees’ efforts. In some interviews, participants described their employers’ cultural model regarding professional development as very proactive with mandatory professional development hours per year and as a win-win for both the employee and the company. Others described more passive cultural models with statements such as, “[Employees] know that the resource is there for them” and “it is an afterthought” without mentioning specific programs to promote its use. Schein (2004) talked about the power inherent in shared assumptions in a company culture. A quantitative study by Kok (2013) analyzed survey responses from more than 2,700 eLearning users and found that “a different degree of success of the eLearning initiative depends upon its coherence with the organizational culture, and the company’s strategy“ (p. 24). The author also found that “training alone is unlikely to produce [full organizational] learning, but when it is backed by other organizational systems such as control, incentive and value systems, it can effectively change the enterprise focus and move to a collective level of achievement” (p. 26). Training leaders can conduct internal research to uncover those conventions as they pertain to technical professional development. LEARNERS’ PERCEPTIONS OF MICROLEARNING 181 Implement change management strategies. It is recommended that training teams design, develop, and implement focused change management programs to cultivate employees’ use of microlearning for technical training. Cultural models are difficult to change. For example, the data in the current study reported that more than a quarter of the participants’ organizations used ILT for technical training 91% to 100% of the time. Organizations and their employees who have long-standing and highly ingrained training processes are resistant to change. Ali and Magalhaes (2008) found that organizations are less likely to adopt new training methodologies if their existing processes have been in place for a significant period of time and are ingrained in the learning culture. Entrenched corporate practices can become “implicit, unconscious, and automated” and part of a “hidden culture” (Clark & Estes, 2008, p. 114). Schein (2004) found that a strong organizational culture controls organizational behavior and can block an organization from making necessary changes for adapting to a changing environment. This suggests that organizations that introduce and develop novel employee training methodologies such as microlearning will benefit from the delivery of concomitant change management programs to acknowledge, address and overcome long-standing and deep-rooted cultural norms. Clark and Estes (2008) recommends six types of support for organizational change processes: (a) clarity of goals and measurements; (b) alignment of the organization’s cultural setting and processes with the goals; (c) frequent messages to communicate plans and progress to all involved; (d) continuous involvement of management; (e) adequate support; and (f) careful selection of change processes to be employed. For example, as noted in the interview participants’ responses, communication to the employees of the organization’s policies, procedures, and benefits is of paramount importance. One interview participant in particular was not sure whether the company offered technical training or reimbursement. In this situation, LEARNERS’ PERCEPTIONS OF MICROLEARNING 182 while the company might be budgeting for training in both time and physical resources, the communication of those benefits is not making its way to the employees; therefore, it is possible that the company would see the lack of interest in training as an indication that it was not perceived as valuable and discontinue offering a benefit the employees are unaware exists. Support risk taking. It is recommended that training teams encourage their technical training end users to take risks and experiment with creative learning methodologies such as microlearning. Cultures that promote risk taking to achieve goals are indicative of highly creative and inventive organizations (Tesluk, Farr, & Klein, 1997). Technical learners are often found in roles demanding creative and complex problem solving expertise. “Because the creativity process begins by recognizing the need for novel problem solutions, features of organization culture that facilitate such recognition are essential” (Tesluk, Farr, & Klein, 1997, p. 31). Technical training to meet today’s fast-paced speed of change is moving away from memorization of facts or application of specific solution methods. Rather, it requires exploration, synthesis, collaboration, and knowledge creation (Peschl, 2007). The survey data in the current study noted that two dozen participants were interested in taking a microlearning course for technical training but they had no opportunity to do so. Those participants with the most experience with microlearning unanimously agreed that they wanted to continue with the format. Encouraging an organizational cultural model that supports risk taking for technical training can lead to creative and innovative results for the company as well as its employees. Solicit input from the technical training users. It is recommended that training teams conduct surveys, live observations, and interviews with their technical training end users to identify organization-specific cultural norms and attitudes that may impact the use of microlearning. Empirical organizational research is uncommon and is estimated by Clark and LEARNERS’ PERCEPTIONS OF MICROLEARNING 183 Estes (2008) to comprise approximately 3% of the published studies. Companies are by nature competitive and reluctant to divulge their trade secrets for success. Moreover, the most appropriate training, performance, and change management support program for a company depends on a wide variety of factors including the type of organization involved (Clark & Estes, 2008). In a case study researching the challenges involved by the introduction of a novel blended learning methodology in a government setting, Holton, Coco, Lowe, and Dutsch (2006) recommended the incorporation of many of the same change management principles discussed by Clark & Estes (2008). The authors recognized the employees’ reticence to change their existing and deeply entrenched corporate culture and its training process. The study’s recommendations included a list of 16 specific factors to facilitate the introduction and use of their new compressed video format for training including the involvement of the eventual learning audience in the curricula design. In addition, to ease the transition, it was suggested that learners be afforded live access to people for assistance with the new methodology. Summary. It is evident from this study that the just-for-me, just-in-time advantages offered by microlearning are valued by technical learners. Therefore the recommendation is for organizations to develop and nurture a cultural model of encouragement, support, risk taking, and commitment to technical training delivered in a microlearning format. This may involve dedicated change management initiatives across the organization to ensure executives, managers, training professionals, and learners are made aware of and participate in the exploration of options and development of curricula. I turn next to a proposal for evaluating microlearning programs within the Kirkpatrick and Kirkpatrick (2016) New World Model. LEARNERS’ PERCEPTIONS OF MICROLEARNING 184 Implementation and Evaluation Framework The four levels of evaluation (reaction, learning, behavior, and results), initially developed in Dr. Donald Kirkpatrick’s 1954 doctoral dissertation (Kirkpatrick & Kirkpatrick, 2016), have comprised the industry’s standard framework for training assessment since the publication of his seminal work Evaluating Training Programs: The Four Levels (Kirkpatrick, 1994). In 2016, Dr. Kirkpatrick’s son and his wife published the new world Kirkpatrick model that flipped the framework so that “the end is the beginning” (Kirkpatrick & Kirkpatrick, 2016, Forward). Thus, the planning for the evaluation of training along the four levels is performed in the reverse order. The initial planning focuses on how the end results will be evaluated to ensure they align with specific organizational goals (level four). These intended results are further broken down into specific leading indicators (or milestones) that will signal whether appropriate progress is being made. The plans for successive levels of evaluation are designed to cascade down from level four through levels three, two and one so that a lower level contributes to the level above. For eLearning and other non-traditional training, the Kirkpatrick New World Model has added evaluation technique adaptations (Kirkpatrick & Kirkpatrick, 2016, ch. 9). For example, the authors recognize that many online training delivery tools offer built-in user-based evaluation features such as quizzes, ratings (e.g., thumbs up or down, number of stars, etc.), surveys, reporting, etc. However, a thorough evaluation of eLearning courses requires going beyond these easy tools of measurement (Kirkpatrick & Kirkpatrick, 2016). For example, with eLearning, the authors caution against evaluating courses too soon; the authors stated that levels three (learning) and four (results) effects take time to emerge. The Kirkpatricks also warn that eLearning course delivery platforms make it much easier to evaluate only the individual and fail to evaluate the LEARNERS’ PERCEPTIONS OF MICROLEARNING 185 team aspect of training. With several cautions in mind, the authors added eLearning evaluation content contributed by Dr. William Horton (author of the 2006 seminal work Evaluating E- Learning). The following proposed implementation and evaluation plan for microlearning is presented, therefore, within this adapted Kirkpatrick new world model. Organizational Purpose, Need and Expectations The purpose and goal for the use of microlearning in organizational settings varies by the institution; therefore, design, implementation, and evaluation strategies will be unique to each company. Typical organizational benefits include wide distribution efficiency, lower delivery cost, consistency of content, frequency of delivery (Hamtini, 2008, Knowles, Holton & Swanson, 2015, Welsh, Wanberg, Brown & Simmering, 2003), the need to keep up with rapidly changing content with a short shelf-life (Brynjolfsson & McAfee, 2014), and the delivery of time-sensitive regulatory compliance content (Strother, 2002). Specific evaluation of the return on training expectations (ROE) depends upon each organization’s desired outcomes to meet the company’s goals. Generically stated, however, in order for instructional designers to develop impactful microlearning courses for their organizations, their evaluation of curricula will need to go through Kirkpatrick’s four levels. Level 4: Results and Leading Indicators Level Four of the New World Kirkpatrick Model measures the extent to which training can be shown to have impacted the organization’s key performance indicators (KPIs) as confirmed by its stakeholders (Kirkpatrick & Kirkpatrick, 2016). Although KPIs can be unique to each company, some common examples include improved customer satisfaction ratings, increased stock price, reduced employee turnover, higher profit margin, fewer production errors, superior safety ratings, etc. For mission critical KPIs, Kirkpatrick and Kirkpatrick (2016) insist LEARNERS’ PERCEPTIONS OF MICROLEARNING 186 that evaluating Level Four results is not optional; it is essential. Kirkpatrick’s New World Model starts the evaluation design process at Level Four instead of Level One to ensure that there is alignment between the company’s large impact KPIs and the objectives of the training curricula. The authors recommend that the company identify a series of leading indicators—interim performance milestones—that, when attained, mark progress toward the achievement the KPI objectives. If a leading indicator is not achieved (e.g., two percent reduction in employee attrition at the end of the first fiscal quarter) toward a company KPI objective (e.g., eight percent reduction of employee attrition at the end of the current fiscal year), adjustments to training curricula can be made mid-cycle to immediately to get performance back on track. The microlearning format is well suited for iterative course modifications. Instructional designers work with the business stakeholders to identify, document, and confirm the leading indicator objectives for the organization’s specific KPIs. The goal is not only to understand how best to design technical training to contribute to the organization’s KPI objectives, but also to point out objectives that are not training issues. For example, a leading indicator for a consulting company’s KPI objective to increase customer satisfaction might be that 100% of their customers submit end-of-project surveys. A microlearning training program that would contribute to achievement of that objective could help managers become skilled at using the survey tool. A factor affecting the objective but not a training issue, however, could be that some customers are not willing to respond to any surveys. Once the organization’s stakeholders acknowledge the objectives, the Level Four evaluation design requires the definition of how success will be measured, by whom, and how often. Instructional designers can then collaborate to ensure the specific Level Four training goals are discussed and acknowledged throughout the team. From there, curricula ideas can be brainstormed and documented, strategies LEARNERS’ PERCEPTIONS OF MICROLEARNING 187 can be designed, and project plans and assignments can be generated. This application of microlearning best practice knowledge to the design of curricula to meet instructional objectives is the subject of Level 3 behavior evaluation. Level 3: Behavior Kirkpatrick and Kirkpatrick (2016) stated that, “Level 3 Behavior is the most important level because training alone will not yield enough organizational results to be viewed as successful” (p. 49). To do so, specific and measurable critical behaviors must be identified at the outset. Critical behaviors are defined in the Kirkpatrick model as those actions individuals or groups need to perform to achieve the organization’s targeted outcomes (Kirkpatrick & Kirkpatrick, 2016). Required drivers, which are defined as the tools, practices, and methods that support or promote the critical behaviors in an organization, reinforce these behaviors. Required drivers fall into two categories: support and accountability. Each training organization will need to identify the critical behaviors and their associated required drivers that are suited to match their distinct performance result goals. In the case of microlearning curricula, instructional designers and their management teams need to conduct ongoing, thorough evaluation as to whether they are properly applying the recommended best practices to their microlearning course design, delivery, and assessment to meet the leading indicator objectives. For example, a critical behavior for the creation of a microlearning course would be to hold weekly design and development progress meetings. Supporting required drivers could include reinforcement (e.g., a best practices checklist), encouragement (e.g., coaching from experienced instructional designers), and rewards (e.g., colleague recognition). Accountability required drivers could include monitoring, observing, and tracking of progress toward key performance indicators. LEARNERS’ PERCEPTIONS OF MICROLEARNING 188 Instructional designers’ application of microlearning best practices can be assessed utilizing a variety of tools. One option is to create a checklist of features. For example, do new courses include “how to” learner support including Internet-based help tools, procedural training, guided practice, and context-appropriate feedback to teach the steps, techniques, and methods to work through the course? Did the designer include worked examples within course content that model use cases and allow learners to practice realistic self-directed learning? Is metacognitive education offered to learners via reflective tasks designed to increase conceptual skills and promote self-regulation of their learning? Are there support features to ensure learners are made aware of the value of having control over content sequencing, course completion, and amount of time spent within the course? Does the instructional designer ensure microlearning courses promote learners’ mastery orientation through challenging, varied, interesting, diverse, and novel exercises and offer those with performance orientation rewards such as gamification features? Are learners offered challenging tasks, SMART goals, and private feedback tools (e.g. knowledge checks, hands-on exercises, etc.)? The checklist should be customized and created to align with each organization’s critical behaviors and required drivers. Level 2: Learning Level Two of the earlier Kirkpatrick (1994) evaluation model focused on an individual’s acquisition of the knowledge and skills necessary to perform the desired actions for which they are trained. Examples include knowledge checks and exams. Kirkpatrick’s New World Model (2016) adds the evaluation of “attitude, confidence and commitment” (p. 42) and recognizes that memorization is not sufficient to support Level Three behaviors. As an example, an employee can be asked to teach a newly acquired skill to a co-worker and, in the process, provide a novel application to solve a problem. LEARNERS’ PERCEPTIONS OF MICROLEARNING 189 Evaluation of the Level Two evaluation of instructional designers tasked with creating technical training courses in a microlearning format can be assessed before they are hired and throughout their employment. Having the ability to design courses with the elements mentioned in Level Three above is only part of the process. The instructional designers’ confidence, attitude, and commitment will be instrumental to the quality of the curricula. The Level Two evaluation, therefore, could be accomplished through performance observation (e.g., contributions made during course design meetings), and documentation review (e.g., course storyboards) to demonstrate not only the requisite knowledge and skills but also the level of confidence, attitude and commitment that the employee brings to the assigned task. Do the instructional designers’ self-reflection activities signal confidence in their ability to design, develop, deliver, and evaluate microlearning courses that generate confidence in the learners who take their courses? Do they demonstrate an ongoing commitment to deliver microlearning courses aligned with both adult learning theory and microlearning best practices to support the needs of the learners? Each organization will need to design its unique professional development programs for instructional designers to successfully meet Level Two expectations. A wide range of options is available including self-directed learning, internal team training workshops, industry conferences, coaching, informal learning, and formal certification and training (e.g., ATD, eLearning Guild, higher education degree, etc.). Professional development programs have no set timing or length as they are ongoing and dependent upon numerous factors including each individual’s expertise with adult learning theory, technology, organizational goals, and employees’ needs. LEARNERS’ PERCEPTIONS OF MICROLEARNING 190 Level 1: Reaction Level One evaluation in the new world Kirkpatrick and Kirkpatrick (2016) evaluation model measures the reactions of learners to training and includes three components: the level of engagement, perceived relevance to their jobs, and extent of satisfaction with the curricula. The model recommends the incorporation of a blended evaluation structure so that immediate reaction measurements (Level One and Level Two) are combined with questions asking learners to anticipate how they will apply the knowledge going forward and forecast their expectations for anticipated results. If the expected results expressed by the learners at the time of Level One and Level Two evaluation do not align with the key drivers (Level Three) and the organization’s goals for return on expectation (Level Four), the course content can be adjusted quickly or supplemental content can delivered before too much time has elapsed. In addition, a blended model can incorporate delayed evaluation processes that are conducted an appropriate amount of time after the learner undertakes a course; these are designed to provide retrospective analyses of Level Three and Level Four. In other words, looking back on each course, how effectively were the course participants actually able to apply their learning to their jobs (Level Three) and what was the impact on the organization’s return on expectations (Level Four). This analysis, conducted from both forward-looking and reflective perspectives, helps to keep training programs in line with organizational and learner goals. Designers of technical microlearning training can leverage not only the feedback received from their team colleagues, but also the Level One and Level Two course evaluations completed by their course learners. This iterative feedback will serve to fine tune the designers’ knowledge, skills, confidence, attitude, and commitment to provide the organization with the highest quality curricula possible. Through the use of Level One and Level Two evaluations, design team LEARNERS’ PERCEPTIONS OF MICROLEARNING 191 leaders can foster an environment of mutual encouragement, knowledge sharing, and performance recognition to ensure ongoing improvement of microlearning instruction. I turn next to a discussion of suggested mechanisms for evaluation. Evaluation Tools The design of effective course evaluation tools must begin while the program itself is being designed (Kirkpatrick & Kirkpatrick, 2016). As is mentioned throughout these implementation and evaluation recommendations, the most appropriate assessment instruments and processes will be unique to each organization. Starting with identification of the Level Four objectives and outcomes, the selection of evaluation methodologies, timing, and strategy should be made to ensure they deliver actionable information at all four levels to improve instruction. Both immediate and delayed evaluation tools can be employed as is described below. . Immediate evaluation. Level One and Level Two evaluation can be adapted to non- traditional training environments such as courses delivered via microlearning. “The Kirkpatrick Model concerns itself solely with the results, rather than with the mechanisms used to accomplish those results” (Kirkpatrick and Kirkpatrick, 2016, p. 67). However, learners’ reactions to the novelty of microlearning courses can call for unique evaluative tools and processes. Because microlearning is often self-directed and on-demand, physical observation may not be possible and learners can easily overlook or ignore a request for completion of a Likert Scale course evaluation sheet. However, much insight can be gained from work-aloud recorded pilot test sessions. This concept can involve an instructional designer using screen share technology to obtain real-time Level One and Level Two evaluation. For example, a designer can observe as a learner works through a microlearning course, ask the learner to provide verbal narration of their decision making process, ask Level One and Level Two questions, and solicit LEARNERS’ PERCEPTIONS OF MICROLEARNING 192 course feedback from the learner. In accordance with the Kirkpatrick and Kirkpatrick (2016) Blended Evaluation ® approach, it is also recommended that instructional design leaders engage in ongoing discussions with their team members regarding their attitudes toward, confidence in, and commitment to utilizing the microlearning format for technical training. See Appendix D for an example of an immediate evaluation tool. Delayed evaluation. Kirkpatrick and Kirkpatrick (2016) state that delayed evaluation processes should emphasize deriving information relevant to the impact training has had on the ability of an organization to achieve its performance goals. This delayed evaluation can be performed at all four Kirkpatrick levels starting with asking learners to offer their reaction to the microlearning courses (Level One) through the benefit of hindsight. Instructional designers can leverage this data to retrofit existing courses and design future curricula. Similarly, course designers should be asked to identify any changes in their confidence, attitude, and competence when creating microlearning courses as well as any recognize gaps in their knowledge discovered during evaluation review (Level Two). In addition, designers can be asked to collaborate with colleagues on the changes they made in their behavior as a result of applying microlearning best practices (Level Three). Finally, measuring the impact on the organization may include self-reported impact as well as reports from employees and managers on the effectiveness of the new microlearning courses, statistics that document improved customer satisfaction with the company, and internal metrics for goal attainment such as reduced errors, an improved safety record, and a steadily increasing stock price (Level Four). To enhance the thoroughness of the evaluation, open-ended questions can be added. See Appendix E for an example of a delayed evaluation tool. LEARNERS’ PERCEPTIONS OF MICROLEARNING 193 ELearning Considerations for Assessment and Evaluation Planning Evaluation data does not explain itself. A thorough analysis is recommended to provide in-depth recommendations back to organizational leaders as well as individual instructional designers. One example of a concise presentation of data is a graphic display of a comparison between immediate and delayed evaluation results. See Figure 47 for a representation of before and after evaluation results on a program for instructional designers focused on microlearning course best practices. Figure 47. Example display of evaluation findings showing changes before and after training. In this example, a simple graph summarizes many pages of data and provides clear indications of changes immediately after a training program and six months later. Some results are self- explanatory (e.g., increase in confidence) while others (e.g., a drop in reported skills) may take more detailed explanation (i.e., once the instructional designers tried to implement the new skills, they realized they needed more training). Notice that minimal or no change might be positive or negative and therefore might require additional explanation. In Figure 47 above, instructional LEARNERS’ PERCEPTIONS OF MICROLEARNING 194 designers remained highly committed to applying the microlearning course recommendations both immediately after the course and at least six months later. Yet their engagement with the program improved only slightly. This type of graph serves to help formulate the “if so, why” and “if not, why not” questions proposed in Kirkpatrick and Kirkpatrick (2016, ch. 12). A table format can include specific data as well as a simple visible indicator to provide clear information at a glance. See Figure 48 for an example of the representation of data evaluating the ongoing transfer of the best practices from the microlearning design course to the instructional designers’ on-the-job performance. Figure 48. Presentation of results from ongoing application of learning to job performance. Many other presentation styles and graphic representations can similarly present data in a succinct yet informative way to stakeholders. For example, line graphs, pie charts, stacked bar charts, bubbles, scatter plots, doughnuts and combinations of tools can offer insight into the data without overwhelming the consumer of the data with numbers and text. Some formats display comparisons better than progress while other formats display a series of data better than static data. To ensure clarity, a common best practice involves testing the visual representation with someone outside of the project and asks for their interpretation of what is displayed. LEARNERS’ PERCEPTIONS OF MICROLEARNING 195 Summary Through the ongoing analysis of all four levels of data gained from both the immediate and the delayed evaluation tools, instructional designers can identify and remove specific barriers to learning in the microlearning format through iterative revisions and thereby enhance the organization’s return on microlearning course expectations. Phillips (1996) added a fifth level of evaluation to the Kirkpatrick Model to calculate return on investment to the organization. Despite the longevity of the Kirkpatrick and Phillips model, an ATD Research report (Ho, 2016) has found that few organizations conduct much more than Level One and Level Two evaluations. See Figure 49 for a visual representation of the ATD Research findings. Figure 49. ATD research findings on the use of the five levels of evaluation in organizations between 2009 and 2015. Adapted from “ATD Research: Evaluating Learning: Getting to Measurements That Matter,” by M. Ho, 2016, Alexandria, VA: ATD Research. Copyright 2016 by ATD Research. LEARNERS’ PERCEPTIONS OF MICROLEARNING 196 Ho (2016) stated some of the reasons training professionals gave for the lack of thorough evaluation included limited funding, access to organizational data (e.g. financials), and the lack of personnel trained in data collection and analysis. The study also found that less than half of the respondents felt their evaluation processes helped their organization to meet learning and business goals. It is not surprising that instructional professionals are not convinced their organizations are meeting desired training outcomes if those outcomes are neither defined nor measured. See Figure 50 from the ATD Research study. Figure 50. ATD research findings on the effectiveness of evaluation in meeting organizational goals between 2009 and 2015. Adapted from “ATD Research: Evaluating Learning: Getting to Measurements That Matter,” by M. Ho, 2016, Alexandria, VA: ATD Research. Copyright 2016 by ATD Research. Perhaps the application of the New World Kirkpatrick Model (Kirkpatrick & Kirkpatrick, 2016) will result in more thorough evaluation practices due to the model’s emphasis on first defining the goals for training outcomes (Level Four), and only then aligning all other levels to support the attainment of those outcomes. However, when Strother (2002) theoretically applied Kirkpatrick’s four levels of evaluation to eLearning courses, the author pointed out nuances in LEARNERS’ PERCEPTIONS OF MICROLEARNING 197 electronic course delivery that my indicate redefining the scope of the four levels. By extrapolating those distinctions, it might be said that microlearning may require a unique evaluation model. For example, Strother mentions Level One evaluation is conducted when the course is complete. In microlearning, course completion may or may not be the goal. Still, evaluation needs to take place to ensure instructional designers continue to develop their own skills as well as deliver courses that align with the goals of the organization and its learners. How that evaluation is conducted is less important than the fact that it is conducted at all. Limitations and Delimitations The limitations of his study included the use of a convenience sampling methodology from volunteer participants recruited from online technical user groups. It is possible that a form of volunteer bias impacted the results such that those who have more negative perspectives on the microlearning format for technical training may not have been willing to participate in the study. In addition, although several requests for study volunteers were made via posts in multiple technical user groups relating to a variety of technical platforms, it is possible that an unequal number participants volunteered from one user group compared to the others. The participants’ experience with and perspectives on microlearning may not have been as diverse as if volunteers had originated from acknowledged dissimilar technical learning communities. Also, the data was obtained through self-reports; actual observation of the participants’ activities may have produced different results. Delimitations involved the decision to exclude from interviews the survey participants who had not taken technical training in a microlearning format. The focus of the study was to obtain technical learners’ perspectives on the microlearning format; therefore, those without that experience would not be able to contribute the insight I sought. A secondary delimitation LEARNERS’ PERCEPTIONS OF MICROLEARNING 198 involved the decision not to interview all participants who volunteered. I had a limited amount of time to conduct interviews and complete the data analysis for this paper; therefore, I randomly selected ten volunteers. Data saturation became apparent in response to certain interview questions (e.g., whether the microlearning format would work for any kind of training topic) but not others. A third delimiter was the exclusion of participant observation from the data gathering process. Time and finance restrictions (both of the researcher and the participants) contributed to that decision. A fourth delimiter was the limited number of questions asked in the survey and during the interview sessions. The objective was to ensure participants remained engaged in the process without requiring of them lengthy commitments of time. At the end of both the survey and the interviews, participants were offered open ended opportunities to contribute additional insight. Most participants did not take advantage of the offer. Conclusion The primary conclusion reached as a result of this study is that microlearning can be a viable format for the delivery of technical training for some learners (but not all) and for some purposes (but not all). To paraphrase one of the interview participants, microlearning should neither be the only arrow in an instructional designer’s quiver nor the only format of training available to learners. Although the training industry is heavily marketing courses, books, events, and webinars on microlearning, the results of this study do not change the long-standing first step in instructional design: Analysis. Before jumping onto the microlearning bandwagon, it is of primary importance for designers to conduct thorough analyses with organizational stakeholders to determine whether the instructional format fits the objectives, culture, setting, and constraints of the organization, the learners, and the content. If so, this study may be of help to instructional designers due to its lists of detailed recommendations for microlearning design best practices. LEARNERS’ PERCEPTIONS OF MICROLEARNING 199 Once microlearning is included in a company’s instructional methodology, frequent four-level evaluation should be conducted within each organization to ensure the format continues to achieve changing organizational objectives. Best practices, use cases, success stories, risks, and cautions should be shared in the greater instructional design community to ensure other organizations’ designers and learners have the best possible experiences with the microlearning format. It is said that perception is reality. Instructional designers would do well to conduct their own version of this study to stay in touch with the reality of microlearning through the perceptions of their learners. LEARNERS’ PERCEPTIONS OF MICROLEARNING 200 References Adkins, S. S. (2016). The 2016 worldwide self-paced eLearning market: Global eLearning market in steep decline. Monroe (WA): Ambient Insight. Retrieved from http://www.ambientinsight.com/Resources/Documents/AmbientInsight_2015- 2020_US_Self-paced-eLearning_Market_Abstract.pdf Aguinis, H., & Kraiger, K. (2009). Benefits of training and development for individuals and teams, organizations, and society. Annual review of psychology, 60, 451-474. Ali, G. E., & Magalhaes, R. (2008). Barriers to implementing e-learning: a Kuwaiti case study. International journal of training and development, 12(1), 36-53. Andrews, D., Nonnecke, B., & Preece, J. (2003). Electronic survey methodology: A case study in reaching hard-to-involve Internet users. International journal of human-computer interaction, 16(2), 185-210. ATD Association for Talent Development. (2016). Microlearning [search term]. Retrieved from https://www.td.org/ Bandura, A. (1977). Self-efficacy: toward a unifying theory of behavioral change. Psychological Review, 84(2), 191. Bannert, M. (2006). Effects of reflection prompts when learning with hypermedia. Journal of Educational Computing Research, 35(4), 359-375. Bannert, M., Hildebrand, M., & Mengelkamp, C. (2009). Effects of a metacognitive support device in learning environments. Computers in Human Behavior, 25(4), 829-835. Bannert, M., Sonnenberg, C., Mengelkamp, C., & Pieger, E. (2015). Short-and long-term effects of students’ self-directed metacognitive prompts on navigation behavior and learning performance. Computers in Human Behavior, 52, 293-306. LEARNERS’ PERCEPTIONS OF MICROLEARNING 201 Bell, C. R. (1961). Psychological versus sociological variables in studies of volunteer bias in surveys. Journal of Applied Psychology, 45(2), 80. Berry, D. & Mok, L. (2012). Five best practices to improve professional development in the IT workforce. Stamford, CT: Gartner, Inc. Retrieved from http://www.gartner.com/ document/2266716?ref=solrAll&refval=159142292&qid=61586175f8cf17238b1bffba0c 570236 Bova, B., & Kroth, M. (2001). Workplace learning and Generation X. Journal of Workplace Learning, 13(2), 57-65. Brethower, D. M. (2000). The relevance of performance improvement to instructional design. In G. M. Piskurich, P. Beckschi, & B. Hall (Eds.), The ASTD handbook of training design and delivery: A comprehensive guide to creating and delivering training programs: Instructor-led, computer-based, or self-directed (473-491). New York, NY: McGraw- Hill. Brown, K. G. (2005). A field study of employee e-learning activity and outcomes. Human Resource Development Quarterly, 16: 465–480. Brynjolfsson, E. & McAfee, A. (2014). The second machine age: Work, progress, and prosperity in a time of brilliant technologies. New York, NY: WW Norton & Company. Bulut, C., & Culha, O. (2010). The effects of organizational training on organizational commitment. International Journal of Training and Development, 14(4), 309-322. Burke, W. W., & Litwin, G. H. (1992). A causal model of organizational performance and change. Journal of Management, 18(3), 523-545. Clark, R. E. & Estes, F. (2008). Turning research into results: A guide to selecting the right performance solutions. Charlotte, NC: Information Age Publishing, Inc. LEARNERS’ PERCEPTIONS OF MICROLEARNING 202 Clark, R. C., & Mayer, R. E. (2008). E-learning and the science of instruction: Proven guidelines for consumers and designers of multimedia learning. San Francisco, CA: Jossey-Bass. Coeurderoy, R., Guilmot, N., & Vas, A. (2014). Explaining factors affecting technological change adoption: A survival analysis of an information system implementation. Management Decision, 52(6), 1082-1100. Combs, W. L., & Davis, B. M. (2010). Demystifying technical training: Partnership, strategy, and execution. San Francisco, CA: John Wiley & Sons. Creswell, J. W. (2014). Research design: Qualitative, quantitative, and mixed methods approaches. Thousand Oaks, CA: SAGE Publications. Deloitte Consulting. (2014). Global human capital trends: Engaging the 21 st century workforce. Retrieved from http://d2mtr37y39tpbu.cloudfront.net/wp-content/uploads/2014/03/ GlobalHumanCapitalTrends_2014.pdf DeRouin, R. E., Fritzsche, B. A., & Salas, E. (2005). E-learning in organizations. Journal of Management, 31(6), 920-940. Donnelly, R. (2006). How "free" is the free worker?: An investigation into the working arrangements available to knowledge workers. Personnel Review, 35(1), 78-97. Eccles, J. (2006). Expectancy value motivational theory. Retrieved from http://www.education.com/reference/article/expectancy-value-motivational-theory/ Edge, D., Fitchett, S., Whitney, M., & Landay, J. (2012). MemReflex: Adaptive flashcards for mobile microlearning. Proceedings of the 14th international conference on human-computer interaction with mobile devices and services companion. 431-440. New York, NY: ACM doi:10.1145/2371664.2371707 LEARNERS’ PERCEPTIONS OF MICROLEARNING 203 Egan, T. M., Yang, B., & Bartlett, K. R. (2004). The effects of organizational learning culture and job satisfaction on motivation to transfer learning and turnover intention. Human resource development quarterly, 15(3), 279-301. Eibl, T. (2007). What size is micro? Using a didactical approach based on learning objectives to define granularity. In Hug, T. (Ed.), Didactics of microlearning: Concepts, discourses and examples (125-138). New York, NY: Waxman Publishing Company. Elliot, A. J., & Harackiewicz, J. M. (1996). Approach and avoidance achievement goals and intrinsic motivation: A mediational analysis. Journal of Personality and Social Psychology, 70(3), 461-475. doi:10.1037/0022-3514.70.3.461 Esteves, J. M. (2014). An empirical identification and categorisation of training best practices for ERP implementation projects. Enterprise Information Systems, 8(6), 665-683. Ettinger, A., Holton, V., & Blass, E. (2006). E-learner experiences: key questions to ask when considering implementing e-learning. Industrial and commercial training, 38(3), 143- 147. Eysenbach, G., & Till, J. E. (2001). Ethical issues in qualitative research on Internet communities. British Medical Journal, 323, 103-105. Fan, W., & Yan, Z. (2010). Factors affecting response rates of the web survey: A systematic review. Computers in Human Behavior, 26(2), 132-139. Galesic, M., & Bosnjak, M. (2009). Effects of questionnaire length on participation and indicators of response quality in a web survey. Public opinion quarterly, 73(2), 349-360. Gallimore, R. & Goldenberg, C. (2010). Analyzing Cultural Models and Settings to Connect Minority Achievement and School Improvement Research. Educational Psychologist, 36(1), 45-56. LEARNERS’ PERCEPTIONS OF MICROLEARNING 204 Garrison, D. R. (2003). Cognitive presence for effective asynchronous online learning: The role of reflective inquiry, self-direction and metacognition. Elements of quality online education: Practice and direction, 4(1), 47-58. Glesne, C. (2011). Becoming qualitative researchers: An introduction (4th ed.). Boston, MA: Pearson. Gobet, F., Lane, P. C. R., Croker, S., Cheng, P. C-H, Oliver, I., & Pine, J. M. (2001). Chunking mechanisms in human learning. Trends in Cognitive Sciences, 5(6), 236-243. Grinberg, M., & Hristova, E. (2011). Efficiency and usability of e-learning systems: Project- oriented methodology guide. Retrieve from: http://eprints.nbu.bg/635/1/METHOLOGY_EN.pdf Guest, G., Bunce, A., & Johnson, L. (2006). How many interviews are enough? An experiment with data saturation and variability. Field Methods, 18(1), 59-82. Hagleitner, W. & Hug, T. (2007). A generative model for the evaluation of micro-learning processes. In T. Hug (Ed.), Didactics of microlearning: Concepts, discourses and examples (381-397). New York, NY: Waxmann Publishing Co. Hamari, J., Koivisto, J., & Sarsa, H. (2014). Does Gamification Work? – A Literature Review of Empirical Studies on Gamification. In Proceedings of the 47th Hawaii International Conference on System Sciences, Hawaii, USA, January 6–9, 2014. Hamtini, T. M. (2008). Evaluating e-learning programs: an adaptation of Kirkpatrick's model to accommodate e-learning environments. Journal of Computer Science, 4(8), 693-698. Hierdeis, H. (2007). From meno to microlearning: A historical survey. In T. Hug (Ed.), Didactics of microlearning, 35-52. New York, NY: Waxmann LEARNERS’ PERCEPTIONS OF MICROLEARNING 205 Ho, M. (2016). Evaluating learning: Getting to measurements that matter. Alexandria, VA: ATD Research. Holton, E. F., Coco, M. L., Lowe, J. L., & Dutsch, J. V. (2006). Blended delivery strategies for competency-based training. Advances in Developing Human Resources, 8(2), 210-228. Horton, W. (2006). Evaluating e-learning. Alexandria, VA: Association for Talent Development Hsia, J. W., Chang, C. C., & Tseng, A. H. (2014). Effects of individuals' locus of control and computer self-efficacy on their e-learning acceptance in high-tech companies. Behaviour & Information Technology, 33(1), 51-64. Hsieh, P. A. J., & Cho, V. (2011). Comparing e-Learning tools’ success: The case of instructor– student interactive vs. self-paced tools. Computers & Education, 57(3), 2025-2038. Hug, T. (2012). Microlearning. In Encyclopedia of the sciences of learning. (Microlearning). US: SpringerLink ebooks. Retrieved from http://tinyurl.com/gvge6db. Hug, T. & Friesen, N. (2007). Outline of a microlearning agenda. In T. Hug (Ed.). New York, NY: Waxmann Publishing Co. Hunt, K. J., Shlomo, N., & Addington-Hall, J. (2013). Participant recruitment in sensitive surveys: A comparative trial of 'opt in' versus 'opt out' approaches. BMC Medical Interactive Services. (2015). Blended learning 2015: Future of learning whitepaper [White paper]. Retrieved from http://interactiveservices.com/wp-content/uploads/2015/06/ blended_learning_whitepaper_2015.pdf Jonassen, D. H. (2000). Toward a design theory of problem solving. Educational Technology Research and Development 48(4), 63-85. Jonsen, K., & Jehn, K. A. (2009). Using triangulation to validate themes in qualitative studies. Qualitative Research in Organizations and Management: An International Journal, 4(2), LEARNERS’ PERCEPTIONS OF MICROLEARNING 206 123-150. Keller, J., & Suzuki, K. (2004). Learner motivation and e-learning design: A multinationally validated process. Journal of Educational Media, 29(3), 229-239. Karimi, K., & Nickpayam, J. (2017). Gamification from the viewpoint of motivational theory. Italian Journal of Science & Engineering, 1(1). Keller, J., & Suzuki, K. (2004). Learner motivation and e-learning design: A multinationally validated process. Journal of Educational Media, 29(3), 229-239. Kerres, M. (2007). Microlearning as a challenge for instructional design. In T. Hug (Ed.), Didactics of microlearning: Concepts, discourses and examples (98-109). New York, NY: Waxmann Publishing Co. Kim, D., Lee, S., & Kim, B. O. (2014). Findings from survey on mobile multimedia training. Allied Academies International Conference. Academy of Information and Management Sciences. Proceedings, 18(2), 6-9. Retrieved from http://search.proquest.com.libproxy1. usc.edu/docview/1647624630?accountid=14749 Kinnebrock, S., Baeßler, B., & Rössler, P. (2007). Quality Management for the Implementation of E-Learning. In T. Hug (Ed.), Didactics of microlearning (398-415). New York, NY: Waxmann. Kirkpatrick, D. L. (1994). Evaluating training programs: The four levels. San Francisco, CA: Berrett-Koehler Publishers. Kirkpatrick, J. D., & Kirkpatrick, W. K. (2016). Kirkpatrick’s four levels of training evaluation. Alexandria, VA: ATD Press. Klein, H. J., Noe, R. A., & Wang, C. (2006). Motivation to learn and course outcomes: The impact of delivery mode, learning goal orientation, and perceived barriers and enablers. Personnel psychology, 59(3), 665-702. LEARNERS’ PERCEPTIONS OF MICROLEARNING 207 Knowles, M. S., Holton III, E. F., and Swanson, R. A. (2015). The adult learner: The definitive classic in adult education and human resource development. (8 th ed.). New York, NY: Routledge. Kohlbacher, F., & Mukai, K. (2007). Japan's learning communities in Hewlett-Packard Consulting and Integration: Challenging one-size fits all solutions. The Learning Organization, 14(1), 8-20. Kok, A. (2013). How to Manage the Inclusion of E-Learning in Learning Strategy. International Journal of Advanced Corporate Learning, 6(1). Koivisto, J., & Hamari, J. (2014). Demographic differences in perceived benefits from gamification. Computers in Human Behavior, 35, 179-188. Krathwohl, D. R. (2002). A revision of Bloom’s Taxonomy: An overview. Theory Into Practice, 41(4), 212–218. Landers, R. N., & Callan, R. C. (2011). Casual social games as serious games: The psychology of gamification in undergraduate education and employee training. In Serious Games and Edutainment Applications (pp. 399-423). Surrey, UK: Springer. Landers, R. N., Bauer, K. N., & Callan, R. C. (2017). Gamification of task performance with leaderboards: A goal setting experiment. Computers in Human Behavior, 71, 508-515. Langreiter, C. and Bolka, A. (2005). Snips and spaces: Managing microlearning. In Proceedings of the Microlearning 2005 Conference, (pp.79–97). Leidelmair, K. (2007). Blogs and chats: Some critical remarks on electronic communication. In In Hug, T. (Ed.) Didactics of Microlearning. Concepts, Discourses and Examples, 187- 199. New York, NY: Waxman Publishing Company. LinkedIn. (2016). About us. Retrieved from https://www.linkedin.com/about-us?trk=hp-about LEARNERS’ PERCEPTIONS OF MICROLEARNING 208 Madrid, R. I., Van Oostendorp, H., & Melguizo, M. C. P. (2009). The effects of the number of links and navigation support on cognitive load and learning with hypertext: The mediating role of reading order. Computers in Human Behavior, 25(1), 66-75. Mathy, F. & Fedman, J. (2012). What’s magic about magic numbers? Chunking and data compression in short-term memory. Cognition, 122(3), 346-362. Maxwell, J. A. (2013). Qualitative research design: An interactive approach (3rd ed.). Thousand Oaks, CA: SAGE Publications. McManus, P., & Rossett, A. (2006). Performance support tools: Delivering value when and where it is needed. Performance Improvement, 45(2), 8-16. Mehlenbacher, B., Bennett, L., Bird, T., Ivey, M., Lucas, J., Morton, J., & Whitman, L. (2005, July). Usable e-learning: A conceptual model for evaluation and design. In Paper presented at the HCI international 2005: 11th international conference on human– computer-interaction, Las Vegas, NV. Merchant, K. A., & Van der Stede, W. A. (2006). Field-based research in accounting: Accomplishments and prospects. Behavioral Research in Accounting, 18, 117-134. Merriam, S. B., & Tisdell, E. J. (2009). Qualitative research: A guide to design and implementation (4th ed.). San Francisco, CA: Jossey-Bass. Millar, M. M., & Dillman, D. A. (2011). Improving response to web and mixed-mode surveys. Public Opinion Quarterly 75: 249–69. Miller, G. A. (1956). The magical number seven, plus or minus two: Some limits on our capacity for processing information. Psychological Review, 63, 81–97. Miller, L. (2015a). ATD Learning Executive Confidence Index (LXCI) Report (Q2 2015). Retrieved from Association of Talent Development website: https://www.td.org/ LEARNERS’ PERCEPTIONS OF MICROLEARNING 209 Professional-Resources/LXCI?mktcops=c.learning-and-development~c.sr-leader Miller, L. (2015b). Instructional design now: A new age of learning and beyond. [White paper]. Association for Talent Development (ATD) and the Institute for Corporate Productivity (i4cp). Retrieved from Association of Talent Development website: http://files.astd.org/Research/LookInsides/Instructional_Design-Free_Preview.pdf Monahan, T., McArdle, G., & Bertolotto, M. (2008). Virtual reality for collaborative e- learning. Computers & Education, 50(4), 1339-1353. Morrison, G. R., Ross, S. M., Kemp, J. E., & Kalman, H. (2010). Designing effective instruction, (6th ed.). Hoboken, NJ:John Wiley & Sons Myers, M. D., and Newman, M. (2007). The qualitative interview in IS research: Examining the craft. Information and Organization, 17(1), 2-26. O’Reilly, T. (2005). What is Web 2.0: Design patterns and business models for the next generation of software. Retrieved from http://www.oreilly.com/pub/a/web2/archive/what- is-web-20.html Owens, L. (2010). A snapshot of today’s content and collaboration professional. Forrester. Retrieved from https://www.forrester.com/report/A+Snapshot+Of+Todays+Content+And +Collaboration+Professional/-/E-RES57833 Paine, N. (2014). The learning challenge: Dealing with technology, innovation and change in learning and development. Philadelphia, PA: Kogan Page Limited. Pajares, F. (2006). Self-efficacy theory. Retrieved from, http://www.education.com/ reference/article/self-efficacy-theory/ Paré, G., & Tremblay, M. (2007). The influence of high-involvement human resources practices, procedural justice, organizational commitment, and citizenship behaviors on information LEARNERS’ PERCEPTIONS OF MICROLEARNING 210 technology professionals' turnover intentions. Group & Organization Management, 32(3), 326-357. Park, J. H. & Wentling, T. (2007). Factors associated with transfer of training in workplace e-learning. Journal of Workplace Learning, 19(5), 311-329. Peschl, M. F. (2007). From double-loop learning to triple-loop learning. Profound change, individual cultivation, and the role of wisdom in the contact of the microlearning approach. In T. Hug (Ed.), Didactics of microlearning: Concepts, discourses and examples (292-312). New York, NY: Waxmann Publishing Co. Pfeffer, J., & Veiga, J. F. (1999). Putting people first for organizational success. The Academy of Management Executive, 13(2), 37-48. Piccoli, G., Ahmad, R., & Ives, B. (2001). Web-based virtual learning environments: A research framework and a preliminary assessment of effectiveness in basic IT skills training. MIS quarterly, 401-426. Pintrich, P. R. (2003). A motivational science perspective on the role of student motivation in learning and teaching contexts. Journal of Educational Psychology, 95(4), 667–686. Pintrich, P.R., Wolters, C. A., & Baxter, G. P. (2000). Assessing metacognition and self- regulated learning. In G. Schraw & J. C. Impara (Eds.), Issues in the Measurement of Metacognition (pp. 43-97). Lincoln, NE: Buros Institute of Mental Measurements Piskurich, G. M. (2000). Make it easier for them to learn on their own: Instructional design for technology-based self-instructional application. In G. M. Piskurich, P. Beckschi, & B. Hall (Eds.), The ASTD handbook of training design and delivery: A comprehensive guide to creating and delivering training programs: Instructor-led, computer-based, or self- directed (473-491). New York, NY: McGraw-Hill. LEARNERS’ PERCEPTIONS OF MICROLEARNING 211 Phillips, J. (1996). Measuring ROI: The fifth level of evaluation. Technical & Skills Training, 7(3), 10-13. Qualtrics. (2016). Security statement. Retrieved from https://www.qualtrics.com/security- statement/ Riege, A. (2005). Three-dozen knowledge-sharing barriers managers must consider. Journal of Knowledge Management, 9(3), 18-35. Roosta, F., Taghiyareh, F., & Mosharraf, M. (2016, September). Personalization of gamification- elements in an e-learning environment based on learners' motivation. In Telecommunications (IST), 2016 8th International Symposium on (pp. 637-642) Rossett, A., & Schafer, L. (2003). What to do about e-dropouts: What if it’s not the e-learning but the e-learner? T+ D, 57(6), 40-46. Rueda, R. (2011). The 3 dimensions of improving student performance. New York: Teachers College Press. Sailer, M., Hense, J., Mandl, H., & Klevers, M. (2013). Psychological perspectives on motivation through gamification. Interaction design and architecture journal, 19, 28-37. salesforce.com. (2016). Why Salesforce? Retrieved from https://www.salesforce.com/salesforce- advantage/?d=70130000000i7zF&internal=true Sarvary, M. (1999). Knowledge management and competition in the consulting industry. California management review, 41(2), 95-107. Schäfer, M. T., & Kranzlmüller, P. (2007). RTFM! Teach Yourself Culture in Open Source Software Projects. In Hug, T. (Ed.), Didactics of Microlearning. Concepts, Discourses and Examples, 324-340. New York, NY: Waxman Publishing Company. Schein, E. H. (2004). Organizational culture and leadership. San Francisco: Jossey-Bass. LEARNERS’ PERCEPTIONS OF MICROLEARNING 212 Schein, E. H. (2009). The corporate culture survival guide. San Francisco: Joseey-Bass. Scherer, F. & Scherer, M. (2007). How “micro” can learning be? A neuropsychological perspective on microlearning. In T. Hug (Ed.), Didactics of Microlearning. Concepts, Discourses and Examples (110-124). New York, NY: Waxmann. Schilling, J., & Kluge, A. (2009). Barriers to organizational learning: An integration of theory and research. International Journal of Management Reviews, 11(3), 337-360. Schraw, G. (1998). Promoting general metacognitive awareness. Instructional science, 26(1-2), 113-125. Schultze, U., & Avital, M. (2011). Designing interviews to generate rich data for information systems research. Information and Organization, 21(1), 1-16. Servage, L. (2005). Strategizing for workplace e-learning: some critical considerations. Journal of Workplace Learning, 17(5/6), 304-317. Shepard, L. A. (2000). The role of assessment in a learning culture. Educational Researcher, 29(7), 4-14. Siemens, G. (2007). Connectivism: Creating a learning ecology in distributed environments. In T. Hug (Ed.), Didactics of Microlearning. Concepts, Discourses and Examples (53-68). New York, NY: Waxmann. Singh, H. (2003). Building effective blended learning programs. Educational Technology, 43, 51-54. Son, L. K., & Simon, D. A. (2012). Distributed learning: Data, metacognition, and educational implications. Educational Psychology Review, 24(3), 379-399. Strother, J. B. (2002). An assessment of the effectiveness of e-learning in corporate training programs. The International Review of Research in Open and Distributed Learning, 3(1). LEARNERS’ PERCEPTIONS OF MICROLEARNING 213 Šumak, B., Heričko, M., & Pušnik, M. (2011). A meta-analysis of e-learning technology acceptance: The role of user types and e-learning technology types. Computers in Human Behavior, 27(6), 2067-2077. Sun, P. C., Tsai, R. J., Finger, G., Chen, Y. Y., & Yeh, D. (2008). What drives a successful e- Learning? An empirical investigation of the critical factors influencing learner satisfaction. Computers & education, 50(4), 1183-1202. Sweller, J., Kirschner, P. A., & Clark, R. E. (2007). Why minimally guided teaching techniques do not work: a reply to commentaries. Educational Psychologist, 42(2), 115-121. Tesluk, P. E., Farr, J. L., & Klein, S. R. (1997). Influences of organizational culture and climate on individual creativity. The Journal of Creative Behavior, 31(1), 27-41. Trailhead. (2017). Salesforce Trailhead [Website]. Retrieved from trailhead.salesforce.com. van der Vyver, B., Pelster, B., Bersin, J. & Haims, J. (2014). Global human capital trends 2014: Engaging the 21st-century workforce. Deloitte University Press. Retrieved from: https://dupress.deloitte.com/dup-us-en/focus/human-capital-trends/2014.html Van Merriënboer, J. J., & Ayres, P. (2005). Research on cognitive load theory and its design implications for e-learning. Educational Technology Research and Development, 53(3), 5-13. Wang, M. (2011). Integrating organizational, social, and individual perspectives in Web 2.0- based workplace e-learning. Information Systems Frontiers, 13(2), 191-205. Waters, T., Marzano, R. J., & McNulty, B. (2003). Balanced leadership. Aurora, CO: McREL. Weiss, R. S. (1994). Learning from strangers: The art and method of qualitative interview studies. New York, NY: The Free Press. LEARNERS’ PERCEPTIONS OF MICROLEARNING 214 Welsh, E. T., Wanberg, C. R., Brown, K. G. & Simmering, M. J. (2003). E-learning: emerging uses, empirical results and future directions. International Journal of Training and Development, 7, 245–258. Wright, K. B. (2005). Researching Internet‐based populations: Advantages and disadvantages of online survey research, online questionnaire authoring software packages, and web survey services. Journal of Computer‐Mediated Communication, 10(3), 00-00. Yough, M., & Anderman, E. (2006). Goal orientation theory. Retrieved from http://www.education.com/reference/article/goal-orientation-theory/. Zhang, Q., Cheng, L., & Boutaba, R. (2010). Cloud computing: state-of-the-art and research challenges. Journal of internet services and applications, 1(1), 7-18. Zhang, D., Zhao, J. L., Zhou, L., & Nunamaker Jr, J. F. (2004). Can e-learning replace classroom learning?. Communications of the ACM, 47(5), 75-79. Zimmerman, B. J. (2000). Self-efficacy: An essential motive to learn. Contemporary educational psychology, 25(1), 82-91. LEARNERS’ PERCEPTIONS OF MICROLEARNING 215 Appendix A Survey Instrument Thank you for your interest in helping me with my dissertation research. My name is Wendy Peterson. I'm working on my doctorate in education through the University of Southern California (USC). My research is focused on evaluating student preferences for obtaining technical training. Participation in this study is voluntary. The survey is anticipated to take 10-minutes to complete. The next screen will present an information sheet. Please read through the information and, if you're willing to participate in the study, please indicate your consent and click the ">>" (next) navigation arrow to launch the survey. Thank you for your time. Survey Questions (Italicized text was not included in the survey instrument) Question 1: Please indicate the type of organization for which you currently work. a. For-profit company b. Public organization (e.g., school, government, etc.) c. Non-profit organization (e.g., church, charity, etc.) d. Self-employed (e.g., no other employees) e. Not currently employed f. Other: _________________________ Question 1a: (This question was only asked if the type of organization selected in Question 1 was for-profit, public, non-profit, or other.) Please estimate the number of employees at your organization. • 51 to 100 • 101 to 500 • 501 to 1,000 • 1,001 or more Question 2: For purposes of data comparison, please move the slider to indicate your age. The answer option was a slider selection tool with a range from 0 to 100. All participants who indicated any age under 18 were directed to a page thanking them for their interest and the survey was stopped. LEARNERS’ PERCEPTIONS OF MICROLEARNING 216 Question 3: For purposes of data comparison, please indicate your gender. • Female • Male • Transgender • Prefer not to answer Question 4: Please identify the country in which you work. • United States • Canada • United Kingdom • Other: __________________________ The following questions are about your experience with technical training. Question 5: Have you taken technical training courses? • Yes • No Question 6: (This question was only asked if the answer to Question 5 was “Yes.” The order of the answer options [except “Other”] was randomly shuffled.) Please let me know your reason for taking technical training courses. • I work in a technical role • I want to start working in a technical role • I have a general interest in technology • Other: ________________________ Question 6a: (This question was only asked if the answer to Question 6 was that they worked in a technical role, they wanted to start working in a technical role, or other.) Please let me know your reason for taking technical training courses. • Programming • Configuration (non-programming) • Administration • End User • Quality Control / Testing • Other: ________________________ LEARNERS’ PERCEPTIONS OF MICROLEARNING 217 Question 6b: (This question was only asked if the answer to Question 5 was “Yes.) Please drag the slider to indicate how many years you have been taking technical training courses. If more than 30, select 30. The answer option was a slider selection tool with a range from 0 to 30. Question 7: (This question inquired about the importance the participant placed on various features and formats of technical training courses. It was in the form of a Likert scale matrix and all statement options were randomly shuffled.) Thinking about your preferences for learning technical content, please identify how important each of the following statements is to you. Answer options: Not Important at All; Not Too Important; No Opinion; Somewhat Important; Very Important a. Learn technical content in short segments b. Learn technical content when it is convenient for me c. Learn only the specific technical content I need d. Have the ability to skip around in a technical course to take the content in the order I want e. Learn technical content by collaborating with colleagues f. Learn technical content in a classroom with a live instructor g. Learn technical content by attending a live webinar with an instructor h. Learn technical content by listening to recorded content i. Learn technical content offline from a workbook Question 8: (This question inquired about organizational influences on professional development. The statement options were presented in the form of a Likert scale matrix and randomly shuffled.) Thinking about your current organization, please respond to the following statements. Answer options: Strongly Disagree, Somewhat Disagree, Neutral, Somewhat Agree, Strongly Agree a. My organization provides the professional development courses I need b. My organization encourages me to take ongoing professional development courses c. My organization has a strong commitment to employee professional development LEARNERS’ PERCEPTIONS OF MICROLEARNING 218 Question 9: (Inquired about organizational influences on professional development. It was in the format of a constant sum listing [answers must total to 100%] and all statement options [except “Other”] were randomly shuffled.) In your current company, what percentage of the technical training is conducted with the following features? Your answers must total 100%. a. Live classroom courses with an instructor b. Live webinars with an instructor c. Printed or online training manuals or workbooks d. E-learning courses that are delivered as full-length training that you have to complete in the order that they’re presented e. E-learning courses that are delivered in small segments structured so you can choose the order in which you go through the content f. Learning by collaborating with my colleagues g. Other: ________________________ The following questions inquire about your experience with the microlearning format for training. My definition of the microlearning format is e-learning training content broken up into short segments (generally 5 minutes or less) that give you control over (a) the order in which you take the segments; (b) whether you skip segments completely; and (c) how much time you spend learning at one time. Examples of courses delivered in a microlearning format include Salesforce's Trailhead badges, Skillsoft, Lynda.com, Udemy, Khan Academy, Coursera, and similar programs. My definition of e-learning courses not in a microlearning format are those with long modules (generally more than 5 minutes in length) that require you to (a) go through the content in a specific order; (b) complete the entire course; or (c) stay on a slide for a set period of time before you can advance to another slide. . Question 10: Have you taken any technical training in a microlearning format? • Yes • No LEARNERS’ PERCEPTIONS OF MICROLEARNING 219 Question 10a: (Only asked if the participant’s response to Question 10 was “No.” The statement options were presented in the form of a Likert scale matrix and randomly shuffled. Once answered, participants were asked if they wanted to participate in a sweepstakes drawing and the survey was concluded for these participants.) Please indicate your level of interest in microlearning courses for technical training. Answer options: Strongly Disagree, Somewhat Disagree, Neutral, Somewhat Agree, Strongly Agree • I have had the opportunity to take a microlearning course for technical training • Given the opportunity, I would like to try a microlearning course for technical training • I do not care for the microlearning course format for technical training • I do not understand what a microlearning course is Drawing: Thank you for participating in my research study. To enter our random drawing for a $10 Amazon gift card, the next screen will ask for your email address. Your email will not be sold, distributed, or used for any other purpose other than this drawing. Once the drawing is complete, your email information will be deleted. However, if you choose not to enter the drawing, simply select "No" below and the survey will conclude. • Yes (Participants who submitted this option were sent out of the main survey and into a separate survey to collect sweepstakes entry information via their email or phone.) • No (Participants who submitted this option were sent to the “Thank you” screen and the survey was concluded.) LEARNERS’ PERCEPTIONS OF MICROLEARNING 220 The following the survey questions were only asked of those participants who indicated in Question 10 that they had taken technical training in a microlearning format. Question 11: Please drag the slider across and estimate how many microlearning-style courses (as defined above) on technical content you have at least started from Trailhead, Skillsoft, Lynda.com, Udemy, Khan Academy, Coursera, or a similar course provider. The answer option was a slider selection tool with a range from 0 to 50. Question 12: (This question inquired about the level of the participant’s motivation to take future technical content in a microlearning format. It was presented in a single-answer multiple choice format.) Do you plan on taking additional microlearning courses in the future? Answer options: Definitely Will Not, Probably Will Not, Might or Might Not, Probably Will, Definitely Will Question 13: (This question inquired about the value the respondent placed on features of microlearning courses including knowledge and motivation factors. It was in the form of a Likert scale matrix and all statements were randomly shuffled.) Thinking back on your overall experience with microlearning courses, how important is each of the following features to you? (Procedural Knowledge; Metacognitive Knowledge, Self-Efficacy; Expectancy Value) Answer options: Not Important at All; Not too Important; No Opinion; Somewhat Important; Very Important a. Clear course navigation instructions b. Definitions for terminology used in the course c. Display of estimated time for each segment’s completion d. Progress bar displays time remaining in each segment e. Interactive features (e.g. videos, projects, knowledge checks, etc.) f. Flexible path through the course content g. Ability to click through a required course quickly LEARNERS’ PERCEPTIONS OF MICROLEARNING 221 Question 14: (This question inquired about the respondent’s level of confidence [self-efficacy] and related value found in the microlearning format for technical training. It was in the form of a Likert scale matrix and all statements were randomly shuffled.) Referring to your level of confidence specifically in the microlearning format for technical training, please rate the following statements: (Self-Efficacy; Expectancy Value) Answer options: Strongly Disagree, Somewhat Disagree, Neutral, Somewhat Agree, Strongly Agree a. I can successfully navigate through a microlearning course b. I can acquire the technical skills I need from microlearning courses c. Taking microlearning courses is an important use of my time Question 15: (This question inquired about motivational factors by asking for the respondent’s overall opinion on the microlearning format for technical training. It was in the form of a Likert scale matrix and all statements were randomly shuffled.) Referring to your overall opinions on the use of the microlearning format for technical training, please rate the following statements: (Self-Efficacy; Expectancy Value) Answer options: Strongly Disagree, Somewhat Disagree, Neutral, Somewhat Agree, Strongly Agree a. I would like more technical training to be delivered in a microlearning format b. The microlearning format makes it easier for me to learn technical content c. I find it worthwhile to be able to make decisions on the path I take through the modules d. The microlearning format helps me achieve my learning goals e. I find it worthwhile to have control over how long I spend in training at one time f. The microlearning format involves too much starting and stopping g. When I take technical training, I spend more than 30 minutes at one time h. Technical training with a live instructor is more effective than e-Learning LEARNERS’ PERCEPTIONS OF MICROLEARNING 222 Question 16: (This question asked for the metacognitive and motivation factors employed while taking a microlearning course. It was a single-answer multiple-choice question and all answer options [except “Other”] were randomly shuffled.) If I could observe you working through your most recent microlearning course, what is the primary strategy I would see you use? Please choose one. (Procedural Knowledge; Metacognitive Knowledge; Self-Efficacy; Expectancy Value) a. Worked straight through the segments in order b. Skipped segments I felt I already knew c. Skipped around in the segments but took them all d. Skipped around in the segments but only took the ones that interested me e. Planned out my learning path before starting the course f. Other: ________________________ Question 17: (This question offered the respondent an opportunity to contribute additional insight. It was a free text field.) Please add any additional feedback you would like to contribute about your impressions of the microlearning format for technical training. Question 18: (This question asked the participant to volunteer for a live interview. Participants were offered a free text field to enter an email or phone number.) Would you be willing to contribute more depth to this research by having in a 10 minute, anonymous phone interview with me? If so, please add your phone number or email so I can contact you to arrange a date and time convenient for you. Thank you. (Once answered, participants were asked the same sweepstakes screen as identified above.) LEARNERS’ PERCEPTIONS OF MICROLEARNING 223 Appendix B Interview Protocol Good morning (or as appropriate), <name>. Thank you for taking the time to talk with me today. I’m very interested in hearing your feedback about the Salesforce security settings microlearning course that you completed. In particular, I’ll be asking for feedback on your specific experience with the microlearning format for technical training and then your perception on how you think the IT industry as a whole will accept the format. My interest in this study arose when I noticed the volume of recent training industry webinars and tools focused on microlearning and I became curious to hear how acceptable the format is for technical content. Your input will help a great deal in clarifying the answer to that question. Before we start, I would like your permission to record our conversation so that I can focus on our time together and have an accurate record of your responses. I assure you that our conversation is strictly confidential, your name is coded and will remain anonymous, and no one except perhaps my faculty advisor will be able to access the recording or the full transcript. Once transcribed, the recording will be deleted. I know this all sounds like I am going to ask you embarrassing questions or try to get you to reveal secrets. But I promise you that nothing of the sort will happen. I am simply a student at a university and when we do research with people, we are required to explain our commitment to confidentiality. If at any time you want to stop the interview, not answer a question, or withdraw from the study, please just let me know and of course you will be free to do stop. After the interview recording is transcribed, you will have the opportunity to review it to ensure I have captured your input correctly. Before we begin, do you have any questions for me? NOTE: Bolded text will not be spoken during the interview and is listed here to link the question to the knowledge, motivation and/or organizational influence. Questions listed as sub-bullets are available as follow-up probing questions should the respondent not provide the depth of information sought. 1. Please tell me a little about why you take technical training courses? (Motivation - Goal Orientation) • How does technical training impact your job/career/professional development? • How important are company or personal status and recognition to your decision to take technical training courses? 2. Please tell me about your overall experiences with technical training courses delivered in an e-learning format. LEARNERS’ PERCEPTIONS OF MICROLEARNING 224 • How does an e-Learning format affect (positively or negatively) your ability to learn technical content? (Knowledge – Metacognitive) • How valuable is the e-learning format to you compared to other training formats for learning technical content? (Motivation – Value) • How confident do you feel in the ability of a course delivered via e-learning to give you the technical skills you want? (Motivation – Self-Efficacy) • What types of instructions do you think should be included in technical e-learning courses? (Knowledge – Procedural) 3. I’m going to ask you a few questions specifically about the microlearning format for technical training courses. Please describe how you decide to proceed through a microlearning-styled course. (Knowledge- Metacognitive) • What support features should a microlearning course provide to help you plan your path through the content? 4. Please describe how confident you feel about your ability to use a microlearning course to obtain the technical knowledge you’re seeking. (Motivation - Self-Efficacy) 5. Regarding your ability to gain technical skills through microlearning-styled training, please compare the value of the microlearning format to other formats of eLearning courses you’ve taken. (Motivation - Value) • What did you find to be different? Please give me an example. • What did you find to be the same? Please give me an example. • What would influence your decision whether or not to take future technical training in a microlearning format? 6. In the next couple of questions, I’d like to know a little more about the company you work for. Please tell me whether your organization encourages, discourages, or remains neutral about its employees taking technical training. (Organization – Cultural Model) • How does that affect your perceptions about taking technical training? (Motivation – Goal Orientation) 7. How does your organization provide technical training? (Organization – Cultural Setting) • Is any of it provided in a microlearning format? (Organization – Cultural Setting) • Does your organization offer a variety of technical training delivery options? (Organization – Cultural Setting) LEARNERS’ PERCEPTIONS OF MICROLEARNING 225 • Does your organization give you time away from your regular duties to take technical training? (Organization – Cultural Setting) • Does your organization ever ask you to take technical training on your own time? (Organization – Cultural Setting) • How much do you think your organization emphasizes professional training and development? (Organization – Cultural Model) • How much do you value whether or not your organization gives you time to take technical training? (Motivation - Value) 8. Please tell me any additional feedback you would like to contribute to this study. LEARNERS’ PERCEPTIONS OF MICROLEARNING 226 Appendix C Survey Question Flow Design LEARNERS’ PERCEPTIONS OF MICROLEARNING 227 Appendix D Example of an Immediate Program Evaluation Tool (Level One and Level Two) In this example, an organization’s training team has just completed an internal program designed to enhance their understanding and application of the latest research behind effective microlearning design and evaluation practices. LEARNERS’ PERCEPTIONS OF MICROLEARNING 228 LEARNERS’ PERCEPTIONS OF MICROLEARNING 229 Appendix E Example of a Delayed Program Evaluation Tool In this blended four-level evaluation example, it has been six months since an organization’s training team completed an internal program on effective microlearning design and evaluation practices. A short survey is followed by individual interview questions noted in the example. LEARNERS’ PERCEPTIONS OF MICROLEARNING 230 
Asset Metadata
Creator Peterson, Wendy L. (author) 
Core Title Learners’ perceptions of the microlearning format for the delivery of technical training: an evaluation study 
Contributor Electronically uploaded by the author (provenance) 
School Rossier School of Education 
Degree Doctor of Education 
Degree Program Organizational Change and Leadership (On Line) 
Publication Date 09/25/2017 
Defense Date 09/06/2017 
Publisher University of Southern California (original), University of Southern California. Libraries (digital) 
Tag elearning,KMO framework,microlearning,OAI-PMH Harvest,technical training 
Language English
Advisor Hirabayashi, Kimberly (committee chair), Patel, Megan (committee member), Seli, Helena (committee member) 
Creator Email wlpeters@usc.edu,wpeterson1910@gmail.com 
Permanent Link (DOI) https://doi.org/10.25549/usctheses-c40-433710 
Unique identifier UC11265213 
Identifier etd-PetersonWe-5763.pdf (filename),usctheses-c40-433710 (legacy record id) 
Legacy Identifier etd-PetersonWe-5763.pdf 
Dmrecord 433710 
Document Type Dissertation 
Rights Peterson, Wendy L. 
Type texts
Source University of Southern California (contributing entity), University of Southern California Dissertations and Theses (collection) 
Access Conditions The author retains rights to his/her dissertation, thesis or other graduate work according to U.S. copyright law.  Electronic access is being provided by the USC Libraries in agreement with the a... 
Repository Name University of Southern California Digital Library
Repository Location USC Digital Library, University of Southern California, University Park Campus MC 2810, 3434 South Grand Avenue, 2nd Floor, Los Angeles, California 90089-2810, USA
Abstract (if available)
Abstract The speed of innovation is challenging instructional designers to meet the individualized goals of their technical learners. One relatively new approach is microlearning: segmenting curricula into small modules (or chunks) and giving the learner control of the delivery of the content to align with their learning objectives. This study was designed to contribute insight to a field-based problem of practice: specifically, what knowledge, motivation, and organization influences affect learners’ perceptions of the microlearning format when it is applied to technical training? The study employed a modified Clark and Estes’ (2008) KMO framework and utilized surveys followed by personal interviews to gather data. Both convenience and snowball sampling methods were employed. Survey participants were recruited from online technical user group communities 
Tags
elearning
KMO framework
microlearning
technical training
Linked assets
University of Southern California Dissertations and Theses
doctype icon
University of Southern California Dissertations and Theses 
Action button