Skip to main content

Measurably Effective Content: Engagement, Complexity, and Curiosity

header

It’s 2023, and the online learning landscape has evolved greatly over the last decade. Especially in the post-pandemic era we find ourselves in, our tooling options for creating and delivering content have become vast, along with our ability to measure its effectiveness. Content creators have various content creation software options to consider in bringing their curriculum to life, many of which support publishing content in formats that adhere to behavior tracking standards, and thus enable us to discern how effective our content is. 

Still, off-the-shelf content creation tools may not be enough to satisfy the demand for engaging experiences we hear from our learners, and we may find ourselves needing to break free from the limitations of traditional templatized content creation. Gaining a clear understanding of our learners and the topics they’re expected to comprehend is a crucial first step. Once clarity is gained, we can then start making informed decisions about what can be accomplished with various content creation software, and what we’ll need to design and build ourselves.

As content creators, we ask ourselves, “what is going to make my content the most effective?” Historically, we’re making many assumptions at design-time, and rarely (if ever) is there any post-deployment feedback to help us answer that question. Granular quantitative data often doesn’t exist to support claims about content effectiveness one way or another, so how do we get there? We implement a learner behavior tracking strategy that facilitates quantitative post-deployment feedback. We can then prove or disprove previous assumptions, make iterative content improvements, and get our updated content back to our learners. With this cycle of content efficacy, we can ensure we’re delivering the best experiences possible supporting our learner’s journey.

Measuring Content Effectiveness

When we’re talking about “measurably effective content,” we’re pointing at the notion of creating content, capturing how students engage and interact with it, and analyzing that data to determine if our content effectively conveys the learning objective. To give an example, let’s start with a simple piece of content: a video. Within the context of a course page, a video can trigger a handful of events based on how the learner is interacting with it:

  • The video is scrolled into view within the learner’s browser
  • The learner clicks the play button
  • The learner presses pause
  • The learner scrubs through the video
  • The learner reaches the end of the video


video

As we can see, much data about the learner’s interactions can be collected with a simple piece of content. This behavior capture pattern can be extrapolated to all sorts of activities and simulations. Imagine a more complex activity like an Electrical Engineering simulation, where learners might configure a virtual multimeter to test the wiring in a simulated house. We could capture multimeter settings, which outlets they are testing, how long they spent in a particular room, what other tools they equip and when they equip them, etc. The more data we can collect about the learner’s interactions, the better.

All this data we’re collecting is formatted against standards like xAPI and Caliper, and makes its way to a Learning Record Store (LRS) where Data Analysts get to have their fun. And nothing is more fun to a Data Analyst than deriving meaningful insights from a huge swath of data. 

Let’s go back to the video example. How can we tell if the video effectively conveys its message? After analyzing the data, our Data Analysts reported back to us that of the 5,000 learners that were presented with the video (scrolled into view), 95% of them clicked play. Of that 95%, only 56% made it to the end of the video. Of that 56%, 35% started scrubbing further along in the video about halfway through. That leaves us with only 22% of those 5,000 learners that watched the video, end-to-end, without scrubbing ahead. Arbitrarily, only 1.3% of all learners scrubbed backward in the video.

So, what conclusions about this video can we draw? Likely that it’s not a very engaging video. It may not get to the point fast enough, or simply be too long. Given that very few learners went back in the video to revisit information, this video could just be lacking any relevant content. Regardless, it’s clear from the data that this video isn’t adding value to the learner’s experience, and now that we know this, we can now go back and improve our content.

lrs

Consider, too, that all content interactions can (and should) be tracked. Our video example is just a single content item that goes into making up the overall learning experience. Our learners may have found little value in the video, but still aced the final exam. We know this because our behavior data gives us both big picture and very granular insights into learner and content performance.

Although implementing a learner behavior tracking strategy may be the most effective way to measure our content’s impact, it’s not the only way to ensure quality content. Since this strategy is set to capture learner data only after the content is available for all learners, it’s up to Learning Experience Designers (LXDs) to make assumptions about the content’s impact at design-time. 

Designing for Learner Engagement

Understanding our target audience and their learning needs is the first step in designing engaging content. LXDs often create personas for various types of learners to help gain a clear picture of expected engagement with which the designer can make informed design decisions. Young learners might benefit from a vibrant color palette with limited but focused information on their screen, and more prominent interaction elements. Higher education learners might prefer more subtle colors, with more information on the page, combined with instructional videos and activities that apply (metaphorically or directly) to real-world scenarios.

LXDs also understand how important accessibility is, and a lot can be accomplished throughout the design process that ensures our content can be consumed by all learners, especially those with disabilities. Validating a design’s color and contrast against Web Content Accessibility Guidelines (WCAG) standards, implementing keyboard navigation, and screen reader support all ensure that visually impaired learners can read what’s on the page. Providing alternate font choices and information display configuration features can help learners with dyslexia and ADHD more readily absorb content while staying focused. By adhering to Universal Design principles, many features and design choices improve the learning experience for everyone, not just those with disabilities. A great example of this is the widespread adoption of “Dark Mode” designs in many websites, software, and even in our computer’s operating systems. Those of us on our computers after dark certainly have appreciated this feature where it’s been implemented (and immediately crank the screen brightness down where it’s not).

These days, we’re living in an interactive world, where our learners expect much more than pages of text sprinkled with static graphics and (long, explanatory) videos. Although many learning content platforms provide Tabs, Accordions, Carousels/Slideshows, and other layout components to help break up the monotony of scrolling, they’re not enough to keep most learner’s attention, let alone spark their curiosity. It’s that curiosity we want from the onset of the learner’s journey. The book Grasp by Sanjay Sarma and Luke Yoquinto explores the role curiosity plays in our “readiness to learn,” and suggests (backed by research) that there is strong evidence that curiosity helps us better remember new information. The more curious you are about a topic, the more likely you will remember it. Subtle interface effects and animated element state transitions can go a long way toward delighting our learners, and give rise to their curiosity.

Framer

 

Rive

 

Credit: Framer Motion Credit: Rive

As our learner’s demand for more exciting and engaging content grows, so too must our abilities to design and implement playful interfaces, along with inviting and encouraging activities; all the while ensuring that every bit of that experience is accessible, and the overall desire to continue discovery isn’t lost, no matter how the learner experiences our content.

A Formative Experience

A good LXD knows how to construct an experience that guides the learner through a learning objective in a formative way. Formative feedback is feedback provided during the learning process to guide the learner, offer correction, and support their ongoing development. It's more immediate than summative feedback and aims to improve understanding and performance. Let’s play with an example:

The learner is presented with an interactive diagram of an atom and is asked to label its main components: protons, neutrons, and electrons. 

formative-1

Once they begin interacting with the activity, they would receive the following formative feedback:

Correctly labeling the protons in the nucleus:
"Excellent! You've correctly identified the protons. Remember, protons have a positive charge and are found in the nucleus of the atom."

formative-2Mistakenly labeling electrons as being inside the nucleus:
"Not quite. Electrons orbit the nucleus and are not found inside it. Try again!"

formative-3

The learner hesitates or seems unsure about where to place the label for neutrons, hint:
"Neutrons are neutral particles. Think about where in the atom you might find particles with no charge."

formative-4

Correctly labeling all parts, reinforce learning:
"Great job labeling the components of the atom! Did you know that the number of protons in an atom determines its elemental identity? For example, an atom with 1 proton is hydrogen, while an atom with 2 protons is helium."

formative-5

The learner progresses:
"Now that you've labeled the basic components, can you adjust the number of protons, neutrons, and electrons to represent a carbon atom?"

formative-6

Feedback after attempting: 
"Well done! A carbon atom has 6 protons, 6 neutrons, and 6 electrons. You've got it right!"

formative-7

These formative feedback examples are designed to guide the learner in real-time, offering corrections, hints, and additional information to enhance understanding and ensure they are on the right track. A well-constructed formative experience engages our learner’s curiosity, guiding and encouraging them throughout their journey. 

Activities like this certainly present more complex accessibility challenges. How might a visually impaired user navigating entirely through a keyboard and screen reader experience this activity? It’s entirely possible to present screen reader users with an audio-only version of this activity, where expanded descriptions of exactly what’s happening is announced. Imagine hearing: “There are eight atomic components grouped together in the middle, and 3 atomic components orbiting this group. Within the middle group, four components are blue and labeled with the letter N, while the other four are red and labeled with a plus sign. The four orbiting elements are green and labeled with a minus sign. Of these three elements, which do you think is the Proton? Press 1 for ‘Blue N’, 2 for ‘Red Plus’, or 3 for ‘Green Minus’”. Pressing 2, the user is read aloud the feedback displayed for our visual learners: "Excellent! You've correctly identified the protons. Remember, protons have a positive charge and are found in the nucleus of the atom." In this way, we’re presenting this activity as an interactive story, giving audio-only learners a way to interact and stay engaged.

Finally, let’s take a moment to consider what the captured learner interaction data might say about this activity. Were they engaged? Were our hints and feedback encouraging enough? How are learners performing on exams after this activity? Do we have data before and after this activity was created to compare?

To Build or To Buy

Usually, the decision of whether to design and build your own content or to work with the plethora of existing learning content creation tools comes down to budget and timelines. With the “buy” approach, there are a multitude of tools available (at a reasonable cost) that can greatly enhance the learning experience, shorten time to market, and deliver content in formats suitable for collecting learner interaction data. Here are 5 popular interactive content creation tool choices, all of which provide LMS integrations for tracking learner interactions:

  • H5P touts an expansive catalog of interactive content types
  • Articulate Rise offers an easy entry for creating engaging course content
  • Articulate Storyline offers more flexibility than Rise, but with a steeper learning curve
  • iSpring Suite offers many powerful features and a PowerPoint-like authoring experience
  • BranchTack enables storytelling through customizable, branching scenarios

This list is but a mere fraction of the eLearning content tools available in the market. If you’re exploring the “buy” approach, ensure that whatever tools you select suit the needs of your learners and content creators, and that produced content can be enabled to track learner interactions.

With the “build” approach, LXDs have the freedom to use design tools of their choosing (Figma is a popular choice), and User Experience (UX) Engineers are empowered with an almost inexhaustible list of frontend technologies with which to implement the design vision. Likely, if we’re building our own experiences, our engineering team already understands the technical infrastructure required to get these experiences in front of the learner, and to collect and store their interaction data.

Finally, “to build or to buy” is a bit of a misnomer, as there are few hurdles in the way of taking both approaches. With a solid content management and delivery strategy, we can deliver vibrant courseware containing content created with various tools alongside rich custom-built simulations and activities. 

Hot Take(Away)s

Unicon has a lot of great options when it comes to creating content experiences to delight our learners. We want our learner’s journey to be engaging, formative, and accessible, and we want to be able to prove our content's effectiveness. 

The more we can discern about a learner's journey through our content, the better equipped we are to measure its effectiveness. Through data analysis, many insights can be gleaned about how impactful content is, and ways to improve content to increase learner performance.

Ensure that content designs cater to the target audience, are accessible, and follow Universal Design principles to be as inclusive as possible, enabling a positive experience for all learners. Remember that catering to the outliers often benefits everyone.

Spark learner curiosities with playful interactions. Implement user interface elements in ways that feel inviting to engage with, rather than requiring it. Subtle animations and transitions go a long way here.

Be formative in approach. Ensure that activities provide frequent and encouraging feedback. There is no “wrong” way to learn, and any mistakes the learner makes should provide feedback that guides the learner to a deeper understanding of the topic, to the point that making mistakes is encouraged.

The “Buy” or “Build” decision can be combined into a hybrid approach to deliver effective and measurable content. With a thoughtful content strategy and engaging experience design, we craft online learning that captivates and delights. By tapping into principles of curiosity and playfulness, we transform static content into dynamic journeys of discovery for our learners.

Phillip Ball

Phillip Ball

Learning Technology Architect
Phillip Ball is a Learning Experience Design Solutions Architect who has been with Unicon for over 10 years. Mr. Ball has worked on many projects for many clients, executing and enhancing his skills in User Experience Design and Software Development. Throughout the course of his career at Unicon, Mr. Ball has worked closely with shot callers in the edtech space, ensuring they have the details they need in order to make informed decisions. He particularly enjoys the implementation of design, taking pride in knowing his work has an impact on the learning experience. Mr. Ball holds a Bachelor of Arts degree, with a focus on Visual Communication. He enjoys researching new trends in technology, and how they might benefit Unicon’s work in education.