LxD Activity Streams Overview

Learning experience design (LxD) is a combination of activity streams including research, analysis, design, development, implementation, and evaluation.

Technically, not all of these activity streams have to be included in any LxD project, but they all should be included. 

Often, there are several iterative cycles of each of these activity streams overlapping in nested patterns throughout the “lifetime” of any LxD project.

Figure 01 shows one way to visualize the relationships across activity streams in any given LxD project.

Figure 01. Visualizing relationships across activity streams in an LxD project.

Activity Streams

All of these activity streams make up the core of what any individual or team of people should actually do while completing any cycle of any LxD project.  These activity streams all complement each other, especially in iterative cycles.  For example, evaluation and research can complement each other, as well as future iterations of analysis, (re)design, and development.

From a big picture perspective, these are the questions addressed in each activity stream:

  • What needs to be analyzed for successful learning experiences, and how?
  • What needs to be designed for successful learning experiences, and how?
  • What kinds of systems need to be developed for successful learning experiences, and how?
  • How should learning systems be implemented in order to deliver successful learning experiences?
  • What needs to be evaluated about learning experiences and systems, and how?
  • Which research topics are relevant to any given learning experience, and which methodologies are a best fit, and how?

Each of these activity streams has a separate content module taking a deep dive into articulating the activities within the stream.  These modules can be explored in any order, but it can be a good idea to start with research, continue with analysis, and then follow with design, development, implementation, and evaluation.  Each of these activity stream content modules is intended to be regularly iterated, updated, and expanded over time, and continuously linked with additional Projects, Cases, Interviews, and Critiques.

Below I’ll briefly introduce each activity stream.

Research

In the research module, we’ll explore reasons why every learning experience should be designed for experimental research practice, regardless of who will be doing the research. 

We’ll cover research theory, practice, and explore the “state of the state” of learning experience research methods.

We’ll explore spectra of relevance for various broad research topics, such as learning, teaching, assessment, evaluation, implementation, and research, as well as content- or subject-specific topics, such as math, science, and the liberal arts.

We’ll cover the spectrum of research methodologies and approaches, including quantitative, qualitative, mixed methods, design-based research, action research, and other combinations of methods and approaches as they arise in relevance.

We’ll articulate the ways that research can and should inform analysis, design, development, implementation, and evaluation within any learning experience project (and vice versa).  This will be complemented with a discussion of the importance of methodological transparency and open/shared data in any research process.

Analysis

Analysis concerns problems, goals, and preparations, beginning with how to realize one or more problems associated with a particular learning experience. 

We’ll explore what a problem actually is, including its neutrality, its level of complexity (or wickedness), as well as how to identify problems: are they learning problems, teaching problems, assessment problems, etc.?  We’ll continue with a discussion of how to verify that identified problems are actually problems, and whether they should be solved, and if so, how soon.

Following the discussion of the nature, structure, and importance/relevance of problems, we’ll explore how to establish initial goals for any learning experience project.  These include some combination of learning goals, teaching goals, assessment goals, motivation and engagement goals, technology goals, and goals that may be unique for specific (sub)sets of stakeholders associated with the learning experience.

As you might have guessed, these initial goals will need to be refined.  We’ll use some example goals to discuss why and how this refinement can and should take place within the analysis activity stream.

Refined goals lend themselves to solution mapping as a guide for moving forward with at least one iteration of design, development, implementation, and evaluation for the learning experience.  This solution mapping can also be connected to continued research strategies.  The mapped solution can be articulated as a series of initial design sketches for things like interactions and media forms, which can usually be pretty messy.

Often these sketches will visualize angles of the problem and solution that lead to a final refinement of goals, after which it is time to move forward to the design phase of the learning experience project.  The stake is firmly in the ground at this point.

We’ll conclude the analysis module with a discussion of how to use what we’ve learned from the analysis phase to help us gear up for at least one cycle of design, development, implementation, evaluation, and research for the learning experience.  This discussion will include strategies for how to build out the team (which may morph from one phase to the next, and from cycle to cycle), as well as more pragmatic things like how to use backtiming and timelines and Gantt charts (or other visualizations) to manage each phase and cycle of the learning experience project.

To continue preparing for the evaluation phase, we’ll connect the mapped solution to assessment and any relevant learning standards, as well as an ethical treatment of any environmental impacts associated with the implementation of the learning experience.  These issues should be addressed before design begins in earnest.

Design

Our exploration of the design phase will begin with an exploration of human-centered design, and whether or not this perspective is always the most important to maintain when designing any learning experience. 

This will dovetail into a discussion of participatory design, and how (and when) this approach can be applied with any of the stakeholders involved in any learning experience.

We’ll cover aspects of communication (media) design including visual, auditory, and motion-based media (and combinations of these media), as well as many different modalities of interaction design, such as clicking, touching, hearing, speaking, waving, moving, and gazing.  This will lead us to user experience design (or UX), and how this relates to learning experiences, considering each type of stakeholder as a “user” of the learning experience, which also helps us explore the concept of sociotechnical systems design.

We’ll continue with articulations of learning design, instructional design, assessment design, and research design, as well as how all four of these types of design fit together to inform each other and the development, implementation, and evaluation of any learning experience.

We’ll conclude the design discussion with an exploration of several strategies for formalizing the learning experience solution design (in preparation for development), including design documentation, prototyping, and cognitive labs.

Development

Our discussion of the development phase will begin with a general overview of development processes and cycles, including the fuzzy zones where design and development can overlap. 

For example, some elements of development can certainly begin before the design phase is completely “finished”.  This also allows us to take a brief step back for a more philosophical perspective on the relationship between design and development, especially across iterative cycles of any LxD project.

We’ll explore a generalizable model for understanding “digital ecosystems” as a mechanism for building (developing) any designed solution for any learning experience, including the relationship of software, hardware, and services in an ecosystem of digital experiences, and how these relationships will likely differ for different groups of stakeholders or participants.  This will lead us into the concept of hybrid experiences, and a framework for development using an understanding of the “hybrid simulation layer” and how all the developed elements should work together as a machine in support of any learning experience.

Through the lens of perspectives such as the platform revolution and the fourth industrial revolution, we’ll explore the pragmatics of software development, hardware development, media development (production), and assessment development, including a general perspective for each type of development that helps us decide when to create and when to curate instead of reinventing the wheel.

Implementation

The implementation module will focus on the concept that implementation is a bridge between design, development, evaluation, and research. 

We’ll explore when and how formative and summative evidence (data) can and should be collected during implementation, and what purposes these data streams can serve, including validation of design, verification of development, and preparation for evaluation and research.  

We’ll reflect on how all of this should be part of design, and we’ll also explore the pragmatics of managing implementation teams, arenas, and groups of stakeholders during any number of iterative cycles of implementation for any learning experience.

Evaluation

The evaluation module will continue the “formative vs. summative” discussion begun in the implementation module, focusing instead on when and why to conduct formative and summative evaluations of any learning experience, as well as a recognition that all summative evaluations are formative on a long enough time scale, even if someone else other than the current LxD team is using the summative results to move forward with new learning experiences.

We’ll discuss in detail each of the different aspects of what is evaluated in any learning experience, including content, process, context, people, machines, information, and environments. 

We’ll conclude the evaluation module with a discussion articulating the various ways any LxD team can and should make the evaluation process useful for iteration, regardless of who is doing the iterating, including useful applications of analytics and thoughts on when and how to build and deliver reports for various stakeholder groups.

NOTE: This is a sample of one module of content that would be available in the LxD Primer course and book(s). Many of the elements of the module would be linked to other modules as relevant.

Stay in the know.

Sign up to stay in the loop about our progress with the LxD Primer project.

Logo for LxD Primer

The LxD Primer is Copyright © 2022 Benjamin E. Erlandson, Ph.D. All rights reserved.

The LxD Primer is part of the RESCIV family of projects.