Learning experience network analysis for design-based research

Jonan Phillip Donaldson (Center for Teaching Excellence, Texas A&M University College Station, College Station, Texas, USA and School of Education, The University of Alabama at Birmingham, Birmingham, Alabama, USA)
Ahreum Han (Department of Curriculum and Instruction, University of Illinois Chicago, Chicago, Illinois, USA)
Shulong Yan (School of Education – Center for Community and Citizen Science, University of California Davis, Davis, California, USA)
Seiyon Lee (Graduate School of Education, University of Pennsylvania, Philadelphia, Pennsylvania, USA)
Sean Kao (Department of Educational Psychology, Texas A&M University, College Station, Texas, USA)

Information and Learning Sciences

ISSN: 2398-5348

Article publication date: 11 December 2023

Issue publication date: 15 January 2024

900

Abstract

Purpose

Design-based research (DBR) involves multiple iterations, and innovations are needed in analytical methods for understanding how learners experience a learning experience in ways that both embrace the complexity of learning and allow for data-driven changes to the design of the learning experience between iterations. The purpose of this paper is to propose a method of crafting design moves in DBR using network analysis.

Design/methodology/approach

This paper introduces learning experience network analysis (LENA) to allow researchers to investigate the multiple interdependencies between aspects of learner experiences, and to craft design moves that leverage the relationships between struggles, what worked and experiences aligned with principles from theory.

Findings

The use of network analysis is a promising method of crafting data-driven design changes between iterations in DBR. The LENA process developed by the authors may serve as inspiration for other researchers to develop even more powerful methodological innovations.

Research limitations/implications

LENA may provide design-based researchers with a new approach to analyzing learner experiences and crafting data-driven design moves in a way that honors the complexity of learning.

Practical implications

LENA may provide novice design-based researchers with a structured and easy-to-use method of crafting design moves informed by patterns emergent in the data.

Originality/value

To the best of the authors’ knowledge, this paper is the first to propose a method for using network analysis of qualitative learning experience data for DBR.

Keywords

Citation

Donaldson, J.P., Han, A., Yan, S., Lee, S. and Kao, S. (2024), "Learning experience network analysis for design-based research", Information and Learning Sciences, Vol. 125 No. 1/2, pp. 22-43. https://doi.org/10.1108/ILS-03-2023-0026

Publisher

:

Emerald Publishing Limited

Copyright © 2023, Jonan Phillip Donaldson, Ahreum Han, Shulong Yan, Seiyon Lee and Sean Kao.

License

Published by Emerald Publishing Limited. This article is published under the Creative Commons Attribution (CC BY 4.0) licence. Anyone may reproduce, distribute, translate and create derivative works of this article (for both commercial & non-commercial purposes), subject to full attribution to the original publication and authors. The full terms of this licence may be seen at http://creativecommons.org/licences/by/4.0/legalcode


Introduction and background

The idea of learning experience by design is central to learning sciences, marking design-based research (DBR) as one of the signature methods of the field (Hoadley, 2018; Barab, 2006; Fishman et al., 2004; Fischer et al., 2018). It is also an important methodology at the intersection of information sciences and the learning sciences (e.g. Dennerlein et al., 2020; Howell et al., 2023; Jayatilleke et al., 2018; Proctor and Blikstein, 2019). DBR is an iterative process that involves collaborative efforts of researchers and practitioners to design, develop, implement and evaluate (Brown, 1992; Collins, 1992). Originally stemming from the need for “usable knowledge” for learners and educators (Lagemann, 2002), it offers a unique opportunity to examine how a theory-informed design plays out in practice, evaluate how different variables interact in the real-world setting and extract key principles for future implementation. In essence, DBR allows for development of both theory and design principles, thereby bridging the traditional gap between theory and practice (Campanella and Penuel, 2021). The last three decades has seen the emergence of related methodologies in the learning sciences such as change laboratory (Sannino, 2008), social design experiment (Gutiérrez, 2016) and participatory design research (Bang and Vossoughi, 2016) that share paradigmatic and analytical features with DBR while remaining distinct methodologies (Clark, 2022; Sannino et al., 2016; Vogelstein, 2022). This paper describes our first steps toward an innovative analytical approach for analyzing learner experience data in a way that helps design-based researchers develop data-driven design changes. This learning experience network analysis (LENA) method can be used by design-based researchers to develop design changes from one iteration to the next as part of their DBR methodological toolkit. Our hope is that other researchers take our work as a starting point for further development of this approach to network analysis in their DBR studies.

DBR complements existing methods in educational research in which design elements and an assessment thereof play a critical role (Collins et al., 2004; Obrenović, 2011). Typically, DBR starts with a needs analysis and theory selection, development of design principles derived from the selected theories and design of the learning experience. After data has been collected during the implementation, data analysis yields design principles or “moves” for the redesign. The design is iteratively refined as key findings are translated into a more nuanced understanding of a theory, development of theory-grounded design principles, or even birth of a new theory. Thus, there is an assumption of an evolution or a trajectory in design principles and multiple iterations are essential (Zheng, 2015). A growing body of researchers have paid attention to conjecture mapping (Sandoval, 2014) as a primary means of generating design principles in DBR. The challenge for researchers is to identify the strengths and weaknesses in a design and most importantly the barriers to the affordances and opportunities in the design.

While DBR has garnered interest for its merits, there has also been debate on its limitations, particularly of methodological concerns. Scholars have raised the need for reflection with a focus on materializing concrete metrics as benchmarks that collaborators of different backgrounds and purposes use as they converse and converge toward efforts for successful DBR (Dede, 2004). One of the key challenges is about how decisions are made from one iteration of the design to the next. Often, expertise of the researchers and practitioners is assumed to be the guiding force as they leverage their intuition amassed over the years (Obrenović, 2011). Therefore, design-based researchers must be attentive to research bias in cherry picking data that fits the researcher’s existing framework, research goals and assumptions, reactivity in terms of how the presence of the researcher affects the settings and studies and reflexivity regarding their own beliefs, values, life experiences and professional practices (Hammersley and Atkinson, 2007; Maxwell, 2012). Although researcher expertise can be important in analyzing learning experience data to craft design changes, innovations in analytical approaches are needed to facilitate the reflective conversation (Schön, 1992) between the researcher and the design while simultaneously reducing the potential for research bias and reactivity. Furthermore, it often remains unclear how findings can be translated into specific design moves, which presents a greater problem considering that researchers may be involved at varying levels and periods (Cobb et al., 2003). Analytical innovations are needed to provide structure for novices, experts from other fields and those who have no access to formal training in DBR methodology to confidently craft data-driven design moves.

Since the beginning of the field, learning scientists have focused on designs for learning, specifically through design research that investigates how designs for real-life settings are experienced and how these experiences relate to theory. Learning theories produce guiding principles to ground our designs, and analysis of learner experiences of our designs helps us further develop these theories. Theories, particularly in design-based work, “must do real design work in generating, selecting and validating design alternatives at the level at which they are consequential for learning” (diSessa and Cobb, 2004, p.77). To accommodate this need, we need expansive and holistic theories and methodologies that enable researchers to comprehend how theories come alive and interact through the design and implementation of learning (diSessa and Cobb, 2004). Our own commitment to embracing the complexity and messiness of learning led us to conclude that reductionist methodologies (e.g. analyzing isolated variables) would not align with our ontological stance regarding complexity and the deeply contextualized nature of learning. Thus, we propose the development of new methods for generating data-grounded design moves within the context of DBR studies and describe our early formulation of one such method. We focus our attention on the complex interdependencies between learner experiences in terms of things that worked well, struggles and alignment with principles from learning theories as a potential source for actionable insights on how we could formally legitimize the often elusive nature of design moves in DBR as well as a new affordance to not only systematize but also streamline the process. Our work has pushed us to begin development of one such method that leverages network analysis.

Network analysis

Complex systems theory provides a frame through which to understand phenomena as systems consisting of many interdependent elements. Interactions between these elements lead to emergence that cannot be understood through analytical methods that rely on assumptions of linear causality (Bar-Yam, 2003). Complex systems tend to exist in nested hierarchies, and the elements in a complex system are often emergent from another complex system. How learners experience different aspects of a design for learning can be seen as emergent phenomena that interact as elements in other complex systems (Hilpert and Marchand, 2018). A complex systems approach allows DBR researchers to embrace the complexity of learning. Network analysis is an important analytical tool for understanding complex systems.

A network is a structure that shows how objects (nodes) are related to each other (links). Researchers in many fields have found that network analysis could be used in different settings because nodes can represent a wide range of elements ranging from people, locations, organizations and cells to beliefs and concepts (Menczer et al., 2020). This unique capability of network analysis allows researchers to examine structures or interrelations within a particular complex system (Borgatti et al., 2009; Luke and Harris, 2007; Menczer et al., 2020). For example, social scientists have widely used social network analysis (SNA) where nodes represent people to investigate the patterns of social relations, which may not be visible in other modes of research methods (Borgatti et al., 2009). Likewise, educational researchers are now actively seeking the potential benefits of using network analysis as a mode of inquiry.

In all types of network analysis, the most important thing is the patterns of relationships between nodes. The three most frequently used types of network analysis in the field of education are as follows: SNA, semantic network analysis (SemNA) and epistemic network analysis (ENA). In SNA, a node is usually a person and a map represents a group (network) of interconnected people (Scott, 2017). The analysis aims to illustrate patterns of relationships among actors to examine the effects of the structures on the actors and the effects of actors on structures (Borgatti, 2009; Martínez et al., 2003). In SemNA, a node is usually a word, and a map represents a group of interconnected words (Christensen and Kenett, 2021). This type of network analysis examines “paired associations based on shared meaning as opposed to paired associations of behavioral or perceived communication links” (Doerfel, 1998, p. 16). In ENA, a node is an idea, and a network map represents an epistemic frame (Shaffer, 2018). The focus of ENA is on concepts or people who share similar cognitive framing to reveal the patterns of cognitive connections (Pantić et al., 2022; Shaffer et al., 2016).

Network analysis has been used in DBR projects. Van Staden and Van Der Westhuizen (2013) used SNA to identify collaboration problems in a DBR study investigating a program through which teachers developed skills in technologies for teaching and learning. Bagley and Shaffer (2015), in DBR studying virtual mentoring, used ENA (along with other analytical methods) to investigate the impact of the intervention. Chen et al (2018) used SNA to provide teachers with feedback about their students’ interactions in online discussions. More recently, SNA was used as a complement to critical discourse analysis in several DBR studies investigating designs for collaborative modeling and engagement in scientific argumentation (Ryu, 2020). Ouyang et al. (2021) used SNA, topic network analysis and cognitive network analysis as visual tools to improve learners’ social-cognitive engagement in a DBR study.

While DBR researchers have used network analysis in many ways as mentioned above, we notice that network analysis has not been usually used as a means of crafting changes to the design of the next iteration. For example, SNA could be used in DBR to understand how learners are interacting, but this information may be difficult to translate into specific design moves. Similarly, ENA can help DBR researchers understand changes in epistemic frames, but again it might be difficult to translate such findings directly into a set of design moves.

What we felt we needed for our own DBR studies was a network analysis method developed intentionally for helping DBR researchers craft specific design moves. Each feature of a design in a DBR study is experienced by learners, and design moves over multiple iterations are needed to bring into alignment the experience intended by the design team and the experience as lived by learners. Therefore, we believe that setting our focus on understanding the many different ways in which the design is experienced suggests a network analysis in which each node of the network represents a specific type of experience, and the patterns of relationships between many such experiences helps us understand the complexity of how the design is experienced.

Latour’s (1996) actor-network theory (ANT) provides a valuable perspective on the relationships between aspects of a design and the ways in which those aspects are experienced. Based on his observation of how scientists work, Latour argued that science is done by the scientists and the tools they use in the network. By tracing the trajectory of the network created by actors (humans) and tools or objects (nonhuman elements), Latour challenged the “objective” view of scientific knowledge and argued that what made the knowledge stable was based on a closure that has sets of relations between balanced functions and position for each actor in the network. At the same time, ANT is a grounded approach that seeks to rebuild theory based on the network that emerges from the collective actions (Latour, 1996). The purpose of using the framing of networks was to avoid the Cartesian divide between matter and spirit. The goal of taking an ANT perspective is not to create a structure based on linear causal relationships between each entity but to deconstruct and map the connections between each entity that gives birth to a specific phenomenon. From a sociocultural perspective, DBR recognizes the need to shift the locus of “experiment” to the real classroom settings rather than limiting it to the controlled environment as traditional psychological experiments have often been. Traditional psychology interventions often failed to translate successfully to the actual environment, given they often used reductionist methods that simplified the conditions for the experiment’s sake but failed to consider the far more complex elements that were excluded from the theoretical perspectives or researchers’ interests. In other words, DBR was developed by learning scientists seeking new ways of embracing the complexity and messiness of learning in holistic and expansive ways.

Informed by ANT, Sørensen (2009), argued that defining the elements of a network in a complex system as either social or material is unproductive because of their intricate relations. Sørensen’s study deconstructed how a virtual environment came to be and argued that the term “materiality” as a new conceptual tool could provide the leap of dividing social and material. As she put, “materiality refer[s] to the achieved quality of a hybrid that allows it to relate to other parts” (p. 61). In a DBR setting, we argue to seek for the materiality of students’ experiences as emergent phenomena that came to be through the interdependent connections created by enactments of the theoretical underpinnings, the instructors’ teaching and classroom configurations, researchers’ research interests, the learning designs and much more. From this materialist epistemology perspective, the students’ direct experience would not come to be without the networked connections or the tangible assemblages (Nathan and Swart, 2021).

In much DBR, researchers tend to be interested in how people experience the design and analyze these experiences toward two goals:

  1. to identify aspects of the design which could be changed to better achieve the desired outcomes of the design; and

  2. to identify aspects of user experiences that can be interpreted through theory or to build theory.

DBR requires that the initial design be grounded in theory, for design moves to be informed by theory and for overall findings to speak back to the theory in which the design was grounded or to develop theory (Anderson and Shattuck, 2012). The principle of leveraging the strengths of a design to address the weaknesses is a fundamental approach in the design sciences based on the argument that weaknesses and strengths are interrelated (Norman, 2013). Our own perspective seeks to analyze not only the relationships between how learners experience strengths and weaknesses of designs for learning, but also these aspects in relation to aspects of learner experiences that align with principles from learning theories. Furthermore, we seek to understand the complexity of numerous multiple interdependencies in these relationships. There is great potential for innovations using network analysis for these purposes.

Although there are studies in which network analysis has been part of DBR studies, we have not identified any studies that described in detail how network analysis could be used as the primary analytical method through which design moves were constructed. ENA is the closest relative to the method we propose, particularly because it is a network analysis approach often used in the learning sciences and recently has gained traction in DBR studies. The studies we identified in the literature used this analytical method to understand learners’ thinking when they engage in constructing design moves (e.g. Arastoopour Irgens, 2021), to track design changes (e.g. Gomez, 2021), or to analyze learning (e.g. Barany et al., 2020; Barany et al., 2021). In these studies, ENA did not appear to be the primary means for the development of design changes. Conversely, some design-based researchers code data in qualitative analysis software and then construct concept maps, but without the use of network analysis (e.g. Parmaxi and Zaphiris, 2020). What we develop in this paper is a novel approach to contribute to the DBR toolkit by using a new variation of network analysis to compliment others in the DBR toolkit such as ENA. While quantitative ethnography methods (Zörgő et al., 2022) such as ENA excel at helping researchers understand epistemic aspects of learning experiences such as identifying and describing epistemic frames, comparison of epistemic frames across people, groups or contexts and tracking of changes in epistemic frames over time (Arastoopour Irgens and Eagan, 2023; Zörgő et al., 2022), our method seeks to help researchers understand ontological aspects such as how the design of learning is experienced. In other words, ENA is most powerful when we want to understand learning itself, where our method is useful when we want to understand the complex patterns of interdependencies between various experiences of learners engaging in a designed learning process. Although both are types of network analysis, there are differences. For instance, in ENA maps, the coordinates have meaning, but our method is similar to traditional SNA maps in which the location of nodes have no meaning beyond proximity and connections to other nodes. Our networks are not weighted, and a larger number of nodes is desirable, while in ENA the networks are weighted and a smaller number of nodes is usually desirable (Tan et al., 2022). Therefore, we see our method as a synergistic companion to other analytical approaches in DBR such as ENA rather than an alternative.

In our DBR projects, the designs are intended to produce learning experiences. Therefore, we seek to understand the complex experiences of learners as they engage in the learning experiences we have designed. In line with the ontological and epistemological assumptions of ANT, we felt the need to develop a new network analysis methodology in which each node is a unique experience (or type of experience) as experienced by learners, and the network map is the overall learning experience as experienced by the group of learners. We use three categories of learning experience. To meet the DBR need for understanding aspects of the design which could be improved, we need to understand aspects of the design in which learners struggled or otherwise experienced as problematic (category 1). We also need to understand aspects in which learners had positive experiences (category 2) to leverage strengths of the design to address aspects in which learners struggled. To meet the DBR need for understanding learner experiences through the lenses of the theories in which the design was grounded, we need to understand learner experiences which align with particular principles derived from those theories (category 3). Therefore, in our LENA approach, each node represents a particular learner experience, and these experiences are categorized as struggles, strengths and theoretical principles. Relationships and clusters of relationships between these nodes help us understand the complex interdependencies at play in how designs for learning are experienced and enable us to make data-informed design changes.

The purpose, reasoning and logic behind learning experience network analysis

We need new methods of analyzing data in DBR which leverage and embrace the complexity of the learning experience–methods which are not themselves so complex as to be difficult for researchers to implement.

Driving principles for learning experience network analysis in design-based research

LENA was developed over the course of many DBR projects in multiple settings including a middle school science program, faculty development programs and university courses in many disciplines including animal science, environmental science, engineering, literature, veterinary medicine, leadership, tourism, psychology and education. The current formalization of LENA occurred through a pre-conference workshop and subsequent collaboration between the facilitators of the workshop and workshop participants.

We wanted to develop analytical methods for use in DBR studies–methods that embrace and illuminate the complexity of learning. Learning environments and the nature of learning itself involve many complex systems in nested levels of interacting complex systems. How learners experience the learning experiences we develop in DBR projects involves numerous types of experiences and interpretations of those experiences that form the interdependent elements of a complex system. Network analysis allows for expansive approaches to holistically understanding the relationships between learning experiences. LENA is only interested in learner experiences because it is intended only for generating changes to learning experience designs in DBR studies, and, therefore, uses only data that can speak to these experiences as directly as possible. We seek to use data that is as close as possible to authentically and directly representing the experiences of learners with as little need for interpretation by the researchers as possible.

We started developing LENA by looking only at network maps of correlated struggles and strengths because we wanted to leverage aspects that worked to address struggles. For instance, in Donaldson et al. (2022a, 2022b) and Ganvir and Donaldson (2022), the design was semester-long collaborative project-based learning in environmental science, leadership, education and tourism courses, and network analysis of these relationships helped us construct design changes to address collaboration issues, relevance issues and framing issues. In the Zhao et al. (2022) DBR study of an argumentation activity, analysis resulted in design moves of increasing the range of topics, providing more sources of information and more detailed guidance regarding expectations. During a preconference workshop (Donaldson et al., 2022c) along with work in subsequent studies, we recognized that design moves in DBR should be grounded not only in evidence from the context, but also in principles from learning theories. Subsequently, we have created network maps that include not only aspects that students experienced as working well and aspects in which students experienced struggles, but also aspects of their experiences that aligned with principles from the learning theories in which the design was grounded. For instance, in the Adam et al. (2023) DBR study investigating competency-based grading, we included codes for principles from self-determination theory. In a study on generative game-based learning (Cooper et al., 2023) the network maps included principles from constructivist and constructionist theories, situated learning theory and transformative learning theory.

Design-based research studies where learning experience network analysis is appropriate

LENA may not be suitable for all DBR studies. This type of analysis is only appropriate investigating the direct experience of learners engaging with a learning experience design, rather than other types of DBR such as design-based implementation research (Fishman and Penuel, 2018) or DBR focused on infrastructure at the research-practice partnership systems level (Penuel, 2019). It is most appropriate in learning contexts where there is a large enough number of participants, which in our experience tends to be at least ten. Data from fewer than 10 participants makes constructing network maps difficult since the method relies on code co-occurrence or correlations. Furthermore, theoretical saturation usually occurs with more than twelve participants (Guest et al., 2006). It is also important that the context is such that it is possible to gather rich qualitative data from learners about their experiences. Although there is no standard measure for the richness of qualitative data (Ames et al., 2019), we deem data to be rich when it consists of multiple paragraphs of text provided in response to questions specifically targeting learner experiences.

Applying learning experience network analysis

The LENA method is fairly straightforward (see Figure 1). We will discuss the details of each step.

Types of design-based research data for learning experience network analysis

Learner experiences can only be indirectly recorded. Different forms of data may have varying degrees to which they represent learner experiences, and differences in the levels and aspects of experiences captured. For instance, surveys can be used to easily capture and code data from large numbers of participants regarding their own personal perceptions of their experiences. Individual interviews or reflection papers allow for much greater depth of analysis of these experiences than surveys, but require more time for the researchers to code. Video recordings of learning activities are valuable in many other forms of analysis within DBR, but they may be limited in their ability to record learners’ own interpretations of their learning experiences. Most of the recent work through which we developed LENA has been in university courses where we found individual reflection papers to provide the optimal richness of data. These reflections ask learners to reflect on their learning experiences in terms of struggles, what worked particularly well and what the activity meant to them. The major problem we have faced is that it takes several months to code the data, therefore, not allowing us to produce findings and implement design moves between semesters. This issue is particularly daunting with courses with hundreds of students. In a few instances we decided that the need for quick between-semester analysis outweighed our desire for rich data, leading us to use surveys with mostly Likert scale questions accompanied by a few short-response open ended questions. We only opted for in-depth interviews for small faculty development programs. We have also used teacher memos to varying degrees of success. We encourage researchers to experiment with other types of data through which to record learner experiences. The most important considerations are ensuring that you are capturing learner experiences, and that you do so in a way that allows for subsequent analysis through which to understand the complex interdependencies between many different aspects of the experiences.

Example data – design thinking for engaged learning

In the following sections we will provide examples from a DBR project in which we engaged students from a senior-level undergraduate course in a college of agriculture in team-based design thinking projects. We used the design thinking for engaged learning (DTEL) model which was developed to provide structure for faculty to engage students in collaborative, project-based learning (Donaldson and Smith, 2017; Jamal et al., 2021). The DTEL process combines innovation, collaboration and project-based learning to develop novel solutions to wicked problems.

Our design was a collaborative project-based learning experience spanning 14 weeks, with the design thinking project broken up into 10 stages (finding and understanding the problem; empathy and perspective work for wicked problem framing; divergent thinking; convergent thinking; project planning and low-fidelity prototyping; high-fidelity prototype construction; user testing and collect data; analyze data and plan design moves; implement design moves to create multiple iterations; and deploy for real-world impact) that were organized into 5 phases (name and frame; diverge and converge; prepare and share; analyze and revise; and deploy). Across all stages, the design intentionally focused on facilitating development of designerly ways of knowing (e.g. wicked problem framing, abductive reasoning, contextualized thinking, reflection-in-action and cognitive, affective and conative empathy).

The key theories underlying the DTEL model informed the initial design of the learning experience: situated learning theory (Lave and Wenger, 1991), cognitive constructivist theory (Piaget, 1952), social constructivist theory (Vygotsky, 1978), constructionist learning (Papert and Harel, 1991) and transformative learning theory (Mezirow, 2009). In line with the domain-specificity aspect of situated learning, students identify and frame a problem directly related to the disciplinary scope of the course (Jamal et al., 2021). In our design, the students were expected to go into real-world contexts where their identified problem exists and conduct interviews to understand stakeholders’ experiences of the problem. Project groups consisting of three to four students each were expected to complete all stages of the process together rather than splitting up tasks to be accomplished by individual members. The design thinking learning activities aimed to facilitate transformative learning experiences through collaborative project-based learning that promotes collective problem-solving and co-constructing knowledge.

The example data comes from the second iteration which occurred one year after the initial prototype. Since then, we have completed a total of four iterations, but selected the second iteration to describe here because this was during the period when we developed LENA in its current form. The design moves implemented in this iteration included adding team peer-review activities, spending more time helping learners develop understanding of the nature of wicked problems, frequent reminders for teams to engage in empathy work in all design thinking phases and reframing the interview assignment in Stage 2 to emphasize the human-centered design principles of focusing on the margins and seeking to understand as opposed to collecting information. There were approximately 30 students in the class, but data from only the 10 students who signed informed consent was collected in the form of reflection assignments completed at the end of each of the five design thinking phases, resulting in a corpus of 50 documents with an average of 400–500 words each. Example reflection prompts included “Write a few paragraphs about your experience in the design thinking project, and what it meant to you personally and professionally. What aspects of the design thinking project worked particularly well for you, and why? In what aspects of the project did you struggle, and why?”

Preparing the data, coding

We usually code at the level of the sentence, although the researcher must determine the most appropriate grain level depending on the context. For instance, we have encountered situations where an entire paragraph is the most appropriate level. The number of codes will depend on the study, but generally we find that in the theory category we use around three to eight a priori codes per theory. In the struggles and what worked categories we often end up with 20 to 50 emergent codes per category. We try our best to code every sentence, but often there are sentences that are irrelevant to our study and, therefore, remain uncoded. Sometimes multiple codes are applied to one sentence. This is particularly true of codes for principles from theory because different theories often have principles that are distinct to that theory, but overlap conceptually with principles from other learning theories.

An early step in DBR is to ground the design in appropriate theories. Different aspects of the theories being used are translated into design principles which guide design choices. Therefore, when preparing the codebook, one category of codes includes all the theory-derived principles. The principles being coded are the same as the principles used when designing the learning experience. Learner experiences are coded for indication of alignment with these specific principles. In our example data, our a priori theory codes from situated learning theory included engaging in the community of practice, engaging in the practices of the community of practice, communication in the community and identity exploration in relation to the community. Codes from transformative learning theory included questioning beliefs or assumptions, changing beliefs or assumptions, changing ways of knowing and developing new perspectives. Our cognitive constructivist codes included individual knowledge construction and knowledge consolidation. The social constructivist codes included collaborative knowledge construction, mediating artifacts and scaffolding. Constructionist theory codes included generativity (making), learner agency, tinkering, productive failure and authentic real-world audience and purpose.

We also want to understand what aspects of the design were problematic and could be subject to improvement in subsequent iterations. We can identify these aspects by emergent coding of learner experiences within the category of student struggles. The researcher must be particularly careful to recognize that some struggles are productive, while others inhibit optimal learning. Only struggles that are counterproductive should be coded as struggles, and struggles such as productive failure or cognitive dissonance should be coded under the relevant theory categories. In our example data, the most frequent emergent codes for student struggles were teamwork, peer review, problem framing, user testing, time management, convergent thinking, divergent thinking and clarity of instructions.

Another category of codes includes aspects of learner experiences which “worked” well and aspects of experiences which students found beneficial, enjoyable, generative, transformative, etc. Not only can these aspects of experiences help us understand what aspects of the designed learning experience were appropriate, but can also be understood in their relationships with problematic aspects of learning – thus, enabling us to leverage strengths to address related weaknesses. Emergent codes in our example data for what worked particularly well-included confidence building, convergent thinking, deeper understanding of issues, divergent thinking, empathy work, learner agency, problem framing and relevance.

After coding is complete, the codebook often needs to be cleaned. If there are any codes which have been used three or fewer times, we either merge these codes (along with their coded segments) with other codes or delete the codes. In one of our studies, we had infrequently-used codes in the struggles category of “STRUGGLE – process not relevant” and “STRUGGLE – lack of interest” which we merged and renamed as “STRUGGLE – relevance and interest.” If the data set is small (for instance, if we have only 10 reflection papers), we might delete codes that have been used only once or twice. If the data set is larger (we often work with data sets including reflections from hundreds of students, with each student submitting five reflection papers) we usually delete codes that have been used three or fewer times.

We also look for codes which have been used much more often than other codes, evaluate all the segments of text to which one code has been applied and attempt to split the code into two or more codes. In one of our studies, we had a frequently-used code in the what worked category of “WORKED – Teamwork” which we were able to split into multiple codes: “WORKED – teamwork – communication,” “WORKED – teamwork – collaboration,” “WORKED – teamwork – relationships, trust” and “WORKED – teamwork – role negotiation, fulfilling roles.”

During the period in which the codebook is being developed, the research team meets frequently to negotiate differences in coding. It is also advisable to conduct inter-rater reliability testing once the codebook is nearly complete and continue to negotiate differences until the team achieves their desired level of inter-rater reliability. After cleaning, our example data codebook included 22 theory codes, 24 struggle codes and 22 codes for what worked.

Correlation or co-occurrence analysis

Once the coding is complete and the codebook is clean, we conduct correlation analysis. In our work so far, we have been using MAXQDA Analytics Pro which has a statistics function whereby we can calculate correlations for pairs of codes, indicating the likelihood that those two codes appear in the same document. We use Pearson’s one-tailed correlations because we have no reason to assume a normal distribution. We conduct this analysis for all possible combinations of pairs of codes in the three categories (theory, what worked and struggles) and export the results as a Microsoft Excel file containing the symmetrical correlation matrix. We have also used other analytical methods such as conducting code co-occurrence frequency analysis. This is useful in studies in which the data includes longer text documents such as interviews because this allows us to narrow the parameters to include only code-occurrences within a few sentences or paragraphs, rather than both codes occurring anywhere within one text document. Because our studies have tended to include shorter text documents in the form of reflection papers consisting of only a few paragraphs each, we will describe how we prepare the correlation matrices for use in network mapping tools.

The Excel files produced by the qualitative analysis software we use must be transformed before importing into the network analysis software. The correlation matrix is symmetrical with the names of the columns and rows indicating the names of the codes. Each cell in the matrix includes the Pearson’s r value and the p-value for values that are significant and values that are not significant. Therefore, we create three copies of the sheet so that we have a sheet for values at the p < 0.001, p < 0.01 and p < 0.05 confidence levels. With some cleaning (Excel macros are helpful here), each sheet contains only the Pearson’s r values that are significant at that level and all other cells are empty.

Network mapping

After the symmetrical correlations are prepared, we create network databases in the network analysis software by importing the data from Excel, with separate network database files for each of the three confidence levels (p < 0.001, p < 0.01 and p < 0.05). Then we construct the network maps as one-mode networks. We always start with the best confidence level (p < 0.001) and work our way down if the network is not strong enough. Often we find that at the p < 0.001 level the network consists of many dyads and triads (two or three connected nodes), so we go to the next level (p < 0.01) to get a map with many connected nodes. Usually, we want network maps with all nodes within one network to best understand the complexity of experiences. Sometimes there are nodes which are not connected to any other nodes, so we delete these nodes because we are only interested in patterns of interdependencies between different aspects of the learning experience. The purpose of LENA is to help researchers make changes to aspects of the design with which learners struggled. Therefore, if there are triads or larger groups of nodes not connected to the main network, and if these groups do not include any struggles, we usually delete these nodes as well.

Cluster analysis

Our next step is to conduct cluster analysis. We use Girvan–Newman cluster analysis (Girvan and Newman, 2002) because this is the most widely-used form of cluster analysis, and although we have attempted using other algorithms, we have found this to be the most appropriate method, thus, far. Girvan–Newman clustering calculates measures of distance between each node – edge-betweenness – and then removes the paths between nodes that have the highest betweenness. Edge-betweenness refers to how many shortest paths exist between nodes. The higher the value of edge-betweenness for an edge between any particular pair of codes, the more power that line (edge) has in terms of information flow through the network. Edge-betweenness values are calculated again and the process repeats until distinct clusters are revealed.

We start with a minimum of two Girvan–Newman clusters, and then select the number of clusters which produces the highest Q value while also not producing an unusable number of clusters. Each cluster must contain at least one “struggle” node and at least one “what worked” or theory node. Higher Q values indicate higher confidence that the clusters represent meaningful clustering in the network. The normal range of Q values in our studies is between 0.3 and 0.7, and we try to avoid Q values less than 0.3. For example, if a map has five clusters with a very high Q value, but one or more of the clusters does not contain the necessary types of nodes (struggle + what worked or theory), we try four clusters. Again, we start with the p < 0.001 level map, and if that map does not produce the necessary types of nodes in each cluster, or has a Q value below 0.3, we abandon that level and move on to the p < 0.01 level.

In our example study, cluster analysis resulted in a number of maps at the p < 0.001 level (see supplemental materials) starting from two clusters at Q = 0.158 with increasing Q values until we got to a map with seven clusters at Q = 0.602, after which the Q value started going down. Usually, we would be overjoyed with the strong Q value in the seven-cluster map. All of the clusters in this map had at least one struggle and at least one what worked or theory principle node. However, when the research team discussed all the maps, they came to the conclusion that they wanted a map that contained at least two struggles in each cluster, so they settled on the map with four clusters at Q = 0.535.

Betweenness centrality measures

In studies with a large number of codes, it might be difficult to decide where to start when interpreting the maps. To assist with this, we conduct node betweenness centrality analysis which calculates betweenness centrality for each node (as opposed to edge-betweenness used in the previous step which calculates the betweenness measures for each line). We then set the node size such that nodes with greater betweenness centrality values are larger. This form of analysis helps us identify which nodes are most important in terms of connecting other nodes together (but not to itself), and, therefore, can be thought of as leverage points.

This form of analysis helps us identify which nodes are most important in terms of connecting other nodes together (but not to itself), and, therefore, can be thought of as leverage points. In our example data (see Figure 2; see supplemental materials for detailed maps and a step-by-step description of the map development process), the leverage points in the largest red circles cluster were WORKED – learning with design thinking is more active and engaging (betweenness value of 318.81), WORKED – relevance in terms of interest in a chosen issue (288.36) and WORKED – divergent thinking (221.08). The leverage points in the next-largest blue triangle cluster were WORKED – building confidence (321.34), WORKED – relevance in terms of future career (265.49), WORKED – relevance in terms of immediate real-world impact and seeing their design come to life (220.0) and WORKED – teamwork in terms of alignment of interests and personality in the team (214.33).

Crafting design moves

The principle behind translating these network maps into design moves is that we can leverage strengths related to weaknesses to address those weaknesses. We create a table with the headings issue, strengths, theory and design moves. Within each cluster, we identify the struggle nodes and write those codes in a row in the table. In the next column of the row, we write out all the strengths (what worked well) in the cluster and do the same for the theory nodes. In the design move column, we craft changes we could make to the design of the learning experience which use the related strengths and theoretical principles to address the weaknesses.

We interpret clusters as prototypical types of student experiences, which often represent different types of students. As an example that was most striking to us as researchers, in one of our studies we found a cluster that could best be described as representing students who see their engagement in learning activities in terms of employee–employer relationships: the student is the employee who completes tasks requested by the employer (the teacher), with the expectation of being compensated in the form of grades. In the same study, we found another cluster in which the students experienced transformation, deep engagement and powerful learning, along with cognitive dissonance and productive failure as major leverage points. When constructing design moves, we try to remain mindful that design moves that address struggles in a cluster representing a type of student experience may have consequences for student experiences in other clusters.

In our example clustered map (Figure 2), the largest cluster is the red circles cluster. This cluster represents student experiences that were generally fruitful and aligned with theory. There were quite a few alignments in this cluster with situated learning theory and constructionist learning principles. This cluster can be interpreted broadly as involving a lot of collaborative and collective work. The blue triangles cluster, on the other hand, has a more individualistic vibe, aligning primarily with cognitive constructivist and constructionist principles. Some of the most important nodes (highest betweenness values) from the what worked category were focused on the individual including confidence building, exploration of relevance for future career and relevance in terms of real-world impact. The black diamonds cluster included mostly aspects that worked well, and was concerned primarily with understanding from new perspectives, particularly through working with stakeholders, user testing and cognitive (mind) empathy. The gray squares cluster represented many struggles, including issues with communication in teamwork, empathy, low motivation and low tolerance of ambiguity. Students with the experiences in this cluster preferred to work individually but valued the relationships in their team.

Table 1 shows only the issues, strengths, theory principles and design moves for the largest red circles cluster from Figure 2 (see supplemental materials for more)

For example, students in the red circles cluster struggled with the timing of the assignment, particularly in feeling that the problem understanding stages of the design thinking process were too slow. What worked particularly well for these students was relevance in terms of interest in their team’s chosen issue, as well as processes for problem identification. Their experiences aligned with the situated learning principles of communication in a community of practice and engaging in the practices of the community which in their case was the agricultural leadership community. Leveraging the principles from theory and aspects that worked well, we developed the design move where, during the problem finding and problem framing stages, the instructor will repeatedly remind students that human nature is to jump into solution mode, but agricultural leaders spend most of their time understanding the problem from intellectual, empathetic and experiential perspectives. This facilitation strategy change was intended to help students truly understand that effective ways of addressing wicked problems requires taking the time to develop deep understanding, and that this is how central members of their community of practice engage with such problems.

Once the design moves are all crafted, the next step is to make those changes to the design and implement the learning experience again. Using this same approach of leveraging theory and strengths to address weaknesses, we developed 10 design moves which were implemented in the following iteration. Over the course of four iterations, we have made over 50 design moves.

Evaluating change across design-based research iterations

LENA was developed to help DBR researchers construct design moves between iterations, and, therefore, in this paper we will not address issues such as evaluating change across iterations, understanding how the design moves affected students in subsequent iterations and translating findings in terms of theory building. For an example of a DBR study in which LENA was used that addresses such issues, see Odom et al. (2023).

Limitations, benefits and insights from learning experience network analysis

There are limitations to the LENA method. This method can be applied only when you can get your data from participant experiences and be confident that the data provides a rich and reliable representation of these experiences. The characteristics of the data (student-generated reflections, for instance) may have unforeseen problematic implications for theory. The purpose of LENA is to help researchers in DBR studies craft design moves and is, therefore, not specifically intended as a means of contributing to theory. However, the complex interdependencies between aspects of learning experiences and their alignments with principles from learning theories produced through LENA may be an area for future design-based researchers to expand the methodology toward theory building. Furthermore, LENA does not afford evaluative functions such as determining the “impact” of a design. Finally, each iteration is a unique and complex context, and there will be many changes from iteration to iteration that are not due to design moves. Therefore, confidence in the impact of design moves from iteration to iteration cannot be as robust as desired. However, this issue may be alleviated to some extent by having many more iterations than is traditionally reported for DBR studies.

Our experience suggests that LENA makes it easier to identify patterns and trends through visualizing interconnections and associations. It also allows us to specify distinctive learning experiences and their interactions within the overall learning design. It enhances learning experience design by providing guidance from current network maps, something particularly helpful for novice DBR researchers. Crafting design moves through this method encourages the researchers to address “why” questions in addition to “what” and “how” questions. Through LENA we can understand how people use our designs from an experiential and theoretical perspective and build that theory.

Through our work developing LENA, we have gained a few valuable insights. Each learner is experiencing learning on their own terms, each with their own possibilities and limitations. LENA helps bridge those learning experiences and provide interconnected insights for data-driven learning design decisions. In other words, we believe that it allows us to embrace the messiness and complexity of learning without getting overwhelmed.

Conclusion and areas for future development of learning experience network analysis

The development of LENA involved a long (and ongoing) evolutionary process, but we are now confident that in the current form it is robust enough to share publicly in order for other researchers to further refine. In terms of our own future studies to apply and test the LENA approach, we are currently working with a team of doctoral students in computer science to build an artificial intelligence system to assist us and other researchers who use LENA with large amounts of data that would otherwise take an enormous amount of time to code. Furthermore, in our own research studies the context has usually been large enrollment courses in higher education, making it nearly impossible to include the students as collaborators in the DBR studies. We will continue to explore innovative approaches to better include students as co-equal colleagues in DBR. For instance, recently we have done a few experiments where we give students the clustered networked maps and use them as an aid in metacognitive and reflective practices in learning. Researchers using LENA for further innovations may identify new types and sources of data regarding learner experiences. We acknowledge that LENA is still in a developmental stage, and this article is our call for other researchers to continue the development of this and other related methods.

Figures

The LENA process

Figure 1.

The LENA process

Network map of struggles, what worked and theory with four Girvan–Newman clusters at Q = 0.535

Figure 2.

Network map of struggles, what worked and theory with four Girvan–Newman clusters at Q = 0.535

Design moves table for the red circles cluster

Issue Strengths Theory Design moves
In the red circles cluster, some students struggled with Timing – assignment too slow, too long; Problem finding, identifying problems; Instructions, grades and clarity of the process; Understanding what to do; Problem scope (realistic, can implement); Teamwork in terms of incorporating ideas, information and efforts; Divergent thinking – idea generation; Time management – personal; and Convergent thinking – selecting a solution For these students, what worked well was Teamwork – openness, new ideas; Learner agency, freedom, flexibility, creativity; Relevance – interest in chosen issue; Instructions, lecture or clarity of process; Timing, pace, action plan; Teamwork communication, dialog, understanding; Learning with design thinking is more active, engaging; and Problem identification – picking their topic, problem These students’ experiences aligned with the situated learning theory principles of communication in a community of practice and engaging in the practices of the community. Their experiences also aligned with the cognitive constructivist principle of individual knowledge construction and the social constructivist principle of collaborative knowledge construction. Finally, their experiences aligned with the constructionist principle of learner agency, authority and autonomy During the problem finding and problem framing stages, the instructor (facilitation) will repeatedly remind them that human nature is to jump into solution mode, but agricultural leaders spend most of their time understanding the problem from intellectual, empathetic and experiential perspectives. (The students need to understand that effective ways of addressing issues requires DEEP understanding which takes time)
During the problem finding Stage 1, groups will be asked to formulate their question as “How can we (our team) help ______ (who) address ______ (problem).” (The instructor can remind them that each team is not “solving” the problem, but helping a very particular set of stakeholders address the problem. And that agricultural leaders do not solve problems themselves, but facilitate problem-solving by others – relationships are key) 
During every group meeting time, have each group member write down what the group is supposed to be doing next (writing), then have each group spend 1 min discussing what they are supposed to be doing – have each member read/show their own version of what they are supposed to be doing. If there is disagreement in a group, they must ask the instructor Fade over time (always do this for the first few weeks – then as needed)
For the divergent thinking stage, have a warmup (to get students in a creative state of mind): how many new uses can we think of for a ______ (paperclip, used car tire, covid mask, etc.)? The whole class engages – students shout out ideas when they come to mind 

Source: Table by authors

Supplementary material

The supplementary material for this article can be found online.

References

Adam, K., Joseph, M., Lightfoot, R., Walker, W.C. and Donaldson, J.P. (2023), “Using a design-based research methodology to improve competency-based grading”, Paper presented at the 2023 International Conference of the Learning Sciences, June 10-15, Montreal, QC.

Ames, H., Glenton, C. and Lewin, S. (2019), “Purposive sampling in a qualitative evidence synthesis: a worked example from a synthesis on parental perceptions of vaccination communication”, BMC Medical Research Methodology, Vol. 19 No. 1, p. 26, doi: 10.1186/s12874-019-0665-4.

Anderson, T. and Shattuck, J. (2012), “Design-based research: a decade of progress in education research?”, Educational Researcher, Vol. 41 No. 1, pp. 16-25, doi: 10.3102/0013189x11428813.

Arastoopour Irgens, G. (2021), “Connected design rationale: a model for measuring design learning using epistemic network analysis”, Instructional Science, Vol. 49 No. 4, pp. 561-587, doi: 10.1007/s11251-021-09551-8.

Arastoopour Irgens, G. and Eagan, B. (2023), “The foundations and fundamentals of quantitative ethnography”, in Arastoopour Irgens, G. and Knight, S. (Eds), Advances in Quantitative Ethnography, Springer, pp. 3-16.

Bagley, E.A. and Shaffer, D.W. (2015), “Stop talking and type: comparing virtual and face‐to‐face mentoring in an epistemic game”, Journal of Computer Assisted Learning, Vol. 31 No. 6, pp. 606-622, doi: 10.1111/jcal.12092.

Bang, M. and Vossoughi, S. (2016), “Participatory design research and educational justice: studying learning and relations within social change making”, Cognition and Instruction, Vol. 34 No. 3, pp. 173-193.

Barab, S. (2006), “Design-based research: a methodological toolkit for the learning scientist”, in Sawyer, R.K. (Ed.), The Cambridge Handbook of the Learning Sciences, Cambridge University Press.

Bar-Yam, Y. (2003), Dynamics of Complex Systems, Westview Press.

Barany, A., Foster, A. and Shah, M. (2020), “Design-based research iterations of a virtual learning environment for identity exploration”, in 2020 6th International Conference of the Immersive Learning Research Network (iLRN), pp. 101-108.

Barany, A., Shah, M. and Foster, A. (2021), “Connecting curricular design and student identity change: an epistemic network analysis”, in Advances in Quantitative Ethnography, pp. 155-169.

Borgatti, S.P., Mehra, A., Brass, D.J. and Labianca, G. (2009), “Network analysis in the social sciences”, Science, Vol. 323 No. 5916, pp. 892-895, doi: 10.1126/science.1165821.

Brown, A.L. (1992), “Design experiments: theoretical and methodological challenges in creating complex interventions in classroom settings”, Journal of the Learning Sciences, Vol. 2 No. 2, pp. 141-178.

Campanella, M. and Penuel, W.R. (2021), “Design-based research educational settings: motivations, crosscutting features, and considerations for design”, in Philippakos, Z.A., Howell, E., Pellegrino, A. and Reinking, D. (Eds), Design-Based Research in Education: Theory and Applications, Guilford Publications, pp. 3-22.

Chen, B., Chang, Y.-H., Ouyang, F. and Zhou, W. (2018), “Fostering student engagement in online discussion through social learning analytics”, The Internet and Higher Education, Vol. 37, pp. 21-30, doi: 10.1016/j.iheduc.2017.12.002.

Christensen, A.P. and Kenett, Y.N. (2021), “Semantic network analysis (SemNA): a tutorial on preprocessing, estimating, and analyzing semantic networks”, Psychological Methods, Vol. 28 No. 4, doi: 10.1037/met0000463.

Clark, H.F. (2022), “Critical climate awareness: re-imagining climate change teaching and learning”, PhD Dissertation, University of California, Los Angeles.

Cobb, P., Confrey, J., diSessa, A., Lehrer, R. and Schauble, L. (2003), “Design experiments in educational research”, Educational Researcher, Vol. 32 No. 1, pp. 9-13, doi: 10.3102/0013189X032001009.

Collins, A. (1992), “Toward a design science of education”, in Scanlon, E. and O’Shea, T. (Eds), New Directions in Educational Technology, Springer Berlin Heidelberg.

Collins, A., Joseph, D. and Bielaczyc, K. (2004), “Design research: theoretical and methodological issues”, Journal of the Learning Sciences, Vol. 13 No. 1, pp. 15-42, doi: 10.1207/s15327809jls1301_2.

Cooper, R.P., Chowdhury, M., Donaldson, J.P. and Barth, M. (2023), “Exploring role-play game-based learning in the literature classroom”, Paper presented at the Constructionism/Fablearn 2023 conference, Oct 7-11, NY.

Dede, C. (2004), “If design-based research is the answer, what is the question? A commentary on Collins, Joseph, and Bielaczyc; diSessa and cobb; and fishman, marx, Blumenthal, krajcik, and soloway in the JLS special issue on design-based research”, Journal of the Learning Sciences, Vol. 13 No. 1, pp. 105-114, doi: 10.1207/s15327809jls1301_5.

Dennerlein, S.M., Tomberg, V., Treasure-Jones, T., Theiler, D., Lindstaedt, S. and Ley, T. (2020), “Co-designing tools for workplace learning”, Information and Learning Sciences, Vol. 121 Nos 3/4, pp. 175-205, doi: 10.1108/ILS-09-2019-0093.

diSessa, A.A. and Cobb, P. (2004), “Ontological innovation and the role of theory in design experiments”, Journal of the Learning Sciences, Vol. 13 No. 1, pp. 77-103, doi: 10.1207/s15327809jls1301_4.

Doerfel, M.L. (1998), “What constitutes semantic network analysis? A comparison of research and methodologies”, Connections, Vol. 21, pp. 16-26.

Donaldson, J.P. and Smith, B.K. (2017), “Design thinking, designerly ways of knowing, and engaged learning”, in Spector, M.J., Lockee, B.B. and Childress, M.D. (Eds), Learning, Design, and Technology: An International Compendium of Theory, Research, Practice, and Policy, Springer International Publishing, doi: 10.1007/978-3-319-17727-4_73-1.

Donaldson, J.P., Odom, S., Stoddard, K., Parker, D. and Zhao, Y. (2022a), “Design thinking as a structure for collaborative project-based learning”, Paper presented at the American Educational Research Association Annual Conference 2022.

Donaldson, J.P., Stoddard, K., Odom, S., Parker, D., Paudyal, S., Thomas, S., Dunlap, K. and Jamal, T. (2022b), “Design thinking as a structure for collaborative project-based learning in multiple disciplines”, Paper presented at the 2022 International Conference of the Learning Sciences, June 6-9 (virtual).

Donaldson, J.P., Walker, W., Zhao, Y. and Kao, S. (2022c), “Using network analysis to develop design moves in design-based research”, Pre-conference workshop at the 2022 International Society of the Learning Sciences Annual Meeting, June 5 (virtual).

Fischer, F., Susan, R., Goldman Hmelo-Silver, C.E. and Reimann, P. (2018), “Introduction”, in Fischer, F., Hmelo-Silver, C.E., Goldman, S.R. and Reimann, P. (Eds), International Handbook of the Learning Sciences, Routledge.

Fishman, B. and Penuel, W. (2018), “Design-based implementation research”, in Fischer, F., Hmelo-Silver, C.E., Goldman, S.R. and Reimann, P. (Eds), International Handbook of the Learning Sciences, Routledge.

Fishman, B., Marx, R.W., Blumenfeld, P., Krajcik, J. and Soloway, E. (2004), “Creating a framework for research on systemic technology innovations”, Journal of the Learning Sciences, Vol. 13 No. 1, pp. 43-76, doi: 10.1207/s15327809jls1301_3.

Ganvir, A. and Donaldson, J.P. (2022), “Using semantic network analysis to inform design moves in design-based research”, Paper presented at the American Educational Research Association Annual Meeting, April 21-26 (virtual).

Girvan, M. and Newman, M.E.J. (2002), “Community structure in social and biological networks”, Proceedings of the National Academy of Sciences, Vol. 99 No. 12, p. 7821, doi: 10.1073/pnas.122653799.

Gomez, K. (2021), “The design, development and evaluation of an augmented reality intervention to support collaborative creative thinking at lower secondary school”, Ph.D. dissertation, University of Cambridge.

Guest, G., Bunce, A. and Johnson, L. (2006), “How many interviews are enough? An experiment with data saturation and variability”, Field Methods, Vol. 18 No. 1, pp. 59-82, doi: 10.1177/1525822X05279903.

Gutiérrez, K.D. (2016), “2011 AERA presidential address: designing resilient ecologies: social design experiments and a new social imagination”, Educational Researcher, Vol. 45 No. 3, pp. 187-196.

Hammersley, M. and Atkinson, P. (2007), Ethnography: Principles in Practice, 3rd ed., Routledge.

Hilpert, J.C. and Marchand, G.C. (2018), “Complex systems research in educational psychology: aligning theory and method”, Educational Psychologist, Vol. 53 No. 3, pp. 185-202, doi: 10.1080/00461520.2018.1469411.

Hoadley, C. (2018), “A short history of the learning sciences”, in Hmelo-Silver, C.E., Goldman, S.R., Reimann, P. and Fischer, F. (Eds), International Handbook of the Learning Sciences, Routledge.

Howell, E., Hubbard, K., Linder, S., Madison, S., Ryan, J. and Bridges, W.C. (2023), “Hyflex pedagogy: six strategies supported by design-based research”, Journal of Applied Research in Higher Education, doi: 10.1108/JARHE-02-2023-0050.

Jamal, T., Kircher, J. and Donaldson, J.P. (2021), “Re-visiting design thinking for learning and practice: critical pedagogy, conative empathy”, Sustainability, Vol. 13 No. 2, doi: 10.3390/su13020964.

Jayatilleke, B.G., Ranawaka, G.R., Wijesekera, C. and Kumarasinha, M.C.B. (2018), “Development of mobile application through design-based research”, Asian Association of Open Universities Journal, Vol. 13 No. 2, pp. 145-168, doi: 10.1108/AAOUJ-02-2018-0013.

Lagemann, E.C. (2002), Usable Knowledge in Education: A Memorandum for the Spencer Foundation Board of Directors (Memorandum), Spencer Foundation.

Latour, B. (1996), “On actor-network theory: a few clarifications”, Soziale Welt, Vol. 47, pp. 369-381.

Lave, J. and Wenger, E. (1991), Situated Learning: Legitimate Peripheral Participation, Cambridge University Press.

Luke, D.A. and Harris, J.K. (2007), “Network analysis in public health: history, methods, and applications”, Annual Review of Public Health, Vol. 28 No. 1, pp. 69-93, doi: 10.1146/annurev.publhealth.28.021406.144132.

Martínez, A., Dimitriadis, Y., Rubia, B., Gómez, E. and de la Fuente, P. (2003), “Combining qualitative evaluation and social network analysis for the study of classroom social interactions”, Computers and Education, Vol. 41 No. 4, pp. 353-368.

Maxwell, J.A. (2012), Qualitative Research Design: An Interactive Approach, Sage Publications.

Menczer, F., Fortunato, S. and Davis, C.A. (2020), A First Course in Network Science, Cambridge University Press.

Mezirow, J. (2009), “An overview on transformative learning”, in Illeris, K. (Ed.), Contemporary Theories of Learning: Learning Theorists in Their Own Words, Routledge.

Nathan, M.J. and Swart, M.I. (2021), “Materialist epistemology lends design wings: educational design as an embodied process”, Educational Technology Research and Development, Vol. 69 No. 4, pp. 1925-1954, doi: 10.1007/s11423-020-09856-4.

Norman, D.A. (2013), The Design of Everyday Things, Basic Books.

Obrenović, Ž. (2011), “Design-based research: what we learn when we engage in design of interactive systems”, Interactions, Vol. 18 No. 5, pp. 56-59, doi: 10.1145/2008176.2008189.

Odom, S.F., Donaldson, J.P., Anderson, K.M., Gui, H., Glover, J., Burns, A. and Armenta, V. (2023), “Design principles for sustainable leadership learning: a complex analysis of learner experiences”, Sustainability, Vol. 15 No. 17, p. 12996.

Ouyang, F., Chen, S. and Li, X. (2021), “Effect of three network visualizations on students' social-cognitive engagement in online discussions”, British Journal of Educational Technology, Vol. 52 No. 6, pp. 2242-2262, doi: 10.1111/bjet.13126.

Pantić, N., Galey, S., Florian, L., Joksimović, S., Viry, G., Gašević, D., Knutes Nyqvist, H. and Kyritsi, K. (2022), “Making sense of teacher agency for change with social and epistemic network analysis”, Journal of Educational Change, Vol. 23 No. 2, pp. 145-177, doi: 10.1007/s10833-021-09413-7.

Papert, S. and Harel, I. (1991), “Situating constructionism”, in Papert, S. and Harel, I. (Eds), Constructionism, Basic Books.

Parmaxi, A. and Zaphiris, P. (2020), “Lessons learned from a design-based research implementation: a researcher’s methodological account”, International Journal of Research and Method in Education, Vol. 43 No. 3, pp. 257-270, doi: 10.1080/1743727X.2019.1671327.

Penuel, W.R. (2019), “Infrastructuring as a practice of design-based research for supporting and studying equitable implementation and sustainability of innovations”, Journal of the Learning Sciences, Vol. 28 Nos 4/5, pp. 659-677, doi: 10.1080/10508406.2018.1552151.

Piaget, J. (1952), The Origins of Intelligence in Children, W Norton and Co.

Proctor, C. and Blikstein, P. (2019), “Unfold studio: supporting critical literacies of text and code”, Information and Learning Sciences, Vol. 120 Nos 5/6, pp. 285-307, doi: 10.1108/ILS-05-2018-0039.

Ryu, S. (2020), “The role of mixed methods in conducting design-based research”, Educational Psychologist, Vol. 55 No. 4, pp. 232-243, doi: 10.1080/00461520.2020.1794871.

Sandoval, W. (2014), “Conjecture mapping: an approach to systematic educational design research”, Journal of the Learning Sciences, Vol. 23 No. 1, pp. 18-36, doi: 10.1080/10508406.2013.778204.

Sannino, A. (2008), “From talk to action: experiencing interlocution in developmental interventions”, Mind, Culture, and Activity, Vol. 15 No. 3, pp. 234-257.

Sannino, A., Engeström, Y. and Lemos, M. (2016), “Formative interventions for expansive learning and transformative agency”, Journal of the Learning Sciences, Vol. 25 No. 4, pp. 599-633, doi: 10.1080/10508406.2016.1204547.

Schön, D.A. (1992), “Designing as reflective conversation with the materials of a design situation”, Knowledge-Based Systems, Vol. 5 No. 1, pp. 3-14, doi: 10.1016/0950-7051(92)90020-G.

Scott, J. (2017), Social Network Analysis, Sage Publications Ltd.

Shaffer, D.W. (2018), “Epistemic network analysis: understanding learning by using big data for thick description”, in Fischer, F., Hmelo-Silver, C.E., Goldman, S.R. and Reimann, P. (Eds), International Handbook of the Learning Sciences, Routledge.

Shaffer, D.W., Collier, W. and Ruis, A.R. (2016), “A tutorial on epistemic network analysis: analyzing the structure of connections in cognitive, social, and interaction data”, Journal of Learning Analytics, Vol. 3 No. 3, pp. 9-45, doi: 10.18608/jla.2016.33.3.

Sørensen, E. (2009), The Materiality of Learning: Technology and Knowledge in Educational Practice, Cambridge University Press.

Tan, Y., Hinojosa, C., Marquart, C., Ruis, A.R. and Shaffer, D.W. (2022), “Epistemic network analysis visualization”, in Wasson, B. and Zörgő, S. (Eds), Advances in Quantitative Ethnography, Springer, pp. 129-143.

Van Staden, C.J. and Van Der Westhuizen, D. (2013), “Learn 2.0 technologies and the continuing professional development of secondary school mathematics teachers”, Journal for New Generation Sciences, Vol. 11, pp. 141-157.

Vogelstein, L. (2022), “Choreographic ways of knowing as generative site for STEM learning, design, and analysis”, PhD Dissertation, Vanderbilt University, Nashville.

Vygotsky, L.S. (1978), Mind in Society: The Development of Higher Mental Process, Harvard University Press.

Zhao, Y., Kao, S., Donaldson, J.P. and Chaney, K. (2022), “Design of an argumentation-based learning activity: connecting veterinary students to real-world problems”, Paper presented at the 2022 International Conference of the Learning Sciences June 6-9 (virtual).

Zheng, L. (2015), “A systematic literature review of design-based research from 2004 to 2013”, Journal of Computers in Education, Vol. 2 No. 4, pp. 399-420, doi: 10.1007/s40692-015-0036-z.

Zörgő, S., Peters, G.-J.Y., Porter, C., Moraes, M., Donegan, S. and Eagan, B. (2022), “Methodology in the mirror: a living, systematic review of works in quantitative ethnography”, in Wasson, B. and Zörgő, S. (Eds), Advances in Quantitative Ethnography, Springer, pp. 144-159.

Acknowledgements

Since submission of this article, the following author has updated their affiliations: Seiyon Lee is at the School of Teaching and Learning, Institute of Advanced Learning Technologies, Educational Technology, University of Florida, Gainesville, Florida, USA.

Corresponding author

Jonan Phillip Donaldson can be contacted at: jonandonaldson@gmail.com

Related articles