Lauren Storz is an Academic Experience Analyst on the Academic Technology Design Team, and a PhD student in cultural anthropology at CU Boulder.
Namita Mehta is a Learning Experience Designer with the Office of Information Technology at CU Boulder. She has over 15 years of experience in education and holds an Ed.D in Leadership for Educational Equity.
Wicked problems are those that have interdependent factors to consider. To solve these complex problems in technology-mediated teaching and learning, we emphasize the importance of understanding context and the systems at play to respond to human needs in the first blog post. Developing a clear and focused problem statement is essential to guide a research project, and mixed research methods can be used to understand and continually refine problem statements throughout the research process. In this post, we want to expand on how we use mixed research methods to achieve a human-centered approach– in other words, an approach which prioritizes the needs, perspectives, and experiences of people at every phase of research. Utilizing a diversity of methods can give us a more realistic understanding of how students, staff, and faculty are experiencing a situation, interaction, or intervention. Context is not only about understanding the experience of each individual, but it also entails a thorough investigation of the relationship between individuals and their interaction with sociotechnical systems. Design-based research must account for how design(s) function in the everyday, lived experiences of those occupying a given space. It must not only document success or failure of a space, but also focus on “the interactions that refine our understanding of the learning issues involved” (DBRC, 2003). The evolution of such accounts depends on methods that record and connect lived processes with assessment goals (DBRC, 2003). In other words, a key element of conducting design-based research that centers relationships and embraces complexity is having a mixed-methods approach (Anderson & Shattuck, 2012).
The Case for Mixed Methods Quantitative and qualitative data contribute something different to our understanding of the human experience. Numerical data is helpful in measuring changes across time. Quantitative methods can be used to measure if an intervention is effective or even to see the relationship between two facets of an experience. What qualitative data offers is exploring new realms and gaining greater clarity on what those numbers mean. These complementary methods can paint a clearer picture of the human experience by measuring the pulse of a population and empathizing individual stories at the same time.
Mixed methods approaches can look a variety of ways. One option is to use two distinct methods such as a survey and a focus group. Some people hesitate when hearing this approach as to use two methods for every practice-engaged inquiry could be both time-consuming and exhausting. This approach does not need to be a burden on professionals who already have so many duties to juggle. One popular way of integrating these practices is by including both Likert scale and open-ended questions on a survey. Another way to approach this is by offering places where students can rate certain questions within a focus group. The main point being that expanding our method allows us to more accurately understand what the experience is and how we can improve it. To highlight this concept further, we would like to share a recent study conducted by the Academic Technology Design Team at CU Boulder, which illustrates one of the myriad ways a complex research project can be approached by employing numerous sources and mixed methods.
Learning Spaces Project ExampleTitled the Learning Spaces Project, this project was in partnership with the Learning Spaces Technology team that is in charge of maintaining and upgrading academic technology in the classroom. While our partner team was looking for concrete recommendations concerning academic technology upgrades and/or support, we wanted to broaden our scope to better understand how academic technologies are situated within teaching and learning ecologies across campus. We decided to create case studies with courses across disciplines and in a variety of classroom types (e.g. large lecture halls, flexible classrooms, labs, etc.). For each case study we triangulated our data through a student survey, classroom observations, and contextual inquiry with instructors. Our student survey focused on students’ expectations and preferences with academic technology and classroom design. For example, we asked likert scale questions such as: To what extent do the following learning tools impact your learning? Which technology/learning tools do you generally expect to be available in a given classroom? Each question gave students the option to rate the following classroom technologies: screens and projector, blackboards and whiteboards, monitors and TVs, clickers, classroom capture, and other (fill in the blank). We also gave students two open ended questions: How does the classroom technology/learning tools and space impact your learning? And, please draw and/or describe the technology and layout in your ideal classroom and explain your design choices. Classroom observations were conducted by a member of our team and narrowed in on classroom dynamics (student to student and instructor to student interactions, as well as the ways in which classroom technology and/or the physical classroom design influenced or mediated such interactions) as they played out in real time. Lastly, contextual inquiries with instructors delved into a given instructor’s pedagogical goals, expectations of available classroom technologies, and any difficulties they have experienced. In conjunction with these case studies we drew from pre-existing data from a campus-wide survey and anonymized Faculty Course Questionnaire responses.
Through this mixed method approach we were able to identify evolving themes that persist across campus, as well as distill concrete recommendations that our partner team could act on right away. For instance, one theme which emerged was that instructors often have to tailor their teaching to fit within the confines of the classroom they have been assigned– which becomes more burdensome when an instructor must teach the same course in two or three different classrooms across campus. This insight has implications for pedagogy, the ways in which instructors are assigned classrooms, and classroom design, but it remains relatively abstract and in need of further investigation. In contrast, a more granular insight came from the student survey. Students from across disciplines consistently highlighted access to classroom capture recordings of weekly lectures as highly beneficial to their learning. However, some are unable to access classroom recordings because the equipment is not physically available in small classrooms for instructors to take advantage of.
How it Applies to Other Areas of Higher EducationAlthough the example we used here is related to academic technology, these principles can be applied to other areas of higher education. This approach could be used to improve the residence life experience of students. A survey with Likert-scale questions could reveal a difference in sense of belonging among different demographics when analyzed. To further explore what could help, the survey could include open-ended questions, or a focus group could be conducted later with the students. The combination of methods allows researchers to delve deeper into a problem of practice, and for students to contribute more of their ideas and have the ability to prioritize them. Two benefits of this approach are that: (1) it limits the researcher bias by creating a participatory approach, and 2) it encourages an iterative approach to designing an experience that works for all students. For example, using a mixed methods approach would help the emerging needs of students during COVID-19 and allow higher education professionals to improve and re-assess if any new approaches are responsive to these needs. Regardless of the methods used, it is important for researchers to be reflexive about their own positionality and how that will inevitably influence the research. We will be expanding on the salience of reflexivity, along with participatory approaches and the assessment cycle in future posts.
This is the second blog of a four part blog series.
For the first blog click here.
ReferencesAnderson, T., & Shattuck, J. (2012). Design-based research:A decade of progress in education research? Educational Researcher, 41(1), 16-25. doi:10.3102/0013189X11428813
Design-Based Research Collective. (2003). Design-based research: An emerging paradigm for educational inquiry. Educational Researcher, 32(1), 5-8.