skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Concept-Level Design Analytics for Blended Courses
Although many efforts are being made to provide educators with dashboards and tools to understand student behaviors within specific technological environments (learning analytics), there is a lack of work in supporting educators in making data-informed design decisions when designing a blended course and planning learning activities. In this paper, we introduce concept-level design analytics, a knowledge-based visualization, which uncovers facets of the learning activities that are being authored. The visualization is integrated into a (blended) learning design authoring tool, edCrumble. This new approach is explored in the context of a higher education programming course, where teaching assistants design labs and home practice sessions with online smart learning content on a weekly basis. We performed a within-subjects user study to compare the use of the design tool both with and without the visualization. We studied the differences in terms of cognitive load, design outcomes and user actions within the system to compare both conditions to the objective of evaluating the impact of using design analytics during the decision-making phase of course design.  more » « less
Award ID(s):
1740775
PAR ID:
10191769
Author(s) / Creator(s):
; ; ;
Date Published:
Journal Name:
Proceedings of 14th European Conference on Technology Enhanced Learning (EC-TEL 2019)
Page Range / eLocation ID:
541–554
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. null (Ed.)
    Over the last 10 years, learning analytics have provided educators with both dashboards and tools to understand student behaviors within specific technological environments. However, there is a lack of work to support educators in making data-informed design decisions when designing a blended course and planning appropriate learning activities. In this paper, we introduce knowledge-based design analytics that uncover facets of the learning activities that are being created. A knowledge-based visualization is integrated into edCrumble, a (blended) learning design authoring tool. This new approach is explored in the context of a higher education programming course, where instructors design labs and home practice sessions with online smart learning content on a weekly basis. We performed a within-subjects user study to compare the use of the design tool both with and without visualization. We studied the differences in terms of cognitive load, controllability, confidence and ease of choice, design outcomes, and user actions within the system to compare both conditions with the objective of evaluating the impact of using design analytics during the decision-making phase of course design. Our results indicate that the use of a knowledge-based visualization allows the teachers to reduce the cognitive load (especially in terms of mental demand) and that it facilitates the choice of the most appropriate activities without affecting the overall design time. In conclusion, the use of knowledge-based design analytics improves the overall learning design quality and helps teachers avoid committing design errors. 
    more » « less
  2. null (Ed.)
    Currently, there is no formal taxonomy for the activities that users engage in when interacting with and making meaning from spatio-temporal game data visualizations. As data visualization, especially spatio-temporal visualization, becomes more popular for game data analytics, it becomes increasingly crucial that we develop a formal understanding of how users, especially players, interact with and extract meaning from game data using these systems. However, existing taxonomies developed for InfoVis are not directly applicable due to domain differences and a lack of consensus within the literature. This paper presents the beginnings of a taxonomy for user interaction with spatio-temporal data specific to the domain of games, developed from the results of a qualitative user study (n=7) in which experienced players were tasked with using a spatio-temporal visualization system to explore and understand telemetry data from Defense of the Ancients 2 (DotA 2). The taxonomy includes seven activities organized into three categories: Data Interaction, Sense Making, and Validation. We discuss the implications of these activities on design and future research. 
    more » « less
  3. Abstract Computational analysis methods and machine learning techniques introduce innovative ways to capture classroom interactions and display data on analytics dashboards. Automated classroom analytics employ advanced data analysis, providing educators with comprehensive insights into student participation, engagement, and behavioral trends within classroom settings. Through the provision of context-sensitive feedback, automated classroom analytics systems can be integrated into the evidence-based pedagogical decision-making and reflective practice processes of faculty members in higher education institutions. This paper presents TEACHActive, an automated classroom analytics system, by detailing its design and implementation. It outlines the processes of stakeholder engagement and mapping, elucidates the benefits and obstacles associated with a comprehensive classroom analytics system design, and concludes by discussing significant implications. These implications propose user-centric design approaches for higher education researchers and practitioners to consider. 
    more » « less
  4. In educational research, user-simulation interaction is gaining importance as it provides key insights into the effectiveness of simulation-based learning and immersive technologies. A common approach to study user-simulation interaction involves manually analyzing participant interaction in real-time or via video recordings, which is a tedious process. Surveys/questionnaires are also commonly used but are open to subjectivity and only provide qualitative data. The tool proposed in this paper, which we call Environmental Detection for User-Simulation Interaction Measurement (EDUSIM), is a publicly available video analytics tool that receives screen-recorded video input from participants interacting with a simulated environment and outputs statistical data related to time spent in pre-defined areas of interest within the simulation model. The proposed tool utilizes machine learning, namely multi-classification Convolutional Neural Networks, to provide an efficient, automated process for extracting such navigation data. EDUSIM also implements a binary classification model to flag imperfect input video data such as video frames that are outside the specified simulation environment. To assess the efficacy of the tool, we implement a set of immersive simulation-based learning (ISBL) modules in an undergraduate database course, where learners record their screens as they interact with a simulation to complete their ISBL assignments. We then use the EDUSIM tool to analyze the videos collected and compare the tool’s outputs with the expected results obtained by manually analyzing the videos. 
    more » « less
  5. This paper presents an experience report on using an interactive program visualization tool — Dynamic, Interactive Stack-Smashing Attack Visualization (DISSAV) — and a complementary active-learning exercise to teach stack smashing, a key software security attack. The visualization tool and active-learning exercise work synergistically to guide the student through challenging, abstract concepts in the advanced cybersecurity area. DISSAV and the exercise are deployed within the software security module of an undergraduate cybersecurity course that introduces a broad range of security topics. A study is designed that collects and evaluates student perceptions on the user interface of DISSAV and the effectiveness of the two resources in improving student learning and engagement. The study finds that over 80% of responses to user interface questions, 66% of responses to student learning questions and 64% of responses to student engagement questions are positive, suggesting that the resources improve student learning and engagement in general. The study does not find discernible patterns of difference in responses from students of different ages and varying levels of prior experience with stack smashing attacks, program visualization tools and C programming. 
    more » « less