skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Learn With Martian: A Tool For Creating Assignments That Can Write And Re-Write Themselves
In this paper, we propose Learn, a unified, easy-to-use tool to apply question generation and selection in classrooms. The tool lets instructors and TAs create assignments that can write and re-write themselves. Given existing course materials, for example a reference textbook, Learn can generate questions, select the highest quality questions, show the questions to students, adapt question difficulty to student knowledge, and generate new questions based on how effectively old questions help students learn. The modular, composable nature of the tools for handling each sub-task allow instructors to use only the parts of the tool necessary to the course, allowing for integration in a large number of courses with varied teaching styles. We also report on the adoption of the tool in classes at the University of Pennsylvania with over 1000 students. Learn is publicly released at https://learn.withmartian.com.  more » « less
Award ID(s):
1928474
PAR ID:
10463297
Author(s) / Creator(s):
; ;
Date Published:
Journal Name:
Proceedings of the 17th Conference of the European Chapter of the Association for Computational Linguistics: System Demonstrations
Page Range / eLocation ID:
267 to 276
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. null (Ed.)
    This hands-on online workshop will introduce high school and college instructors to CSAwesome, a free Java curriculum and ebook at course.csawesome.org for the Advanced Placement (AP) Computer Science (CS) A course. This course is equivalent to a college-level CS1 course in Java. CSAwesome is an official College Board approved curriculum and professional development provider and has been widely adopted by AP high school teachers. The free ebook on the Runestone platform includes executable Java code examples and a variety of practice problems with immediate feedback: multiple-choice, fill-in-the-blank, write-code, mixed-up code (Parsons), and clickable code. It also includes coding challenges and support for pair programming. The curriculum is designed to help transition students from AP Computer Science Principles, which is equivalent to a CS0 course. Teacher lesson plans and resources are freely available. During this workshop, participants will register for the free ebook and work through example activities using object-oriented programming. If possible, participants will be divided into breakout groups according to their Java expertise. Participants will also learn how to create a custom course on the Runestone platform, create and grade assignments, use the instructor's dashboard to view student progress, contribute to the question bank, and use an interleaved spaced practice tool. We will also discuss online/hybrid teaching and engagement strategies. 
    more » « less
  2. The ability to identify one’s own confusion and to ask a question that resolves it is an essential metacognitive skill that supports self-regulation (Winne, 2005). Yet, while students receive substantial training in how to answer questions, little classroom time is spent training students how to ask good questions. Past research has shown that students are able to pose more high-quality questions after being instructed in a taxonomy for classifying the quality of their questions (Marbach‐Ad & Sokolove, 2000). As pilot data collection in preparation for a larger study funded through NSF-DUE, we provided engineering statics students training in writing high-quality questions to address their own confusions. The training emphasized the value of question-asking in learning and how to categorize questions using a simple taxonomy based on prior work (Harper et al., 2003). The taxonomy specifies five question levels: 1) an unspecific question, 2) a definition question, 3) a question about how to do something, 4) a why question, and 5) a question that extends knowledge to a new circumstance. At the end of each class period during a semester-long statics course, students were prompted to write and categorize a question that they believed would help them clarify their current point of greatest confusion. Through regular practice writing and categorizing such questions, we hoped to improve students' abilities to ask questions that require higher-level thinking. We collected data from 35 students in courses at two institutions. Over the course of the semester, students had the opportunity to write and categorize twenty of their own questions. After the semester, the faculty member categorized student questions using the taxonomy to assess the appropriateness of the taxonomy and whether students used it accurately. Analysis of the pilot data indicates three issues to be addressed: 1) Student compliance in writing and categorizing their questions varied. 2) Some students had difficulty correctly coding their questions using the taxonomy. 3) Some student questions could not be clearly characterized using the taxonomy, even for faculty raters. We will address each of these issues with appropriate refinements in our next round of data collection: 1) Students may have been overwhelmed with the request to write a question after each class period. In the future, we will require students to write and categorize at least one question per week, with more frequent questions encouraged. 2) To improve student use of the taxonomy in future data collection, students will receive more practice with the taxonomy when it is introduced and more feedback on their categorization of questions during the semester. 3) We are reformulating our taxonomy to accommodate questions that may straddle more than one category, such as a question about how to extend a mathematical operation to a new situation (which could be categorized as either a level 3 or 5). We are hopeful that these changes will improve accuracy and compliance, enabling us to use the intervention as a means to promote metacognitive regulation and measure changes as a result, which is the intent of the larger scope of the project. 
    more » « less
  3. Addressing common student questions in introductory STEM courses early in the term is one way that instructors can ensure that their students have all been presented with information about how to succeed in their courses. However, categorizing student questions and identifying evidence-based resources to address student questions takes time, and instructors may not be able to easily collect and respond to student questions at the beginning of every course. To help faculty effectively anticipate and respond to student questions, we 1) administered surveys in multiple STEM courses to identify common student questions, 2) conducted a qualitative analysis to determine categories of student questions (e.g., what are best practices for studying, how can in- and out-of- course time be effectively used), and 3) collaboratively identified advice on how course instructors can answer these questions. Here, we share tips, evidence-based strategies, and resources from faculty that instructors can use to develop their own responses for students. We hope that educators can use these common student questions as a starting point to proactively address questions throughout the course and that the compiled resources will allow instructors to easily find materials that can be considered for their own courses. 
    more » « less
  4. ChatGPT has been at the center of media coverage since its public release at the end of 2022. Given ChatGPT’s capacity for generating human-like text on a wide range of subjects, it is not surprising that educators, especially those who teach writing, have raised concerns regarding the implications of generative AI tools on issues of plagiarism and academic integrity. How do we navigate the already complex discourse around what constitutes plagiarism and how much assistance is acceptable within the bounds of academic integrity? As we contemplate these theoretical questions, a more practical approach is to assess what these tools can do to facilitate students’ learning of existing academic integrity codes. In this short piece, we share our exploratory interactions with ChatGPT relevant to issues of plagiarism and academic integrity, hoping to shed light on how writing instructors can use the tool to facilitate the teaching and learning of ethics in academic writing. 
    more » « less
  5. null (Ed.)
    To defend against collaborative cheating in code writing questions, instructors of courses with online, asynchronous exams can use the strategy of question variants. These question variants are manually written questions to be selected at random during exam time to assess the same learning goal. In order to create these variants, currently the instructors have to rely on intuition to accomplish the competing goals of ensuring that variants are different enough to defend against collaborative cheating, and yet similar enough where students are assessed fairly. In this paper, we propose data-driven investigation into these variants. We apply our data-driven investigation into a dataset of three midterm exams from a large introductory programming course. Our results show that (1) observable inequalities of student performance exist between variants and (2) these differences are not just limited to score. Our results also show that the information gathered from our data-driven investigation can be used to provide recommendations for improving design of future variants. 
    more » « less