skip to main content


Title: Beyond Bot Detection: Combating Fraudulent Online Survey Takers
Different techniques have been recommended to detect fraudulent responses in online surveys, but little research has been taken to systematically test the extent to which they actually work in practice. In this paper, we conduct an empirical evaluation of 22 antifraud tests in two complementary online surveys. The first survey recruits Rust programmers on public online forums and social media networks. We find that fraudulent respondents involve both bot and human characteristics. Among different anti-fraud tests, those designed based on domain knowledge are the most effective. By combining individual tests, we can achieve a detection performance as good as commercial techniques while making the results more explainable. To explore these tests under a broader context, we ran a different survey on Amazon Mechanical Turk (MTurk). The results show that for a generic survey without requiring users to have any domain knowledge, it is more difficult to distinguish fraudulent responses. However, a subset of tests still remain effective.  more » « less
Award ID(s):
1955965 2030521
NSF-PAR ID:
10321051
Author(s) / Creator(s):
; ; ; ; ;
Date Published:
Journal Name:
Proceedings of the Web Conference 2022
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. The landscapes of many elementary, middle, and high school math classrooms have undergone major transformations over the last half-century, moving from drill-and-skill work to more conceptual reasoning and hands-on manipulative work. However, if you look at a college level calculus class you are likely to find the main difference is the professor now has a whiteboard marker in hand rather than a piece of chalk. It is possible that some student work may be done on the computer, but much of it contains the same type of repetitive skill building problems. This should seem strange given the advancements in technology that allow more freedom than ever to build connections between different representations of a concept. Several class activities have been developed using a combination of approaches, depending on the topic. Topics covered in the activities include Riemann Sums, Accumulation, Center of Mass, Volumes of Revolution (Discs, Washers, and Shells), and Volumes of Similar Cross-section. All activities use student note outlines that are either done in a whole group interactive-lecture approach, or in a group work inquiry-based approach. Some of the activities use interactive graphs designed on desmos.com and others use physical models that have been designed in OpenSCAD and 3D-printed for students to use in class. Tactile objects were developed because they should provide an advantage to students by enabling them to physically interact with the concepts being taught, deepening their involvement with the material, and providing more stimuli for the brain to encode the learning experience. Web-based activities were developed because the topics involved needed substantial changes in graphical representations (i.e. limits with Riemann Sums). Assessment techniques for each topic include online homework, exams, and online concept questions with an explanation response area. These concept questions are intended to measure students’ ability to use multiple representations in order to answer the question, and are not generally computational in nature. Students are also given surveys to rate the overall activities as well as finer grained survey questions to try and elicit student thoughts on certain aspects of the models, websites, and activity sheets. We will report on student responses to the activity surveys, looking for common themes in students’ thoughts toward specific attributes of the activities. We will also compare relevant exam question responses and online concept question results, including common themes present or absent in student reasoning. 
    more » « less
  2. This research paper reports the in-progress validation of a quantitative instrument designed to assess the perceived impact of participating in a National Science Foundation (NSF)-funded Engineering Research Center (ERC). A multi-institutional consortium composed of ERC education directors, researchers, and evaluators from six NSF-funded ERCs designed easily accessible evaluation instruments and tools that specifically help measure anticipated outcomes for ERC participants for all ERCs. The total effort underway by the consortium includes creating a suite of qualitative and quantitative instruments, an evaluator toolkit, and a user-friendly online platform to host the inventory materials. This paper focuses on the quantitative instrument created to evaluate the experiences of those who engage with a center. It consists of Likert-type questions assessing the impact of the ERC on participants' self-reported: 1) understanding of the ERC, 2) research and communication skills, 3) climate of inclusion, 4) mentorship experiences, and 5) program satisfaction. The instrument also included additional demographic questions and questions to capture STEM-related future plans. The instrument was designed using multiple rounds of design iterations and pilot tests. Separate surveys used by individual ERCs were compiled and categorized to ensure all requirements from the National Science Foundation were met. The web-based survey was administered to six ERCs during the Summer of 2021, Fall of 2021, and Spring of 2022. A total of 549 responses were collected; 535 were used following data cleaning procedures. Sample sizes for each component of the survey varied because some ERCs chose to only use some parts of the new instrument. Exploratory Factor Analyses (EFA) were performed to identify latent factors and items that needed further revision. The following factors emerged from our analyses: 1) ERC general understanding; 2) development of research skills; 3) development of professional skills; 4) experience in the ERC; 5) feelings toward the ERC; 6) Beliefs about the ERC, 7) mentors performance; and 8) mentorship experience. The results provide preliminary evidence that the survey can be used across ERCs. This effort is the first that has been undertaken to develop a shared ERC instrument. The data collected was used to continue in-progress validation. The collaborative nature of this effort can provide ways for ERCs to benchmark impacts of their efforts and share effective practices across ERCs and other similarly structured STEM centers going forward. 
    more » « less
  3. Objective Over the past decade, we developed and studied a face-to-face video-based analysis-of-practice professional development (PD) model. In a cluster randomized trial, we found that the face-to-face model enhanced elementary science teacher knowledge and practice and resulted in important improvements to student science achievement (student treatment effect, d = 0.52; Taylor et al, 2017; Roth et al, 2018). The face-to-face PD model is expensive and difficult to scale. In this paper, we present the results of a two-year design-based research study to translate the face-to-face PD into a facilitated online PD experience. The purpose is to create an effective, flexible, and cost-efficient PD model that will reach a broader audience of teachers. Perspective/Theoretical Framework The face-to-face PD model is grounded in situated cognition and cognitive apprenticeship frameworks. Teachers engage in learning science content and effective science teaching practices in the context in which they will be teaching. There are scaffolded opportunities for teachers to learn from analysis of model videos by experienced teachers, to try teaching model units, to analyze video of their own teaching efforts, and ultimately to develop their own unit, with guidance. The PD model attends to the key features of effective PD as described by Desimone (2009) and others. We adhered closely to the design principles of the face-to-face model as described by Authors, 2019. Methods We followed a design-based research approach (DBR; Cobb et al., 2003; Shavelson et al., 2003) to examine the online program components and how they promoted or interfered with the development of teachers’ knowledge and reflective practice. Of central interest was the examination of mechanisms for facilitating teacher learning (Confrey, 2006). To accomplish this goal, design researchers engaged in iterative cycles of problem analysis, design, implementation, examination, and redesign (Wang & Hannafin, 2005) in phase one of the project before studying its effect. Data Three small pilot groups of teachers engaged in both synchronous and asynchronous components of the larger online course which began implementation with a 10-week summer course that leads into study groups of participants meeting through one academic year. We iteratively designed, tested, and revised 17 modules across three pilot versions. On average, pilot groups completed one module every two weeks. Pilot 1 began the work in May 2019; Pilot 2 began in August 2019, and Pilot 3 began in October 2019. Pilot teachers responded to surveys and took part in interviews related to the PD. The PD facilitators took extensive notes after each iteration. The development team met weekly to discuss revisions. We revised all modules between each pilot group and used what we learned to inform our development of later modules within each pilot. For example, we applied what we learned from testing Module 3 with Pilot 1 to the development of Module 3 for Pilots 2, and also applied what we learned from Module 3 with Pilot 1 to the development of Module 7 for Pilot 1. Results We found that community building required the same incremental trust-building activities that occur in face-to-face PD. Teachers began with low-risk activities and gradually engaged in activities that required greater vulnerability (sharing a video of themselves teaching a model unit for analysis and critique by the group). We also identified how to contextualize technical tools with instructional prompts to allow teachers to productively interact with one another about science ideas asynchronously. As part of that effort, we crafted crux questions to surface teachers’ confusions or challenges related to content or pedagogy. We called them crux questions because they revealed teachers’ uncertainty and deepened learning during the discussion. Facilitators leveraged asynchronous responses to crux questions in the synchronous sessions to push teacher thinking further than would have otherwise been possible in a 2-hour synchronous video-conference. Significance Supporting teachers with effective, flexible, and cost-efficient PD is difficult under the best of circumstances. In the era of covid-19, online PD has taken on new urgency. NARST members will gain insight into the translation of an effective face-to-face PD model to an online environment. 
    more » « less
  4. This paper describes an evidence based-practice paper to a formative response to the engineering faculty and students’ needs at Anonymous University. Within two weeks, the pandemic forced the vast majority of the 1.5 million faculty and 20 million students nationwide to transition all courses from face-to-face to entirely online. Never in the history of higher education has there been a concerted effort to adapt so quickly and radically, nor have we had the technology to facilitate such a rapid and massive change. At Anonymous University, over 700 engineering educators were racing to transition their courses. Many of those faculty had never experienced online course preparation, much less taught one synchronously or asynchronously. Faculty development centers and technology specialists across the university made a great effort to aid educators in this transition. These educators had questions about the best practices for moving online, how their students were affected, and the best ways to engage their students. However, these faculty’s detailed questions were answerable only by faculty peers’ experience, students’ feedback, and advice from experts in relevant engineering education research-based practices. This paper describes rapid, continuous, and formative feedback provided by the Engineering Education Faculty Group (EEFG) to provide an immediate response for peer faculty guidance during the pandemic, creating a community of practice. The faculty membership spans multiple colleges in the university, including engineering, education, and liberal arts. The EEFG transitioned immediately to weekly meetings focused on the rapidly changing needs of their colleagues. Two surveys were generated rapidly by Hammond et al. to characterize student and faculty concerns and needs in March of 2020 and were distributed through various means and media. Survey 1 and 2 had 3381 and 1506 respondents respectively with most being students, with 113 faculty respondents in survey 1, the focus of this piece of work. The first survey was disseminated as aggregated data to the College of Engineering faculty with suggested modifications to course structures based on these findings. The EEFG continued to meet and collaborate during the remainder of the Spring 2020 semester and has continued through to this day. This group has acted as a hub for teaching innovation in remote online pedagogy and techniques, while also operating as a support structure for members of the group, aiding those members with training in teaching tools, discussion difficult current events, and various challenges they are facing in their professional teaching lives. While the aggregated data gathered from the surveys developed by Hammond et al. was useful beyond measure in the early weeks of the pandemic, little attention at the time was given to the responses of faculty to that survey. The focus of this work has been to characterize faculty perceptions at the beginning of the pandemic and compare those responses between engineering and non-engineering faculty respondents, while also comparing reported perceptions of pre- and post-transition to remote online teaching. Interviews were conducted between 4 members of the EEFG with the goal of characterizing some of the experiences they have had while being members of the group during the time of the pandemic utilizing Grounded theory qualitative analysis. 
    more » « less
  5. Objective Over the past decade, we developed and studied a face-to-face video-based analysis-of-practice PD model. In a cluster randomized trial, we found that the face-to-face model enhanced elementary science teacher knowledge and practice, and resulted in important improvements to student science achievement (student treatment effect, d = 0.52; Taylor et al., 2017: Roth et al., 2018). The face-to-face PD model is expensive and difficult to scale. In this poster, we present the results of a two-year design-based research study to translate the face-to-face PD into a facilitated online PD experience. The purpose is to create an effective, flexible, and cost-efficient PD model that will reach a broader audience of teachers. Perspective/Theoretical Framework The face-to-face PD model is grounded in situated cognition and cognitive apprenticeship frameworks. Teachers engage in learning science content and practices in the context in which they will be teaching. In addition, there are scaffolded opportunities for teachers to learn from model videos by experienced teachers, try model units, and ultimately develop their own unit, with guidance. The PD model also attends to the key features of effective PD as described by Desimone (2009) and others. We adhered closely to the design principles of the face-to-face model as described by Roth et al., 2018. Methods We followed a design-based research approach (DBR: Cobb et al., 2003: Shavelson et al., 2003) to examine the online program components and how they promoted or interfered with the development of teachers’ knowledge and reflective practice. Of central interest was the examination of mechanisms for facilitating teacher learning (Confrey, 2006). To accomplish this goal, design researchers engaged in iterative cycles of problem analysis, design, implementation, examination, and redesign (Wang & Hannafin, 2005). Data We iteratively designed, tested, and revised 17 modules across three pilot versions. Three small groups of teachers engaged in both synchronous and asynchronous components of the larger online course. They responded to surveys and took part in interviews related to the PD. The PD facilitators took extensive notes after each iteration. The development team met weekly to discuss revisions. Results We found that community building required the same incremental trust-building activities that occur in face-to-face PD. Teachers began with low-risk activities and gradually engaged in activities that required greater vulnerability (sharing a video of themselves teaching a model unit for analysis and critique by the group). We also identified how to contextualize technical tools with instructional prompts to allow teachers to productively interact with one another about science ideas asynchronously. As part of that effort, we crafted crux questions to surface teachers’ confusions or challenges related to content or pedagogy. Facilitators leveraged asynchronous responses to crux questions in the synchronous sessions to push teacher thinking further than would have otherwise been possible in a 2-hour synchronous video-conference. Significance Supporting teachers with effective, flexible, and cost-efficient PD is difficult under the best of circumstances. In the era of COVID-19, online PD has taken on new urgency. AERA members will gain insight into the construction of an online PD for elementary science teachers/ Full digital poster available at: https://aera21-aera.ipostersessions.com/default.aspx?s=64-5F-86-2E-15-F8-C3-C0-45-C6-A0-B7-1D-90-BE-46 
    more » « less