skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: It’s About Time: Gaining Insights from Turnaround Time Metrics
Rutgers University is a R1 Research Institution with $929M in Sponsored Research funding in FY 2023 and $747M in Research Expenditures in 2022 as per the HERD survey. Since forming in 2019, the Data, Analytics and Business Intelligence team within the Office for Research has grown to three members, all with the mission of providing accurate and timely data about the Rutgers research enterprise for both internal leadership and decision makers across the university. The Office for Research leadership was receiving complaints from the field about sponsored research awards taking too long to be set up. The Data, Analytics and Business Intelligence team was tasked to find out: 1. Is this a common issue? 2. Is this a localized issue? 3. What’s causing this? Our team determined that by calculating the start and end of processes, and by categorizing responsible party within those processes, we can figure out what an average user might experience and then dig deeper into the outliers. We involved IT to extract the data from the award set up system into the data warehouse, and ultimately a Tableau data source. We also met with the business team to better understand the data in their system and define how their processes should be categorized, and if any of their processes might need to change to help generate important data. The final deliverable was a published Tableau Dashboard that leadership and the business unit could access that shows turnaround time trends broken down in various ways, with the option to drill down into outliers without leaving the dashboard. Since then, we’ve been able to replicate this method across other Office for Research business units including the teams managing IRB and IACUC protocols. Our presentation would include: 1. The problem that was presented to us 2. The process we followed 3. Samples of the dashboards we created 4. The outcome and lessons learned 5. Guiding questions on how other schools can use this method and replicate it across multiple systems within their Research Offices  more » « less
Award ID(s):
2324388
PAR ID:
10563697
Author(s) / Creator(s):
;
Publisher / Repository:
University of Kentucky Libraries
Date Published:
Subject(s) / Keyword(s):
FOS: Computer and information sciences
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Central administration at Rutgers University requested help from the Office for Research to develop a proof-of-concept tool to count collaborations between faculty, departments, schools and chancellor units across the university. This presentation will discuss how the Data, Analytics and Business Intelligence team focused on research administration and research output data to define a “collaboration,” structure available data, create interactive visualizations and allow end users to customize the level of detail displayed. This presentation will begin by discussing the importance of generating reliable collaboration data in a university setting including use cases for the data. It will then describe the dataset that was used for the proof-of-concept project and discuss why these sources were important. Finally, the presentation will discuss future development goals and collaborators for the collaboration project. 
    more » « less
  2. The use of data and analytics in the field of research administration remains in explorations stages for many, if not most, higher education institutions. That is evidenced not only by the great demand and attendance seen for such sessions at NCURA’s Annual Meetings in 2022 and 2023, but also via prior research. This presentation will highlight the efforts implementing data-informed decision making based on research administration metrics, analytics, and dashboard examples. Emphasis will be placed on that you can’t manage what you can’t measure. The data analytics team at Emory supports all visions of our research administration leadership. Challenges, pain points, and lessons learned will be shared. Reasons for implementing collecting data and metrics will be shown. These include improving operational efficiencies, stakeholder satisfaction (e.g., faculty), as well as providing analytical insights to decision makers. The benefits of such initiatives will also be depicted with examples of successfully implemented metrics and analytics, including dashboard examples. 
    more » « less
  3. As proposal, subaward, award, and agreement volumes continue to grow at your institution, how can you strategize to support the size of your research infrastructure? How can you justify the expansion of your teams and talent? How do you adjust roles and workload to align with the growing portfolio? What technology considerations should you make to track this? This session will cover how to leverage data to make effective business decisions regarding resource needs and allocation methodology to meet growing demands, starting with your internal data and then looking externally. We will cover strategies for using data analytics to efficiently manage the size of your research enterprise and portfolios, as well as measure performance of pre-award and post-award functions across the grant lifecycle. Participants will understand how to analyze portfolios, beyond sheet volume, and analyze the various criteria that can be objectively evaluated to determine how to balance your sponsored programs portfolio. We will review common reporting and analytics tools and provide examples of critical data points to consider at both the central office and department level. Presented at the 2024 Research Analytics Summit in Albuquerque, NM 
    more » « less
  4. There is a growing interest in the research and use of automated feedback dashboards that display classroom analytics; yet little is known about the detailed processes instructors use to make sense of these tools, and to determine the impact on their teaching practices. This research was conducted at a public Midwestern university within the context of an automated classroom observation and feedback implementation project. Fifteen engineering instructors engaged in this research. The overarching goal was to investigate instructor teaching beliefs, pedagogical practices, and sensemaking processes regarding dashboard use. A grounded theory approach was used to identify categories related to instructor perceptions. Results revealed that instructor experiences inform both their present use of the dashboard and consequential future actions. A model is presented that illustrates categories included in instructor pre-use, use, and post-use of an automated feedback dashboard. An extension to this model is presented and accompanied by recommendations for a more effective future use of automated dashboards. The model’s practical implications inform both instructors and designers on effective design and use of dashboards, ultimately paving a way to improve pedagogical practices and instruction 
    more » « less
  5. It is difficult for instructors, and even students themselves, to become aware in real-time of inequitable behaviors occurring on student teams. Here, we explored a potential measure for inequitable teamwork drawing on data from a digital pedagogical tool designed to surface and disrupt such team behaviors. Students in a large, undergraduate business course completed seven surveys about team health (called team checks) at regular intervals throughout the term, providing information about team dynamics, contributions, and processes. The ways in which changes in students’ scores from team check to team check compared to the median changes for their team were used to identify the proportions of teams with outlier student scores. The results show that for every team size and team check item, the proportion of teams with outliers at the end of the term was smaller than at the beginning of the semester, indicating stabilization in how teammates evaluated their team experiences. In all but two cases, outlying students were not disproportionately likely to identify with historically marginalized groups based on gender or race/ethnicity. Thus, we did not broadly identify teamwork inequities in this specific context, but the method provides a basis for future studies about inequitable team behavior. 
    more » « less