skip to main content


Search for: All records

Creators/Authors contains: "Lin, S."

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. The association between student motivation and learning, and changes in motivation across a course, were evaluated for students enrolled in one-semester foundation-level inorganic chemistry courses at multiple postsecondary institutions across the United States. The Academic Motivation Scale for Chemistry (AMS-Chemistry) and the Foundations of Inorganic Chemistry American Chemical Society Exam (i.e., a content knowledge measure) were used in this study. Evidence of validity, reliability, and longitudinal measurement invariance for data obtained from the AMS-Chemistry instrument with this population were found using methodologies appropriate for ordinal, non-parametric data. Positive and significant associations between intrinsic motivation measures and academic performance corroborate theoretical and empirical investigations; however, a lack of pre/post changes in motivation suggest that motivation may be less malleable in courses primarily populated by chemistry majors. Implications for inorganic chemistry instructors include paths for incorporating engaging pedagogies known to promote intrinsic motivation and methods for incorporating affect measures into assessment practices. Implications for researchers include a need for more work that disaggregates chemistry majors when evaluating relationships between affect and learning, and when making pre/post comparisons. Additionally, this work provides an example of how to implement more appropriate methods for treating data in studies using Likert-type responses and nested data. 
    more » « less
  2. null (Ed.)
  3. Graph Neural Networks (GNNs) are based on repeated aggregations of information from nodes’ neighbors in a graph. However, because nodes share many neighbors, a naive implementation leads to repeated and inefficient aggregations and represents significant computational overhead. Here we propose Hierarchically Aggregated computation Graphs (HAGs), a new GNN representation technique that explicitly avoids redundancy by managing intermediate aggregation results hierarchically and eliminates repeated computations and unnecessary data transfers in GNN training and inference. HAGs perform the same computations and give the same models/accuracy as traditional GNNs, but in a much shorter time due to optimized computations. To identify redundant computations, we introduce an accurate cost function and use a novel search algorithm to find optimized HAGs. Experiments show that the HAG representation significantly outperforms the standard GNN by increasing the end-to-end training throughput by up to 2.8× and reducing the aggregations and data transfers in GNN training by up to 6.3× and 5.6×, with only 0.1% memory overhead. Overall, our results represent an important advancement in speeding-up and scaling-up GNNs without any loss in model predictive performance. 
    more » « less
  4. null (Ed.)
  5. Free, publicly-accessible full text available June 1, 2024