skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Search for: All records

Creators/Authors contains: "Robinson, Daniel"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Free, publicly-accessible full text available December 31, 2025
  2. Abstract Over the past 30 years, several reviews have examined scholarly contributions of individual researchers and institutions in the field of educational psychology (Fong et al., Educational Psychology Review 34:2379–2403, 2022; Greenbaum et al., Educational Psychology Review 28:215–223, 2016; Hsieh et al., Contemporary Educational Psychology 29:333–343, 2004; Jones et al., Contemporary Educational Psychology 35:11–16, 2010; Smith et al., Contemporary Educational Psychology 23:173–181, 1998; Smith et al., Contemporary Educational Psychology 28:422– 430, 2003). However, no reviews have specifically examined scholarly impact as measured by citations since (Walberg, Current Contents 22:5–14, 1990) did so over 34 years ago. The present review focused on the period from 1988 to 2023, identifying the most cited articles and authors since Walberg’s study that focused on the period from 1966–1988. Whereas most of the previous reviews have been limited in terms of brief time periods (e.g., six years) and a small set of journals (e.g., five), our scope included 12 educational psychology journals across 36 years. The most cited article (over 9000 times) by (Ryan and Deci, Contemporary Educational Psychology 25:54–67, 2000) had more than twice as many citations as the second most cited article by (Pintrich and Groot, Journal of Educational Psychology 82:33–40, 1990). Most of the top 30 most cited articles, including four of the top five, addressed the topic of motivation. With regard to highly cited authors, the top five were John Sweller, Richard E. Mayer, Fred Paas, Richard M. Ryan, and Reinhard Pekrun. Several of the 30 most cited authors have never appeared in previous lists of most productive authors. Finally, keyword and cluster analyses revealed most popular topics and collaborative networks among many of the most cited authors that may partly explain their productivity. Examining article and author impact is an important complement to productivity when considering scholarly contributions to the field of educational psychology. 
    more » « less
    Free, publicly-accessible full text available September 1, 2025
  3. Cognitive load theory (CLT) has driven numerous empirical studies for over 30 years and is a major theme in many of the most cited articles published between 1988 and 2023. However, CLT articles have not been compared to other educational psychology research in terms of the research designs used and the extent to which recommendations for practice are justified. As Brady and colleagues found, a large percentage of the educational psychology articles reviewed were not experimental and yet frequently made specific recommendations from observational/correlational data. Therefore, in this review, CLT articles were examined with regard to the types of research methodology employed and whether recommendations for practice were justified. Across several educational psychology journals in 2020 and 2023, 16 articles were determined to directly test CLT. In contrast to other articles, which employed mostly observational methods, all but two of the CLT articles employed experimental or intervention designs. For the two CLT articles that were observational, recommendations for practice were not made. Reasons for the importance of experimental work are discussed. 
    more » « less
    Free, publicly-accessible full text available August 1, 2025
  4. A stochastic algorithm is proposed, analyzed, and tested experimentally for solving continuous optimization problems with nonlinear equality constraints. It is assumed that constraint function and derivative values can be computed but that only stochastic approximations are available for the objective function and its derivatives. The algorithm is of the sequential quadratic optimization variety. Distinguishing features of the algorithm are that it only employs stochastic objective gradient estimates that satisfy a relatively weak set of assumptions (while using neither objective function values nor estimates of them) and that it allows inexact subproblem solutions to be employed, the latter of which is particularly useful in large-scale settings when the matrices defining the subproblems are too large to form and/or factorize. Conditions are imposed on the inexact subproblem solutions that account for the fact that only stochastic objective gradient estimates are employed. Convergence results are established for the method. Numerical experiments show that the proposed method vastly outperforms a stochastic subgradient method and can outperform an alternative sequential quadratic programming algorithm that employs highly accurate subproblem solutions in every iteration. Funding: This material is based upon work supported by the National Science Foundation [Awards CCF-1740796 and CCF-2139735] and the Office of Naval Research [Award N00014-21-1-2532]. 
    more » « less
    Free, publicly-accessible full text available July 1, 2025
  5. A sequential quadratic optimization algorithm is proposed for solving smooth nonlinear-equality-constrained optimization problems in which the objective function is defined by an expectation. The algorithmic structure of the proposed method is based on a step decomposition strategy that is known in the literature to be widely effective in practice, wherein each search direction is computed as the sum of a normal step (toward linearized feasibility) and a tangential step (toward objective decrease in the null space of the constraint Jacobian). However, the proposed method is unique from others in the literature in that it both allows the use of stochastic objective gradient estimates and possesses convergence guarantees even in the setting in which the constraint Jacobians may be rank-deficient. The results of numerical experiments demonstrate that the algorithm offers superior performance when compared with popular alternatives. 
    more » « less
  6. Optimization problems with group sparse regularization are ubiquitous in various popular downstream applications, such as feature selection and compression for Deep Neural Networks (DNNs). Nonetheless, the existing methods in the literature do not perform particularly well when such regularization is used in combination with a stochastic loss function. In particular, it is challenging to design a computationally efficient algorithm with a convergence guarantee and can compute group-sparse solutions. Recently, a half-space stochastic projected gradient ({\tt HSPG}) method was proposed that partly addressed these challenges. This paper presents a substantially enhanced version of {\tt HSPG} that we call~{\tt AdaHSPG+} that makes two noticeable advances. First, {\tt AdaHSPG+} is shown to have a stronger convergence result under significantly looser assumptions than those required by {\tt HSPG}. This improvement in convergence is achieved by integrating variance reduction techniques with a new adaptive strategy for iteratively predicting the support of a solution. Second, {\tt AdaHSPG+} requires significantly less parameter tuning compared to {\tt HSPG}, thus making it more practical and user-friendly. This advance is achieved by designing automatic and adaptive strategies for choosing the type of step employed at each iteration and for updating key hyperparameters. The numerical effectiveness of our proposed {\tt AdaHSPG+} algorithm is demonstrated on both convex and non-convex benchmark problems. The source code is available at \url{https://github.com/tianyic/adahspg}. 
    more » « less