Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
-
Testing for Granger causality relies on estimating the capacity of dynamics in one time series to forecast dynamics in another. The canonical test for such temporal predictive causality is based on fitting multivariate time series models and is cast in the classical null hypothesis testing framework. In this framework, we are limited to rejecting the null hypothesis or failing to reject the null -- we can never validly accept the null hypothesis of no Granger causality. This is poorly suited for many common purposes, including evidence integration, feature selection, and other cases where it is useful to express evidence against, rather than for, the existence of an association. Here we derive and implement the Bayes factor for Granger causality in a multilevel modeling framework. This Bayes factor summarizes information in the data in terms of a continuously scaled evidence ratio between the presence of Granger causality and its absence. We also introduce this procedure for the multilevel generalization of Granger causality testing. This facilitates inference when information is scarce or noisy or if we are interested primarily in population-level trends. We illustrate our approach with an application on exploring causal relationships in affect using a daily life study.more » « lessFree, publicly-accessible full text available November 1, 2025
-
Free, publicly-accessible full text available December 31, 2025
-
null (Ed.)Background Mobile health (mHealth) methods often rely on active input from participants, for example, in the form of self-report questionnaires delivered via web or smartphone, to measure health and behavioral indicators and deliver interventions in everyday life settings. For short-term studies or interventions, these techniques are deployed intensively, causing nontrivial participant burden. For cases where the goal is long-term maintenance, limited infrastructure exists to balance information needs with participant constraints. Yet, the increasing precision of passive sensors such as wearable physiology monitors, smartphone-based location history, and internet-of-things devices, in combination with statistical feature selection and adaptive interventions, have begun to make such things possible. Objective In this paper, we introduced Wear-IT, a smartphone app and cloud framework intended to begin addressing current limitations by allowing researchers to leverage commodity electronics and real-time decision making to optimize the amount of useful data collected while minimizing participant burden. Methods The Wear-IT framework uses real-time decision making to find more optimal tradeoffs between the utility of data collected and the burden placed on participants. Wear-IT integrates a variety of consumer-grade sensors and provides adaptive, personalized, and low-burden monitoring and intervention. Proof of concept examples are illustrated using artificial data. The results of qualitative interviews with users are provided. Results Participants provided positive feedback about the ease of use of studies conducted using the Wear-IT framework. Users expressed positivity about their overall experience with the framework and its utility for balancing burden and excitement about future studies that real-time processing will enable. Conclusions The Wear-IT framework uses a combination of passive monitoring, real-time processing, and adaptive assessment and intervention to provide a balance between high-quality data collection and low participant burden. The framework presents an opportunity to deploy adaptive assessment and intervention designs that use real-time processing and provides a platform to study and overcome the challenges of long-term mHealth intervention.more » « less
-
Assessing several individuals intensively over time yields intensive longitudinal data (ILD). Even though ILD provide rich information, they also bring other data analytic challenges. One of these is the increased occurrence of missingness with increased study length, possibly under non-ignorable missingness scenarios. Multiple imputation (MI) handles missing data by creating several imputed data sets, and pooling the estimation results across imputed data sets to yield final estimates for inferential purposes. In this article, we introduce dynr.mi(), a function in the R package, Dynamic Modeling in R (dynr). The package dynr provides a suite of fast and accessible functions for estimating and visualizing the results from fitting linear and nonlinear dynamic systems models in discrete as well as continuous time. By integrating the estimation functions in dynr and the MI procedures available from the R package, Multivariate Imputation by Chained Equations (MICE), the dynr.mi() routine is designed to handle possibly non-ignorable missingness in the dependent variables and/or covariates in a user-specified dynamic systems model via MI, with convergence diagnostic check. We utilized dynr.mi() to examine, in the context of a vector autoregressive model, the relationships among individuals’ ambulatory physiological measures, and self-report affect valence and arousal. The results from MI were compared to those from listwise deletion of entries with missingness in the covariates. When we determined the number of iterations based on the convergence diagnostics available from dynr.mi(), differences in the statistical significance of the covariate parameters were observed between the listwise deletion and MI approaches. These results underscore the importance of considering diagnostic information in the implementation of MI procedures.more » « less