Abstract Cyclic degradation in flexible electronic inks remains a key challenge while their deployment in life critical applications is ongoing. The origin of electrical degradation of a screen-printed stretchable conductive ink with silver flakes embedded in a polyurethane binder is investigated under uniaxial monotonic and cyclic stretching, using in-situ confocal microscopy and scanning electron microscopy experiments, for varying ink thickness (1, 2, and 3 layers, each layer around 8–10 μ m) and trace width (0.5, 1, and 2 mm). Cracks form under monotonic stretching, and the evolution of crack pattern (density, length and width) with applied strain is affected by ink thickness such that the 3-layer ink exhibits larger normalized resistance but slightly lower resistance than the 1-layer ink up to strains of 125%. For cyclic stretching, the crack density and length do not evolve with cycling. However, the cracks widen and deepen, leading to an increase in resistance with cycling. There exists a strong correlation between fatigue life, i.e. the number of cycles until a normalized resistance of 100 is reached, and the strain amplitude. The normalized resistance increase rate with respect to cycling is also found to scale with strain amplitude. The rate of change in resistance with cycling decreases with ink thickness and trace width. For practical applications, thicker ( ⩾ 25 μ m) and wider (⩾2 mm) inks should be used to lower resistance increases with repeated deformation. 
                        more » 
                        « less   
                    This content will become publicly available on April 28, 2026
                            
                            Reliable Board-Level Degradation Prediction with Monotonic Segmented Regression under Noisy Measurement
                        
                    
    
            The increasing complexity of electronic systems in autonomous electric vehicles necessitates robust methods for forecasting the degradation of critical components such as printed circuit boards (PCBs). Various time series forecasting methods have been investigated to predict in-situ resistance degradation under vibration loads. However, these methods failed to capture the degradation trend under strong measurement noise. This paper introduces Monotonic Segmented Linear Regression (MSLR), a novel approach designed to capture monotonic degradation trends in time series data under significant measurement noise. By incorporating monotonic constraints, MSLR effectively models the non-decreasing behavior characteristic of degradation processes. To further enhance reliability of the prediction, we integrate Adaptive Conformal Inference (ACI) with MSLR, enabling the estimation of statistically valid upper bounds for resistance degradation with high confidence. Extensive experiments demonstrate that MSLR outperforms state-of-the-art time series forecasting baselines on real-world PCB degradation datasets. 
        more » 
        « less   
        
    
                            - Award ID(s):
- 1956313
- PAR ID:
- 10593475
- Publisher / Repository:
- 2025 IEEE VLSI Test Symposium
- Date Published:
- ISBN:
- 979-8-3315-2144-8
- Format(s):
- Medium: X
- Location:
- Tempe, AZ, USA
- Sponsoring Org:
- National Science Foundation
More Like this
- 
            
- 
            Time series forecasting is an important application in various domains such as energy management, traffic planning, financial markets, meteorology, and medicine. However, real-time series data often present intricate temporal variability and sharp fluctuations, which pose significant challenges for time series forecasting. Previous models that rely on 1D time series representations usually struggle with complex temporal variations. To address the limitations of 1D time series, this study introduces the Times2D method that transforms the 1D time series into 2D space. Times2D consists of three main parts: first, a Periodic Decomposition Block (PDB) that captures temporal variations within a period and between the same periods by converting the time series into a 2D tensor in the frequency domain. Second, the First and Second Derivative Heatmaps (FSDH) capture sharp changes and turning points, respectively. Finally, an Aggregation Forecasting Block (AFB) integrates the output tensors from PDB and FSDH for accurate forecasting. This 2D transformation enables the utilization of 2D convolutional operations to effectively capture long and short characteristics of the time series. Comprehensive experimental results across large-scale data in the literature demonstrate that the proposed Times2D model achieves state-of-the-art performance in both short-term and long-term forecasting.more » « less
- 
            In many scientific fields, such as economics and neuroscience, we are often faced with nonstationary time series, and concerned with both finding causal relations and forecasting the values of variables of interest, both of which are particularly challenging in such nonstationary environments. In this paper, we study causal discovery and forecasting for nonstationary time series. By exploiting a particular type of state-space model to represent the processes, we show that nonstationarity helps to identify the causal structure, and that forecasting naturally benefits from learned causal knowledge. Specifically, we allow changes in both causal strengths and noise variances in the nonlinear state-space models, which, interestingly, renders both the causal structure and model parameters identifiable. Given the causal model, we treat forecasting as a problem in Bayesian inference in the causal model, which exploits the time-varying property of the data and adapts to new observations in a principled manner. Experimental results on synthetic and real-world data sets demonstrate the efficacy of the proposed methods.more » « less
- 
            Summary We consider forecasting a single time series using a large number of predictors in the presence of a possible nonlinear forecast function. Assuming that the predictors affect the response through the latent factors, we propose to first conduct factor analysis and then apply sufficient dimension reduction on the estimated factors to derive the reduced data for subsequent forecasting. Using directional regression and the inverse third-moment method in the stage of sufficient dimension reduction, the proposed methods can capture the nonmonotone effect of factors on the response. We also allow a diverging number of factors and only impose general regularity conditions on the distribution of factors, avoiding the undesired time reversibility of the factors by the latter. These make the proposed methods fundamentally more applicable than the sufficient forecasting method of Fan et al. (2017). The proposed methods are demonstrated both in simulation studies and an empirical study of forecasting monthly macroeconomic data from 1959 to 2016. Also, our theory contributes to the literature of sufficient dimension reduction, as it includes an invariance result, a path to perform sufficient dimension reduction under the high-dimensional setting without assuming sparsity, and the corresponding order-determination procedure.more » « less
- 
            null (Ed.)As new grid edge technologies emerge—such as rooftop solar panels, battery storage, and controllable water heaters—quantifying the uncertainties of building load forecasts is becoming more critical. The recent adoption of smart meter infrastructures provided new granular data streams, largely unavailable just ten years ago, that can be utilized to better forecast building-level demand. This paper uses Bayesian Structural Time Series for probabilistic load forecasting at the residential building level to capture uncertainties in forecasting. We use sub-hourly electrical submeter data from 120 residential apartments in Singapore that were part of a behavioral intervention study. The proposed model addresses several fundamental limitations through its flexibility to handle univariate and multivariate scenarios, perform feature selection, and include either static or dynamic effects, as well as its inherent applicability for measurement and verification. We highlight the benefits of this process in three main application areas: (1) Probabilistic Load Forecasting for Apartment-Level Hourly Loads; (2) Submeter Load Forecasting and Segmentation; (3) Measurement and Verification for Behavioral Demand Response. Results show the model achieves a similar performance to ARIMA, another popular time series model, when predicting individual apartment loads, and superior performance when predicting aggregate loads. Furthermore, we show that the model robustly captures uncertainties in the forecasts while providing interpretable results, indicating the importance of, for example, temperature data in its predictions. Finally, our estimates for a behavioral demand response program indicate that it achieved energy savings; however, the confidence interval provided by the probabilistic model is wide. Overall, this probabilistic forecasting model accurately measures uncertainties in forecasts and provides interpretable results that can support building managers and policymakers with the goal of reducing energy use.more » « less
 An official website of the United States government
An official website of the United States government 
				
			 
					 
					
