Autocorrelation

/ˌɔː.toʊ.kəˈreɪ.ʃən/

noun … “how the past whispers to the present.”

Autocorrelation is a statistical measure that quantifies the correlation of a signal, dataset, or time series with a delayed copy of itself over varying lag intervals. It captures the degree to which current values are linearly dependent on past values, revealing repeating patterns, trends, or temporal dependencies. Autocorrelation is widely used in time-series analysis, signal processing, econometrics, and machine learning to detect seasonality, persistence, and memory effects in data.