Pedro Nunes' Lecture - 2023

13/12/2023 a 15/12/2023

Richard Davis

 

Richard Davis is Howard Levene Professor of Statistics at Columbia University. He received his Ph.D. degree in Mathematics from the University of California at San Diego in 1979 and has held academic positions at MIT, Colorado State University, and visiting appointments at numerous other universities. He was Hans Fischer Senior Fellow at the Technical University of Munich (2009-12), Villum Kan Rasmussen Visiting Professor at the University of Copenhagen (2011-13), and Chalmers Jubilee Professor at Chalmers University of Technology.

His research interests include time series, applied probability, extreme value theory, heavy-tailed modeling with applications to network models, and spatial-temporal modeling. He has advised/co-advised 34 PhD students and has delivered numerous short courses on time series and heavy-tailed modeling. 

He is a fellow of the Institute of Mathematical Statistics and the American Statistical Association, and is an elected member of the International Statistical Institute. He was president of IMS in 2015-16 and Editor-in-Chief of Bernoulli Journal 2010-12. He is co-author (with Peter Brockwell) of the bestselling books "Time Series: Theory and Methods", "Introduction to Time Series and Forecasting", and of the time series analysis computer software package ITSM2000. Together with Torben Andersen, Jens-Peter Kreiss, and Thomas Mikosch, he co-edited the "Handbook in Financial Time Series" and with Holan, Lund, and Ravishanker the "Handbook of Discrete-Valued Time Series". In 1998, he won (with collaborator W.T.M Dunsmuir) the Koopmans Prize for Econometric Theory.

 

13th of December

15:00 University of Aveiro, Edifício Central e da Reitoria, Sala dos Atos Académicos

Using Sample Splitting for Assessing Goodness of Fit in Time Series

Abstract: A fundamental and often final step in time series modeling is to assess the quality of fit of a proposed model to the data. Since the underlying distribution of the innovations that generate a model is often not prescribed, goodness-of-fit tests typically take the form of testing the fitted residuals for serial independence. However, these fitted residuals are inherently dependent since they are based on the same parameter estimates. Thus, standard tests of serial independence, such as those based on the autocorrelation function (ACF) or distance correlation function (ADCF) of the fitted residuals, need to be adjusted. The sample splitting procedure in Pfister et al. (2018) is one such fix for the case of models for independent data, but fails to work in the dependent case. In this talk we show how sample splitting can be leveraged in the time series setting to perform tests of serial dependence of fitted residuals using the ACF and ADCF. Here the first half of the time series is used to estimate the parameters of the model, and then using these parameter estimates, the entire time series is used to compute the estimated residuals. The ACF and ADCF tests of serial independence will then have the same limit distributions as though the underlying residuals are indeed iid.  (This is joint work with Leon Fernandes.)

Video recording (if the video does not open in the best possible (HD) quality, please select it in the YouTube player):

 

15th of December

15:00 Academia das Ciências de Lisboa, Sala das Sessões

Statistical Learning of Multivariate Extremes

Abstract: In this talk, a spectral clustering algorithm for analyzing the dependence structure of multivariate extremes is proposed. This work studies the theoretical performance of spectral clustering based on a random k-nearest neighbor graph constructed from an extremal sample, i.e., the angular part of random vectors for which the radius exceeds a large threshold. In particular, we derive the asymptotic distribution of extremes arising from a linear factor model and prove that, under certain conditions, spectral clustering can consistently identify the clusters of extremes arising in this model. Leveraging this result we propose a simple consistent estimation strategy for learning the angular measure. Our theoretical findings are complemented with numerical experiments illustrating the finite sample performance of our methods. An application to environmental extremes will also be given.  (This is joint work with Marco Avella Medina and Gennady Samorodnitsky.)


Edited/published: 05/03/2024