Key research themes
1. How can normalization improve robust and efficient correlation estimation in noisy or small-sample data?
This theme investigates the role of normalization techniques on enhancing the stability, robustness, and accuracy of correlation measures and associated statistical procedures, especially in contexts of correlated or noisy data, small sample sizes, or challenging signal conditions. Normalization methods are studied both as preprocessing steps (e.g., batch normalization and its variants in neural networks) as well as mathematical adjustments to correlation estimators to ensure correct variance estimates, robustness to nonnormality, and improved inference.
2. What advances in multivariate correlation analysis enable detection of complex dependencies beyond pairwise measures?
Multivariate correlation methods extend beyond pairwise correlations, capturing complex dependencies among multiple variables or datasets. This research theme centers on canonical correlation analysis (CCA) and its variants, kernel or nonlinear extensions, and newly proposed concordance-based techniques. The focus is on methodological developments that generalize correlation measures to extract interpretable multivariate relations and optimize detection of nonlinear, high-dimensional, or nonlinear dependencies in diverse data domains such as genomics, neuroscience, and social sciences.
3. How can normalized cross correlation and compression-based distances be applied for image matching and neural synchronization measures?
This research theme focuses on specialized applications of normalized cross correlation (NCC) and normalized compression distance (NCD) methods in pattern recognition and neuroscience. It covers algorithmic advances that combine normalization with signal processing and compression to enhance face matching under varying conditions and quantify cortico-muscular synchronization in brain signals.