Key research themes
1. How can compression schemes exploit data locality and statistical properties to improve compression ratio and efficiency?
This theme investigates compression methods that leverage locality of reference, data-dependent adaptive coding, and statistical characteristics of data to achieve improved compression ratios without sacrificing speed or requiring multiple passes over data. The focus is on algorithmic improvements and theoretical guarantees regarding performance relative to baseline coding schemes like Huffman coding.
2. What are statistical and data-driven methods for predicting or achieving high lossy compression ratios with controlled distortion in scientific data?
This theme covers models, metrics, and techniques specifically tailored to lossy compression of large scientific datasets, focusing on precise prediction of compression ratios and quality metrics like PSNR under error bounds. It includes novel computational frameworks and compression pipelines on advanced architectures (e.g., GPUs) that balance compression ratio, throughput, and fidelity.
3. How do domain-specific factors and applications influence compression ratio relevance and interpretation, particularly in medical imaging and physical sciences?
This theme investigates the interaction between data acquisition parameters, physical properties of data sources, and compression ratio metrics, emphasizing the limitations of relying solely on compression ratio as an indicator of fidelity or quality. It draws from experimental and computational studies in medical CT imaging, soil deformation measurement, and physical material compression, illustrating the need for multidimensional assessments of compression impact beyond ratio alone.