Papers by Claudio Alberti

IBC 2015 Conference, 2015
Broadcast and broadband networks continue to be separate worlds in the video consumption business... more Broadcast and broadband networks continue to be separate worlds in the video consumption business. Some initiatives such as HbbTV have built a bridge between both worlds, but its application is almost limited to providing links over the broadcast channel to content providers' applications such as Catch-up TV services. When it comes to reality, the user is using either one network or the other. H2B2VS is a Celtic-Plus project aiming at exploiting the potential of real hybrid networks by implementing efficient synchronization mechanisms and using new video coding standard such as High Efficiency Video Coding (HEVC). The goal is to develop successful hybrid network solutions that enable value added services with an optimum bandwidth usage in each network and with clear commercial applications. An example of the potential of this approach is the transmission of Ultra-HD TV by sending the main content over the broadcast channel and the required complementary information over the broadband network. This technology can also be used to improve the life of handicapped persons: Deaf people receive through the broadband network a sign language translation of a programme sent over the broadcast channel; the TV set then displays this translation in an inset window. One of the most important contributions of the project is developing and testing synchronization methods between two different networks that offer unequal qualities of service with significant differences in delay and jitter. In this paper, the main technological project contributions are described, including SHVC, the scalable extension of HEVC and a special focus on the synchronization solution adopted by MPEG and DVB. The paper also presents some of the implemented practical use cases, such as the sign language translation described above, and their performance results so as to evaluate the commercial application of this type of solution.

Intellectual Property Management and Protection for Multimedia Content
Nowadays, network infrastructures are increasingly used in support to commercialization of digita... more Nowadays, network infrastructures are increasingly used in support to commercialization of digital multimedia content. Such kind of non-material goods, namely videos, music, still images and any other type of multi-media information are ready for the migration from traditional delivery technologies to a full-electronic delivery model. Incidentally, the same features that make the distribution and management of digital content so easy are also responsible for the difficulties of selling it on-line. Digital information as it is can be easily copied an unlimited number of times and transferred to an unlimited number of people. Thus it is of crucial importance for a development of the digital multimedia market that exploit the full potential of digital technology to insure that the holders of intellectual property over the content are appropriately rewarded and that only authorized parties can access the valuable information. Moreover, the whole protection system shall be as transparent...

ECMA-407: A New 3D audio codec implementation up to NHK 22.2
ECMA-407, the first 3D audio standard worldwide, introduces a new concept of static models to low... more ECMA-407, the first 3D audio standard worldwide, introduces a new concept of static models to lower bitrate coding, which may be equally applied with channels, channels and objects and Higher Order Ambisonics (HOA). Static models may either operate in time domain or in frequency domain and allow transporting highly complex 3D audio content up to NHK 22.2 with no or very little side information, as would be necessary with dynamic models, generally referred to as parametric coding. Static models in time domain, when calibrated with statistical means, would require extensive computational complexity and large amount of data. A new approach, based on David Hilbert’s extensive studies on invariant theory, contrarily introduces a new “non-random” concept with Gaussian processes. Apart from an already published solution to the problem, an alternative solution is given here for the first time, entirely based on Hilbert’s ingenious construction of „kanonische Nullformen”. ECMA-407 is compliant with waveform preserving base audio codecs and non-waveform preserving audio codecs such as USAC and HE-AAC v2. Dataflow-programming by means of RVC-CAL will furthermore establish a programming environment, which is particularly apt for ECMA-407. The development, spatial tuning and testing environment for ECMA-407 at McGill University, conformant to ITU-R Recommendation BS.1116-1, is described together with a short description of subjective performance with PCM and with non-tuned and tuned non-waveform preserving based audio codecs.

Porting an MPEG-HEVC decoder to a low-power many-core platform
After several generations of video coding standards, MPEG High Efficient Video Coding (HEVC) is l... more After several generations of video coding standards, MPEG High Efficient Video Coding (HEVC) is likely to emerge as the video coding standards for HD and Ultra-HD TV. HEVC decoding is expected to be less computationally demanding and to provide a higher level of potential intrinsic parallelism. A many-core platform such as the STM STHORM appears to be a very good candidate for supporting low-power HEVC implementations capable of exploiting the different intrinsic parallelization options. This work explores the potential of HEVC wavefront and tiles algorithms implementation on the STHORM. Different partitioning options of an HEVC specified at high level using the standard RVC-CAL dataflow language are presented. Performances are measured and profiled on the STHORM platform by repartitioning and refactoring the dataflow software according to performance objectives.
MPEG-G the emerging standard for genomic data compression
Keywords: genomics ; data compression Reference EPFL-CONF-233617 Record created on 2017-12-21, mo... more Keywords: genomics ; data compression Reference EPFL-CONF-233617 Record created on 2017-12-21, modified on 2017-12-25
2018 Data Compression Conference, Mar 1, 2018
High-throughput sequencing of RNA molecules has enabled the quantitative analysis of gene express... more High-throughput sequencing of RNA molecules has enabled the quantitative analysis of gene expression at the expense of storage space and processing power. To alleviate these problems, lossy compression methods of the quality scores associated to RNA sequencing data have recently been proposed, and the evaluation of their impact on downstream analyses is gaining attention. In this context, this work presents a first assessment of the impact of lossily compressed quality scores in RNA sequencing data on the performance of some of the most recent tools used for differential gene expression.
An Interpreted Approach to Multimedia Streams Protection
An Audio Virtual DSP for Multimedia Frameworks
The new MPEG-4 Audio standard provides two toolsets for synthetic Audio generation, Audio process... more The new MPEG-4 Audio standard provides two toolsets for synthetic Audio generation, Audio processing and multimedia content description called Structured Audio (SA) and BInary Format for Scenes (BIFS).

2013 Conference on Design and Architectures for Signal and Image Processing, 2013
This paper presents a methodology to perform design space exploration of complex signal processin... more This paper presents a methodology to perform design space exploration of complex signal processing systems implemented using the CAL dataflow language. In the course of space exploration, critical path in dataflow programs is first presented, and then analyzed using a new strategy for computational load reduction. These techniques, together with detecting design bottlenecks, point to the most efficient optimization directions in a complex network. Following these analysis, several new refactoring techniques are introduced and applied on the dataflow program in order to obtain feasible design points in the exploration space. For a MPEG-4 AVC/H.264 decoder software and hardware implementation, the multi-dimensional space can be explored effectively for throughput, resource, and frequency, with real-time decoding range from QCIF to HD resolutions.
Keywords: LTS3 Reference LTS-CONF-2001-057View record in Web of Science Record created on 2006-06... more Keywords: LTS3 Reference LTS-CONF-2001-057View record in Web of Science Record created on 2006-06-14, modified on 2017-05-10

Design space exploration and implementation of RVC-CAL applications using the TURNUS framework
2013 Conference on Design and Architectures for Signal and Image Processing, 2013
While research on the design of heterogeneous concurrent systems has a long and rich history, a u... more While research on the design of heterogeneous concurrent systems has a long and rich history, a unified design methodology and tool support has not emerged so far, and thus the creation of such systems remains a difficult, time-consuming and error-prone process. The absence of principled support for system evaluation and optimization at high abstraction levels makes the quality of the resulting implementation highly dependent on the experience or prejudices of the designer. In this work we present TURNUS, a unified dataflow design space exploration framework for heterogeneous parallel systems. It provides high-level modelling and simulation methods and tools for system level performances estimation and optimization. TURNUS represents the outcome of several years of research in the area of co-design exploration for multimedia stream applications. During the presentation, it will be demonstrated how the initial high-level abstraction of the design facilitates the use of different anal...
Differential Gene Expression with Lossy Compression of Quality Scores in RNA-Seq Data
2017 Data Compression Conference (DCC)
High-throughput sequencing of RNA molecules has enabled the quantitative analysis of the expressi... more High-throughput sequencing of RNA molecules has enabled the quantitative analysis of the expression of genes at the expense of storage space and processing power. To help alleviate these problems, lossy compression methods of the quality scores associated to RNA sequence data have recently been proposed, and the evaluation of their impact on downstream analysis is gaining attention. This work presents a first assessment of the impact of lossily compressed quality scores in RNA sequence data on the performance of some of the most recent tools used for differential gene expression.
Keywords: LTS3 Reference LTS-CONF-2003-041 Record created on 2006-06-14, modified on 2016-08-08
Keywords: LTS3 Reference LTS-CONF-2000-049 Record created on 2006-06-14, modified on 2016-08-08

Context-aware consumption of content implies that platforms mediating the access to content must ... more Context-aware consumption of content implies that platforms mediating the access to content must be able to do that independently of the user location, type of terminal, network conditions and content formats. In addition, user preferences, environmental conditions and content owners and usage rights should be respected. The fulfillment of this concept requires the availability of a comprehensive set of metadata related with both the content and the usage environment. But how can systems coherently and easily exchange, inter-relate and use the different types of required meta– information? This paper presents the solution developed within the ENTHRONE project with this aim, describing the mechanisms designed for publishing multimedia content and all relevant metadata associated with the consumption of that content, in an open and unified form, for accessing that content upon request of end-users and decide if and what kind of adaptation operations are needed to provide a context-awa...
Turnus: A unified dataflow design space exploration framework for heterogeneous parallel systems
This paper presents the main features of the TURNUS co-exploration environment, an unified design... more This paper presents the main features of the TURNUS co-exploration environment, an unified design space exploration framework suitable for heterogeneous parallel systems designed using an high level dataflow representation. The main functions of this tool are illustrated through the analysis of a video decoder implemented in the RVC-CAL dataflow language.
The MPEG-4 standard provides a complete framework for hybrid coding of natural and structured sou... more The MPEG-4 standard provides a complete framework for hybrid coding of natural and structured sound information that permits the description of complete spatial environments by a low amount of data. A new platform, based on a virtual DSP, has been developed to optimize the MPEG decoding and pre-processing interface to feed an array of loudspeakers rendering spatial Audio by Wave Field Synthesis. The virtual DSP has an instruction set adapted to MPEG-4 advanced audio features on superscalar processors.
Journal of Computational Biology
The intrinsic high-entropy sequence metadata, known as quality scores, is largely the cause of th... more The intrinsic high-entropy sequence metadata, known as quality scores, is largely the cause of the substantial size of sequence data files. Yet, there is no consensus on a viable reduction of the resolution of the quality score scale, arguably because of collateral side effects. In this article, we leverage on the penalty functions of HISAT2 aligner to rebin the quality score scale in such a way as to avoid any impact on sequence alignment, identifying alongside a distortion threshold for ''safe'' quality score representation. We tested our findings on whole-genome and RNA-seq data, and contrasted the results with three methods for lossy compression of the quality scores.

The MPEG-G standardization project is a coordinated international effort to specify a compressed ... more The MPEG-G standardization project is a coordinated international effort to specify a compressed data format that enables large-scale genomic data to be processed, transported and shared. The standard consists of a set of specifications (i.e., a book) describing a normative decoding process to retrieve the information coded in a compliant file or bitstream. It provides the means to implement leading-edge compression technologies that have been shown to achieve significant compression gains over currently used formats for storage of unaligned and aligned sequencing reads. Additionally, the standard provides a wealth of much needed functionality, such as selective access, application programming interfaces to the compressed data, standard interfaces to support data protection mechanisms, support for streaming, and a process to assess the conformance of implementations. ISO/IEC is engaged in supporting the maintenance and availability of the standard specification, which guarantees the...
An Evaluation Framework for Lossy Compression of Genome Sequencing Quality Values
2016 Data Compression Conference (DCC), 2016
This paper provides the specification and an initial validation of an evaluation framework for th... more This paper provides the specification and an initial validation of an evaluation framework for the comparison of lossy compressors of genome sequencing quality values. The goal is to define reference data, test sets, tools and metrics that shall be used to evaluate the impact of lossy compression of quality values on human genome variant calling. The functionality of the framework is validated referring to two state-of-the-art genomic compressors. This work has been spurred by the current activity within the ISO/IEC SC29/WG11 technical committee (a.k.a. MPEG), which is investigating the possibility of starting a standardization activity for genomic information representation.
Uploads
Papers by Claudio Alberti