Residual-error prediction based on deep learning for lossless image compression This publication appears in: Electronics Letters Authors: I. Schiopu and A. Munteanu Volume: 54 Issue: 17 Pages: 1032-1033 Publication Date: Aug. 2018
Abstract: A novel residual-error prediction method based on deep learning with application in lossless image compression is introduced. The proposed method employs machine learning tools to minimise the residual error of the employed prediction tools. Experimental results demonstrate average bitrate savings of 32% over the state-of-the-art in lossless image compression. To the best of the authors' knowledge, this Letter is the first to propose a deep-learning based method for residual-error prediction.
|