MATEC Web Conf.
Volume 63, 20162016 International Conference on Mechatronics, Manufacturing and Materials Engineering (MMME 2016)
|Number of page(s)||6|
|Section||Information Technology, Control and Application|
|Published online||12 July 2016|
Fast Image Super-resolution with Sparse Coding
Shandong Vocational College of Science and Technology, Weifang, 261053, China
In this paper, we introduce a novel fast image reconstruction method for super-resolution (SR) base on sparse coding. This method combine online dictionary learning and a fast sparse coding way, both of which can improve the efficiency of the reconstruction process and ensure the image visual quality. The new online optimization algorithm for dictionary learning based on stochastic approximations, which can drastically advance the learning speed, especially on millions of training samples. Meanwhile, we trained a neural network to speed up the reconstruction process, which based on iterative shrinkage-thresholding algorithm (ISTA), we called learned iterative shrinkage-thresholding algorithm (LISTA). It would produce best approximation sparse code with some fixed depth. We demonstrate that our approach can simultaneously improve the image fidelity and cost less computation.
Key words: SRIR / sparse coding / super-resolution / fast image super-resolution
© Owned by the authors, published by EDP Sciences, 2016
This is an Open Access article distributed under the terms of the Creative Commons Attribution License 4.0, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Current usage metrics show cumulative count of Article Views (full-text article views including HTML views, PDF and ePub downloads, according to the available data) and Abstracts Views on Vision4Press platform.
Data correspond to usage on the plateform after 2015. The current usage metrics is available 48-96 hours after online publication and is updated daily on week days.
Initial download of the metrics may take a while.