0 Datasets
0 Files
Get instant academic access to this publication’s datasets.
Yes. After verification, you can browse and download datasets at no cost. Some premium assets may require author approval.
Files are stored on encrypted storage. Access is restricted to verified users and all downloads are logged.
Yes, message the author after sign-up to request supplementary files or replication code.
Join 50,000+ researchers worldwide. Get instant access to peer-reviewed datasets, advanced analytics, and global collaboration tools.
✓ Immediate verification • ✓ Free institutional access • ✓ Global collaborationJoin our academic network to download verified datasets and collaborate with researchers worldwide.
Get Free AccessConvolutional neural network has been successfully applied to image denoising. In particular, dilated convolution, which expands the network's receptive field, has been widely used and has achieved good results in image denoising. Losing some image information, a standard network cannot effectively reconstruct tiny image details from noisy images. To solve this problem, we propose a pyramid dilated CNN, which mainly has three pyramid dilated convolutional blocks (PDCBs) and a gated fusion unit (GFU). PDCB uses dilated convolution to expand the network's receptive field and the pyramid structure to obtain more image details. GFU fuses and enhances the feature maps from different blocks. Experiments demonstrate that the proposed method outperforms the comparative state-of-the-art denoising methods for gray and color images. In addition, the proposed method can effectively deal with real-world noisy images.
Xinlei Jia, Yali Peng, Jun Li, Yunhong Xin, Bao Ge, Shigang Liu (2022). Pyramid dilated convolutional neural network for image denoising. , 31(02), DOI: https://doi.org/10.1117/1.jei.31.2.023024.
Datasets shared by verified academics with rich metadata and previews.
Authors choose access levels; downloads are logged for transparency.
Students and faculty get instant access after verification.
Type
Article
Year
2022
Authors
6
Datasets
0
Total Files
0
Language
en
DOI
https://doi.org/10.1117/1.jei.31.2.023024
Access datasets from 50,000+ researchers worldwide with institutional verification.
Get Free Access