Paper: | TP-P5.11 |
Session: | Image and Video Artifact Removal and Denoising |
Time: | Tuesday, September 18, 14:30 - 17:10 |
Presentation: |
Poster
|
Title: |
A VARIATIONAL RECOVERY METHOD FOR VIRTUAL VIEW SYNTHESIS |
Authors: |
Akira Kubota; Tokyo Institute of Technology | | |
| Takahiro Saito; Kanagawa University | | |
Abstract: |
This paper presents a novel method based on image recovery scheme for virtual view synthesis. First, using multiple hypothetical depths, we generate multiple candidate images for the desired virtual view. The generated images suffer from blending artifacts (seen like blur) due to pixel mis-correspondence. From these blurry images, we recover an image without artifacts (i.e. an all infocus image) by minimizing an energy functional of unknown textures at all the hypothetical depths. The desired image is finally reconstructed as the sum of all the estimated textures. Simulation result shows that texture color value exist over all the hypothetical depths (i.e. depth is not uniquely identified for every pixel) nevertheless the desired image can be reconstructed with adequate quality. |