Verleysen, Cédric
[UCL]
Nowadays, when viewing a pre-recorded video content, the spectator's viewpoint is restricted to one of the cameras that have recorded the scene. To improve the viewer's experience, the next generation of video content aims at enabling him/her to interactively define his/her viewpoint. This domain is known as view synthesis and consists in the interpolation of images that would be seen from a different viewpoint than the ones captured by real cameras. However, the current solutions require that the real cameras share very similar viewpoints, which limits the range of synthesized views. To circumvent this issue, this thesis focuses on the view interpolation when only two real cameras observe the scene from very different viewpoints. This minimalist and challenging camera configuration, called wide-baseline stereo, makes the view synthesis and its underlying 3D estimation problem ill-posed, i.e., multiple reconstructed views are possible. This thesis proposes three new priors to address the problem. The first contribution proposes an energy minimization framework to favor the preservation of the order of the elements in the scene while changing the viewpoint. The second contribution assumes the piecewise-planarity of the scene's background and approximates its 3D by a set of 3D planes under sparsity and smoothness constraints. The last contribution considers the view synthesis of dynamic foreground objects. It first learns a prior about plausible 2D silhouettes of the object based on non-linear dimensionality reduction of their shape descriptors. Afterwards, it constrains the interpolated views to be consistent with this shape prior.
Référence bibliographique |
Verleysen, Cédric. 3D estimation and view synthesis in wide-baseline stereo. Prom. : De Vleeschouwer, Christophe |
Permalien |
http://hdl.handle.net/2078.1/167678 |