人検出結果に基づくカメラの自己キャリブレーションと 3 次元位置推定
- 安藤寛哲, 藤吉弘亘
- 電気学会 一般産業研究会, 2009
Download: PDF (Japanese)
A camera self-calibration method based on the results of human detection is proposed. This method extracts the positions and heights of people in the target scene from the results of human detection and human-area segmentation, and estimates camera parameters such as a location of camera in the world coordinate and a vanishing line in the image coordinate. Calibrating a camera generally requires intensive work, but the proposed technique can perform self-calibration using parameters that are automatically extracted from the target image. As a result, our method can estimate the three-dimensional position of an object even with a camera that had not been previously calibrated. Experimental results show that the accuracy of estimated camera parameters can be improved by using the results of human-area segmentation.