- 安藤寛哲, 藤吉弘亘
- 電気学会論文誌, vol. 131, no. 4, pp. 482–489, 2011
A camera self-calibration method based on the results of human detection is proposed. This method extracts the positions and heights of people in the target scene from the results of human detection and human-area segmentation, and it estimates camera parameters such as the location of the camera in the world coordinates and a vanishing line in the image coordinates. Calibrating a camera generally requires intensive effort, but the proposed method can perform self-calibration using parameters that are automatically extracted from the target image. As a result, our method can estimate the three-dimensional position of an object even when a camera that has not been previously calibrated is used. Experimental results show that the accuracy of the estimated camera parameters can be improved by using the results of human-area segmentation.