camera calibration toolbox in MATLAB provides a variety of routines and calibration methods, which are very detailed and even provide grid-type targets. The user interface is convenient and flexible, and it is very simple to use when calibrating the camera. In addition, the C source code of the toolbox is open in the open source computer vision library, which provides ideal conditions for in-depth learning and secondary development [7-9]. The calibration of the camera is the same as that of the camera.
1 Principle of camera calibration
There are three different levels of coordinate systems in calibration: world coordinate system, camera coordinate system and image plane coordinate system (physical coordinate system and pixel coordinate system), as shown in Figure 1.
1.1 World coordinate system
The world coordinate system is also called the real or real world coordinate system, represented by XwYwZw. It is the absolute coordinate of the objective world (so it is also called the objective coordinate system). General 3D scenes are represented by this coordinate system.
1.2 Camera coordinate system
The camera coordinate system is a coordinate system centered on the camera, represented by XcYcZc, and the optical axis of the camera is generally taken as the Zc axis.
1.3 Image plane coordinate system
The image physical coordinate system is the image plane xy coordinate system formed in the camera, and the image plane is generally parallel to the camera coordinate system plane.
The image pixel coordinate system is the uv coordinate system formed in the camera, and the upper left corner of the image plane ∏ is generally taken as the origin.
The brightness of each point on the image is related to the intensity of the reflected light at a certain surface point of the object, while the position of the image point on the image plane is only related to the relative orientation of the object in the camera space and the internal structure of the camera. The internal structure of the camera is determined by the internal parameters of the camera. In order to describe the imaging geometry of the camera, it is necessary to mathematically model the camera. The pinhole model, also known as the linear model, is usually used. This model is mathematically the central projection of the three-dimensional space to the two-dimensional plane, described by a 3×4 matrix. This model is a (degenerate) photographic transformation, so it is usually called a photographic camera.
1.4 Camera Calibration Principles
Camera calibration refers to establishing the relationship between the pixel position of the camera image and the position of the scene point. The way is to solve the model parameters of the camera based on the image coordinates and world coordinates of the known feature points according to the camera model, as shown in Figure 2. The model parameters that need to be calibrated for the camera are divided into internal parameters and external parameters, and the conversion relationship is:
The transformation from a point in the world coordinate system to the camera coordinate system can be represented by an orthogonal transformation matrix R and a translation transformation matrix T. fx, fy, γ, u0, and v0 are the internal parameters of the linear model, where fx and fy are defined as the equivalent focal lengths in the X and Y directions, respectively; u0 and v0 are the coordinates of the image center (the intersection of the optical axis and the image plane); γ is the non-perpendicular factor between the u axis and the v axis; R and T are rotation matrices and translation matrices. If the matrices M1 and M2 are known, the correspondence between the world coordinates and the pixel coordinates can be established. The calibration task of the camera is to find the parameters in each transformation matrix.
Since the camera optical system does not work exactly according to the idealized pinhole imaging principle, there is lens distortion, that is, there is an optical distortion error between the actual image of the object point on the camera imaging plane and the ideal image [2,3]. There are three main types of distortion errors: radial distortion, eccentric distortion, and thin prism distortion, represented by δr, δd, and δp, respectively. The first type only produces deviations in the mirror position, while the latter two types produce both radial and tangential deviations.
After considering the distortion, the ideal image point coordinates (Xu, Yu) on the image plane are equal to the sum of the actual image point coordinates (Xd, Yd) and the distortion error, that is:
2 Camera calibration
The output image resolution of the camera is 3 280 × 2 460, and a black and white chessboard is used as the calibration template. The side length of the template square is 30 mm. The real-time calibration process is as follows:
(1) Run the calibration main function calib_gui, and the mode selection window shown in Figure 3 is displayed.
Through this operation, you can choose to upload all calibration photos at once or upload them in parts when the computer memory is insufficient. No matter which mode you choose, there will be the same user window, and the subsequent calibration process can be completed in this window, as shown in Figure 4.
(2) By reading the image in the main window, you can get the photo to be calibrated.
(3) Get the corner points. The program running interface is shown in Figure 5. The program calibration results are as follows:
%--Focal length:
fc=[3463.194803808018200;3807.341090056066200];
%--Principal point:
cc=[1633.861831663415600;1394.235351077526500];
%--Skew coefficient:
alpha_ c=0.000000000000000;
%--Distortion coefficients:
kc=[-0.208188511841198; 0.035081678657317;0.0023875
81735940;0.000491712255333;0.000000000000000];
%--Focal length uncertainty:
fc_error=[260.123743256455500;284.746622601852150];
%--Principal point uncertainty:
cc_error=[36. 917650368224287;47.589021356646775];
%--Skew coefficient uncertainty:
alpha_c_error=0.000000000000000;
%- -Distortion coefficients uncertainty:
kc_error=[0.031723675208984;0.077972615251388; 0.002
023682615518;0.001567520438212;0.000000000000000];
%--Image size:
nx=3280;
ny=2460 ;
%--Various other variables (may be ignored if you do not use the Matlab Calibration Toolbox):
%--Those variables are used to control which intrinsic parameters should be optimized
n_ima=12;
%Number of calibration images
est_fc=[1;1];
%Estimation indicator of the two focal variables
est_aspect_ratio=1;
% Estimation indicator of the aspect ratio fc(2)/fc(1)
center_optim=1;
% Estimation indicator of the principal point
est_alpha=0;
% Estimation indicator of the skew coefficient
est_dist=[1;1;1;1;0];
% Estimation indicator of the distortion coefficientspoint
est_alpha=0;
% Estimation indicator of the skew coefficient
est_dist =[1;1;1;1;0];
% Estimation indicator of the distortion coefficients
From the experiment, we can see that the calibration results can be quickly obtained using the calibration toolbox in MATLAB. The operation is simple and easy to understand, and the visualization effect is good. The result error, distortion, etc. can be displayed in graphical form.
References
[1] Zhang Guangjun. Visual measurement [M]. Beijing: Science Press, 2008.
[2] Wu Fuchao. Mathematical methods in computer vision [M]. Beijing: Science Press, 2008.
[3] ROGER Y. TSAI. A versatile camera calibration technique for high-accuracy 3D machine vision metrology using off-the-shelf TV cameras and lenses[J]. IEEE Journal of robotics and automation, 1987,RA-3(4):323-344
. 4] TSAI R Y. An efficient and accurate camera calibration technique for 3D machine vision[M]. Proc. of IEEE Conference of computer Vision and Pattern Recognition, 1986:364-374.
[5] ZHANG Z Y. Flexible camera calibration by viewing a plane from unknown orientations[C].In: Proceedings of the International Conference on Computer Vision (ICCV'99),1999:666-673.
[6] Jin Zhiguang, Wei Jiandong, Zhang Guanyu, et al. A new camera calibration method based on LCD[J]. Remote Sensing Applications, 2008, 1:87-90.
[7] Meng Xiaoqiao, Hu Zhanyi. A new easy camera calibration technique based on circular points[J]. pattern recognition, 2003,36(5):1155-1164.
[8] WU YH, ZHU HJ, HU ZY, et al. Camera calibration from the quasi-affine invariance of two parallel circles[C]. In Proc. European Conference on Computer Vision(ECCV'2004), 2004,I:190-202.
[9] Xu De, Zhao Xiaoguang, Tu Zhiguo, et al. Camera calibration of hand-eye system based on single feature point[J].IEEE Transtration on Pattern Analysis and Machine Intelligence, 2000,22(11):1330-1334.
Previous article:Application of McBSP Technology in Data Transmission
Next article:Component-oriented software architecture of outpatient department information management system
Recommended ReadingLatest update time:2024-11-16 19:42
- Molex leverages SAP solutions to drive smart supply chain collaboration
- Pickering Launches New Future-Proof PXIe Single-Slot Controller for High-Performance Test and Measurement Applications
- CGD and Qorvo to jointly revolutionize motor control solutions
- Advanced gameplay, Harting takes your PCB board connection to a new level!
- Nidec Intelligent Motion is the first to launch an electric clutch ECU for two-wheeled vehicles
- Bosch and Tsinghua University renew cooperation agreement on artificial intelligence research to jointly promote the development of artificial intelligence in the industrial field
- GigaDevice unveils new MCU products, deeply unlocking industrial application scenarios with diversified products and solutions
- Advantech: Investing in Edge AI Innovation to Drive an Intelligent Future
- CGD and QORVO will revolutionize motor control solutions
- Innolux's intelligent steer-by-wire solution makes cars smarter and safer
- 8051 MCU - Parity Check
- How to efficiently balance the sensitivity of tactile sensing interfaces
- What should I do if the servo motor shakes? What causes the servo motor to shake quickly?
- 【Brushless Motor】Analysis of three-phase BLDC motor and sharing of two popular development boards
- Midea Industrial Technology's subsidiaries Clou Electronics and Hekang New Energy jointly appeared at the Munich Battery Energy Storage Exhibition and Solar Energy Exhibition
- Guoxin Sichen | Application of ferroelectric memory PB85RS2MC in power battery management, with a capacity of 2M
- Analysis of common faults of frequency converter
- In a head-on competition with Qualcomm, what kind of cockpit products has Intel come up with?
- Dalian Rongke's all-vanadium liquid flow battery energy storage equipment industrialization project has entered the sprint stage before production
- Allegro MicroSystems Introduces Advanced Magnetic and Inductive Position Sensing Solutions at Electronica 2024
- Car key in the left hand, liveness detection radar in the right hand, UWB is imperative for cars!
- After a decade of rapid development, domestic CIS has entered the market
- Aegis Dagger Battery + Thor EM-i Super Hybrid, Geely New Energy has thrown out two "king bombs"
- A brief discussion on functional safety - fault, error, and failure
- In the smart car 2.0 cycle, these core industry chains are facing major opportunities!
- The United States and Japan are developing new batteries. CATL faces challenges? How should China's new energy battery industry respond?
- Murata launches high-precision 6-axis inertial sensor for automobiles
- Ford patents pre-charge alarm to help save costs and respond to emergencies
- New real-time microcontroller system from Texas Instruments enables smarter processing in automotive and industrial applications
- The Definitive Guide to Arduino Robotics
- Electronics Engineer Quick-Start Resources, Super Practical, Suitable for Beginners
- Are those who have seen this almost 40 years old?
- IPC-6013E-EN_2021 Qualification and Performance Specification or FlexibleRigi...
- Live Review: June 28th Datang NXP Semiconductors | Battery Management Chip Solution Design and Precautions
- Dear students, Qorvo 2021 campus recruitment is here, take action now.
- Video tutorial on developing RT-Thread with Bluexun AB32VG1
- Scanning interface for rotary flow meters
- I saw two interesting news this morning: What are the trade-offs in autonomous driving? Is there anything that cannot be changed in the CAN bus?
- Adafruit LED glasses running circuitpython