1 System solution design
The humanoid robot control system consists of two parts: the robot controller and the robot remote control. The controller uses Philips' ARM7 processor as the core and includes 17 steering gear control drive circuits and interfaces, serial communication debugging circuits, remote control interfaces, power supplies, etc. Download and transmit control information through the serial port. The robot remote control uses the wireless module to send instructions to the robot controller to control the robot to complete specified actions.
2 Hardware design
The humanoid robot controller mainboard consists of the control core LPC2114, power supply and detection circuit, external crystal oscillator, JTAG debugging interface, serial port module, remote control module, servo interface, LED indicator, etc. The block diagram is shown in Figure 1.
LPC2114 is based on a 32-bit ARM7TDMI-S CPU that supports real-time simulation and tracking, and has 128 kbytes of high-speed Flash memory, very small LQFP64 package, extremely low power consumption, 2 32-bit timers, 4 channels of 10 Bit ADC, PWM output, 46 GPIOs, and up to 9 external interrupts make them particularly suitable for industrial control, medical systems, access control, and electronic cash registers (POS) [2].
The high level of functional integration and powerful port driving capabilities make the core part of the robot controller motherboard circuit very simple, basically realizing a single-chip design. Figure 2 shows the LPC2114 controller circuit and the driver of four servos. The interface circuit is similar to the other 13 servo drives.
Considering the motion balance control of the educational robot, the PCB circuit board is laid out as shown in Figure 3(a). In addition to the minimum system of the processor, as well as the necessary indication and control circuits, the most important thing about this layout is the 17 servo drive interfaces. Here, angled 3-row pin connections are used to ensure a compact and reliable link. The 17 servo interfaces are divided into 5 groups, and the download interfaces are arranged with 9 on the left and right. The head of the humanoid robot contains 1 servo interface, which is responsible for left and right movement. The left arm contains 3 servo interfaces, which control the shoulders and elbows. , 3 degrees of freedom of movement of the wrist, and the left leg contains 4 steering gear interfaces, which control 4 degrees of freedom of the left and right/up and down hips, knees and ankles. The sub-control interfaces and functions of the right arm and right leg are symmetrical to the left half of the robot controller [3].
Finally, the controller and the 7.2V battery core are assembled into the aluminum alloy shell in parallel, serving as the body of the robot and the center of gravity of the movement balance. A good layout is crucial to the movement of the robot [4].
Considering the weight and movement strength of the humanoid robot, the design uses the S3050 high-torque metal gear digital steering gear for vehicles and marine competitions in the FUTABA series, as shown in Figure 3(b). The steering gear has a weight of 48.8 grams and is small in size. It operates at 6V voltage and can reach a rotation speed of 0.16 seconds/60 degrees and a torque of 6.5 kg/cm.
The 17 servos are arranged symmetrically according to the aforementioned grouping and human body joint structure. There is 1 servo for the head, 3 servos for each left and right arm, respectively serving as shoulder, elbow, and wrist joints, and 5 servos for each left and right leg. There are 1 left and right movement of the hip joint, 1 front and back movement, 1 knee joint, 1 forward and backward movement of the ankle joint, and 1 left and right movement, including 2 shoulder joint servos and 2 hip joint servos for left and right movement. It is installed symmetrically and compactly from top to bottom, left and right. As part of the body, all other servos are fixed into a compact humanoid structure using aluminum alloy supports [5].
3 Software design
Software design mainly includes software architecture design, program flow design and drive control function design.
3.1 Software architecture
The principle of the humanoid robot control system is: when the robot controller receives the instruction from the remote controller, the ARM processor decomposes the action parameters to be executed into a series of PWM signals with varying widths according to the instruction requirements, and outputs them to the rudder through the drive circuit. Machine interface, the external servo rotates corresponding angles according to PWM signals with different duty cycles. The different rotation angles of multiple servos constitute different instantaneous actions of the robot. The continuous execution of multiple actions completes the response actions corresponding to the external commands. The software architecture is shown in Figure 4. This article mainly explains the key driving functions.
3.2 Steering gear driver design
The driving pulse and steering gear rotation angle are shown in Table 1. Different high-level times correspond to different output positions of the steering gear. Therefore, the LPC2114 internal timer can be used to simulate 17 PWM waveforms with a period of 20ms and corresponding high-level width and position timing to drive and control the rotation angles of the 17 servos respectively to complete the related actions of the robot [6].
The drive of multiple servos needs to calculate the rotation time based on the robot's attitude data, sort the time data according to a certain algorithm, and set the movement and stop time of each servo under the control of the CPU.
3.3 Design of the overall driving function ManMoveFrame() of the robot steering gear
Function function: Convert the input angle of each server into the high-level time of each server, and call the corresponding server driver according to the 17 server high-level times from short to long. There are 18 entry parameters in total, 17 are server angles, 1 is animation dwell time, conversion relationship: 1 degree is approximately 0.00814ms.
Function definition: void ManMoveFrame (int16 Head_Angle, int16 Larm1_Angle,…,, int16 Rleg5_Angle, uint8 t)
{uint8 i, t1; int HeadTime, Larm1Time, Larm2Time, …, Rleg4Time, Rleg5Time;/*Time variable corresponding to angle*/
int MotorTime[17]; /*Each servo drive pulse time array*/
MotorTime[0]=HeadTime=Head_0+Head_Angle*Angle1;/*Head drive pulse time*/
/*MotorTime[1]to MotorTime[16];*/
qsort(MotorTime, 17, sizeof(int), Compare); /*Quickly sort the MotorTime array so that its values are rearranged from small to large*/ …
for (i=0; i robot posture data input driver function ManMoveKeyframeData () design
The key posture of the robot's action is determined by the rotation position of each servo. It is necessary to set the degree of freedom relationship data of the 17 servos with reference to the posture of the robot. The drive function converts the parameters set as shown in Table 2 into drive signals to control The rotation of 17 servos completes the robot's walking action. The default data in blank spaces in the table is 0.
Function: Initialize each server angle/time array, and input each servo angle data of each posture action of the robot to each server array. A total of 19 entry parameters: 1 key frame number, 17 server angles, and 1 animation dwell time.
Function definition: void ManMoveKeyframeData(uint16 ID, int16 H_Angle, …, int16 RL5_Angle, uint8 t) {
ID_Max++;/*The number of action frames, a global variable, records the number of action frames. Every time a frame is added, the value of ID_Max increases by 1*/
H[ID]=H_Angle;/*Angle of head*/ … RL5[ID]=RL5_Angle; /*Angle of left shoulder*/
T_Key[ID]=t;/*The length of time the frame action is completed*/}
3.5 Design of robot motion control driver function ManMoveKeyframeToFrame()
Main function: Convert posture key frame data with a large movement range combined with the posture key frame data immediately below it into ordinary frame drive parameters with uniform time for use by the overall driver. Entry parameters: None. Export parameters: None.
Function definition: void ManMoveKeyframeToFrame(void)
{uint16 id, h, la1, la2, la3, ra1, ra2, ra3, ll1, ll2, ll3, ll4, ll5, rl1, rl2, rl3, rl4, rl5; uint8 t, k;/*frame number, servo Time temporary variable*/
for(id=0;id robot program flow
The robot determines and completes its actions based on the received remote control commands. The workflow is shown in Figure 5. Because humanoid robots use batteries, their work consumes more power. Therefore, the working voltage must be detected before movement. If the working requirements are met, continue working. Otherwise, the action will not be performed and an alarm will be issued. If the working voltage does not meet the requirements, the robot will fall due to insufficient power. , it will stop running after receiving the end command, and will no longer respond to remote control commands sent from the outside. If it completes an action, it will return to the standing state [7].
It should be pointed out that when the robot is moving, the program must control its external server in batches in a time-sharing manner to reduce the power consumption pressure on the processor and achieve reliable driving.
4 Conclusion
The motion data is designed according to the robot's motion posture. After testing, the control system can complete a variety of gymnastics movements including forward rolls, back rolls, and push-ups. The research team plans to further encapsulate functions and design a visual graphical programming interface so that motion data can be generated more intuitively on the computer, building block programming, and reducing the difficulty of operation.
Previous article:Design of multi-channel pulse amplitude analyzer based on ARM microcontroller LPC2134
Next article:Design of embedded urine analyzer system based on microprocessor and Ethernet interface
- Popular Resources
- Popular amplifiers
Professor at Beihang University, dedicated to promoting microcontrollers and embedded systems for over 20 years.
- LED chemical incompatibility test to see which chemicals LEDs can be used with
- Application of ARM9 hardware coprocessor on WinCE embedded motherboard
- What are the key points for selecting rotor flowmeter?
- LM317 high power charger circuit
- A brief analysis of Embest's application and development of embedded medical devices
- Single-phase RC protection circuit
- stm32 PVD programmable voltage monitor
- Introduction and measurement of edge trigger and level trigger of 51 single chip microcomputer
- Improved design of Linux system software shell protection technology
- What to do if the ABB robot protection device stops
- Huawei's Strategic Department Director Gai Gang: The cumulative installed base of open source Euler operating system exceeds 10 million sets
- Download from the Internet--ARM Getting Started Notes
- Learn ARM development(22)
- Learn ARM development(21)
- Learn ARM development(20)
- Learn ARM development(19)
- Learn ARM development(14)
- Learn ARM development(15)
- Analysis of the application of several common contact parts in high-voltage connectors of new energy vehicles
- Wiring harness durability test and contact voltage drop test method
- EEWORLD University Hall----Design using TI's series capacitor buck converter
- 【Silicon Labs Development Kit Review 07】_freeRtos+UART+TH+IMU+PainterEngine
- Live broadcast at 2 pm today [Introduction to TI MSP430 capacitive touch development]
- Comparison of two PWM motor drives
- CircuitPython 6.0.0 Alpha 2 released
- Here is the stock: I am talking about the out-of-stock FT232. You can also get eeworld rewards by placing an order~
- Comics Database
- How to identify the parameters of ceramic capacitors
- 10 ways 5G will change our daily lives, the last one is thought-provoking
- STC8 MCU Problem