In recent years, with the large-scale construction of 3G networks and the rapid popularization of smart phones, the era of mobile Internet has arrived. As an operating system launched by Google, Android has attracted widespread attention since its launch and has been favored by many manufacturers and developers. Android is an open, complete and free mobile phone platform. Its powerful open source features have attracted more and more developers. The version of Android has now been upgraded from the initial 1.1 to the latest 4.0, with more and more powerful functions, better user experience, and more colorful application products released on the Android platform.
There are more and more applications on the Android platform, and people have higher and higher requirements for applications. At this stage, touch-screen mobile phones are the trend, and touch is the mainstream way for people to interact with applications. People have higher and higher requirements for touch sensitivity, simplicity and convenience, so it is necessary to optimize the touch effect of applications.
1 Touch mechanism of Android platform
Generally speaking, events are actions triggered when users interact with the UI (graphical interface). For example, touching a specific area on the phone screen will trigger a corresponding event. In Android, these events will be sent to the event handler, which is a method that specifically receives event objects and translates and processes them.
In Android, user event response is very important. User messages mainly come from three hardware external devices: touch response (ONTouch series methods), key response (onKey series methods) and trackball. Whether we interact with the screen by touch or by using a trackball, any screen-related interaction is an interaction with the view controls at the corresponding position of the screen. Now almost all Android phones are touch screens, with fewer and fewer basic buttons, and touch screen response has become a major trend of development. Therefore, when designing applications, we mainly consider the effect of touch and focus on optimizing user touch response.
Figure 1 Android platform user message processing process
Figure 1 details the process of various user operations being captured by the system. After the system captures the user's response, the Linux driver captures the user's message, which is passed to the system method in the Activity through the Activity Manager of the Android framework layer. The View class is updated by executing the corresponding system method, and the message is passed to the Application Framework layer. Finally, the Linux driver is called to control the drawing and updating of the interface. In Android projects, touch-related interface functions are mainly encapsulated in the android.view.KeyEvent class. When writing touch screen control, first import the package of this class and explain the method in the subclass of Activity. Android's event handling mechanism is relatively simple. It is not necessary to understand the event handling mechanism in detail. When implementing various responses, you only need to implement each method. The specific implementation of each method is carried out in the View class. The following focuses on the implementation of touch response on the Android platform.
2 Implementation of touch response on Android platform
Generally speaking, in Android, the handling of touch events is the same as UI events, mainly handled by two methods, one is to use the onTouchEvent() method of the View class, and the other is to implement the onTouch() method of the OnTouchListener interface. When both methods exist in the system, the system will give priority to the OnTouchListener interface callback method for processing. Generally, the onTouchListener() method is rewritten in a simple UI design interface, and when multiple Activity switches are involved, the Activity information can be saved and jumped in the method.
For large applications, the interface state is relatively complex and the touch response needs to be processed in detail. The system provides an interface function onTouchEvent() specifically for processing user touch events. In the actual development process, you only need to declare this interface function first, and then rewrite the function in the main View class to implement specific touch effects. The following is the declaration of the interface function:
There is only one MotionEvent event parameter in the touch screen monitoring method. The instance of this class saves various touch screen actions of the player, including: press action, move action, multi-touch screen, screen pressure, etc. Many static constant values of actions are defined in this class. The player's action is obtained through the event.getAction() method to match the required action constant value.
The above is the touch response processing of the sound interface: event.getAction() is the type of key control, which is a system method of the MotionEvent class, and obtains the current touch response. When the screen is touched downward, it responds to the MotionEvent.ACTION_DOWN action. When the contact point meets the given interval range, the corresponding operation is performed to achieve the state jump of the game. When dividing the touch range, it is best to use the relative position of the screen as a reference point. This will be more convenient when applying horizontal and vertical screen processing or application porting, and avoid frequent modifications of the reference coordinate values. The specific properties of the current device screen width and height can be obtained in the subclass that inherits Activity at the program entry, so that the touch range can be determined by the screen width and height attribute values. The flowchart of the entire touch part is as follows:
Figure 2 Specific process of touch screen operation response
Figure 2 shows the specific response process when the screen is touched during the game, which mainly involves the Activity class and the View class. In the Activity class, the touch method onTouchEvent() is declared, and the method is defined and explained in detail in the View class. After the touch response, the event response mechanism is triggered. The event object obtains the touch response through the getAction() method, obtains the current touch point coordinates event.getX() and event.getY(), and judges with the touch range in the method. If it is within the area, the touch response is executed. After the touch response is executed, the touch release needs to be processed in the touch release MotionEvent.ACTION_DOWN to release the current touch response in time.
3 Optimization of touch response on Android platform
It is worth noting that in the above touch response MotionEvent.ACTION_DOWN, a temporary touch count variable keyCount is specially defined. When the count variable is added to a certain level, the touch response is executed. This process can effectively prevent continuous touch response, and the touch is not released and jumps directly to the next state. The setting of touch variables is very necessary in the process of jumping between various interfaces, especially in the game menu.
In addition to the response of each menu interface, another important response in touch response is the control of the protagonist in the game. The touch response principle of the main game interface is the same as that of the menu interface. The difference is that the selection of the touch judgment range is different. The touch range of the menu interface is fixed, while the demarcation of the touch area of the game interface is dynamic. Since the protagonist is the core character in role-playing games, the touch response of the protagonist's control is particularly important. When designing touch, it is generally designed with the protagonist as the center, and the coordinates of the protagonist are used as the basic point for judgment.
Figure 3 Division of the area when the protagonist is in control
Figure 3 shows the area division when the main character moves. The intersection of the horizontal and vertical coordinate axes is used as the coordinate center point of the main character, and the area around the main character is divided into 10 areas. When the touch point is in area 9-10, the direction is right, when it is in the symmetrical area 5-6, the direction is left, and when it is in area 7-8, the direction is downward. For areas 1-4, further division is made. Area 2-3 is upward and only responds to upward operations. Area 1 responds to the right key and the up key at the same time, and area 4 responds to the left key and the up key at the same time. This refinement of the area is in line with the actual situation, and the angle range of area 1 and area 4 can be fine-tuned according to the actual situation.
After dividing the area around the protagonist with the protagonist as the center, the upper area responds to the upper key control, that is, when responding to the up key, it responds to the left or right key at the same time; the right area responds to the right key; the upper and lower areas respond to the up and down keys respectively, and perform the corresponding touch key operations after responding to the keys. This way, dividing the area according to the interval has a good user experience. It should be noted that the touch key control controlled by the protagonist must correspond to the release of the touch key, otherwise the corresponding logic will continue to execute after the system receives the user's touch response. When the event.getAction() state is MotionEvent.ACTION_UP, the touch key is released. The operation of releasing the touch key is as follows:
if ((player.keyStatus & SonicPlayer.KEY_LEFT) == SonicPlayer.KEY_LEFT)
{player.keyUp(SonicPlayer.KEY_LEFT);}//Release specific touch response
The improvement of the touch screen interface design module lies in refining the area according to the center point of the protagonist, and then performing corresponding operations according to the divided areas, while adding a counting variable in the touch response.
4 Conclusion
The touch response of the Android platform is very important. The touch event processing mechanism itself is relatively complex, but the touch response is very simple to implement. You only need to implement the relevant methods. The touch screen response of the application is mainly divided into the touch response of the menu interface and the touch response of the game interface. The implementation mechanism of these two parts is the same, but the method of defining the touch range is different.
In each menu interface, the touch range is mainly determined by the pixel position of each picture in the menu relative to the screen, and then the touch screen response is realized; in the game interface, the main focus is on the protagonist, and the touch range and corresponding operations are determined according to the above-mentioned area range division. In the process of implementing the touch method, the setting of the touch temporary count variable is very necessary, which plays a buffering role to prevent a touch key from not releasing and triggering the response of multiple interfaces. This paper mainly implements and optimizes the touch response of the Android platform, which greatly enhances the user experience and has strong application value.
Previous article:Display Technology of Digital Storage Oscilloscope Based on FPGA
Next article:Application of MAX7219 in Transmitter Display
Recommended ReadingLatest update time:2024-11-16 20:47
- Popular Resources
- Popular amplifiers
- MathWorks and NXP Collaborate to Launch Model-Based Design Toolbox for Battery Management Systems
- STMicroelectronics' advanced galvanically isolated gate driver STGAP3S provides flexible protection for IGBTs and SiC MOSFETs
- New diaphragm-free solid-state lithium battery technology is launched: the distance between the positive and negative electrodes is less than 0.000001 meters
- [“Source” Observe the Autumn Series] Application and testing of the next generation of semiconductor gallium oxide device photodetectors
- 采用自主设计封装,绝缘电阻显著提高!ROHM开发出更高电压xEV系统的SiC肖特基势垒二极管
- Will GaN replace SiC? PI's disruptive 1700V InnoMux2 is here to demonstrate
- From Isolation to the Third and a Half Generation: Understanding Naxinwei's Gate Driver IC in One Article
- The appeal of 48 V technology: importance, benefits and key factors in system-level applications
- Important breakthrough in recycling of used lithium-ion batteries
- Innolux's intelligent steer-by-wire solution makes cars smarter and safer
- 8051 MCU - Parity Check
- How to efficiently balance the sensitivity of tactile sensing interfaces
- What should I do if the servo motor shakes? What causes the servo motor to shake quickly?
- 【Brushless Motor】Analysis of three-phase BLDC motor and sharing of two popular development boards
- Midea Industrial Technology's subsidiaries Clou Electronics and Hekang New Energy jointly appeared at the Munich Battery Energy Storage Exhibition and Solar Energy Exhibition
- Guoxin Sichen | Application of ferroelectric memory PB85RS2MC in power battery management, with a capacity of 2M
- Analysis of common faults of frequency converter
- In a head-on competition with Qualcomm, what kind of cockpit products has Intel come up with?
- Dalian Rongke's all-vanadium liquid flow battery energy storage equipment industrialization project has entered the sprint stage before production
- Allegro MicroSystems Introduces Advanced Magnetic and Inductive Position Sensing Solutions at Electronica 2024
- Car key in the left hand, liveness detection radar in the right hand, UWB is imperative for cars!
- After a decade of rapid development, domestic CIS has entered the market
- Aegis Dagger Battery + Thor EM-i Super Hybrid, Geely New Energy has thrown out two "king bombs"
- A brief discussion on functional safety - fault, error, and failure
- In the smart car 2.0 cycle, these core industry chains are facing major opportunities!
- The United States and Japan are developing new batteries. CATL faces challenges? How should China's new energy battery industry respond?
- Murata launches high-precision 6-axis inertial sensor for automobiles
- Ford patents pre-charge alarm to help save costs and respond to emergencies
- New real-time microcontroller system from Texas Instruments enables smarter processing in automotive and industrial applications
- What are the six axes in a six-axis robot?
- Three-phase sensorless sine wave BLDC drive (MS39549 and MS39545)
- October 21 Live Review: ADI Inertial MEMS Applications (including video, presentation documents, Q&A)
- Altium selects the schematic device and automatically highlights the corresponding PCB device. How to set it?
- Pre-registration for the prize live broadcast | TI uses DLP micro-projection technology to design augmented reality smart glasses
- Please recommend some technical public accounts and recent thoughts
- Come here to apply for the DFRobot Xingkong board~
- Which one has a 50 ohm impedance matching circuit? Please refer to it.
- (C-Wireless Charging Electric Car) Shaanxi Province First Prize_Topic C_Xi'an University of Electronic Science and Technology
- About bus conflicts (Part 1)