Implementation and optimization of touch screen for Android mobile applications

Publisher:tgdddtLatest update time:2012-10-18 Source: 维库电子Keywords:Android Reading articles on mobile phones Scan QR code
Read articles on your mobile phone anytime, anywhere

In recent years, with the large-scale construction of 3G networks and the rapid popularization of smart phones, the era of mobile Internet has arrived. As an operating system launched by Google, Android has attracted widespread attention since its launch and has been favored by many manufacturers and developers. Android is an open, complete and free mobile phone platform. Its powerful open source features have attracted more and more developers. The version of Android has now been upgraded from the initial 1.1 to the latest 4.0, with more and more powerful functions, better user experience, and more colorful application products released on the Android platform.

There are more and more applications on the Android platform, and people have higher and higher requirements for applications. At this stage, touch-screen mobile phones are the trend, and touch is the mainstream way for people to interact with applications. People have higher and higher requirements for touch sensitivity, simplicity and convenience, so it is necessary to optimize the touch effect of applications.

1 Touch mechanism of Android platform

Generally speaking, events are actions triggered when users interact with the UI (graphical interface). For example, touching a specific area on the phone screen will trigger a corresponding event. In Android, these events will be sent to the event handler, which is a method that specifically receives event objects and translates and processes them.

In Android, user event response is very important. User messages mainly come from three hardware external devices: touch response (ONTouch series methods), key response (onKey series methods) and trackball. Whether we interact with the screen by touch or by using a trackball, any screen-related interaction is an interaction with the view controls at the corresponding position of the screen. Now almost all Android phones are touch screens, with fewer and fewer basic buttons, and touch screen response has become a major trend of development. Therefore, when designing applications, we mainly consider the effect of touch and focus on optimizing user touch response.

Figure 1 Android platform user message processing process

Figure 1 details the process of various user operations being captured by the system. After the system captures the user's response, the Linux driver captures the user's message, which is passed to the system method in the Activity through the Activity Manager of the Android framework layer. The View class is updated by executing the corresponding system method, and the message is passed to the Application Framework layer. Finally, the Linux driver is called to control the drawing and updating of the interface. In Android projects, touch-related interface functions are mainly encapsulated in the android.view.KeyEvent class. When writing touch screen control, first import the package of this class and explain the method in the subclass of Activity. Android's event handling mechanism is relatively simple. It is not necessary to understand the event handling mechanism in detail. When implementing various responses, you only need to implement each method. The specific implementation of each method is carried out in the View class. The following focuses on the implementation of touch response on the Android platform.

2 Implementation of touch response on Android platform

Generally speaking, in Android, the handling of touch events is the same as UI events, mainly handled by two methods, one is to use the onTouchEvent() method of the View class, and the other is to implement the onTouch() method of the OnTouchListener interface. When both methods exist in the system, the system will give priority to the OnTouchListener interface callback method for processing. Generally, the onTouchListener() method is rewritten in a simple UI design interface, and when multiple Activity switches are involved, the Activity information can be saved and jumped in the method.

For large applications, the interface state is relatively complex and the touch response needs to be processed in detail. The system provides an interface function onTouchEvent() specifically for processing user touch events. In the actual development process, you only need to declare this interface function first, and then rewrite the function in the main View class to implement specific touch effects. The following is the declaration of the interface function:

There is only one MotionEvent event parameter in the touch screen monitoring method. The instance of this class saves various touch screen actions of the player, including: press action, move action, multi-touch screen, screen pressure, etc. Many static constant values ​​of actions are defined in this class. The player's action is obtained through the event.getAction() method to match the required action constant value.

The above is the touch response processing of the sound interface: event.getAction() is the type of key control, which is a system method of the MotionEvent class, and obtains the current touch response. When the screen is touched downward, it responds to the MotionEvent.ACTION_DOWN action. When the contact point meets the given interval range, the corresponding operation is performed to achieve the state jump of the game. When dividing the touch range, it is best to use the relative position of the screen as a reference point. This will be more convenient when applying horizontal and vertical screen processing or application porting, and avoid frequent modifications of the reference coordinate values. The specific properties of the current device screen width and height can be obtained in the subclass that inherits Activity at the program entry, so that the touch range can be determined by the screen width and height attribute values. The flowchart of the entire touch part is as follows:

Figure 2 Specific process of touch screen operation response

Figure 2 shows the specific response process when the screen is touched during the game, which mainly involves the Activity class and the View class. In the Activity class, the touch method onTouchEvent() is declared, and the method is defined and explained in detail in the View class. After the touch response, the event response mechanism is triggered. The event object obtains the touch response through the getAction() method, obtains the current touch point coordinates event.getX() and event.getY(), and judges with the touch range in the method. If it is within the area, the touch response is executed. After the touch response is executed, the touch release needs to be processed in the touch release MotionEvent.ACTION_DOWN to release the current touch response in time.

3 Optimization of touch response on Android platform

It is worth noting that in the above touch response MotionEvent.ACTION_DOWN, a temporary touch count variable keyCount is specially defined. When the count variable is added to a certain level, the touch response is executed. This process can effectively prevent continuous touch response, and the touch is not released and jumps directly to the next state. The setting of touch variables is very necessary in the process of jumping between various interfaces, especially in the game menu.

In addition to the response of each menu interface, another important response in touch response is the control of the protagonist in the game. The touch response principle of the main game interface is the same as that of the menu interface. The difference is that the selection of the touch judgment range is different. The touch range of the menu interface is fixed, while the demarcation of the touch area of ​​the game interface is dynamic. Since the protagonist is the core character in role-playing games, the touch response of the protagonist's control is particularly important. When designing touch, it is generally designed with the protagonist as the center, and the coordinates of the protagonist are used as the basic point for judgment.

Figure 3 Division of the area when the protagonist is in control

Figure 3 shows the area division when the main character moves. The intersection of the horizontal and vertical coordinate axes is used as the coordinate center point of the main character, and the area around the main character is divided into 10 areas. When the touch point is in area 9-10, the direction is right, when it is in the symmetrical area 5-6, the direction is left, and when it is in area 7-8, the direction is downward. For areas 1-4, further division is made. Area 2-3 is upward and only responds to upward operations. Area 1 responds to the right key and the up key at the same time, and area 4 responds to the left key and the up key at the same time. This refinement of the area is in line with the actual situation, and the angle range of area 1 and area 4 can be fine-tuned according to the actual situation.

After dividing the area around the protagonist with the protagonist as the center, the upper area responds to the upper key control, that is, when responding to the up key, it responds to the left or right key at the same time; the right area responds to the right key; the upper and lower areas respond to the up and down keys respectively, and perform the corresponding touch key operations after responding to the keys. This way, dividing the area according to the interval has a good user experience. It should be noted that the touch key control controlled by the protagonist must correspond to the release of the touch key, otherwise the corresponding logic will continue to execute after the system receives the user's touch response. When the event.getAction() state is MotionEvent.ACTION_UP, the touch key is released. The operation of releasing the touch key is as follows:

if ((player.keyStatus & SonicPlayer.KEY_LEFT) == SonicPlayer.KEY_LEFT)

{player.keyUp(SonicPlayer.KEY_LEFT);}//Release specific touch response

The improvement of the touch screen interface design module lies in refining the area according to the center point of the protagonist, and then performing corresponding operations according to the divided areas, while adding a counting variable in the touch response.

4 Conclusion

The touch response of the Android platform is very important. The touch event processing mechanism itself is relatively complex, but the touch response is very simple to implement. You only need to implement the relevant methods. The touch screen response of the application is mainly divided into the touch response of the menu interface and the touch response of the game interface. The implementation mechanism of these two parts is the same, but the method of defining the touch range is different.

In each menu interface, the touch range is mainly determined by the pixel position of each picture in the menu relative to the screen, and then the touch screen response is realized; in the game interface, the main focus is on the protagonist, and the touch range and corresponding operations are determined according to the above-mentioned area range division. In the process of implementing the touch method, the setting of the touch temporary count variable is very necessary, which plays a buffering role to prevent a touch key from not releasing and triggering the response of multiple interfaces. This paper mainly implements and optimizes the touch response of the Android platform, which greatly enhances the user experience and has strong application value.

Keywords:Android Reference address:Implementation and optimization of touch screen for Android mobile applications

Previous article:Display Technology of Digital Storage Oscilloscope Based on FPGA
Next article:Application of MAX7219 in Transmitter Display

Recommended ReadingLatest update time:2024-11-16 20:47

Google tests HDR photography on Android Go phones
      According to XDA reports, some developers have discovered that Google is testing the HDR photo-taking function on Google Camera Go, which is expected to enable low-end mobile phones to take HDR photos.   Google Camera Go is a simplified version of Google Camera. XDA said that developers found a hidden "Detail
[Mobile phone portable]
Design of emergency call system for Android smartphone
introduction With the rapid development of mobile communication technology, smart phones have been widely popularized and applied. In addition to basic calling functions, smart phones have become a new type of terminal node in the era of the Internet of Things. At present, the elderly, women and children often encou
[Microcontroller]
Design of emergency call system for Android smartphone
Nokia Quicksilver appears on Geekbench with 6GB RAM and Android 11
     A mysterious HMD Global smartphone codenamed Nokia Quicksilver has appeared on the website of benchmarking platform Geekbench. The listing reveals that the smartphone will be equipped with 6GB of RAM and run the Android 11 operating system.   In addition, the Nokia Quicksilver listing shows that the smartphone wi
[Mobile phone portable]
Nokia Quicksilver appears on Geekbench with 6GB RAM and Android 11
HTC U11 Life gets Android 10 update
      It is reported that the HTC U11 Life system will be updated to Android 10. It is understood that the HTC U11 Life is HTC's first mobile device upgraded to Android 9 Pie.   HTC's mobile phone business in recent years has been mediocre, and new mobile phones have been released one after another. In particular, a
[Mobile phone portable]
OPPO ColorOS is compatible with Android 13! The first batch of upgraded models are announced
      In the early morning of May 12, Google held the Google I/O 2022 conference and officially launched the Android 13 operating system.   A few days ago, the official Weibo account of @ColorOS stated that ColorOS has been adapted to Android 13, and the first batch of models, OPPO Find N, OPPO Find X5 Pro, and OnePlu
[Mobile phone portable]
Android phones collect 20 times more user data than iPhones, study finds
A study by a computer science professor at Trinity College Dublin found that a typical Android phone collects about 20 times more user data than an equivalent iPhone. According to ArsTechnica, Doug Rice, the college's chair researcher in computer systems, said that both iOS and Android are constantly collecting so-c
[Mobile phone portable]
Apple Music Beta for Android supports spatial audio and lossless audio
Apple's spatial audio and lossless audio are now available on the Apple Music Android app.  The beta version of ‌Apple Music‌ 3.6 for Android includes automatic crossfading and enhanced library search capabilities. Automatic crossfading, a new way to listen, blends each song into the next for a seamless experience.
[Mobile phone portable]
OPPO's new phone appears on GeekBench: equipped with Qualcomm Snapdragon 888, 12GB RAM, running Android 12
      Existing leaks indicate that OPPO is preparing to launch its first foldable screen phone, the Find N curved foldable screen phone.   Previously, an OPPO phone was certified by the Ministry of Industry and Information Technology, model number PEUM00. Some netizens said that this phone is OPPO's upcoming folding s
[Mobile phone portable]
OPPO's new phone appears on GeekBench: equipped with Qualcomm Snapdragon 888, 12GB RAM, running Android 12
Latest Power Management Articles
Change More Related Popular Components

EEWorld
subscription
account

EEWorld
service
account

Automotive
development
circle

About Us Customer Service Contact Information Datasheet Sitemap LatestNews


Room 1530, 15th Floor, Building B, No.18 Zhongguancun Street, Haidian District, Beijing, Postal Code: 100190 China Telephone: 008610 8235 0740

Copyright © 2005-2024 EEWORLD.com.cn, Inc. All rights reserved 京ICP证060456号 京ICP备10001474号-1 电信业务审批[2006]字第258号函 京公网安备 11010802033920号