Touch screen, electronic equipment and display control method

文档序号:7055 发布日期:2021-09-17 浏览:26次 中文

1. The display control method is applied to electronic equipment, wherein the electronic equipment comprises a touch screen;

the method comprises the following steps:

when the gesture of the user moving the electronic equipment is determined to be a preset gesture according to first acceleration data, acquiring a capacitance measurement value acquired by the touch screen in a second time period, wherein the first acceleration data is acquired according to a first sampling frequency, and the acceleration data of the electronic equipment in the first time period;

when the acquired capacitance measurement value meets a second preset condition, judging whether the gesture of the user for moving the electronic equipment is the preset gesture again according to second acceleration data, wherein the second acceleration data is acquired according to a second sampling frequency, and the duration of a third time period is greater than that of the first time period and/or the first sampling frequency is less than the second sampling frequency;

and when the gesture of the user for moving the electronic equipment is determined to be the preset gesture again, controlling the display state of the touch screen to be switched.

2. The method of claim 1,

the second preset condition includes: the capacitance measured value shows a monotonically increasing change trend in the second time period, and the capacitance measured value at the end time of the second time period is not greater than a first preset threshold;

controlling the display state switching of the touch screen, including:

and controlling the display state of the touch screen to be switched from the bright screen to the off screen.

3. The method of claim 1,

the second preset condition includes: the capacitance measured value shows a monotone decreasing change trend in a second time period, and the capacitance measured value at the end time of the second time period is not less than a second preset threshold;

controlling the display state switching of the touch screen, including:

and controlling the display state of the touch screen to be switched from off to on.

4. A method according to any of claims 1-3, wherein the preset gesture is a lift, a drop, or a flip.

5. The method of any one of claims 1-4, wherein determining from the first acceleration data that the gesture of the user moving the electronic device is a preset gesture comprises:

and if it is determined according to the first acceleration data that the direction of the acceleration of the electronic equipment is crossed with any one of three axes of a space coordinate system, the numerical values of at least part of the acceleration data in the first acceleration data are monotonically increased or monotonically decreased, and the number of peaks and/or troughs in the first acceleration data is smaller than a preset number, determining that the gesture of the user moving the electronic equipment is a preset gesture.

6. The method of claim 3, wherein determining from the first acceleration data that the gesture of the user moving the electronic device is a preset gesture comprises:

when the touch screen is in a low power consumption mode and the touch screen is in a screen off state, judging whether a gesture of a user for moving the electronic equipment is a preset gesture according to first acceleration data; when the touch screen is in a low power consumption mode, the touch host machine processes the closing of the THP algorithm;

the method further comprises the following steps:

when the gesture of the user moving the electronic equipment is the preset gesture, controlling the touch screen to exit a low power consumption mode, starting the THP algorithm, and keeping the touch screen in a screen-off state; wherein the THP algorithm is used for judging the variation trend of the capacitance measurement value in the second time period.

7. The method of claim 6, wherein the method further comprises:

if the capacitance measured value does not have a monotonically increasing change trend in the second time period, keeping the touch screen in a screen-off state;

if the capacitance measured value shows a monotonically increasing change trend in the second time period, continuously judging whether the capacitance measured value at the end time of the second time period is smaller than the second preset threshold; if the capacitance measured value at the ending moment of the second time period is smaller than the second preset threshold, controlling the display state of the touch screen to be switched from off to on; if the capacitance measurement value at the end time of the second time period is not less than the second preset threshold, executing: and judging whether the gesture of the user for moving the electronic equipment is the preset gesture again according to the second acceleration data.

8. An electronic device comprising a touch screen, a processor, and a memory for storing one or more computer programs;

the one or more computer programs stored by the memory, when executed by the processor, cause the electronic device to perform the method of any of claims 1-7.

9. A computer-readable storage medium, comprising a computer program which, when run on an electronic device, causes the electronic device to perform the method of any one of claims 1-7.

10. A program product, characterized in that it comprises instructions which, when run on an electronic device, cause the electronic device to carry out the method according to any one of claims 1-7.

Background

At present, when a user answers a call by using an electronic device such as a mobile phone and a smart watch, if the electronic device detects that a shielding object (for example, a finger, a face, and the like of a person) is close to a screen, the screen can be turned off, so that the situation that the screen is in contact with the face when the user answers the call to cause an error operation is prevented, in addition, the electric quantity of the electronic device can be saved, and when the electronic device detects that the shielding object is far away from the screen, the screen can be turned on, so that the electronic device can display information or receive an input operation of the user.

With the requirement of a user for the screen duty ratio being higher and higher, the frame above the screen of the electronic device is narrower (even disappears), and the proximity light sensor cannot be accommodated, so that the conventional approach detection scheme by installing the proximity light sensor in the frame above the screen of the electronic device is not suitable. In order to ensure that electronic equipment with a narrow frame or without a frame has a proximity detection function, the prior art proposes to use a capacitive sensing technology of a capacitive touch screen to implement proximity detection.

However, in practical applications, the capacitance data sensed by the touch screen is very easily interfered by surrounding environments (e.g., metal sheet deformation, ambient temperature, ambient humidity, human body sweat, static electricity, etc. in the touch screen), so that the proximity detection accuracy is poor, and further, the electronic device has poor display control accuracy and poor user experience.

Disclosure of Invention

The embodiment of the application provides a touch screen, electronic equipment and a display control method, and aims to solve the technical problems that in the prior art, the electronic equipment is poor in proximity detection accuracy and poor in display control accuracy.

In a first aspect, a touch screen is provided, which includes a middle frame screen chamber, a supporting member, a display module and a touch module; the middle frame screen cabin forms an accommodating space, the display module is arranged in the accommodating space formed by the middle frame screen cabin, and the touch module is arranged on one side, away from the middle frame screen cabin, of the display module; a metal sheet is arranged on one side, close to the middle frame screen cabin, of the display module, and the edge of the metal sheet is fixed with the middle frame screen cabin and the display module respectively; the support piece is arranged between the middle frame screen bin and the metal sheet and plays a supporting role on the metal sheet.

In this application embodiment, when the electronic equipment gesture changes, support piece can play the supporting role to the sheetmetal, weakens or even avoids the sheetmetal because of the deformation that the action of gravity took place for the capacitance value that the touch-control module detected can reflect the distance of sheltering from thing and touch-sensitive screen more truly, and then improves the accuracy that is close to the detection, controls the display state of touch-sensitive screen more accurately.

In one possible design, a first projection of the supporting member on one side of the touch module and a second projection of the metal sheet on one side of the touch module are completely overlapped.

That is to say, support piece's setting range can be that the whole setting of carrying out in sheetmetal coverage, and support piece can play the supporting role to the sheetmetal whole face like this, can further improve the accuracy that approaches the detection, controls the display state of touch-sensitive screen more accurately.

In a possible design, a first projection of the supporting member on one side of the touch module is located within a second projection of the metal sheet on one side of the touch module, and a distance between an edge of the second projection and a center of the first projection is smaller than a preset value.

That is to say, can only set up (for example receive the great region of action of gravity deformation degree) support piece in the subregion of sheetmetal coverage, can play supporting role to the sheetmetal like this, improve the accuracy that approaches the detection, control the display state of touch-sensitive screen more accurately, reduce support piece's volume and weight simultaneously, more do benefit to the frivolous of touch-sensitive screen.

In one possible design, the support includes a plurality of sub-members, and the sub-members are arranged at intervals along a first direction, wherein the first direction is any direction perpendicular to the stacking direction of the display module and the touch module.

Like this, can all play the supporting role to the sheetmetal whole face more uniformly, improve the accuracy of being close to the detection, control touch-sensitive screen's display state more accurately, reduce support piece's volume and weight simultaneously, more do benefit to touch-sensitive screen frivolousization.

In one possible design the support is a foam with good electrical insulation properties.

Therefore, the supporting piece has good supporting force, interference on other devices can be avoided, and the touch screen is lighter and thinner.

In a second aspect, an electronic device is provided, which includes the touch screen of the first aspect or any one of the possible designs of the first aspect.

Like this, when the electronic equipment gesture changes, support piece in the screen can play the supporting role to the sheetmetal, weakens or even avoids the sheetmetal because of the deformation that the action of gravity took place for the capacitance value that the touch module detected can reflect the distance of sheltering from thing and touch-sensitive screen more truly, and then improves the accuracy that electronic equipment is close to the detection, controls electronic equipment's display state more accurately.

In a third aspect, a display control method is provided, which is applied to an electronic device, wherein the electronic device comprises a touch screen; the method comprises the following steps: when the gesture that a user moves the electronic equipment is determined to be a preset gesture, acquiring a capacitance measurement value sensed by a contact in a preset area on the touch screen, wherein the preset area is located in the peripheral area of the touch screen; and when the acquired capacitance measurement value meets a first preset condition, controlling the display state of the touch screen to be switched. Wherein the predetermined area is defined.

In the embodiment of the application, the processor detects the approaching/departing of the shielding object and the touch screen by selecting the preset area in the touch screen to generate the capacitance measured value so as to control the display state switching of the touch screen, and the preset area is located in the peripheral area of the touch screen, and the capacitance measured value generated in the area is less interfered by metal deformation, so that the calculated approaching/departing result is more accurate compared with the approaching/departing result calculated based on the whole-screen capacitance value, the approaching detection accuracy of the electronic equipment can be improved, and the display state of the electronic equipment can be controlled more accurately.

In one possible design, the first preset condition may include: the capacitance measurement value is in a monotone increasing change trend in a first time length, and the capacitance measurement value at the ending time of the first time length is larger than a first preset threshold. Correspondingly, the controlling of the display state switching of the touch screen may include: and controlling the display state of the touch screen to be switched from the bright screen to the off screen.

Like this, can improve the accuracy that electronic equipment is close to the detection to sheltering from the thing, and then control electronic equipment more accurately and switch over to turning off the screen from bright screen.

In one possible design, the first preset condition may include: the capacitance measurement value is in a monotone decreasing change trend in a first time length, and the capacitance measurement value at the ending time of the first time length is smaller than a second preset threshold. Correspondingly, the controlling of the display state switching of the touch screen may include: and controlling the display state of the touch screen to be switched from off to on.

Like this, can improve electronic equipment and keep away from the accuracy that detects to sheltering from the thing, and then control electronic equipment more accurately and switch to bright screen from putting out the screen.

In a fourth aspect, a display control method is provided, which is applied to an electronic device, where the electronic device includes a touch screen; the method comprises the following steps: when the gesture of the user moving the electronic equipment is determined to be a preset gesture according to first acceleration data, acquiring a capacitance measurement value acquired by the touch screen in a second time period, wherein the first acceleration data is acquired according to a first sampling frequency, and the acceleration data of the electronic equipment in the first time period; when the acquired capacitance measurement value meets a second preset condition, judging whether the gesture of the user for moving the electronic equipment is the preset gesture again according to second acceleration data, wherein the second acceleration data is acquired according to a second sampling frequency, and the duration of a third time period is greater than that of the first time period and/or the first sampling frequency is less than the second sampling frequency; and when the gesture of the user for moving the electronic equipment is determined to be the preset gesture again, controlling the display state of the touch screen to be switched.

In the embodiment of the application, after the electronic device determines that the obtained capacitance measurement value meets the second preset condition, a gesture judgment step is added (namely, whether the gesture of the electronic device moved by the user is the preset gesture is judged again according to the second acceleration data), so that even if the capacitance value sensed by the touch screen is interfered by the surrounding environment, the accuracy of display control of the electronic device can be further ensured through the gesture judgment step, and the user experience is improved.

In one possible design, the second preset condition includes: the capacitance measured value shows a monotonically increasing change trend in the second time period, and the capacitance measured value at the end time of the second time period is not greater than a first preset threshold. Correspondingly, the control of the display state switching of the touch screen comprises the following steps: and controlling the display state of the touch screen to be switched from the bright screen to the off screen.

Like this, can improve the accuracy that electronic equipment is close to the detection to sheltering from the thing, and then control electronic equipment more accurately and switch over to turning off the screen from bright screen.

In one possible design, the second preset condition includes: the capacitance measured value shows a monotone decreasing change trend in a second time period, and the capacitance measured value at the end time of the second time period is not less than a second preset threshold. Correspondingly, the control of the display state switching of the touch screen comprises the following steps: and controlling the display state of the touch screen to be switched from off to on.

Like this, can improve electronic equipment and keep away from the accuracy that detects to sheltering from the thing, and then control electronic equipment more accurately and switch to bright screen from putting out the screen.

In a fifth aspect, embodiments of the present application further provide an electronic device, which includes a touch screen, a processor, and a memory, where the memory is used to store one or more computer programs. One or more computer programs stored in the memory that, when executed by the processor, enable the electronic device to implement aspects of any one of the possible designs of the third aspect or the third aspect; or, when the one or more computer programs stored in the memory are executed by the processor, the electronic device is enabled to implement the solution of the fourth aspect or any of the possible designs of the fourth aspect.

In a sixth aspect, embodiments of the present application further provide an electronic device, where the electronic device includes a module/unit that performs the method of any one of the above third aspect and possible designs of the third aspect; alternatively, the electronic device comprises means/unit for performing the method of any of the above-mentioned fourth aspect or possible designs of the fourth aspect. These modules/units may be implemented by hardware, or may be implemented by hardware executing corresponding software.

In a seventh aspect, an embodiment of the present application further provides a chip, where the chip is coupled to a memory in an electronic device, and is configured to call a computer program stored in the memory and execute a technical solution in any one of possible designs of the third aspect or the third aspect of the embodiment of the present application; or, the chip is used to call a computer program stored in a memory and execute a technical solution of a design in any one of the possible designs of the fourth aspect or the fourth aspect of the embodiments of the present application. Herein, the term "coupled" in the embodiments of the present application means that two components are directly or indirectly combined with each other.

In an eighth aspect, embodiments of the present application further provide a computer-readable storage medium, where the computer-readable storage medium includes a computer program. When the computer program runs on the electronic device, the electronic device is enabled to execute the technical solution in any one of the possible designs of the third aspect or the third aspect of the embodiments of the present application; or, when the computer program runs on an electronic device, the electronic device is caused to execute the technical solution in any one of the possible designs of the fourth aspect or the fourth aspect of the embodiments of the present application.

In a ninth aspect, a program product in embodiments of the present application includes instructions. When the program product runs on an electronic device, causing the electronic device to execute the technical solution in any one of the possible designs of the third aspect or the third aspect of the embodiments of the present application; or, when the program product runs on an electronic device, the electronic device is caused to execute the technical solution in any one of the possible designs of the fourth aspect or the fourth aspect of the embodiments of the present application.

Drawings

Fig. 1A and 1B are schematic structural diagrams of a touch screen of an electronic device in the prior art;

FIG. 2A, FIG. 2B, and FIG. 2C are schematic diagrams illustrating an electronic device with a screen facing upwards, a screen perpendicular to a horizontal direction, and a screen facing downwards, respectively;

FIG. 3 is a schematic structural diagram of a possible electronic device in an embodiment of the present application;

FIG. 4 is a schematic structural diagram of a possible touch screen in an embodiment of the present application;

FIGS. 5A, 5B, and 5C are schematic views of three possible support members in an embodiment of the present application;

FIG. 6 is a software architecture of a possible electronic device according to an embodiment of the present application;

FIG. 7 is a flowchart illustrating a display control method according to an embodiment of the present application;

FIG. 8 is a flow chart of a gesture algorithm in an embodiment of the present application;

FIG. 9 is a schematic diagram of a predetermined area in an embodiment of the present application;

FIG. 10 is a flow chart of another display control method in the embodiment of the present application;

FIG. 11 is a schematic time-consuming diagram of illuminating a touch screen when a barrier is away from the touch screen;

FIG. 12 is a flow chart of another display control method in the embodiment of the present application;

FIG. 13 is a flow chart of another display control method in the embodiment of the present application;

FIG. 14 is a flow chart of another display control method in the embodiment of the present application;

fig. 15 is a schematic structural diagram of another possible electronic device in the embodiment of the present application.

Detailed Description

As described in the background art, in the conventional technical solution, proximity detection of an electronic device is generally achieved by installing a proximity sensor in a bezel above a screen of the electronic device. Proximity light sensors generally consist of two parts: one part is an infrared Light Emitting Diode (LED) for emitting near infrared light; the other part is a photosensitive sensor sensitive to near infrared light. When the shielding object is close to the electronic equipment, the near infrared light received by the photosensitive sensor is more and more, and when the near infrared light received by the photosensitive sensor is greater than the upper limit value, the screen is turned off; when the shielding object is far away from the electronic equipment, the near infrared light received by the photosensitive sensor is less and less, and when the near infrared light received by the photosensitive sensor is smaller than the lower limit value, the screen is lightened. However, as the screen occupation ratio of the electronic device is increased, the border above the screen becomes narrower (even disappears), and the proximity light sensor cannot be accommodated. In order to ensure that such electronic devices also have a proximity detection function, some prior arts propose a capacitive sensing technology using a capacitive touch screen to implement proximity detection.

Fig. 1A and 1B are schematic structural diagrams of a touch screen of an electronic device in the prior art. Referring to fig. 1A and 1B, the touch screen includes a middle frame screen compartment 11, a display module 12 disposed in an accommodating space formed by the middle frame screen compartment 11, and a touch module 13 disposed above the display module. Wherein, the display module assembly 12 is provided with a sheetmetal 14 near the one side of middle frame screen storehouse 11 bottom, and the edge of sheetmetal 14 is fixed with middle frame screen storehouse 11, display module assembly 12 bonding respectively through the viscose, shows the viscose through black filling part in fig. 1A, fig. 1B. The metal sheet 14 can fix the display module 12 well, and ensure the stability of the display module 12.

When the shielding object is close to or away from the touch screen within a certain range, the capacitance value of the capacitive sensor at the position corresponding to the touch point on the touch module 13 changes (for example, the capacitance value increases when the shielding object is close to the touch screen, the capacitance value decreases when the shielding object is away from the touch screen, and the capacitance value is maximum when the shielding object contacts the touch module 13), so that the approach detection can be realized in principle by detecting the change rule of the capacitance value on the touch module 13.

However, in practical applications, the capacitance value sensed by the touch screen is easily interfered by the surrounding environment, such as deformation of the metal sheet in the touch screen, ambient temperature, ambient humidity, sweat on the shielding object, static electricity, and the like.

Taking the deformation of the metal sheet as an example: when the electronic device is close to or far away from the obstruction, the posture of the electronic device is usually changed, such as a user picking up, putting down or turning over the electronic device when receiving a call. The middle area of the metal sheet 14 in the touch screen is not fixed by adhesive, so that deformation to a large extent can be generated under the influence of gravity, and when the electronic equipment is in different postures, the deformation degree of the metal sheet 14 is different. For example, fig. 2A, fig. 2B, and fig. 2C are schematic diagrams illustrating the electronic device with the screen facing upward, the electronic device perpendicular to the horizontal direction, and the electronic device facing downward, wherein when the electronic device is facing upward, as shown in fig. 2A, the distance between the metal sheet 14 and the touch module 13 is increased; when the electronic device is facing down on the screen, as shown in fig. 2C, the distance between the metal sheet 14 and the touch module 13 is reduced; when the electronic device is disposed on the screen perpendicular to the horizontal direction, as shown in fig. 2B, the distance between the metal sheet 14 and the touch module 13 has a relatively small variation. The deformation of the metal sheet 14 in different degrees will generate different sizes of sensing capacitors with other metal layers in the electronic device, and further generate different degrees of interference on the capacitance of the capacitive sensor on the touch module 13.

Generally, the capacitance value generated by the touch module 13 when the shielding object is near the touch screen but does not contact the touch screen is much smaller than the capacitance value generated by the touch module 13 when the shielding object contacts the touch screen. In a conventional scenario where the touch module 13 is only used for touch detection (e.g., detecting an input operation performed by a user), since the magnitude of a capacitance value generated by contact of a shielding object is relatively large (here, a capacitance value when the distance between the shielding object and the touch module 13 is 0mm is marked as 100%), a threshold value for detecting whether the shielding object contacts the touch screen is set to be relatively large, and such capacitance value interference caused by environmental factors (e.g., capacitance value interference caused by deformation of the metal sheet 14) is generally about 20% of the capacitance value when the shielding object contacts the touch screen, and thus can be ignored. However, in a scenario where the touch module 13 is used for proximity detection, when the shielding object is near the touch screen but does not contact the touch screen (the distance between the shielding object and the touch module 13 is approximately in the range of 20mm to 5 mm), the magnitude of the capacitance generated by the touch module 13 is small, generally 1% to 4% of the capacitance during contact, so that the capacitance interference generated by these environmental factors has a great influence on the proximity detection accuracy of the touch module 13. Therefore, in the prior art, the accuracy of detecting the approach distance of the shielding object based on the touch screen is poor, so that the display control accuracy of the electronic equipment is poor, the problems of mistakenly lighting or mistakenly extinguishing the screen and the like frequently occur, the power consumption of the electronic equipment is large, and the user experience is poor.

In view of this, the embodiments of the present application provide the following three solutions to improve the accuracy of the display control of the electronic device.

First, can set up support piece between sheetmetal 14 and the center screen storehouse 11 in the structure of the touch-sensitive screen of the electronic equipment that fig. 1A, fig. 1B show constitutes, support piece plays the supporting role to the part that the sheetmetal easily deforms, can effectively resist and avoid the sheetmetal to take place deformation even, improves the accuracy that electronic equipment is close to the detection, and then controls electronic equipment's display state more accurately.

Secondly, based on the structure of the touch screen of the electronic device shown in fig. 1A and 1B, when proximity detection is performed based on the capacitance value generated by the touch module 13, the distance between the shielding object and the touch screen is calculated by selecting the capacitance value (for example, the capacitance value sensed by the first three rows of contact positions on the touch module 13) sensed by the edge area (i.e., the area that is not easily deformed) of the corresponding metal sheet 14 in the touch module 13. Because the region at the edge of the metal sheet 14 is not easy to deform, the capacitance value corresponding to the region is likely to be less interfered, and the calculated approaching/departing result is more accurate than the calculated approaching/departing result based on the whole screen capacitance value, so that the approaching detection accuracy of the electronic equipment can be improved, and the display state of the electronic equipment can be controlled more accurately.

Thirdly, based on the structure of the touch screen of the electronic device shown in fig. 1A and 1B, after calculating the distance relationship between the shielding object and the electronic device based on the capacitance value sensed by the touch module 13 and satisfying the condition of turning off/lighting the screen, a gesture determination step is added, and based on the posture change condition of the electronic device, it is determined whether the approach or the separation of the shielding object relative to the electronic device is an accidental trigger, and when the gesture change condition is determined to be the accidental trigger, that is, when it is determined that the user has the intention to approach or separate the electronic device from the human body, the turning off/lighting the screen is executed. Therefore, even if the capacitance value sensed by the touch screen is interfered by the surrounding environment, the accuracy of display control of the electronic equipment can be ensured through the gesture judgment step, and the user experience is improved.

It should be noted that, in the embodiment of the present application, the three schemes may be implemented separately or in combination with each other, and the embodiment of the present application is not particularly limited.

The technical solutions in the embodiments of the present application will be described below with reference to the drawings in the embodiments of the present application. The embodiments of the present application relate to at least one, including one or more; wherein a plurality means greater than or equal to two. In addition, it is to be understood that the terms first, second, etc. in the description of the present application are used for distinguishing between the descriptions and not necessarily for describing a sequential or chronological order.

The technical solution provided in the embodiment of the present application may be applied to electronic devices such as a mobile phone, a tablet computer, a desktop computer, a laptop computer, a notebook computer, an ultra-mobile personal computer (UMPC), a handheld computer, a netbook, a Personal Digital Assistant (PDA), a wearable electronic device, and a virtual reality device, and the embodiment of the present application does not limit the electronic devices at all.

Taking the electronic device as an example of a mobile phone, fig. 3 shows an exemplary structural diagram of the mobile phone. As shown in fig. 3, the mobile phone 100 may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a button 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a Subscriber Identity Module (SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.

It is to be understood that the illustrated structure of the embodiment of the present application does not specifically limit the mobile phone 100. In other embodiments of the present application, the handset 100 may include more or fewer components than shown, or some components may be combined, some components may be separated, or a different arrangement of components may be used. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.

Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a memory, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors.

The controller may be a neural center and a command center of the cell phone 100, among others. The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.

A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.

The processor 110 may execute the display control method provided by the embodiment of the present application. The processor 110 can determine a trend of change in the distance between the obstruction and the touch screen (i.e., approaching or moving away) based on the trend of change in the capacitance of the touch screen. For example, as the distance between the barrier and the touch screen becomes smaller (i.e., closer), the capacitance value of the touch point at the proximity of the touch screen to the barrier becomes larger; when the distance between the shielding object and the capacitive touch screen is increased (i.e., is far away), the capacitance of the touch point near the shielding object is decreased. When the processor 110 determines that the distance between the barrier and the capacitive touch screen meets a preset condition, the touch screen is controlled to perform a preset operation, such as turning on or off the screen.

In some embodiments, the processor 110 may further determine whether the gesture performed by the user on the electronic device is a preset gesture after determining that the distance between the obstruction and the touch screen satisfies the preset condition. And controlling the touch screen to execute preset operation only when the distance between the shielding object and the touch screen is determined to meet the preset condition and the gesture executed by the user on the electronic equipment is the preset gesture. For example, the processor 110 may determine a gesture performed by the user on the electronic device, such as picking up, putting down, or turning over the electronic device, according to the acceleration data output by the acceleration sensor 180E, further, the processor 110 may turn off the touch screen to reduce power consumption of the terminal device only after detecting that the obstruction is close to the touch screen and determining that the gesture performed by the user on the electronic device is lifting up, and turn on the touch screen only after detecting that the obstruction is far away from the touch screen and determining that the gesture performed by the user on the electronic device is putting down, so as to facilitate the user to operate the electronic device. Therefore, the accuracy of display control of the electronic equipment can be improved, and the user experience is improved.

In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.

The I2C interface is a bi-directional synchronous serial bus that includes a serial data line (SDA) and a Serial Clock Line (SCL). In some embodiments, processor 110 may include multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180K, the charger, the flash, the camera 193, etc. through different I2C bus interfaces, respectively. For example: the processor 110 may be coupled to the touch sensor 180K through an I2C interface, so that the processor 110 and the touch sensor 180K communicate through an I2C bus interface, thereby implementing the touch function and the proximity detection function of the mobile phone 100.

The I2S interface may be used for audio communication. In some embodiments, processor 110 may include multiple sets of I2S buses. The processor 110 may be coupled to the audio module 170 via an I2S bus to enable communication between the processor 110 and the audio module 170. In some embodiments, the audio module 170 may communicate audio signals to the wireless communication module 160 via the I2S interface, enabling answering of calls via a bluetooth headset.

The PCM interface may also be used for audio communication, sampling, quantizing and encoding analog signals. In some embodiments, the audio module 170 and the wireless communication module 160 may be coupled by a PCM bus interface. In some embodiments, the audio module 170 may also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to implement a function of answering a call through a bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.

The UART interface is a universal serial data bus used for asynchronous communications. The bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is generally used to connect the processor 110 with the wireless communication module 160. For example: the processor 110 communicates with a bluetooth module in the wireless communication module 160 through a UART interface to implement a bluetooth function. In some embodiments, the audio module 170 may transmit the audio signal to the wireless communication module 160 through a UART interface, so as to realize the function of playing music through a bluetooth headset.

MIPI interfaces may be used to connect processor 110 with peripheral devices such as display screen 194, camera 193, and the like. The MIPI interface includes a Camera Serial Interface (CSI), a Display Serial Interface (DSI), and the like. In some embodiments, the processor 110 and the camera 193 communicate through a CSI interface to implement the camera function of the handset 100. The processor 110 and the display screen 194 communicate through the DSI interface to implement the display function of the mobile phone 100.

The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal and may also be configured as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 110 with the camera 193, the display 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, a MIPI interface, and the like.

The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the mobile phone 100, and may also be used to transmit data between the mobile phone 100 and peripheral devices. And the earphone can also be used for connecting an earphone and playing audio through the earphone. The interface may also be used to connect other electronic devices, such as AR devices and the like.

It should be understood that the interface connection relationship between the modules illustrated in the embodiment of the present application is only an exemplary illustration, and does not constitute a limitation on the structure of the mobile phone 100. In other embodiments of the present application, the mobile phone 100 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.

The charging management module 140 is configured to receive charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 140 may receive charging input from a wired charger via the USB interface 130. In some wireless charging embodiments, the charging management module 140 may receive a wireless charging input through a wireless charging coil of the cell phone 100. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.

The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 and provides power to the processor 110, the internal memory 121, the external memory, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be used to monitor parameters such as battery capacity, battery cycle count, battery state of health (leakage, impedance), etc. In some other embodiments, the power management module 141 may also be disposed in the processor 110. In other embodiments, the power management module 141 and the charging management module 140 may be disposed in the same device.

The wireless communication function of the mobile phone 100 can be realized by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor, the baseband processor, and the like.

The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the handset 100 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.

The mobile communication module 150 may provide a solution including wireless communication of 2G/3G/4G/5G, etc. applied to the handset 100. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the same device as at least some of the modules of the processor 110.

The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.) or displays an image or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional modules, independent of the processor 110.

The wireless communication module 160 may provide solutions for wireless communication applied to the mobile phone 100, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), Bluetooth (BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.

In some embodiments, the antenna 1 of the handset 100 is coupled to the mobile communication module 150 and the antenna 2 is coupled to the wireless communication module 160 so that the handset 100 can communicate with networks and other devices through wireless communication techniques. The wireless communication technology may include global system for mobile communications (GSM), General Packet Radio Service (GPRS), code division multiple access (code division multiple access, CDMA), Wideband Code Division Multiple Access (WCDMA), time-division code division multiple access (time-division code division multiple access, TD-SCDMA), Long Term Evolution (LTE), LTE, BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a beidou navigation satellite system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).

The mobile phone 100 implements the display function through the GPU, the display screen 194, and the application processor. The GPU is a microprocessor for image processing, and is connected to the display screen 194 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.

The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel, also referred to as a "display module". The display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), and the like. In some embodiments, the cell phone 100 may include 1 or N display screens 194, with N being a positive integer greater than 1.

The touch sensor 180K is also referred to as a "touch module". The touch sensor 180K may be used to detect a touch operation applied thereto or nearby. The touch sensor 180K may communicate the detected touch operation to the processor 110 to determine the touch event type. Visual output associated with the touch operation may be provided through the display screen 194. The touch sensor 180K may further detect a distance value between an obstruction (e.g., a human body part such as a finger or a human face) and the touch screen, and may further transmit the detected distance value to the processor 110, so that the processor 110 controls a display state of the display screen, such as turning on or off, according to the distance between the obstruction and the touch screen.

In the embodiment of the present application, the touch sensor 180K and the display screen 194 may form a touch screen, which is also referred to as a "touch screen". In some embodiments, the touch sensor 180K may be disposed in the display 194, and in other embodiments, the touch sensor 180K may be disposed on the surface of the mobile phone 100, at a position different from the position of the display 194.

Fig. 4 is a schematic structural diagram of a possible touch screen in an embodiment of the present application. The touch screen comprises a middle frame screen cabin 31, a display module 32 arranged in an accommodating space formed by the middle frame screen cabin 31 and a touch module 33 arranged above the display module 32. Wherein, the display module assembly 32 is provided with a sheetmetal 34 near the one side of center screen storehouse 31 bottom, and the edge of sheetmetal 34 is fixed (exemplarily, can be fixed through the mode that bonds) with center screen storehouse 31, display module assembly 32 respectively, and the steadiness of display module assembly 32 can be guaranteed to sheetmetal 34. The touch screen further comprises a support 35 disposed on one side of the middle frame screen chamber 31 facing the display module 32 and located below the metal sheet 34. When the electronic equipment gesture changes, support piece 35 plays the supporting role to sheetmetal 34, weakens or even avoids the deformation of sheetmetal 34 because of the action of gravity to take place for the capacitance value that touch module 33 detected can reflect the distance of sheltering from thing and touch-sensitive screen more really, and then improves the accuracy that electronic equipment is close to the detection, controls electronic equipment's display state more accurately.

For example, the material of the supporting member 35 may be implemented in various ways, such as foam with good electrical insulation performance, or other electrical insulation materials, such as polyethylene terephthalate or polyvinyl fluoride. The support 35 may be disposed over the entire area of the metal sheet 34, or may be disposed only in a partial area of the area covered by the metal sheet 34 (e.g., an area deformed by gravity to a greater extent).

For example, referring to fig. 5A, the supporting member 35 is disposed on the whole surface of the metal sheet 34.

For example, referring to fig. 5B, a support 35 is provided only in the central region (where deformation is likely to occur due to gravity) of the metal sheet 34 within the coverage area of the metal sheet.

For example, referring to fig. 5C, the support 35 may include a plurality of sub-members, each sub-member being spaced apart from and within the footprint of the metal sheet 34.

Of course, in the specific embodiment, the shape and size of the supporting member 35 may have other implementations, and the embodiments of the present application are not limited specifically here.

The mobile phone 100 can detect a capacitance value sensed when a shielding object is close to or away from the touch screen at each contact point of each touch screen, and when the mobile phone 100 detects that the capacitance value is within a preset range, it can be determined that the shielding object is close to the touch screen, when the capacitance value is detected to be in a decreasing trend, it can be determined that the shielding object is away from the touch screen, and when the capacitance value is detected to be in an increasing trend, it can be determined that the shielding object is close to the touch screen. Thus, the mobile phone 100 can detect that the user holds the mobile phone 100 close to the ear to talk by using the touch screen, so as to automatically turn off the screen to achieve the purpose of saving power, or detect that the user holds the mobile phone 100 away from the ear to automatically turn on the screen to facilitate the user operation. The approach detection function of the touch screen can also be used for the automatic unlocking and screen locking processing process of the mobile phone in a leather sheath mode and a pocket mode.

The mobile phone 100 may implement a shooting function through the ISP, the camera 193, the video codec, the GPU, the display 194, the application processor, and the like.

The ISP is used to process the data fed back by the camera 193. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.

The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, the handset 100 may include 1 or N cameras 193, N being a positive integer greater than 1.

The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the handset 100 is in frequency bin selection, the digital signal processor is used to perform fourier transform or the like on the frequency bin energy.

Video codecs are used to compress or decompress digital video. Handset 100 may support one or more video codecs. Thus, the handset 100 can play or record video in a variety of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.

The NPU is a neural-network (NN) computing processor that processes input information quickly by using a biological neural network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. The NPU can realize applications such as intelligent recognition of the mobile phone 100, for example: image recognition, face recognition, speech recognition, text understanding, and the like.

The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the storage capability of the mobile phone 100. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music, video, etc. are saved in an external memory card.

The internal memory 121 may be used to store computer-executable program code, which includes instructions. The processor 110 executes various functional applications of the cellular phone 100 and data processing by executing instructions stored in the internal memory 121. The internal memory 121 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The data storage area may store data (e.g., audio data, a phonebook, etc.) created during use of the handset 100, and the like. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like.

The mobile phone 100 can implement audio functions through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. Such as music playing, recording, etc.

The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110.

The speaker 170A, also called a "horn", is used to convert the audio electrical signal into an acoustic signal. The cellular phone 100 can listen to music through the speaker 170A or listen to a hands-free call.

The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal. When the cellular phone 100 receives a call or voice information, it is possible to receive voice by placing the receiver 170B close to the ear of the person.

The microphone 170C, also referred to as a "microphone," is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can input a voice signal to the microphone 170C by speaking the user's mouth near the microphone 170C. The handset 100 may be provided with at least one microphone 170C. In other embodiments, the handset 100 may be provided with two microphones 170C to achieve noise reduction functions in addition to collecting sound signals. In other embodiments, the mobile phone 100 may further include three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, and implement directional recording functions.

The headphone interface 170D is used to connect a wired headphone. The headset interface 170D may be the USB interface 130, or may be a 3.5mm open mobile electronic device platform (OMTP) standard interface, a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.

The pressure sensor 180A is used for sensing a pressure signal, and converting the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A can be of a wide variety, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a sensor comprising at least two parallel plates having an electrically conductive material. When a force acts on the pressure sensor 180A, the capacitance between the electrodes changes. The handset 100 determines the intensity of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the mobile phone 100 detects the intensity of the touch operation according to the pressure sensor 180A. The cellular phone 100 can also calculate the touched position based on the detection signal of the pressure sensor 180A. In some embodiments, the touch operations that are applied to the same touch position but different touch operation intensities may correspond to different operation instructions. For example: and when the touch operation with the touch operation intensity smaller than the first pressure threshold value acts on the short message application icon, executing an instruction for viewing the short message. And when the touch operation with the touch operation intensity larger than or equal to the first pressure threshold value acts on the short message application icon, executing an instruction of newly building the short message.

The gyro sensor 180B may be used to determine the motion attitude of the cellular phone 100. In some embodiments, the angular velocity of the handpiece 100 about three axes (i.e., the x, y, and z axes) may be determined by the gyro sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. Illustratively, when the shutter is pressed, the gyro sensor 180B detects the shake angle of the mobile phone 100, calculates the distance to be compensated for the lens module according to the shake angle, and allows the lens to counteract the shake of the mobile phone 100 through a reverse movement, thereby achieving anti-shake. The gyroscope sensor 180B may also be used for navigation, somatosensory gaming scenes.

The air pressure sensor 180C is used to measure air pressure. In some embodiments, the handset 100 calculates altitude, aiding in positioning and navigation, from the barometric pressure measured by the barometric pressure sensor 180C.

The magnetic sensor 180D includes a hall sensor. The handset 100 can detect the opening and closing of the flip holster using the magnetic sensor 180D. In some embodiments, when the handset 100 is a flip phone, the handset 100 may detect the opening and closing of the flip according to the magnetic sensor 180D. And then according to the opening and closing state of the leather sheath or the opening and closing state of the flip cover, the automatic unlocking of the flip cover is set.

The acceleration sensor 180E can detect the magnitude of acceleration of the cellular phone 100 in various directions (typically three axes). The magnitude and direction of gravity can be detected when the handset 100 is stationary. The method can also be used for recognizing the posture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.

A distance sensor 180F for measuring a distance. The handset 100 may measure distance by infrared or laser. In some embodiments, taking a picture of a scene, the cell phone 100 may utilize the range sensor 180F to range for fast focus.

The ambient light sensor 180L is used to sense the ambient light level. The handset 100 may adaptively adjust the brightness of the display 194 according to the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust the white balance when taking a picture. The ambient light sensor 180L may also cooperate with a touch screen to detect whether the mobile phone 100 is in a pocket to prevent accidental touches.

The fingerprint sensor 180H is used to collect a fingerprint. The mobile phone 100 can utilize the collected fingerprint characteristics to unlock the fingerprint, access the application lock, take a photograph of the fingerprint, answer an incoming call with the fingerprint, and the like.

The temperature sensor 180J is used to detect temperature. In some embodiments, the handset 100 implements a temperature processing strategy using the temperature detected by the temperature sensor 180J. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold, the mobile phone 100 performs a reduction in performance of a processor located near the temperature sensor 180J, so as to reduce power consumption and implement thermal protection. In other embodiments, the cell phone 100 heats the battery 142 when the temperature is below another threshold to avoid an abnormal shutdown of the cell phone 100 due to low temperatures. In other embodiments, when the temperature is lower than a further threshold, the mobile phone 100 boosts the output voltage of the battery 142 to avoid abnormal shutdown due to low temperature.

The bone conduction sensor 180M may acquire a vibration signal. In some embodiments, the bone conduction sensor 180M may acquire a vibration signal of the human vocal part vibrating the bone mass. The bone conduction sensor 180M may also contact the human pulse to receive the blood pressure pulsation signal. In some embodiments, the bone conduction sensor 180M may also be disposed in a headset, integrated into a bone conduction headset. The audio module 170 may analyze a voice signal based on the vibration signal of the bone mass vibrated by the sound part acquired by the bone conduction sensor 180M, so as to implement a voice function. The application processor can analyze heart rate information based on the blood pressure beating signal acquired by the bone conduction sensor 180M, so as to realize the heart rate detection function.

The keys 190 include a power-on key, a volume key, and the like. The keys 190 may be mechanical keys. Or may be touch keys. The cellular phone 100 may receive a key input, and generate a key signal input related to user setting and function control of the cellular phone 100.

The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration cues, as well as for touch vibration feedback. For example, touch operations applied to different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also respond to different vibration feedback effects for touch operations applied to different areas of the display screen 194. Different application scenes (such as time reminding, receiving information, alarm clock, game and the like) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.

Indicator 192 may be an indicator light that may be used to indicate a state of charge, a change in charge, or a message, missed call, notification, etc.

The SIM card interface 195 is used to connect a SIM card. The SIM card can be attached to and detached from the cellular phone 100 by being inserted into the SIM card interface 195 or being pulled out from the SIM card interface 195. The handset 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 195 may support a Nano SIM card, a Micro SIM card, a SIM card, etc. The same SIM card interface 195 can be inserted with multiple cards at the same time. The types of the plurality of cards may be the same or different. The SIM card interface 195 may also be compatible with different types of SIM cards. The SIM card interface 195 may also be compatible with external memory cards. The mobile phone 100 interacts with the network through the SIM card to implement functions such as communication and data communication. In some embodiments, the handset 100 employs esims, namely: an embedded SIM card. The eSIM card can be embedded in the mobile phone 100 and cannot be separated from the mobile phone 100.

It should be understood that the illustrated structure of the embodiment of the present application does not specifically limit the mobile phone 100. In other embodiments of the present application, the handset 100 may include more or fewer components than shown, or some components may be combined, some components may be separated, or a different arrangement of components may be used. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.

The software system of the electronic device may employ a layered architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture. The embodiment of the application takes an Android system with a layered architecture as an example, and exemplarily illustrates a software structure of an electronic device.

Referring to fig. 6, the software architecture of the electronic device may include a hardware layer, a driver layer, a Hardware Abstraction Layer (HAL), a framework layer, and an application layer.

The application layer may send an application registration proximity light event to the framework layer to ask the framework layer to turn on the proximity light function. The proximity light function refers to a function of determining that a barrier is close to or far away from a touch screen through a proximity light sensor as described in the background art. Although embodiments of the present application may not employ a proximity light sensor, but rather a touch screen to sense the proximity or distance of a barrier, for applications in the application layer, the proximity detection function may still be activated by registering a proximity light event. Of course, the name of the proximity light event may also be modified, such as modifying to "proximity detection event", as long as the function of detecting the approaching or departing of the obstruction from the touch screen can be represented, and the embodiment of the present application is not limited herein.

The acceleration data output by the acceleration sensor can be used for judging user gestures, such as picking up, putting down or turning over, by a gesture algorithm in the frame layer.

An Integrated Circuit (IC) of the touch screen may send raw data (capacitance generated at each contact position) of the touch screen sensing barrier to a Touch Host Processing (THP) daemon.

In this embodiment of the application, the gesture determination may include two determination stages, namely a small gesture and a large gesture, where the large gesture and the small gesture are different in sampling frequency and corresponding sampling time of a gesture algorithm, and a total number of sampling points corresponding to the small gesture is smaller than a total number of sampling points corresponding to the large gesture.

A small gesture judging stage: a gesture algorithm in the framework layer determines whether the user gesture is a preset gesture (e.g., pick up, drop down, etc.) based on acceleration data sampled by the acceleration sensor over a first time period. If yes, the framework layer triggers a THP daemon process in the hardware abstraction layer to run the THP algorithm. The THP daemon process can input the capacitance value data reported by the touch screen into a THP algorithm to obtain a calculation result (the approach or the distance of the shielding object), and generates a corresponding approach or distance event according to the calculation result to report.

A large gesture judgment stage: in the process that the THP daemon process runs the THP algorithm, the gesture algorithm in the framework layer may continue to receive acceleration data reported by the acceleration sensor, and then determine again whether the user gesture is a preset gesture (for example, pick up, put down, or the like) based on the acceleration data sampled by the acceleration sensor in a third time period, where the third time period is longer than the first time period, and an end time of the third time period is later than an end time of the first time period.

The THP algorithm of the THP daemon derives a proximity event (e.g., proximity, distance) from the user's gesture, the result of scene recognition (e.g., call, anti-false touch), and the volume data from the touch screen, and reports the proximity event to the sensor manager. The capacitance data input into the THP algorithm may be capacitance data sensed by touch points in all areas on the touch screen, or may be capacitance data sensed by touch points in only an edge area on the touch screen (for example, capacitance data sensed by touch points in the first 3 rows, the last 3 rows, the left three columns, or the right three columns on the touch screen).

The sensor manager may send switch control instructions to the THP daemon. For example, the sensor manager may issue an opening instruction according to a user gesture (small gesture) identified by the gesture algorithm, and the THP driver drives the THP daemon process to run the THP algorithm after receiving the opening instruction issued by the sensor manager. The sensor manager can also issue a closing instruction according to a proximity event sent by the THP daemon process, or issue a closing instruction according to a user gesture (large gesture) identified by a gesture algorithm, so that the THP algorithm of the THP daemon process is dormant, and the purpose of reducing power consumption is achieved.

The THP algorithm and the gesture algorithm may be implemented in an application chip of a System On Chip (SOC), and further, in order to reduce power consumption, the gesture algorithm may be implemented in a sensor hub (sensorhub) chip of the application chip.

The embodiment of the application can be applied to the scenes of conversation, third-party application (such as WeChat) conversation and the like. For example, when the processor detects that the gesture performed by the user on the electronic device is lifted based on a small amount of acceleration data, it starts to detect whether a shielding object is close to the touch screen, and if the shielding object is detected to be close to the touch screen, the touch screen is turned off to reduce the power consumption of the electronic device, or after the shielding object is detected to be close to the touch screen and the gesture performed by the user on the electronic device is judged to be lifted again based on more acceleration data, the touch screen is turned off to reduce the power consumption of the electronic device. For example, when the processor detects that the gesture performed by the user on the electronic device is put down based on a small amount of acceleration data, it starts to detect whether the shielding object is away from the touch screen, and if the shielding object is away from the touch screen, the touch screen is lightened to facilitate the user operation, or after the shielding object is detected to be away from the touch screen and the gesture performed by the user on the electronic device is judged again based on more acceleration data, the touch screen is lightened to facilitate the user operation.

The embodiment of the application can also be applied to the false touch prevention scene. For example, when the electronic device is placed in a pocket, the start key is pressed by mistake due to user movement and the like, so that the touch screen is lightened, whether the distance between the shielding object and the touch screen is smaller than or equal to a preset value or not is detected, if the distance is smaller than or equal to the preset value, prompt information is displayed to remind a user that the shielding object shields the touch screen, or after the distance between the shielding object and the touch screen is detected to be smaller than or equal to the preset value, when the posture change of the electronic device after acceleration data recognition has a periodic change rule, the touch screen can be considered to be touched by mistake, and the screen is extinguished to save power consumption.

It should be understood that software programs and/or modules corresponding to the software architecture in fig. 6 may be stored in the internal memory 121 of the handset 100 shown in fig. 3.

The technical solutions provided in the embodiments of the present application are described in detail below by two specific embodiments. The following embodiments are all exemplified by being applied to the mobile phone 100.

Example one

Referring to fig. 7, a flowchart of a display control method provided in an embodiment of the present application is shown, where the method includes:

s701, when the gesture executed by the user on the mobile phone is determined to be a preset gesture, a capacitance measured value on a preset area of the touch screen is obtained.

Specifically, when the processor requests the framework layer to open the proximity light function through application registration of the application layer to a proximity light event (for example, when the processor registers the proximity light event when the processor starts a call process), the framework layer starts a running gesture algorithm. And an acceleration sensor in the hardware layer reports the acquired acceleration data to the frame layer, and a gesture algorithm in the frame layer calculates the gesture executed by the user on the mobile phone to be a preset gesture based on the acceleration data. The preset gesture can be lifting, dropping or turning.

Referring to fig. 8, the gesture algorithm may include three decision logics: (1) relative motility. Once the direction of the acceleration and any one of three axes (namely x axis, y axis and z axis) of the space coordinate system are crossed, the gesture of the mobile phone is considered to be changed, and a suspected gesture of lifting, putting down or overturning occurs. This logic does not distinguish between pick-up or drop-down, but only distinguishes between stationary and moving states; (2) monotonicity. When the acceleration data of a certain sampling point number is monotonously increased or monotonously decreased, the suspected gesture of lifting, putting down or overturning can be determined to occur. The gesture algorithm can filter the influence of some sudden-change noises on the gesture judgment accuracy through the logic; (3) is aperiodic. Judging the number of peaks and troughs of the acceleration data, if the number of peaks and troughs is less than a certain value, the mobile phone is considered to be suspected to be picked up and separated from the hand, otherwise, the mobile phone is determined to be in periodic motion, for example, if the user runs at a constant speed, the mobile phone is in periodic shaking, and the number of peaks and troughs is large. The false touch rate can be reduced by this logic. The processor determines that the preset gesture occurs only when it is determined that the three judgment logics are all satisfied.

Further, when the gesture executed by the user on the mobile phone is one of a lifting, dropping or turning gesture, the sensor manager in the framework layer can send a starting instruction to the THP daemon process, the THP daemon process can input the capacitance measurement value reported by the touch screen into the THP algorithm to obtain a calculation result (the barrier is close to or far away from), and a corresponding close event or far event is generated according to the calculation result and reported.

And the capacitance measured value input by the THP algorithm is a capacitance measured value on a preset area of the touch screen. The preset area may refer to an area where a capacitance measurement value of the touch screen is less susceptible to sheet metal drying, for example, the contacts in the touch screen form a contact array, and the preset area may be a first 3 rows of contact areas, a last 3 rows of contact areas, a left 3 columns of contact areas, or a right 3 columns of contact areas in the contact array, and the embodiment of the present application is not limited specifically. One possible example, see FIG. 9, is that the preset area includes rows 1-3 of contact areas, and three left and three right columns of contact areas in rows 4-10.

The capacitance measurement may be a capacitance measurement over a set time period. Illustratively, the set duration may refer to a preset number of frames, such as obtaining capacitance measurements for 5 consecutive frames on a preset area of the touch screen.

S702: and when the capacitance measured value is determined to meet a first preset condition, controlling the display state of the touch screen to be switched, wherein the switching comprises switching the display state from on screen to off screen or from off screen to on screen.

Specifically, after receiving an approaching or departing event reported by the THP daemon, the framework layer in the processor starts a screen-off or screen-on process, and notifies the driver layer to complete the screen-off or screen-on process. If the event is close to the event, a screen-off process is started, and if the event is far from the event, a screen-on process is started.

In embodiments of the application, the THP daemon may determine a trend of the distance between the barrier and the touch screen (i.e. approaching or departing) according to a trend of the capacitance measurement. For example, as the distance between the barrier and the touch screen becomes smaller (i.e., closer), the capacitance measurement of the contact points that determine the proximity of the touch screen to the barrier becomes larger; as the distance between the barrier and the capacitive touch screen becomes larger (i.e., farther away), the capacitance measurement of the touch point that determines the proximity of the touch screen to the barrier becomes smaller.

The first preset conditions corresponding to two scenes that the shielding object is close to the touch screen and the shielding object is far away from the touch screen can be different. Specifically, in a scene where the blocking object is close to the touch screen, the first preset condition may include: the capacitance measured value on the preset area of the touch screen meets the monotone increasing change trend in the first time length, and the capacitance measured value at the ending moment of the first time length is greater than a first preset threshold; in a scene where the obstruction is far away from the touch screen, the first preset condition may include: and the capacitance measured value on the preset area of the touch screen meets the monotonically decreasing change trend within the first time length, and the capacitance measured value at the end time of the first time length is smaller than a second preset threshold. And the second preset threshold is smaller than the first preset threshold.

Optionally, after the framework layer starts the screen-off or screen-on process, the sensor manager in the framework layer may also issue a shutdown instruction, so that the THP algorithm of the THP daemon is dormant, and the purpose of reducing power consumption is achieved.

It should be noted that the touch screen of the mobile phone in this embodiment may be the touch screen shown in fig. 3, or the touch screen shown in fig. 4, which is not limited in the implementation of this application.

As an optional implementation, since the THP algorithm is implemented in the frame layer, for power consumption reduction, after the touch screen is turned off due to the touch screen approaching the barrier, the touch screen may not be powered down, but enter a low power consumption mode, where the THP algorithm is turned off. When a gesture is triggered (for example, a user puts down or turns over the mobile phone), the mobile phone exits from the low power consumption mode, and at the moment, the THP algorithm is awakened (namely, the sensor manager in the framework layer sends a starting instruction to the THP daemon).

In this embodiment, the processor selects an area (i.e., the preset area) of the touch screen where the corresponding metal sheet is not easily deformed to generate a capacitance measurement value to detect the approaching/leaving of the shielding object and the touch screen, and since the capacitance value corresponding to the area may be less interfered, the calculated approaching/leaving result is more accurate than the calculated approaching/leaving result based on the capacitance value of the entire screen, so that the accuracy of the approach detection of the electronic device can be improved, and the display state of the electronic device can be more accurately controlled.

Example two

Referring to fig. 10, a flowchart of another display control method provided in the embodiment of the present application is shown, where the method includes:

s1001, determining whether a gesture executed by a user on the mobile phone is a preset gesture according to first acceleration data, wherein the first acceleration data are acceleration data collected by an acceleration sensor in a first time period, and the sampling frequency of the acceleration data is a first frequency.

Specifically, the processor registers a proximity light event through an application of the application layer (for example, registers a proximity light event when a WeChat starts a call process) to request the framework layer to open a proximity light function, and a gesture algorithm is started. After the frame layer receives acceleration data collected in a first time period reported by a acceleration sensor in the hardware layer, the gesture executed by the user on the mobile phone is calculated to be a preset gesture through a gesture algorithm, namely the small gesture. The logic for judging the small gesture by the gesture algorithm may refer to the logic for judging the gesture algorithm in the first embodiment, which is not described herein again.

S1002, when the gesture executed by the user on the mobile phone is determined to be the preset gesture, obtaining a capacitance measured value of the touch screen in a second time period, and judging whether the capacitance measured value meets a second preset condition.

Specifically, after the processor determines that the gesture executed by the user on the mobile phone is the preset gesture through the framework layer, the sensor manager in the framework layer can send a starting instruction to the THP daemon process, the THP daemon process can input the capacitance measured value acquired by the touch screen in the second time period into the THP algorithm to obtain a calculation result (the approach or the distance of the shielding object), and a corresponding approach or distance event is generated according to the calculation result and reported. The specific implementation of the THP algorithm may refer to the specific implementation of the THP algorithm in the embodiment, and details are not described here. The specific implementation manner of the preset gesture may refer to the specific implementation manner of the preset gesture in the first embodiment, and details are not repeated here.

Wherein the start time of the second time period may be the end time of the first time period, or the start time of the second time period may be after the end time of the first time period.

When the touch screen measures the capacitance in the second time period, the capacitance measurement value may be a capacitance measurement value detected by a touch point of the touch screen, or a capacitance measurement value detected in a partial region of the touch screen. For example, the capacitance measurement value may be the capacitance measurement value detected in the preset area shown in fig. 9.

It should be noted that the second preset condition may be the same as or different from the first preset condition in the first embodiment, and the embodiment of the present application is not limited. For the same situation, refer to a specific implementation manner of the first preset condition in the foregoing embodiment, and details are not described here. The following examples are given for different cases: in a scene where the obstruction is close to the touch screen, the second preset condition may include: the capacitance measured value meets the monotonically increasing change trend in the second time length, and the capacitance measured value at the ending time of the second time length is not greater than a first preset threshold; in a scene that the obstruction is far away from the touch screen, the second preset condition may include that the capacitance measurement value satisfies a monotonically decreasing trend within the second duration, and the capacitance measurement value at the end time of the second duration is not less than a second preset threshold. And the second preset threshold is smaller than the first preset threshold.

S1003, when the capacitance measured value meets a second preset condition, determining whether the gesture executed by the user on the mobile phone is the preset gesture again according to second acceleration data, wherein the second acceleration data is acceleration data collected by the acceleration sensor in a third time period, the sampling frequency of the acceleration data is a second frequency, the duration of the third time period is greater than the duration of the first time period, and/or the first frequency is less than the second frequency.

Specifically, after receiving the approach or departure event reported by the THP process, the framework layer continues to operate the gesture algorithm, and the gesture algorithm determines again whether the gesture of the user is the preset gesture according to the second acceleration data reported by the acceleration sensor, that is, the large gesture determination stage described above. The judgment logic of the large gesture may refer to the judgment logic of the gesture algorithm in the first embodiment, and is not described herein again. The starting time of the third time period may be the starting time of the first time period, and of course, the starting time of the third time period may also be after the starting time of the first time period, which is not limited herein.

In this embodiment of the application, the purpose of setting the condition that the duration of the third time period is greater than the duration of the first time period and/or the first frequency is less than the second frequency is to ensure that the data volume (total number of sampling points) of the second acceleration data is greater than the data volume (total number of sampling points) of the second acceleration data, that is, to ensure that the accuracy of the large gesture judgment is greater than the accuracy of the small gesture judgment, and further ensure the accuracy of the proximity detection. That is, if the duration of the third time period is not greater than the duration of the first time period, the first frequency must be less than the second frequency; if the duration of the third time period is greater than the duration of the first time period, the first frequency may be less than or equal to the second frequency, but with the proviso that the amount of data of the second acceleration acquired in the third time period is greater than the amount of data of the first acceleration acquired in the first time period. Alternatively, the first time period may be a sub-time period in the third time period, that is, the second acceleration data may include the first acceleration data.

In the embodiment of the application, the first acceleration data corresponding to the small gesture and the second acceleration data corresponding to the large gesture are different based on the scene requirement. Specifically, the first acceleration data corresponding to the small gesture requires that the acceleration data cannot be misreported, and on the premise of not misreporting, the missing reports are strived as few as possible, for example: the sampling frequency of the acceleration sensor in the first time period is 100Hz, 10 points are sampled, and the sampling time of the first acceleration data plus the judgment time of the small gesture is 230ms in total. The second acceleration data corresponding to the large gesture requires that the acceleration data cannot be missed (i.e. more data amount is required), and on the premise of no missing report, strives for as few false reports as possible, for example: and in the third time period, the sampling frequency of the acceleration sensor is also 100Hz, and 20 points are sampled, so that the sampling time of the second acceleration data plus the judgment time of the large gesture is 400 ms.

And S1004, when the gesture executed by the user on the mobile phone is determined to be the preset gesture again, controlling the display state of the touch screen to be switched, wherein the switching comprises switching the display state from the bright screen to the off screen or switching the display state from the off screen to the bright screen.

Specifically, after the frame layer determines that the gesture executed by the user on the mobile phone is the preset gesture again through the operation gesture algorithm, the frame layer starts a screen turn-off or turn-on process, and informs the driving layer to complete the screen turn-off or turn-on process.

Optionally, after the framework layer starts a screen-off or screen-on process, the sensor manager in the framework layer may also issue a shutdown instruction, so that the THP algorithm of the THP daemon is dormant, and the purpose of reducing power consumption is achieved.

It should be noted that the touch screen of the mobile phone in this embodiment may be the touch screen shown in fig. 3, or the touch screen shown in fig. 4, which is not limited in the implementation of this application.

For example, as shown in fig. 11, in a scene of a call or a third-party application call, when an obstruction is far away from a touch screen (i.e., the distance between the obstruction and the touch screen is increased), a time-consuming schematic diagram of lighting the touch screen is shown. The processor runs a gesture algorithm in the frame layer, the gesture algorithm primarily determines that the gesture of the user is put down according to first acceleration data reported by the acceleration sensor (in a small gesture judgment stage, the gesture judgment result is about 230ms from the acceleration data acquisition to the gesture judgment result output), the gesture is sent to the hardware abstraction layer through the driving layer, and the THP daemon process in the hardware abstraction layer starts the THP algorithm. In the THP algorithm, a THP daemon process obtains capacitance measurement values of 5 continuous frames reported by a touch screen, determines the distance between a shielding object and the touch screen according to the capacitance measurement values, determines that the shielding object is far away from the touch screen along with the increase of the distance, and reports a far-away event to a frame layer through a driving layer when the distance between the shielding object and the touch screen exceeds a set value. And the frame layer continues to operate the gesture algorithm, and the gesture algorithm determines the gesture of the user to be put down again according to the second acceleration data reported by the acceleration sensor (in a large gesture judgment stage, about 400ms is spent from the acquisition of the acceleration data to the output of a gesture judgment result). And the frame layer starts a screen lightening process and informs the driving layer to finish the screen lightening process. The frame layer sends a backlight instruction to the driving layer to light the touch screen.

Similarly, because the THP algorithm is implemented in the frame layer, in order to reduce power consumption, after the touch screen is turned off due to the touch screen approaching the barrier, the touch screen may not be powered off, but enter a low power consumption mode, where the THP algorithm is turned off. When a gesture is triggered (for example, a user puts down or turns over the mobile phone), the mobile phone exits from the low power consumption mode, and at this time, the THP algorithm is awakened (that is, the sensor manager in the framework layer sends a start instruction to the THP daemon process), and the method in the embodiment of the present application is executed.

For example, referring to fig. 12, after the touch screen approaches the barrier to cause the touch screen to be turned off, the touch screen may enter a low power consumption mode without powering down, and in the low power consumption mode, the touch screen is in a screen-off state, and the THP algorithm is turned off. If the small gesture trigger is detected (for example, the gesture that the user puts down the mobile phone or turns over the mobile phone is recognized based on the acceleration data in a short time), the low power consumption mode is exited, the THP algorithm is awakened, and if the gesture is not recognized, the low power consumption mode is continuously adopted, and the screen is continuously turned off. And after the THP algorithm is awakened, acquiring a capacitance measured value of the touch screen in a second time period, judging whether the capacitance measured value in the preset time period meets the monotonically decreasing variation trend, if so, continuously judging whether the capacitance measured value at the end moment of the preset time period is smaller than a second preset threshold, and if not, continuously turning off the touch screen. And if the capacitance measured value at the ending moment is smaller than a second preset threshold, lightening the touch screen, otherwise, continuously judging whether a large gesture triggering condition is met (for example, a gesture that a user carries out putting down a mobile phone or turning over the mobile phone is identified based on acceleration data in a longer time). And if the large gesture triggering condition is met, lightening the touch screen, and otherwise, continuously turning off the screen.

In this embodiment, after the distance relationship between the shielding object and the mobile phone is calculated based on the capacitance measurement value detected by the touch screen and meets a second preset condition, a large gesture determination step is added, whether the shielding object is accidentally triggered relatively to the electronic device or not is determined based on the posture change condition of the electronic device, and when the shielding object is determined to be accidentally triggered, that is, when the user has an intention to bring the electronic device close to or away from a human body, the screen is turned off/lit. Therefore, even if the capacitance value sensed by the touch screen is interfered by the surrounding environment, the accuracy of display control of the electronic equipment can be ensured through the gesture judgment step, and the user experience is improved.

With reference to the foregoing embodiments and the related drawings, embodiments of the present application further provide a display control method, which can be implemented in an electronic device (e.g., a mobile phone, a tablet computer, etc.) with a touch screen. For example, the structure of the electronic device may be as shown in fig. 3, fig. 4, fig. 5A, fig. 5B, fig. 5C, or the like. As shown in fig. 13, the method may include the steps of:

s1301, when the gesture that a user moves the electronic equipment is determined to be a preset gesture, obtaining a capacitance measurement value sensed by a contact in a preset area on the touch screen, wherein the preset area is located in the peripheral edge area of the touch screen;

s1302, when the acquired capacitance measurement value meets a first preset condition, controlling the display state of the touch screen to be switched.

With reference to the foregoing embodiments and the related drawings, embodiments of the present application further provide a display control method, which can be implemented in an electronic device (e.g., a mobile phone, a tablet computer, etc.) with a touch screen. For example, the structure of the electronic device may be as shown in fig. 3, fig. 4, fig. 5A, fig. 5B, fig. 5C, or the like. As shown in fig. 14, the method may include the steps of:

s1401, when determining that the gesture of the user moving the electronic equipment is a preset gesture according to first acceleration data, acquiring a capacitance measurement value acquired by the touch screen in a second time period, wherein the first acceleration data is acquired according to a first sampling frequency, and the acceleration data of the electronic equipment in the first time period;

s1402, when the acquired capacitance measurement value meets a second preset condition, judging whether the gesture of the user for moving the electronic device is the preset gesture again according to second acceleration data, wherein the second acceleration data is acquired according to a second sampling frequency, and the duration of a third time period is greater than that of the first time period and/or the first sampling frequency is less than the second sampling frequency;

and S1403, when the gesture that the user moves the electronic equipment is determined to be the preset gesture again, controlling the display state of the touch screen to be switched.

The above embodiments of the present application can be combined arbitrarily to achieve different technical effects.

The following describes an apparatus provided in the embodiments of the present application with reference to the drawings to implement the above-mentioned embodiments of the method of the present application.

As shown in fig. 15, some further embodiments of the present application disclose an electronic device, which may include: one or more processors 1502, memory 1503, one or more computer programs 1504; the various devices described above may be connected by one or more communication buses 1505. Wherein the one or more computer programs 1504 are stored in the memory 1503 and configured to be executed by the one or more processors 1502, and the one or more computer programs 1504 include instructions that can be used to perform all or part of the steps described in the embodiments of fig. 3-4, 5A-5C, and 6-14.

The processor 1502 may be a Central Processing Unit (CPU), an application-specific integrated circuit (ASIC), one or more integrated circuits for controlling program execution, a baseband chip, or the like. The number of the memory 1503 may be one or more, and the memory 1503 may be a read-only memory (ROM), a Random Access Memory (RAM), or a disk memory, etc.

The electronic device shown in fig. 15 may be a mobile phone, an ipad, a notebook, a smart tv, a wearable device (e.g., a smart watch, a smart helmet, or a smart bracelet), and so on. When the electronic device shown in fig. 15 is a mobile phone, the structure thereof can be seen as shown in fig. 3.

In the embodiments provided in the present application, the method provided in the embodiments of the present application is described from the perspective of the electronic device (the mobile phone 100) as an execution subject. In order to implement the functions in the method provided by the embodiment of the present application, the terminal device may include a hardware structure and/or a software module, and implement the functions in the form of a hardware structure, a software module, or a hardware structure and a software module. Whether any of the above-described functions is implemented as a hardware structure, a software module, or a hardware structure plus a software module depends upon the particular application and design constraints imposed on the technical solution.

As used in the above embodiments, the term "when …" may be interpreted to mean "if …" or "after …" or "in response to a determination of …" or "in response to a detection of …", depending on the context. Similarly, depending on the context, the phrase "at the time of determination …" or "if (a stated condition or event) is detected" may be interpreted to mean "if the determination …" or "in response to the determination …" or "upon detection (a stated condition or event)" or "in response to detection (a stated condition or event)".

In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, cause the processes or functions described in accordance with the embodiments of the application to occur, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by wire (e.g., coaxial cable, fiber optic, digital subscriber line) or wirelessly (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that incorporates one or more of the available media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., solid state disk), among others.

The foregoing description, for purpose of explanation, has been described with reference to specific embodiments. However, the exemplary discussions above are not intended to be exhaustive or to limit the application to the precise forms disclosed. Many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles of the application and its practical applications, to thereby enable others skilled in the art to best utilize the application and various embodiments with various modifications as are suited to the particular use contemplated.

完整详细技术资料下载
上一篇:石墨接头机器人自动装卡簧、装栓机
下一篇:触控设备及其控制方法

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!

技术分类