Game scene reproduction method, electronic equipment and system
1. A method of game scene rendering, comprising:
the electronic device running a first game application;
the electronic equipment acquires a first equipment parameter when the electronic equipment runs the first game application through a first performance detection tool;
when the electronic equipment determines that the first equipment parameter meets a first condition through the first performance detection tool, the electronic equipment intercepts a first image displayed when the first game application is operated through the first performance detection tool, wherein the first image comprises a first small map image, a first game picture and a first mark corresponding to a target object, and the first mark is located at a first position of the first small map image;
the electronic equipment closes the first game application and re-runs the first game application;
the electronic equipment intercepts a first map image displayed when the electronic equipment runs the first game application through a second performance detection tool;
the electronic device mapping, by the second performance detection tool, the first location to a second location of the first large map image based on the first small map image;
the electronic equipment acquires a third position of the target object currently located in the first map image through the second performance detection tool;
and when the electronic equipment controls a game picture displayed by the electronic equipment to be the same as the first game picture through the second performance detection tool based on the second position and the third position, acquiring a second equipment parameter of the electronic equipment, wherein the second equipment parameter is used for optimizing the performance of the electronic equipment.
2. The method according to claim 1, wherein the controlling, by the electronic device, the game screen displayed by the electronic device to be the same as the first game screen based on the second position and the third position by the second performance detection tool specifically includes:
the electronic device controls the target object to move from the third position to the second position;
the electronic equipment controls the visual angle rotation of a game picture displayed by the first game application;
when the electronic device determines that the visual angle of the game picture displayed on the electronic device is rotated to be the same as the first game picture through the second performance detection tool, the electronic device acquires a second device parameter based on the second performance detection tool, wherein the second device parameter is used for optimizing the performance of the electronic device.
3. The method of claim 1, wherein prior to the electronic device running the first gaming application, the method further comprises:
the method comprises the steps that the electronic equipment displays a first interface, wherein the first interface comprises icons of one or more applications, and the icons of the one or more applications comprise an icon of a first game application;
the electronic device receiving a first input acting on an icon of the first gaming application;
in response to the first input, the electronic device runs the first game application.
4. The method of claim 1, wherein the first device parameter comprises: one or more of a frame rate of a game screen, a CPU load during running of the first game application, and a CPU temperature during running of the first game application.
5. The method of claim 4, wherein the first condition comprises:
the frame rate of the game picture is less than a specified frame rate threshold value; and/or the CPU load in the running process of the first game application is larger than a specified value threshold; and/or the CPU temperature in the running process of the first game application is larger than a specified value threshold.
6. The method according to claim 1, wherein the second device parameter specifically comprises: one or more of game frame vertex number, shader complexity, texture resolution.
7. The method of claim 1, wherein before the electronic device maps the first location to a second location of the first large map image based on the first small map image by a second performance detection tool, the method further comprises:
the electronic equipment acquires a second image with the same size as the first image, wherein the second image comprises a second small map image with the same size as the first small map image;
the electronic equipment determines coordinate information of the second small map image in the second image based on a preset small map image;
the electronic device maps the first location to a second location of the first large map image based on the first small map image, and specifically includes:
the electronic equipment acquires the first small map image from the first image based on the coordinate information;
the electronic device zooming the first large map image to the same scale as the first small map image;
the electronic equipment determines a first area with similarity greater than a specified threshold value with a first small map image from the first large map image;
the electronic equipment determines the second position in the first area based on the position information of the first position in the first small map image, wherein the relative position of the second position in the first area is the same as the position information of the first position in the first small map image.
8. The method according to claim 7, wherein the acquiring, by the electronic device, the first small map image from the first image based on the coordinate information specifically includes:
the electronic equipment cuts out the first small map image in a preset shape from the first image based on the coordinate information.
9. The method according to claim 7, wherein the acquiring, by the electronic device, the first small map image from the first image based on the coordinate information specifically includes:
the electronic equipment cuts out a third image of a preset shape from the first image based on the coordinate information, wherein the third image comprises the first small map image;
the electronic equipment identifies the edge position of the first small map image in the third image through a preset algorithm;
the electronic device cuts out the first minimap image from the third image based on the edge position.
10. The method according to claim 2, wherein the electronic device controls the target object to move from the third position to the second position, and in particular comprises:
the electronic equipment outputs a control signal to external mechanical equipment through the second performance detection tool;
the electronic equipment receives input of a direction key displayed on a touch screen when the external mechanical equipment operates the first game application aiming at the electronic equipment based on the control signal;
in response to the input of the external mechanical device, the electronic device controls the target object to move from the third position to the second position.
11. The method of claim 10, wherein before the electronic device outputs the control signal to the external mechanical device via the second performance testing tool, the method further comprises:
the electronic equipment stores one or more paths for moving the target object from the third position to the second position in advance.
12. The method of claim 2, wherein after the electronic device controls the target object to move from the third position to the second position, the method further comprises:
the electronic equipment intercepts a fourth image displayed when the first game application is currently operated;
when the electronic equipment compares that the similarity between the game picture of the fourth image and the first game picture is out of a specified threshold range, the electronic equipment controls the target object to move a specified distance towards a specified direction;
when the electronic equipment controls the target object to move towards the appointed direction for an appointed distance each time, the electronic equipment intercepts the image after the target object moves, the similarity between the game picture in the intercepted image and the first game picture is compared, and when the similarity between the game picture in the intercepted image and the first game picture is within the appointed threshold range, the electronic equipment controls the visual angle of the game picture displayed by the first game application to rotate.
13. The method according to claim 2, wherein the electronic device controls a rotation of a viewing angle of a game screen displayed by the first game application, and specifically comprises:
the electronic equipment outputs a control signal to external mechanical equipment through the second performance detection tool;
when the electronic equipment receives the input of the external mechanical equipment acting on the target object based on the control signal, the first game application runs the displayed game picture and rotates by a specified angle;
when the game picture displayed by the running of the first game application rotates by a specified angle, the electronic equipment intercepts an image displayed by the current running of the first game application and carries out similarity contrast with the first image;
and when the similarity is within a specified threshold range, the electronic equipment determines that the visual angle of the game picture displayed on the electronic equipment is rotated to be the same as the first game picture.
14. An electronic device comprising a communication apparatus, a memory, and a processor coupled to the memory, a plurality of applications, and one or more programs; the processor, when executing the one or more programs, causes the electronic device to implement the method of any of claims 1-13.
15. A computer storage medium, characterized in that the storage medium has stored therein a computer program comprising executable instructions that, when executed by a processor, cause the processor to perform the method of any one of claims 1-13.
Background
Along with the development of terminal technology, the large-scale popularization of electronic equipment such as mobile phones, tablet computers, wearable equipment with mobile communication function (for example, smart band, smart watch, smart glasses and the like) brings convenience to the life of people, and especially, the colorful game application on the electronic equipment adds much fun to the life.
At present, when a bug (bug) occurs in the running process of a game application, a subsequent game scene at the bug occurrence moment is reproduced so that the electronic equipment can detect the bug occurrence reason and repair the bug based on the reproduced game scene, and the method mainly depends on a tester. For example, when the electronic device detects, during the running of the game application, an abnormality (e.g., one or more of a frame rate of the game screen, a CPU load during the running of the game, a CPU temperature during the running of the game, etc.) in a device parameter related to the running of the game through a specified performance detection tool (e.g., a perfDog test tool) (e.g., one or more of a frame rate of the game screen is less than a specified threshold, a CPU load during the running of the game is higher than a specified threshold, a CPU temperature during the running of the game is higher than a specified threshold, etc.), the electronic device may intercept the game screen image at the time of the detection of the abnormal device parameter information. When the game picture displayed by the electronic equipment changes, the tester can repeatedly compare the captured game picture image with the game scene currently displayed by the electronic equipment, so that the game scene at the moment when the abnormal equipment parameter information is detected is reproduced.
Therefore, the method has the advantages that the operation process is very complicated, and the problems of long time consumption and low efficiency in game scene reproduction are caused.
Disclosure of Invention
The application provides a method, electronic equipment and system for game scene reproduction, which achieve the effects of reducing the time consumed by game scene reproduction, improving the efficiency of game scene reproduction and improving the accuracy of the result of game scene reproduction.
In a first aspect, the present application provides a method for game scene reproduction, the method comprising: the electronic device runs a first gaming application. The electronic equipment obtains a first equipment parameter when the electronic equipment runs the first game application through a first performance detection tool. When the electronic equipment determines that the first equipment parameter meets a first condition through the first performance detection tool, the electronic equipment intercepts a first image displayed when the first game application is operated through the first performance detection tool, wherein the first image comprises a first small map image, a first game picture and a first mark corresponding to a target object, and the first mark is located at a first position of the first small map image. The electronic device closes the first game application and re-runs the first game application. And the electronic equipment intercepts a first geomap image displayed when the electronic equipment runs the first game application through a second performance detection tool. The electronic device maps, by a second performance detection tool, the first location to a second location of the first large map image based on the first small map image. The electronic equipment acquires a third position of the target object currently located in the first map image through the second performance detection tool. And when the electronic equipment controls the game picture displayed by the electronic equipment to be the same as the first game picture through the second performance detection tool based on the second position and the third position, acquiring a second equipment parameter of the electronic equipment, wherein the second equipment parameter is used for optimizing the performance of the electronic equipment.
Therefore, the time consumed by the game scene reappearance can be reduced, the efficiency of the game scene reappearance is improved, and the result accuracy of the game scene reappearance can be improved.
In a possible implementation manner, the controlling, by the electronic device, a game screen displayed by the electronic device to be the same as the first game screen based on the second position and the third position by the second performance detection tool specifically includes: the electronic device controls the target object to move from the third position to the second position. The electronic equipment controls the visual angle rotation of the game picture displayed by the first game application. When the electronic device determines that the visual angle of the game picture displayed on the electronic device is rotated to be the same as the first game picture through the second performance detection tool, the electronic device acquires a second device parameter based on the second performance detection tool, and the second device parameter is used for optimizing the performance of the electronic device.
In one possible implementation, before the electronic device runs the first game application, the method further includes: the electronic device displays a first interface, wherein the first interface includes icons of one or more applications, including an icon of a first game application. The electronic device receives a first input acting on an icon of the first game application. In response to the first input, the electronic device runs the first game application.
In one possible implementation, the first device parameter includes: one or more of a frame rate of a game screen, a CPU load during the running of the first game application, and a CPU temperature during the running of the first game application.
In one possible implementation, the first condition includes: the frame rate of the game picture is less than a designated frame rate threshold. And/or the CPU load during the running of the first game application is larger than a specified numerical threshold. And/or the CPU temperature during the running of the first game application is greater than a specified numerical threshold.
In a possible implementation manner, the second device parameter specifically includes: one or more of game frame vertex number, shader complexity, texture resolution.
In one possible implementation, before the electronic device maps the first location to the second location of the first large map image based on the first small map image through the second performance detection tool, the method further includes: the electronic equipment acquires a second image with the same size as the first image, wherein the second image comprises a second small map image with the same size as the first small map image. The electronic equipment determines the coordinate information of the second small map image in the second image based on a preset small map image. The electronic device maps the first position to a second position of the first large map image based on the first small map image, and specifically includes: the electronic equipment acquires the first thumbnail image from the first image based on the coordinate information. The electronic device zooms the first large map image to the same scale as the first small map image. The electronic equipment determines a first area with similarity greater than a specified threshold value with the first small map image from the first large map image. The electronic equipment determines the second position in the first area based on the position information of the first position in the first small map image, and the relative position of the second position in the first area is the same as the position information of the first position in the first small map image.
In a possible implementation manner, the acquiring, by the electronic device, the first small map image from the first image based on the coordinate information specifically includes: the electronic device cuts out the first minimap image of a preset shape from the first image based on the coordinate information.
In a possible implementation manner, the acquiring, by the electronic device, the first small map image from the first image based on the coordinate information specifically includes: the electronic device cuts out a third image of a preset shape from the first image based on the coordinate information, wherein the third image comprises the first minimap image. And the electronic equipment identifies the edge position of the first small map image in the third image through a preset algorithm. The electronic device crops the first minimap image from the third image based on the edge location.
In a possible implementation manner, the controlling, by the electronic device, the target object to move from the third position to the second position specifically includes: the electronic equipment outputs a control signal to external mechanical equipment through the second performance detection tool. The electronic equipment receives the input of the direction key displayed on the touch screen when the external mechanical equipment operates the first game application aiming at the electronic equipment based on the control signal. And responding to the input of the external mechanical equipment, and controlling the target object to move from the third position to the second position by the electronic equipment.
In a possible implementation manner, after the electronic device controls the target object to move from the third position to the second position, the method may further include: and the electronic equipment intercepts a fourth image displayed when the first game application is currently operated. When the electronic equipment compares that the similarity between the game picture of the fourth image and the first game picture is out of a specified threshold range, the electronic equipment controls the target object to move a specified distance towards a specified direction. When the electronic equipment controls the target object to move towards the appointed direction for an appointed distance each time, the electronic equipment intercepts the image after the target object moves, the similarity between the game picture in the intercepted image and the first game picture is compared, and when the similarity between the game picture in the intercepted image and the first game picture is within the appointed threshold range, the electronic equipment controls the visual angle of the game picture displayed by the first game application to rotate.
In a possible implementation manner, the controlling, by the electronic device, the rotation of the angle of view of the game screen displayed by the first game application specifically includes: the electronic equipment outputs a control signal to external mechanical equipment through the second performance detection tool. When the electronic equipment receives the input of the external mechanical equipment acting on the target object based on the control signal, the first game application runs the displayed game picture and rotates by a specified angle. When the game picture displayed by the running of the first game application rotates by a designated angle, the electronic equipment intercepts the image displayed by the current running of the first game application and carries out similarity contrast with the first image. When the similarity is within a specified threshold range, the electronic equipment determines that the visual angle of the game picture displayed on the electronic equipment is rotated to be the same as the first game picture.
In a second aspect, the present application provides an electronic device comprising a communication apparatus, a memory, and a processor coupled to the memory, a plurality of application programs, and one or more programs. The processor, when executing the one or more programs, causes the electronic device to perform the method of any of the possible implementations of the first aspect.
In a third aspect, the present application provides a computer storage medium having a computer program stored therein, the computer program comprising executable instructions that, when executed by a processor, cause the processor to perform the method of any one of the possible implementations of the first aspect.
Drawings
Fig. 1 is a schematic hardware structure diagram of an electronic device 100 according to an embodiment of the present disclosure;
FIG. 2 is a flowchart of a method for reproducing a game scene according to an embodiment of the present disclosure;
3A-3E are a set of schematic user interfaces provided by embodiments of the present application;
FIG. 3F is a schematic diagram of a position map provided by an embodiment of the present application;
3G-3H are a set of schematic user interfaces provided by embodiments of the present application;
fig. 4A-4B are schematic diagrams of a set of user interfaces provided by an embodiment of the present application.
Detailed Description
The technical solution in the embodiments of the present application will be described in detail and removed with reference to the accompanying drawings. In the description of the embodiments herein, "/" means "or" unless otherwise specified, for example, a/B may mean a or B; "and/or" in the text is only an association relationship describing an associated object, and means that three relationships may exist, for example, a and/or B may mean: three cases of a alone, a and B both, and B alone exist, and in addition, "a plurality" means two or more than two in the description of the embodiments of the present application.
In the following, the terms "first", "second" are used for descriptive purposes only and are not to be understood as implying or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature, and in the description of embodiments of the application, unless stated otherwise, "plurality" means two or more.
First, a process of game scene reproduction provided by the present application is introduced.
The electronic device may run a game application, displaying an interface of the game application. While running the game application, the electronic device may also run a designated performance testing tool (e.g., a perfDog testing tool) for detecting changes in device parameters (e.g., game screen frame rate, CPU load during game running, CPU temperature, etc.) during game running. When the specified performance testing tool detects that the device parameters are abnormal (for example, a game picture is dropped, the CPU load is too high during the game running process, the CPU temperature is too high, and the like), the electronic device may capture the image 1 of the game application interface at that moment. After the test person acquires the image 1, the test person may repeatedly compare the image 1 with the interface of the game application currently displayed by the electronic device, so as to reproduce the game scene of the image 1 at any time.
It can be seen from the above flow that the game scene reproduction depends on manual operation, and the operation process is very complicated, thereby causing the problems of long time consumption, low efficiency and low accuracy of the result of the game scene reproduction.
Accordingly, the present application provides a method of game scene rendering. When the electronic device runs the game application and a specified performance testing tool (e.g., perfDog testing tool), the electronic device may display an interface of the game application. When the specified performance testing tool detects that the device parameters (for example, one or more of the frame rate of the game picture, the CPU load during the game running, the CPU temperature during the game running, and the like) are abnormal (for example, one or more of the frame rate of the game picture is less than a specified threshold, the CPU load during the game running is higher than a specified threshold, the CPU temperature during the game running is higher than a specified threshold, and the like) in the running process of the game application, the electronic device may intercept the game application interface to obtain the cut-out image 1. The screenshot 1 may include a minimap and a target object, among other things. Then, when the electronic device needs further device performance analysis through a designated performance detection tool (e.g., a renderoc tool), the electronic device can reproduce a game scene corresponding to the screenshot 1 through a designated algorithm (e.g., an image recognition algorithm, a routing algorithm, etc.) based on the screenshot 1. The electronic device may perform further device performance analysis based on the recurring game scenario. Therefore, the time consumed by game scene reproduction can be reduced, the efficiency of game scene reproduction can be improved, and the result accuracy of game scene reproduction can be improved.
Next, an exemplary electronic apparatus 100 provided in the embodiment of the present application is described.
Fig. 1 shows a hardware configuration diagram of an electronic device 100.
The electronic device 100 may be a mobile phone, a tablet computer, a desktop computer, a laptop computer, a handheld computer, a notebook computer, an ultra-mobile personal computer (UMPC), a netbook, a cellular phone, a Personal Digital Assistant (PDA), an augmented reality (A R) device, a Virtual Reality (VR) device, an Artificial Intelligence (AI) device, a wearable device, a vehicle-mounted device, an intelligent home device, and/or a smart city device, and the embodiment of the present application is not particularly limited to specific types of the electronic device.
The electronic device 100 may include a processor 101, a memory 102, a wireless communication module 103, a display screen 104, a sensor module 105, an audio module 106, a speaker 107, and the like. The modules may be connected by a bus or in other manners, and the embodiment of the present application takes the bus connection as an example.
It is to be understood that the illustrated structure of the embodiment of the present invention does not specifically limit the electronic device 100. In other embodiments of the present application, electronic device 100 may also include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 101 may include one or more processor units, for example, the processor 101 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors.
The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 101 for storing instructions and data. In some embodiments, the memory in the processor 101 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 101. If the processor 101 needs to use the instruction or data again, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 101, thereby increasing the efficiency of the system.
In some embodiments, processor 101 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an inter-integrated circuit (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a mobile industry processor interface (mobile industry processor interface, mpi), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
Memory 102 is coupled to processor 101 for storing various software programs and/or sets of instructions. In a specific implementation, the memory 102 may include a volatile memory (volatile memory), such as a Random Access Memory (RAM); non-volatile memory (non-volatile memory) such as ROM, flash memory, Hard Disk Drive (HDD), or Solid State Drive (SSD) may also be included; the memory 102 may also comprise a combination of the above-mentioned kinds of memories. The memory 102 may store some program codes (e.g., image recognition algorithm program codes, routing algorithm program codes, etc.) to facilitate the processor 101 to call the program codes stored in the memory 102 to implement the implementation method of the embodiment in the electronic device 100. The memory 102 may store an operating system, such as an embedded operating system like uCOS, VxWorks, RTLinux, etc.
The wireless communication module 103 may provide solutions for wireless communication applied to the electronic device 100, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), Bluetooth (BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like. The wireless communication module 103 may be one or more devices integrating at least one communication processing module. The wireless communication module 103 receives electromagnetic waves via an antenna, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 101. The wireless communication module 103 may also receive a signal to be transmitted from the processor 101, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna to radiate the electromagnetic waves. In some embodiments, the electronic device 100 may detect or scan devices in the vicinity of the electronic device 100 by transmitting a signal through a bluetooth module (not shown in fig. 1) or a WLAN module (not shown in fig. 1) in the wireless communication module 103, and establish a wireless communication connection with the nearby devices and transmit data. Among other things, the bluetooth module may provide a solution that includes one or more of classic bluetooth (BR/EDR) or Bluetooth Low Energy (BLE) bluetooth communication. The WLAN module may provide solutions that include one or more of Wi-Fi direct, W i-Fi LAN, or Wi-Fi softAP WLAN communications.
The display screen 104 may be used to display images, video, and the like. The display screen 104 may include a display panel. The display panel may be a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, an mcroed, a Micro-oeld, a quantum dot light-emitting diode (QLED), or the like. In some embodiments, the electronic device 100 may include 1 or N display screens 104, N being a positive integer greater than 1.
The sensor module 105 may include a touch sensor 105A or the like. The touch sensor 105A may also be referred to as a "touch device". The touch sensor 105A may be disposed on the display screen 104, and the touch sensor 105A and the display screen 104 form a touch screen, which is also called a "touch screen". The touch sensor 105A can be used to detect a touch operation acting thereon or nearby. Optionally, the sensor module 105 may further include a gyroscope sensor (not shown in fig. 1), an acceleration sensor (not shown in fig. 1), and the like. Where a gyro sensor may be used to determine the motion pose of the electronic device 100, in some embodiments, the electronic device 100 may determine the angular velocity of the electronic device 100 about three axes (i.e., the x, y, and z axes) via the gyro sensor. Acceleration sensors may be used to detect the magnitude of acceleration of electronic device 100 in various directions (typically the x, y, and z axes). The magnitude and direction of gravity can be detected when the electronic device 100 is stationary.
The audio module 106 may be used to convert digital audio information into an analog audio signal for output, and may also be used to convert an analog audio input into a digital audio signal. The audio module 106 may also be used to encode and decode audio signals. In some embodiments, the audio module 106 may also be disposed in the processor 101, or some functional modules of the audio module 106 may be disposed in the processor 101.
The speaker 107, which may also be referred to as a "horn," is used to convert audio electrical signals into sound signals. The electronic device 100 may listen to music through the speaker 107 or to a hands free phone.
Optionally, the electronic device 100 may further include a mobile communication module (not shown in fig. 1). The mobile communication module may provide a solution for wireless communication including 2G/3G/4G/5G, etc. for use on the electronic device 100.
A game scene reproduction method provided in the embodiment of the present application is described below.
Fig. 2 is a flowchart illustrating a game scene reproduction method provided in an embodiment of the present application, where a specific flow of the method may include:
s201, the electronic device 100 runs a first game application and displays an interface of the first game application.
Specifically, the performance testing tool 1 (which may also be referred to as a first performance testing tool, e.g., perfDog testing tool) may select the first game application as the subject of the performance testing by the performance testing tool 1. The performance detection tool 1 may detect a first device parameter of the electronic device 100 (e.g., one or more of a frame rate of a game screen, a CPU load during game execution, a CPU temperature during game execution, etc.) while the first game application is running. The electronic device 100 may run a first game application and display an interface of the first game application.
The electronic device 100 can receive input (which can also be referred to as a first input, e.g., a click) for a first game app icon, in response to which the electronic device 100 can run the first game app and can display an interface for the first game app.
Illustratively, as shown in fig. 3A, the electronic device 100 may display a home screen interface 300 (which may also be referred to as a first interface). One or more application icons may be displayed in the interface 300. The one or more application icons may include a weather application icon, a stock application icon, a calculator application icon, a setting application icon, a mail application icon, a theme application icon, a music application icon, a video application icon, a game application icon 301, and the like.
Optionally, a status bar, page indicator, and tray icon area may also be displayed in the interface 300. The status bar may include, among other things, one or more signal strength indicators for mobile communication signals (also may be referred to as cellular signals), signal strength indicators for wireless fidelity (Wi-Fi) signals, battery status indicators, time indicators, and so forth. The page indicator may be used to indicate the positional relationship of the currently displayed page with other pages. The tray icon area may include a plurality of tray icons (e.g., a dial application icon, an information application icon, a contacts application icon, a camera application icon, etc.) that remain displayed during page switching. The page may also include a plurality of application icons and a page indicator, the page indicator may not be a part of the page, the page indicator may exist separately, the picture icon is also optional, and this is not limited in this embodiment of the present application.
The electronic device 100 may receive a touch operation (e.g., a click) acting on the game app icon 301, and in response to the touch operation, the electronic device 100 may display the game app interface 310.
As shown in fig. 3B, the game application interface 310 may include a game screen, a direction key area 311, a small map area 312, and a target object 313, and the like. The direction key region 311 may include an up button icon, a right button icon, a down button icon, and a left button icon. The direction key region may be used to receive a touch operation (e.g., a click) applied to the region, and in response to the touch operation, the electronic apparatus 100 may display movement of the target object 313 in a corresponding direction (e.g., upward, rightward, downward, leftward). The minimap region 312 may include a marker 312A corresponding to the target object 313. The marker 312A may be used to display the position information of the target object 313 in the game map. The target object 313 is a subject operated in a game application. The electronic apparatus 100 may control the target object 313 to be movable in the game screen in response to the touch operation with respect to the above-described direction key region 311. The electronic apparatus 100 may also adjust a position of a game screen displayed on the display screen in the game scene in response to a drag input (e.g., a long press drag operation acting on the target object 313).
In one possible implementation, the direction key region 311 may include more or fewer direction keys than shown. For example, direction key region 311 may also include an oblique upper right direction key, an oblique lower right direction key, and the like. In other embodiments, the key region 311 may be a circular icon for monitoring a touch operation (e.g., long-pressing the circular icon and dragging in any direction) applied to the circular icon, and the electronic device 100 displays an interface in which the target object 313 moves in any direction in the game screen in response to the operation. This is not limited by the present application.
In one possible implementation, the small map area 312 may be used to receive a touch operation (e.g., a click) that acts within the area, and in response to the touch operation, the electronic device 100 may display an interface that includes the large map of the gaming application.
Optionally, in some embodiments, the game interface 310 may further include a function key area, which may include one or more function keys (e.g., "pop" function key, "bring up" function key, etc.). The electronic apparatus 100 may receive a touch operation (e.g., a click) applied to the one or more function keys, and in response to the touch operation, the electronic apparatus 100 may display a game screen on which the target object applies the corresponding skill.
In a possible implementation manner, the electronic device 100 may also receive a key value of a physical key, where the key value of the physical key has a mapping relationship with the direction area 311. The electronic device 100 may control the movement of the target object 313 in a corresponding direction (e.g., up, right, down, left) based on the key value of the physical key.
It should be noted that the game application interface 310 is only used for exemplary explanation of the present application and does not constitute any limitation to the present application.
S202, when the electronic device 100 detects that the first device parameter is abnormal, the electronic device 100 intercepts an interface of the first game application at the current time to obtain a screenshot 1 (which may also be referred to as a first image). The screenshot 1 may include a minimap window and a game screen 1 (which may also be referred to as a first game screen). In the small map window, a map image 1 (may also be referred to as a first small map image) and a mark 1 (may also be referred to as a first mark) corresponding to the target object are displayed, and the mark 1 may be located at a first position on the map image 1. The game screen 1 includes an image corresponding to the target object.
Specifically, when the performance detection tool 1 (e.g., perfDog testing tool) detects that an abnormality occurs in a first device parameter of the electronic device 100 during the running of the first game application (which may also be referred to as a first condition, for example, one or more of a frame rate of a game screen is smaller than a specified threshold, a CPU load during the running of the game is higher than the specified threshold, a CPU temperature during the running of the game is higher than the specified threshold, and the like), the electronic device 100 may intercept an interface of the first game application at the current time to obtain the screenshot 1.
For example, the electronic device 100 may run the game application corresponding to the game application icon 301 shown in fig. 3A and display the game application interface 310. Meanwhile, the electronic device 100 may also operate the performance detection tool 1, which is used to detect a first device parameter during the game operation, such as a game screen frame rate, a CPU load during the game operation, a CPU temperature during the game operation, and the like. When the performance detection tool 1 detects that the game frame rate is lower than a specified threshold (for example, the game frame rate is lower than 30 frames/second), the electronic device 100 may intercept the game application interface 310 to obtain a screenshot (which may also be referred to as screenshot 1) as shown in fig. 3C. The screenshot may include a minimap window 321 and a game screen 320 (which may also be referred to as game screen 1). The minimap window 321 may include a marker 321A corresponding to the target object (may also be referred to as a marker 1 corresponding to the target object) and a map image 321B (may also be referred to as a map image 1), and the game screen 320 may include an image 322 corresponding to the target object 313 (may also be referred to as a target object).
S203, after the electronic device 100 acquires the screenshot 1, the electronic device 100 may control the first game application to display and capture a large map (which may also be referred to as a first large map image).
Specifically, after the electronic device 100 acquires the screenshot 1, the performance testing tool 2 (which may also be referred to as a second performance testing tool, such as a renderoc tool, a TraceView tool, a sysstrace tool, etc.) may select the first game application as a subject for testing by the performance testing tool 2. The performance detection tool 2 may detect a second device parameter (e.g., game frame vertex count, shader complexity, texture resolution, etc.) of the electronic device 100 during execution of the first game application. The electronic device 100 may run the first game application and control the first game application to display the large map.
The electronic device 100 can receive a touch operation (e.g., a click) to the map control, in response to which the electronic device 100 can display and intercept a large map of the first gaming application, where the large map can be a map panorama of the first gaming application.
Illustratively, as shown in FIG. 3D, the electronic device 100 may run a first game application, displaying a game application interface 330. The game screens of the game app interface 330 and the game app interface 310 are not the same. The game application interface 330 may include a game screen, a direction key area 311, a small map area 312, and a target object 313, etc. For the description of the direction key area 311, the small map area 312 and the target object 313, reference may be made to the foregoing description of the embodiment shown in fig. 3A, and details are not repeated here. Wherein the location of the marker 312A in the game application interface 330 is different from the location of the marker 312A in the game application interface 310 in the small map area 312.
The electronic device 100 may receive a touch operation (e.g., a click) to use as a control for the small map area 312 (which may also be referred to as a map control), in response to which the electronic device 100 may display the game application interface 340.
As shown in fig. 3E, the gaming application interface 340 may include a large map of the gaming application and a close control 342. The large map may include a mark 341 corresponding to the target object 313 shown in fig. 3D, which is used to indicate the position information of the current target object 313 in the large map. Closing control 342 may be used to receive a touch operation (e.g., a click) on the control, in response to which electronic device 100 may no longer display a large map of the gaming application.
S204, the electronic device 100 determines, based on the map image 1 and the large map, that the mark 1 corresponding to the target object in the map image 1 is mapped to the position 1 (which may also be referred to as a second position) in the large map.
The electronic device 100 may extract a map image 1 from the screenshot 1, where the map image 1 may include a mark 1 corresponding to the target object.
Specifically, the electronic device 100 may acquire coordinate information of the map image 1 in the screenshot 1 in advance. The electronic device 100 may extract the map image 1 from the screen shot 1 based on the coordinate information of the screen shot 1.
For example, before acquiring the screenshot 1, the electronic device 100 may acquire a screenshot 2 (which may also be referred to as a second image) which is the same size as the screenshot 1 and includes a small map window, and a map image (which may also be referred to as a preset small map image). This map image is the same as the map image displayed in the minimap window in screenshot 2 (which may also be referred to as the second minimap image). The electronic device 100 may identify coordinate information of the map image in the screenshot 2 based on the screenshot 2 and the map image through a specified image recognition algorithm (e.g., a template matching function cv2.matchTemplate ()) in OpenCV. The electronic device 100 may acquire the coordinate information. When the electronic device 100 acquires the screenshot 1, the electronic device 100 may extract a map image 1 of a preset shape (e.g., a square or a rectangle) from the screenshot 1 based on the coordinate information.
After the map image 1 is extracted, the electronic device 100 may zoom the large map displayed in step S203 (e.g., the large map in the game application interface 340 shown in fig. 3E) to the same scale (which may also be referred to as a scale) as the map image 1. The electronic apparatus 100 may identify a region (which may also be referred to as a first region) similar to the map image 1 from the geomap through a specified image recognition algorithm (e.g., template matching function cv2.matchtemplate ()) in OpenCV based on the map image 1.
Next, the electronic apparatus 100 may calculate the position information of the marker 1 (for example, the aforementioned marker 321A in fig. 3C) corresponding to the target object in the map image 1. The electronic device 100 may calculate that the marker 1 in the map image 1 maps to the position 1 in the large map by a specified algorithm (e.g., remap algorithm in OpenCV).
Illustratively, as shown in fig. 3F, the map image 1 may be an image 351, and the image 351 includes a mark 321A (also referred to as a mark 1 corresponding to the target object). The large map may be image 352. The image 352 may include a mark 352A and a mark 341 corresponding to the target object 313 shown in fig. 3D. Wherein the mark 352A may be used for the mark 321A to map to the position 1 in the large map, and the mark 341 may represent the position information of the current target object 313 in the large map.
In some possible implementations, when the map image 1 in the screenshot 1 is a circle, and the size of the extracted image (which may also be referred to as a third image) including the map image 1 is a rectangle or a square, the electronic device 100 may further extract the map image 1 from the extracted image by taking a center point of the extracted image as a center, and removing an extra portion outside the map image 1 through a specified image recognition algorithm (e.g., an edge detection algorithm).
S205, the electronic device 100 acquires a position 2 (which may also be referred to as a third position) of the target object in the geodetic map.
For example, the electronic device 100 may calculate coordinate information of the mark 341 in the geodetic map corresponding to the target object 313 (which may also be referred to as a target object) in the embodiment shown in fig. 3E in the foregoing step S203, so as to obtain the position 2 of the target object in the geodetic map.
S206, the electronic equipment 100 controls the target object to move from the position 2 to the position 1.
Specifically, the electronic device 100 may output a control signal to an external mechanical device (e.g., a robot) through the performance detection tool 2. The electronic apparatus 100 may receive an input operation of an external mechanical apparatus (e.g., a manipulator) moving in a direction displayed on the display screen based on the control signal, and in response to the input operation, the electronic apparatus 100 may control the target object to move from the position 2 to the position 1.
Among them, the electronic apparatus 100 may previously store one or more travel paths from the position 2 to the position 1. The electronic device 100 may uniformly divide the geodesic into a plurality of regions according to a specified region size (e.g., a set of pixel points with each region size of 9x 9), and each region may be referred to as a location point. The plurality of position points in the large map include a position point 1 corresponding to a position 1 and a position point 2 corresponding to a position 2. With the position point 2 as a starting point and the position point 1 as an end point, the electronic device 100 may control the target object to travel from the position point 2 to the position point 1 through a specified algorithm (e.g., a route search algorithm) within a specified time range (e.g., 20 seconds). If the target object can move from position point 2 to position point 1 within a specified time range (e.g., 20 seconds), the travel path is valid and the electronic device 100 may store the travel path. If there are multiple travel paths for location 2 to move to location 1, electronic device 100 may store the multiple paths in a corresponding set of paths.
The electronic device 100 may store operation information of an external mechanical device (e.g., a robot). The electronic device 100 may calculate the distance that the target object moves forward each time the external mechanical device acts on the direction control (e.g., right direction control, left direction control, upward direction control, downward direction control, etc.) for 1 second. For example, first electronic device 100 may capture screenshot 3. Electronic device 100 may identify, based on screenshot 3, a contour of the target object through an image recognition algorithm, and electronic device 100 may calculate a center point position 3 of the target object through the contour. When the electronic device 100 receives a touch operation (e.g., clicking) of the external mechanical device on the right direction control for 5 seconds, the electronic device 100 may control the target object to move to the right in response to the touch operation. Then, the electronic device 100 may obtain the screenshot 4, and based on the screenshot 4, the electronic device 100 may identify an outline of the target object through an image recognition algorithm, and the electronic device 100 may calculate the central point position 4 of the target object through the outline. The electronic device 100 may calculate the displacement of the center point position 3 and the center point position 4 of the target object, divided by the movement time 5 seconds, so as to calculate the distance that the target object moves forward when the external mechanical device performs a touch operation (e.g., clicking) on the right direction control for 1 second.
When the electronic device 100 acquires the position 2 of the target object currently in the large map, the electronic device 100 may control the target object to move from the position 2 to the position 1 by using the position 2 as a starting point and a pre-stored travel path from the position 2 to the position 1 and operation information of the external mechanical device in response to the received input operation of the external mechanical device for the direction movement.
For example, as shown in fig. 3G, the electronic apparatus 100 may receive an input operation by the manipulator with respect to the right key icon in the direction area 311, and in response to the input operation, the electronic apparatus 100 may control the target object 313 to move to the right. After the electronic device 100 controls the target object to move from location 2 to location 1, the electronic device 100 may display the game application interface 360. For the description of the game application interface 360, reference may be made to the description of the embodiment shown in fig. 3B, and details are not repeated here. The game screen in fig. 3G is different from the game screen in the screenshot 1 shown in fig. 3C in the aforementioned step S202.
In one possible implementation, if one or more travel paths from location 2 to location 1 are not pre-stored in the electronic device 100, the electronic device 100 may control the target object to move from location 2 to location 1 through a specified algorithm (e.g., a-way search algorithm). The electronic device 100 may store the travel path moved from location 2 to location 1.
In a possible implementation manner, after the electronic device 100 controls the target object to move from the position 2 to the position 1, the electronic device 100 may acquire a screenshot image 5 (which may also be referred to as a fourth image) at the current time, and calculate a similarity between a game screen in the screenshot image 5 and a game screen in the screenshot 1. If the similarity is outside the specified threshold tolerance range, the electronic device 100 may output a control signal to a mechanical device (e.g., a manipulator) through the performance detection tool 2, and the electronic device 100 may receive an input of the mechanical device acting on the target object based on the control signal to control the target object to move to a specified distance in a specified direction. When the target object moves a specified distance in a specified direction each time, the electronic device 100 may capture an image after the current target object has moved, and compare the similarity between the game screen in the image and the game screen 1 in the screenshot 1 until the similarity between the game screen 1 and the image is within a specified threshold range. The operation process may refer to the foregoing steps S203-S206, and will not be described herein.
S207, the electronic device 100 controls the visual angle of the game picture displayed by the first game application to rotate.
Specifically, the electronic device 100 may output a control signal to an external mechanical device (e.g., a robot) through the performance detection tool 2. The electronic device 100 may receive an input operation of an external mechanical device (e.g., a manipulator) based on the control signal, and in response to the input operation, the electronic device 100 may control the rotation of the viewing angle of the game screen displayed by the first game application. When the game screen rotates by a specified angle (e.g., 1 degree) in a specified direction (e.g., clockwise), the electronic device 100 may compare the current game application interface image with the screenshot 1 to calculate the similarity, and when the calculation result is within a specified threshold range, the electronic device 100 may determine that the game screen at this time is the same as the game screen 1 in the screenshot 1.
For example, as shown in fig. 3H, the electronic device 100 may receive a drag operation of a manipulator with respect to the target object 313, and the electronic device 100 may control the target object 313 to rotate by controlling the angle of view of the game screen displayed by the first game application in response to the drag operation. When the electronic device 100 controls the target object 313 to rotate by a specified angle (e.g., 1 degree) in a specified direction (e.g., clockwise), the electronic device 100 may compare the current game application interface image with the aforementioned screenshot 1 shown in fig. 3B for similarity calculation, and when the calculation result is within a specified threshold range, the electronic device 100 may determine that the game screen at this time is the same as the game screen 1 in the screenshot 1 shown in fig. 3B.
In one possible implementation, when the electronic device 100 controls the target object to move from location 2 to location 1, and the electronic device 100 displays the game screen in the game application interface 360 that is the same as the game screen 1 shown in the screenshot 1 in fig. 3B, the electronic device 100 may perform step S208.
S208, when the game screen displayed by the first game application is the same as the game screen 1 in the screenshot 1, the performance detection tool 2 may obtain the second device parameter of the electronic device 100.
Specifically, when the game screen displayed by the first game application is the same as the game screen 1 in the screenshot 1, the performance detection tool 2 (e.g., renderoc tool, TraceView tool, sysstrace tool, etc.) may obtain, based on the game screen, second device parameters (e.g., one or more of game screen vertex number, shader complexity, texture resolution, etc.) of the electronic device 100 in a scene corresponding to the current game screen of the first game application, and perform further device performance analysis based on the obtained second device parameters.
For example, when the game screen displayed by the first game application is the same as the game screen 320 (also referred to as game screen 1) in the screenshot (also referred to as screenshot 1) shown in the embodiment of fig. 3C, the renderoc tool may obtain, based on the game screen, the game screen vertex number, texture resolution, and the like of the electronic device 100 in the scene corresponding to the current game screen of the first game application for further device performance analysis.
In some embodiments, electronic device 200 may establish a communication connection with electronic device 100 (e.g., electronic device 200 and electronic device 100 may establish a communication connection based on a universal serial bus). The hardware structure of the electronic device 200 may refer to the hardware structure of the electronic device 100 shown in the foregoing embodiment in fig. 1, and is not described herein again. In the foregoing steps 210 to 202, when the performance detection tool 1 detects the first device parameter of the electronic device 100 in the process of running the first game application, the electronic device 200 may acquire the data information of the first device parameter sent by the electronic device 100 based on the communication connection. The electronic device 200 may display data information of the first device parameter.
For example, as shown in fig. 4A, electronic device 200 may establish a communication connection with electronic device 100 (e.g., electronic device 200 and electronic device 100 may establish a communication connection based on a universal serial bus). Take the example that the first device parameter is the CPU temperature value at which the game is running. The electronic device 100 runs the first game application, displaying the game application interface 310. The performance detection tool 1 detects a CPU temperature value of the electronic device 100 during the running of the first game application. The electronic device 200 may acquire the CPU temperature value sent by the electronic device 100 based on the communication connection, and display data information of the CPU temperature value. For the description of the game application interface 310, reference may be made to the foregoing description of the embodiment shown in fig. 3B, which is not repeated herein. In the data information of the CPU temperature value displayed by the electronic device 200, the X axis is the running time of the first game application, and the Y axis is the temperature value of the CPU.
In some embodiments, electronic device 200 may establish a communication connection with electronic device 100 (e.g., electronic device 200 and electronic device 100 may establish a communication connection based on a universal serial bus). The hardware structure of the electronic device 200 may refer to the hardware structure of the electronic device 100 shown in the foregoing embodiment in fig. 1, and is not described herein again. When the performance detection tool 2 acquires the second device parameter, the electronic device 200 may acquire data information of the second device parameter transmitted by the electronic device 100 based on the communication connection. The electronic device 200 may display the data information of the second device parameter.
As shown in fig. 4B, when the electronic device 100 displays the game screen 320 (also referred to as game screen 1) in the screenshot (also referred to as screenshot 1) shown in the embodiment of fig. 3C, that is, displays the game application interface 310 shown in the embodiment of fig. 3B, the renderoc tool may capture the game screen and obtain data information of the top points of the game screen based on the captured game screen. The electronic apparatus 200 may acquire the data information of the game screen top points transmitted by the electronic apparatus 100 based on the communication connection. The electronic device 200 may display data information of the top points of the game screen.
In one possible implementation, the electronic device 200 may send data instructions to the electronic device 100 based on the communication connection to control the electronic device 100 to run the first game application.
In some embodiments, the electronic device 100 may obtain multiple corresponding screenshots when the first device parameter (e.g., one or more of a frame rate of a game screen, a CPU load during game running, a CPU temperature during game running, and the like) is detected to be abnormal by the performance detection tool 1. The plurality of screenshots include screenshot 1 and screenshot 2. When the electronic device 100 executes the foregoing steps 203 to 208 based on the screenshot 1, and acquires the second device parameter of the electronic device 100 in the game scene corresponding to the game screen in the screenshot 1 for further device performance analysis, the electronic device 100 may execute the foregoing steps 203 to 208 based on the screenshot 2, and acquire the second device parameter of the electronic device 100 in the game scene corresponding to the game screen in the screenshot 2 for further device performance analysis.
As used in the above embodiments, the term "when …" may be interpreted to mean "if …" or "after …" or "in response to a determination of …" or "in response to a detection of …", depending on the context. Similarly, depending on the context, the phrase "at the time of determination …" or "if (a stated condition or event) is detected" may be interpreted to mean "if the determination …" or "in response to the determination …" or "upon detection (a stated condition or event)" or "in response to detection (a stated condition or event)".
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, cause the processes or functions described in accordance with the embodiments of the application to occur, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable devices. The computer instructions may be stored in a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by wire (e.g., coaxial cable, fiber optic, digital subscriber line) or wirelessly (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that incorporates one or more of the available media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., solid state disk), among others.
One of ordinary skill in the art will appreciate that all or part of the processes in the methods of the above embodiments may be implemented by hardware related to instructions of a computer program, which may be stored in a computer-readable storage medium, and when executed, may include the processes of the above method embodiments. And the aforementioned storage medium includes: various media capable of storing program codes, such as ROM or RAM, magnetic or optical disks, etc.
- 上一篇:石墨接头机器人自动装卡簧、装栓机
- 下一篇:异常监控方法和异常监控系统