Method for identifying dirt of lens, processor and household appliance

文档序号:9347 发布日期:2021-09-17 浏览:42次 中文

1. A method for identifying soiling of a lens, applied to a household appliance provided with an image acquisition device, comprising:

acquiring an image detected by the image acquisition device in real time;

performing edge detection on the image to obtain a detection result;

determining the number of edge pixel points in the image according to the detection result;

and determining the dirty state of the camera corresponding to the image according to the number of the edge pixel points in the image.

2. The method for identifying the contamination of the lens according to claim 1, wherein the performing the edge detection on the image to obtain the detection result comprises:

filtering the image;

enhancing the image after filtering processing;

acquiring the gradient amplitude of each pixel point in the enhanced image;

and obtaining a detection result according to the gradient amplitude of each pixel point.

3. The method for identifying lens contamination according to claim 2, wherein the obtaining the detection result according to the gradient magnitude of each pixel point comprises:

comparing the gradient amplitude of each pixel point with a first preset threshold value;

and under the condition that the gradient amplitude of the pixel point is greater than a first preset threshold, determining the pixel point with the gradient amplitude greater than the first preset threshold as an edge pixel point as the detection result.

4. The method for identifying the contamination of the lens according to claim 1, wherein the determining the contamination state of the camera corresponding to the image according to the number of the edge pixel points in the image comprises:

acquiring the ratio of the number of edge pixel points in the image to the number of all pixel points in the image;

and determining the dirty state of the camera corresponding to the image according to the ratio.

5. The method for identifying the contamination of the lens according to claim 4, wherein the determining the contamination state of the camera corresponding to the image according to the ratio comprises:

comparing the ratio with a second preset threshold;

and determining the dirty state of the camera corresponding to the image according to the comparison result.

6. The method for identifying the contamination of the lens according to claim 5, wherein the determining the contamination state of the camera corresponding to the image according to the ratio comprises:

and under the condition that the ratio is smaller than the second preset threshold value, judging that the camera corresponding to the image is dirty.

7. The method for identifying lens contamination according to claim 6, further comprising:

and sending a cleaning prompt when the camera is judged to be dirty.

8. The method for identifying lens contamination according to claim 1, further comprising:

and determining the dirt degree of the camera corresponding to the image according to the number of the edge pixel points in the image.

9. The method for identifying the contamination of the lens according to claim 1, wherein the acquiring the image detected by the image acquisition device in real time comprises:

acquiring an image detected by the image acquisition device in real time before the household appliance is used or after the household appliance is used.

10. A processor configured to perform the method for identifying lens contamination according to any one of claims 1 to 9.

11. A household appliance, characterized in that it comprises:

an image acquisition device configured to detect an image in real time; and

the processor of claim 9.

12. The household appliance of claim 11, wherein the household appliance comprises a range hood.

Background

In modern life, the position of the kitchen in home life is more and more important. Along with the development trend of intelligent life, kitchen household appliances are more and more intelligent and humanized. The range hood is an important component of kitchen household appliances, and the degree of intellectualization of the range hood also represents the intellectualization level of home furnishing.

The image is an effective and intuitive data form and is applied to various scenes. Among other things, image recognition based cigarette makers are also gradually entering the market (e.g., recognizing food items using images, recognizing cooking actions to help cookers complete a cooking process or directing a cooking machine to autonomously complete a cooking process). In these intelligent scenes, the image serves as a data source for these perceptual behaviors, and the image quality directly affects the accuracy of perceptual judgment. In the cooking process, oil smoke is generated more or less when a high-temperature user cooks, and the oil smoke is attached to the surface of the lens of the camera, so that the lens is blurred, and the image quality acquired by the camera sensor is influenced. If the cigarette machine system does not find the camera dirty in time, the fuzzy low-quality image is continuously adopted as data to finish the identification work of intelligent cooking, and an error result is generated.

However, in the related art, there is no simple and quick way to identify whether the camera is dirty or not. Therefore, there is still much room for improvement in the dirty recognition of the camera.

Disclosure of Invention

The invention aims to provide a method for identifying lens dirtiness, a processor and a household appliance.

In order to achieve the above object, a first aspect of the present invention provides a method for identifying contamination of a lens, which is applied to a home appliance provided with an image capture device, the method comprising:

acquiring an image detected by an image acquisition device in real time;

performing edge detection on the image to obtain a detection result;

determining the number of edge pixel points in the image according to the detection result;

and determining the dirty state of the camera corresponding to the image according to the number of the edge pixel points in the image.

In the embodiment of the present invention, performing edge detection on an image to obtain a detection result includes:

filtering the image;

enhancing the image after filtering processing;

acquiring the gradient amplitude of each pixel point in the enhanced image;

and obtaining a detection result according to the gradient amplitude of each pixel point.

In the embodiment of the present invention, obtaining the detection result according to the gradient amplitude of each pixel point includes:

comparing the gradient amplitude of each pixel point with a first preset threshold value;

and under the condition that the gradient amplitude of the pixel point is greater than a first preset threshold, determining the pixel point with the gradient amplitude greater than the first preset threshold as an edge pixel point as a detection result.

In the embodiment of the present invention, determining the dirty state of the camera corresponding to the image according to the number of edge pixel points in the image includes:

acquiring the ratio of the number of edge pixel points in the image to the number of all pixel points in the image;

and determining the dirty state of the camera corresponding to the image according to the ratio.

In the embodiment of the present invention, determining the dirty state of the camera corresponding to the image according to the ratio includes:

comparing the ratio with a second preset threshold;

and determining the dirty state of the camera corresponding to the image according to the comparison result.

In the embodiment of the present invention, determining the dirty state of the camera corresponding to the image according to the ratio includes:

and under the condition that the ratio is smaller than a second preset threshold value, judging that the camera corresponding to the image is dirty.

In the embodiment of the present invention, the method further includes:

and when the camera is judged to be dirty, sending a cleaning prompt.

In the embodiment of the present invention, the method further includes:

and determining the dirt degree of the camera corresponding to the image according to the number of the edge pixel points in the image.

In the embodiment of the present invention, acquiring the image detected by the image acquisition device in real time includes:

before the household appliance is used or after the household appliance is used, the image detected by the image acquisition device in real time is acquired.

A second aspect of the invention provides a processor configured to perform the above-mentioned method for identifying a lens contamination.

A third aspect of the present invention provides a home appliance including:

an image acquisition device configured to detect an image in real time; and the processor.

In an embodiment of the invention, the household appliance comprises a range hood.

By the technical scheme, the image detected in real time is obtained; performing edge detection on the image to obtain a detection result; determining the number of edge pixel points in the image according to the detection result; and determining the dirty state of the camera corresponding to the image according to the number of the edge pixel points in the image. The embodiment of the invention can identify whether the camera is dirty or not by using the image, and the identification method is simple and quick.

Additional features and advantages of embodiments of the invention will be set forth in the detailed description which follows.

Drawings

The accompanying drawings, which are included to provide a further understanding of the embodiments of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain the embodiments of the invention without limiting the embodiments of the invention. In the drawings:

FIG. 1 is a schematic flow diagram of a method for identifying smoke in an embodiment of the present invention;

FIG. 2 is a schematic diagram of an edge detection process according to an embodiment of the present invention;

FIG. 3 is a schematic view of a contamination detection process of a camera according to an embodiment of the present invention;

FIG. 4 is a diagram of a hardware configuration of an electronic device according to an embodiment of the present invention;

fig. 5 is an internal structural diagram of a computer device according to an embodiment of the present invention.

Detailed Description

The following detailed description of embodiments of the invention refers to the accompanying drawings. It should be understood that the detailed description and specific examples, while indicating embodiments of the invention, are given by way of illustration and explanation only, not limitation.

An embodiment of the present invention provides a method for identifying contamination of a lens, as shown in fig. 1, the method includes:

step 101: acquiring an image detected by an image acquisition device in real time;

step 102: performing edge detection on the image to obtain a detection result;

step 103: determining the number of edge pixel points in the image according to the detection result;

step 104: and determining the dirty state of the camera corresponding to the image according to the number of the edge pixel points in the image.

The method in the embodiment can be applied to household appliances, and the household appliances are provided with image acquisition devices. Specifically, the household appliance can be a range hood, and the image acquisition device can be a camera. The method of the present embodiment will be described below by taking a range hood as an example.

Specifically, the detection image in the present embodiment may be obtained by shooting with a camera. The number of the cameras can be multiple, and the cameras can be fixedly arranged on the range hood and also can be connected with the range hood through a connecting device. During actual application, the camera can shoot in real time to obtain a detection image.

Furthermore, a camera can be arranged to collect images before cooking, and whether the camera is dirty or not can be judged according to the collected images; the camera can be arranged to collect images after cooking, and whether the camera is dirty or not can be judged according to the collected images. In addition, the acquisition of the images can be performed at regular intervals. E.g., two weeks apart, one month apart, etc. In actual application, the duration of the interval may be set by the user at will according to the user requirement, or may be set by the user according to the interval duration recommended by the manufacturer, or may be a time interval that has been set by the device before leaving the factory. Specifically, when the manufacturer recommends the interval duration, the manufacturer can recommend the interval duration according to the number of times of use, the frequency of use and the type of use of the user. For example, when the number of times of use of the user is zero to a first numerical time, recommending a first interval duration; recommending a second interval duration when the number of times of use of the user is between a first numerical time and a second numerical time; and when the using times of the user are between the second numerical time and the third numerical time, recommending a third interval duration and the like. For another example, when the user mostly uses cooking, the interval duration of the A is recommended; and when the use type of the user is mostly stir-frying, recommending the B interval duration.

In practical application, in an embodiment, the performing edge detection on the detected image and acquiring the detection result includes:

filtering the detection image;

enhancing the filtered image;

acquiring the gradient amplitude of each pixel point in the enhanced detection image;

and obtaining a detection result by using the gradient amplitude of each pixel point in the detection image.

Specifically, the filtering processing is carried out on the detection image, so that the noise in the detection image can be eliminated, the interference of the noise in the detection image on the edge detection result is eliminated, and the detection accuracy is improved. Specifically, the detection image is subjected to filtering processing, and gaussian filtering may be employed.

In practical application, in order to strengthen the detection result and enhance the edge detection of the detection image, the detection image after filtering processing can be enhanced, so that the difference between gradient amplitudes of pixel points in the detection image is increased, the edge points in the image, namely the edge contour in the image, can be effectively and quickly detected, and the image is determined and identified.

After the gradient amplitude values among the pixel points in the image are enhanced, the enhanced gradient amplitude values of the pixel points can be used for judgment, and the identification detection result of the detection image is obtained.

Specifically, in an embodiment, the obtaining the detection result by using the gradient magnitude of each pixel point in the detection image includes:

comparing the gradient amplitude of each pixel point in the detection image with a first preset threshold value to obtain a comparison result of each pixel point in the detection image;

and obtaining the detection result of the whole detection image according to the comparison result of each pixel point in the detection image.

Specifically, the first preset threshold may be set empirically. The setting of the first preset threshold can affect the detection result of the detected image, so that a proper value needs to be set during the setting so as to prevent the inaccurate value setting from affecting the dirt detection of the camera.

In an embodiment, the comparing the gradient amplitude of each pixel point in the detection image with the first preset threshold to obtain the comparison result of each pixel point in the detection image includes:

judging whether the gradient amplitude of each pixel point in the detection image is larger than a first preset threshold value or not;

and under the condition that the gradient amplitude of the pixel point is larger than a first preset threshold value, taking the pixel point as an edge pixel point as a comparison result of the pixel point.

Specifically, since the edge amplitude of the pixel point of the edge contour in the image may have a larger value, the edge amplitude of the pixel point in the image is compared with a preset threshold (which may be referred to as a first preset threshold), and if the value of the edge amplitude of the pixel point is larger than the preset threshold, it indicates that the difference between the amplitude of the pixel point and the amplitude of the peripheral pixel point is larger, and the larger value may be the edge contour, so that the pixel point whose gradient amplitude is larger than the preset threshold may be determined as the edge pixel point in the detected image.

When the gradient amplitude of the pixel point is larger than a preset threshold value, judging that the pixel point is an edge pixel point; and when the gradient amplitude of the pixel point is smaller than a preset threshold value, judging that the pixel point is not an edge pixel point.

And determining the detection result of the whole detection image according to the comparison result of each pixel point in the detection image.

In an embodiment, determining the contamination state of the camera corresponding to the detected image according to the number of edge pixel points in the detected image includes:

acquiring the ratio of the number of edge pixel points in the detection image to the number of all pixel points in the detection image;

and determining the dirty state of the camera corresponding to the detected image according to the ratio.

Specifically, the number of edge pixel points among all pixel points in the image is counted, and the dirty state of the camera (or called as a lens) is determined according to the number of the edge pixel points in the image. Further, the ratio of the number of edge pixel points in the picture to the number of all pixel points in the picture can be obtained, and the ratio can be in a fraction form or a percentage form. And judging whether the lens is dirty or not by utilizing the ratio in a fractional form or a percentage form. The specific expression of the ratio can be various, and is not limited herein.

After the ratio is obtained, the judgment is carried out by the following method:

in one embodiment, determining the contamination state of the camera corresponding to the detected image according to the ratio includes:

comparing the ratio with a second preset threshold;

and determining the contamination degree of the camera corresponding to the detected image according to the comparison result.

Here, the judgment threshold (which may be understood as a second preset threshold) may be set empirically. The setting of the judgment threshold can affect the detection result of the detected image, so that a proper numerical value needs to be set during the setting so as to prevent the inaccurate numerical value setting from affecting the dirt detection of the camera. Specifically, the judgment threshold may be a ratio of edge pixel points detected by the camera in the collected image to all pixel points in the entire image when the user does not use the device; or the ratio of the edge pixel point in the collected image shot by the camera after cleaning to all the pixel points in the image after the cleaning of the camera is detected by the user. Of course, the ratio may be in other cases, and is not limited herein.

Specifically, when the statistical ratio is smaller than a judgment threshold, judging that a camera corresponding to the picture is dirty; and when the statistical ratio is greater than or equal to the judgment threshold value, judging that the camera corresponding to the picture is clear.

In an embodiment, determining the contamination state of the camera corresponding to the detected image according to the ratio includes:

and sending a cleaning prompt when the camera is judged to be dirty.

When judging that the camera is dirty, because dirty camera can influence the picture testing result, cause the testing result inaccurate, influence user's use and experience. Therefore, when the camera is dirty, the user can be reminded to remind the user to clean in time, so that the use experience of the user is enhanced, and the detection accuracy of the equipment is guaranteed.

In addition, further, besides the dirty state is determined according to the statistical ratio, the dirty degree of the camera can be determined according to the statistical ratio.

For example, a plurality of contamination levels are preset, and each contamination level corresponds to a ratio range. Then, when the degree of contamination is determined according to the ratio, after the statistical ratio is obtained, the degree of contamination corresponding to the ratio range may be determined according to the ratio range in which the ratio is located. The dirt grades can be divided into three grades, namely low, medium and high; the division may be performed into four levels, i.e., a first level, a second level, a third level, and a fourth level, or another division may be performed.

By the technical scheme, the image detected in real time is obtained; performing edge detection on the image to obtain a detection result; determining the number of edge pixel points in the image according to the detection result; and determining the dirty state of the camera corresponding to the image according to the number of the edge pixel points in the image. The embodiment of the invention can identify whether the camera is dirty or not by using the image, and the identification method is simple and quick.

The present invention will be described in further detail with reference to the following application examples.

The intelligent kitchen and the intelligent cooking are development trends of future kitchens, the smoke machine is used as an important electric appliance in the kitchen and an important electric appliance tool for cooking, and the intelligent degree of the smoke machine also determines the intelligent degree of the whole kitchen. Images are applied to various scenes as an effective and intuitive data form, and cigarette machines (which can be understood as range hoods) based on image recognition are gradually entering the market. The food materials are identified by the images, and the cooking action is identified to help a cook to complete the cooking process or instruct the cooking machine to autonomously complete the cooking process. In these intelligent scenes, the image serves as a data source for these perceptual behaviors, and the image quality directly affects the accuracy of perceptual judgment. In the cooking process, oil smoke is generated at high temperature more or less and is attached to the surface of a camera lens to cause the blur of the lens, so that the image quality acquired by a camera sensor is influenced, if a smoke machine system does not find the camera dirty in time, the fuzzy low-quality image is continuously adopted as data to complete the recognition work of intelligent cooking, and wrong results are generated. The key to avoid the situation lies in finding the dirt of the lens in time and cleaning the surface of the lens in time, thereby ensuring to collect clear image data.

Based on this, this application embodiment is with regard to the dirty problem of discerning of cigarette machine camera lens, has adopted a simple image processing technique to discern whether the camera lens is dirty, compares with other camera dirty identification methods, and this method is simple more swift. The application embodiment mainly utilizes the edge detection image processing technology to identify whether the lens is dirty or not, and can accurately and efficiently identify the dirt of the lens.

Specifically, the identification process of the present application embodiment is as follows:

firstly, the oil smoke generated in the cooking process is mainly the mixture of substances of edible oil decomposed at high temperature and oil water, the substances are easily adsorbed on the surface of the lens of the camera, when a certain amount of the substances is reached, the shot image is unclear and has a fuzzy feeling, and if the low-quality image is adopted as data to be processed, the image detection result is inaccurate. The edge detection of the image can find points with drastic changes in the image, and the points often reflect important objects or important information (mostly edge contours of the objects) in the image. Therefore, the degree of contamination of the camera can be identified by edge detection.

Here, as shown in fig. 2, the detecting step of the edge detection includes the steps of:

step 10: inputting a picture, and then executing the step 20;

step 20: filtering, and then executing step 30;

the noise in the image can be eliminated by filtering, the noise is sensitive to the derivative of the image intensity, and the edge detection is realized by solving the first or second derivative of the image intensity.

Step 30: calculating gradient amplitude and direction of pixel points, and then executing step 40;

here, the image may be subjected to enhancement processing, and after the enhancement processing, the intensity change values of the respective points of the image in the field are determined, and the intensity change is determined by calculating the gradient amplitude of the pixel value.

Step 40: judging whether the point is qualified or not by using a threshold value;

if the point is judged to be qualified, executing the step 50; if the point is judged to be unqualified, returning to execute the step 30;

and screening the pixel values with larger gradients through a threshold value to screen out the pixel points which accord with the principle.

Step 50: this point is retained, after which step 60 is performed;

step 60: and outputting the result.

Specifically, when the lens on the cigarette machine is not dirty during identification, the pictures taken by the cigarette machine are clear, and the clear pictures contain more edge information; when the lens is dirty, a layer of oil stain is attached to the surface of the lens, so that the shot pictures are unclear, and the edge outlines of the blurred pictures are few, so that whether the current camera is dirty or not can be judged through the characteristic.

Based on this, referring to fig. 3, the identification process of the embodiment of the present application may specifically include the following steps:

step 100: inputting a picture, and then executing step 200;

step 200: gaussian filtering, followed by performing step 300;

and filtering the picture to obtain the filtered picture.

Step 300: edge detection, followed by performing step 400;

and carrying out edge detection on the filtered picture to obtain an edge detection result, and storing the result in a certain data form.

Step 400: counting the number of edge information in the edge detection result, and then executing step 500;

and counting the number of the detected edge pixel points in the edge detection result.

Step 500: calculating the ratio of the number of the edge information to the number of the pixel points of the picture, and then executing the step 600;

and calculating the ratio of the number of the edge pixel points to the number of the pixel points of the whole picture.

Step 600: the ratio reaches a set threshold;

comparing whether the ratio of the number of the edge pixel points to the number of the pixel points of the whole picture reaches a set threshold value or not by setting the threshold value, and if the ratio is lower than the set threshold value, judging that the lens is dirty and needs to be scrubbed; otherwise, the detection process is continuously repeated.

If the ratio does not reach the set threshold, returning to execute the step 100; if the ratio reaches the set threshold, go to step 700;

step 700: and sending out a reminder.

And sending a prompt to the user through a smoke machine sensor according to the judgment result, namely prompting the user to scrub the lens surface of the camera in time when the lens is dirty and needs to be scrubbed.

The embodiment of the application identifies whether the mirror surface of the camera arranged on the cigarette machine is dirty or not by a simple digital image processing method (namely, an edge detection method). The method is simple, efficient and accurate, can well solve the problem of dirty identification of the camera, and prepares for collecting high-quality images.

An embodiment of the present invention further provides a processor, where the processor is capable of implementing the method in any one of the above embodiments when executing a related instruction or a related command.

The embodiment of the invention also provides the household appliance which comprises the image acquisition device, and the image acquisition device can detect the image in real time. In addition, the range hood further comprises the processor, and the processor can realize the method in any one of the embodiments when executing relevant instructions or relevant commands.

Further, the household appliance comprises a range hood.

Embodiments of the present invention further provide a computer program product, which includes a computer program, and when the computer program is executed by a processor, the method for identifying lens contamination of any one of the above embodiments is implemented.

All the method processes in the above embodiments may be implemented by a plurality of functional modules, and may be implemented by a plurality of functional modules. When the plurality of functional modules implement the above method, the above processing allocation process may be completed by different program modules as needed, that is, the internal structure of the terminal is divided into different program modules to complete all or part of the above described processing. In addition, the apparatus and method embodiments provided by the above embodiments belong to the same concept, and specific implementation processes thereof are described in the method embodiments for details, which are not described herein again.

Based on the hardware implementation of the program module, and in order to implement the method according to the embodiment of the present invention, an embodiment of the present invention further provides an electronic device, as shown in fig. 4, where the electronic device 400 includes:

a communication interface 401 capable of performing information interaction with other devices (such as network devices, terminals, and the like);

the processor 402 is connected with the communication interface 401 to realize information interaction with other devices, and is used for executing the method provided by one or more technical schemes when running a computer program;

a memory 403 for storing a computer program capable of running on the processor 402.

It should be noted that: the specific process of the processor 402 for performing the above operations is described in detail in the method embodiment, and is not described herein again.

Of course, in practice, the various components in the electronic device 400 are coupled together by a bus system 404. It is understood that the bus system 404 is used to enable communications among the components. The bus system 404 includes a power bus, a control bus, and a status signal bus in addition to a data bus. For clarity of illustration, however, the various buses are labeled as bus system 404 in FIG. 4.

The memory 403 in embodiments of the present invention is used to store various types of data to support the operation of the electronic device 400. Examples of such data include: any computer program for operating on the electronic device 400.

The method disclosed in the above embodiments of the present invention may be applied to the processor 402 or implemented by the processor 402. The processor 402 may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuits of hardware or instructions in the form of software in the processor 402. The Processor 402 may be a general purpose Processor, a Digital Signal Processor (DSP), or other programmable logic device, discrete gate or transistor logic device, discrete hardware components, or the like. Processor 402 may implement or perform the methods, steps, and logic blocks disclosed in embodiments of the present invention. A general purpose processor may be a microprocessor or any conventional processor or the like. The steps of the method disclosed by the embodiment of the invention can be directly implemented by a hardware decoding processor, or can be implemented by combining hardware and software modules in the decoding processor. The software modules may be located in a storage medium located in memory 403, and the processor 402 reads the information in memory 403 and performs the steps of the aforementioned methods in conjunction with its hardware.

In an exemplary embodiment, the electronic Device 400 may be implemented by one or more Application Specific Integrated Circuits (ASICs), DSPs, Programmable Logic Devices (PLDs), Complex Programmable Logic Devices (CPLDs), Field Programmable Gate Arrays (FPGAs), general purpose processors, controllers, Micro Controllers (MCUs), microprocessors (microprocessors), or other electronic components for performing the foregoing methods.

It is to be understood that the memory 403 of embodiments of the present invention may be either volatile memory or non-volatile memory, and may include both volatile and non-volatile memory. Among them, the nonvolatile Memory may be a Read Only Memory (ROM), a Programmable Read Only Memory (PROM), an Erasable Programmable Read-Only Memory (EPROM), an Electrically Erasable Programmable Read-Only Memory (EEPROM), a magnetic random access Memory (FRAM), a Flash Memory (Flash Memory), a magnetic surface Memory, an optical disk, or a Compact Disc Read-Only Memory (CD-ROM); the magnetic surface storage may be disk storage or tape storage. Volatile Memory can be Random Access Memory (RAM), which acts as external cache Memory. By way of illustration and not limitation, many forms of RAM are available, such as Static Random Access Memory (SRAM), Synchronous Static Random Access Memory (SSRAM), Dynamic Random Access Memory (DRAM), Synchronous Dynamic Random Access Memory (SDRAM), Double Data Rate Synchronous Dynamic Random Access Memory (DDRSDRAM), Enhanced Synchronous Dynamic Random Access Memory (ESDRAM), Enhanced Synchronous Dynamic Random Access Memory (Enhanced DRAM), Synchronous Dynamic Random Access Memory (SLDRAM), Direct Memory (DRmb Access), and Random Access Memory (DRAM). The described memory for embodiments of the present invention is intended to comprise, without being limited to, these and any other suitable types of memory.

In an exemplary embodiment, the present invention further provides a storage medium, i.e. a computer storage medium, in particular a computer readable storage medium, for example comprising a memory 403 storing a computer program, which is executable by a processor 402 of the electronic device 400 to perform the aforementioned method steps. The computer readable storage medium may be Memory such as FRAM, ROM, PROM, EPROM, EEPROM, Flash Memory, magnetic surface Memory, optical disk, or CD-ROM.

In one embodiment, the processes in the above embodiments may also be implemented by a computer device. The application also provides a computer device, which can be a terminal, and the internal structure diagram of the computer device can be shown in fig. 5. The computer apparatus includes a processor a01, a network interface a02, a display screen a04, an input device a05, and a memory (not shown in the figure) connected through a system bus. Wherein processor a01 of the computer device is used to provide computing and control capabilities. The memory of the computer device comprises an internal memory a03 and a non-volatile storage medium a 06. The nonvolatile storage medium a06 stores an operating system B01 and a computer program B02. The internal memory a03 provides an environment for the operation of the operating system B01 and the computer program B02 in the nonvolatile storage medium a 06. The network interface a02 of the computer device is used for communication with an external terminal through a network connection. The computer program is executed by the processor a01 to implement the method of any of the above embodiments. The display screen a04 of the computer device may be a liquid crystal display screen or an electronic ink display screen, and the input device a05 of the computer device may be a touch layer covered on the display screen, a button, a trackball or a touch pad arranged on a casing of the computer device, or an external keyboard, a touch pad or a mouse.

Those skilled in the art will appreciate that the architecture shown in fig. 5 is merely a block diagram of some of the structures associated with the disclosed aspects and is not intended to limit the computing devices to which the disclosed aspects apply, as particular computing devices may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.

An embodiment of the present invention provides an apparatus, where the apparatus includes a processor, a memory, and a program stored in the memory and capable of being executed on the processor, and the processor implements the method according to any one of the above embodiments when executing the program.

As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.

The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.

These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.

These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.

In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.

The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). The memory is an example of a computer-readable medium.

Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, computer readable media does not include transitory computer readable media (transmyedia) such as modulated data signals and carrier waves.

It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in the process, method, article, or apparatus that comprises the element.

The above are merely examples of the present application and are not intended to limit the present application. Various modifications and changes may occur to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the scope of the claims of the present application.

完整详细技术资料下载
上一篇:石墨接头机器人自动装卡簧、装栓机
下一篇:一种自动去除图片背景的方法及系统

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!