Display device and screen projection method
1. A display device for displaying screen projection data from the screen projection device, the display device comprising:
a display;
a fixing assembly for rotating the display to place the display in a landscape screen state or a portrait screen state;
a controller communicatively coupled with the display, the controller configured to:
starting an RTSP (real time streaming protocol) interaction thread in response to the fact that a screen projection signal is received, so that the display equipment and the screen projection equipment perform data interaction;
sending an RTSP message containing a screen state identifier to the screen projection device, wherein the screen state identifier is used for indicating the state of the display, and the RTSP message is used for negotiating a video format between the display device and the screen projection device;
and receiving screen projection data coded by the screen projection equipment according to the screen state identification, and controlling a display to play the screen projection data.
2. The display device of claim 1, wherein in the step of sending an RTSP message containing a screen state identifier to the screen projecting device, the controller is further configured to:
detecting the current state of the display;
and adding a screen state identifier in the video format negotiation field according to the state of the display.
3. The display device according to claim 2, wherein in the step of adding a screen status identifier in the video coding capability negotiation field according to the status of the display, the controller is further configured to:
when the display is in the landscape state, controlling any reserved data bit identifier in the video format negotiation field to be 0;
and when the display is in the vertical screen state, controlling any reserved data bit identifier in the video format negotiation field to be 1.
4. The display device of claim 3, wherein the controller is further configured to:
and controlling the last bit of data in the video format negotiation field to be used for setting the screen state identifier.
5. A display device for transmitting screen projection data to a projected screen device, the display device comprising:
a display;
a controller communicatively coupled with the display, the controller configured to:
sending a screen projection signal to the screen projected equipment, and starting an RTSP (real time streaming protocol) interaction thread to enable the display equipment and the screen projected equipment to perform data interaction;
receiving an RTSP message containing a screen state identifier sent by the screen-projected device, wherein the screen state identifier is used for indicating the state of a display in the screen-projected device, the display in the screen-projected device is in a horizontal screen state or a vertical screen state, and the RTSP message is used for negotiating a video format between the display device and the screen-projected device;
when the display is in the horizontal screen state, coding the maximum screen resolution supported by the display equipment and the projected screen equipment according to the horizontal screen resolution; when the display is in the vertical screen state, coding the maximum screen resolution supported by the display equipment and the projected screen equipment according to the vertical screen resolution;
feeding back the encoded screen projection data to the screen projection equipment so that the screen projection equipment plays the screen projection data.
6. The display device of claim 5, wherein the landscape screen resolution is a length by width resolution; the vertical screen resolution is a width-by-length resolution.
7. A screen projection method is characterized by comprising the following steps:
in response to receiving the screen projection signal, starting an RTSP (real time streaming protocol) interaction thread to enable the screen projected equipment and the screen projection equipment to perform data interaction;
sending an RTSP message containing a screen state identifier to the screen projection equipment, wherein the screen state identifier is used for indicating the state of a display in the screen projection equipment, the display in the screen projection equipment is in a horizontal screen state or a vertical screen state, and the RTSP message is used for negotiating a video format between the screen projection equipment and the screen projection equipment;
and receiving screen projection data coded by the screen projection equipment according to the screen state identification, and controlling a display in the screen projection equipment to play the screen projection data.
8. The screen projection method of claim 7, wherein in the sending of the RTSP message containing the screen state identifier to the screen projection device, the method further comprises:
detecting the state of a display in the current screen projected equipment;
and adding screen state identification in the video format negotiation field according to the state of the display.
9. The screen projection method of claim 8, wherein in the adding of the screen state identifier in the video coding capability negotiation field according to the state of the display, the method further comprises:
when a display in the current screen projected equipment is in the horizontal screen state, controlling any reserved data bit identifier in the video format negotiation field to be 0;
and when a display in the current screen-projected equipment is in the vertical screen state, controlling any reserved data bit identifier in the video format negotiation field to be 1.
10. A screen projection method is characterized by comprising the following steps:
sending a screen projection signal to the screen projection equipment, and starting an RTSP (real time streaming protocol) interaction thread to enable the screen projection equipment and the screen projection equipment to perform data interaction;
receiving an RTSP message which is sent by the screen-projected device and contains a screen state identifier, wherein the screen state identifier is used for indicating the state of a display in the screen-projected device, the display in the screen-projected device is in a horizontal screen state or a vertical screen state, and the RTSP message is used for negotiating a video format between the screen-projected device and the screen-projected device;
when a display in the screen projection equipment is in the horizontal screen state, coding the maximum screen resolution supported by the screen projection equipment and the screen projection equipment together according to the horizontal screen resolution; when a display in the screen projection equipment is in the vertical screen state, coding the maximum screen resolution supported by the screen projection equipment and the screen projection equipment together according to the vertical screen resolution;
feeding back the encoded screen projection data to the screen projection equipment so that the screen projection equipment plays the screen projection data.
Background
The display device is a television product which can realize the bidirectional man-machine interaction function and integrates a plurality of functions such as audio and video, entertainment, data and the like. In order to meet the diversified requirements of users, the display equipment is provided with various applications such as screen projection, audio and video, entertainment and the like, and interacts and exchanges information with the users through a user interface.
For screen projection applications, Miracast screen projection is an important resource sharing mode among display devices, and media resource sharing is realized through RTSP (Real time streaming Protocol). For example, when Miracast is used for screen casting, a mobile phone needs to acquire a video frame of media resources to be shared, then compress and encode the video frame, and transmit the video frame to a smart television. And decompressing the received media asset data by the intelligent television, and displaying the decompressed media asset. At present, in the screen projection process, because the aspect ratio of a mobile phone screen is not adapted to the aspect ratio of an intelligent television screen, the phenomenon that a screen projection picture is stretched and zoomed when the intelligent television is used for projecting the screen can be caused, and the user experience effect is poor.
Disclosure of Invention
The application provides a display device and a screen projection method, which aim to solve the technical problem that in the prior art, the screen projection picture is stretched and zoomed in the screen projection process, so that the user experience effect is poor.
In order to solve the technical problem, the embodiment of the application discloses the following technical scheme:
in a first aspect, an embodiment of the present application discloses a display device, where the display device is configured to display screen projection data from the screen projection device, and the display device includes:
a display;
a fixing assembly for rotating the display to place the display in a landscape screen state or a portrait screen state;
a controller communicatively coupled with the display, the controller configured to:
starting an RTSP (real time streaming protocol) interaction thread in response to the fact that a screen projection signal is received, so that the display equipment and the screen projection equipment perform data interaction;
sending an RTSP message containing a screen state identifier to the screen projection device, wherein the screen state identifier is used for indicating the state of the display, and the RTSP message is used for negotiating a video format between the display device and the screen projection device;
and receiving screen projection data coded by the screen projection equipment according to the screen state identification, and controlling a display to play the screen projection data.
In some embodiments, in the step of sending an RTSP message containing a screen state identifier to the screen projecting device, the controller is further configured to:
detecting the current state of the display;
and adding a screen state identifier in the video format negotiation field according to the state of the display.
In some embodiments, in the step of adding a screen state identification in the video coding capability negotiation field according to the state of the display, the controller is further configured to:
when the display is in the landscape state, controlling any reserved data bit identifier in the video format negotiation field to be 0;
and when the display is in the vertical screen state, controlling any reserved data bit identifier in the video format negotiation field to be 1.
In a second aspect, an embodiment of the present application discloses a display device, where the display device is configured to send screen projection data to a screen projected device, and the display device includes:
a display;
a controller communicatively coupled with the display, the controller configured to:
sending a screen projection signal to the screen projected equipment, and starting an RTSP (real time streaming protocol) interaction thread to enable the display equipment and the screen projected equipment to perform data interaction;
receiving an RTSP message containing a screen state identifier sent by the screen-projected device, wherein the screen state identifier is used for indicating the state of a display in the screen-projected device, the display in the screen-projected device is in a horizontal screen state or a vertical screen state, and the RTSP message is used for negotiating a video format between the display device and the screen-projected device;
when the display is in the horizontal screen state, coding the maximum screen resolution supported by the display equipment and the projected screen equipment according to the horizontal screen resolution; when the display is in the vertical screen state, coding the maximum screen resolution supported by the display equipment and the projected screen equipment according to the vertical screen resolution;
feeding back the encoded screen projection data to the screen projection equipment so that the screen projection equipment plays the screen projection data.
In a third aspect, an embodiment of the present application discloses a screen projection method, including:
in response to receiving the screen projection signal, starting an RTSP (real time streaming protocol) interaction thread to enable the screen projected equipment and the screen projection equipment to perform data interaction;
sending an RTSP message containing a screen state identifier to the screen projection equipment, wherein the screen state identifier is used for indicating the state of a display in the screen projection equipment, the display in the screen projection equipment is in a horizontal screen state or a vertical screen state, and the RTSP message is used for negotiating a video format between the screen projection equipment and the screen projection equipment;
and receiving screen projection data coded by the screen projection equipment according to the screen state identification, and controlling a display in the screen projection equipment to play the screen projection data.
In a fourth aspect, an embodiment of the present application discloses a screen projection method, where the screen projection method includes:
sending a screen projection signal to the screen projection equipment, and starting an RTSP (real time streaming protocol) interaction thread to enable the screen projection equipment and the screen projection equipment to perform data interaction;
receiving an RTSP message which is sent by the screen-projected device and contains a screen state identifier, wherein the screen state identifier is used for indicating the state of a display in the screen-projected device, the display in the screen-projected device is in a horizontal screen state or a vertical screen state, and the RTSP message is used for negotiating a video format between the screen-projected device and the screen-projected device;
when a display in the screen projection equipment is in the horizontal screen state, coding the maximum screen resolution supported by the screen projection equipment and the screen projection equipment together according to the horizontal screen resolution; when a display in the screen projection equipment is in the vertical screen state, coding the maximum screen resolution supported by the screen projection equipment and the screen projection equipment together according to the vertical screen resolution;
feeding back the encoded screen projection data to the screen projection equipment so that the screen projection equipment plays the screen projection data.
Compared with the prior art, the beneficial effect of this application is:
when a user sends a screen projection signal to the display device through the screen projection device, the display device responds to the screen projection signal, and the screen projection device and the display device both start RTSP (real time streaming protocol) interaction threads. The display equipment sets a screen state identifier according to the state of the display controlled by the fixed component, and sends the screen state identifier to the screen projection equipment when the display equipment performs video format negotiation with the screen projection equipment. And the screen projection equipment encodes the screen projection data according to the received screen state identification and sends the screen projection data to the display equipment. And the display equipment plays the screen data on the display after receiving the screen projection data. In the application, the display equipment records whether the current display is in a horizontal screen state or a vertical screen state through the screen state identification, the screen projection equipment acquires the state of the display in the current display equipment in the RTSP interaction process, and performs matching coding on the screen projection data, so that the screen projection data are adapted to the state of the display, the resolution definition of the screen projection data is ensured, and the user experience is improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application.
Drawings
In order to more clearly explain the technical solution of the present application, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious to those skilled in the art that other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a schematic diagram illustrating an operational scenario between a display device and a control apparatus according to some embodiments;
a block diagram of the hardware configuration of the control device 100 according to some embodiments is illustrated in fig. 2;
a block diagram of a hardware configuration of a display device 200 according to some embodiments is illustrated in fig. 3;
a schematic diagram of a software configuration in a display device 200 according to some embodiments is illustrated in fig. 4;
an effect diagram showing a landscape state of the display device 200 according to some embodiments is illustrated in fig. 5 a;
an effect diagram illustrating a portrait state of the display device 200 according to some embodiments is illustrated in fig. 5 b;
FIG. 6 illustrates a display effect diagram in which the screen projection data is stretched, according to some embodiments;
a flow diagram of a screen projection method according to some embodiments is illustrated in fig. 7;
a flow diagram of another screen projection method according to some embodiments is illustrated in fig. 8;
a timing diagram of a method of screen projection according to some embodiments is illustrated in fig. 9.
Detailed Description
To make the purpose and embodiments of the present application clearer, the following will clearly and completely describe the exemplary embodiments of the present application with reference to the attached drawings in the exemplary embodiments of the present application, and it is obvious that the described exemplary embodiments are only a part of the embodiments of the present application, and not all of the embodiments.
It should be noted that the brief descriptions of the terms in the present application are only for the convenience of understanding the embodiments described below, and are not intended to limit the embodiments of the present application. These terms should be understood in their ordinary and customary meaning unless otherwise indicated.
The terms "first," "second," "third," and the like in the description and claims of this application and in the above-described drawings are used for distinguishing between similar or analogous objects or entities and not necessarily for describing a particular sequential or chronological order, unless otherwise indicated. It is to be understood that the terms so used are interchangeable under appropriate circumstances.
The terms "comprises" and "comprising," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a product or apparatus that comprises a list of elements is not necessarily limited to all elements expressly listed, but may include other elements not expressly listed or inherent to such product or apparatus.
The term "module" refers to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware and/or software code that is capable of performing the functionality associated with that element.
Fig. 1 is a schematic diagram of an operation scenario between a display device and a control apparatus according to an embodiment. As shown in fig. 1, a user may operate the display apparatus 200 through the smart device 300 or the control device 100.
In some embodiments, the control apparatus 100 may be a remote controller, and the communication between the remote controller and the display device includes an infrared protocol communication or a bluetooth protocol communication, and other short-distance communication methods, and controls the display device 200 in a wireless or wired manner. The user may input a user instruction through a key on a remote controller, voice input, control panel input, etc., to control the display apparatus 200.
In some embodiments, the smart device 300 (e.g., mobile terminal, tablet, computer, laptop, etc.) may also be used to control the display device 200. For example, the display device 200 is controlled using an application program running on the smart device.
In some embodiments, the display device 200 may also be controlled in a manner other than the control apparatus 100 and the smart device 300, for example, the voice command control of the user may be directly received by a module configured inside the display device 200 to obtain a voice command, or may be received by a voice control device provided outside the display device 200.
In some embodiments, the display device 200 is also in data communication with a server 400. The display device 200 may be allowed to be communicatively connected through a Local Area Network (LAN), a Wireless Local Area Network (WLAN), and other networks. The server 400 may provide various contents and interactions to the display apparatus 200. The server 400 may be a cluster or a plurality of clusters, and may include one or more types of servers.
Fig. 2 exemplarily shows a block diagram of a configuration of the control apparatus 100 according to an exemplary embodiment. As shown in fig. 2, the control device 100 includes a controller 110, a communication interface 130, a user input/output interface 140, a memory, and a power supply. The control apparatus 100 may receive an input operation instruction from a user and convert the operation instruction into an instruction recognizable and responsive by the display device 200, serving as an interaction intermediary between the user and the display device 200.
Fig. 3 shows a hardware configuration block diagram of the display apparatus 200 according to an exemplary embodiment.
In some embodiments, the display apparatus 200 includes at least one of a tuner demodulator 210, a communicator 220, a detector 230, an external device interface 240, a controller 250, a display 260, an audio output interface 270, a memory, a power supply, a user interface.
In some embodiments the controller comprises a processor, a video processor, an audio processor, a graphics processor, a RAM, a ROM, a first interface to an nth interface for input/output.
In some embodiments, the display 260 includes a display screen component for presenting a picture, and a driving component for driving an image display, a component for receiving an image signal from the controller output, performing display of video content, image content, and a menu manipulation interface, and a user manipulation UI interface.
In some embodiments, the display 260 may be a liquid crystal display, an OLED display, and a projection display, and may also be a projection device and a projection screen.
In some embodiments, communicator 220 is a component for communicating with external devices or servers according to various communication protocol types. For example: the communicator may include at least one of a Wifi module, a bluetooth module, a wired ethernet module, and other network communication protocol chips or near field communication protocol chips, and an infrared receiver. The display apparatus 200 may establish transmission and reception of control signals and data signals with the external control apparatus 100 or the server 400 through the communicator 220.
In some embodiments, the user interface may be configured to receive control signals for controlling the apparatus 100 (e.g., an infrared remote control, etc.).
In some embodiments, the detector 230 is used to collect signals of the external environment or interaction with the outside. For example, detector 230 includes a light receiver, a sensor for collecting ambient light intensity; alternatively, the detector 230 includes an image collector, such as a camera, which may be used to collect external environment scenes, attributes of the user, or user interaction gestures, or the detector 230 includes a sound collector, such as a microphone, which is used to receive external sounds.
In some embodiments, the external device interface 240 may include, but is not limited to, the following: high Definition Multimedia Interface (HDMI), analog or data high definition component input interface (component), composite video input interface (CVBS), USB input interface (USB), RGB port, and the like. The interface may be a composite input/output interface formed by the plurality of interfaces.
In some embodiments, the tuner demodulator 210 receives broadcast television signals via wired or wireless reception, and demodulates audio/video signals, such as EPG data signals, from a plurality of wireless or wired broadcast television signals.
In some embodiments, the controller 250 and the modem 210 may be located in different separate devices, that is, the modem 210 may also be located in an external device of the main device where the controller 250 is located, such as an external set-top box.
In some embodiments, the controller 250 controls the operation of the display device and responds to user operations through various software control programs stored in memory. The controller 250 controls the overall operation of the display apparatus 200. For example: in response to receiving a user command for selecting a UI object to be displayed on the display 260, the controller 250 may perform an operation related to the object selected by the user command.
In some embodiments, the object may be any one of selectable objects, such as a hyperlink, an icon, or other actionable control. The operations related to the selected object are: displaying an operation connected to a hyperlink page, document, image, or the like, or performing an operation of a program corresponding to the icon.
In some embodiments the controller comprises at least one of a Central Processing Unit (CPU), a video processor, an audio processor, a Graphics Processing Unit (GPU), a RAM Random Access Memory (RAM), a ROM (Read-Only Memory), a first to nth interface for input/output, a communication Bus (Bus), and the like.
And the CPU is used for executing the operating system and the application program instructions stored in the memory and executing various application programs, data and contents according to various interaction instructions for receiving external input so as to finally display and play various audio and video contents. The CPU processor may include a plurality of processors. E.g. comprising a main processor and one or more sub-processors.
In some embodiments, a graphics processor for generating various graphics objects, such as: icons, operation menus, user input instruction display graphics, and the like. The graphic processor comprises an arithmetic unit, which performs operation by receiving various interactive instructions input by a user and displays various objects according to display attributes; the system also comprises a renderer for rendering various objects obtained based on the arithmetic unit, wherein the rendered objects are used for being displayed on a display.
In some embodiments, the video processor is configured to receive an external video signal, and perform video processing such as decompression, decoding, scaling, noise reduction, frame rate conversion, resolution conversion, and image synthesis according to a standard codec protocol of the input signal, so as to obtain a signal that can be displayed or played on the direct display device 200.
In some embodiments, the video processor includes a demultiplexing module, a video decoding module, an image synthesis module, a frame rate conversion module, a display formatting module, and the like. The demultiplexing module is used for demultiplexing the input audio and video data stream. And the video decoding module is used for processing the video signal after demultiplexing, including decoding, scaling and the like. And the image synthesis module is used for carrying out superposition mixing processing on the GUI signal input by the user or generated by the user and the video image after the zooming processing by the graphic generator so as to generate an image signal for display. And the frame rate conversion module is used for converting the frame rate of the input video. And the display formatting module is used for converting the received video output signal after the frame rate conversion, and changing the signal to be in accordance with the signal of the display format, such as an output RGB data signal.
In some embodiments, the audio processor is configured to receive an external audio signal, decompress and decode the received audio signal according to a standard codec protocol of the input signal, and perform noise reduction, digital-to-analog conversion, and amplification processing to obtain an audio signal that can be played in the speaker.
In some embodiments, a user may enter user commands on a Graphical User Interface (GUI) displayed on display 260, and the user input interface receives the user input commands through the Graphical User Interface (GUI). Alternatively, the user may input the user command by inputting a specific sound or gesture, and the user input interface receives the user input command by recognizing the sound or gesture through the sensor.
In some embodiments, a "user interface" is a media interface for interaction and information exchange between an application or operating system and a user that enables conversion between an internal form of information and a form that is acceptable to the user. A commonly used presentation form of the User Interface is a Graphical User Interface (GUI), which refers to a User Interface related to computer operations and displayed in a graphical manner. It may be an interface element such as an icon, a window, a control, etc. displayed in the display screen of the electronic device, where the control may include a visual interface element such as an icon, a button, a menu, a tab, a text box, a dialog box, a status bar, a navigation bar, a Widget, etc.
In some embodiments, a system of a display device may include a Kernel (Kernel), a command parser (shell), a file system, and an application program. The kernel, shell, and file system together make up the basic operating system structure that allows users to manage files, run programs, and use the system. After power-on, the kernel is started, kernel space is activated, hardware is abstracted, hardware parameters are initialized, and virtual memory, a scheduler, signals and interprocess communication (IPC) are operated and maintained. And after the kernel is started, loading the Shell and the user application program. The application program is compiled into machine code after being started, and a process is formed.
Referring to fig. 4, in some embodiments, the system is divided into four layers, which are an Application (Applications) layer (abbreviated as "Application layer"), an Application Framework (Application Framework) layer (abbreviated as "Framework layer"), an Android runtime (Android runtime) and system library layer (abbreviated as "system runtime library layer"), and a kernel layer from top to bottom.
In some embodiments, at least one application program runs in the application program layer, and the application programs may be windows (windows) programs carried by an operating system, system setting programs, clock programs or the like; or an application developed by a third party developer. In particular implementations, the application packages in the application layer are not limited to the above examples.
The framework layer provides an Application Programming Interface (API) and a programming framework for the application. The application framework layer includes a number of predefined functions. The application framework layer acts as a processing center that decides to let the applications in the application layer act. The application program can access the resources in the system and obtain the services of the system in execution through the API interface.
As shown in fig. 4, in the embodiment of the present application, the application framework layer includes a manager (Managers), a Content Provider (Content Provider), and the like, where the manager includes at least one of the following modules: an Activity Manager (Activity Manager) is used for interacting with all activities running in the system; the Location Manager (Location Manager) is used for providing the system service or application with the access of the system Location service; a Package Manager (Package Manager) for retrieving various information related to an application Package currently installed on the device; a Notification Manager (Notification Manager) for controlling display and clearing of Notification messages; a Window Manager (Window Manager) is used to manage the icons, windows, toolbars, wallpapers, and desktop components on a user interface.
In some embodiments, the activity manager is used to manage the lifecycle of the various applications as well as general navigational fallback functions, such as controlling exit, opening, fallback, etc. of the applications. The window manager is used for managing all window programs, such as obtaining the size of a display screen, judging whether a status bar exists, locking the screen, intercepting the screen, controlling the change of the display window (for example, reducing the display window, displaying a shake, displaying a distortion deformation, and the like), and the like.
In some embodiments, the system runtime layer provides support for the upper layer, i.e., the framework layer, and when the framework layer is used, the android operating system runs the C/C + + library included in the system runtime layer to implement the functions to be implemented by the framework layer.
In some embodiments, the kernel layer is a layer between hardware and software. As shown in fig. 4, the core layer includes at least one of the following drivers: audio drive, display driver, bluetooth drive, camera drive, WIFI drive, USB drive, HDMI drive, sensor drive (like fingerprint sensor, temperature sensor, pressure sensor etc.) and power drive etc..
The hardware or software architecture in some embodiments may be based on the description in the above embodiments, and in some embodiments may be based on other hardware or software architectures that are similar to the above embodiments, and it is sufficient to implement the technical solution of the present application.
In some embodiments, the display 260 of the display device 200 is configured to display a user interface, images, text, video, etc., the controller 250 of the display device 200 is configured to provide the user interface, images, text, video, etc., to the display 260, and the controller 250 may control the fixing component, thereby implementing the rotation of the display 260 through the fixing component to switch the display device 200 between the landscape state and the portrait state.
In some embodiments, the fixing component is fixed on the back of the display 260, the fixing component is used for fixing with a wall surface, and the fixing component receives the control of the controller 250, so that the display 260 rotates in a vertical plane, and the display 260 is in different screen states. The screen state includes a landscape state and a portrait state, and fig. 5a schematically illustrates an effect of the landscape state of the display apparatus 200 according to some embodiments, and in conjunction with fig. 5a, the landscape state refers to a state in which the length of the display 260 in the horizontal direction is greater than the length of the display 260 in the vertical direction when viewed from the front of the display 260. Fig. 5b schematically illustrates an effect of the vertical screen state of the display device 200 according to some embodiments, and as shown in fig. 5b, the vertical screen state refers to a state in which the length of the display 260 in the horizontal direction is smaller than the length of the display 260 in the vertical direction when viewed from the front of the display 260. Wherein vertical in this application means substantially vertical.
Based on the display device 200, a user can download a screen projection application in an application program center of the display device 200, the screen projection application can be an application program based on a Miracast (wireless display) function, the Miracast transmits screen projection data of the screen projection device to the display device 200 by using a WIFI Point-to-Point (Point 2Point) network transmission function to play, and therefore the purpose of resource sharing is achieved, wherein the screen projection data can be audio and video data, and the screen projection device can be a terminal device such as a smart phone, a tablet computer, a computer and the like. For the display device 200 with the rotatable display 260, when the screen is projected through a vertical screen projection device such as a smart phone or a tablet computer, if the screen states of the projection device and the display device 200 are inconsistent, the screen projection image is stretched and zoomed when the display device 200 projects the screen, as shown in fig. 6, the screen projection data is stretched, so that the user experience effect is poor.
To solve the above problems, the present application provides a display device and a screen projection method in some embodiments. It should be noted that, the display device in this application may refer to not only the smart television, but also a computer, a tablet computer, and the like. In addition, the display device 200 may be used as a screen projection device at a screen projection end, or as a display device at a screen projection end.
The screen projection process provided by the embodiment of the present application is first described with reference to the drawings through the display device at the screen projection end.
A flowchart of a screen projection method according to some embodiments is illustrated in fig. 7. The application provides a display device which is a screen projection device and is used for displaying screen projection data from the screen projection device in some embodiments. The display device 200 includes a display 260 and a controller 250, the controller 250 being communicatively coupled to the display 260, the controller being configured to perform the screen projection process of fig. 7. With reference to fig. 7, the screen projection process is as follows:
s701: and responding to the received screen projection signal, and starting an RTSP interaction thread to enable the display equipment and the screen projection equipment to carry out data interaction.
In some embodiments, a user may input a screen projection signal at a screen projection device, wherein the screen projection device is provided with a screen projection application, and the screen projection application is based on a Miracast screen projection function. For example, a screen projection control is arranged on the screen projection device, and a user clicks or selects the screen projection control through touch control to start a Miracast screen projection function. After the Miracast screen projection function is started, the screen projection device sends a screen projection signal to the display device 200, the display device 200 responds to the screen projection signal, starts the Miracast screen projection function of the display device, and enables the display device and the screen projection device to perform data interaction based on the Miracast.
In some embodiments, when the display device 200 starts the Miracast screen projection function, it starts its own P2P network. Of course, when the Miracast screen projection function is started, the screen projection device also starts its own P2P network. Based on the P2P network, the display device 200 establishes a connection with the screen projection device, and the two ends interact with each other through the network part to obtain the IP addresses of each other.
In some embodiments, Miracast needs to use WFD (Wi-Fi Display) interaction protocol to implement media asset sharing when implementing the screen projection function. The WFD interaction protocol utilizes rtsp (real Time Streaming protocol) protocol to implement capability negotiation between the screen projection device and the display device. After the P2P network connection is established, the display device 200 and the screen projection device respectively start an RTSP interaction thread, perform a handshake interaction phase, and implement data interaction between the display device 200 and the screen projection device.
S702: and sending an RTSP message containing a screen state identifier to the screen projection equipment, wherein the screen state identifier is used for indicating the state of the display, and the RTSP message is used for negotiating a video format between the display equipment and the screen projection equipment.
In some embodiments, during RTSP-based data interaction, when both ends are in the M3 stage, the screen projection device and the display device define some capability parameters that both ends need to negotiate, where one of the capability parameters is capability negotiation (wfd _ video _ formats) of video format, and during the negotiation, the resolution, frame rate, etc. of each frame image in the screen projection data to be played are determined. The negotiation field of video format (video format) is a 16-ary character string of consecutive different lengths, where different parameters are separated by spaces.
For example, the negotiation fields for the video format are as follows:
00 01 01 08 00000001 00000000 00000003 00 0000 0000 13none none
the resolution and frame rate of the projection data are negotiated through some parameters, for example, the parameter 00000001 is a standard to be followed by image compression and encoding supported by CEA (Consumer Electronics Association, usa), and for example, 00000000 is a standard to be followed by image compression and encoding supported by VESA (Video Electronics Standards Association ). The parameters of the international video coding standard such as CEA, VESA and the like negotiate the compression resolution and the video frame rate of the images at two ends. CEA and VESA are all 4 bytes, 32 bits are represented by 16-system, each bit of data has specific standard meaning, 0 represents non-support, and 1 represents support. As shown in Table 1, the resolution and frame rate represented by each data bit supported by CEA.
Table 1:
data bit
Index
Description of the invention
0
0
640*480p60
1
1
720*480p60
2
2
720*480i60
3
3
720*576p50
4
4
720*576i50
5
5
1280*720p30
6
6
1280*720p60
7
7
1920*1080p30
8
8
1920*1080p60
9
9
1920*1080i60
10
10
1280*720p25
11
11
1280*720p50
12
12
1920*1080p25
13
13
1920*1080p50
14
14
1920*1080i50
15
15
1280*720p24
16
16
1920*1080p24
31:17
-
Retention
In table 1, the frame rate is 60, as the 6 th bit represents 1280 × 720 resolution. If the data bit corresponds to 1, it represents that the display device supports 1280 × 720 resolution and the frame rate is 60, and if the data bit corresponds to 0, it represents that the display device does not support 1280 × 720 resolution and the frame rate is 60. Other standards are similar and will not be described herein. If the resolution and the frame rate supported by the plurality of parameters are different, the highest resolution and frame rate are selected.
In some embodiments, during the M3 interaction between the display device 200 and the screen projection device, the display device 200 will inform the screen projection device of all the supported resolutions and frame rates in the form of RTSP messages, and after receiving the capability parameters of the display device 200, the screen projection device will select a maximum optimal value for encoding and compressing. For example, the display device 200 may support the CEA standard to be 1920 × 1080 and 60 frames/s, and according to the downward compatibility characteristic, the corresponding CEA standard low resolution is also supported, such as 1920 × 1080 and 30 frames/s, and 1280x720 and 60 frames/s, and 1280x720 and 30 frames/s. When the screen projection device receives the capability parameters of the display device 200, one of the capability parameters is selected, and then the result is notified to the display device 200, and the display device 200 confirms the result after receiving the result.
In some embodiments, when sending the RTSP message for negotiating the video format between the display device and the screen projection device, the display device 200 first detects the current state of the display 260, i.e. determines whether the current state of the display 260 is a landscape state or a portrait state. And determining a screen state identifier according to the current state of the display 260, adding the screen state identifier in the video format negotiation field, and indicating the state of the display through the screen state identifier.
In some embodiments, controller 250 controls any reserved data bit identification in the video format negotiation field to be 0 when the display 260 is in the landscape state. When the display 260 is in the portrait state, the controller 250 controls any reserved data bit identifier in the video format negotiation field to be 1. In a negotiation field of a video format, each international video coding standard parameter has 32 bit data bits, and the 32 bit data bits have data bits which are not developed and utilized temporarily, namely, reserved data bits. It should be noted that, in another implementation manner, since one data bit has two states, that is, 0 or 1, here, 0 may also be set to represent the vertical screen state, and 1 represents the horizontal screen state, as long as the negotiation definitions at both ends of the display device 200 and the screen projection device are uniform, which is not limited herein.
In some embodiments, as the resolution of the subsequent video is upgraded, the data bits in the parameter of the international video coding standard are gradually set to the new resolution supported. Therefore, the last bit of data in the video format negotiation field can be controlled to be used for setting the screen status flag.
In some embodiments, after receiving the RTSP message, the screen projection device only needs to add a judgment statement to the original code program for analyzing and determining the resolution, and judges whether the last bit of data in the video format negotiation field is 1 or 0, if so, it is known that the display of the display device is in a vertical screen state, and if so, it is known that the display of the display device is in a horizontal screen state.
For example, when the display device 200 detects that the current display state is the vertical screen state, the RTSP message sent to the screen projection device is as follows:
wfd_video_formats:00 01 01 02 800000A0 00000000 0000000 00 0000 0000 00none none。
in this message, 800000A0 converts 800000A0 to 1000000000000000000010100000 as a binary, for the standards to be followed for image compression and encoding supported by CEA. From right to left, the 6 th, 8 th and 32 th bits are 1, and the others are 0. Referring to table 1, the 6 th and 8 th bits represent 1280 × 720, p60, 1920 × 1080 and p60, respectively, and the 32 th bit represents that the current display state is a vertical screen. After receiving the message, the screen projection device replies the message to the display device as follows:
wfd_video_formats:00 01 01 02 00000080 00000000 0000000 00 0000 0000 00none none。
from the message returned by the screen-casting device, it can be known that the screen-casting device is ready to select 1920 x 1080 and p60 represented by 8 th bit to encode and compress the screen-casting data.
S703: and receiving screen projection data coded by the screen projection equipment according to the screen state identification, and controlling a display to play the screen projection data.
In some embodiments, after knowing the current status of the display at the display device side and the video capability supported by the display device 200 according to the RTSP message, the screen projection device selects the resolution and the frame rate preferentially, and performs encoding compression on the screen projection data according to the screen status identifier. And sending the compressed screen projection data to the display device 200, and receiving the screen projection data by the display device 200 and controlling a display to play the screen projection data.
In order to further explain the screen projection process in the present application, an embodiment of the present application further provides a display device, and the screen projection process provided in the embodiment of the present application is described by a screen projection end display device in combination with the accompanying drawings.
A flow diagram of another screen projection method according to some embodiments is illustrated in fig. 8. The application provides a display device which is a screen projection device and is used for sending screen projection data to a screen projection device in some embodiments. The display device 200 includes a display 260 and a controller 250, the controller 250 being communicatively coupled to the display 260, the controller being configured to perform the screen projection process of fig. 8. With reference to fig. 8, the screen projection process is as follows:
s801: and sending a screen projection signal to the screen projected equipment, and starting an RTSP (real time streaming protocol) interaction thread to enable the display equipment to perform data interaction with the screen projected equipment.
In some embodiments, the display device 200 installs a screen projection application that is based on Miracast screen projection functionality. For example, a screen projection control is arranged on the display device, and a user clicks or selects the screen projection control through touch control to start a Miracast screen projection function. After the Miracast screen projection function is started, the display device 200 sends a screen projection signal to the projected screen device, starts the Miracast screen projection function of the display device, and starts the RTSP interaction thread of the two end devices respectively based on the Miracast to perform a handshake interaction stage, so as to realize data interaction.
S802: receiving RTSP (real time streaming protocol) messages which are sent by the screen-projected device and contain screen state identification, wherein the screen state identification is used for indicating the state of a display in the screen-projected device, the display in the screen-projected device is in a horizontal screen state or a vertical screen state, and the RTSP messages are used for negotiating the video format between the display device and the screen-projected device.
In some embodiments, before negotiating the video format capability with the display device 200, the screen-projected device checks whether the display is in the landscape screen state or the portrait screen state, and marks the display in the RTSP message through the screen state identifier. After receiving the RTSP message, the display device 200 obtains the state of the display of the current screen-projected device and the video capability supported by the screen-projected device, and then selects the resolution and the frame rate preferentially.
S803: and when the display is in the horizontal screen state, encoding the maximum screen resolution supported by the display equipment and the projected screen equipment according to the horizontal screen resolution.
In some embodiments, after the display device 200 receives the RTSP message, the state of the display of the projected device is determined according to the screen state identifier in the message, and the maximum screen resolution supported by both the display device and the projected device is selected. After the projected device recognizes the maximum resolution, if the display of the projected device is in the horizontal screen state, encoding according to the horizontal screen resolution. The horizontal resolution is a length-by-width resolution, for example, the maximum resolution supported by both ends is 1920 × 1080, and then encoding and compressing are performed according to 1920 × 1080.
S804: and when the display is in the vertical screen state, encoding the maximum screen resolution supported by the display equipment and the projected screen equipment according to the vertical screen resolution.
In some embodiments, after the display device 200 receives the RTSP message, the state of the display of the projected device is determined according to the screen state identifier in the message, and the maximum screen resolution supported by both the display device and the projected device is selected. After the projected screen equipment recognizes the maximum resolution, if the display of the projected screen equipment is in the vertical screen state, encoding according to the vertical screen resolution. Wherein, the vertical screen resolution is wide-by-long resolution, for example, the maximum resolution supported by both ends is 1920 × 1080, the length and the width are reversed, and encoding compression is performed according to 1080 × 1920.
S805: feeding back the encoded screen projection data to the screen projection equipment so that the screen projection equipment plays the screen projection data.
In some embodiments, the display device 200 sends the compressed screen projection data to the screen-projected device, and the screen-projected device receives the screen projection data and controls the display to play the screen projection data.
The above-described screen projection process is further described below with reference to the drawings.
A timing diagram of a method of screen projection according to some embodiments is illustrated in fig. 9. As shown in fig. 9, a user starts a screen projection application installed on the screen projection device, triggers a screen projection function, and the screen projection device sends a screen projection signal to the screen to be projected and establishes a TCP connection with the screen to be projected. And the screen projection equipment and the screen projected equipment respectively start RTSP threads to carry out capability negotiation. The projected screen equipment detects the current state of the display and records the state in the negotiation fields of the video formats at the two ends through the screen state identification. The screen projection equipment sends the RTSP message containing the screen state identification to the screen projection equipment, and the screen projection equipment selects the maximum resolution and the frame rate supported by the screen projection equipment according to the RTSP message and acquires the identity of the screen projection equipment. After the screen projection equipment confirms that the resolution and the frame rate are proper, the screen projection equipment encodes screen projection data according to the screen state identification if the horizontal screen is indicated, and encodes according to the vertical screen resolution if the vertical screen is indicated. And the screen projection equipment sends the encoded screen projection data to the screen projection equipment. And the screen projection equipment receives and decodes the screen projection data, and displays the decoded screen projection data on a display.
In the application, the screen projection equipment records whether the current display is in a horizontal screen state or a vertical screen state through the screen state identification, acquires the state of the display in the current screen projection equipment in the RTSP interaction process, and performs matching coding on the screen projection data, so that the screen projection data are adapted to the state of the display, the resolution definition of the screen projection data is ensured, and the user experience is improved.
Corresponding to the above-mentioned screen-projected end display device, an embodiment of the present application further provides a screen projection method, including: and the screen projection equipment starts an RTSP (real time streaming protocol) interaction thread in response to the screen projection signal, so that the screen projection equipment and the screen projection equipment perform data interaction. The method comprises the steps that a screen projection device sends RTSP (real time streaming protocol) message containing a screen state identifier to the screen projection device, wherein the screen state identifier is used for indicating the state of a display in the screen projection device, the display in the screen projection device is in a horizontal screen state or a vertical screen state, and the RTSP message is used for negotiating the video format between the screen projection device and the screen projection device. And the screen projection equipment receives screen projection data coded by the screen projection equipment according to the screen state identification, and controls a display in the screen projection equipment to play the screen projection data.
In some embodiments, in the sending an RTSP message containing a screen state identifier to the screen projecting device, the method further comprises: and the screen projection equipment detects the state of a display in the current screen projection equipment, and adds a screen state identifier in the video format negotiation field according to the state of the display.
In some embodiments, in the adding the screen state identifier in the video coding capability negotiation field according to the state of the display, the method further comprises: when the display in the current screen-projected device is in the landscape state, the screen-projected device controls any reserved data bit identifier in the video format negotiation field to be 0. And when the display in the current screen projection equipment is in the vertical screen state, the screen projection equipment controls any reserved data bit identifier in the video format negotiation field to be 1.
Corresponding to the screen projection end display device, the embodiment of the application further provides a screen projection method, and the method comprises the following steps: and the screen projection equipment sends a screen projection signal to the screen projected equipment, and an RTSP interaction thread is started so that the screen projected equipment and the screen projected equipment perform data interaction. The screen projection device receives RTSP (real time streaming protocol) messages which are sent by the screen projection device and contain screen state identification, wherein the screen state identification is used for indicating the state of a display in the screen projection device, the display in the screen projection device is in a horizontal screen state or a vertical screen state, and the RTSP messages are used for negotiating the video format between the screen projection device and the screen projection device. When a display in the screen projection equipment is in the horizontal screen state, the screen projection equipment encodes the maximum screen resolution supported by the screen projection equipment and the screen projection equipment together according to the horizontal screen resolution. And when the display in the screen projected device is in the vertical screen state, the screen projected device encodes the maximum screen resolution supported by the screen projected device and the screen projected device together according to the vertical screen resolution. And the screen projection equipment feeds the encoded screen projection data back to the screen to be projected so that the screen to be projected is played by the screen to be projected.
Since the above embodiments are all described by referring to and combining with other embodiments, the same portions are provided between different embodiments, and the same and similar portions between the various embodiments in this specification may be referred to each other. And will not be described in detail herein.
It is noted that, in this specification, relational terms such as "first" and "second," and the like, are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a circuit structure, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such circuit structure, article, or apparatus. Without further limitation, the presence of an element identified by the phrase "comprising an … …" does not exclude the presence of other like elements in a circuit structure, article, or device comprising the element.
Other embodiments of the present application will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure herein. This application is intended to cover any variations, uses, or adaptations of the invention following, in general, the principles of the application and including such departures from the present disclosure as come within known or customary practice within the art to which the invention pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the application being indicated by the following claims.
The above embodiments of the present application do not limit the scope of the present application.