US 12,225,177 B2
Screen detection method, apparatus and device, computer program and readable medium
Jian Gao, Beijing (CN); Sen Ma, Beijing (CN); Fang Cheng, Beijing (CN); Tao Hong, Beijing (CN); Jinye Zhu, Beijing (CN); Pengxia Liang, Beijing (CN); and Jing Yu, Beijing (CN)
Assigned to BOE Technology Group Co., Ltd., Beijing (CN)
Appl. No. 17/765,390
Filed by BOE Technology Group Co., Ltd., Beijing (CN)
PCT Filed May 28, 2021, PCT No. PCT/CN2021/096964
§ 371(c)(1), (2) Date Mar. 30, 2022,
PCT Pub. No. WO2022/246844, PCT Pub. Date Dec. 1, 2022.
Prior Publication US 2024/0121369 A1, Apr. 11, 2024
Int. Cl. H04N 13/117 (2018.01); H04N 13/167 (2018.01); H04N 13/189 (2018.01)
CPC H04N 13/117 (2018.05) [H04N 13/167 (2018.05); H04N 13/189 (2018.05)] 15 Claims
OG exemplary drawing
 
1. A screen detection method, wherein the method comprises:
receiving a cylindrical lens detection instruction for a target screen, wherein the cylindrical lens detection instruction at least comprises target viewpoints;
acquiring browsing images shot from the target screen under the target viewpoints in response to the detection instruction, wherein the target screen is a screen of which a light emission side is provided with cylindrical lenses;
taking the browsing images as viewpoint images when the browsing images comprise target contents; and
outputting detection parameters of the cylindrical lenses on the target screen based on image parameters of the viewpoint images, wherein the detection parameters are parameters that are used in detection of the cylindrical lenses and are actual parameters of the cylindrical lenses for replacing design values of the parameters, wherein the detection parameters comprise at least one of placing heights of the cylindrical lenses, a central distance between two adjacent cylindrical lenses, alignment angle deviations of the cylindrical lenses, alignment position deviations of the cylindrical lenses, and curvature radii of the cylindrical lenses;
wherein the detection parameters at least comprise the placing heights of the cylindrical lenses; the step of outputting the detection parameters of the cylindrical lenses on the target screen based on the image parameters of the viewpoint images comprises: acquiring viewpoint positions corresponding to the viewpoint images and pixel point positions on the pixel surface based on the viewpoint images; acquiring a first pixel point distance between the pixel point positions corresponding to two adjacent viewpoint images on the same cylindrical lens; and acquiring the placing heights of the cylindrical lenses on the target screen based on the viewpoint positions, the number of the viewpoints, the first pixel point distance and a refractive index of a medium from the cylindrical lenses to the pixel surface; wherein the step of acquiring the placing heights of the cylindrical lenses on the target screen based on the viewpoint positions, the number of the viewpoints, the first pixel point distance and the refractive index of the medium from the cylindrical lenses to the pixel surface comprises:
establishing space rectangular coordinates (x, y, z) by using a plane where the pixel surface of the target screen is located as an xy plane, acquiring spatial coordinate values of the viewpoint positions in the space rectangular coordinates, and outputting the placing heights of the cylindrical lenses on the target screen according to the following formula:

OG Complex Work Unit Math
wherein T represents the placing heights, N represents the number of the viewpoints, n represents the refractive index of the medium from the cylindrical lenses to the pixel surface, Psubrepresents the first pixel point distance between the pixel point positions corresponding to two adjacent viewpoint images on the same cylindrical lens, XN represents an x-axis spatial coordinate value of an Nth viewpoint image, X1 represents an x-axis coordinate value of a first viewpoint image, and z represents a z-axis coordinate value of each viewpoint image, wherein N≥2, and N is a positive integer; or
wherein the detection parameters at least comprise the central distance between two adjacent cylindrical lenses; the step of outputting the detection parameters of the cylindrical lenses on the target screen based on the image parameters of the viewpoint images comprises: acquiring the central distance between two adjacent cylindrical lenses based on the placing heights of the cylindrical lenses and the refractive index of the medium from the cylindrical lenses to the pixel surface; or;
wherein the detection parameters at least comprise the alignment angle deviations of the cylindrical lenses; wherein the step of outputting the detection parameters of the cylindrical lenses on the target screen based on the image parameters of the viewpoint images comprises: acquiring the number of the target longitudinal contents, viewpoint positions corresponding to the viewpoint images and pixel point positions on the pixel surface based on the viewpoint images; acquiring a first pixel point distance between the pixel point positions corresponding to two adjacent viewpoint images on the same cylindrical lens and content widths of the target longitudinal contents on the viewpoint images; and acquiring alignment angle deviations of the cylindrical lenses based on the number of the target longitudinal contents, the first pixel point distance and the content widths; wherein the step of acquiring the alignment angle deviations of the cylindrical lenses based on the number of the target longitudinal contents, the first pixel point distance and the content widths comprises: outputting the alignment angle deviations of the cylindrical lenses according to the following formula:

OG Complex Work Unit Math
wherein Δθ represents the alignment angle deviations of the cylindrical lenses, N represents the number of the target longitudinal contents, Psubrepresents the first pixel point distance between the pixel point positions corresponding to two adjacent viewpoint images on the same cylindrical lens, and W represents the content widths of the target longitudinal contents on the viewpoint images; or
wherein the detection parameters at least comprise the alignment position deviations of the cylindrical lenses; wherein the step of outputting the detection parameters of the cylindrical lenses on the target screen based on the image parameters of the viewpoint images comprises: acquiring the alignment position deviations of the cylindrical lenses based on the image parameters of the viewpoint images; or
wherein the detection parameters at least comprise the curvature radii of the cylindrical lenses; wherein the step of outputting the detection parameters of the cylindrical lenses on the target screen based on the image parameters of the viewpoint images comprises: acquiring the viewing angles of the viewpoint images; and by adjusting curvature radii of optical simulation models of the cylindrical lenses, using the curvature radii as the curvature radii of the cylindrical lenses when the viewing angles at the maximum sharpness of the optical simulation models are the viewing angles of the viewpoint images.