| CPC H04N 13/117 (2018.05) [H04N 13/156 (2018.05); H04N 13/268 (2018.05); H04N 13/293 (2018.05); H04N 23/62 (2023.01); H04N 23/632 (2023.01); H04N 23/698 (2023.01); H04N 23/90 (2023.01)] | 20 Claims |

|
1. A photographing method comprising:
receiving a photographing instruction of a user;
shooting, in response to receiving the photographing instruction, a primary camera image using a primary camera of an electronic device;
shooting, in response to receiving the photographing instruction, a wide-angle image using a wide-angle camera of the electronic device;
obtaining a first point cloud based on the primary camera image;
obtaining a second point cloud based on the wide-angle image;
calculating a common area of the primary camera image and the wide-angle image based on camera parameters of the primary camera and the wide-angle camera;
calculating, using a stereo matching imaging algorithm, content in the common area to obtain a binocular depth map;
obtaining, using the binocular depth map, a binocular point cloud based on the primary camera image and the wide-angle image;
performing a first depth calibration on the second point cloud based on the first point cloud to obtain a first calibrated point cloud of the wide-angle image;
performing a second depth calibration on the binocular point cloud based on the first point cloud to obtain a second calibrated binocular point cloud;
obtaining a fused three-dimensional (3D) point cloud bycloud by:
matching and fusing the first point cloud, the second point cloud, and the binocular point cloud based on a full-scene 3D point cloud; and
fusing the first point cloud with the first calibrated point cloud and the second calibrated binocular point cloud; and
generating a 3D image based on the fused 3D point cloud.
|