US 11,889,052 B2
Method for encoding video information and method for decoding video information, and apparatus using same
Se Yoon Jeong, Daejeon-si (KR); Hui Yong Kim, Daejeon-si (KR); Sung Chang Lim, Daejeon-si (KR); Jin Ho Lee, Daejeon-si (KR); Ha Hyun Lee, Seoul (KR); Jong Ho Kim, Daejeon-si (KR); Jin Soo Choi, Daejeon-si (KR); Jin Woong Kim, Daejeon-si (KR); Chie Teuk Ahn, Daejeon-si (KR); Gwang Hoon Park, Seongnam-si (KR); Kyung Yong Kim, Suwon-si (KR); Han Soo Lee, Yongin-Si (KR); and Tae Ryong Kim, Yongin-Si (KR)
Assigned to Electronics and Telecommunications Research Institute, Daejeon (KR); and University-Industry Cooperation Group of Kyung Hee University, Yongin-si (KR)
Filed by Electronics and Telecommunications Research Institute, Daejeon (KR); and University—Industry Cooperation Group of Kyung Hee University, Yongin-si (KR)
Filed on Feb. 1, 2022, as Appl. No. 17/590,161.
Application 17/590,161 is a continuation of application No. 17/344,161, filed on Jun. 10, 2021, granted, now 11,388,393.
Application 17/344,161 is a continuation of application No. 14/044,542, filed on Oct. 2, 2013, granted, now 11,064,191, issued on Jul. 13, 2021.
Application 14/044,542 is a continuation of application No. 13/977,520, granted, now 9,955,155, issued on Apr. 24, 2018, previously published as PCT/KR2011/010379, filed on Dec. 30, 2011.
Claims priority of application No. 10-2010-0140721 (KR), filed on Dec. 31, 2010; and application No. 10-2011-0147083 (KR), filed on Dec. 30, 2011.
Prior Publication US 2022/0159237 A1, May 19, 2022
Int. Cl. H04N 19/10 (2014.01); H04N 19/105 (2014.01); H04N 19/136 (2014.01); H04N 19/196 (2014.01); H04N 19/61 (2014.01); H04N 19/46 (2014.01)
CPC H04N 19/10 (2014.11) [H04N 19/105 (2014.11); H04N 19/136 (2014.11); H04N 19/197 (2014.11); H04N 19/196 (2014.11); H04N 19/46 (2014.11); H04N 19/61 (2014.11)] 3 Claims
OG exemplary drawing
 
1. A method of decoding video information, the method comprising:
generating an occurrence probability of a split flag for a current coding unit based on a split depth value for one or more spatial neighboring blocks of the current coding unit,
wherein the split flag indicates whether a coding unit is split and the split depth value indicates a depth level of a coding unit according to the split flag, and
wherein the coding unit is not split when the split flag is equal to 0, and the coding unit is split into four coding units with half horizontal and vertical size of the coding unit when the split flag is equal to 1;
performing entropy decoding of the split flag on the current coding unit based on the generated occurrence probability;
decoding a prediction flag indicating whether information of the current coding unit is the same as prediction information derived from information of a temporal neighboring block of the current coding unit, the information of the current coding unit including a motion vector of the current coding unit;
determining the information of the current coding unit based on the decoded prediction flag; and
performing inter-prediction of the current coding unit based on the determined information of the current coding unit,
wherein, when the information of the current coding unit is the same as the prediction information derived from information of the temporal neighboring block, the information of the current coding unit is determined to be the prediction information, and each of a first reference index and a second reference index for the current coding unit has a specific value indicating a specific reference frame of a current frame to which the current coding unit belongs, based on that two lists of reference frames are used for the current coding unit,
wherein the current coding unit is decoded based on a temporally previous frame and a temporally subsequent frame of the current frame, the temporally previous frame being indicated by the first reference index and the temporally subsequent frame being indicated by the second reference index, and
wherein, when the information of the current coding unit is different from the prediction information derived from information of the temporal neighboring block, the information of the current coding unit is derived by adding a difference value obtained from a bitstream to the prediction information.