US 11,947,782 B2
Device, method, and graphical user interface for manipulating workspace views
Julian Missig, Burlingame, CA (US); Jonathan Koch, Lewisville, NC (US); Avi E. Cieplinski, San Francisco, CA (US); B. Michael Victor, Menlo Park, CA (US); Jeffrey Traer Bernstein, San Francisco, CA (US); Duncan R. Kerr, San Francisco, CA (US); and Myra M. Haggerty, San Mateo, CA (US)
Assigned to Apple Inc., Cupertino, CA (US)
Filed by Apple Inc., Cupertino, CA (US)
Filed on Jun. 17, 2022, as Appl. No. 17/843,729.
Application 17/843,729 is a continuation of application No. 17/180,227, filed on Feb. 19, 2021, granted, now 11,366,576.
Application 17/180,227 is a continuation of application No. 16/377,702, filed on Apr. 8, 2019, granted, now 10,928,993, issued on Feb. 23, 2021.
Application 16/377,702 is a continuation of application No. 14/455,303, filed on Aug. 8, 2014, granted, now 10,254,927, issued on Apr. 9, 2019.
Application 14/455,303 is a continuation of application No. 12/567,206, filed on Sep. 25, 2009, granted, now 8,832,585, issued on Sep. 9, 2014.
Prior Publication US 2023/0143113 A1, May 11, 2023
Int. Cl. G06F 3/0484 (2022.01); G06F 3/04817 (2022.01); G06F 3/0488 (2022.01); G06F 3/04883 (2022.01); G06F 3/04886 (2022.01); G06F 9/451 (2018.01); G09G 5/14 (2006.01); G06F 3/01 (2006.01)
CPC G06F 3/0484 (2013.01) [G06F 3/04817 (2013.01); G06F 3/0488 (2013.01); G06F 3/04883 (2013.01); G06F 3/04886 (2013.01); G06F 9/451 (2018.02); G09G 5/14 (2013.01); G06F 3/017 (2013.01); G06F 2203/04808 (2013.01)] 24 Claims
OG exemplary drawing
 
1. A computer system, comprising:
a touch-sensitive display;
a mouse;
one or more processors; and
memory storing one or more programs configured to be executed by the one or more processors, the one or more programs including instructions for:
while displaying a user interface region that includes a first graphical object at a first position in the user interface region and a second graphical object at a second position in the user interface region:
detecting, via the touch-sensitive display, a touch input; and
detecting, via the mouse, a mouse-based input, wherein at least a portion of the mouse-based input is detected concurrently with at least a portion of the touch input, wherein the mouse-based input is substantially stationary, and wherein the mouse-based input is directed to the first graphical object; and
in response to detecting the mouse-based input and the touch input, performing an operation that is determined based on both the mouse-based input and the touch input, wherein performing the operation includes moving the first graphical object relative to the first position in the user interface region and maintaining the second graphical object relative to the second position in the user interface region.