Bibliographic Details
Title: |
HANDLES INTERACTIONS FOR HUMAN-COMPUTER INTERFACE |
Document Number: |
20110197161 |
Publication Date: |
August 11, 2011 |
Appl. No: |
12/703115 |
Application Filed: |
February 09, 2010 |
Abstract: |
A system is disclosed for providing on-screen graphical handles to control interaction between a user and on-screen objects. A handle defines what actions a user may perform on the object, such as for example scrolling through a textual or graphical navigation menu. Affordances are provided to guide the user through the process of interacting with a handle. |
Inventors: |
Mattingly, Andrew (Kirkland, WA, US); Hill, Jeremy (Seattle, WA, US); Dayal, Arjun (Redmond, WA, US); Kramp, Brian (Kirkland, WA, US); Vassigh, Ali (Redmond, WA, US); Klein, Christian (Duvall, WA, US); Poulos, Adam (Redmond, WA, US); Kipman, Alex (Redmond, WA, US); Margolis, Jeffrey (Seattle, WA, US) |
Assignees: |
MICROSOFT CORPORATION (Redmond, WA, US) |
Claim: |
1. In a system comprising a computing environment coupled to a capture device for capturing user position and providing a human-computer interface, a method of facilitating user interaction with an area of a display for the human-computer interface, comprising: (a) generating a handle associated with the area of the interface; (b) detecting, via a camera sensing user movement, engagement by the user with the handle generated in said step (a); (c) receiving an indication of gesture by the user; and (d) performing an action on the area of the user interface in response to said step (c). |
Claim: |
2. The method of claim 1, said step (a) comprising the step of displaying the handle on the area of the user interface. |
Claim: |
3. The method of claim 1, said step (a) comprising the step of displaying the handle adjacent the area of the user interface. |
Claim: |
4. The method of claim 1, said step (a) comprising the step of integrating the handle as part of the area of the user interface so that no separate handle is displayed. |
Claim: |
5. The method of claim 1, said step (a) comprising the step of displaying the handle as a circular graphical object of two or three dimensions. |
Claim: |
6. The method of claim 1, said step (a) comprising the step of displaying the handle as a graphical object and further comprising the step (e) of changing an appearance of the handle on the display from said step (a) to said step (b) and from said step (b) to said step (c). |
Claim: |
7. The method of claim 1, said step (b) of detecting engagement by the user with the handle facilitated by simulating an attractive force around the handle pulling a cursor to the handle, engagement detected upon the cursor being pulled to the handle. |
Claim: |
8. The method of claim 1, said step (a) of generating a handle associated with the area of the user interface comprising the step of associating a handle to the area based on an action to be performed on the area upon detection of a gesture performed by a user while engaging the handle. |
Claim: |
9. The method of claim 1, further comprising the step (f) of displaying affordances associated with the handle indicating how a user may interact with the handle generated in said step (a). |
Claim: |
10. A processor readable storage medium for a computing environment coupled to a capture device for capturing user position and providing a human-computer interface, the processor readable storage medium programming a processor to perform a method of facilitating user interaction with an action area of a display for the human-computer interface, comprising: (a) displaying on the display a graphical handle associated with the area of the interface, the graphical handle providing an explicit engagement point for engaging the action area and the graphical handle defining how a user may interact with the action area upon receipt of a predefined gesture by the user; (b) receiving an indication that the user is tracking to the handle as a result of detecting a position of the user; (c) establishing engagement with the handle when a user has tracked to the handle; (d) receiving an indication of gesture by the user; (e) performing an action with respect to the action area of the display defined by the graphical handle where the gesture indication received in said step (d) matches the predefined gesture of said step (a). |
Claim: |
11. A processor readable storage medium of claim 10, the method further comprising the step (f) of displaying affordances defining how a user may interact with the handle upon said step (b) of establishing engagement with the handle. |
Claim: |
12. A processor readable storage medium of claim 11, wherein said step (f) comprises the step of displaying rails showing directions in which the handle may be moved. |
Claim: |
13. A processor readable storage medium of claim 12, wherein said step (d) of receiving an indication of gesture by the user by the user comprises the step of receiving a gesture to move the handle in a direction defined by the rails in said step (f). |
Claim: |
14. A processor readable storage medium of claim 10, said step (d) comprising the step of receiving an indication to move the handle in a one of directions up, down, left and right with respect to the display. |
Claim: |
15. A processor readable storage medium of claim 14, said step (e) comprising the step of navigating through a menu of objects upon said step (d) of receiving the indication to move the handle up, down, left or right. |
Claim: |
16. In a computer system having a computing environment coupled to a capture device for capturing user position and coupled to a display, a human-computer interface comprising: an action area on the display, the action area capable of at least one of performing an action and having an action performed on it; a handle displayed on the display and associated with the action area, the handle providing an explicit engagement point with an action area and defining how a user may interact with the action area; and rails displayed on the display associated with the handle for defining how a user may manipulate the handle. |
Claim: |
17. The human-computer interface of claim 16, wherein the action area is a navigation menu and the handle defines navigation paths through the navigation menu. |
Claim: |
18. The human-computer interface of claim 17, wherein the navigation menu is a textual menu. |
Claim: |
19. The human-computer interface of claim 17, wherein the navigation menu is a graphical menu. |
Claim: |
20. The human-computer interface of claim 16, wherein a simulated force is generated around the handle to make engagement with the handle by a user easier to establish. |
Current U.S. Class: |
715/810 |
Current International Class: |
06; 06 |
Accession Number: |
edspap.20110197161 |
Database: |
USPTO Patent Applications |