Gesture Elicitation for Image Editing
MetadataShow full item record
Natural User Interfaces (NUIs) represent one of the most important steps that computer scientists have taken towards shaping a more intuitive, more detached kind of interaction with the technological sphere. NUIs, which refer to mouse-less interfaces, are advancing swiftly and seeking to replace the graphical user interfaces (GUIs) based on windows, icons, menus and a pointer (WIMP). The term natural is related to the way people interact with the physical world, as it is those experiences that the NUIs are trying to mimic. In this work we focused on gesture based interaction which poses a greater interest than for example touch applications since there is a lot of uncharted territory. Gestures fit into the context of daily life fairly well, but they are not discoverable in the way that menu options are which could lead to user confusion or lack of awareness of the features of the tool. One way to address this problem is to attempt to discover gestures that are actually intuitive by examining which gestures users initially try when asked to perform certain tasks. This research project focused on image editing as a task, in an attempt to accommodate the ever expanding community of people generating pictures. Instead of providing the users with a package of previously created editing gestures, we asked people to provide us with a collection of gestures that they could spontaneously and instinctively think of in the context of image editing. Our main goal was to discover whether there were any common gestures that people shared for specific image editing tasks. In many cases users created similar motions, which validated the approach and provided a starting point for building a gesture-based image editor prototype (which could be expanded to many other gesture-based tools).