Show simple item record

dc.contributorSheldon, Dan
dc.contributorGinsberg, Tatiana
dc.contributor.advisorAndrews, Christopher
dc.contributor.authorBancila, Andreea
dc.date.accessioned2013-06-13T18:11:15Z
dc.date.available2013-06-13T18:11:15Z
dc.date.issued2013-06-13
dc.identifier.urihttp://hdl.handle.net/10166/3257
dc.description.abstractNatural User Interfaces (NUIs) represent one of the most important steps that computer scientists have taken towards shaping a more intuitive, more detached kind of interaction with the technological sphere. NUIs, which refer to mouse-less interfaces, are advancing swiftly and seeking to replace the graphical user interfaces (GUIs) based on windows, icons, menus and a pointer (WIMP). The term natural is related to the way people interact with the physical world, as it is those experiences that the NUIs are trying to mimic. In this work we focused on gesture based interaction which poses a greater interest than for example touch applications since there is a lot of uncharted territory. Gestures fit into the context of daily life fairly well, but they are not discoverable in the way that menu options are which could lead to user confusion or lack of awareness of the features of the tool. One way to address this problem is to attempt to discover gestures that are actually intuitive by examining which gestures users initially try when asked to perform certain tasks. This research project focused on image editing as a task, in an attempt to accommodate the ever expanding community of people generating pictures. Instead of providing the users with a package of previously created editing gestures, we asked people to provide us with a collection of gestures that they could spontaneously and instinctively think of in the context of image editing. Our main goal was to discover whether there were any common gestures that people shared for specific image editing tasks. In many cases users created similar motions, which validated the approach and provided a starting point for building a gesture-based image editor prototype (which could be expanded to many other gesture-based tools).en_US
dc.description.sponsorshipComputer Scienceen_US
dc.language.isoen_USen_US
dc.subjectnatural user interfaceen_US
dc.subjectgesture elicitationen_US
dc.subjectkinecten_US
dc.subjectgesture interfaceen_US
dc.subjectimage editing using gesturesen_US
dc.subjectC# visual studioen_US
dc.titleGesture Elicitation for Image Editingen_US
dc.typeThesis
dc.date.gradyear2013en_US
mhc.institutionMount Holyoke College
mhc.degreeUndergraduateen_US
dc.rights.restrictedpublicen_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record