Abstract
Human-Computer-Interaction (HCI) is a key topic on the Accessibility area, namely in what concerns a Smart-City environment, where humans have to interact with artefacts spread all around one particular city. In the last decade HCI has experienced a significant evolution towards a desired fully natural user interface, for example in what concerns the ability to recognize a wide range of hand gestures in real-time. Some efforts have lately been made in the hardware sector in order to deploy sensors that may gather information about the human body movements, as detailed as possible. One example of this effort is the Kinect Sensor from Microsoft. Nevertheless, although these solutions provide a solution for the body movement, they lack on the details, namely the hands, given their small dimension compared to the body as a whole. In other words, the SDKs provided by the vendors of these devices usually lack on the needed details concerning hand movements that would be needed for an accurate hand gesture recognition implementation. This paper presents an extension to the Kinect SDK based on a contour analysis for the estimation of the hand position. This algorithms are then used to provide the creation of a gesture library that might be used afterwards