###### tags: `draft` `thesis` `jptw` # Chapter 1 Introduction ## 1.1 Background and Motivation :::info Keywords | `aging` `monitoring` `video stream` `event detection` ::: With the fast growing of senior population, the aging issue has drawn highly attention to the public. To care for the elderly, unobtrusive in-home health monitoring has been highlighted since ever. As the device of in-home monitoring, video camera catches human behavior during daily life at home. To detect these behaviors automatically, many researchers have proposed several methods based on a great variety of models in video and audio processing, such as hidden markov model(HMM), recurrent neural network (RNN) and deep learning models (DL). Those models normally require large amount of labeled data since they are data-driven models. However, labeling a large amount of data requires tremendous cost for company, such as money, time, and human resources. To address this limitation, developing assistant labeling tools are extremely important. ## ==1.2 Problem Statement== For multimedia event detection (MED), typical systems focus on the processing in aspects of computer vision. Audio analysis, however, could also retrieve rich information for event detection. In this thesis, we put mainly points in audio analysis domain. We developed a user-friendly annotation tool that highlights the event-related region for labeling, to shorten the time of labeling different sound events by manual. Considered the insufficiency and diversity of semantic tags, the proposed system would start from query-by-example (QBE) method. Finally, a pilot test is conducted to validate this system and gives advance signs on more possibilities. ## 1.3 Thesis Overview The rest of the thesis is organized as follows. Section 2 discusses related work on audio retrieval and sound event detection. Section 3 desccibes the proposed system design and audio fingerprinting method. Section 4 conducts exeprimental analysis and results, and Section 5 concludes and gives the direction of future research.