This assignment can be done in either Python or MATLAB

computer science

Description

CSE 572 Assignment #1: Activity Recognition Project *


Note: This assignment can be done in either Python or MATLAB 


In this project, we will attempt to develop a computing system that can understand human activities. You will be provided data for a given activity, specifically eating action mixed with other unknown activities. Your aim is to identify the eating activities amongst the noise. We will be working with real world wristband data and, in the process, we will use all the concepts that we will learn in the data mining course. Note that this is only an attempt and we do not expect a full solution yet. 


In the first phase we will do data cleaning, noise reduction, data organization and feature extraction. 


a) Phase 1: Data Cleaning and Organization [20 points] 


The data provided to you is collected using two sources: 


A) Wristband data: Where the subject is wearing the wristband and performing eating actions periodically interspersed with non-eating unknown actions. The wristband provides you with i) accelerometer, ii) gyroscope, iii) orientation, and iv) EMG sensors. The sampling rate is 100 Hz for EMG sensors and 50Hz for IMU sensors. 


B) Video recording of the person performing eating actions only used for establishing ground truth. In the data, you will be provided with the ground truth in the form of frame numbers where an eating action starts and ends. The actual video data will not be provided for privacy issues. The video data is taken at 30 frames per second. Hence you must convert the frame numbers into sample numbers for the wristband sensors through some logic that you have to develop. The assumption that you can take is that the start frame and sample #1 of the wristband are synchronized. The output of this step will be a set of data snippets that are labelled eating actions and a set of data snippets that are non-eating. 


Note: You don’t need to use Video_info excel files. You just need to use the text files in both groundTruth and MyoData folders for this project. 


The way to convert the frame numbers into sample numbers is as follows: For the scope of this project you can ignore timestamp. Consider the ground truth file, where there are three columns. Ignore the last column. The first column is the start frame of an eating action, the second column is the end frame. Each row is an eating action. Each frame has 2 number that can be multiplied by 100 and divided by 30 for EMG data. This gives you the corresponding sample number that indicates the start and end rows of an eating action. Do this for every row to get the sample numbers for eating actions for a person. The ones that were left are the non-eating rows. 


Related Questions in computer science category


Disclaimer
The ready solutions purchased from Library are already used solutions. Please do not submit them directly as it may lead to plagiarism. Once paid, the solution file download link will be sent to your provided email. Please either use them for learning purpose or re-write them in your own language. In case if you haven't get the email, do let us know via chat support.