Skip to content

My machine learning pet project on Human Activity Recognition Using Multiclass Classification with fitness data from a smartphone tracker

Notifications You must be signed in to change notification settings

amoshnin/ML-Human.Activity.Recognition

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Problem description

  • Dataset that I will be using to solve this problem was built from the recordings of 30 study participants performing activities of daily living (ADL) while carrying a waist-mounted smartphone with embedded inertial sensors (using the Samsung Galaxy S II device).

  • Objective of this project is to classify activities into one of the six activities performed by the participants (WALKING, WALKING_UPSTAIRS, WALKING_DOWNSTAIRS, SITTING, STANDING, LAYING)

  • Embedded accelerometer and gyroscope were used to collect the data. I've captured 3-axial linear acceleration and 3-axial angular velocity at a constant rate of 50Hz

  • The obtained dataset has been randomly partitioned into two sets, where 70% of the volunteers were selected for generating the training data and 30% the test data.

  • The experiments have been video-recorded to label the data manually.

Problem description from data science perspective

Solution description

About

My machine learning pet project on Human Activity Recognition Using Multiclass Classification with fitness data from a smartphone tracker

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published