Ong, Kian Shon (2022) Music by actions: a music recommender based on activity recognition. Final Year Project, UTAR.
Abstract
Recommendation systems are widely used for personalized movies, music and product suggestions using collaborative filtering methods. Currently, music recommender suggests music based on listening history and similar genres, which are not ambient and actor aware. This project proposes HitMe; a music recommendation system that suggests songs based on the users’ real-time activities. For example, HitMe recommends high tempo songs to a user running on a treadmill while a slow pace song for a user who is relaxing on a couch. Firstly, we build a CNN-LSTM for indoor activity recognition using a custom activity dataset. The dataset contains pre-processed activity videos from “HMDB51”, “UCF-101”, “STAIR Actions”, and “kinetics-downloader” datasets. We train the CNN-LSTM for multiclass classifications to identify nine actions; that includes: “Biking”, “ComputerWork”, “Driving”, “Eat&Drink”, “PlayInstrument”, “Sport”, “Studying” “Walking”, “Writing”. Before that, we use VGG16 to extract video features useful for the CNN-LSTM. The early results showed that HitMe model score 0.6507 in accuracy for indoor activity recognition. Besides activity recognition model, we also implement a content-based music recommender system by using the Spotify API, where this recommender is to recommend users a list of tracks based on the user preferences such as favourite artist, song, and activity predicted from the activity recognition model. In a nutshell, the final result showed that HitMe able to get user context information and make a recommended playlist in Spotify.
Actions (login required)