Ajob Recommender is my bachelor thesis at Shahid Beheshti University.
This video is a brief presentation of this project.
This image shows the project strcutute. Below, is the brief description about each phase.
- Collecting Profile URLs
- Crawling Profiles' Data
- Data Preprocessing
- Building a machine learning model
- Building a website for model
In order to crawl profiles in linkedin, we need to collect the URLs of the profiles. The URLs are used in crawler then to crawl profiles.
For crawling profiles' data, we use crawler. A crawler is written in Selenium in python. Its source code is here.
One of the challenges for data scientists is data preprocessing. In order to build a machine learning model, we need to form data to what we want and transform the dirty data to clean data. This repository manages all things about data preprocessing.
This phase is the ❤ of the project. It's a machine learning model which learns to predict the final job based on skills, educations and experiences. All phases are done for this part. the models used in this project are here
The final part is to build a website (Front-end + Back-end) which helps us to use our model. It contains 3 services.
