Car make,model and year detection
Upload a car photo, identify it’s model, make and the year in which it was manufactured.
Images uploaded are neither stored nor re-used for model training.
Car make and model detection using Stanford dataset which was cleaned up and put on kaggle:
https://www.kaggle.com/jutrera/stanford-car-dataset-by-classes-folder
The dataset consists of 16k+ images and has 196 distinct labels.
Architecture used: ResNext50_32x4d using fastai API
Aakash Bakhle
Google one of the cars from the car_names in sidebar. Download any image and upload for model to predict.
https://caridentifier.azurewebsites.net/
P.S: Link maynot be available always as free tier does not support systems with higher RAM.
This service is not free and works only on S3 or P1v2 and above instances. Pricing can be found here
You need a free-tier/paid subscription with Microsoft Azure at https://portal.azure.com/
Fill details. Create a new resource group with any name.
Select Docker Container (Region, Linux Plan and SKU Size come by default) and click next :
From ‘Image Source’ dropdown, choose Docker Hub. After filling everything, click ‘Review+create’:
Click ‘Create’. It takes about 5 minutes. Once done click ‘Go to Resource’.
Select ‘Configuration’ in the ‘Settings’ tab on the left side pane.:
Click ‘New Application Setting’
Fill in data as given below and click OK:
Click ‘Save’ and then Click ‘Continue’ for the Service to restart:
Head to the ‘Overview’ tab in the side pane and launch the URL:
Google one of the cars from the car_names in sidebar. Download any image and upload for model to predict.
Play around!
I assume you have miniconda/anaconda installed and your system has at least 2GB RAM.
Due to GitHub’s upload limits, the model file can be found here
Clone this repo, place the model file in the same folder.
conda create --name demo
conda install pip
pip install -r requirements.txt
streamlit run app.py
Streamlit outputs localhost url. Open it and follow steps 11 and 12 from Azure Deployment above.
stanford_car_model_dataset_fastai.ipynb :
Contains the main code to train the model. It is specific to google colab
app.py:
Contains streamlit code to deploy model on localhost.
labels.csv:
List of car names from the original dataset
Dockerfile:
The dockerfile given has been built and pushed to dockerhub