Welcome to “Installing TensorFlow with Object Detection API”. In this post we will install TensorFlow and his Object Detection API using Anaconda.

Some time ago, we found many issues trying to do the same thing without Anaconda in Windows. Because of that we choose Anaconda which makes that easy and clean. So, lets begin.

Requirements

  • Anaconda with Python 3 (download here)

Preparation

First of all, we have to create a new conda environment.

So, our project will be in a new python installation and when we ‘re done we can safely remove it and leave our workstation clean.

Copy to Clipboard

Note that we have to use python 3.6 because tensorflow doesn’t work yet with last python version.

Take a look too that it’s a good decision to install the anaconda package since it contains lots of packages that could be useful when working with Object Detection.

Installation

Also, let’s install tensorflow in our conda environment:

Copy to Clipboard

Since TensorFlow doesn’t include their models when it’s installed through pip or conda, we have to add those manually to our installation folder.

If you work with the default paths on Windows then you should find the installation folder in:

Copy to Clipboard

So, to add models we have to clone this repo inside the installation folder

Copy to Clipboard

Once we have the repo cloned, we must compile the protobuf libraries. The command must be runned from tensorflow/models/research/.

If you are on Windows, you have to use this command because protoc doesn’t support the * flag:

Copy to Clipboard

If you are on Linux or MacOS, then use this command:

Copy to Clipboard

If you are running locally, the following step is required. Hence you have to add the tensorflow models directories with this command.

Windows:

Copy to Clipboard

Linux:

Copy to Clipboard

Validation

Finally, we want to make sure everything went out fine.

Copy to Clipboard
Object Detection API Ok

Good news! As a result we have our environment ready to work with TensorFlow and Object Detection.

In the next chapter we will talk about custom data preparation to train models.