In this blog post, we will install TensorFlow Machine Learning Library on Ubuntu 18.04 / Debian 9. If you need Tensorflow GPU, you should have a dedicated Graphics card on your Ubuntu 18.04 – NVIDIA, AMD e.t.c. The software installed for Tensorflow GPU is CUDA Toolkit.
Install Tensorflow (CPU Only) on Ubuntu 18.04 LTS / Debian 9
To Install Tensorflow (CPU Only) on Ubuntu 18.04, you’ll go with Tensorflow no GPU supported version which requires Python 2.7 or Python 3.3+. Install Python and required modules by running the following commands:
sudo apt update sudo apt -y install python python-pip python-setuptools python-dev
Then install Tensorflow using pip Python package manager.
pip install --upgrade tensorflow requests
If you have CUDA-enabled GPU cards, then you can install the GPU package.
pip install tensorflow-gpu
But don’t forget that GPU packages require a CUDA®-enabled GPU card.
Verify that your Tensorflow is working fine.
python -c "import tensorflow as tf; tf.enable_eager_execution(); print(tf.reduce_sum(tf.random_normal([1000, 1000])))"
Output:
2018-12-19 00:53:36.272184: I tensorflow/core/platform/cpu_feature_guard.cc:141] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 FMA tf.Tensor(820.4219, shape=(), dtype=float32)
Running a test Model:
mkdir ~/tensorflow_projects cd ~/tensorflow_projects git clone https://github.com/tensorflow/models.git export PYTHONPATH="$PYTHONPATH:$(pwd)/models" cd models/official/mnist python mnist.py
Using TensorBoard
TensorBoard is a group of visualization tools that make it easier to understand, debug, and optimize TensorFlow programs. Use TensorBoard to visualize your TensorFlow graph, plot quantitative metrics about the execution of your graph, and show additional data like images that pass through it.
Start TensorBoard by running:
mkdir ~/tensor_logs tensorboard --logdir=~/tensor_logs
On running the tensorboard
command, the output like below will be printed in your screen.
TensorBoard 1.12.1 at http://ubuntu-01:6006 (Press CTRL+C to quit)
You can kill TensorBoard process by Pressing CTRL+C
Not that by default Tensorflow outputs are stored under the /tmp
directory. When TensorBoard is fully configured, access the URL on http://[ServerHostname|IPAddress]:6006
. The Dashboard looks like this:
Running Tensorflow (CPU Only) in Docker Container
You can also run TensorFlow in a docker container. If you don’t have a Docker Engine installed on Ubuntu 18.04, our guide should come in handy.
How to install Docker CE on Ubuntu / Debian / Fedora / Arch / CentOS
The TensorFlow Docker images are already configured to run TensorFlow. A Docker container runs in a virtual environment and is the easiest way to set up GPU support.
Download TensorFlow Docker image:
docker pull tensorflow/tensorflow
Once Downloaded, start Jupyter notebook server by running:
docker run -it -p 8888:8888 tensorflow/tensorflow
If you just want to run a TensorFlow test, use:
docker run -it --rm tensorflow/tensorflow \ python -c "import tensorflow as tf; tf.enable_eager_execution(); print(tf.reduce_sum(tf.random_normal([1000, 1000])))"
Read more about Running TensorFlow in Docker from the official website.