numsection Keras

Keras is a Python library that provides a clean and convenient way to create a range of deep learning models on top of powerful libraries such as TensorFlow, Theano or CNTK. Keras was developed and maintained by François Chollet, a Google engineer and it is released under the permissive MIT license.

Below are the tasks of this lab session. If you don’t finish all of them during this session lab, please, read last Task before leaving classroom.


task: Update the docker image:
#Download the docker image
docker pull jorditorresbcn/dl:latest
task: Create a container:
#Create a container
docker run -it -p 8888:8888 -p 6006:6006  jorditorresbcn/dl:latest

This container has:

  • Port forwarding - 8888 (for Jupyter)
  • Port forwarding - 6006 (for Tensorboard)
task: Clone the course repository:
cd /app/
git clone https://github.com/jorditorresBCN/dlaimet.git
task: Run the Jupyer Notebook server:
#Inside the container
jupyter notebook --ip=0.0.0.0 --allow-root

On your computer, open your browser and go to http://localhost:8888, the password is aidl.

If you are on windows and you are experiencing connectivity issues, please check THIS.

task: Run your first Keras program

First of all, using your browser with jupyter, open the Keras examples folder and locate the mnist-keras-book file. Try to run all the blocks t in order to check your Keras installation.

The output should be something like (You can stop it anytime):

Using TensorFlow backend.
Downloading data from
https://s3.amazonaws.com/img-datasets/mnist.npz
   8192/11490434 [.] - ETA: 164 ...
60000 train samples
10000 test samples
Train on 60000 samples, validate on 10000 samples
Epoch 1/12
128/60000 [.] - ETA: 102s - loss: 2.2928 - acc: 0.0938
...
task: Analyzing the code

Using your browser with jupyter, look for these parts on the code:

  1. Identify how Keras reads the data.
  2. Identify where the neural net definition is.
  3. Which are the layers used on this net? Which are the activation functions used on this net?
  4. Identify the loss and optimizer functions.

numsection TensorBoard

TensorBoard is a visualization tool included with TensorFlow. You can use TensorBoard to visualize your TensorFlow graph, plot quantitative metrics about the execution of your graph, and show additional data like images that pass through it. In this lab we will use it to visualise information about our Keras network.

The code contains the variables tensorboard_dir and tensorboard_active that allow the TensorBoard execution using the Keras callbacks. If you put tensorboard_active to True, Keras will start to save TensorBoard data to tensorboard_dir every epoch.

task: Run TensorBoard

Modify the tensorboard_dir value to a folder for saving the TensorBoard data. Change the tensorboard_active value to True. Before running the script, clear the jupyter kernel (Kernel -> Restart and clear output).

Hint: You will need another terminal for running TensorBoard and Jupyter at the same time. Open a new terminal and then use these commands:

docker ps
docker exec -it YOUR_CONTAINER_ID /bin/bash
cd /app/dlaimet/keras
tensorboard --logdir=YOUR_TENSORBOARD_FOLDER
#OUTPUT
Starting TensorBoard Starting TensorBoard 0.1.6 at http://localhost:6006
(Press CTRL+C to quit)

Go to http://localhost:6006 through your browser and TensorBoard will start. We recommend Google Chrome or Chromium in order to avoid compatibility and lag problems.You will see an output like:

You can run TensorBoard and Keras at same time, Tensor-Board will update the data every epoch.

task: How to use TensorBoard
TensorBoard could be a very useful tool during your DLAI project. Now you are ready to learn by your self the required features that could help you. The following guides explain how to use TensorBoard: Going through the jupyter notebook, modify some hyperparameters (batch size, number of epochs, learning rate), optimizer, loss functions or the layers of the model. Use TensorBoard and the logs to observe the new accuracy. Before running the script, clear the jupyter kernel (Kernel -> Restart and clear output) and modify the Tensorboard folder. Take a look to the TensorBoard docs for seeing how to merge different runs in one chart.