The Awwvision lab uses Kubernetes and Cloud Vision API to demonstrate how to use the Vision API to classify (label) images from Reddit's /r/aww subreddit and display the labelled results in a web app.

Awwvision has three components:

  1. A simple Redis instance.
  2. A web app that displays the labels and associated images.
  3. A worker that handles scraping Reddit for images and classifying them using the Vision API. Cloud Pub/Sub is used to coordinate tasks between multiple worker instances.

Codelab-at-a-conference setup

If you see a "request account button" at the top of the main Codelabs window, click it to obtain a temporary account. Otherwise ask one of the staff for a coupon with username/password.

These temporary accounts have existing projects that are set up with billing so that there are no costs associated for you with running this codelab.

Note that all these accounts will be disabled soon after the codelab is over.

Use these credentials to log into the machine or to open a new Google Cloud Console window https://console.cloud.google.com/. Accept the new account Terms of Service and any updates to Terms of Service.

Here's what you should see once logged in:

When presented with this console landing page, please select the only project available. Alternatively, from the console home page, click on "Select a Project" :

Activate Google Cloud Shell

From the GCP Console click the Cloud Shell icon on the top right toolbar:

Then click "Start Cloud Shell":

It should only take a few moments to provision and connect to the environment:

This virtual machine is loaded with all the development tools you'll need. It offers a persistent 5GB home directory, and runs on the Google Cloud, greatly enhancing network performance and authentication. Much, if not all, of your work in this lab can be done with simply a browser or your Google Chromebook.

Once connected to the cloud shell, you should see that you are already authenticated and that the project is already set to your PROJECT_ID.

Run the following command in the cloud shell to confirm that you are authenticated:

gcloud auth list

Command output

Credentialed accounts:
 - <myaccount>@<mydomain>.com (active)
gcloud config list project

Command output

[core]
project = <PROJECT_ID>

If it is not, you can set it with this command:

gcloud config set project <PROJECT_ID>

Command output

Updated property [core/project].

Enable the Vision API by visiting the APIs & Services Dashboard on the cloud console:

In this lab you will use gcloud, Google Cloud Platform's command-line tool, to set up a Container Engine Kubernetes cluster. You can specify as many nodes as you want, but you need at least one. The cloud platform scope is used to allow access to the Pub/Sub and Vision APIs.

In Cloud Shell, run the following to create a cluster in the us-central1-f zone:

gcloud config set compute/zone us-central1-f



Then start up the cluster by running:

gcloud container clusters create awwvision \
    --num-nodes 2 \
    --scopes cloud-platform

Run the following to use the container's credentials:

gcloud container clusters get-credentials awwvision

Verify that everything is working using the kubectl command-line tool:

kubectl cluster-info

Now add sample data to your project by running:

git clone https://github.com/GoogleCloudPlatform/cloud-vision

In Cloud Shell, change to the python/awwvision directory in the cloned cloud-vision repo:

cd cloud-vision/python/awwvision

Once in the awwvision directory, run make all to build and deploy everything:

make all

As part of the process, Docker images will be built and uploaded to the Google Container Registry private container registry. In addition, yaml files will be generated from templates, filled in with information specific to your project, and used to deploy the ‘redis', ‘webapp', and ‘worker' Kubernetes resources for the lab.

After you've deployed, check that the Kubernetes resources are up and running.

First, list the pods by running:

kubectl get pods

You should see something like the following, though your pod names will be different. Make sure all of your pods have a Running before executing the next command.

NAME                     READY     STATUS    RESTARTS   AGE
awwvision-webapp-vwmr1   1/1       Running   0          1m
awwvision-worker-oz6xn   1/1       Running   0          1m
awwvision-worker-qc0b0   1/1       Running   0          1m
awwvision-worker-xpe53   1/1       Running   0          1m
redis-master-rpap8       1/1       Running   0          2m

Next, list the deployments by running:

kubectl get deployments -o wide

You can see the number of replicas specified for each, and the images used.

NAME               DESIRED   CURRENT   UP-TO-DATE   AVAILABLE   AGE       CONTAINERS         IMAGES                                SELECTOR
awwvision-webapp   1         1         1            1           1m        awwvision-webapp   gcr.io/your-project/awwvision-webapp   app=awwvision,role=frontend
awwvision-worker   3         3         3            3           1m        awwvision-worker   gcr.io/your-project/awwvision-worker   app=awwvision,role=worker
redis-master       1         1         1            1           1m        redis-master       redis                                 app=redis,role=master

Once deployed, get the external IP address of the webapp service by running:

kubectl get svc awwvision-webapp

It may take a few minutes for the assigned external IP to be listed in the output. You should see something like the following, though your IPs will be different.

NAME               CLUSTER_IP      EXTERNAL_IP    PORT(S)   SELECTOR                      AGE
awwvision-webapp   10.163.250.49   23.236.61.91   80/TCP    app=awwvision,role=frontend   13m

Copy and paste the external IP of the awwvision-webapp service into a new browser to open the webapp, then click Start the Crawler button.

Next, click go back and you should start to see images from the /r/aww subreddit classified by the labels provided by the Vision API. You will see some of the images classified multiple times, when multiple labels are detected for them. (You can reload in a bit, in case you brought up the page before the crawler was finished).

Your results will look something like this: