Cloud Run allows you to run stateless containers in a fully managed environment. It is built from open-source Knative, letting you choose to run your containers either fully managed with Cloud Run, or in your Google Kubernetes Engine cluster with Cloud Run for Anthos.
Events for Cloud Run makes it easy to connect Cloud Run services with events from a variety of sources. It allows you to build event-driven architectures in which microservices are loosely coupled and distributed. It also takes care of event ingestion, delivery, security, authorization, and error-handling for you which improves developer agility and application resilience.
In this codelab, you will learn about Events for Cloud Run. More specifically, you will listen to events from Cloud Pub/Sub and Audit Logs.
Long-term Vision is to be able to deliver events from various sources to Google Cloud sinks and Custom sinks.
Google Cloud sources
Event sources that are Google Cloud owned products
Event sources that are Google-owned products such as Gmail, Hangouts, Android Management and more
Event sources that are not Google-owned products and are created by end-users themselves
3rd party sources
Event sources that are neither Google-owned nor customer-produced. This includes popular event sources such as Github, SAP, Datadog, Pagerduty, etc that are owned and maintained by 3rd party providers, partners, or OSS communities.
Events are normalized to CloudEvents v1.0 format for cross-service interoperability. CloudEvents is a vendor-neutral open spec describing event data in common formats, enabling interoperability across services, platforms and systems.
This preview is the first version which delivers an initial set of the long-term functionality.
You can draw events from Google Cloud sources and Custom applications publishing to Cloud Pub/Sub and deliver them to Google Cloud Run sinks.
Events from a breadth of Google Cloud sources are delivered by way of Cloud Audit Logs. The latency and availability of event delivery from these sources are tied to those of Cloud Audit Logs. Whenever an event from a Google Cloud source is fired, a corresponding Cloud Audit Log entry is created.
Custom applications publishing to Cloud Pub/Sub can publish messages to a Pub/Sub topic they specify and in any format.
The underlying delivery mechanism in Events for Cloud Run is Cloud Pub/Sub and topics and subscriptions created to facilitate event delivery are visible in customers' Pub/Sub instance for their project.
Event triggers are the filtering mechanism to specify which events to deliver to which sink. They include:
All events are delivered in the Cloud Events v1.0 format for cross service interoperability.
If you see a "request account button" at the top of the main Codelabs window, click it to obtain a temporary account. Otherwise ask one of the staff for a coupon with username/password.
These temporary accounts have existing projects that are set up with billing so that there are no costs associated for you with running this codelab.
Note that all these accounts will be disabled soon after the codelab is over.
Use these credentials to log into the machine or to open a new Google Cloud Console window https://console.cloud.google.com/. Accept the new account Terms of Service and any updates to Terms of Service.
Here's what you should see once logged in:
When presented with this console landing page, please select the only project available. Alternatively, from the console home page, click on "Select a Project" :
While Google Cloud can be operated remotely from your laptop, in this codelab you will be using Google Cloud Shell, a command line environment running in the Cloud.
From the GCP Console click the Cloud Shell icon on the top right toolbar:
It should only take a few moments to provision and connect to the environment. When it is finished, you should see something like this:
This virtual machine is loaded with all the development tools you'll need. It offers a persistent 5GB home directory, and runs on Google Cloud, greatly enhancing network performance and authentication. All of your work in this lab can be done with simply a browser.
Inside Cloud Shell, make sure that you project id is setup:
gcloud config set project [YOUR-PROJECT-ID]
Check that the gcloud component for alpha in installed:
gcloud components install alpha
Enable all necessary services:
gcloud services enable run.googleapis.com gcloud services enable logging.googleapis.com gcloud services enable cloudbuild.googleapis.com
You also need to enable Data Access Audit Logs (Admin read, Data read, Data write) for all Google Cloud services from which you intend to receive events. In a later step, we will show you how to enable audit logs for Google Cloud Storage.
Cloud Run sinks, Events for Cloud Run and Google Cloud Sources must all be in the same region for the product to be guaranteed to work. Events for Cloud Run is currently available in the following GCP regions:
In Cloud Shell, set the Cloud Run region to one of the supported regions and platform to managed:
gcloud config set run/region europe-west1 gcloud config set run/platform managed
You can check that the configuration is set:
gcloud config list ... [run] platform = managed region = europe-west1
You can discover what the registered sources are, the types of events they can emit, and how to configure triggers in order to consume them.
To see the list of different types of events:
gcloud alpha events types list TYPE SOURCE DESCRIPTION com.google.cloud.auditlog.event CloudAuditLogsSource This event is sent when a cloud audit log is emitted. com.google.cloud.pubsub.topic.publish CloudPubSubSource This event is sent when a message is published to a Cloud Pub/Sub topic.
To get more information about each event type:
gcloud alpha events types describe com.google.cloud.pubsub.topic.publish
As an event sink, deploy a Cloud Run service that logs the contents of the CloudEvent it receives.
Clone a repository with samples in different languages (Node.js, Go, Java, Python, C#):
git clone https://github.com/gcpevents/eventsforcloudrun.git
You can build and deploy any samples you prefer. As an example, take a look at the Node.js sample in
eventsforcloudrun/node folder. It has 3 files:
index.js: Creates a basic web server that listens on the port defined by the PORT environment variable and logs the body of the incoming HTTP request.
package.json: Contains a start script command and a dependency on the Express web application framework.
Dockerfile: Container image of the app.
Build your container image using Cloud Build. Run the following from the directory containing the
gcloud builds submit --tag gcr.io/$(gcloud config get-value project)/helloworld
Cloud Build builds the container image and pushes it to Cloud Registry, all in one command. Once pushed to the registry, you can check that the image is in the registry by listing all the container images associated with your project using this command:
gcloud container images list
Deploy your containerized application to Cloud Run:
export SERVICE_NAME=helloworld-events gcloud run deploy $SERVICE_NAME \ --image gcr.io/$(gcloud config get-value project)/helloworld \ --allow-unauthenticated
On success, the command line displays the service URL, for instance:
Service [helloworld-events] revision [helloworld-events-00001] has been deployed and is serving traffic at [SERVICE URL]
You can now visit your deployed container by opening the service URL in any browser window.
One way of receiving events is through Cloud Pub/Sub. Custom applications can publish messages to Cloud Pub/Sub and these messages can be delivered to Google Cloud Run sinks via Events for Cloud Run.
First, create a Cloud Pub/Sub topic. You can replace
TOPIC_ID with a unique name you prefer:
TOPIC_ID=cr-topic gcloud pubsub topics create $TOPIC_ID
Before creating the trigger, get more details on the parameters you'll need to construct a trigger for events from Cloud Pub/Sub:
gcloud alpha events types describe com.google.cloud.pubsub.topic.publish
Create a trigger to filter events published to the Cloud Pub/Sub topic to our deployed Cloud Run service:
gcloud alpha events triggers create trigger-pubsub \ --target-service $SERVICE_NAME \ --type com.google.cloud.pubsub.topic.publish \ --parameters topic=$TOPIC_ID
You can check that the trigger is created by listing all triggers:
gcloud alpha events triggers list
You might need to wait for up to 10 minutes for the trigger creation to be propagated and for it to begin filtering events.
In order to simulate a custom application sending message, you can use
gcloud to to fire an event:
gcloud pubsub topics publish $TOPIC_ID --message="Hello there"
The Cloud Run sink we created logs the body of the incoming message. You can view this in the Logs section of your Cloud Run instance:
Note that "Hello there" will be base64 encoded as it was sent by Pub/Sub and you will have to decode it if you want to see the original message sent.
Optionally, you can delete the trigger once done testing.
gcloud alpha events triggers delete trigger-pubsub
You will set up a trigger to listen for events from Audit Logs. More specifically, you will look for Cloud Storage events in Audit Logs.
First, create a Cloud Storage bucket in the same region as the deployed Cloud Run service. You can replace
BUCKET_NAME with a unique name you prefer:
export BUCKET_NAME=cr-bucket gsutil mb -p $(gcloud config get-value project) \ -l $(gcloud config get-value run/region) \ gs://$BUCKET_NAME/
In order to receive events from a service, you need to enable audit logs. From the Cloud Console, select
IAM & Admin > Audit Logs from the upper left-hand menu. In the list of services, check Google Cloud Storage:
On the right hand side, make sure Admin, Read and Write are selected. Click save:
To learn how to identify the parameters you'll need to set up an actual trigger, perform an actual operation.
For example, create a random text file and upload it to the bucket:
echo "Hello World" > random.txt gsutil cp random.txt gs://$BUCKET_NAME/random.txt
Now, let's see what kind of audit log this update generated. From the Cloud Console, select
Logging > Logs Viewer from the upper left-hand menu.
Query Builder, choose
GCS Bucket and choose your bucket and its location. Click
Once you run the query, you'll see logs for the storage bucket and one of those should be
resourceName. We'll use these in creating the trigger.
You are now ready to create an event trigger for Audit Logs.
You can get more details on the parameters you'll need to construct the trigger:
gcloud alpha events types describe com.google.cloud.auditlog.event
Create the trigger with the right filters:
gcloud alpha events triggers create trigger-auditlog \ --target-service $SERVICE_NAME \ --type com.google.cloud.auditlog.event \ --parameters serviceName=storage.googleapis.com \ --parameters methodName=storage.objects.create
List all triggers to confirm that trigger was successfully created:
gcloud alpha events triggers list
Wait for up to 10 minutes for the trigger creation to be propagated and for it to begin filtering events. Once ready, it will filter create events and send them to the service.
You're now ready to fire an event.
Upload the same file to the Cloud Storage bucket as you did earlier:
gsutil cp random.txt gs://$BUCKET_NAME/random.txt
If you check the logs of the Cloud Run service in Cloud Console, you should see the received event:
Optionally, you can delete the trigger once done testing:
gcloud alpha events triggers delete trigger-auditlog
Congratulations for completing the codelab.