Google Cloud Run runs stateless containers, serverlessly

Google Cloud Run lets you deploy stateless, HTTP-invocable containers to a managed compute service or to Google Kubernetes Engine

Paul Krill

Google has expanded its serverless compute options with the addition of Cloud Run, a managed compute service that lets you run stateless containers that are invocable via HTTP requests. Cloud Run is also available on Google Kubernetes Engine (GKE), allowing you to run containerized HTTP workloads on a managed Kubernetes cluster.

Cloud Run lets developers take advantage of the portability of containers and the velocity of serverless computing. Now in beta, Cloud Run provides for automated provisioning and scaling of workloads, with users paying only for the resources their containers actually use. On GKE, Cloud Run allows stateless HTTP workloads to run on existing Kubernetes clusters, with users having access to custom machine types, Google Compute Engine networks, and the ability to run side-by-side with other workloads in the same cluster.

Cloud Run allows developers to build applications in any language, using the tools and dependencies of their choice. Cloud Run is based on Knative, an open API and software layer that lets users move “serverless” workloads across Kubernetes platforms including Google Cloud Platform, GKE, and anywhere else that Kubernetes runs.

Key features of Cloud Run include:

  • A command line and user interface for deploying and managing services.
  • Autoscaling, moving up and down from zero to N based on traffic. When running on GKE, autoscaling is limited to the capacity within the cluster.
  • Users can select their own language or operating system libraries as well as use their own binaries.
  • Container workflows and standards can be leveraged. Cloud Run can be paired with Cloud Build, Container Registry, and Docker, among others.
  • Redundancy is provided. Services are regional and automatically replicated across multiple zones.
  • Integrated logging and monitoring, including Stackdriver monitoring, logging, and error-reporting.
  • Users can map services to their own domains.

Cloud Run shares core infrastructure with two other serverless technologies at Google, Google Cloud Functions and Google App Engine. You can go to the Google Cloud Platform website to start a free trial.

Google Cloud Run runs stateless containers, serverlessly

Google Cloud Run lets you deploy stateless, HTTP-invocable containers to a managed compute service or to Google Kubernetes Engine

Paul Krill Apr 10th 2019

Google has expanded its serverless compute options with the addition of Cloud Run, a managed compute service that lets you run stateless containers that are invocable via HTTP requests. Cloud Run is also available on Google Kubernetes Engine (GKE), allowing you to run containerized HTTP workloads on a managed Kubernetes cluster.

Cloud Run lets developers take advantage of the portability of containers and the velocity of serverless computing. Now in beta, Cloud Run provides for automated provisioning and scaling of workloads, with users paying only for the resources their containers actually use. On GKE, Cloud Run allows stateless HTTP workloads to run on existing Kubernetes clusters, with users having access to custom machine types, Google Compute Engine networks, and the ability to run side-by-side with other workloads in the same cluster.

Cloud Run allows developers to build applications in any language, using the tools and dependencies of their choice. Cloud Run is based on Knative, an open API and software layer that lets users move “serverless” workloads across Kubernetes platforms including Google Cloud Platform, GKE, and anywhere else that Kubernetes runs.

Key features of Cloud Run include:

  • A command line and user interface for deploying and managing services.
  • Autoscaling, moving up and down from zero to N based on traffic. When running on GKE, autoscaling is limited to the capacity within the cluster.
  • Users can select their own language or operating system libraries as well as use their own binaries.
  • Container workflows and standards can be leveraged. Cloud Run can be paired with Cloud Build, Container Registry, and Docker, among others.
  • Redundancy is provided. Services are regional and automatically replicated across multiple zones.
  • Integrated logging and monitoring, including Stackdriver monitoring, logging, and error-reporting.
  • Users can map services to their own domains.

Cloud Run shares core infrastructure with two other serverless technologies at Google, Google Cloud Functions and Google App Engine. You can go to the Google Cloud Platform website to start a free trial.