Google Cloud Python Logging Library Improves Serverless Support With Version 3.0 Release

The Google Cloud Python logging library has been updated to version 3.0.0. Enhancements to Cloud Run and Cloud Functions, as well as support for string JSON payloads and automated metadata attachments, are also included in this release.

On supported environments like GKE, Cloud Run, and Cloud Functions, the library now uses GCP’s structured JSON logging feature in version 3.0.0. If the logging library is operating in a supported environment, it can recognize it and use the StructuredLogHandler. This handler outputs logs to standard out as JSON strings, which GCP’s built-in agents can decode and send to Cloud Logging. Within serverless environments, one can still log using the prior technique by manually configuring the library with a CloudLoggingHandler instance.

Only App Engine and Kubernetes Engine were supported in previous versions of google-cloud-logging. On serverless setups like Cloud Run and Cloud Functions, users noted that the library would occasionally drop logs. Because the library would send logs in batches over the network, this was the case. Unsent batches could be lost if a serverless environment shuts down.

It’s helpful to have as much information about the environment as possible in your application logs while troubleshooting the program. By recognizing and appending metadata about the environment to each log message, ‘google-cloud-logging’ tries to assist in this process. 

🔥 Recommended Read: Leveraging TensorLeap for Effective Transfer Learning: Overcoming Domain Gaps

The logging library now detects and attaches details about the environment to each log message automatically. The GCP resource from which the log originated, information about the HTTP request in the logs’ context, and the source location are now supported fields (e.g., file, line, and function name). While the library will attempt to set this data automatically, it is possible to set the fields explicitly as follows:

logging.info("hello", extra={
    "labels": {"foo": "bar"},
    "http_request": {"requestUrl": "localhost"},
    "trace": "01234"
})

The Python standard library integration could only send logs with string payloads in prior versions of the library. The option to log JSON payloads in two different ways has been added in this release. The JSON data is attached to the first method as a JSON-parsable string:

import logging
import json

data_dict = {"hello": "world"}
logging.info(json.dumps(data_dict))

The JSON data is supplied as a json_fields dictionary in the second method, which makes use of Python’s additional argument:

import logging

data_dict = {"hello": "world"}
logging.info("message field", extra={"json_fields": data_dict})

A new Logger.log method will attempt to infer and log any specified type, among other things. As illustrated in the code example below, arguments to the log function now support a wider range of input formats:

# lowercase severity strings will be accepted
logger.log("hello world", severity="warning")

# a severity will be pulled out of the JSON payload if not otherwise set
logger.log({"hello": "world", "severity":"warning"})

# resource data can be passed as a dict instead of a Resource object
logger.log("hello world", resource={"type":"global", "labels":[]})

For log creation, the team suggests utilizing the normal Python logging interface. However, for use cases such as reading logs or managing log sinks, you can utilize google.cloud.logging directly.

The Google Cloud Logging Python library now supports more computing environments, finds more useful metadata, and provides more comprehensive support for JSON logs with version 3.0.0. There are further user-experience enhancements such as a new log method and more permissive argument processing, in addition to these significant changes.

References:

  • https://cloud.google.com/blog/products/devops-sre/google-cloud-logging-python-client-library-v3-0-0-release
  • https://www.infoq.com/news/2022/02/gcp-python-logging/

Nitish is a computer science undergraduate with keen interest in the field of deep learning. He has done various projects related to deep learning and closely follows the new advancements taking place in the field.