How to inject custom log files into GCP Cloud Logging?

Chimbu Chinnadurai
4 min readJun 30, 2023


Google Cloud Platform (GCP) Cloud Logging is a powerful tool for collecting, storing, and analyzing log data from your applications and infrastructure. However, by default, Cloud Logging only collects logs from your VM instances and other GCP resources. If you want to analyze logs from custom sources, such as third-party applications, manually inject them into Cloud Logging.

Our customers frequently inquire about injecting custom log files from third parties stored in Cloud Storage buckets into Cloud Logging. While there is no in-built method to achieve this in GCP, it is possible to create a customized pipeline using Cloud Storage object change notifications, Pub/Sub and Cloud Function.

This blog post will show you how to deploy a custom solution to inject log files into Cloud Logging.


  1. Enable the below APIs in your GCP project

2. gcloud CLI

3. git

Cloud storage bucket and object change notification

  • Set up the necessary env variables.
export PROJECT_ID="your-project-id"
export REGION="your-region" # ex: us-central1
export LOG_FILES_BUCKET_NAME="bucket-to-upload-log-files" #ex: custom-log-injection-demo
export PUB_SUB_TOPIC_NAME="log-file-bucket-object-change-notification"
export CLOUD_FUNCTION_NAME="log-injection-cf"
  • Create a cloud storage bucket to upload the log files.
gcloud storage buckets create gs://$BUCKET_NAME --location $REGION --project=PROJECT_ID
  • Set up a notification configuration to your bucket for all supported events. Here we will set up the notification for OBJECT_FINALIZE event which will be sent when a new object (or a new generation of an existing object) is successfully created in the bucket. This includes copying or rewriting an existing object.
gcloud storage buckets notifications create gs://$LOG_FILES_BUCKET_NAME --topic=$PUB_SUB_TOPIC_NAME --event-types=OBJECT_FINALIZE --project=$PROJECT_ID

A pub/sub topic is automatically created to send the notification. You can also use an existing topic if required.

$ gcloud storage buckets notifications create gs://$LOG_FILES_BUCKET_NAME --topic=$PUB_SUB_TOPIC_NAME --event-types=OBJECT_FINALIZE --project=$PROJECT_ID
etag: '1'
id: '1'
kind: storage#notification
payload_format: JSON_API_V1
topic: //

Deploy Cloud function

The sample Python cloud function is available in

Clone the repository and run the following command to deploy the function. The cloud function is subscribed to the pub/sub topic created in the previous step and will be invoked whenever a new event is published to the topic.

gcloud functions deploy $CLOUD_FUNCTION_NAME \
--trigger-event=providers/cloud.pubsub/eventTypes/topic.publish \
--trigger-resource=$PUB_SUB_TOPIC_NAME \
--max-instances=10 \
--runtime=python311 \
--entry-point=process_log \

Test and validate

Let's upload a sample log file to the cloud storage bucket and validate the log injection to cloud logging.

cat << EOF > sample-logs.json
{"id": 1,"severity":"Info","msg":"1 success"}
{"id": 2,"severity":"Info","msg":"2 success"}
{"id": 3,"severity":"Info","msg":"3 success"}
{"id": 4,"severity":"Info","msg":"4 success"}
{"id": 5,"severity":"Info","msg":"5 success"}
{"id": 6,"severity":"Info","msg":"6 success"}
{"id": 7,"severity":"Info","msg":"7 success"}
{"id": 8,"severity":"Info","msg":"8 success"}
{"id": 9,"severity":"Info","msg":"9 success"}
{"id": 10,"severity":"Info","msg":"10 success"}

gcloud storage cp sample-logs.json gs://$LOG_FILES_BUCKET_NAME

The file upload event will trigger an object change notification to the pub/sub topic. This will trigger the cloud function, which you can validate by checking the cloud function logs.

The above screenshots show that the logs are successfully uploaded into cloud logging.


In this blog post, we showed you how to inject custom log files into Google Cloud Platform (GCP) Cloud Logging. We provided a small example to illustrate the process. You can customize the solution to meet your specific needs.