There's no denying that cloud computing is gaining more and more traction; however, a less known - but staggering - prediction is that a significant chunk (65.9%) of all enterprise software spending will go to cloud technologies in 2025 from less than 60% in 2022.
Unfortunately, for many companies, the cost implications of cloud technologies have not been sufficiently addressed, even with the rising number of FinOps solutions dedicated to reducing cloud spend.
On the Google Cloud Platform (GCP), Cloud Logging is a dedicated service that collects, organizes, analyzes, and monitors log data, making it easier to troubleshoot and monitor applications, infrastructure, and services.
With the rising complexity of cloud systems and the importance of effective logging and monitoring, GCP Cloud logging plays a crucial role. The downside is that, as with any cloud-based service, they come at a steep cost when left unmanaged, contributing significantly to your overall cloud spend.
This article will provide insight into GCP Cloud logging pricing model, cost optimization techniques, and the right solution to manage costs and maximize your cloud spend.
This is one of our most popular articles. Don’t miss another highly viewed piece on What is Datadog—read it here.
Like many cloud services across different platforms, the pricing model of GCP Cloud Logging can contribute significantly to the overall cost of your cloud operations.
GCP Cloud Logging pricing is based on two critical components: logging ingestion and logging storage.
Let's see a practical example to illustrate the GCP Cloud logging pricing model better. If you generate 500 GB of log data monthly from a web application, the ingestion cost would be $250 per month ($0.50 per GB) minus the free allotment of 50 GB. To retain the data for 90 days, you would be charged $0.01 per GB per month for the remaining 60 days, resulting in a monthly storage cost of $45 for 450 GB of log data. The total cost for GCP Cloud Logging in this scenario would be $295 per month.
Beyond the pricing model, if you want to export the log data to BigQuery for further analysis, additional fees would apply based on the amount of data exported.
Without adequate visibility and control, the cost of GCP Cloud Logging may become very expensive for your business. This can be caused by several factors, including:
There are many ways in which you can reduce or optimize your GCP Cloud Logging cost, which includes setting up exclusions for logs that are not relevant or redundant, and creating log sinks to route logs to specific destinations. You can even employ more advanced cost control strategies such as fine-tuning retention policies based on log type and frequency of access, and using custom metrics to monitor and alert on unusual log patterns.
You can use the GCP Console to set up an exclusion filter that excludes specific logs from being ingested into GCP Cloud Logging. To set up an exclusion filter, navigate to Logging > Logs Router and click "Create Sink" to create a new sink. In the "Create sink" dialog, give your sink a name and select a destination.
If you already have a sink in place, you can edit it to add the exclusion filter. After that, then “Choose logs to filter out of sink”. Create an exclusion filter with the logs you want to exclude. For instance, you could exclude logs from a specific resource, logs with a specific log level, or logs containing specific text. You can even specify the conditions for the logs you want to exclude, since the filter is written in a query language called the Logging Query Language.
```
resource.type="gce_instance"
AND
resource.labels.instance_id="INSTANCE_ID"
```
By creating exclusion filters, you can avoid ingesting unnecessary logs, reducing storage and processing costs associated with that data. With regular reviews and updates of these exclusion filters, you can ensure that you are only ingesting the logs that provide valuable insights into your cloud infrastructure and applications, while minimizing their GCP Cloud Logging costs.
Consider exporting logs to cheaper storage options like Google Cloud Storage or BigQuery. By doing this, you can use more affordable storage options to save money while retaining access to your log data.
To export logs to Google Cloud Storage or BigQuery, you can configure a Log Sink in the GCP Console. A Log Sink allows organizations to specify where log data should be exported and how it should be formatted. By exporting logs to Google Cloud Storage or BigQuery, you can take advantage of these services' more affordable storage options while still retaining access to their log data.
In addition to the cost savings associated with exporting logs to cheaper storage options, you can also leverage the powerful analytics offered by these services to gain deeper insights into your log data. By using BigQuery to analyze log data, for example, you can easily identify trends and anomalies in your cloud infrastructure and applications, leading to more efficient monitoring and troubleshooting.
Another way to optimize GCP Cloud Logging costs is to consider the retention period for logs stored in Cloud Logging. Logs are typically retained for 30 days and don't incur storage costs during this period. However, retaining logs for longer than the default retention period can result in increased storage costs.
To reduce log storage costs, you can carefully consider and adjust your retention policies as needed. For example, suppose you don't require logs to be retained for longer than the default retention period. In that case, you can modify your retention policies to ensure that logs are automatically deleted after 30 days.
In GCP Cloud logging, you can cut costs and reduce the need for long-term storage and processing by creating metrics from logs. You can aggregate and summarize log data, which reduces the amount of raw log data to store and process, particularly if you generate a high volume of log data.
This also means you can customize your log data based on your specific operational needs. For example, you can create metrics that track specific application events or errors, or even metrics that offer insights on infrastructure performance.
Resource-based IAM policies can be used to restrict access to logs in GCP Cloud Logging. By applying these policies, you can reduce the need for expensive log auditing and ensure that only authorized users can access your logs. This can help you meet compliance and regulatory requirements while also reducing your overall costs.
Additionally, resource-based IAM policies can help you maintain better control over your logs, allowing you to better monitor usage and track access.
Logging can be expensive, especially when logging large volumes of data in high throughput. So it's essential to be aware of the costs associated with it, and doing this manually can be highly burdensome. To make it easier, FinOps tools like Finout can be incredibly helpful in identifying wasteful spending and highlighting areas that need improvement.
Finout is a cloud cost monitoring platform that can help you implement the best techniques for optimizing Cloud Logging cost, giving you granular visibility into log usage and data and recommending ways to reduce costs. With Finout's advanced analytics and cost optimization features, you can get visibility into your overall cloud costs across all platforms – beyond just GCP (Azure, AWS, Snowflake, DataDog, etc.) – all from a single dashboard.
So, don't let GCP Cloud Logging costs drain your budget; book a demo today to see how Finout can help optimize your GCP Cloud Logging costs.