Skip to main content

Do you want to save on Log Analytics costs..?

Howdy Folks

It's time to discuss Azure Monitor costs, As you all know Microsoft merge most of their logging service solutions under the Azure Monitor umbrella. And most of the services share a common log collection location for this service which is the Log Analytics workspace. The following diagram explains the services under each category.


Next what we need to understand is how we get charged for Log analytics workspaces. So that we can work on reducing the costs.

As per the Microsoft documentation, we are getting charged based on

  1. Data Retention
  2. Data Ingestion

let's do a quick calculation and see how this going to affect our bill. Let's take an example scenario, let's assume we have a few Azure workloads that generate 2GB data every day and we are ingesting that to log analytics to retain for 90 days



As per the above example, we can clearly see our charges will increase with data ingestion capacity but not much with data retention.

How to Find the Utilization?

Now that we need to know how much data you are ingesting per day and what sources are. It's somewhat easy to find out. you can simply run the following log query in the workspace

Usage
| project TimeGenerated, SourceSystem, DataType, Quantity
| summarize TotalGB = sum(Quantity) / 1024 by DataType
| sort by TotalGB desc
| render piechart
Log Analytics usage (piechart)
Log Analytics usage (table)

Note - if you want to see the billable usage only please use the below query

Usage
| where IsBillable == true
| project TimeGenerated, SourceSystem, DataType, Quantity
| summarize TotalGB = sum(Quantity) / 1024 by DataType
| sort by TotalGB desc
| render piechart

How to Save Cost?

So there are a few things that we could do to save the cost.

  1. Consider moving to Commitment Tiers pricing model to the Log Analytics workspace.
  2. Limit the daily ingestion for your workspace.
  3. Consider using different logging options other than a workspace for noncritical logs (eg - blob storage)
  4. Reduce the log retention

Commitment Tiers

Other than the Pay-As-You-Go model, Log Analytics has Commitment Tiers, which can save around 30 percent compared to the Pay-As-You-Go price. We can commit to buying data ingestion for a workspace, starting at 100 GB/day, at a lower price than PAYG pricing. Any usage above the commitment level (overage) is billed at that same price per GB as provided by the current commitment tier.
But I believe this should be the last option, after doing all the fine-tuning and monitoring for some of the log analytics usages. At least a couple of months

Limit the daily ingestion for your workspace

One more way to reduce cost is to change the log level. You need to work with your teams and find out do you really need logs to be readily available in log analytics, for example, debug logs.  Usually the lower the log level, the higher the number of logs.

Consider using different logging options

Just in the above section, we took debug logs as an example. you need these debug logs if there is something wrong. Nothing better than debugging logs when in an outage to troubleshoot and find out what is the problem. So now you must be thinking about what I can do to save money and keep these logs. That is when the other options come into the picture. Using low-cost storage as log destinations. You can point (required but not required all the time) logs to storage accounts. Archiving to storage accounts costs way less than log analytics workspace

Reduce the log retention

The other thing that you could do is reduce the log retention time, You may have already known log retention for the first 31 days is free. But my honest opinion is unless you dump heaps of log data into log analytics, you won't get charged much for retention compared to the log ingestion.

Conclusion

Long story short, saving cost again falls to your side :).If you can do the following you can reduce the cost you spend on log analytics workspaces and also you can have a valuable source of logs

  1. Make sure you are logging in what's absolutely necessary
  2. Move noncritical logs to low-cost storage
  3. Periodically check the logs that you are ingesting into the log analytics

Until next time....... :D


Comments

Popular posts from this blog

Deploying an Automation Account with a Runbook and Schedule Using Bicep

Introduction Automation is a key component in many organizations' cloud strategy. Azure Automation allows you to automate the creation, deployment, and management of resources in your Azure environment. In this post, we will walk through the process of deploying an Automation Account with a Runbook and Schedule using Bicep, a new domain-specific language for deploying Azure resources. Intention My intention at the  end is to run a PowerShell  script to start and shutdown Azure VMs based on tag values. PowerShell  script that I have used is from below l ink.  And two  of me   collogue s ( Michael Turnley   and Saudh Mohomad helped to modify the  PowerShell  script. Prerequisites Before we begin, you will need the following: An Azure subscription The Azure CLI installed on your machine. The Azure Bicep extension for the Azure CLI Creating the Automation Account The first step in deploying an Automation Account with a Runbook and Schedule is to create the Aut

Securing Azure Services with Fetian FIDO

Hey Folks  Here again with another security topic with Fetian Fido. And once again Fetian devices proved their excellent quality and stability. For this I choose Fetian K33 -  AllinPass FIDO Security Key – FEITIAN (ftsafe.com) and  K39 -  Single-button FIDO Security Keys | FEITIAN (ftsafe.com) Use case  In an organization following changes needs to be implemented.  1. Update the password policy 2. Update the user session time out to 30 minutes Once these changes being implemented, the following issues need to be addressed 1. Users' complaint new passwords need to be so long 2. Users complain sessions time out makes them work so much slower with the longer passwords 3. Etc... Solution  One of my friends reached out to me to help solve this problem. All I could think of was using passwordless auth with FIDO devices. We have decided to use Fido2 keys for better security and flexibility for the users. The FIDO (Fast IDentity Online) Alliance helps to promote open authentication stand

Migrating Azure DevOps Variable Groups

Howdy Folks, I was working on an application modernization project. And there was a requirement to migrate application deployments from one project to another in Azure DevOps. deployment pipelines were heavily dependent on variable groups. So, we wanted to migrate these variables group to the new project. Couldn't find any solutions in internet for this, so came up with the below scripts. You can grab the scripts from the below GitHub URL. DaniduWeerasinghe911/Migrate-Azure-DevOps-Variable-Groups: This Repo Include PowerShell Scripts relating to Migrating Azure DevOps Variable Groups (github.com) Azure DevOps Variable Groups Azure DevOps Variable Groups are a way to store and manage sets of variables that can be used across multiple pipelines in Azure DevOps. These variables can include secrets, connection strings, and other sensitive information that is needed for builds and releases. Variable Groups provide a centralized way to manage these variables and ensure that they are cons