Easily build complex reports
Monitoring and efficiency metrics
Custom cost allocation tags
Network cost visibility
Organizational cost hierarchies
Budgeting and budget alerts
Discover active resources
Consumption-based insights
Alerts for unexpected charges
Automated AWS cost savings
Discover cost savings
Unified view of AWS discounts
COGS and business metrics
Model savings plans
Collaborate on cost initiatives
Create and manage your teams
Automate cloud infrastructure
Cloud cost issue tracking
Detect cost spikes
by Danielle Vansia
Contents
Unit economics relates to analyzing the costs associated with producing or providing one unit of a product or service. The FinOps Foundation notes that unit economics “bridges the gap between what an organization spends on cloud and the fundamental value that cloud spending creates.” Organizations in the cloud can track costs for metrics related to user base, transactions, or orders to help align cloud usage with business objectives.
You work for an e-commerce company that wants to track certain business-related metrics along with their cloud infrastructure costs. You’ve observed that cloud costs for the e-commerce site typically surge during the week but drop on weekends. You want to view these metrics on top of your cloud costs to understand how your current cloud spending affects per unit costs. You have a view set up in a PostgreSQL database, called total_orders, that aggregates data from other tables, as seen in the image below.
total_orders
Demo PostgreSQL total_orders table
In this demo, you’ll create a script that queries this table and then uses the Vantage API to import the data as a business metric through a daily automation. You can use the per unit costs feature in Vantage to view business-specific metrics along with cloud costs on Cost Reports. The workflow for this automation follows the below diagram.
Workflow diagram of daily per unit cost automation
/business_metrics
This tutorial assumes you already know how to work with and configure SQL databases, particularly PostgreSQL, as well as have a basic understanding of Python, including creating functions and using loops. It won’t cover how to use a scheduler, like Apache Airflow, or set up a cron job; however, once this script is created, you can add it to your preferred workflow scheduler to automate a daily execution.
For Vantage, you’ll need at least one provider connected. You’ll also need a Vantage API token with READ and WRITE scopes enabled.
READ
WRITE
You can find the full script and all demo files in the FinOps as Code demo repo.
To create the initial business metric, you can create it directly in Vantage, or you can send the following POST call to the /business_metrics endpoint. For the cost_report_tokens_with_metadata parameter, enter the token of the corresponding Cost Report you want the business metric to be displayed on.
POST
cost_report_tokens_with_metadata
token
Tip: The Cost Report could contain filters for things like infrastructure costs (e.g., specific AWS services) so that you can view all the costs related to running the e-commerce platform. Replace the data in the values object with one or more daily data points.
values
curl --request POST \ --url https://api.vantage.sh/v2/business_metrics \ --header 'accept: application/json' \ --header 'authorization: Bearer <YOUR_VANTAGE_ACCESS_TOKEN>' \ --header 'content-type: application/json' \ --data ' { "title": "Orders", "cost_report_tokens_with_metadata": [ { "unit_scale": "per_unit", "cost_report_token": "<COST_REPORT_TOKEN>" } ], "values": [ { "date": "2024-02-01", "amount": 12345 } ] } '
Note the unique business metric token that’s provided in the API response. You will need this token for the next step.
{ "token": "bsnss_mtrc_e6f23a8d05261351", "title": "Orders", "created_by_token": "usr_f3a7c91fb456d28c", "cost_report_tokens_with_metadata": [], "values": [ { "date": "2024-02-01T00:00:00Z", "amount": "12345.0" } ] }
Export your Vantage API access token and business metric token as environment variables.
export VANTAGE_API_TOKEN=<YOUR_VANTAGE_API_TOKEN> export ORDERS_METRIC_ID=<BUSINESS_METRIC_TOKEN>
Create a Python script and add the following package imports:
import requests from datetime import date import psycopg2 import os
requests
datetime
psycopg2
os
Create a function that fetches records from the PostgreSQL database. The function will take in the current_date (defined later), a schema, and a table_name. The function connects to the PostgreSQL database and then executes a query to obtain the metric value for the current date. You can also add additional function parameters or export the PostgreSQL database credentials as environment variables. Include error logic in case the database is unreachable.
current_date
schema
table_name
def fetch_orders(schema, table_name, current_date): try: # Replace with your database credentials conn = psycopg2.connect( dbname="<DB_NAME>", user="<USER>", password="<PASSWORD>", host="<HOST>", port="<PORT>" ) cur = conn.cursor() # Execute the query with the current date query = f"SELECT * FROM {schema}.{table_name} WHERE date = %s" cur.execute(query, (current_date,)) records = cur.fetchall() cur.close() conn.close() return records except Exception as e: print(f"Error fetching records from table {table_name}: {e}") return []
Create a function that sends data to the Vantage API. The function should take in the business metric token you obtained earlier (the metric_id parameter) and an API payload to structure the API call. Include error logic in case the API becomes unreachable.
metric_id
payload
def update_business_metric(metric_id, payload): try: url = f"https://api.vantage.sh/v2/business_metrics/{metric_id}" headers = { "accept": "application/json", "content-type": "application/json", "authorization": f"Bearer {os.environ.get('VANTAGE_API_TOKEN')}" } response = requests.put(url, json=payload, headers=headers) response.raise_for_status() return response.text except Exception as e: return f"Error updating business metric {metric_id}: {e}"
In the main block, configure the following steps:
fetch_orders
ORDERS_METRIC_ID
update_business_metric
if __name__ == "__main__": # Get the current date current_date = date.today().isoformat() # Schema and table; adjust as needed schema = "public" table_name = "total_orders" # Fetch and update total orders orders = fetch_orders(schema, table_name, current_date) if orders: date_returned, total_orders_returned = orders[0] payload = {"values": [{"date": date_returned.isoformat(), "amount": total_orders_returned}]} metric_id = os.environ.get("ORDERS_METRIC_ID") if metric_id: response = update_business_metric(metric_id, payload) print("Vantage API response:", response) else: print("Business metric token not found.") else: print(f"No records found for {schema}.{table_name} for the current date.")
To ensure the script runs daily, you can schedule it as a cron job or configure it to run within another scheduler, such as Apache Airflow.
As your metric data is imported each day, you can view it on the corresponding Cost Report you configured in Vantage. In this example, a Cost Report filtered to AWS costs for the e-commerce site shows costs spiking on weekends as well as on 2/7 and 2/14, possibly correlating with increased sales activity. The cost per order rises alongside these spikes in cloud costs. You could consider exploring further correlations or configuring anomaly alerts to manage cost fluctuations and optimize your operational efficiency. This data can help the team to not just view cloud costs but also understand how costs for each order trend in more relatable terms.
Vantage Cost Report with the business metric displayed as a line on the chart
Analyzing the correlation between business metrics and cloud costs offers valuable insights for business operations. An automated pipeline like this can help you easily view changes each day.
S3 Tables simplifies analytics by bringing managed Iceberg capabilities to S3.
RDS Extended Support allows customers to continue receiving security updates for older database versions, but it comes at a significant hourly per-vCPU cost that increases over time.
MongoDB Atlas is the cost-effective choice for production workloads where high-availability is a requirement.