Unit economics relates to analyzing the costs associated with producing or providing one unit of a product or service. The FinOps Foundation notes that unit economics “bridges the gap between what an organization spends on cloud and the fundamental value that cloud spending creates.” Organizations in the cloud can track costs for metrics related to user base, transactions, or orders to help align cloud usage with business objectives.
The Scenario: How to Visualize Unit Economics
You work for an e-commerce company that wants to track certain business-related metrics along with their cloud infrastructure costs. You’ve observed that cloud costs for the e-commerce site typically surge during the week but drop on weekends. You want to view these metrics on top of your cloud costs to understand how your current cloud spending affects per unit costs. You have a view set up in a PostgreSQL database, called total_orders
, that aggregates data from other tables, as seen in the image below.
In this demo, you’ll create a script that queries this table and then uses the Vantage API to import the data as a business metric through a daily automation. You can use the per unit costs feature in Vantage to view business-specific metrics along with cloud costs on Cost Reports. The workflow for this automation follows the below diagram.
- A Python script is set to run daily and queries the PostgreSQL database to obtain data from the
total_orders
view. - The result is passed to the
/business_metrics
endpoint of the Vantage API to update a business metric called Orders. - You can view the business metric and calculated per unit costs on a configured Cost Report in Vantage.
Prerequisites
This tutorial assumes you already know how to work with and configure SQL databases, particularly PostgreSQL, as well as have a basic understanding of Python, including creating functions and using loops. It won’t cover how to use a scheduler, like Apache Airflow, or set up a cron job; however, once this script is created, you can add it to your preferred workflow scheduler to automate a daily execution.
For Vantage, you’ll need at least one provider connected. You’ll also need a Vantage API token with READ
and WRITE
scopes enabled.
Demo: Automate and Visualize Unit Economics
You can find the full script and all demo files in the FinOps as Code demo repo.
Step 1: Create a Business Metric in Vantage
To create the initial business metric, you can create it directly in Vantage, or you can send the following POST
call to the /business_metrics
endpoint. For the cost_report_tokens_with_metadata
parameter, enter the token
of the corresponding Cost Report you want the business metric to be displayed on.
Tip: The Cost Report could contain filters for things like infrastructure costs (e.g., specific AWS services) so that you can view all the costs related to running the e-commerce platform. Replace the data in the values
object with one or more daily data points.
curl --request POST \
--url https://api.vantage.sh/v2/business_metrics \
--header 'accept: application/json' \
--header 'authorization: Bearer <YOUR_VANTAGE_ACCESS_TOKEN>' \
--header 'content-type: application/json' \
--data '
{
"title": "Orders",
"cost_report_tokens_with_metadata": [
{
"unit_scale": "per_unit",
"cost_report_token": "<COST_REPORT_TOKEN>"
}
],
"values": [
{
"date": "2024-02-01",
"amount": 12345
}
]
}
'
Note the unique business metric token
that’s provided in the API response. You will need this token
for the next step.
{
"token": "bsnss_mtrc_e6f23a8d05261351",
"title": "Orders",
"created_by_token": "usr_f3a7c91fb456d28c",
"cost_report_tokens_with_metadata": [],
"values": [
{
"date": "2024-02-01T00:00:00Z",
"amount": "12345.0"
}
]
}
Step 2: Import the Required Packages
Export your Vantage API access token and business metric token
as environment variables.
export VANTAGE_API_TOKEN=<YOUR_VANTAGE_API_TOKEN>
export ORDERS_METRIC_ID=<BUSINESS_METRIC_TOKEN>
Create a Python script and add the following package imports:
import requests
from datetime import date
import psycopg2
import os
- Use
requests
to connect with the Vantage API. - Use
datetime
to access the current date. - Use
psycopg2
to connect with the PostgreSQL database. - Use
os
to access environment variables.
Step 3: Create a Function to Query the PostgreSQL Database
Create a function that fetches records from the PostgreSQL database. The function will take in the current_date
(defined later), a schema
, and a table_name
. The function connects to the PostgreSQL database and then executes a query to obtain the metric value for the current date. You can also add additional function parameters or export the PostgreSQL database credentials as environment variables. Include error logic in case the database is unreachable.
def fetch_orders(schema, table_name, current_date):
try:
# Replace with your database credentials
conn = psycopg2.connect(
dbname="<DB_NAME>",
user="<USER>",
password="<PASSWORD>",
host="<HOST>",
port="<PORT>"
)
cur = conn.cursor()
# Execute the query with the current date
query = f"SELECT * FROM {schema}.{table_name} WHERE date = %s"
cur.execute(query, (current_date,))
records = cur.fetchall()
cur.close()
conn.close()
return records
except Exception as e:
print(f"Error fetching records from table {table_name}: {e}")
return []
Step 4: Create a Function to Send Data to the Vantage API
Create a function that sends data to the Vantage API. The function should take in the business metric token
you obtained earlier (the metric_id
parameter) and an API payload
to structure the API call. Include error logic in case the API becomes unreachable.
def update_business_metric(metric_id, payload):
try:
url = f"https://api.vantage.sh/v2/business_metrics/{metric_id}"
headers = {
"accept": "application/json",
"content-type": "application/json",
"authorization": f"Bearer {os.environ.get('VANTAGE_API_TOKEN')}"
}
response = requests.put(url, json=payload, headers=headers)
response.raise_for_status()
return response.text
except Exception as e:
return f"Error updating business metric {metric_id}: {e}"
Step 5: Set Up a Scheduled Pipeline
In the main block, configure the following steps:
- Retrieve the current date in ISO format.
- Create two variables for the PostgreSQL
schema
andtable_name
. - Call the
fetch_orders
function using both these variables and thecurrent_date
. - Extract the date and total number of orders from the output.
- Create a
payload
object that takes in the returned data and structures it to the format required by the/business_metrics
endpoint in the Vantage API. - Retrieve the previously exported
ORDERS_METRIC_ID
environment variable. - Call the
update_business_metric
function, passing in themetric_id
andpayload
variables. - Add error logic in case of API failures or data retrieval issues.
if __name__ == "__main__":
# Get the current date
current_date = date.today().isoformat()
# Schema and table; adjust as needed
schema = "public"
table_name = "total_orders"
# Fetch and update total orders
orders = fetch_orders(schema, table_name, current_date)
if orders:
date_returned, total_orders_returned = orders[0]
payload = {"values": [{"date": date_returned.isoformat(), "amount": total_orders_returned}]}
metric_id = os.environ.get("ORDERS_METRIC_ID")
if metric_id:
response = update_business_metric(metric_id, payload)
print("Vantage API response:", response)
else:
print("Business metric token not found.")
else:
print(f"No records found for {schema}.{table_name} for the current date.")
To ensure the script runs daily, you can schedule it as a cron job or configure it to run within another scheduler, such as Apache Airflow.
View Cost Per Order in Vantage
As your metric data is imported each day, you can view it on the corresponding Cost Report you configured in Vantage. In this example, a Cost Report filtered to AWS costs for the e-commerce site shows costs spiking on weekends as well as on 2/7 and 2/14, possibly correlating with increased sales activity. The cost per order rises alongside these spikes in cloud costs. You could consider exploring further correlations or configuring anomaly alerts to manage cost fluctuations and optimize your operational efficiency. This data can help the team to not just view cloud costs but also understand how costs for each order trend in more relatable terms.
Conclusion
Analyzing the correlation between business metrics and cloud costs offers valuable insights for business operations. An automated pipeline like this can help you easily view changes each day.
Lower your AWS costs.