Migrate financials to radian-software org

This commit is contained in:
Radon Rosborough 2022-05-09 15:41:39 -07:00
parent fb6948df72
commit afe6df3cdc
14 changed files with 2 additions and 708 deletions

View File

@ -1,61 +0,0 @@
Riju :: $169.46
CloudWatch :: $34.80
EC2 :: $107.01
Data Transfer :: $0.68
EBS Snapshot :: $5.45
EBS Volume :: $46.40
EBS Volume :: $46.40
gp2 :: $11.61
gp3 :: $34.78
Instance :: $54.48
t2.small :: $0.04
t3 :: $0.08
t3.2xlarge :: $29.80
t3.medium :: $14.77
t3.small :: $9.78
ECR :: $7.31
Data Transfer :: $3.29
Storage :: $4.02
ELB :: $20.05
Data Transfer :: $0.31
LCUs :: $0.06
Load Balancer :: $19.68
S3 :: $0.29
COMMENTARY: This month was a disaster because AWS makes it really hard
to understand what exactly is going to run up your bill.
The most egregious thing here is CloudWatch. It turns out that if you
follow the official documentation for how to set up a CloudWatch alarm
on disk space for your EC2 instance, the default configuration has SSM
Agent creating a metric for *every* filesystem mounted on your
instance, which is actually one or more per Docker container, so I
actually had like multiple tens of thousands of metrics being shipped
to CloudWatch, which is expensive. I fixed this for August, bringing
CloudWatch costs to be effectively zero.
We have some charges for a t3.medium, this is before I scaled the
server down to t3.small. The charges for that instance are also higher
than you'd expect because I was originally running two of them before
scaling it down for a singleton because I realized I was out of my
depth.
We had a couple gp2 volumes (more expensive) before I migrated
everything to gp3. EBS costs are generally quite high here because not
only did I previously have two instances serving traffic, but I also
had a dev server. Each of those three instances had to have the full
256 GB data volume to store language images, which was ridiculously
expensive. I'm planning on keeping Riju as a singleton for a while
because of this issue, and relying on vertical scaling until that
becomes no longer feasible. The persistent dev server will be replaced
by a transient CI instance that can be spun up to do large rebuild
operations, mitigating EBS costs.
t3.2xlarge is the dev server, this is mostly just tough luck since I
did need to spend a lot of time building and rebuilding language
images and those hours add up. Hopefully that won't be as much of an
issue going forward now that the infrastructure is more stable and we
can hopefully get away without a dev server in general. But
fundamentally you can't do builds on your local laptop without a
symmetric Internet plan because you need to upload like 100 GB for a
full rebuild.

View File

@ -1,25 +0,0 @@
Riju :: $58.75
EC2 :: $32.26
Data Transfer :: $0.04
EBS Snapshot :: $1.67
EBS Volume :: $18.46
EBS Volume :: $18.46
gp2 :: $0.69
gp3 :: $17.77
Instance :: $12.09
t3.small :: $12.09
ECR :: $6.42
Data Transfer :: $1.38
Storage :: $5.05
ELB :: $19.93
Data Transfer :: $0.18
LCUs :: $0.06
Load Balancer :: $19.68
S3 :: $0.13
COMMENTARY: I think we could save on ELB costs by migrating to Lambda;
see https://github.com/radian-software/riju/issues/93 for that.
Otherwise, the main important thing to note about this month is that I
had part of the infrastructure spun down for a significant part of it,
as per https://riju.statuspage.io/ (Aug 1 through Aug 16). So costs
are liable to increase next month now that we are in normal operation.

View File

@ -1,27 +0,0 @@
Riju :: $81.55
EC2 :: $57.02
Data Transfer :: $0.02
EBS Snapshot :: $1.97
EBS Volume :: $26.82
EBS Volume :: $26.82
gp2 :: $1.01
gp3 :: $25.81
Instance :: $28.21
t3.medium :: $19.01
t3.small :: $9.21
ECR :: $5.09
Storage :: $5.09
ELB :: $19.32
Data Transfer :: $0.22
LCUs :: $0.06
Load Balancer :: $19.04
S3 :: $0.12
COMMENTARY: We're starting to look pretty stable from month to month.
Naturally the costs are higher because we were operating the
infrastructure for the entire month this time, instead of being down
for half of it, but I think this cost is about what we should expect
to see going forward until changes are made.
I did realize, by the way, that we can't use Lambda to replace the
ELB, because that wouldn't support websockets. Oh well.

View File

@ -1,18 +0,0 @@
Riju :: $106.77
EC2 :: $81.38
Data Transfer :: $0.03
EBS Snapshot :: $2.36
EBS Volume :: $28.57
EBS Volume :: $28.57
gp2 :: $1.07
gp3 :: $27.49
Instance :: $50.43
t3.large :: $23.05
t3.medium :: $27.38
ECR :: $5.14
Storage :: $5.14
ELB :: $20.14
Data Transfer :: $0.38
LCUs :: $0.07
Load Balancer :: $19.68
S3 :: $0.11

View File

@ -1,17 +0,0 @@
Riju :: $133.50
EC2 :: $108.22
Data Transfer :: $0.02
EBS Snapshot :: $3.79
EBS Volume :: $27.58
EBS Volume :: $27.58
gp2 :: $1.04
gp3 :: $26.55
Instance :: $76.82
t3.large :: $76.82
ECR :: $5.34
Storage :: $5.34
ELB :: $19.83
Data Transfer :: $0.69
LCUs :: $0.10
Load Balancer :: $19.04
S3 :: $0.11

View File

@ -1,15 +0,0 @@
Riju :: $134.10
EC2 :: $108.76
EBS Snapshot :: $4.40
EBS Volume :: $27.01
EBS Volume :: $27.01
gp2 :: $1.02
gp3 :: $26.00
Instance :: $77.35
t3.large :: $77.35
ECR :: $5.45
Storage :: $5.45
ELB :: $19.77
LCUs :: $0.09
Load Balancer :: $19.68
S3 :: $0.12

View File

@ -1,15 +0,0 @@
Riju :: $134.19
EC2 :: $108.65
EBS Snapshot :: $4.58
EBS Volume :: $26.86
EBS Volume :: $26.86
gp2 :: $1.01
gp3 :: $25.84
Instance :: $77.21
t3.large :: $77.21
ECR :: $5.51
Storage :: $5.51
ELB :: $19.92
LCUs :: $0.24
Load Balancer :: $19.68
S3 :: $0.11

View File

@ -1,18 +0,0 @@
Riju :: $118.74
EC2 :: $95.20
EBS Snapshot :: $5.44
EBS Volume :: $19.48
EBS Volume :: $19.48
gp2 :: $1.02
gp3 :: $18.47
Instance :: $70.27
t3.large :: $70.25
ECR :: $5.51
Storage :: $5.51
ELB :: $17.92
LCUs :: $0.14
Load Balancer :: $17.77
S3 :: $0.11
COMMENTARY: Costs are down by $16 this month because I realized that I
could cut the EBS data volume from 256 GB to 128 GB!

View File

@ -1,15 +0,0 @@
Riju :: $122.42
EC2 :: $96.90
EBS Snapshot :: $5.70
EBS Volume :: $13.91
EBS Volume :: $13.91
gp2 :: $1.01
gp3 :: $12.90
Instance :: $77.28
t3.large :: $77.28
ECR :: $5.54
Storage :: $5.54
ELB :: $19.87
LCUs :: $0.19
Load Balancer :: $19.68
S3 :: $0.11

View File

@ -1,15 +0,0 @@
Riju :: $119.48
EC2 :: $94.60
EBS Snapshot :: $5.70
EBS Volume :: $13.91
EBS Volume :: $13.91
gp2 :: $1.01
gp3 :: $12.90
Instance :: $74.98
t3.large :: $74.98
ECR :: $5.57
Storage :: $5.57
ELB :: $19.20
LCUs :: $0.16
Load Balancer :: $19.04
S3 :: $0.11

View File

@ -1,14 +1,4 @@
# Riju financials
This directory has a Python script that can download and analyze
billing data from AWS to determine how much Riju actually costs. This
information is then made publicly available in per-month
subdirectories here; for some months with unusual charges I've added
commentary to explain what was going on.
This information is then imported into [Riju's master budgeting
spreadsheet](https://docs.google.com/spreadsheets/d/15Us9KLXaJ6B1lNhrM6GV6JmmeKqNc8NNeTnaWiAhozw/edit?usp=sharing)
which compares spending to donations in order to determine whether we
are making a profit (we are not...). Once we start making a profit we
can start donating to the EFF as promised, or scale up Riju's
infrastructure to support more users for free.
This data has all moved to
[radian-software/financials](https://github.com/radian-software/financials).

View File

@ -1,319 +0,0 @@
#!/usr/bin/env python3
import argparse
import collections
import csv
import decimal
import gzip
import io
import json
import logging
import os
import pathlib
import re
import sys
from urllib.parse import urlparse
import boto3
logging.basicConfig(level=logging.INFO)
ROOT = pathlib.Path(__file__).parent
def die(msg):
raise AssertionError(msg)
def get_csv(year, month, force_download=False):
target_dir = ROOT / f"{year}-{month:02d}"
logging.info(f"Using base directory {target_dir}")
target_dir.mkdir(exist_ok=True)
latest_csv = target_dir / "latest.csv"
if force_download or not latest_csv.exists():
try:
latest_csv.unlink()
except FileNotFoundError:
pass
s3 = boto3.client("s3")
o = urlparse(os.environ["BILLING_REPORTS_URL"], allow_fragments=False)
assert o.scheme == "s3"
bucket = o.netloc
base_prefix = o.path.strip("/") + "/"
report_name = base_prefix.rstrip("/").split("/")[-1]
logging.info(f"List s3://{bucket}/{base_prefix}")
month_prefixes = [
elt["Prefix"]
for elt in s3.list_objects_v2(
Bucket=bucket, Prefix=f"{base_prefix}", Delimiter="/"
)["CommonPrefixes"]
]
if not month_prefixes:
die("no report prefixes found")
expected_month_prefix = f"{base_prefix}{year}{month:02d}"
matching_month_prefixes = [
p for p in month_prefixes if p.startswith(expected_month_prefix)
]
if not matching_month_prefixes:
die(f"no report prefix for the specified month ({expected_month_prefix})")
if len(matching_month_prefixes) > 1:
die(f"multiple matching report prefixes: {repr(matching_month_prefixes)}")
(month_prefix,) = matching_month_prefixes
stream = io.BytesIO()
manifest_path = f"{month_prefix}{report_name}-Manifest.json"
logging.info(f"Download s3://{bucket}/{manifest_path} in-memory")
s3.download_fileobj(bucket, manifest_path, stream)
manifest = json.loads(stream.getvalue())
(report_path,) = manifest["reportKeys"]
if not report_path.endswith(".csv.gz"):
die(f"unexpected report extension in {report_path}")
logging.info(f"Get metadata for s3://{bucket}/{report_path}")
basename = s3.head_object(Bucket=bucket, Key=report_path)[
"LastModified"
].strftime("%Y-%m-%d")
logging.info(
f"Download s3://{bucket}/{report_path} to {target_dir.relative_to(ROOT)}/{basename}.csv.gz"
)
s3.download_file(bucket, report_path, f"{target_dir}/{basename}.csv.gz")
logging.info(f"Decompress {basename}.csv.gz")
with gzip.open(f"{target_dir}/{basename}.csv.gz") as f_read:
with open(f"{target_dir}/{basename}.csv", "wb") as f_write:
while chunk := f_read.read(1024):
f_write.write(chunk)
latest_csv.symlink_to(f"{basename}.csv")
return latest_csv
def read_csv(csv_path):
rows = []
with open(csv_path) as f:
reader = csv.reader(f)
header = next(reader)
for row in reader:
rows.append(dict((key, val) for (key, val) in zip(header, row) if val))
return rows
def get_tax_key(item):
service = item["lineItem/ProductCode"]
usage_type = item["lineItem/UsageType"]
if "DataTransfer" in usage_type:
service = "AWSDataTransfer"
return (service, usage_type)
def embed_taxes(items):
tax_items = collections.defaultdict(list)
usage_items = collections.defaultdict(list)
for item in items:
item_type = item["lineItem/LineItemType"]
if item_type == "Tax":
tax_items[get_tax_key(item)].append(item)
elif item_type == "Usage":
usage_items[get_tax_key(item)].append(item)
else:
die(f"unexpected line item type {repr(item_type)}")
for key in tax_items:
if key not in usage_items:
die(f"tax for {repr(key)} but no usage for that key")
tax_cost = sum(item["lineItem/UnblendedCost"] for item in tax_items[key])
usage_cost = sum(item["lineItem/UnblendedCost"] for item in usage_items[key])
tax_multiplier = (tax_cost + usage_cost) / usage_cost
for item in usage_items[key]:
item["lineItem/UnblendedCost"] *= tax_multiplier
return [item for group in usage_items.values() for item in group]
def classify_line_item(item, billing_month=None, full=False):
service = item["lineItem/ProductCode"]
usage_type = item["lineItem/UsageType"]
operation = item.get("lineItem/Operation")
resource = item.get("lineItem/ResourceId")
project = item.get("resourceTags/user:BillingCategory")
# In 2021-07, the first month that I was using AWS resources for
# Riju in a nontrivial capacity, I had subpar billing
# observability, so a lot of the resources aren't tagged
# correctly. So for that month specifically, I'm hacking in a
# couple of heuristics to tag the resources after the fact based
# on what I know about my usage of AWS.
if billing_month == "2021-07":
if resource and "riju" in resource.lower():
project = "Riju"
elif resource and "shallan" in resource.lower():
project = "Shallan"
elif resource and "veidt" in resource.lower():
project = "Veidt"
elif service == "AmazonCloudWatch":
project = "Riju"
elif (
service == "AmazonEC2"
and resource != "i-077884b74aba86bac"
and "ElasticIP:IdleAddress" not in usage_type
and "EBS:SnapshotUsage" not in usage_type
):
project = "Riju"
# Subpar tagging on my part for some testing resources.
if billing_month == "2022-02":
if service == "AmazonEC2" and resource in {
"i-04af44ee8f8238a00",
"i-0a16cf6c998e59b88",
"i-0ec6e28b124698fc0",
"i-0df1818af33ea1aa9",
}:
project = "Riju"
# AWS does not let you put tags on a public ECR repository,
# yippee.
if service == "AmazonECRPublic" and resource.endswith("repository/riju"):
project = "Riju"
category = [
"Uncategorized",
service,
usage_type,
operation or "(no operation)",
resource or "(no resource)",
]
if not full:
if service == "AmazonS3":
category = ["S3"]
elif service == "AmazonSNS":
category = ["SNS"]
elif service in ("AmazonECR", "AmazonECRPublic"):
category = ["ECR"]
if "DataTransfer" in usage_type:
category.append("Data Transfer")
elif "TimedStorage" in usage_type:
category.append("Storage")
else:
category.extend(
[
"Uncategorized",
usage_type,
operation or "(no operation)",
resource or "(no resource)",
]
)
elif service == "AmazonEC2":
category = ["EC2"]
if "ElasticIP:IdleAddress" in usage_type:
category.append("EIP")
# Apparently tags on EIPs are ignored for billing
# purposes, so we just have to know what we were using
# them for. (Leaving them uncategorized for 2021-07
# though.)
if billing_month != "2021-07":
project = "Corona"
elif "EBS:VolumeUsage" in usage_type:
category.append("EBS Volume")
category.extend(["EBS Volume", re.sub(r"^.+\.", "", usage_type)])
elif "EBS:SnapshotUsage" in usage_type:
category.append("EBS Snapshot")
elif (
"DataTransfer" in usage_type
or "In-Bytes" in usage_type
or "Out-Bytes" in usage_type
):
category.append("Data Transfer")
elif "BoxUsage" in usage_type or "CPUCredits" in usage_type:
category.extend(["Instance", re.sub(r"^.+:", "", usage_type)])
else:
category.extend(
[
"Uncategorized",
usage_type,
operation or "(no operation)",
resource or "(no resource)",
]
)
elif service == "AWSELB":
category = ["ELB"]
if "DataTransfer" in usage_type:
category.append("Data Transfer")
elif "LCUUsage" in usage_type:
category.append("LCUs")
elif "LoadBalancerUsage":
category.append("Load Balancer")
else:
category.extend(
[
"Uncategorized",
usage_type,
operation or "(no operation)",
resource or "(no resource)",
]
)
elif service == "AmazonCloudWatch":
category = ["CloudWatch"]
elif service == "awskms":
category = ["KMS"]
if not project:
category.extend(
[
usage_type,
operation or "(no operation)",
resource or "(no resource)",
]
)
return [project or "Uncategorized", *category]
def add_to_taxonomy(taxonomy, category, item):
if category:
categories = taxonomy.setdefault("categories", {})
add_to_taxonomy(categories.setdefault(category[0], {}), category[1:], item)
else:
taxonomy.setdefault("items", []).append(item)
taxonomy.setdefault("cost", 0)
taxonomy["cost"] += float(item["lineItem/UnblendedCost"])
def uncategorized_last(key):
return (key == "Uncategorized", key)
def print_taxonomy(taxonomy, indent="", file=sys.stdout):
cost = taxonomy["cost"]
categories = taxonomy.get("categories", {})
for category in sorted(categories, key=uncategorized_last):
subtaxonomy = categories[category]
cost = subtaxonomy["cost"]
if cost < 0.01:
continue
print(f"{indent}{category} :: ${cost:.2f}", file=file)
print_taxonomy(subtaxonomy, indent=indent + " ", file=file)
def classify_costs(csv_path, **kwargs):
all_items = [item for item in read_csv(csv_path)]
items = []
for item in all_items:
cost = item["lineItem/UnblendedCost"]
if cost and float(cost):
items.append({**item, "lineItem/UnblendedCost": float(cost)})
taxonomy = {}
for item in embed_taxes(items):
add_to_taxonomy(taxonomy, ["AWS", *classify_line_item(item, **kwargs)], item)
return taxonomy
def main():
parser = argparse.ArgumentParser()
parser.add_argument("date")
parser.add_argument("-f", "--force-download", action="store_true")
parser.add_argument("-w", "--write", action="store_true")
args = parser.parse_args()
year, month = map(int, args.date.split("-"))
billing_month = f"{year}-{month:02d}"
csv_path = get_csv(year, month, force_download=args.force_download)
taxonomy = classify_costs(csv_path, billing_month=billing_month)
print_taxonomy(taxonomy)
if args.write:
riju_taxonomy = taxonomy["categories"]["AWS"]
riju_taxonomy["categories"] = {"Riju": riju_taxonomy["categories"]["Riju"]}
target_dir = ROOT / f"{year}-{month:02d}"
with open(target_dir / "breakdown.txt", "w") as f:
print_taxonomy(riju_taxonomy, file=f)
if __name__ == "__main__":
main()
sys.exit(0)

135
financials/poetry.lock generated
View File

@ -1,135 +0,0 @@
[[package]]
name = "boto3"
version = "1.18.23"
description = "The AWS SDK for Python"
category = "main"
optional = false
python-versions = ">= 3.6"
[package.dependencies]
botocore = ">=1.21.23,<1.22.0"
jmespath = ">=0.7.1,<1.0.0"
s3transfer = ">=0.5.0,<0.6.0"
[package.extras]
crt = ["botocore[crt] (>=1.21.0,<2.0a0)"]
[[package]]
name = "botocore"
version = "1.21.23"
description = "Low-level, data-driven core of boto 3."
category = "main"
optional = false
python-versions = ">= 3.6"
[package.dependencies]
jmespath = ">=0.7.1,<1.0.0"
python-dateutil = ">=2.1,<3.0.0"
urllib3 = ">=1.25.4,<1.27"
[package.extras]
crt = ["awscrt (==0.11.24)"]
[[package]]
name = "jmespath"
version = "0.10.0"
description = "JSON Matching Expressions"
category = "main"
optional = false
python-versions = ">=2.6, !=3.0.*, !=3.1.*, !=3.2.*"
[[package]]
name = "python-dateutil"
version = "2.8.2"
description = "Extensions to the standard Python datetime module"
category = "main"
optional = false
python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,>=2.7"
[package.dependencies]
six = ">=1.5"
[[package]]
name = "python-dotenv"
version = "0.19.0"
description = "Read key-value pairs from a .env file and set them as environment variables"
category = "main"
optional = false
python-versions = ">=3.5"
[package.extras]
cli = ["click (>=5.0)"]
[[package]]
name = "s3transfer"
version = "0.5.0"
description = "An Amazon S3 Transfer Manager"
category = "main"
optional = false
python-versions = ">= 3.6"
[package.dependencies]
botocore = ">=1.12.36,<2.0a.0"
[package.extras]
crt = ["botocore[crt] (>=1.20.29,<2.0a.0)"]
[[package]]
name = "six"
version = "1.16.0"
description = "Python 2 and 3 compatibility utilities"
category = "main"
optional = false
python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*"
[[package]]
name = "urllib3"
version = "1.26.6"
description = "HTTP library with thread-safe connection pooling, file post, and more."
category = "main"
optional = false
python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*, <4"
[package.extras]
brotli = ["brotlipy (>=0.6.0)"]
secure = ["pyOpenSSL (>=0.14)", "cryptography (>=1.3.4)", "idna (>=2.0.0)", "certifi", "ipaddress"]
socks = ["PySocks (>=1.5.6,!=1.5.7,<2.0)"]
[metadata]
lock-version = "1.1"
python-versions = "^3.9"
content-hash = "170b0bcf9f0ae12c4c9e1daa195ecdb39585494414b88e53e3da72916eb52c51"
[metadata.files]
boto3 = [
{file = "boto3-1.18.23-py3-none-any.whl", hash = "sha256:1b08ace99e7b92965780e5ce759430ad62b7b7e037560bc772f9a8789f4f36d2"},
{file = "boto3-1.18.23.tar.gz", hash = "sha256:31cc69e665f773390c4c17ce340d2420e45fbac51d46d945cc4a58d483ec5da6"},
]
botocore = [
{file = "botocore-1.21.23-py3-none-any.whl", hash = "sha256:3877d69e0b718b786f1696cd04ddbdb3a57aef6adb0239a29aa88754489849a4"},
{file = "botocore-1.21.23.tar.gz", hash = "sha256:d0146d31dbc475942b578b47dd5bcf94d18fbce8c6d2ce5f12195e005de9b754"},
]
jmespath = [
{file = "jmespath-0.10.0-py2.py3-none-any.whl", hash = "sha256:cdf6525904cc597730141d61b36f2e4b8ecc257c420fa2f4549bac2c2d0cb72f"},
{file = "jmespath-0.10.0.tar.gz", hash = "sha256:b85d0567b8666149a93172712e68920734333c0ce7e89b78b3e987f71e5ed4f9"},
]
python-dateutil = [
{file = "python-dateutil-2.8.2.tar.gz", hash = "sha256:0123cacc1627ae19ddf3c27a5de5bd67ee4586fbdd6440d9748f8abb483d3e86"},
{file = "python_dateutil-2.8.2-py2.py3-none-any.whl", hash = "sha256:961d03dc3453ebbc59dbdea9e4e11c5651520a876d0f4db161e8674aae935da9"},
]
python-dotenv = [
{file = "python-dotenv-0.19.0.tar.gz", hash = "sha256:f521bc2ac9a8e03c736f62911605c5d83970021e3fa95b37d769e2bbbe9b6172"},
{file = "python_dotenv-0.19.0-py2.py3-none-any.whl", hash = "sha256:aae25dc1ebe97c420f50b81fb0e5c949659af713f31fdb63c749ca68748f34b1"},
]
s3transfer = [
{file = "s3transfer-0.5.0-py3-none-any.whl", hash = "sha256:9c1dc369814391a6bda20ebbf4b70a0f34630592c9aa520856bf384916af2803"},
{file = "s3transfer-0.5.0.tar.gz", hash = "sha256:50ed823e1dc5868ad40c8dc92072f757aa0e653a192845c94a3b676f4a62da4c"},
]
six = [
{file = "six-1.16.0-py2.py3-none-any.whl", hash = "sha256:8abb2f1d86890a2dfb989f9a77cfcfd3e47c2a354b01111771326f8aa26e0254"},
{file = "six-1.16.0.tar.gz", hash = "sha256:1e61c37477a1626458e36f7b1d82aa5c9b094fa4802892072e49de9c60c4c926"},
]
urllib3 = [
{file = "urllib3-1.26.6-py2.py3-none-any.whl", hash = "sha256:39fb8672126159acb139a7718dd10806104dec1e2f0f6c88aab05d17df10c8d4"},
{file = "urllib3-1.26.6.tar.gz", hash = "sha256:f57b4c16c62fa2760b7e3d97c35b255512fb6b59a259730f36ba32ce9f8e342f"},
]

View File

@ -1,16 +0,0 @@
[tool.poetry]
name = "riju-financials"
version = "0.1.0"
description = "Financial data for Riju hosting"
authors = ["Radian LLC <contact+riju@radian.codes>"]
[tool.poetry.dependencies]
python = "^3.9"
boto3 = "^1.18.23"
python-dotenv = "^0.19.0"
[tool.poetry.dev-dependencies]
[build-system]
requires = ["poetry-core>=1.0.0"]
build-backend = "poetry.core.masonry.api"