Skip to content

JrTimha/coding_challenge_metering

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

12 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Coding-Challenge: Metering Distributed System

To enable billing of cloud resources to customers, resource consumption need to be accurately metered. Customers can order various services, such as databases or compute engines, which are defined using Stock Keeping Units (SKUs). An SKU represents an orderable product variant such as a compute instance with a specific number of vCPUs and Memory. Each SKU has a predefined price per unit/hour. For simplicity, we assume that usages are received per hour (so for fixed hourly periods such as 08:00 to 09:00 or 14:00 - 15:00) via the API.

Design and implement (preferred in Go but Kotlin also possible) a REST API for cloud resource metering and cost calculation. The API should adhere to the following requirements:

  • Receive usage data from service instances, including instance ID, SKU, and usage details
  • Implement a validation that avoids duplicates for an instance for a specific point of time
  • It should be possible to add additional validations in the future
  • Implement an endpoint to aggregate the costs by instances and SKUs for a specific time period supporting pagination

Get started guide:

  1. This project contains a Docker Compose file named compose.yaml. You need to execute the docker compose before using this project, included containers: 2. Kafka Broker 3. PostgreSQL 4. Kafka UI
  2. Start the billing-engine and frontend, both are in this monorepo
  3. Execute all tests in billing-engine/src/test/
  4. Go on http://localhost:4200, choose the first tenant
  5. See the dashboard numbers grow
  6. Delete Kafka Broker with volume, empty the BillingHistory Table and the billing-engine/tmp file when you want to restart (to reset Kafka Streams)

Software Concepts:

Relational Model:

Alt-Text

Software Architecture:

Alt-Text

Key Benefits:

  • Very good scalability
  • Less stress on the database because billing updates are mainly calculated with kafka and a bit more load balanced

Things I would do if I had more time:

  • Add more tests, specific for the Kafka Streams processing
  • Use a more advanced solution to simulate instances
  • Using Testcontainers

About

A small coding challenge using kafka streams to calculate cloud resource usage costs

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published