Startups Stack Exchange Archive

How to estimate costs for servers for web application?

Maybe I can find some answers here. How would you estimate costs for a
business plan in context of a web application that is running on a cloud?

Problem is I do not know how the application will technically perform, we have
no prototype yet. At one point we will need to scaleā€¦

Answer 1493

Approach #1

I can give you an example, how to estimate the costs for a fictious application, which runs in the cloud (Amazon AWS).

Let's say, there is a device, which regularly sends requests to the server. The server runs in the cloud and needs to process those requests.

Let's assume that the requests are sent every 10 minutes or 144 times per day or 4320 requests per month.

Then, we assume that it takes the server 0.005 seconds (5 milliseconds) to process one request. This gives 0.72 hours of machine time per user per day and 21.6 hours per user per month.

Let's assume we have 10000 users. This gives a total machine time per hour of 216000.

Finally, you go to the web site of Amazon AWS and look up the price per machine hour. At the time of creating that example (months ago) it was equal to 22.2 roubles.

Multiply machine hour price (22.2) by the number of machine hours per month (216000) and you get the total costs of the cloud server infrastructure (4 795 200 roubles).

You can download a spreadsheet with these calculations here.

The diagram below shows how assumptions (yellow ellipses) affect expenses and revenue in that spreadsheet.

Assumptions, expenses and revenue

It goes without saying that you should insert different value into your assumptions to see how the final value changes.

Approach #2

Apart from Amazon AWS, there is DigitalOcean. Here, you pay 5 dollars per virtual machine.

If you can estimate the number of virtual machines (their specs are available on the site of DigitalOcean), you can take the number of required machines and multiply by 5.


All content is licensed under CC BY-SA 3.0.