Why is "power" important
So historically when someone started talking about “computer power” my brain went straight to “computer performance”, mainly I guess because the other option is electricity and hey, who cares, PC’s all use the same amount and we need computers, so electricity usage is just a cost of doing business, right?
Now that there is such a chasm between the power draw of different types of computer, and that the price of electricity has blown the roof off many budgets (and most applications run on “anything”) my brain has now switched to jumping from “power” directly to my energy bill.
Just making a comparison between my desktop Raspberry Pi and my desktop PC, on average, assuming 24×7 operation and an even split between idle and max load power consumption, and using the current UK electricity unit rate of 25.73 p/kWh:
- Raspberry Pi:
- Average power consumption: 8.50 W
- Average annual energy consumption: 74.46 kWh
- Average annual running cost: £19.16
- PC:
- Average power consumption: 82.50 W
- Average annual energy consumption: 722.70 kWh
- Average annual running cost: £185.95
This means the average annual running cost for the PC is approximately £166.79 higher than the Raspberry Pi. In other words, the PC’s average annual running cost is about 9.71 times higher than the Raspberry Pi’s.
Now you can argue the pro’s and con’s of 24×7 operation, and how much time computers spend running under maximum load, but at the same time my slightly older desktop PC idles at 60 W and both machines run low-power fan-less graphics cards.
Either way, a dozen machines in a small office with a few of them running SETI or bitcoin mining in the background, and you’re easily up around the contract-hire cost of a small BMW. Alternatively for home workers, just running your PC is costing you as much as your Netflix subscription.
You may be thinking “yes, but I ‘need’ the computing power of a PC!” .. I’ve been working as a web developer exclusively on Raspberry Pi machines for the last year, so for the majority of people (and developers) the answer is, that you likely don’t “need” that amount of power.
Where does the differential come from?
Chip makers like Intel and AMD typically use computer architectures derived from the x86 model, currently (generally) referred to as AMD64. Raspberry Pi (and others) have opted for a different architecture referred to as ARM, which is typically the architecture used by mobile phones and as such, a low power draw is in it’s DNA.
Just to scale up the comparison, in data centres the main cost is that of power consumed, typically dwarfing equipment costs over a relatively short period of time. If you were able to drop your most expensive cost by a factor of 9, what’s that worth?
A few months ago we switched cloud hosting from a company using AMD64 systems to a company using ARM systems (same low power architecture as Raspberry Pi, just the equivalent compute power of AMD64) and strangely we find our cloud computing costs somewhat lower. In terms of band-per-buck, a good bit less than half the price!
In the not too dim-and-distant past I worked for a company with ~ 30 developers whose monthly cloud hosting bill was more than their payroll costs, by quite some margin. ARM must at least be food-for-thought!
Just a mention ..
There are some low-power versions of AMD64 style chips, notably Intel have two chips in this arena, the N100 and the newer N150. As I understand it they both idle at the same sort of power as a Raspberry Pi5 (maybe slightly higher) but use more juice under load (by 2-3x).