Gustav Brock
Gustav at cactus.dk
Wed Sep 30 03:13:18 CDT 2009
Hi all Has anyone worked seriously with minimizing power consumption of small servers? Here, charges for electricity has reached a new high at about $0.35/0.25 per kWh. This means that power costs for a machine using 200W running continuously will reach $610/430 per year. No kidding. Thus, for ourselves and a couple of clients, we are trying to work out any model for reducing these costs while, of course, not losing features or services. One method is easy: Replace existing servers with new models using less power. However, that is not enough. Other ideas are: 1. Move services to "the cloud". This cuts power costs as well as machine inventory costs to zero but introduces new costs for rental of CPU time and probably higher speed bandwidth of internet connections. 2. Introduce standby methods for servers. Quite often servers are idle at night or most of the night. But is the OS (typical Windows) able to set the machine at some standby level while keeping it responsive? 3. Introduce shut-down of servers. If OS permits, shut down the server after some time of inactivity. Power must be turned on by a remote call of Wake-up-LAN which most machines are capable of. 4. Move server OS to laptops. Laptops are per definition built to consume as little power as possible and do have all sorts of hardware that can control or be controlled to use only the power needed. As an additional bonus, laptops have built in emergency power supply (the battery). If any of you have had similar considerations and/or practical experiences implementing these, I would be pleased to learn. /gustav