Your Biggest Household Energy Hog? Your Smartphone!

Your iPhone Does NOT Use More Power Than a Fridge

Your iPhone Does NOT Use More Power Than a Fridge

Not too long ago we were marveling about how the iPhone 5 only consumed ~$0.41 per year to charge, with the conclusion that “energy consumption of a modern smartphone is minuscule”. But phone charging is only part of the picture, and as we factor in the migration to wireless networks and cloud, that smartphone uses more energy than TWO refrigerators!

The new report from IEEE Spectrum notes that all of the transitions occuring as we move from a world of desktops and landlines to a 3-screen world of laptop-tablet-smartphone require more energy.

I have seen this information twisted in the past week to say that phones use more power than household appliances. That, as noted, is silly – and the reason for the image. Even the most efficient new Energy Star large double-door fridge costs close to $100 per year in power.

So what does this data mean?

As we have moved our phones from wired landlines to wireless landlines to analog cell phones to 3G and 4G, the required infrastructure and power has exploded. Data network ranges are smaller and require more power, meaning we need more towers and transmitters to feed existing customers as they upgrade. And as more and more people buy smartphones, this means more network bandwidth and power requirements, which continue to expand the need for power.

The same is true as we have moved away from wired data connections for tablets and laptops. Our base stations and expanders at home consume considerable power, and as we move more and more of our data to Cloud storage it requires huge server farms and massive data centers – all of which means more power and infrastructure. And guess what? Cloud means wireless – so you need more wireless bandwidth and power to feed that as well!

In the end you take something that costs my family less to charge in a year than to buy a single Dunkin’ Donuts coffee – and it ends up being a huge burden on the natural resources of the world.

Here are a few more findings:

•The global ICT ecosystem uses about 1500 terawatt-hours of electricity annually, which is equal to the electricity used by Japan and Germany combined. That figure will increase as cloud architecture overtakes wired networks.
•This growth is also driven by “revolutions in everything from healthcare to cars, and from factories to farms.”
•A 50-megawatt enterprise data center in Iowa that gets 70 percent of its energy from coal and 25 percent from wind will cost $350 million less to run over its lifetime compared with one in California, with no coal in its energy mix.
•However, environmental regulations could eliminate the favored position coal enjoys because of its low cost. For instance, U.S. President Barack Obama has called for existing coal-fired power plants to meet more stringent greenhouse gas emissions standards.
•In India (the second largest mobile market after China), 75 percent of rural and 33 percent of urban cellphone base stations will need to run on alternative energy by 2020.

This holistic look at power consumption takes us out of our comfort zone – it felt a lot better thinking that whether we bought Apple or Samsung we’d be using well under $1 a year in power. However, the reality is that for all of our choices and changes there are outcomes and consequences.

Categories: Editorials


2 replies

  1. >Our base stations and expanders at home consume considerable power, and as we move more and more of our data to Cloud storage it requires huge server farms and massive data centers – all of which means more power and infrastructure.

    Point taken! But two things to consider:

    1) A lot of that infrastructure is shared by desktops, other servers, and even large portions of (or entire!) many companies as they look into ways to reduce costs. Take for example Amazon Web Services, which powers Netflix, reddit, Instagram, Ericsson, Adobe, and Spotify, just to name a few brands. The push towards the cloud reduces the need for different companies to purchase, power, and cool their own infrastructure. This is infrastructure that all these disparate companies would be highly unlikely to use to their fullest, which by definition means wasted resources.

    2) A common knock against electric vehicles is that for the most part electricity is generated by heavily polluting coal plants (at least that’s the case for the United States; not so much for, say, Germany, Iceland, or France). On the other hand, it’s much easier to control/treat the emissions from one smokestack than a thousand different tailpipes. Additionally, advanced technologies can be applied more easily to a single boiler than a thousand internal combustion engines already on the road.

    In similar fashion, centralized computing (heh, the more things change, the more they stay the same, right?) makes it easier to switch out server components for more efficient processors, storage media, etc. when compared to the IT department carting away a stack of Dell desktops at the end of another three-year lease. The thin clients at the other end only have to be powerful enough to handle some basic local computing and displaying the virtual machine it’s interfacing with, which means fewer upgrades are necessary. It’s not too far a leap to think the same thing might happen at some point with mobile devices. And since power (for both processing and cooling) is one of the primary expenses in running a data center, its owners have a very persuasive incentive to make sure their complexes are as is efficient as practically possible.

  2. Hmm. Makes me wonder how much power my power line adapter uses??

    Also reminds me I need to get cracking on my churchs’ website. I have it running on a crusty server in the basement for the moment until we migrate it to our host. Once that;s done I am going to move everything else off of it and on to a Raspberry PI or something similar with much lower power requirements.