> The decrease is almost entirely due to gains in lighting efficiency in households, and particularly the transition from incandescent (and compact fluorescent) light bulbs to LED light bulbs
I replaced all of my incandescent and fluorescent lighting with LEDs years ago. A decent amount of the hypothetical savings from more efficient lighting was eaten by having even more lumens than previously as quality of light upgrades. Despite that, I did not notice much of a difference since my household had been keeping the lights off unless we needed them.
There was a minor dip in the electric bill from other initiatives (e.g. heat pump dryer) and my solar panels started producing more than my household used. I had relatively few computers (general purpose; not counting embedded) running in my home compared to others in computing for years to try to keep this trend. In the past few years, I got an electric car and replaced my oil heat with heat pumps. Now my solar panels only produce about 60% of the electricity I use and I have given up on limiting usage to match what I produce.
Anyway, no matter how many efficiency initiatives people adopt, electricity usage is likely to increase rather than drop. That is because we not only find new uses for electricity, but the population keeps growing.
> Anyway, no matter how many efficiency initiatives people adopt, electricity usage is likely to increase rather than drop. That is because we not only find new uses for electricity, but the population keeps growing.
It seems that energy use in the US peaked in 2007, though I do wonder how much of that is down to moving manufacturing abroad.
Energy use applies to far more than just electricity. It includes petroleum used in automobiles and heating. While the total energy pie is decreasing (for now), the marketshare of electricity should be increasing.
When there is no more energy usage to move to electric cars and heat pumps (and no more production to move to China), I would expect energy usage to start increasing again. Jevons paradox cannot be avoided forever.
Jevon's paradox is not some universal law. It's not even clear if it applies very often. One has to separate out different reasons for increasing consumption of some resource as it gets cheaper, and not just observe a correlation and infer causation.
It is related to supply and demand curves. The more supply is available for the same cost, the more demand increases to use it. In any case, technology moves forward stimulating increased electricity usage and we are healthier because of it. My heat pump and electric car eliminate nitrous oxide and sulfur oxide emissions, as well as particulate emissions. Respiratory health is improved because of it.
That’s only true up to a point. PC prices fell dramatically in real terms over time as people didn’t need to spend 2000$ mid 90’s dollars to get vastly more performance at home.
But more interesting is the trend to upgraded less frequently.
Not in a way that makes those trends correlate. Worldwide PC sales peaked in 2011, but they continued to get cheaper in real terms.
Sales trends for new technology should be broken down into new customers and replacement devices for existing customers if you want to understand what’s going on. Falling prices did little for either trend but replacements were providing an eco of earlier growth. Until that was offset by people taking ever longer to buy replacements.
2011 is long after the major price drops of the 90s and early 00s, where demand kept increasing with price drops. Not only that, but every few years, you could buy a computer that was a few times better than the one that preceded it at a lower price.
PC sales began to drop in 2012 due to the iPhone and iPad causing people to not need PCs as much. Improvements in PC technology had also started to slow and would reach a crawl a few years later. All of this meant market saturation. Supply and demand still apply, but increases in sales volume from lower pricing are much more limited than they used to be.
For anyone who does not remember, on May 7, 1997, Intel introduced the Pentium II at 300MHz at a cost of $1981. On March 8, 2000, Intel introduced a the Pentium III at 1000MHz at a cost of $990. It was roughly 3 times faster on average, while being half the price of the older processor that launched less than 3 years prior. The performance had jumped so much that unfortunately no one seems to have included the two chips in a comparison. Anyway, no three year period in 2011 onward had performance jump so much with pricing halving. The closest was the Ryzen 7 1800X to the Ryzen 9 3950X, but the performance increase was more tame (~2.5x best case compared to ~3x average case) and the price went up by 50% instead of halving. However, you can see similar improvements if you go further back in time, such as 3 years before the Pentium II launch to the 100MHz Pentium (I cannot find how much that cost sadly).
I noticed a vastly larger jump between HDD and SSD’s, which hit for most people after 2011, than between PII vs PIII’s. People generally buy systems not individual components and in practice it’s mostly bottlenecks that matter. PII’s and PIII’s had the same RAM latency as earlier and later CPU’s because that’s bound by the speed of light, thus the explosion in cache size.
> 2011 is long after the major price drops of the 90s and early 00s
Not in terms of actual average PC prices which are down ~40% from their 2011 average. That’s a big discount in real world prices. Your price comparison on 1+k CPU’s is irrelevant when so few of them entered customers hands at those prices. Sure nVidia is selling 2000$ GPU’s today, but the average PC is ~650$ that’s the number that matters.
Being down 40% is nothing compared to how you used to be able to get a PC that is many times more capable of one from a few years ago for half the price.
As for NAND flash SSDs, the Intel X25-M came out in 2008:
Being available isn’t becoming mainstream. Most new computers in 2011 still had HDD due to cost.
Similarly, the average price people actually paid for a PC didn’t suddenly drop by half over 3 years as you proposed. Over 15 years that kind of price declines would mean people when from 4k in 2025 dollars to 125$ which simply didn’t happen.
I could cherry pick numbers suggesting opposite trends. nVidia’s top of the line card more than doubled in price when adjusted for inflation between 1999 and now. But back then most people didn’t even buy 3D graphics cards and today integrated graphics is by far the most common.
My power usage has dropped by maybe 50% (a bit of a guess, based partly on electricity bills), and quality of life has increased - I simply didn't consider that these options worked better overall until I tried them for power-saving purposes.
It actually taught me a lot about innovation - there is no substitute for just trying things; you just can't know by thinking about them ahead of time. A picture is worth 1,000 words, and an experience is worth 100,000 pictures - you can't really convey it in pictures. As a result, I 'just try things' whenever I have the opportunity, and my learning rate has increased dramatically.
There are two questions about efficiency upgrades:
How much power do you use compared to the alternative? It may increase, but more efficiency means it increases less than the alternative.
How much power do you use absolutely? Physics causes climate change, and doesn't care about the hypothetical alternative.
My household energy usage has decreased, but my household electricity usage continues to rise.
In the early days, our heating oil usage was 1000 gallons per year. Efficiency initiatives reduced that to about 430 gallons per year. 1000 gallons is about 41MWh. 430 gallons is about 17MWh. Going to the heat pump has me using about 7MWh extra electricity per year, while yearly electricity usage had been around 11 MWh per year (with 10 MWh produced by solar). This does not count the car, for which the numbers would be even more skewed by life changes since I drove about 1 to 2 orders of magnitude more when I was in college.
Depending on how you decide to do the accounting, my household’s total energy usage dropped by 65% (if we count the oil usage reductions on top of the heat pump) or by 35% (if we count only the heat pump), without even counting the solar panels. Still, my electricity usage has never been higher.
If you were to try to cook my books so to speak, you could say my household’s total energy usage has decreased by 85% with a electricity usage decrease of 27%, by treating the energy from the solar panels as if it were free. I do not think that is a correct way of doing accounting (although it could be a matter of opinion).
By the way, remarks on climate change could encourage people to claim unrealistic improvements in personal/household energy usage, such as the figures I gave for what I could claim if we “cooked the books”. Of the various figures I gave, I think the 35% reduction in total energy usage is the most honest figure. It had been achieved in the past 2 years, unlike the other factors I mentioned that are many years old.
Though hath sinned against the 2nd law of thermodynamics!
1,000 gallons of heating oil have ~41 MWh of chemical potential energy (if using the HHV). Those 41 MWh are not directly comparable to 7MWh of electricity. While the units (MWh) are the same, they are measuring two completely different things.
They have as much in common as an American dollar and a Jamaican dollar; yes they're both "dollars", but they refer to different things.
Energy usage wise, they are the same. That is why the units match after conversions. In any case, I replaced substantial energy usage from heating oil with less substantial use of electricity. Here is a fun fact. It is more energy efficient to run a heat pump off electricity from a commercial generator burning diesel to heat a home than it is to burn the diesel directly. This includes grid losses. The reason for this is that heat pumps exceed 100% efficiency because the environmental energy moved is free as far as accounting is concerned. In my case, the electricity is produced by burning natural gas rather than diesel.
By the way, I realize my numbers are borderline in showing that, but the past year has been colder than years where
I burned oil, and it is hard to account for that when looking at numbers as there is no strict control.
They are not the same though. This leads to issues with prominent people, see Vaclav Smil, making the case that it'll be too difficult to transition from fossil fuels because they conflate these two things. Some do it out of ignorance, others know better and do it since it's in their financial interest.
As long as we conflate the two, people will more easily be misinformed and think they need to replace their 41 MWh of heating oil with 41 MWh of electricity. But they don't. They need at most 41 MWh of heating. And as you said, your heat pump is probably getting you and average COP of at least 3. Meaning they will need to pay for fewer "MWh" in order to get the same amount of heat to your house.
It is more efficient, just as it's more efficient to charge and EV of a grid running on natural gas than it is burning the petrol in an ICE. Both are also far better for pollution.
The two are the same as far as my computation of my household’s reduction in energy usage is concerned. That is why I converted to the same units.
As for needing 41MWh of heat, that is incorrect as my boiler was only 86% efficient and the one it replaced was even less efficient. It is also incorrect as the efficiency gains had reduced oil usage to 18MWh. Heat wise, I only need around 15MWh per year (although I likely needed more this year since it was particularly cold).
I have a suspicion that the ducted method of heat delivery used by my heat pump has more losses than the hot water system previously used to deliver heat. I had been sealing the central AC ducts in the winter to save a few hundred gallons of oil usage. I can no longer do that as heat is delivered via those ducts now.
I don’t disagree with you about furnace efficiency.
But my point still stands: You needed 18MWh of oil or 15MWh of heating. Neither of those numbers are how much electricity you will need to run a heat pump.
Dividing 15kWh by the average COP should determine it, although heat demands change from year to year and temperatures vary, causing not only the amount of heat needed to vary, but the average COP to vary too.
This does not matter for total energy usage calculations unless you consider the environmental heat to be an input, but as far as the industry is concerned, it is free, which is why the COP for heat pumps is greater than 1.
If you were comparing gasoline in a car to an EV I would maybe see what you are talking about- engines are like 30% efficient so the conversion to useful energy requires a large multiple of potential energy.
But in the case of an oil- or gas- fired furnace, their thermal efficiency is at least 80%, and often more, so their potential energy usage is close enough to directly comparable to their heating value.
Mine was 86% efficient, although for accounting purposes, I considered the waste energy to be part of energy consumption, which seems to be the most sensible way of doing it.
I did not mean to imply you did. I just stated that my own data on my energy usage shows a decrease, but how much depends on how the accounting is done, and I can “cook the books” to produce some truly absurd figures by considering savings from the not so recent past, and treating my solar panels as causing a net decrease.
Electricity use went up in your case, but switching from oil heat and an internal combustion car to heat pumps and an EV should mean that your overall energy use has gone down fairly significantly (down to 1/3 or 1/4 of the energy used on heating and driving).
So that's not quite the Jevons paradox unless you're going to drive three times the distance or expand to heating four times as much space in your house.
Overall energy efficiency has improved by about 35% in my napkin math, but I find whenever I improve efficiency, my electricity usage increases. This applies to both electric sources of energy and non-electric sources of energy as something new always seems to follow the efficiency gain. The reduction in overall energy usage is a side effect of efficiency difference of the heat pump (although on a dollar basis, costs remain about the same) being huge and is an abnormally.
Yours seems like an unusual case. Lighting is something that’s least vulnerable to Jevons effects because (after electrification reached maturity) people already use all the artificial light they could ever want and don’t look for ways to use more when it becomes cheaper (more lighting per unit cost). Even if they’re less careful about turning off unused light, LEDs are so much more efficient that they will still use less energy for lighting.
In contrast, energy itself is very vulnerable to Jevons effects because there is always a marginal use of energy it can be applied to as more is freed up.
My thinking is that we should look at the electricity usage rather than the subset of electricity used by lighting, since the bill going down because of lighting means the bill can go up because of something new.
Oh yes, I agree with the general point, that the Jevons non-effect in lighting is canceled by the Jevons effect in general energy usage, and that we can't expect energy efficiency improvements alone to reduce the latter.
So there's an important point here: the benefit of energy efficiency improvements is not that it will get total energy usage down by itself (and thereby hit GHG targets), but rather, that it will reduce the total hit to consumption/utility that we experience when regulations disincentivize (GHG-emitting) energy usage.
> people already use all the artificial light they could ever want
Huh ? People put LEDs in so many more places than they were putting incandescent lights. Under desks, beds, in closets, all around walls, over furniture, behind TVs...
Hm, I'd never noticed that phenomenon, but even so, the efficiency gain is so high that, even when you enumerate all those spaces, it doesn't cancel out the energy savings of the shift.
Also, I've noticed that during the transition, when people have a mix of incandescent and LED, they internalize the potential efficiency of the LED, and then use the incandescent bulbs in a similar way. So the incandescent light on the hall table that's more appealing gets left on for longer periods as if it were an LED.
I'm guessing this is more common in households with legacy light fixtures and legacy nostalgias, though, and that's its own diminishing set.
I can't understand if this is the same as the Jevons paradox but I can observe that once some economical behaviour is imposed/incentivied on people, government taxes more the object of this economical behaviour. Also once government promises some subside, the cost of subsided good raises by the amount of subside. I would name it the Greedo Paradox
I highly doubt that Jevons paradox is going to apply much here (and the data in the article seems to show this), simply because lighting was already cheap enough so that most people never gave a second thought to using lightbulbs (indeed, ask any parent that has ever yelled "turn off the lights" to their kids). Jevons paradox applies when there is strong demand for a good, but it is too expensive to be widely deployed, but a decrease in price of that good then allows many more uses of it. While sure there may be some edge cases (a sibling comment mentioned big flood lights for their yard), it's not like most people had dark rooms before where they thought "darn, if only light were cheaper"...
> A decent amount of the hypothetical savings from more efficient lighting was eaten by having even more lumens than previously as quality of light upgrades
I'm not sure what you mean by "quality of light upgrades". To be honest I hate the quality of light that LEDs give off. It's too "white bright", and I much prefer the quality of incandescent bulbs. I usually have LEDs much less brighter than they can go because otherwise I feel like there are floodlights in my house. Relatedly, I hate the switch to LED headlights. They're much too bright for oncoming drivers, and many car brands position them much too high.
It was a pun. It should have been quality of life. Our rooms are better lit now, as the lumen output is higher.
On a more serious note, the CRI improved when going from fluorescent lighting to LEDs, when better quality LEDs were used. Thus quality of light is actually meaningful as more than a pun.
I am reminded of Jevons paradox:
https://en.wikipedia.org/wiki/Jevons_paradox
I replaced all of my incandescent and fluorescent lighting with LEDs years ago. A decent amount of the hypothetical savings from more efficient lighting was eaten by having even more lumens than previously as quality of light upgrades. Despite that, I did not notice much of a difference since my household had been keeping the lights off unless we needed them.
There was a minor dip in the electric bill from other initiatives (e.g. heat pump dryer) and my solar panels started producing more than my household used. I had relatively few computers (general purpose; not counting embedded) running in my home compared to others in computing for years to try to keep this trend. In the past few years, I got an electric car and replaced my oil heat with heat pumps. Now my solar panels only produce about 60% of the electricity I use and I have given up on limiting usage to match what I produce.
Anyway, no matter how many efficiency initiatives people adopt, electricity usage is likely to increase rather than drop. That is because we not only find new uses for electricity, but the population keeps growing.