Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
GUI Earth Power Hardware

Power-Saving Web Pages: Real Or Myth? 424

An anonymous reader writes "Are dark webdesigns an energy saving alternative to a snow white Google? The theory is websites with black backgrounds save energy, based on the assumption that a monitor requires more power to display a white screen than black. Is this a blatant green washing ploy by Blackle.com, or an earnest energy saving tweak for a search tool we use every day? To find out, PCSTATS hooked up an Extech Power Analyzer to a 19" CRT and a 19" LCD and measured power draw — turns out there is a not insignificant difference ..."
This discussion has been archived. No new comments can be posted.

Power-Saving Web Pages: Real Or Myth?

Comments Filter:
  • by Anonymous Coward on Thursday April 19, 2012 @04:32PM (#39738317)

    The article is about the energy efficiency of various webpages. The only problem is that all the numbers are in Watts.

  • Re:Seriously? (Score:2, Informative)

    by Anonymous Coward on Thursday April 19, 2012 @04:33PM (#39738335)
  • OLED's (Score:4, Informative)

    by imgod2u ( 812837 ) on Thursday April 19, 2012 @04:36PM (#39738387) Homepage

    The idea is valid for all of the smartphones running OLED displays. OLED's take no power (or very little) to display a black pixel. It takes full power to display white.

  • Re:Seriously? (Score:5, Informative)

    by QuasiSteve ( 2042606 ) on Thursday April 19, 2012 @04:39PM (#39738449)

    Seriously?
    Did anyone here actually believe this? The big power draw is from the backlight, which is still running even with black pixels.

    Yes, anyone here actually believed this. I guess in your hurry to post, you misread the double-negative in the summary...

    turns out there is a not insignificant difference

    ...that actually indicates that there is at least a measurable difference.

    Note that their measurements apply specifically to the two models they tested, a CRT and a particular LCD.
    If 'white' means you have to drive the LCD, then white takes more energy. If 'black' means you have to drive the LCD, then black takes more energy. Most LCD drivers are standardized, though - and given the prevalence of lighter content, it may be worth it to the industry (even if only so they can use it in marketing) to switch the defaults.

  • Re:Seriously? (Score:4, Informative)

    by Anonymous Coward on Thursday April 19, 2012 @04:39PM (#39738451)

    The link was to pcstats.com, which actually tested the claim. There was a ~25% difference between all-white and all-black screens on their test CRT, and a ~12% difference between the two on their test LCD.

    They tested a lot more sites than just Google and Blackle.

  • Re:Seriously? (Score:4, Informative)

    by betterunixthanunix ( 980855 ) on Thursday April 19, 2012 @04:41PM (#39738483)
    Some monitors will reduce the brightness of the blacklight when the screen displays a very dark image.
  • Re:Really? (Score:5, Informative)

    by X0563511 ( 793323 ) on Thursday April 19, 2012 @04:42PM (#39738501) Homepage Journal

    No, it's not. It's supposed to make the screen feel like it has a higher contrast ratio than it actually does.... and has nothing to do with power consumption.

  • Re:Seriously? (Score:5, Informative)

    by amicusNYCL ( 1538833 ) on Thursday April 19, 2012 @04:51PM (#39738623)

    Unless the screen is OLED, the answer to "does dark sites save power?" is a flat out NO.

    How you do figure, where's your data? Their data clearly shows that a CRT displaying all white uses 85W, and the same monitor displaying all black uses 63W, which sounds to me like it's using 25% less power to display the black screen. For an LCD the difference is only about 10%. The grayscale comparisons clearly show a relationship between darkness and power draw.

  • by Xeranar ( 2029624 ) on Thursday April 19, 2012 @04:53PM (#39738655)

    Think of electricity as waves crashing against the beach. Amps are how tall they are, volts how many are arriving in a frame of time or frequency. Watts is a measurement that gives a volumetric answer to power usage. It's a perfectly valid way to measure since we pay based on wattage per hour.

  • Re:No shit... (Score:3, Informative)

    by QuasiSteve ( 2042606 ) on Thursday April 19, 2012 @04:54PM (#39738683)

    I don't really see the problem with "not insignificant".

    Just because something is "not insignificant" doesn't make it "significant".

    Say I give you a papercut. You'll be in a "not insignificant" amount of pain.. in fact, you'll probably curse me all day long.
    But it's not exactly a "significant" amount of pain either.. it's not like you're curled up on the floor begging for somebody, anybody, to put you out of your misery or at least give you an OTC painkiller.

    Perhaps a completely alternative term could have been used - suggestions?
    ( I used 'measurable' in another post - but while 0.01% might be measurable, it but would be insignificant. )

  • Re:Really? (Score:5, Informative)

    by uigrad_2000 ( 398500 ) on Thursday April 19, 2012 @05:08PM (#39738883) Homepage Journal

    Well, he probably wasn't aware exactly which model of monitor you had. Generalizations tend to be bad for this reason.

    I, for example, have an LCD projector with a dynamic iris. It dims the bulb for dark scenes, and it is only for the improvement in contrast ratios. I know this, because it doesn't dim the bulb by decreasing the voltage over the filament, but by closing shutters (the iris) between the bulb and the LCD panel. It's described in more detail here [projectorcentral.com]

    I don't know the full history of the feature on monitors, but I'd assume it was originally to increase contrast ratio. After one marketer slapped a "energy efficient" sticker on the box, the manufacturers realized the marketing benefit of the feature, and probably renamed the menu for later models.

  • by EmagGeek ( 574360 ) on Thursday April 19, 2012 @05:14PM (#39738955) Journal

    >> Voltage mutiplied by current in Amps equals Watts.

    NO. For God's sake will people stop making this mistake.

    Voltage multiplied by current in Amps equals VA, not Watts. If you want watts, you have to multiply Voltage in Volts, Current in Amps, and the cosine of the angle between them (which is more commonly known as the power factor.

    VA = V*A
    Watts = V*A*PF

  • by JesseMcDonald ( 536341 ) on Thursday April 19, 2012 @05:15PM (#39738977) Homepage

    Amps are how tall they are, volts how many are arriving in a frame of time or frequency.

    Not that it affects the product (charge/energy), but amps measure transfer of charge over time, and volts measure electrical potential energy, so volts should be the height of the waves (gravitational potential energy) and amps the rate of arrival (in terms of volume of water per unit time, not waves per unit time).

    It's a perfectly valid way to measure since we pay based on wattage per hour.

    I don't know about where you're from, but around here we pay for energy in watt-hours (1 W*h = 3600 J), not watts per hour.

  • Re:Seriously? (Score:4, Informative)

    by sunderland56 ( 621843 ) on Thursday April 19, 2012 @05:40PM (#39739317)
    The report is internally inconsistent.
    • First they state figures for an all-white screen in their "Black and White" test: 85.1W (CRT), 38.4W (LCD).
    • Then later they test an all-white screen in their "Greyscales" test: 84.9W (CRT), 40.0W (LCD).

    So they show a 1.6 watt difference (LCD) on the same image, where their stated difference between google and blackie is 3.8 watts.

  • Re:Double Negative (Score:3, Informative)

    by Anonymous Coward on Thursday April 19, 2012 @06:36PM (#39739979)

    In this case, the double negative has a valid use. By saying "not insignificant" it leaves all other possibilities except for insignificant. This doesn't necessarily mean that the difference is significant, just that it isn't insignificant. If they said there is a "significant difference" then they have left only one option - that the difference is significant, and that statement carries more weight.

  • by Zadaz ( 950521 ) on Thursday April 19, 2012 @08:24PM (#39741001)

    The best enegery saving, battery-life extneing thing I've done is to use FlashBlock. (Or in Chrome set it up to not load any extension without a click.) This has been the difference between getting 8 hours out of my laptop and getting 2 1/2.

    Now if only web pages would be smarter about using setTimeout [mozilla.org].

  • by willy_me ( 212994 ) on Thursday April 19, 2012 @09:26PM (#39741469)

    >> Voltage mutiplied by current in Amps equals Watts.

    NO. For God's sake will people stop making this mistake.

    Voltage multiplied by current in Amps equals VA, not Watts. If you want watts, you have to multiply Voltage in Volts, Current in Amps, and the cosine of the angle between them (which is more commonly known as the power factor.

    VA = V*A Watts = V*A*PF

    No, Watts is really Voltage times Current. But when referring to AC systems, definitions get all screwed up. Just look at "kWh" - what a mess. It's like electricians have their own definitions for these units. I suppose it is understandable - using a single number to approximate a waveform and then performing calculations using Ohms Law makes most tasks much easier.

    So pointing out the difference between Watts and VA is good - thanks for that. But don't be calling the real definition for Watts wrong. Also, your definition for power factor is not correct - or at least it is dated. It only applies to AC systems where the waveform is shifted. Power factor also applies to waveforms that are modified in other ways. For example, a computer power supply without power factor correction consumes pulses of power during the peak points of the sine wave. This changes the shape of the wave without resulting in a phase shift. With power factor correction, a control circuit draws power throughout the entire waveform so that the sine wave is not distorted.

    I wonder what they used to measure power usage for this test. Did the instrument record true RMS power? Those instruments are much more expensive but required for accurate results. Guess I should rtfa.

  • Re:even more savings (Score:2, Informative)

    by rev0lt ( 1950662 ) on Thursday April 19, 2012 @10:50PM (#39741981)

    (...)of electricity that could be saved (...)

    Saved as in not paid directly. Electricity won't suddenly stop being produced because the average consumption dropped a few KWh, and this is one of the most ignored factors by "the green people". Electricity is generated regardless and must be spent almost immediatly - there is no efficient way of storing it (and no, batteries aren't that efficient), specially when considering the sheer requisites of the high values (both voltage and current) involved.
    That said, it's always good for the consumers to save a few cents. But take it for what it is (a money question) and not as an "environment-friendly" approach.

This restaurant was advertising breakfast any time. So I ordered french toast in the renaissance. - Steven Wright, comedian

Working...