Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Supercomputing Power

Can Computing Clean Up Its Act? (economist.com) 107

Long-time Slashdot reader SpzToid shares a report from The Economist: What you notice first is how silent it is," says Kimmo Koski, the boss of the Finnish IT Centre for Science. Dr Koski is describing LUMI -- Finnish for "snow" -- the most powerful supercomputer in Europe, which sits 250km south of the Arctic Circle in the town of Kajaani in Finland. LUMI, which was inaugurated last year, is used for everything from climate modeling to searching for new drugs. It has tens of thousands of individual processors and is capable of performing up to 429 quadrillion calculations every second. That makes it the third-most-powerful supercomputer in the world. Powered by hydroelectricity, and with its waste heat used to help warm homes in Kajaani, it even boasts negative emissions of carbon dioxide. LUMI offers a glimpse of the future of high-performance computing (HPC), both on dedicated supercomputers and in the cloud infrastructure that runs much of the internet. Over the past decade the demand for HPC has boomed, driven by technologies like machine learning, genome sequencing and simulations of everything from stockmarkets and nuclear weapons to the weather. It is likely to carry on rising, for such applications will happily consume as much computing power as you can throw at them. Over the same period the amount of computing power required to train a cutting-edge AI model has been doubling every five months. All this has implications for the environment.

HPC -- and computing more generally -- is becoming a big user of energy. The International Energy Agency reckons data centers account for between 1.5% and 2% of global electricity consumption, roughly the same as the entire British economy. That is expected to rise to 4% by 2030. With its eye on government pledges to reduce greenhouse-gas emissions, the computing industry is trying to find ways to do more with less and boost the efficiency of its products. The work is happening at three levels: that of individual microchips; of the computers that are built from those chips; and the data centers that, in turn, house the computers. [...] The standard measure of a data centre's efficiency is the power usage effectiveness (pue), the ratio between the data centre's overall power consumption and how much of that is used to do useful work. According to the Uptime Institute, a firm of it advisers, a typical data centre has a pue of 1.58. That means that about two-thirds of its electricity goes to running its computers while a third goes to running the data centre itself, most of which will be consumed by its cooling systems. Clever design can push that number much lower.

Most existing data centers rely on air cooling. Liquid cooling offers better heat transfer, at the cost of extra engineering effort. Several startups even offer to submerge circuit boards entirely in specially designed liquid baths. Thanks in part to its use of liquid cooling, Frontier boasts a pue of 1.03. One reason lumi was built near the Arctic Circle was to take advantage of the cool sub-Arctic air. A neighboring computer, built in the same facility, makes use of that free cooling to reach a pue rating of just 1.02. That means 98% of the electricity that comes in gets turned into useful mathematics. Even the best commercial data centers fall short of such numbers. Google's, for instance, have an average pue value of 1.1. The latest numbers from the Uptime Institute, published in June, show that, after several years of steady improvement, global data-centre efficiency has been stagnant since 2018.
The report notes that the U.S., Britain and the European Union, among others, are considering new rules that "could force data centers to become more efficient." Germany has proposed the Energy Efficiency Act that would mandate a minimum pue of 1.5 by 2027, and 1.3 by 2030.
This discussion has been archived. No new comments can be posted.

Can Computing Clean Up Its Act?

Comments Filter:
  • ...in Amsterdam. Data centers are next.
    • Re: (Score:3, Informative)

      by guruevi ( 827432 )

      Well, we’re in a situation right now where many datacenters are being abandoned because they were mandated (through subsidy) to be powered by green energy and a solar/wind only grid can’t supply sufficient density.

      Datacenters built on solar/wind promises 10 years ago, due to growing compute density, are less than 25% full and the solar/wind provider is saying “sorry, this is all we got in this region”, you told us a decade ago this was what you wanted, so we’re sticking to it.

      A

      • And in which SF story happened that?
        Include the planet please too ... sounds interesting.

        • by guruevi ( 827432 )

          Plenty of information about this in the Netherlands, New York, Northern Virginia, California.

      • Farmers and timberland owners get paid to defer farming/harvesting and those payments come from selling carbon credits. Nuclear power is ~5x more expensive than solar or wind. While uranium is incredibly energy dense, it still costs around $2k per kg once processed into fuel. There are no power generation options that have zero impact. Nobody wants to build/maintain/operate nuclear power plants. They're too expensive. They've always been bad financial investments and they've also never really been safe.
    • by sonlas ( 10282912 ) on Friday August 18, 2023 @06:51AM (#63777342)

      Trams in Amsterdam are powered by electricity. In 2022, the Netherlands electricity mix [nowtricity.com] (scroll down for historical data) was:
      - 47% Gas
      - 27% Coal
      - 15% Wind
      - 6% Nuclear
      - the rest is anecdotic

      Which resulted in 481g CO2eq/kWh in 2022. One of the worst level of emissions per kWh in Europe (beaten only by Poland and its 598g CO2eq/kWh).

      But anyway, the question "can computing clean up its act" is not really about that. Most of the impact of computing comes from the sheer amount of devices we keep producing and consuming: smartphones, smartwatches, smart-security systems, laptops... I don't have the exact numbers right now, but I seem to remember that for smartphones for instance, 75% of the CO2 impact came from manufacturing and shipping. Keeping a smartphone 4 years instead of the average 1.5 years has actually more positive impact than reducing its power consumption even by 50%. Same for a laptop. Same for a lot of things that we use daily.

      • Re: (Score:3, Insightful)

        by guruevi ( 827432 )

        So you’re saying, buy Apple, whose devices have the longest lifespan in the industry.

        • by sonlas ( 10282912 ) on Friday August 18, 2023 @07:33AM (#63777408)

          So you’re saying, buy Apple, whose devices have the longest lifespan in the industry.

          Interesting remark, which shows that reality is a bit more complex. Apple devices are well-built, and could last a long time. However, due to their high price (relative to other brands) most people buying it are on the wealthy side, and tend to buy a new (Apple) device every 1.5-2 years. You could argue that their old devices could get sold on the used market, which would be a good thing. But unfortunately most devices (not just Apple's) rot in drawers when people replace them.

          So the thing is not just to use a long-lasting device, but to actually use it for a long time. Which means accepting that some of your friends may have shinier things than you, even though you could afford them too.

          In Europe, you can also buy a Fairphone 4 [fairphone.com], which is long-lasting too, and which you can basically repair with a screwdrivers. Plus they try to actually source their materials from sustainable sources when possible. And are the most open about their subcontractors.

          • Interesting remark, which shows that reality is a bit more complex. Apple devices are well-built, and could last a long time. However,

            ...Apple doesn't put enough battery in them to cover the current demand of the devices when the battery ages, so they have to decrease the maximum speed of the device to make it not spontaneously shut down — especially when new versions of the OS which put more demand on the hardware are released. And it costs enough to put a new battery in it that most people (who have a contract and get a "free" phone every couple years) will not do so, because the hardware is designed in such a way that it makes it

            • by EvilSS ( 557649 )

              And it costs enough to put a new battery in it that most people (who have a contract and get a "free" phone every couple years) will not do so, because the hardware is designed in such a way that it makes it that expensive.

              It costs between $69 and $99 to have Apple replace a battery in an iPhone, and that includes the battery and labor. That is not what I would call expensive. And that is if you didn't buy AppleCare+ on it, in which case the battery replacement is free.

              The cost of a battery replacement is not what is holding people back from keeping phones longer.

            • And it costs enough to put a new battery in it that most people (who have a contract and get a "free" phone every couple years) will not do so, because the hardware is designed in such a way that it makes it that expensive.

              I sure hope a good solution comes forward for this soon, because I can't think of one. I'm not an iPhone user, but I don't want to go back to leaky cases. I want a case sealed with adhesive that somehow makes battery replacement still achievable.

              I believe the genesis of the sealed phone was all the warranty denials that Apple had done in the past. The little white sticker turned pink due to humidity so they told their customers they caused "water damage." And really, as hardware becomes more reliable hu

            • Oh, was that not what you were going to say?

              Nope, because your argument about batteries in apple devices (iphones I guess?) being too small is just bullshit. People recharge their phone every day (or night actually). A relative bought an iPhone SE in 2017, and it is still working and lasting the day. I am sure if she was watching netflix all day on it, it wouldn't, but if this is your usage pattern I guess we have nothing else to discuss.

              And the phone is outdated when you buy it, so you don't have to wait for it to become outdated before you want to replace it, you have that urge from day one? Interesting...

              I have grown out of having the "urge" to have the new shiny thing just because it shiny, since I was a teenager.

              Gu

              • your argument about batteries in apple devices (iphones I guess?) being too small is just bullshit. People recharge their phone every day (or night actually).

                First of all, that's not enough. Second, my cheapass Moto G Power will last for days on a charge.

          • Interesting remark, which shows that reality is a bit more complex. Apple devices are well-built, and could last a long time. However, due to their high price (relative to other brands) most people buying it are on the wealthy side, and tend to buy a new (Apple) device every 1.5-2 years.

            Four years, according to this [9to5mac.com] article. Four point three, according to this [techrepublic.com]. That latter article has the astonishing statement "Two out of every three devices ever sold by Apple are still in use."

            You could argue that their old devices could get sold on the used market, which would be a good thing. But unfortunately most devices (not just Apple's) rot in drawers when people replace them.

            Really?? Everybody I know trades their old phone in when they buy a new one. I'd say that the question is, what happens to the old phones that are traded in to Apple (or the Verizon store, or wherever you got the phone.)

            • That latter article has the astonishing statement "Two out of every three devices ever sold by Apple are still in use."

              Of course, if Apple is providing the stats they are unlikely to provide proof via spying in the OS. They are likely basing it on how many are still iCloud-locked, some of which have been shredded already because it couldn't be re-used at the recycler.

              When you trade Apple products back to Apple, I'm pretty sure they shred it to keep it off the secondary market. What little they would earn putting it back out there is more than made up for in new sales.

              • That latter article has the astonishing statement "Two out of every three devices ever sold by Apple are still in use."

                Of course, if Apple is providing the stats they are unlikely to provide proof via spying in the OS. They are likely basing it on how many are still iCloud-locked, some of which have been shredded already because it couldn't be re-used at the recycler.

                According to the article, "Asymco mobile analyst Horace Dediu created a formula for device lifespan based on the number of devices sold versus the number of active devices in use." Not clear where the "number of active devices in use" number comes from.

                When you trade Apple products back to Apple, I'm pretty sure they shred it to keep it off the secondary market. What little they would earn putting it back out there is more than made up for in new sales.

                Hard to say without more data. In addition to the profit from actually selling the phones, there's a good argument that selling low-cost used iPhones to people who can't afford the up-front cost of new iPhones would be a good way to get people into the Apple

                • According to the article, "Asymco mobile analyst Horace Dediu created a formula for device lifespan based on the number of devices sold versus the number of active devices in use." Not clear where the "number of active devices in use" number comes from.

                  Oops, just looked at the article again, and a little later in the article it says that the "number of active devices in use" figure comes from the Apple quarterly investment conference call. So, you were right, the number comes from Apple, and as you point out, could be a biased number.

            • Apple has robots which strip the phones back down to individual components. When a brand new phone becomes just a little too old to be a certified refurb, it becomes part of another iPhone if the part is technically compatible, or further recycled to form brand new compatible parts again.
          • However, due to their high price (relative to other brands) most people buying it are on the wealthy side, and tend to buy a new (Apple) device every 1.5-2 years
            I doubt anyone who has a working computer, replaces it just for fun.
            This is a 9 year old Apple device. The other one is 12 year olds, was about to let the battery be replaced, as it could easy run another ten years.

            If you have a running laptop, with a magsafe connector, made from metal with an SSD inside: there is no damn reason to ever buy a new on

      • I have 38% for renewables in Holland in 2021, I suspect your figures..
        https://en.m.wikipedia.org/wik... [wikipedia.org]
        "gas is green" was an oil industry lie, it's as bad as coal. shift over to real renewable electricity and the trams will be green.
        as for computing, yes, for a start extend the life of devices, force manufacturers to release source code for older devices so they can be reused..
      • by SpzToid ( 869795 )
        Trams in Amsterdam are powered exclusively using renewable energy. Citation, (in Dutch):

        https://www.ovmagazine.nl/nieu... [ovmagazine.nl]

        Here's an English translation, (GVB is the public transportation department):

        GVB switched to Dutch green electricity

        From 2019, the tram and metro in Amsterdam will run on Dutch green electricity. This is the result of the tender that GVB started in May this year together with Metro and Tram from the municipality of Amsterdam.

        The tender was won by energy supplier Nuon/Vatte
    • by Eunomion ( 8640039 ) on Friday August 18, 2023 @07:45AM (#63777444)
      Nerdy quibble: A wind mill is literally a mill powered by wind. If the turning blades generate electricity instead of running a mill, it's a wind turbine.
      • by AmiMoJo ( 196126 )

        Climate change deniers started mockingly calling the windmills, so proponents of wind power adopted the term.

        • Anti-science trolls don't really speak English. It's a mistake to take anything they say literally, since their game is just to firehose gibberish so reality is hidden in noise.
      • It's a good thing language is static and words never change meaning over time. Meanwhile, people still "turn up" the volume on their TV and "dial" their phones.

        • Verbs can survive changes in object. So you can "rewind" a video even if there's literally nothing to "wind" in the first place. But windmill is a noun, and nobody refers to wind turbine rotation as "milling."
          • Archaic terms aren't limited to verbs. Even nouns that have verbs in their names stick around long past their original use. Software bugs are no longer actual moths fried by a vacuum tube but we still call program glitches bugs.

            • "Bug" works as a metaphor though. A literal bug interferes and annoys, much like a software glitch. A mill doesn't have metaphoric relevance to an electrical generator. People say "windmill" just because it's pithy, not because it applies logically or figuratively.
              • You don't think a thing that uses rotational energy to do work isn't a metaphor for electrical generation?

                • I've always understood milling to refer to mechanical work done to structurally alter an input material. Slicing wood, pulping paper, grinding grain, etc. The only structural changes in an electric generator are incidental wear and tear. I mean, you wouldn't call a flashlight with a hand-crank generator a "light mill."
          • by Bumbul ( 7920730 )

            But windmill is a noun, and nobody refers to wind turbine rotation as "milling."

            I'm not a native english speker, but would you really use that term for windmill blade rotation? Isn't milling the thing that happens with the millstones, inside the housing?

            In everyday language, "windmill" does not point to the stuff happening inside (i.e. the mill), but to the shape/figure of the windmill itself. Thus, it is quite acceptable to use the same noun also for these new things with rotating blades, even if they happen to generate electricity (instead of flour from grains) behind the closed d

            • Yes, milling is inside, but that's even less similar to what goes on in a wind power system. The shaft only does work against a field to generate electricity, it doesn't operate a mechanical process to alter a tangible substance. The rotating wind blades on the outside are the only similarity.
    • They must look funny.

    • ...in Amsterdam. Data centers are next.

      So you will only get your data from the cloud when the wind blows?

      If that's the case then I think Leon Musk will have no problems spewing from Twitter...err X...err Who Are They Now.

  • "capable of performing up to 429 quadrillion calculations every second"

    So IOW Python and nodeJS could run at the same speed as C++ on a 20 year old pentium!

    Seriously, if the goal is to reduce the energy usage of systems then a good place to start is the development languages. Scripting languages are fast to develop with and test but they're hideously inefficient from a CPU cycle and hence energy usage POV. Since people won't be willing to give them up perhaps there should be far more emphasis on them being

    • by guruevi ( 827432 )

      Energy costs are reflected in total price. Having your team spend 10x longer in their offices, driving to work every day, build environments running etc to craft perfectly functioning C or Rust is more expensive and less carbon efficient than having them bang it out in Python or NodeJS.

      Even so, a lot of Python libraries are already optimized and using C, that it makes very little difference if you swap out a properly written library call with actual pure C.

      • The data center power requirements for even a small compute farm are enormous compared to the office power used by the staff working 9-6.

        For something like this system with tens of thousands of cpus plus whatever they using for network and don't forget ram and some sort of storage systems, there is no reason to compare to the office. The office use is just noise on these scales.

        First week of freshman CS class we were taught that algorithms are everything. Hardware is secondary. Their example was running

        • by guruevi ( 827432 )

          In most cases, the algorithm runs millions of times per second, having it shave off a little bit of energy at the expense of thousands of man hours is not relevant. Again, energy cost is in the price, it costs $500 to run but $50,000 to hire a person that can make it run for $250, the embedded energy expense in having a person is a lot higher.

          Yes, at some scale, if you CAN and NEED to scale something to that extent, it becomes worth it, most datacenters, even large compute facilities are running thousands i

          • Ok so what you said is pretty much the quick sort vs bubble sort.

            If current AI compute was like bubble sort then perhaps there is a yet to be discovered/created AI compute algorithm that doesn't require millions of runs per second. Quick sort didn't always exist. It was a big deal at the time. What if someone came up with an Ai compute methodology that required tens of thousands instead of millions of calculations? Or better yet, operated entirely differently and just ran once for a few hours and was do

            • by guruevi ( 827432 )

              I think you're talking about the difference of research vs practice. Yes, there is value in research at all levels, but that doesn't apply across the board.

              For most companies it isn't valuable enough to port everything from Python to eg. Rust just to save a bit of energy as activist programmers often recommend because at some point, someone will make Python more energy efficient [github.com] and within a few years, all the time and money and energy you spent has been undone by faster compilers, better interpreters and o

              • Ok, taken that way, I agree. Points well made about practice at a single company rewriting code in another language vs general field research. I'm with you.

      • Energy costs are reflected in total price.

        The take that this data center is environmentally friendly pretty much ignores that it's directly adding heat to the atmosphere. The warming isn't coming from electrical cooling adding to the grid load indirectly due to greenhouse gas emissions, but it's warming, regardless.

        Heat added directly to the atmosphere and/or the oceans isn't a good thing. It's just better than adding greenhouse gas emissions.

        • The heat added by human activity, regardless what/how: is completely irrelevant in relation to the CO2 emissions ...

          • The heat added by human activity, regardless what/how: is completely irrelevant in relation to the CO2 emissions.

            No, it isn't. Because the threat is environmental heating; CO2 emissions trap heat in our atmosphere. We create heat directly that will be trapped by CO2 emissions. Reducing either heat trapping or heat generation, or both, serve to address the actual threat to some degree.
            The two issues are directly related.

            • Sorry, the heat getting out of a power plant making the environment a little bit hotter: is ZERO - in relation to the amount the sun rays in and is trapped by CO2.

              You seriously need to get a clue about dimensions.

              Here: look at it, this is an ice cube. Small? Yepp!
              Look over there: that is an glacier ... big, don't you think so?

              Power plant -> small heat (lots of CO2 though).
              Sun -> lots of heat ... not the suns fault, CO2s fault.

      • Having your team spend 10x longer in their offices, driving to work every day, build environments running etc to craft perfectly functioning C or Rust is more expensive and less carbon efficient than having them bang it out in Python or NodeJS.

        Unless you're building it once and running it millions/billions/trillions of times. Oh wait, doesn't that apply to a lot of code that runs in datacenters?

    • You obviously have little clue as to current compiler technology, that can provide JIT compilers for a variety of languages targeted to just about any platform, These compilers approach or exceed the performance of native-compiled binaries because they can take advantage of run-time information about the data types and values actually being used by the code rather than trying to infer them (usually poorly) at compile time. The fact that Python does not have one of these yet is more a testament to its develo

      • by Viol8 ( 599362 )

        "You obviously have little clue as to current compiler technology"

        Ah - a genius speaks!

        "These compilers approach or exceed the performance of native-compiled binaries because they can take advantage of run-time information about the data types and values actually being used by the code rather than trying to infer them (usually poorly) at compile time"

        If it gets it wrong at compile time the program won't work properly at runtime so you're talking out of your backside. Unless you can give an example of one ty

        • One of the things about DPU/GPU (aka AI) programming is that the code flow and data has to be much closer to typed ideal or the memory space and I/O operations with each compute unit becomes high and the whole point of many lower end cores and shared memory spaces becomes useless. Using a 128 bit register for a integer because the compiler is JIT for the programmers convenience isn't the way to go when optimizing sending something out a work unit a billion times to be processed and filtered only on the int
      • The problem is rarely the language. The people using the scripting-style languages are overall less trained on how to write good code. When your algorithm is an order of magnitude more complex than it needs to be, then better compilation will only get you so far.

        • a) not correct
          b) scientific code is usually a loop - a single loop - going over lots of data sequentially - pulled from a file: calling one single library function, or a few: which are implemented in C++

          Most of the time there is seriously nothing a "better programmer" would do better or different.

    • by DarkOx ( 621550 )

      RFLOL - good friggin luck.

      The industry might *might* occasionally take some time to consider performance on the hottest of hot paths, at huge players, running massive scale applications. The Google's and Meta's of the world might spend some time optimizing things like PHP and node where they can - but nobody is going back running the web C/C++. Rust / Go etc are also not going get out of the systems space - just like C / C++ not flexible or fast enough to work in. - Man hours cost and will continue to cost

    • Sloppy code written by the cheapest contractors around the globe will beat your energy costs any day. Business goals are usually lower total cost, not lower energy usage.

    • by crgrace ( 220738 )

      The cores of HPC codes are still largely written in Fortran and C. Python and other scripting languages are only used for configuration and program control. Not the heavy lifting.

    • people who use python on a super computer usually know more than you about computing ...
      I could make a long list, but cut it down to one item:
      The python "script" is a 10 liner. 3 load a libraries, 3 open files, 3 call functions of those libraries: the python code is not even showing up on the performance profiler ...

  • Seriously. They are bad for the environment.
  • by fuzzyfuzzyfungus ( 1223518 ) on Friday August 18, 2023 @06:54AM (#63777346) Journal
    It seems deeply unproductive to as "can computing clean up its act" as though that were a single question.

    If you want to talk efficiency per unit work 'cleaning up its act' is something of a desperate imperative in the business. Among large-scale operators energy and cooling costs are simply far too large to ignore(even in places chosen for cheap hydro or the like) and for people doing client hardware it's increasingly laptops and cell phones where the customer cares about battery life, fan noise, and lap burns. Small scale operations can't justify as much expert attention; you need to amortize thermal or electrical engineering time over a fair number of nodes before it costs less than just running the AC slightly more; but even those guys end up taking advantage of the features demanded by the larger customers that chipmakers and system integrators actually listen to.

    If you want to talk choice of work done, though, there's a lot more room for skepticism; but the stuff that would be indefensible faff would mostly be indefensible faff regardless of how efficient the hardware guys could make it.

    It seems somewhere between unhelpful and actively misleading to treat the methods and the objectives as though they were a single topic. It's certainly worth keeping an eye on datacenter developers looking for sweetheart deals that will end up getting their power subsidized in the name of creating 3 whole screwdriver monkey jobs; or give them the right to use massive amounts of evaporative cooling in some arid area so that they can boast about how environmentally friendly they are for not using AC; but as long as you keep those sorts of perverse incentives at bay the hardware and operations sides are usually pretty interested in efficiency; because power and cooling are big visible expenses for them.

    Most of the more interesting and less flattering questions pertain to just what we do with all the compute time we use.
    • Most of the more interesting and less flattering questions pertain to just what we do with all the compute time we use.

      Watch AI-generated cat videos seems to be the direction we are heading to. And porn.

    • by AmiMoJo ( 196126 )

      Going forward no data centres or supercomputers should use any non-renewable energy. They should be obliged to add enough renewable energy to cover their needs. Many do anyway, but it should be a requirement.

      • Many do anyway, but it should be a requirement.

        Data centers who "do anyway" as you say, were just buying carbon offsets (while it was trendy) to appear green.

        Going forward no data centres or supercomputers should use any non-renewable energy. They should be obliged to add enough renewable energy to cover their needs.

        Just no. Data centers don't use renewables; they draw electricity from the grid. The distinction is subtle but significant. What would truly be beneficial is ensuring that no data centers or supercomputers are deployed in countries where the electricity grid produces CO2 emissions exceeding 50g CO2eq/kWh (arbitrary number here, but you get the point). This approach would effectively compel countrie

        • by AmiMoJo ( 196126 )

          I'd have them add enough capacity to cover all their needs all the time. That will likely include some battery storage, although they have that anyway for UPS.

          Then most of the time they will be adding renewable energy to the grid.

          • Then most of the time they will be adding renewable energy to the grid.

            Re-read the last sentence of my last post: "You are conflating the objective (reducing CO2 emissions) with the means (deploying renewables and other low-carbon energy sources, such as nuclear or hydro)."

            This is because your own personal objective/mission is to deploy renewables. Even if that doesn't mean less CO2 emissions. Instead, your objective should be to have low CO2 emissions, and thus a low-emitting grid. Build new datacenters only in countries with low-carbon grids (there are quite a few already, a

            • by AmiMoJo ( 196126 )

              I'd rather see investment go into countries that don't already have low carbon grids. The more renewable energy going into the grid, the less CO2 emitting generation there will be. That's the case because renewable energy is cheaper, and because in most places fossil fuel plants have to give priority to renewables and cheaper source when possible.

              • The more renewable energy going into the grid, the less CO2 emitting generation there will be.

                No. In that case, if you just add renewables to cover the needs of your new datacenter (which is what you are proposing), then you just emit as much as before for everything else... Percentage-wise, the CO2 emitting generation will be less, because you added renewables for your new needs, but the amount of CO2 emitted wil still be the same (and this is what matters; % of renewables in a grid do not matter).

                Actually, the CO2 emissions will likely go up even if you just build renewables for your new datacente

                • by AmiMoJo ( 196126 )

                  Read it again carefully. Add as much as needed to cover ALL the datacentre's needs ALL the time. So that would mean enough to cover the datacentre when wind and solar are at their minimums, with battery storage allowed to cover short term dips.

                  In other words they will be required to massively over-build capacity, and over a wide geographic area. Coverage will be determined based on independent assessment of the proposal. Most of the time the build renewables will be generating far in excess of what the data

                  • In other words they will be required to massively over-build capacity, and over a wide geographic area.

                    Do you realize how stupid your proposition is? You basically mean to tell the datacenters: "hey, even though we keep telling everyone that renewables are so cheap that a decarbonized grid is right around the corner (well, for the last 30 years, must be a big corner I guess), can you please build it for us? Err, I mean, for you, but we mainly for us too. And you'll be paying for it too".

                    This is just so out of touch with reality, can't argue more with you for today.

  • by coofercat ( 719737 ) on Friday August 18, 2023 @07:58AM (#63777484) Homepage Journal

    I know why it's the way it is, but I always wondered about the wisdom of putting your datacentre in Silicon Valley (ie. an almost year-round hot place). Here is an example of someone putting it deliberately where it's cold(er).

    Then there's the possibly competing issue of putting it somewhere you can use the waste heat - domestic or even industrial units near by are required. People tend not to live in year-round cold places though. You also have to wonder if the climate likes its cold bits getting warmed all year too.

    Lastly, putting it somewhere that you can't get at sensibly because it's too remote isn't useful, and of course you have to think about the network connections to such a place too.

    FWIW, there's a (crappy looking) industrial unit about 10 minutes drive from my house that has an 8MW electrical connection (and was previously, I assume a datacentre, although not anything on the scale of the one in TFS). I'll bet that none of its waste heat was used to even warm the offices on site, let alone the other industrial units that surround it, and definitely not the houses that surround all of it. Again, I can see the reasons why not - but that's sort of the problem - we need to find a way to make doing these sorts of things the obvious and easy solution, rather than the difficult and expensive one.

    • If we had a decent internet with good uplink speeds then cloudy data centers could be ludicrously distributed, i.e. it would actually make sense to use their elements for residential heating and the like.

      Unfortunately we don't, which is why we have data centers.

      • It's not as simple as "uplink speeds." These are spots that have many multiple paths out - not just redundancy but efficient routing. Homes are never going to be that.

      • Data circuits to Data Center are a million times cheaper than to the utility pole in front of your house. And once a half dozen conduits are in place expanding a fiber plant is a trivial number of man hours compared to a subdivision.
    • That's more about how humans herd towards where's popular. One coffee shop springs up and then just watch as another 3 pop up in the same spot. It doesn't really make sense when you could move away from the competition. It's also the soft power at play of the US. The name silicon valley was a great pitch whoever did it. The idea that you can only host your Datacenter in one place is funny.
      Remember a person is smart, people are dumb, irrational animals that are prone to panic and you know it.
    • In chicago IL there has a been a multi story data center at 350 Carmack for 2 decades now owned by a leader in data center reality. The building was ideal because it was a warehouse, shipping and print facility for the chicago phone books and is on rail lines and next to mass transit. No amount of weight (gensets) or space requirements was a problem for this space for the first decade. The site was chosen because cooling and power was ideal, the utility had a boondoggle across the ally to utilize.
  • Curious how much energy could be saved by dispensing with inefficient (often Javascript-based) client-side software. Not just websites (though this is probably the majority of it), but things like Adobe's Creative Cloud launcher and the trash that comes preinstalled on most store-bought machines.
    • by caseih ( 160668 )

      Companies are only in favor of reducing energy use if it happens to correspond with lower costs. Otherwise everyone simply does what is the cheapest and makes them the most amount of money. So sadly useless bloatware will continue to ship indefinitely. That's as opposed to my bloatware which isn't useless. My Javascript and electron framework -based products are beautiful and elegant.

    • A little maybe, I would say it's purely consumer driven. After all what are those servers hosting?
      Hint it's not cat photos that's the most popular content on the internet.
      Javascript is probably a major offender, however all attempts to replace it may cause their own issues. Firefox doing a swap out from one language to another is a prime example of how to kill all legacy extensions.
      can we instead look at making a better version of Javascript. After all, making a new standard scripting language will proba
  • Put a small nuclear reactor next to the data center and power it for 50 years.

  • "...everything from climate modeling to searching for new drugs".

    So, different ways of digging for cash. Not very different from bitcoin mining, in that it requires an incredibly expensive computer, incurs massive running costs, wastes huge amounts of electricity, and accomplishes absolutely nothing useful.

    • This is very true. AI is just yet another Bitcoin, a solution based on cheap silicon than another industry is attempting to capitalize on. It does fill in the media hype gaps with the long time to market of 5G universal connectivity for cheap IOT processors. Always on cheap computers doing a huge amount of sensing, universally around us was a creepy sale, now AI makes it creepier to the non technologist. Great job... We got 1984 instead of the Jetsons.
  • Please, show me a link where to buy one. LOL!

Business is a good game -- lots of competition and minimum of rules. You keep score with money. -- Nolan Bushnell, founder of Atari

Working...