Power

America Adds 11.7 GW of New Solar Capacity in Q3 - Third Largest Quarter on Record (electrek.co) 55

America's solar industry "just delivered another huge quarter," reports Electrek, "installing 11.7 gigawatts (GW) of new capacity in Q3 2025. That makes it the third-largest quarter on record and pushes total solar additions this year past 30 GW..." According to the new "US Solar Market Insight Q4 2025" report from Solar Energy Industries Association (SEIA) and Wood Mackenzie, 85% of all new power added to the grid during the first nine months of the Trump administration came from solar and storage. And here's the twist: Most of that growth — 73% — happened in red [Republican-leaning] states. Eight of the top 10 states for new installations fall into that category, including Texas, Indiana, Florida, Arizona, Ohio, Utah, Kentucky, and Arkansas...

Two new solar module factories opened this year in Louisiana and South Carolina, adding a combined 4.7 GW of capacity. That brings the total new U.S. module manufacturing capacity added in 2025 to 17.7 GW. With a new wafer facility coming online in Michigan in Q3, the U.S. can now produce every major component of the solar module supply chain...

SEIA also noted that, following an analysis of EIA data, it found that more than 73 GW of solar projects across the U.S. are stuck in permitting limbo and at risk of politically motivated delays or cancellations.

Power

More of America's Coal-Fired Power Plants Cease Operations (newhampshirebulletin.com) 117

New England's last coal-fired power plant "has ceased operations three years ahead of its planned retirement date," reports the New Hampshire Bulletin.

"The closure of the New Hampshire facility paves the way for its owner to press ahead with an initiative to transform the site into a clean energy complex including solar panels and battery storage systems." "The end of coal is real, and it is here," said Catherine Corkery, chapter director for Sierra Club New Hampshire. "We're really excited about the next chapter...." The closure in New Hampshire — so far undisputed by the federal government — demonstrates that prolonging operations at some facilities just doesn't make economic sense for their owners. "Coal has been incredibly challenged in the New England market for over adecade," said Dan Dolan, president of the New England Power Generators Association.

Merrimack Station, a 438-megawatt power plant, came online in the1960s and provided baseload power to the New England region for decades. Gradually, though, natural gas — which is cheaper and more efficient — took over the regional market... Additionally, solar power production accelerated from 2010 on, lowering demand on the grid during the day and creating more evening peaks. Coal plants take longer to ramp up production than other sources, and are therefore less economical for these shorter bursts of demand, Dolan said. In recent years, Merrimack operated only a few weeks annually. In 2024, the plant generated just0.22% of the region's electricity. It wasn't making enough money to justify continued operations, observers said.

The closure "is emblematic of the transition that has been occurring in the generation fleet in New England for many years," Dolan said. "The combination of all those factors has meant that coal facilities are no longer economic in this market."

Meanwhile Los Angeles — America's second-largest city — confirmed that the last coal-fired power plant supplying its electricity stopped operations just before Thanksgiving, reports the Utah News Dispatch: Advocates from the Sierra Club highlighted in a news release that shutting down the units had no impact on customers, and questioned who should "shoulder the cost of keeping an obsolete coal facility on standby...." Before ceasing operations, the coal units had been working at low capacities for several years because the agency's users hadn't been calling on the power [said John Ward, spokesperson for Intermountain Power Agency].
The coal-powered units "had a combined capacity of around 1,800 megawatts when fully operational," notes Electrek, "and as recently as 2024, they still supplied around 11% of LA's electricity. The plant sits in Utah's Great Basin region and powered Southern California for decades." Now, for the first time, none of California's power comes from coal. There's a political hiccup with IPP, though: the Republican-controlled Utah Legislature blocked the Intermountain Power Agency from fully retiring the coal units this year, ordering that they can't be disconnected or decommissioned. But despite that mandate, no buyers have stepped forward to keep the outdated coal units online. The Los Angeles Department of Water and Power (LADWP) is transitioning to newly built, hydrogen-capable generating units at the same IPP location, part of a modernization effort called IPP Renewed. These new units currently run on natural gas, but they're designed to burn a blend of natural gas and up to 30% green hydrogen, and eventually100% green hydrogen. LADWP plans to start adding green hydrogen to the fuel mix in 2026.
"With the plant now idled but legally required to remain connected, serious questions remain about who will shoulder the cost of keeping an obsolete coal facility on standby," says the Sierra Club.

One of the natural gas units started commerical operations last Octoboer, with the second starting later this month, IPP spokesperson John Ward told Agency].
the Utah News Dispatch.
Programming

Rust in Linux's Kernel 'is No Longer Experimental' (thenewstack.io) 90

Steven J. Vaughan-Nichols files this report from Tokyo: At the invitation-only Linux Kernel Maintainers Summit here, the top Linux maintainers decided, as Jonathan Corbet, Linux kernel developer, put it, "The consensus among the assembled developers is that Rust in the kernel is no longer experimental — it is now a core part of the kernel and is here to stay. So the 'experimental' tag will be coming off." As Linux kernel maintainer Steven Rosted told me, "There was zero pushback."

This has been a long time coming. This shift caps five years of sometimes-fierce debate over whether the memory-safe language belonged alongside C at the heart of the world's most widely deployed open source operating system... It all began when Alex Gaynor and Geoffrey Thomas at the 2019 Linux Security Summit said that about two-thirds of Linux kernel vulnerabilities come from memory safety issues. Rust, in theory, could avoid these by using Rust's inherently safer application programming interfaces (API)... In those early days, the plan was not to rewrite Linux in Rust; it still isn't, but to adopt it selectively where it can provide the most security benefit without destabilizing mature C code. In short, new drivers, subsystems, and helper libraries would be the first targets...

Despite the fuss, more and more programs were ported to Rust. By April 2025, the Linux kernel contained about 34 million lines of C code, with only 25 thousand lines written in Rust. At the same time, more and more drivers and higher-level utilities were being written in Rust. For instance, the Debian Linux distro developers announced that going forward, Rust would be a required dependency in its foundational Advanced Package Tool (APT).

This change doesn't mean everyone will need to use Rust. C is not going anywhere. Still, as several maintainers told me, they expect to see many more drivers being written in Rust. In particular, Rust looks especially attractive for "leaf" drivers (network, storage, NVMe, etc.), where the Rust-for-Linux bindings expose safe wrappers over kernel C APIs. Nevertheless, for would-be kernel and systems programmers, Rust's new status in Linux hints at a career path that blends deep understanding of C with fluency in Rust's safety guarantees. This combination may define the next generation of low-level development work.

Virtualization

VMware Kills vSphere Foundation In Parts of EMEA (theregister.com) 19

Broadcom has quietly pulled VMware vSphere Foundation from parts of EMEA, pushing smaller customers toward far more expensive bundles and prompting some to consider jumping to Hyper-V or Nutanix. The Register reports: VVF is a bundle that offers compute, storage, and networking virtualization, and a platform to run containers. It's most useful in hyperconverged infrastructure and hybrid clouds, but is less capable than the Cloud Foundation (VCF) private cloud suite. Virtzilla said EMEA customers would need to check with their local dealer to see if VVF was still on sale in their country. "VVF is no longer available in some EMEA countries, but for the majority it is still available," a Broadcom spokesperson said. "Customers will have to reach out to sales reps or partners to determine availability of a given product in their region. These changes were recent."

Our initial tipster said their reseller clued them into the impending change when VMware's new fiscal year started in November. This anonymous customer told us that their hardware fleet boasts thousands of compute cores and without more affordable options, his organization was looking at their annual VMware spend leaping by 10x from around $130,000 to $1.3 million. "We're currently looking to jump ship to either Microsoft's Hyper-V or Nutanix, as we can't eat (that) increase," they told The Register. [...]

For the moment, a Broadcom spokesperson told us it has no plans to ditch VMware vSphere Standard, the basic server virtualization bundle which we're told makes up about 60 percent of the company's licenses and is a lower-cost way to access VMware's hypervisor than buying its full suite of VMware Cloud Foundation products. "We have not announced any changes to the availability of vSphere Standard in EMEA nor end of support for vSphere Standard," the spokesperson said via email. "The product remains fully available across EMEA today. However, Broadcom product availability can vary by region to align with local market requirements, customer demand, and other considerations."

The Courts

Netflix Faces Consumer Class Action Over $72 Billion Warner Bros Deal (reuters.com) 49

Netflix's $72 billion bid to buy Warner Bros Discovery has triggered a consumer class action claiming the merger would crush competition, erase HBO Max as a rival, and hand Netflix control over major franchises. Reuters reports: The proposed class action (PDF) was filed on Monday by a subscriber to Warner Bros-owned HBO Max who said the proposed deal threatened to reduce competition in the U.S. subscription video-on-demand market. "Netflix has demonstrated repeated willingness to raise subscription prices even while facing competition from full-scale rivals such as WBD," the lawsuit said. [...] The lawsuit said the Warner Bros deal would eliminate one of Netflix's closest rivals, HBO Max, and give Netflix control over Warner Bros marquee franchises including Harry Potter, DC Comics and Game of Thrones. On Monday, Paramount Skydance launched a $108 billion hostile bid to buy Warner Bros. Discovery with an all-cash, $30-per-share offer.
Cellphones

New Jolla Phone Now Available for Pre-Order as an Independent Linux Phone (9to5linux.com) 45

Jolla is "trying again with a new crowd-funded smartphone," reports Phoronix: Finnish company Jolla started out 14 years ago where Nokia left off with MeeGo and developed Sailfish OS as a new Linux smartphone platform. Jolla released their first smartphone in 2013 after crowdfunding but ultimately the Sailfish OS focus the past number of years now has been offering their software stack for use on other smartphone devices [including some Sony Xperia smartphones and OnePlus/Samsung/ Google/ Xiaomi devices].
This new Jolla Phone's pre-order voucher page says the phone will only produced if 2,000 units are ordered before January 4. (But in just a few days they've already received 1,721 pre-orders — all discounted to 499€ from a normal price between 599 and 699 €). Estimate delivery is the first half of 2026. "The new Jolla Phone is powered by a high-performing Mediatek 5G SoC," reports 9to5Linux, "and features 12GB RAM, 256GB storage that can be expanded to up to 2TB with a microSDXC card, a 6.36-inch FullHD AMOLED display with ~390ppi, 20:9 aspect ratio, and Gorilla Glass, and a user-replaceable 5,500mAh battery." The Linux phone also features 4G/5G support with dual nano-SIM and a global roaming modem configuration, Wi-Fi 6 wireless, Bluetooth 5.4, NFC, 50MP Wide and 13MP Ultrawide main cameras, front front-facing wide-lens selfie camera, fingerprint reader on the power key, a user-changeable back cover, and an RGB indication LED. On top of that, the new Jolla Phone promises a user-configurable physical Privacy Switch that lets you turn off the microphone, Bluetooth, Android apps, or whatever you wish.

The device will be available in three colors, including Snow White, Kaamos Black, and The Orange. All the specs of the new Jolla Phone were voted on by Sailfish OS community members over the past few months. Honouring the original Jolla Phone form factor and design, the new model ships with Sailfish OS (with support for Android apps), a Linux-based European alternative to dominating mobile operating systems that promises a minimum of 5 years of support, no tracking, no calling home, and no hidden analytics...

The device will be manufactured and sold in Europe, but Jolla says that it will design the cellular band configuration to enable global travelling as much as possible, including e.g. roaming in the U.S. carrier networks. The initial sales markets are the EU, the UK, Switzerland, and Norway.

Data Storage

The Last Video Rental Store Is Your Public Library 27

404 Media's Claire Woodcock writes: As prices for streaming subscriptions continue to soar and finding movies to watch, new and old, is becoming harder as the number of streaming services continues to grow, people are turning to the unexpected last stronghold of physical media: the public library. Some libraries are now intentionally using iconic Blockbuster branding to recall the hours visitors once spent looking for something to rent on Friday and Saturday nights.

John Scalzo, audiovisual collection librarian with a public library in western New York, says that despite an observed drop-off in DVD, Blu-ray, and 4K Ultra disc circulation in 2019, interest in physical media is coming back around. "People really seem to want physical media," Scalzo told 404 Media. Part of it has to do with consumer awareness: People know they're paying more for monthly subscriptions to streaming services and getting less. The same has been true for gaming.

As the audiovisual selector with the Free Library of Philadelphia since 2024, Kris Langlais has been focused on building the library's video game collections to meet comparable interest in demand. Now that every branch library has a prominent video game collection, Langlais says that patrons who come for the games are reportedly expressing interest in more of what the library has to offer. "Librarians out in our branches are seeing a lot of young people who are really excited by these collections," Langlais told 404 Media. "Folks who are coming in just for the games are picking up program flyers and coming back for something like that."
IP disputes are fueling the shift, too.

The report notes how rights and licensing battles are making some films harder to access -- from titles that quietly slip out of commercial circulation, to streaming-only releases that never make it to disc, to entire shows vanishing during mergers like HBO Max-Discovery+. One prominent example is The People's Joker, which was briefly pulled from the Toronto International Film Festival over a conflict with Batman's rightsholders.

Situations like that are pushing librarians to grab physical copies while they still can, before these works risk disappearing altogether.
AI

After Nearly 30 Years, Crucial Will Stop Selling RAM To Consumers 116

Micron is shutting down its Crucial consumer RAM business in 2026 after nearly three decades, citing heavy demand from AI data centers. "The AI-driven growth in the data center has led to a surge in demand for memory and storage," Sumit Sadana, EVP and chief business officer at Micron Technology, said in a statement. "Micron has made the difficult decision to exit the Crucial consumer business in order to improve supply and support for our larger, strategic customers in faster-growing segments." Ars Technica reports: Micron said it will continue shipping Crucial consumer products through the end of its fiscal second quarter in February 2026 and will honor warranties on existing products. The company will continue selling Micron-branded enterprise products to commercial customers and plans to redeploy affected employees to other positions within the company.

Crucial launched in 1996 during the Pentium era as Micron's consumer brand for RAM and storage upgrades. Over the years, the brand expanded to encompass other memory-related products such as SSDs, flash memory cards, and portable storage drives. Micron Technology has been manufacturing RAM since 1981.
Data Storage

Google's Vibe Coding Platform Deletes Entire Drive 95

A Google Antigravity user says the AI-driven "vibe coding" tool accidentally wiped his entire D: drive while trying to clear a project cache. Google says it's investigating, but the episode adds to a growing list of AI tools behaving in ways that "would get a junior developer fired," suggests The Register. From the report: We reached out to the user, a photographer and graphic designer from Greece, who asked we only identify him as Tassos M because he doesn't want to be permanently linked online to what could "become a controversy or conspiracy against Google." [...] Tassos told Antigravity to help him develop software that's useful for any photographer who has to choose a few prime shots from a mountain of snaps. He wanted the software to let him rate images, then automatically sort them into folders based on that rating.

According to his Reddit post, when Tassos figured out the AI agent had wiped his drive, he asked, "Did I ever give you permission to delete all the files in my D drive?". "No, you absolutely did not give me permission to do that," Antigravity responded. "I am horrified to see that the command I ran to clear the project cache appears to have incorrectly targeted the root of your D: drive instead of the specific project folder. I am deeply, deeply sorry. This is a critical failure on my part."

Redditors, as they are wont to do, were quick to pounce on Tassos for his own errors, which included running Antigravity in Turbo mode, which lets the Antigravity agent execute commands without user input, and Tassos accepted responsibility. "If the tool is capable of issuing a catastrophic, irreversible command, then the responsibility is shared -- the user for trusting it and the creator for designing a system with zero guardrails against obviously dangerous commands," he opined on Reddit.

As noted earlier, Tassos was unable to recover the files that Antigravity deleted. Luckily, as he explained on Reddit, most of what he lost had already been backed up on another drive. Phew. "I don't think I'm going to be using that again," Tassos noted in a YouTube video he published showing additional details of his Antigravity console and the AI's response to its mistake. Tassos isn't alone in his experience. Multiple Antigravity users have posted on Reddit to explain that the platform had wiped out parts of their projects without permission.
Data Storage

How Bad Will RAM and Memory Shortages Get? (arstechnica.com) 77

Digital Trends reports: A wave of shortages now threatens to ripple across RAM, SSDs, and even hard drives, affecting not only performance-hungry rigs but also everyday systems.

— CyberPowerPC has publicly confirmed it will raise prices on all systems starting December 7th due to RAM costs spiking by 500% and SSD prices doubling since October.

— Memory suppliers warn of a global DRAM and SSD shortage running into late 2026 or even 2027, driven heavily by AI server demand.

— As reported by Bloomberg, Lenovo has already stockpiled memory to ride out the crunch and maintain steadier PC pricing.

— Among other OEMs, HP, in its recent earnings call, flagged possible price increases or lower-spec models on the back of rising component costs.

But Apple "may also be in a good position to weather the shortage," reports Ars Technica, since "analysts at Morgan Stanley and Bernstein Research believe that Apple has already laid claim to the RAM that it needs and that its healthy profit margins will allow it to absorb the increases better than most."

Ars Technica also shows how much RAM and storage prices have jumped — sometimes as much as 2x or even 3x in just three months. "In short, there's no escaping these price increases, which affect SSDs and both DDR4 and DDR5 RAM kits of all capacities (though higher-capacity RAM kits do seem to be hit a little harder)." Memory and storage shortages can be particularly difficult to get through. As with all chips, it can take years to ramp up capacity and/or build new manufacturing facilities... And memory makers in particular may be slow to ramp up manufacturing capacity in response to shortages. If they decide to start manufacturing more chips now, what happens if memory demand drops off a cliff in six months or a year (if, say, an AI bubble deflates or pops altogether)? It means an oversupply of memory chips — consumers benefit from rock-bottom prices for components, but it becomes harder for manufacturers to cover their costs... The upshot is: Not only are memory prices getting bad now, but it's exceptionally difficult to predict when shortage-fueled price hikes might end...

Tom's Hardware reports that AMD has told its partners that it expects to raise GPU prices by about 10 percent starting next year and that Nvidia may have canceled a planned RTX 50-series Super launch entirely because of shortages and price increases.

Canada

Canada Rolls Back Climate Rules To Boost Investments 75

Canada's Prime Minister Mark Carney has signed an agreement with Alberta's premier that will roll back certain climate rules to spur investment in energy production, while encouraging construction of a new oil pipeline to the West Coast. From a report: Under the agreement, which was signed on Thursday, the federal government will scrap a planned emissions cap on the oil and gas sector and drop rules on clean electricity in exchange for a commitment by Canada's top oil-producing province to strengthen industrial carbon pricing and support a carbon capture-and-storage project.

The deal, which was hailed by the country's oil industry but panned by environmentalists, signaled a shift in Canada's energy policy in favour of fossil fuel development and is already creating tensions within Carney's minority government. Steven Guilbeault, who served as environment minister under Carney's predecessor Justin Trudeau, said he was quitting the cabinet over concerns that Canada's climate plan was being dismantled.
Data Storage

Unpowered SSDs in Your Drawer Are Slowly Losing Data (xda-developers.com) 79

An anonymous reader shares a report: Solid-state drives sitting unpowered in drawers or storage can lose data over time because voltage gradually leaks from their NAND flash cells, and consumer-grade drives using QLC NAND retain data for about a year while TLC NAND lasts up to three years without power. More expensive MLC and SLC NAND can hold data for five and ten years respectively. The voltage loss can result in missing data or completely unusable drives.

Hard drives remain more resistant to power loss despite their susceptibility to bit rot. Most users relying on SSDs for primary storage in regularly powered computers face little risk since drives typically stay unpowered for only a few months at most. The concern mainly affects creative professionals and researchers who need long-term archival storage.

Power

One Company's Plan to Sink Nuclear Reactors Deep Underground (ieee.org) 113

Long-time Slashdot reader jenningsthecat shared this article from IEEE Spectrum: By dropping a nuclear reactor 1.6 kilometers (1 mile) underground, Deep Fission aims to use the weight of a billion tons of rock and water as a natural containment system comparable to concrete domes and cooling towers. With the fission reaction occurring far below the surface, steam can safely circulate in a closed loop to generate power.

The California-based startup announced in October that prospective customers had signed non-binding letters of intent for 12.5 gigawatts of power involving data center developers, industrial parks, and other (mostly undisclosed) strategic partners, with initial sites under consideration in Kansas, Texas, and Utah... The company says its modular approach allows multiple 15-megawatt reactors to be clustered on a single site: A block of 10 would total 150 MW, and Deep Fission claims that larger groupings could scale to 1.5 GW. Deep Fission claims that using geological depth as containment could make nuclear energy cheaper, safer, and deployable in months at a fraction of a conventional plant's footprint...

The company aims to finalize its reactor design and confirm the pilot site in the coming months. [Company founder Liz] Muller says the plan is to drill the borehole, lower the canister, load the fuel, and bring the reactor to criticality underground in 2026. Sites in Utah, Texas, and Kansas are among the leading candidates for the first commercial-scale projects, which could begin construction in 2027 or 2028, depending on the speed of DOE and NRC approvals. Deep Fission expects to start manufacturing components for the first unit in 2026 and does not anticipate major bottlenecks aside from typical long-lead items.

In short "The same oil and gas drilling techniques that reliably reach kilometer-deep wells can be adapted to host nuclear reactors..." the article points out. Their design would also streamline construction, since "Locating the reactors under a deep water column subjects them to roughly 160 atmospheres of pressure — the same conditions maintained inside a conventional nuclear reactor — which forms a natural seal to keep any radioactive coolant or steam contained at depth, preventing leaks from reaching the surface."

Other interesting points from the article:
  • They plan on operating and controlling the reactor remotely from the surface.
  • Company founder Muller says if an earthquake ever disrupted the site, "you seal it off at the bottom of the borehole, plug up the borehole, and you have your waste in safe disposal."
  • For waste management, the company "is eyeing deep geological disposal in the very borehole systems they deploy for their reactors."
  • "The company claims it can cut overall costs by 70 to 80 percent compared with full-scale nuclear plants."

"Among its competition are projects like TerraPower's Natrium, notes the tech news site Hackaday, saying TerraPower's fast neutron reactors "are already under construction and offer much more power per reactor, along with Natrium in particular also providing built-in grid-level storage.

"One thing is definitely for certain..." they add. "The commercial power sector in the US has stopped being mind-numbingly boring."


United Kingdom

Britain Sets New Record, Generating Enough Wind Power for 22 Million Homes (thetimes.com) 113

An anonymous reader shared this report from Sky News: A new wind record has been set for Britain, with enough electricity generated from turbines to power 22 million homes, the system operator has said.

The mark of 22,711 megawatts (MW) was set at 7.30pm on 11 November... enough to keep around three-quarters of British homes powered, the National Energy System Operator (Neso) said. The country had experienced windy conditions, particularly in the north of England and Scotland...

Neso has predicted that Britain could hit another milestone in the months ahead by running the electricity grid for a period entirely with zero carbon power, renewables and nuclear... Neso said wind power is now the largest source of electricity generation for the UK, and the government wants to generate almost all of the UK's electricity from low-carbon sources by 2030.

"Wind accounted for 55.7 per cent of Britain's electricity mix at the time..." reports The Times: Gas provided only 12.5 per cent of the mix, with 11.3 per cent coming from imports over subsea power cables, 8 per cent from nuclear reactors, 8 per cent from biomass plants, 1.4 per cent from hydroelectric plants and 1.1 per cent from storage.

Britain has about 32 gigawatts of wind farms installed, approximately half of that onshore and half offshore, according to the Wind Energy Database from the wind industry body Renewable UK. That includes five of the world's biggest offshore wind farms. The government is seeking to double onshore wind and quadruple offshore wind power by 2030 as part of its plan for clean energy....

Jane Cooper, deputy chief executive of Renewable UK, said: "On a cold, dark November evening, wind was generating enough electricity to power 80 per cent of British homes when we needed it most.

Google

Google Must Double AI Serving Capacity Every 6 Months To Meet Demand 57

Google's AI infrastructure chief told employees the company must double its AI serving capacity every six months in order to meet demand. In a presentation earlier this month, Amin Vahdat, a vice president at Google Cloud, gave a presentation titled "AI Infrastructure." It included a slide on "AI compute demand" that said: "Now we must double every 6 months.... the next 1000x in 4-5 years." CNBC reports: The presentation was delivered a week after Alphabet reported better-than-expected third-quarter results and raised its capital expenditures forecast for the second time this year, to a range of $91 billion to $93 billion, followed by a "significant increase" in 2026. Hyperscaler peers Microsoft, Amazon and Meta also boosted their capex guidance, and the four companies now expect to collectively spend more than $380 billion this year.

Google's "job is of course to build this infrastructure but it's not to outspend the competition, necessarily," Vahdat said. "We're going to spend a lot," he said, adding that the real goal is to provide infrastructure that is far "more reliable, more performant and more scalable than what's available anywhere else." In addition to infrastructure build-outs, Vahdat said Google bolsters capacity with more efficient models and through its custom silicon. Last week, Google announced the public launch of its seventh generation Tensor Processing Unit called Ironwood, which the company says is nearly 30 times more power efficient than its first Cloud TPU from 2018.

Vahdat said the company has a big advantage with DeepMind, which has research on what AI models can look like in future years. Google needs to "be able to deliver 1,000 times more capability, compute, storage networking for essentially the same cost and increasingly, the same power, the same energy level," Vahdat said. "It won't be easy but through collaboration and co-design, we're going to get there."
China

Tech Company CTO and Others Indicted For Exporting Nvidia Chips To China (arstechnica.com) 11

An anonymous reader quotes a report from Ars Technica: The US crackdown on chip exports to China has continued with the arrests of four people accused of a conspiracy to illegally export Nvidia chips. Two US citizens and two nationals of the People's Republic of China (PRC), all of whom live in the US, were charged in an indictment (PDF) unsealed on Wednesday in US District Court for the Middle District of Florida. The indictment alleges a scheme to send Nvidia "GPUs to China by falsifying paperwork, creating fake contracts, and misleading US authorities," John Eisenberg, assistant attorney general for the Justice Department's National Security Division, said in a press release yesterday.

The four arrestees are Hon Ning Ho (aka Mathew Ho), a US citizen who was born in Hong Kong and lives in Tampa, Florida; Brian Curtis Raymond, a US citizen who lives in Huntsville, Alabama; Cham Li (aka Tony Li), a PRC national who lives in San Leandro, California; and Jing Chen (aka Harry Chen), a PRC national who lives in Tampa on an F-1 non-immigrant student visa. The suspects face a raft of charges for conspiracy to violate the Export Control Reform Act of 2018, smuggling, and money laundering. They could serve many decades in prison if convicted and given the maximum sentences and forfeit their financial gains. The indictment says that Chinese companies paid the conspirators nearly $3.9 million.
One of the suspects was briefly the CTO of Corvex, a Virginia-based AI cloud computing company that is planning to go public. Corvex told CNBC yesterday that it "had no part in the activities cited in the Department of Justice's indictment," and that "the person in question is not an employee of Corvex. Previously a consultant to the company, he was transitioning into an employee role but that offer has been rescinded."
The Internet

Cloudflare Explains Its Worst Outage Since 2019 57

Cloudflare suffered its worst network outage in six years on Tuesday, beginning at 11:20 UTC. The disruption prevented the content delivery network from routing traffic for roughly three hours. The failure, writes Cloudflare in a blog post, originated from a database permissions change deployed at 11:05 UTC. The modification altered how a database query returned information about bot detection features. The query began returning duplicate entries. A configuration file used to identify automated traffic doubled in size and spread across the network's machines. Cloudflare's traffic routing software reads this file to distinguish bots from legitimate users. The software had a built-in limit of 200 bot detection features. The enlarged file contained more than 200 entries. The software crashed when it encountered the unexpected file size.

Users attempting to access websites behind Cloudflare's network received error messages. The outage affected multiple services. Turnstile security checks failed to load. The Workers KV storage service returned elevated error rates. Users could not log into Cloudflare's dashboard. Access authentication failed for most customers.

Engineers initially suspected a coordinated attack. The configuration file was automatically regenerated every five minutes. Database servers produced either correct or corrupted files during a gradual system update. Services repeatedly recovered and failed as different versions of the file circulated. Teams stopped generating new files at 14:24 UTC and manually restored a working version. Most traffic resumed by 14:30 UTC. All systems returned to normal at 17:06 UTC.
Social Networks

Jack Dorsey Funds diVine, a Vine Reboot That Includes Vine's Video Archive (techcrunch.com) 20

An anonymous reader quotes a report from TechCrunch: As generative AI content starts to fill our social apps, a project to bring back Vine's six-second looping videos is launching with Twitter co-founder Jack Dorsey's backing. On Thursday, a new app called diVine will give access to more than 100,000 archived Vine videos, restored from an older backup that was created before Vine's shutdown. The app won't just exist as a walk down memory lane; it will also allow users to create profiles and upload their own new Vine videos. However, unlike on traditional social media, where AI content is often haphazardly labeled, diVine will flag suspected generative AI content and prevent it from being posted. According to TechCrunch, a volunteer preservation group called the Archive Team saved Vine's content when it shut down in 2016. The only problem was that everything was stored in massive 40-50 GB binary blob files that were basically unusable for casual viewing.

Evan Henshaw-Plath (who goes by the name Rabble), an early Twitter employee and member of Jack Dorsey's nonprofit "and Other Stuff," dug into those backup files to try and salvage as much as he could. He spent months writing big-data extraction scripts, reverse-engineering how the archived binaries were structured, and reconstructing the original video files, old user info, view counts, and more. "I wasn't able to get all of them out, but I was able to get a lot out and basically reconstruct these Vines and these Vine users, and give each person a new user [profile] on this open network," he said.

Rabble estimates that through this process he was able to successfully recover 150,000-200,000 Vine videos from around 60,000 creators. diVine then rebuilt user profiles on top of the decentralized Nostr protocol so creators can reclaim their accounts, request takedowns, or upload missing videos.

You can check out the app for yourself at diVine.video. It's available in beta form on both iOS and Android.
Hardware

Valve Rejoins the VR Hardware Wars With Standalone Steam Frame (arstechnica.com) 45

Valve is ready to rejoin the VR hardware race with the Steam Frame, a lightweight standalone SteamOS headset that can run games locally or stream wirelessly from a PC using new "foveated streaming" tech. It's set to launch in early 2026. Ars Technica reports: Powered by a Snapdragon 8 Gen 3 processor with 16 GB of RAM, the Steam Frame sports a 2160 x 2160 resolution display per eye at an "up to 110 degrees" field-of-view and up to 144 Hz. That's all roughly in line with 2023's Meta Quest 3, which runs on the slightly less performant Snapdragon XR2 Gen 2 processor. Valve's new headset will be available in models sporting 256GB and 1TB or internal storage, both with the option for expansion via a microSD card slot. Pricing details have not yet been revealed publicly.

The Steam Frame's inside-out tracking cameras mean you won't have to set up the awkward external base stations that were necessary for previous SteamVR headsets (including the Index). But that also means old SteamVR controllers won't work with the new hardware. Instead, included Steam Frame controllers will track your hand movements, provide haptic feedback, and offer "input parity with a traditional game pad" through the usual buttons and control sticks.

For those who want to bring desktop GPU power to their VR experience, the Steam Frame will be able to connect wirelessly to a PC using an included 6 GHz Wi-Fi 6E adapter. That streaming will be enhanced by what Valve is calling "foveated rendering" technology, which sends the highest-resolution video stream to where your eyes are directly focused (as tracked by two internal cameras). That will help Steam Frame streaming establish a "fast, direct, low-latency link" to the machine, Valve said, though the company has yet to respond to questions about just how much additional wireless latency users can expect.
Further reading: Valve Enters the Console Wars
The Courts

OpenAI Fights Order To Turn Over Millions of ChatGPT Conversations (reuters.com) 69

An anonymous reader quotes a report from Reuters: OpenAI asked a federal judge in New York on Wednesday to reverse an order that required it to turn over 20 million anonymized ChatGPT chat logs amid a copyright infringement lawsuit by the New York Times and other news outlets, saying it would expose users' private conversations. The artificial intelligence company argued that turning over the logs would disclose confidential user information and that "99.99%" of the transcripts have nothing to do with the copyright infringement allegations in the case.

"To be clear: anyone in the world who has used ChatGPT in the past three years must now face the possibility that their personal conversations will be handed over to The Times to sift through at will in a speculative fishing expedition," the company said in a court filing (PDF). The news outlets argued that the logs were necessary to determine whether ChatGPT reproduced their copyrighted content and to rebut OpenAI's assertion that they "hacked" the chatbot's responses to manufacture evidence. The lawsuit claims OpenAI misused their articles to train ChatGPT to respond to user prompts.

Magistrate Judge Ona Wang said in her order to produce the chats that users' privacy would be protected by the company's "exhaustive de-identification" and other safeguards. OpenAI has a Friday deadline to produce the transcripts.

Slashdot Top Deals