Microsoft is going nuclear to power its AI ambitions

Microsoft thinks next-generation nuclear reactors can power its data centers and AI ambitions, according to a job listing for a principal program manager who’ll lead the company’s nuclear energy strategy.

Data centers already use a hell of a lot of electricity, which could thwart the company’s climate goals unless it can find clean sources of energy. Energy-hungry AI makes that an even bigger challenge for the company to overcome. AI dominated Microsoft’s Surface event last week.

[…]

The job posting says it’s hiring someone to “lead project initiatives for all aspects of nuclear energy infrastructure for global growth.”

Microsoft is specifically looking for someone who can roll out a plan for small modular reactors (SMR).

[…]

The US Nuclear Regulatory Commission just certified an SMR design for the first time in January, which allows utilities to choose the design when applying for a license for a new power plant. And it could usher in a whole new chapter for nuclear energy.

Even so, there are still kinks to work out if Microsoft wants to rely on SMRs to power the data centers where its cloud and AI live. An SMR requires more highly enriched uranium fuel, called HALEU, than today’s traditional reactors. So far, Russia has been the world’s major supplier of HALEU. There’s a push in the US to build up a domestic supply chain of uranium, which communities near uranium mines and mills are already fighting. Then there’s the question of what to do with nuclear waste, which even a fleet of SMRs can generate significant amounts of and the US is still figuring out how to store long term

[…]

Microsoft has also made an audacious deal to purchase electricity from a company called Helion that’s developing an even more futuristic fusion power plant. Both old-school nuclear reactors and SMR designs generate electricity through nuclear fission, which is the splitting apart of atoms. Nuclear fusion, involves forcing atoms together the way stars do to create their own energy. A fusion reactor is a holy grail of sorts — it would be a source of abundant clean energy that doesn’t create the same radioactive waste as nuclear fission. But despite decades of research and recent breakthroughs, most experts say a fusion power plant is at least decades away — and the world can’t wait that long to tackle climate change.

Helion’s backers also include OpenAI CEO and ChatGPT developer Sam Altman.

[…]

Source: Microsoft is going nuclear to power its AI ambitions – The Verge

Heat pumps twice as efficient as fossil fuel systems in cold weather, study finds – gas lobby misinforms, blocks uptake

Heat pumps are more than twice as efficient as fossil fuel heating systems in cold temperatures, research shows.

Even at temperatures approaching -30C, heat pumps outperform oil and gas heating systems, according to the research from Oxford University and the Regulatory Assistance Project thinktank.

[…]

The research, published in the specialist energy research journal Joule, used data from seven field studies in North America, Asia and Europe. It found that at temperatures below zero, heat pumps were between two and three times more efficient than oil and gas heating systems.

The authors said the findings showed that heat pumps were suitable for almost all homes in Europe, including the UK, and should provide policymakers with the impetus to bring in new measures to roll them out as rapidly as possible.

Dr Jan Rosenow, the director of European programmes at the Regulatory Assistance Project and co-author of the report, said: “There has been a campaign spreading false information about heat pumps [including casting doubt on whether they work in cold weather]. People [in the UK] don’t know much about heat pumps, so it’s very easy to scare them by giving them wrong information.”

The Guardian and the investigative journalism organisation DeSmog recently revealed that lobbyists associated with the gas boiler sector had attempted to delay a key government measure to increase the uptake of heat pumps.

[…]

Source: Heat pumps twice as efficient as fossil fuel systems in cold weather, study finds | Energy | The Guardian

Ransomed.vc: Using the GDPR fine as a benchmark to ransom stolen data

On August 15, 2023, the threat actor “Ransomed,” operating under the alias “RansomForums,” posted on Telegram advertising their new forum and Telegram chat channel. On the same day, the domain ransomed[.]vc was registered.

But before activity on Ransomed had even really begun, the forum was the victim of a distributed denial-of-service (DDoS) attack. In response, the operators of the site quickly pivoted to rebrand it as a ransomware blog that, similar to other ransomware collectives, would adopt the approach of publicly listing victim names while issuing threats of data exposure unless ransoms are paid.

[…]

Ransomed is leveraging an extortion tactic that has not been observed before—according to communications from the group, they use data protection laws like the EU’s GDPR to threaten victims with fines if they do not pay the ransom. This tactic marks a departure from typical extortionist operations by twisting protective laws against victims to justify their illegal attacks.

[…]

The group has disclosed ransom demands for its victims, which span from €50,000 EUR to €200,000 EUR. For comparison, GDPR fines can climb into the millions and beyond—the highest ever was over €1 billion EUR. It is likely that Ransomed’s strategy is to set ransom amounts lower than the price of a fine for a data security violation, which may allow them to exploit this discrepancy in order to increase the chance of payment.

As of August 28, Ransomed operators have listed two Bitcoin addresses for payment on their site. Typically, threat actors do not make their wallet addresses public, instead sharing them directly with victims via a ransom note or negotiations portal.

These unconventional choices have set Ransomed apart from other ransomware operations, although it is still unproven if their tactics will be successful.

[…]

It is likely that Ransomed is a financially motivated project, and one of several other short-lived projects from its creators.

The owner of the Ransomed Telegram chat claims to have the source code of Raid Forums and said they intend to use it in the future, indicating that while the owner is running a ransomware blog for now, there are plans to turn it back into a forum later—although the timeline for this reversion is not clear.

The forum has gained significant attention in the information security community and in threat communities for its bold statements of targeting large organizations. However, there is limited evidence that the attacks published on the Ransomed blog actually took place, beyond the threat actors’ claims.

[…]

As the security community continues to monitor this enigmatic group’s activities, one thing remains clear: the landscape of ransomware attacks continues to evolve, challenging defenders to adapt and innovate in response.

Source: The Emergence of Ransomed: An Uncertain Cyber Threat in the Making | Flashpoint

The Milky Way’s Mass is Much Lower Than We Thought

How massive is the Milky Way? It’s an easy question to ask, but a difficult one to answer. Imagine a single cell in your body trying to determine your total mass, and you get an idea of how difficult it can be. Despite the challenges, a new study has calculated an accurate mass of our galaxy, and it’s smaller than we thought.

One way to determine a galaxy’s mass is by looking at what’s known as its rotation curve. Measure the speed of stars in a galaxy versus their distance from the galactic center. The speed at which a star orbits is proportional to the amount of mass within its orbit, so from a galaxy’s rotation curve you can map the function of mass per radius and get a good idea of its total mass. We’ve measured the rotation curves for several nearby galaxies such as Andromeda, so we know the masses of many galaxies quite accurately.

But since we are in the Milky Way itself, we don’t have a great view of stars throughout the galaxy. Toward the center of the galaxy, there is so much gas and dust we can’t even see stars on the far side. So instead we measure the rotation curve using neutral hydrogen, which emits faint light with a wavelength of about 21 centimeters. This isn’t as accurate as stellar measurements, but it has given us a rough idea of our galaxy’s mass. We’ve also looked at the motions of the globular clusters that orbit in the halo of the Milky Way. From these observations, our best estimate of the mass of the Milky Way is about a trillion solar masses, give or take.

The distribution of stars seen by the Gaia surveys. Credit: Data: ESA/Gaia/DPAC, A. Khalatyan(AIP) & StarHorse team; Galaxy map: NASA/JPL-Caltech/R. Hurt

This new study is based on the third data release of the Gaia spacecraft. It contains the positions of more than 1.8 billion stars and the motions of more than 1.5 billion stars. While this is only a fraction of the estimated 100-400 billion stars in our galaxy, it is a large enough number to calculate an accurate rotation curve. Which is exactly what the team did. Their resulting rotation curve is so precise, that the team could identify what’s known as the Keplerian decline. This is the outer region of the Milky Way where stellar speeds start to drop off roughly in accordance with Kepler’s laws since almost all of the galaxy’s mass is closer to the galactic center.

The Keplerian decline allows the team to place a clear upper limit on the mass of the Milky Way. What they found was surprising. The best fit to their data placed the mass at about 200 billion solar masses, which is a fifth of previous estimates. The absolute upper mass limit for the Milky Way is 540 billion, meaning that the Milky Way is at least half as massive as we thought. Given the amount of known regular matter in the galaxy, this means the Milky Way has significantly less dark matter than we thought.

Source: The Milky Way’s Mass is Much Lower Than We Thought – Universe Today

Firefox now has private browser-based website translation – no cloud servers required

Web browsers have had tools that let you translate websites for years. But they typically rely on cloud-based translation services like Google Translate or Microsoft’s Bing Translator.

The latest version of Mozilla’s Firefox web browser does things differently. Firefox 118 brings support for Fullpage Translation, which can translate websites entirely in your browser. In other words, everything happens locally on your computer without any data sent to Microsoft, Google, or other companies.

Here’s how it works. Firefox will notice when you visit a website in a supported language that’s different from your default language, and a translate icon will show up in the address bar.

Tap that icon and you’ll see a pop-up window that asks what languages you’d like to translate from and to. If the browser doesn’t automatically detect the language of the website you’re visiting, you can set these manually.

Then click the “Translate” button, and a moment later the text on the page should be visible in your target language. If you’d prefer to go back to the original language, just tap the translate icon again and choose the option that says “show original.”

You can also tap the settings icon in the translation menu and choose to “always translate” or “never translate” a specific language so that you won’t have to manually invoke the translation every time you visit sites in that language.

Now for the bad news: Firefox Fullpage Translation only supports 9 languages so far:

  • Bulgarian
  • Dutch
  • English
  • French
  • German
  • Italian
  • Polish
  • Portuguese
  • Spanish

[…]

Source: Firefox 118 brings browser-based website translation (no cloud servers required… for a handful of supported languages) – Liliputing