UN Cybercrime Treaty does not define cybercrime, allows any definition and forces all signatories to secretly surveil their own population on request by any other signatory (think totalitarian states spying on people in democracies with no recourse)

[…] EFF colleague, Katitza Rodriguez, about the Cybercrime Treaty, which is about to pass, and which is, to put it mildly, terrifying:

https://www.eff.org/deeplinks/2024/07/un-cybercrime-draft-convention-dangerously-expands-state-surveillance-powers

Look, cybercrime is a real thing, from pig butchering to ransomware, and there’s real, global harms that can be attributed to it. Cybercrime is transnational, making it hard for cops in any one jurisdiction to handle it. So there’s a reason to think about formal international standards for fighting cybercrime.

But that’s not what’s in the Cybercrime Treaty.

Here’s a quick sketch of the significant defects in the Cybercrime Treaty.

The treaty has an extremely loose definition of cybercrime, and that looseness is deliberate. In authoritarian states like China and Russia (whose delegations are the driving force behind this treaty), “cybercrime” has come to mean “anything the government disfavors, if you do it with a computer.” “Cybercrime” can mean online criticism of the government, or professions of religious belief, or material supporting LGBTQ rights.

Nations that sign up to the Cybercrime Treaty will be obliged to help other nations fight “cybercrime” – however those nations define it. They’ll be required to provide surveillance data – for example, by forcing online services within their borders to cough up their users’ private data, or even to pressure employees to install back-doors in their systems for ongoing monitoring.

These obligations to aid in surveillance are mandatory, but much of the Cybercrime Treaty is optional. What’s optional? The human rights safeguards. Member states “should” or “may” create standards for legality, necessity, proportionality, non-discrimination, and legitimate purpose. But even if they do, the treaty can oblige them to assist in surveillance orders that originate with other states that decided not to create these standards.

When that happens, the citizens of the affected states may never find out about it. There are eight articles in the treaty that establish obligations for indefinite secrecy regarding surveillance undertaken on behalf of other signatories. That means that your government may be asked to spy on you and the people you love, they may order employees of tech companies to backdoor your account and devices, and that fact will remain secret forever. Forget challenging these sneak-and-peek orders in court – you won’t even know about them:

https://www.eff.org/deeplinks/2024/06/un-cybercrime-draft-convention-blank-check-unchecked-surveillance-abuses

Now here’s the kicker: while this treaty creates broad powers to fight things governments dislike, simply by branding them “cybercrime,” it actually undermines the fight against cybercrime itself. Most cybercrime involves exploiting security defects in devices and services – think of ransomware attacks – and the Cybercrime Treaty endangers the security researchers who point out these defects, creating grave criminal liability for the people we rely on to warn us when the tech vendors we rely upon have put us at risk.

[…]

When it comes to warnings about the defects in their own products, corporations have an irreconcilable conflict of interest. Time and again, we’ve seen corporations rationalize their way into suppressing or ignoring bug reports. Sometimes, they simply delay the warning until they’ve concluded a merger or secured a board vote on executive compensation.

Sometimes, they decide that a bug is really a feature

Note: Responsible disclosure is something people should really “get” by now.

[…]

The idea that users are safer when bugs are kept secret is called “security through obscurity” and no one believes in it – except corporate executives

[…]

The spy agencies have an official doctrine defending this reckless practice: they call it “NOBUS,” which stands for “No One But Us.” As in: “No one but us is smart enough to find these bugs, so we can keep them secret and use them attack our adversaries, without worrying about those adversaries using them to attack the people we are sworn to protect.”

NOBUS is empirically wrong.

[…]

The leak of these cyberweapons didn’t just provide raw material for the world’s cybercriminals, it also provided data for researchers. A study of CIA and NSA NOBUS defects found that there was a one-in-five chance of a bug that had been hoarded by a spy agency being independently discovered by a criminal, weaponized, and released into the wild.

[…]

A Cybercrime Treaty is a good idea, and even this Cybercrime Treaty could be salvaged. The member-states have it in their power to accept proposed revisions that would protect human rights and security researchers, narrow the definition of “cybercrime,” and mandate transparency. They could establish member states’ powers to refuse illegitimate requests from other countries:

https://www.eff.org/press/releases/media-briefing-eff-partners-warn-un-member-states-are-poised-approve-dangerou

 

Source: Pluralistic: Holy CRAP the UN Cybercrime Treaty is a nightmare (23 Jul 2024) – Pluralistic: Daily links from Cory Doctorow

Dual action antibiotic could make bacterial resistance nearly impossible

A new antibiotic that works by disrupting two different cellular targets would make it 100 million times more difficult for bacteria to evolve resistance, according to new research from the University of Illinois Chicago.

For a new paper in Nature Chemical Biology, researchers probed how a class of synthetic drugs called macrolones disrupt bacterial cell function to fight infectious diseases. Their experiments demonstrate that macrolones can work two different ways—either by interfering with protein production or corrupting DNA structure.

Because would need to implement defenses to both attacks simultaneously, the researchers calculated that is nearly impossible.

“The beauty of this antibiotic is that it kills through two different targets in bacteria,” said Alexander Mankin, distinguished professor of pharmaceutical sciences at UIC. “If the antibiotic hits both targets at the same concentration, then the bacteria lose their ability to become resistant via acquisition of random mutations in any of the two targets.”

[…]

More information: Elena V. Aleksandrova et al, Macrolones target bacterial ribosomes and DNA gyrase and can evade resistance mechanisms, Nature Chemical Biology (2024). DOI: 10.1038/s41589-024-01685-3

Source: Dual action antibiotic could make bacterial resistance nearly impossible

Google isn’t killing third-party cookies in Chrome after all in move that surprises absolutely no-one.

Google won’t kill third-party cookies in Chrome after all, the company said on Monday. Instead, it will introduce a new experience in the browser that will allow users to make informed choices about their web browsing preferences, Google announced in a blog post. Killing cookies, Google said, would adversely impact online publishers and advertisers. This announcement marks a significant shift from Google’s previous plans to phase out third-party cookies by early 2025.

[…]

Google will now focus on giving users more control over their browsing data, Chavez wrote. This includes additional privacy controls like IP Protection in Chrome’s Incognito mode and ongoing improvements to Privacy Sandbox APIs.

Google’s decision provides a reprieve for advertisers and publishers who rely on cookies to target ads and measure performance. Over the past few years, the company’s plans to eliminate third-party cookies have been riding on a rollercoaster of delays and regulatory hurdles. Initially, Google aimed to phase out these cookies by the end of 2022, but the deadline was pushed to late 2024 and then to early 2025 due to various challenges and feedback from stakeholders, including advertisers, publishers, and regulatory bodies like the UK’s Competition and Markets Authority (CMA).

In January 2024, Google began rolling out a new feature called Tracking Protection, which restricts third-party cookies by default for 1% of Chrome users globally. This move was perceived as the first step towards killing cookies completely. However, concerns and criticism about the readiness and effectiveness of Google’s Privacy Sandbox, a collection of APIs designed to replace third-party cookies, prompted further delays.

The CMA and other regulatory bodies have expressed concerns about Google’s Privacy Sandbox, fearing it might limit competition and give Google an unfair advantage in the digital advertising market. These concerns have led to extended review periods and additional scrutiny, complicating Google’s timeline for phasing out third-party cookies. Shortly after Google’s Monday announcement, the CMA said that it was “considering the impact” of Google’s change of direction.

Source: Google isn’t killing third-party cookies in Chrome after all

Intel has finally figured out its long-standing desktop CPU instability issues, hopefully patches in August

The first reports of instability issues with the 13th-gen Intel desktop CPUs started popping up in late 2022, mere months after the models came out. Those issues persisted, and over time, users reported dealing with unexpected and sudden crashes on PCs equipped with the company’s 14th-gen CPUs, as well. Now, Intel has announced that it finally found the reason why its 13th and 14th-gen desktop processors have been causing crashes and giving out on users, and it promises to roll out a fix by next month.

In its announcement, Intel said that based on extensive analysis of the processors that had been returned to the company, it has determined that elevated operating voltage was causing the instability issues. Apparently, it’s because a microcode algorithm — microcodes, or machine codes, are sets of hardware-level instructions — has been sending incorrect voltage requests to the processor.

Intel has now promised to release a microcode patch to address the “root cause of exposure to elevated voltages.” The patch is still being validated to ensure that it can address all “scenarios of instability reported to Intel,” but the company is aiming to roll it out by mid-August.

As wccftech notes, while Intel’s CPUs have been causing issues with users for at least a year and a half, a post on X by Sebastian Castellanos in February put the problem in the spotlight. Castellanos wrote that there was a “worrying trend” of 13th and 14th-gen Intel CPUs having stability issues with Unreal Engine 4 and 5 games, such as Fortnite and Hogwarts Legacy. He also noticed that the issue seems to affect mostly higher-end models and linked to a discussion on Steam Community. The user that wrote the post on Steam wanted to issue a warning to those experiencing “out of video memory trying to allocate a rendering resource” errors that it was their CPU that was faulty. They also linked to several Reddit threads with people experiencing the same problem and who had determined that their issue lied with their Intel CPUs.

More recently, the indie studio Alderon Games published a post about “encountering significant problems with Intel CPU stability” while developing its multiplayer dinosaur survival game Path of Titans. Its founder, Matthew Cassells, said the studio found that the issue affected end customers, dedicated game servers, developers’ computers, game server providers and even benchmarking tools that use Intel’s 13th and 14th-gen CPUs. Cassells added that even the CPUs that initially work well deteriorate and eventually fail, based on the company’s observations. “The failure rate we have observed from our own testing is nearly 100 percent,” the studio’s post reads, “indicating it’s only a matter of time before affected CPUs fail.”

Source: Intel has finally figured out its long-standing desktop CPU instability issues

Nvidia’s third-party RTX 40-series GPUs are losing performance over time thanks to rubbish factory-installed thermal paste

Modern graphics cards use lots of power and all of it is turned into heat. So if you’re paying many hundreds of dollars for a powerful GPU, you’d expect no expense to be spared on the cooling system. It turns out that for many Nvidia RTX 40-series vendors, the expense is being spared and cheap, poorly applied thermal paste is leading to scorching high hotspot temperatures and performance degradation over time.

That’s the conclusion hardware tester Igor’s Lab has come to after testing multiple GeForce RTX cards, analysing temperatures and performance, and discovering that the thermal paste used by many graphics card vendors is not only sub-standard for the job but is also poorly applied.

I have four RTX 40-series cards in my office (RTX 4080 Super, 4070 Ti, and two 4070s) and all of them have quite high hotspots—the highest temperature recorded by an individual thermal sensor in the die. In the case of the 4080 Super, it’s around 11 °C higher than the average temperature of the chip. I took it apart to apply some decent quality thermal paste and discovered a similar situation to that found by Igor’s Lab.

In the space of a few months, the factory-applied paste had separated and spread out, leaving just an oily film behind, and a few patches of the thermal compound itself. I checked the other cards and found that they were all in a similar state.

[…]

Removing the factory-installed paste from another RTX 4080 graphics card, Igor’s Lab applied a more appropriate amount of a high-quality paste and discovered that it lowered the hotspot temperature by nearly 30 °C.

But it’s not just about the hotspots. Cheap, poorly applied thermal paste will cause the performance of a graphics card to degrade over time because GPUs lower clock speeds when they reach their thermal limits. PC enthusiasts are probably very comfortable with replacing a CPU’s thermal paste regularly but it’s not a simple process with graphics cards.

[…]

While Nvidia enjoys huge margins on its GPUs, graphics card vendors aren’t quite so lucky, but they’re not so small that spending a few more dollars on better thermal paste isn’t going to bankrupt the company.

Mind you, if they all started using PTM7950, then none of this would be an issue—the cards would run cooler and would stay that way for much longer. The only problem then is that you’d hear the coil whine over the reduced fan noise.

Source: Nvidia’s third-party RTX 40-series GPUs are losing performance over time thanks to rubbish factory-installed thermal paste | PC Gamer