23andMe is on the brink. What happens to all that genetic DNA data?

[…] The one-and-done nature of Wiles’ experience is indicative of a core business problem with the once high-flying biotech company that is now teetering on the brink of collapse. Wiles and many of 23andMe’s 15 million other customers never returned. They paid once for a saliva kit, then moved on.

Shares of 23andMe are now worth pennies. The company’s valuation has plummeted 99% from its $6 billion peak shortly after the company went public in 2021.

As 23andMe struggles for survival, customers like Wiles have one pressing question: What is the company’s plan for all the data it has collected since it was founded in 2006?

[…]

Andy Kill, a spokesperson for 23andMe, would not comment on what the company might do with its trove of genetic data beyond general pronouncements about its commitment to privacy.

[…]

When signing up for the service, about 80% of 23andMe’s customers have opted in to having their genetic data analyzed for medical research.

[…]

The company has an agreement with pharmaceutical giant GlaxoSmithKline, or GSK, that allows the drugmaker to tap the tech company’s customer data to develop new treatments for disease.

Anya Prince, a law professor at the University of Iowa’s College of Law who focuses on genetic privacy, said those worried about their sensitive DNA information may not realize just how few federal protections exist.

For instance, the Health Insurance Portability and Accountability Act, also known as HIPAA, does not apply to 23andMe since it is a company outside of the health care realm.

[…]

According to the company, all of its genetic data is anonymized, meaning there is no way for GSK, or any other third party, to connect the sample to a real person. That, however, could make it nearly impossible for a customer to renege on their decision to allow researchers to access their DNA data.

“I couldn’t go to GSK and say, ‘Hey, my sample was given to you — I want that taken out — if it was anonymized, right? Because they’re not going to re-identify it just to pull it out of the database,” Prince said.

[…]

the patchwork of state laws governing DNA data makes the generic data of millions potentially vulnerable to being sold off, or even mined by law enforcement.

“Having to rely on a private company’s terms of service or bottom line to protect that kind of information is troubling — particularly given the level of interest we’ve seen from government actors in accessing such information during criminal investigations,” Eidelman said.

She points to how investigators used a genealogy website to identify the man known as the Golden State Killer, and how police homed in on an Idaho murder suspect by turning to similar databases of genetic profiles.

“This has happened without people’s knowledge, much less their express consent,” Eidelman said.

[…]

Last year, the company was hit with a major data breach that it said affected 6.9 million customer accounts, including about 14,000 who had their passwords stolen.

[…]

Some analysts predict that 23andMe could go out of business by next year, barring a bankruptcy proceeding that could potentially restructure the company.

[…]

Source: What happens to all of 23andMe’s genetic DNA data? : NPR

For more fun reading about about this clusterfuck of a company and why giving away DNA data is a spectacularly bad idea:

Google’s AI enshittifies search summaries with ads

Google is rolling out ads in AI Overviews, which means you’ll now start seeing products in some of the search engine’s AI-generated summaries.

Let’s say you’re searching for ways to get a grass stain out of your pants. If you ask Google, its AI-generated response will offer some tips, along with suggestions for products to purchase that could help you remove the stain. […]

Google’s AI Overviews could contain relevant products.

 

Source: Google’s AI search summaries officially have ads – The Verge

License Plate Readers Are Creating a US-Wide Database of Cars – and political affiliation, planned parenthood and more

At 8:22 am on December 4 last year, a car traveling down a small residential road in Alabama used its license-plate-reading cameras to take photos of vehicles it passed. One image, which does not contain a vehicle or a license plate, shows a bright red “Trump” campaign sign placed in front of someone’s garage. In the background is a banner referencing Israel, a holly wreath, and a festive inflatable snowman.

Another image taken on a different day by a different vehicle shows a “Steelworkers for Harris-Walz” sign stuck in the lawn in front of someone’s home. A construction worker, with his face unblurred, is pictured near another Harris sign. Other photos show Trump and Biden (including “Fuck Biden”) bumper stickers on the back of trucks and cars across America.

[…]

These images were generated by AI-powered cameras mounted on cars and trucks, initially designed to capture license plates, but which are now photographing political lawn signs outside private homes, individuals wearing T-shirts with text, and vehicles displaying pro-abortion bumper stickers—all while recording the precise locations of these observations.

[…]

The detailed photographs all surfaced in search results produced by the systems of DRN Data, a license-plate-recognition (LPR) company owned by Motorola Solutions. The LPR system can be used by private investigators, repossession agents, and insurance companies; a related Motorola business, called Vigilant, gives cops access to the same LPR data.

[…]

those with access to the LPR system can search for common phrases or names, such as those of politicians, and be served with photographs where the search term is present, even if it is not displayed on license plates.

[…]

“I searched for the word ‘believe,’ and that is all lawn signs. There’s things just painted on planters on the side of the road, and then someone wearing a sweatshirt that says ‘Believe.’” Weist says. “I did a search for the word ‘lost,’ and it found the flyers that people put up for lost dogs and cats.”

Beyond highlighting the far-reaching nature of LPR technology, which has collected billions of images of license plates, the research also shows how people’s personal political views and their homes can be recorded into vast databases that can be queried.

[…]

Over more than a decade, DRN has amassed more than 15 billion “vehicle sightings” across the United States, and it claims in its marketing materials that it amasses more than 250 million sightings per month.

[…]

The system is partly fueled by DRN “affiliates” who install cameras in their vehicles, such as repossession trucks, and capture license plates as they drive around. Each vehicle can have up to four cameras attached to it, capturing images in all angles. These affiliates earn monthly bonuses and can also receive free cameras and search credits.

In 2022, Weist became a certified private investigator in New York State. In doing so, she unlocked the ability to access the vast array of surveillance software accessible to PIs. Weist could access DRN’s analytics system, DRNsights, as part of a package through investigations company IRBsearch. (After Weist published an op-ed detailing her work, IRBsearch conducted an audit of her account and discontinued it.

[…]

While not linked to license plate data, one law enforcement official in Ohio recently said people should “write down” the addresses of people who display yard signs supporting Vice President Kamala Harris, the 2024 Democratic presidential nominee, exemplifying how a searchable database of citizens’ political affiliations could be abused.

[…]

In 2022, WIRED revealed that hundreds of US Immigration and Customs Enforcement employees and contractors were investigated for abusing similar databases, including LPR systems. The alleged misconduct in both reports ranged from stalking and harassment to sharing information with criminals.

[…]

 

Source: License Plate Readers Are Creating a US-Wide Database of More Than Just Cars | WIRED

Insecure Robot Vacuums From Chinese Company Deebot Collect Photos and Audio to Train Their AI

Ecovacs robot vacuums, which have been found to suffer from critical cybersecurity flaws, are collecting photos, videos and voice recordings — taken inside customers’ houses — to train the company’s AI models.

The Chinese home robotics company, which sells a range of popular Deebot models in Australia, said its users are “willingly participating” in a product improvement program.

When users opt into this program through the Ecovacs smartphone app, they are not told what data will be collected, only that it will “help us strengthen the improvement of product functions and attached quality”. Users are instructed to click “above” to read the specifics, however there is no link available on that page.

Ecovacs’s privacy policy — available elsewhere in the app — allows for blanket collection of user data for research purposes, including:

– The 2D or 3D map of the user’s house generated by the device
– Voice recordings from the device’s microphone
— Photos or videos recorded by the device’s camera

“It also states that voice recordings, videos and photos that are deleted via the app may continue to be held and used by Ecovacs…”

Source: Insecure Robot Vacuums From Chinese Company Deebot Collect Photos and Audio to Train Their AI