Partly AI-generated folk-pop hit barred from Sweden’s official charts

 A hit song has been excluded from Sweden’s official chart after it emerged the “artist” behind it was an AI creation.

I Know, You’re Not Mine – or Jag Vet, Du Är Inte Min in Swedish – by a singer called Jacub has been a streaming success in Sweden, topping the Spotify rankings.

However, the Swedish music trade body has excluded the song from the official chart after learning it was AI-generated.

Spotify Wrapped is taking over our feeds, but you don’t have outsource your relationship with music to AI | Liz Pelly
Read more

“Jacub’s track has been excluded from Sweden’s official chart, Sverigetopplistan, which is compiled by IFPI Sweden. While the song appears on Spotify’s own charts, it does not qualify for inclusion on the official chart under the current rules,” said an IFPI Sweden spokesperson.

Ludvig Werber, IFPI Sweden’s chief executive, said: “Our rule is that if it is a song that is mainly AI-generated, it does not have the right to be on the top list.”

[…]

IFPI Sweden acted after an investigative journalist, Emanuel Karlsten, revealed the song was registered to a Danish music publisher called Stellar and that two of the credited rights holders worked in the company’s AI department.

“What emerges is a picture of a music publisher that wants to experiment with new music and new kinds of artists. Who likes to push the limits of the audience’s tolerance threshold for artificial music and artificial artists,” wrote Karlsten.

In a statement, Stellar said: “The artist Jacub’s voice and parts of the music are generated with the help of AI as a tool in our creative process.”

[…]

Spotify does not require music to be labelled as AI-generated, but has been cracking down on AI-made spam tracks as every play more than 30 seconds long generates a royalty for the scammer behind it – and dilutes payments to legitimate artists.

Jacub is not the first AI artist to score a hit with audiences. A “band” called the Velvet Sundown amassed more than 1m streams on Spotify last year before it emerged the group was AI-generated, including its promotional images and backstory as well as the music. Its most popular song has now accumulated 4m streams on the platform.

[…]

Source: Partly AI-generated folk-pop hit barred from Sweden’s official charts | AI (artificial intelligence) | The Guardian

In other news, they have banned the use of synthesisers, DJs and autotune from the IFPI charts as well. Oh no, they didn’t. It will just take them a few decades to catch up again.

What Happened After Security Researchers Found 60 Flock Cameras Livestreaming to the Internet

A couple months ago, YouTuber Benn Jordan “found vulnerabilities in some of Flock’s license plate reader cameras,” reports 404 Media’s Jason Koebler. “He reached out to me to tell me he had learned that some of Flock’s Condor cameras were left live-streaming to the open internet.”

This led to a remarkable article where Koebler confirmed the breach by visiting a Flock surveillance camera mounted on a California traffic signal. (“On my phone, I am watching myself in real time as the camera records and livestreams me — without any password or login — to the open internet… Hundreds of miles away, my colleagues are remotely watching me too through the exposed feed.”) Flock left livestreams and administrator control panels for at least 60 of its AI-enabled Condor cameras around the country exposed to the open internet, where anyone could watch them, download 30 days worth of video archive, and change settings, see log files, and run diagnostics. Unlike many of Flock’s cameras, which are designed to capture license plates as people drive by, Flock’s Condor cameras are pan-tilt-zoom (PTZ) cameras designed to record and track people, not vehicles. Condor cameras can be set to automatically zoom in on people’s faces… The exposure was initially discovered by YouTuber and technologist Benn Jordan and was shared with security researcher Jon “GainSec” Gaines, who recently found numerous vulnerabilities in several other models of Flock’s automated license plate reader (ALPR) cameras.
Jordan appeared this week as a guest on Koebler’s own YouTube channel, while Jordan released a video of his own about the experience. titled “We Hacked Flock Safety Cameras in under 30 Seconds.” (Thanks to Slashdot reader beadon for sharing the link.) But together Jordan and 404 Media also created another video three weeks ago titled “The Flock Camera Leak is Like Netflix for Stalkers” which includes footage he says was “completely accessible at the time Flock Safety was telling cities that the devices are secure after they’re deployed.”

The video decries cities “too lazy to conduct their own security audit or research the efficacy versus risk,” but also calls weak security “an industry-wide problem.” Jordan explains in the video how he “very easily found the administration interfaces for dozens of Flock safety cameras…” — but also what happened next: None of the data or video footage was encrypted. There was no username or password required. These were all completely public-facing, for the world to see…. Making any modification to the cameras is illegal, so I didn’t do this. But I had the ability to delete any of the video footage or evidence by simply pressing a button. I could see the paths where all of the evidence files were located on the file system…

During and after the process of conducting that research and making that video, I was visited by the police and had what I believed to be private investigators outside my home photographing me and my property and bothering my neighbors. John Gaines or GainSec, the brains behind most of this research, lost employment within 48 hours of the video being released. And the sad reality is that I don’t view these things as consequences or punishment for researching security vulnerabilities. I view these as consequences and punishment for doing it ethically and transparently.

I’ve been contacted by people on or communicating with civic councils who found my videos concerning, and they shared Flock Safety’s response with me. The company claimed that the devices in my video did not reflect the security standards of the ones being publicly deployed. The CEO even posted on LinkedIn and boasted about Flock Safety’s security policies. So, I formally and publicly offered to personally fund security research into Flock Safety’s deployed ecosystem. But the law prevents me from touching their live devices. So, all I needed was their permission so I wouldn’t get arrested. And I was even willing to let them supervise this research.

I got no response.

So instead, he read Flock’s official response to a security/surveillance industry research group — while standing in front of one of their security cameras, streaming his reading to the public internet.

“Might as well. It’s my tax dollars that paid for it.”

” ‘Flock is committed to continuously improving security…'”

Source: What Happened After Security Researchers Found 60 Flock Cameras Livestreaming to the Internet | Slashdot

For more on why Flock cameras are problematic, read here

CD Project Takes down VR Mod for Cyberpunk – because it was paid

Yes, the TOS don’t allow commercial mods, which has plusses and minusses. So, yes, technically CD Project Red is in the right. However, it takes a lot of work and time to do some of these mods and if you want to get paid for it that is your right. Just as much as it is your right to not buy it if you don’t like it. Whatever.

There are loads of paid external services that run on top of Amazon, Paypal, Ebay, Discord, most AI products are built on top of OpenAI, etc. It’s a valid (if risky, due to the dependency) way to create value for people.

It seems to me that the TOS are overextended though. How can you legally determine what someone will do with a product they bought? US law is pretty bizarre in that respect, just as companies can get away with not allowing reverse engineering and lock people into buying hugely overpriced repairs and replacement parts only from them. Maybe look at China to see how this kind of law kills innovation and look at monopolies to see how this drives costs up and removes choice for consumers.

[…] Now that the dust has settled, I’m even more sorry to announce that we are leaving behind an adventure that so many of you deeply loved and enjoyed. CD PROJEKT S.A. decided that they would follow in Take-Two Interactive Software’s steps and issued a DMCA notice against me for the removal of the Cyberpunk 2077 VR mod.

At least they were a little more open about it, and I could get a reply both from their legal department and from the VP of business development. But in the end it amounted to the same iron-clad corpo logic: every little action that a company takes is in the name of money, but everything that modders do must be absolutely for free.

As usual they stretch the concept of “derivative work” until it’s paper-thin, as though a system that allows visualizing 40+ games in fully immersive 3D VR was somehow built making use of their intellectual property. And as usual they give absolutely zero f***s about how playing their game in VR made people happy, and they cannot just be grateful about the extra copies of the title they sold because of that—without ever having to pour money into producing an official conversion (no, they’re not planning to release their own VR port, in case you were wondering). […]

Source: Another one bites the dust | Patreon