Yep, That SpaceX Crew Capsule Was Definitely Destroyed During Failed Ground Test, Company Confirms

After weeks of speculation, SpaceX has finally admitted that a Crew Dragon capsule was destroyed during a test of system’s abort thrusters on April 20. No cause was given for the anomaly, nor were any new details disclosed about possible delays to NASA’s languishing Commercial Crew Program.

Speaking to reporters at a NASA briefing held earlier this week, Hans Koenigsmann, the vice president of build and flight reliability at SpaceX, said the mishap is “certainly not great news,” in terms of the company’s plan to launch astronauts into space later this year, as CBS News reports. The purpose of the briefing was to discuss an upcoming cargo launch to the ISS, but the incident, in which a Crew Dragon capsule got torched just prior to the firing of launch-abort thrusters, dominated much of the discussion.

The mishap occurred at Cape Canaveral’s Landing Zone 1 on April 20 during static ground tests of the system’s boosters. The Crew Dragon was reportedly engulfed in flames and thick orange-black smoke, which was probably toxic, could be seen for miles. Both NASA and SpaceX have been tight-lipped about the incident, but Koenigsmann shared some new information with reporters during the briefing.

Tests of the system’s smaller, maneuvering Draco thrusters were done earlier in the day without incident, he said. It was when the focus shifted to the system’s larger SuperDraco boosters—a series of eight thrusters tied to the abort system—that things went sideways.

“At the test stand, we powered up Dragon, it powered up as expected, we completed tests with the Draco thrusters—the smaller thrusters that are also on the cargo Dragon,” said Koenigsmann per CBS News. “And then just before we wanted to fire the SuperDracos there was an anomaly and the vehicle was destroyed.”

Source: Yep, That SpaceX Crew Capsule Was Definitely Destroyed During Failed Ground Test, Company Confirms

Kremlin signs total internet surveillance and censorship system into law, from Nov 1st.

Russia’s internet iron curtain has been formally signed into law by President Putin. The nation’s internet service providers have until 1 November to ensure they comply.

The law will force traffic through government-controlled exchanges and eventually require the creation of a national domain name system.

The bill has been promoted as advancing Russian sovereignty and ensuring Runet, Russia’s domestic internet, remains functioning regardless of what happens elsewhere in the world. The government has claimed “aggressive” US cybersecurity policies justify the move.

Control of exchanges is seen as an easy way for the Russian government to increase its control over what data its citizens can see, and what they can post. The Kremlin wants all data required by the network to be stored within Russian borders.

ISPs will only be allowed to connect to other ISPs, or peer, through approved exchanges. These exchanges will have to include government-supplied boxes which can block data traffic as required.

There have been widespread protests within the country against the law.

Source: Having a bad day? Be thankful you don’t work at a Russian ISP: Kremlin signs off Pootynet restrictions • The Register

Dark Net’s Wall Street Market Falls to Police

Police from around the world shut down the biggest active black market on the dark web this month, according to announcements from law enforcement agencies in the United States, Germany, and the Netherlands released on Friday.

Wall Street Market, as the black market site was known, was the target of a 1.5-year-long multinational investigation. Three Germans were arrested on April 23 and 24 inside Germany for their alleged role in creating and administering the site that sold illegal drugs, documents, weapons, and data.

“WSM was one of the largest and most voluminous darknet marketplaces of all time,” FBI Special Agent Leroy Shelton wrote in the criminal complaint released on Friday.

[…]

Wall Street Market had 1.15 million customer accounts and 5,400 registered sellers, according to the U.S. Justice Department. However, don’t take those numbers to be accurate census accounts—users are anonymous, sellers and buyers both often create multiple accounts, and there’s no way to get a realistic count on the number of individuals active on a market like this.

A better way to understand the scale of a black market like this is to look at the actual money involved. Last month, Wall Street Market administrators stole around $11 million from user accounts, authorities say.

“An ‘exit scam’ was allegedly conducted last month when the WSM administrators took all of the virtual currency held in marketplace escrow and user accounts—believed by investigators to be approximately $11 million—and then diverted the money to their own accounts.

Source: Dark Net’s Wall Street Market Falls to Police

Amazing AI Generates Entire Bodies of People Who Don’t Exist

A new deep learning algorithm can generate high-resolution, photorealistic images of people — faces, hair, outfits, and all — from scratch.

The AI-generated models are the most realistic we’ve encountered, and the tech will soon be licensed out to clothing companies and advertising agencies interested in whipping up photogenic models without paying for lights or a catering budget. At the same time, similar algorithms could be misused to undermine public trust in digital media.

[…]

In a video showing off the tech, the AI morphs and poses model after model as their outfits transform, bomber jackets turning into winter coats and dresses melting into graphic tees.

Specifically, the new algorithm is a Generative Adversarial Network (GAN). That’s the kind of AI typically used to churn out new imitations of something that exists in the real world, whether they be video game levels or images that look like hand-drawn caricatures.

Source: Amazing AI Generates Entire Bodies of People Who Don’t Exist

Security lapse exposed a Chinese smart city surveillance system

Smart cities are designed to make life easier for their residents: better traffic management by clearing routes, making sure the public transport is running on time and having cameras keeping a watchful eye from above.

But what happens when that data leaks? One such database was open for weeks for anyone to look inside.

Security researcher John Wethington found a smart city database accessible from a web browser without a password. He passed details of the database to TechCrunch in an effort to get the data secured.

[…]

he system monitors the residents around at least two small housing communities in eastern Beijing, the largest of which is Liangmaqiao, known as the city’s embassy district. The system is made up of several data collection points, including cameras designed to collect facial recognition data.

The exposed data contains enough information to pinpoint where people went, when and for how long, allowing anyone with access to the data — including police — to build up a picture of a person’s day-to-day life.

A portion of the database containing facial recognition scans (Image: supplied)

The database processed various facial details, such as if a person’s eyes or mouth are open, if they’re wearing sunglasses, or a mask — common during periods of heavy smog — and if a person is smiling or even has a beard.

The database also contained a subject’s approximate age as well as an “attractive” score, according to the database fields.

But the capabilities of the system have a darker side, particularly given the complicated politics of China.

The system also uses its facial recognition systems to detect ethnicities and labels them — such as “汉族” for Han Chinese, the main ethnic group of China — and also “维族” — or Uyghur Muslims, an ethnic minority under persecution by Beijing.

Where ethnicities can help police identify suspects in an area even if they don’t have a name to match, the data can be used for abuse.

The Chinese government has detained more than a million Uyghurs in internment camps in the past year, according to a United Nations human rights committee. It’s part of a massive crackdown by Beijing on the ethnic minority group. Just this week, details emerged of an app used by police to track Uyghur Muslims.

We also found that the customer’s system also pulls in data from the police and uses that information to detect people of interest or criminal suspects, suggesting it may be a government customer.

Facial recognition scans would match against police records in real time (Image: supplied)

Each time a person is detected, the database would trigger a “warning” noting the date, time, location and a corresponding note. Several records seen by TechCrunch include suspects’ names and their national identification card number.

Source: Security lapse exposed a Chinese smart city surveillance system – TechCrunch