Companies ‘can sack workers for refusing to use fingerprint scanners’ in Australia

Businesses using fingerprint scanners to monitor their workforce can legally sack employees who refuse to hand over biometric information on privacy grounds, the Fair Work Commission has ruled.

The ruling, which will be appealed, was made in the case of Jeremy Lee, a Queensland sawmill worker who refused to comply with a new fingerprint scanning policy introduced at his work in Imbil, north of the Sunshine Coast, late last year.

Fingerprint scanning was used to monitor the clock-on and clock-off times of about 150 sawmill workers at two sites and was preferred to swipe cards because it prevented workers from fraudulently signing in on behalf of their colleagues to mask absences.

The company, Superior Woods, had no privacy policy covering workers and failed to comply with a requirement to properly notify individuals about how and why their data was being collected and used. The biometric data was stored on servers located off-site, in space leased from a third party.

Lee argued the business had never sought its workers’ consent to use fingerprint scanning, and feared his biometric data would be accessed by unknown groups and individuals.

“I am unwilling to consent to have my fingerprints scanned because I regard my biometric data as personal and private,” Lee wrote to his employer last November.

“Information technology companies gather as much information/data on people as they can.

“Whether they admit to it or not. (See Edward Snowden) Such information is used as currency between corporations.”

Lee was neither antagonistic or belligerent in his refusals, according to evidence before the commission. He simply declined to have his fingerprints scanned and continued using a physical sign-in booklet to record his attendance.

He had not missed a shift in more than three years.

The employer warned him about his stance repeatedly, and claimed the fingerprint scanner did not actually record a fingerprint, but rather “a set of data measurements which is processed via an algorithm”. The employer told Lee there was no way the data could be “converted or used as a finger print”, and would only be used to link to his payroll number to his clock-on and clock-off time. It said the fingerprint scanners were also needed for workplace safety, to accurately identify which workers were on site in the event of an accident.

Lee was given a final warning in January, and responded that he valued his job a “great deal” and wanted to find an alternative way to record his attendance.

“I would love to continue to work for Superior Wood as it is a good, reliable place to work,” he wrote to his employer. “However, I do not consent to my biometric data being taken. The reason for writing this letter is to impress upon you that I am in earnest and hope there is a way we can negotiate a satisfactory outcome.”

Lee was sacked in February, and lodged an unfair dismissal claim in the Fair Work Commission.

Source: Companies ‘can sack workers for refusing to use fingerprint scanners’ | World news | The Guardian

You only have one set of fingerprints – that’s the problem with biometrics: they can’t be changed, so you really really don’t want them stolen from you

Google wants to spy on you and then report on you

suggesting, automatically implementing, or both suggesting and automatically implementing, one or more household policies to be implemented within a household environment. The household policies include one or more input criteria that is derivable from at least one smart device within the household environment, the one or more input criteria relating to a characteristic of the household environment, a characteristic of one or more occupants of the household, or both. The household policies also include one or more outputs to be provided based upon the one or more input criteria.

https://patents.justia.com/patent/10114351

Source: Patent Images

Eg.page 16, figure 25 – monitor TV watching patterns and report on you

page 20, figure 33 – detect time brushing teeth and report on you

Do you trust Google to be your parent?!

Google patents a way for your smart devices to spy on you, serve you ads, even if your privacy settings says no

In some embodiments, the private network may include at least one first device that captures information about its surrounding environment, such as data about the people and/or objects in the environment. The first device may receive a set of potential content sent from a server external to the private network. The first device may select at least one piece of content to present from the set of potential content based in part on the people/object data and/or a score assigned by the server to each piece of content. The private network may also include at least one second device that receives the captured people/object data sent from the first device. The second device may also receive a set of potential content sent from the server external to the private network. The second device may select at least one piece of content to present from the set of potential content based in part on the people/object data sent from the first device and/or a score assigned by the server to each piece of content. Using the private network to communicate the people/object data between devices may preserve the privacy of the user since the data is not sent to the external server. Further, using the obtained people/object data to select content enables more personalized content to be chosen.

[…]

 

  • urther, although not shown in this particular way, in some embodiments, the client device 134 may collect people/object data 136 using one or more sensors, as discussed above. Also, as previously discussed, the raw people/object data 136 may be processed by the sensing device 138, the client device 134, and/or a processing device 140 depending on the implementation. The people/object data 136 may include the data described above regarding FIG. 7 that may aid in recognizing objects, people, and/or patterns, as well as determining user preferences, mood, and so forth.
  • [0144]
    After the client device 134 is in possession of the people/object data 136, the client device 134 may use the classifier 144 to score each piece of content 132. In some embodiments, the classifier 144 may combine at least the people/object data 136, the scores provided by the server 67 for the content 132, or both, to determine a final score for each piece of content 132 (process block 216), which will be discussed in more detail below.
  • [0145]
    The client device 134 may select at least one piece of content 132 to display based on the scores (process block 218). That is, the client device 134 may select the content 132 with the highest score as determined by the classifier 144 to display. However, in some embodiments, where none of the content 132 generate a score above a threshold amount, no content 132 may be selected. In those embodiments, the client device 134 may not present any content 132. However, when at least one item of content 132 scores above the threshold amount and is selected, then the client device 134 may communicate the selected content 132 to a user of the client device 134 (process block 220) and track user interaction with the content 132 (process block 222). It should be noted that when more than one item of content 132 score above the threshold amount, then the item of content 132 with the highest score may be selected. The client device 134 may use the tracked user interaction and conversions to continuously train the classifier 144 to ensure that the classifier 144 stays up to date with the latest user preferences.
  • [0146]
    It should be noted that, in some embodiments, the processing device 140 may receive the content 132 from the server 67 instead of, or in addition to, the client device 134. In embodiments where the processing device 140 receives the content 132, the processing device 140 may perform the classification of the content 132 using a classifier 144 similar to the client device 134 and the processing device 140 may select the content 132 with the highest score as determined by the classifier 144. Once selected, the processing device 140 may send the selected content 132 to the client device 134, which may communicate the selected content 132 to a user.
  • […]
  • The process 230 may include training one or more models of the classifier 144 with people/object data 136, locale 146, demographics 148, search history 150, scores from the server 67, labels 145, and so forth. As previously discussed, the classifier 144 may include a support vector machine (SVM) that uses supervised learning models to classify the content 132 into one of two groups (e.g., binary classification) based on recognized patterns using the people/object data 136, locale 146, demographics 148, search history 150, scores from the server 67, and the labels 145 for the two groups of “show” or “don’t show.”

 

Source: US20160260135A1 – Privacy-aware personalized content for the smart home – Google Patents

 

They have thought up around 140 ways that this can be used…

Dutch Gov sees Office 365 spying on you, sending your texts to US servers without recourse or knowledge

Uit het rapport van de Nederlandse overheid blijkt dat de telemetrie-functie van alle Office 365 en Office ProPlus-applicaties onder andere e-mail-onderwerpen en woorden/zinnen die met behulp van de spellingschecker of vertaalfunctie zijn geschreven worden doorgestuurd naar systemen in de Verenigde Staten.

Dit gaat zelfs zo ver dat, als een gebruiker meerdere keren achter elkaar op de backspace-knop drukt, de telemetrie-functie zowel de zin voor de aanpassing al die daarna verzamelt en doorstuurt. Gebruikers worden hiervan niet op de hoogte gebracht en hebben geen mogelijkheid deze dataverzameling te stoppen of de verzamelde data in te zien.

De Rijksoverheid heeft dit onderzoek gedaan in samenwerking met Privacy Company. “Microsoft mag deze tijdelijke, functionele gegevens niet opslaan, tenzij de bewaring strikt noodzakelijk is, bijvoorbeeld voor veiligheidsdoeleinden,” schrijft Sjoera Nas van de Privacy Company in een blogpost.

Source: Je wordt bespied door Office 365-applicaties – Webwereld

Facebook files patent to find out more about you by looking at the background items in your pictures and pictures you are tagged in

An online system predicts household features of a user, e.g., household size and demographic composition, based on image data of the user, e.g., profile photos, photos posted by the user and photos posted by other users socially connected with the user, and textual data in the user’s profile that suggests relationships among individuals shown in the image data of the user. The online system applies one or more models trained using deep learning techniques to generate the predictions. For example, a trained image analysis model identifies each individual depicted in the photos of the user; a trained text analysis model derive household member relationship information from the user’s profile data and tags associated with the photos. The online system uses the predictions to build more information about the user and his/her household in the online system, and provide improved and targeted content delivery to the user and the user’s household.

Source: United States Patent Application: 0180332140

Microsoft slips ads into Windows 10 Mail client – then U-turns so hard, it warps fabric of reality – Windows is an OS, not a service!

Microsoft was, and maybe still is, considering injecting targeted adverts into the Windows 10 Mail app.

The ads would appear at the top of inboxes of folks using the client without a paid-for Office 365 subscription, and the advertising would be tailored to their interests. Revenues from the banners were hoped to help keep Microsoft afloat, which banked just $16bn in profit in its latest financial year.

According to Aggiornamenti Lumia on Friday, folks using Windows Insider fast-track builds of Mail and Calendar, specifically version 11605.11029.20059.0, may have seen the ads in among their messages, depending on their location. Users in Brazil, Canada, Australia, and India were chosen as guinea pigs for this experiment.

A now-deleted FAQ on the Office.com website about the “feature” explained the advertising space would be sold off to help Microsoft “provide, support, and improve some of our products,” just like Gmail and Yahoo! Mail display ads.

Also, the advertising is targeted, by monitoring what you get up to with apps and web browsing, and using demographic information you disclose:

Windows generates a unique advertising ID for each user on a device. When the advertising ID is enabled, both Microsoft apps and third-party apps can access and use the advertising ID in much the same way that websites can access and use a unique identifier stored in a cookie. Mail uses this ID to provide more relevant advertising to you.

You have full control of Windows and Mail having access to this information and can turn off interest-based advertising at any time. If you turn off interest-based advertising, you will still see ads but they will no longer be as relevant to your interests.

Microsoft does not use your personal information, like the content of your email, calendar, or contacts, to target you for ads. We do not use the content in your mailbox or in the Mail app.

You can also close an ad banner by clicking on its trash can icon, or get rid of them completely by coughing up cash:

You can permanently remove ads by buying an Office 365 Home or Office 365 Personal subscription.

Here’s where reality is thrown into a spin, literally. Microsoft PR supremo Frank Shaw said a few hours ago, after the ads were spotted:

This was an experimental feature that was never intended to be tested broadly and it is being turned off.

Never intended to be tested broadly, and was shut down immediately, yet until it was clocked, had an official FAQ for it on Office.com, which was also hastily nuked from orbit, and was rolled out in highly populated nations. Talk about hand caught in the cookie jar.

Source: Microsoft slips ads into Windows 10 Mail client – then U-turns so hard, it warps fabric of reality • The Register

Couple Who Ran retro ROM Site (with games you can’t buy any more) to Pay Nintendo $12 Million

Nintendo has won a lawsuit seeking to take two large retro-game ROM sites offline, on charges of copyright infringement. The judgement, made public today, ruled in Nintendo’s favour and states that the owners of the sites LoveROMS.com and LoveRETRO.co, will have to pay a total settlement of $12 million to Nintendo. The complaint was originally filed by the company in an Arizona federal court in July, and has since lead to a swift purge of self-censorship by popular retro and emulator ROM sites, who have feared they may be sued by Nintendo as well.

LoveROMS.com and LoveRETRO.co were the joint property of couple Jacob and Cristian Mathias, before Nintendo sued them for what they have called “brazen and mass-scale infringement of Nintendo’s intellectual property rights.” The suit never went to court; instead, the couple sought to settle after accepting the charge of direct and indirect copyright infringement. TorrentFreak reports that a permanent injunction, prohibiting them from using, sharing, or distributing Nintendo ROMs or other materials again in the future, has been included in the settlement. Additionally all games, game files, and emulators previously on the site and in their custody must be handed over to the Japanese game developer, along with a $12.23 million settlement figure. It is unlikely, as TorrentFreak have reported, that the couple will be obligated to pay the full figure; a smaller settlement has likely been negotiated in private.

Instead, the purpose of the enormous settlement amount is to act as a warning or deterrent to other ROM and emulator sites surviving on the internet. And it’s working.

Motherboard previously reported on the way in which Nintendo’s legal crusade against retro ROM and emulator sites is swiftly eroding a large chunk of retro gaming. The impact of this campaign on video games as a whole is potentially catastrophic. Not all games have been preserved adequately by game publishers and developers. Some are locked down to specific regions and haven’t ever been widely accessible.

The accessibility of video games and the gaming industry has always been defined and limited by economic boundaries. There are a multitude of reasons why retro games can’t be easily or reliably accessed by prospective players, and by wiping out ROM sites Nintendo is erasing huge chunks of gaming history. Limiting the accessibility of old retro titles to this extent will undoubtedly affect the future of video games, with classic titles that shaped modern games and gaming development being kept under lock and key by the monolithic hand of powerful game developers.

Since the filing of the suit in July EmuParadise, a haven for retro games and emulator titles, has shut down. Many other sites have followed suit.

Source: Couple Who Ran ROM Site to Pay Nintendo $12 Million – Motherboard

Wow, that’s a sure fire way to piss off your fans, Nintendo!

How to Quit Google Completely

Despite all the convenience and quality of Google’s sprawling ecosystem, some users are fed up with the fishy privacy policies the company has recently implemented in Gmail, Chrome, and other services. To its credit, Google has made good changes in response to user feedback, but that doesn’t diminish the company’s looming shadow over the internet at large. If you’re ready to ditch Google, or even just reduce its presence in your digital life, this guide is here to help.

Since Google owns some of the best and most-used apps, websites, and internet services, making a clean break is difficult—but not impossible. We’re going to take a look at how to leave the most popular Google services behind, and how to keep Google from tracking your data. We’ve also spent serious time researching and testing great alternatives to Google’s offerings, so you can leave the Big G without having to buy new devices or swear fealty to another major corporation.

Source: How to Quit Google Completely

Chinese ‘gait recognition’ tech IDs people by how they walk

Chinese authorities have begun deploying a new surveillance tool: “gait recognition” software that uses people’s body shapes and how they walk to identify them, even when their faces are hidden from cameras.

Already used by police on the streets of Beijing and Shanghai, “gait recognition” is part of a push across China to develop artificial-intelligence and data-driven surveillance that is raising concern about how far the technology will go.

Huang Yongzhen, the CEO of Watrix, said that its system can identify people from up to 50 meters (165 feet) away, even with their back turned or face covered. This can fill a gap in facial recognition, which needs close-up, high-resolution images of a person’s face to work.

“You don’t need people’s cooperation for us to be able to recognize their identity,” Huang said in an interview in his Beijing office. “Gait analysis can’t be fooled by simply limping, walking with splayed feet or hunching over, because we’re analyzing all the features of an entire body.”

Watrix announced last month that it had raised 100 million yuan ($14.5 million) to accelerate the development and sale of its gait recognition technology, according to Chinese media reports.

Chinese police are using facial recognition to identify people in crowds and nab jaywalkers, and are developing an integrated national system of surveillance camera data. Not everyone is comfortable with gait recognition’s use.

Source: Chinese ‘gait recognition’ tech IDs people by how they walk

U.S. Indicts Chinese Hacker-Spies in Conspiracy to Steal Aerospace Secrets

The U.S. Justice Department has charged two Chinese intelligence officers, six hackers, and two aerospace company insiders in a sweeping conspiracy to steal confidential aerospace technology from U.S. and French companies.

For more than five years, two Chinese Ministry of State Security (MSS) spies are said to have run a team of hackers focusing on the theft of designs for a turbofan engine used in U.S. and European commercial airliners, according to an unsealed indictment (below) dated October 25. In a statement, the DOJ said a Chinese state-owned aerospace company was simultaneously working to develop a comparable engine.

“The threat posed by Chinese government-sponsored hacking activity is real and relentless,” FBI Special Agent in Charge John Brown of San Diego said in a statement. “Today, the Federal Bureau of Investigation, with the assistance of our private sector, international and U.S. government partners, is sending a strong message to the Chinese government and other foreign governments involved in hacking activities.”

The MSS officers involved were identified as Zha Rong, a division director in the Jiangsu Province regional department (JSSD), and Chai Meng, a JSSD section chief.

At the direction of the MSS officers, the hackers allegedly infiltrated a number of U.S. aerospace companies, including California-based Capstone Turbine, among others in Arizona, Massachusetts, and Oregon, the DOJ said. The officers are also said to have recruited at least two Chinese employees of a French aerospace manufacturer—insiders who allegedly aided the conspiracy by, among other criminal acts, installing the remote access trojan Sakula onto company computers.

Source: U.S. Indicts Chinese Hacker-Spies in Conspiracy to Steal Aerospace Secrets

Now Apps Can Track You Even After You Uninstall Them

If it seems as though the app you deleted last week is suddenly popping up everywhere, it may not be mere coincidence. Companies that cater to app makers have found ways to game both iOS and Android, enabling them to figure out which users have uninstalled a given piece of software lately—and making it easy to pelt the departed with ads aimed at winning them back.

Adjust, AppsFlyer, MoEngage, Localytics, and CleverTap are among the companies that offer uninstall trackers, usually as part of a broader set of developer tools. Their customers include T-Mobile US, Spotify Technology, and Yelp. (And Bloomberg Businessweek parent Bloomberg LP, which uses Localytics.) Critics say they’re a fresh reason to reassess online privacy rights and limit what companies can do with user data. “Most tech companies are not giving people nuanced privacy choices, if they give them choices at all,” says Jeremy Gillula, tech policy director at the Electronic Frontier Foundation, a privacy advocate.

Some providers say these tracking tools are meant to measure user reaction to app updates and other changes. Jude McColgan, chief executive officer of Boston’s Localytics, says he hasn’t seen clients use the technology to target former users with ads. Ehren Maedge, vice president for marketing and sales at MoEngage Inc. in San Francisco, says it’s up to the app makers not to do so. “The dialogue is between our customers and their end users,” he says. “If they violate users’ trust, it’s not going to go well for them.” Adjust, AppsFlyer, and CleverTap didn’t respond to requests for comment, nor did T-Mobile, Spotify, or Yelp.

Uninstall tracking exploits a core element of Apple Inc.’s and Google’s mobile operating systems: push notifications. Developers have always been able to use so-called silent push notifications to ping installed apps at regular intervals without alerting the user—to refresh an inbox or social media feed while the app is running in the background, for example. But if the app doesn’t ping the developer back, the app is logged as uninstalled, and the uninstall tracking tools add those changes to the file associated with the given mobile device’s unique advertising ID, details that make it easy to identify just who’s holding the phone and advertise the app to them wherever they go.

The tools violate Apple and Google policies against using silent push notifications to build advertising audiences, says Alex Austin, CEO of Branch Metrics Inc., which makes software for developers but chose not to create an uninstall tracker. “It’s just generally sketchy to track people around the internet after they’ve opted out of using your product,” he says, adding that he expects Apple and Google to crack down on the practice soon. Apple and Google didn’t respond to requests for comment.

Source: Now Apps Can Track You Even After You Uninstall Them – Bloomberg

Oxford study claims data harvesting among Android apps is “out of control”

It’s no secret that mobile apps harvest user data and share it with other companies, but the true extent of this practice may come as a surprise. In a new study carried out by researchers from Oxford University, it’s revealed that almost 90 percent of free apps on the Google Play store share data with Alphabet.

The researchers, who analyzed 959,000 apps from the US and UK Google Play stores, said data harvesting and sharing by mobile apps was now “out of control.”

“We find that most apps contain third party tracking, and the distribution of trackers is long-tailed with several highly dominant trackers accounting for a large portion of the coverage,” reads the report.

It’s revealed that most of the apps, 88.4 percent, could share data with companies owned by Google parent Alphabet. Next came a firm that’s no stranger to data sharing controversies, Facebook (42.5 percent), followed by Twitter (33.8 percent), Verizon (26.27 percent), Microsoft (22.75 percent), and Amazon (17.91 percent).

According to The Financial Times, which first reported the research, information shared by these third-party apps can include age, gender, location, and information about a user’s other installed apps. The data “enables construction of detailed profiles about individuals, which could include inferences about shopping habits, socio-economic class or likely political opinions.”

Big firms then use the data for a variety of purposes, such as credit scoring and for targeting political messages, but its main use is often ad targeting. Not surprising, given that revenue from online advertising is now over $59 billion per year.

According to the research, the average app transfers data to five tracker companies, which pass the data on to larger firms. The biggest culprits are news apps and those aimed at children, both of which tend to have the most third-party trackers associated with them.

Source: New study claims data harvesting among Android apps is “out of control” – TechSpot

SIM Cards That Force Your Mobile Data Through Tor Are Coming

It’s increasingly difficult to expect privacy when you’re browsing online, so a non-profit in the UK is working to build the power of Tor’s anonymity network right into the heart of your smartphone.

Brass Horn Communications is experimenting with all sorts of ways to improve Tor’s usability for UK residents. The Tor browser bundle for PCs can help shield your IP address from snoopers and data-collection giants. It’s not perfect and people using it for highly-illegal activity can still get caught, but Tor’s system of sending your data through the various nodes on its network to anonymize user activity works for most people. It can help users surf the full web in countries with restrictive firewalls and simply make the average Joe feel like they have more privacy. But it’s prone to user error, especially on mobile devices. Brass Horn hopes to change that.

Brass Horn’s founder, Gareth Llewelyn, told Motherboard his organization is “about sticking a middle finger up to mobile filtering, mass surveillance.” Llewelyn has been unnerved by the UK’s relentless drive to push through legislation that enables surveillance and undermines encryption. Along with his efforts to build out more Tor nodes in the UK to increase its notoriously slow speeds, Llewelyn is now beta-testing a SIM card that will automatically route your data through Tor and save people the trouble of accidentally browsing unprotected.

Currently, mobile users’ primary option is to use the Tor browser that’s still in alpha-release and couple it with software called Orbot to funnel your app activity through the network. Only apps that have a proxy feature, like Twitter, are compatible. It’s also only available for Android users.

You’ll still need Orbot installed on your phone to use Brass Horn’s SIM card and the whole idea is that you won’t be able to get online without running on the Tor network. There’s some minor setup that the organization walks you through and from that point on, you’ll apparently never accidentally find yourself online without the privacy protections that Tor provides.

In an email to Gizmodo, Llewellyn said that he does not recommend using the card on a device with dual-SIMs. He said the whole point of the project is that a user “cannot accidentally send packets via Clearnet, this is to protect one’s privacy, anonymity and/or protect against NITs etc, if one were to use a dual SIM phone it would negate the failsafe and would not be advisable.” But if a user so desired, they could go with a dual-SIM setup.

You’re also unprotected if you end up on WiFi, but in general, this is a way for journalists, activists, and rightly cautious users to know they’re always protected.

The SIM acts as a provider and Brass Horn essentially functions as a mobile virtual network operator that piggybacks on other networks. The site for Brass Horn’s Onion3G service claims it’s a safer mobile provider because it only issues “private IP addresses to remote endpoints which if ‘leaked’ won’t identify you or Brass Horn Communications as your ISP.” It costs £2.00 per month and £0.025 per megabyte transferred over the network.

A spokesperson for the Tor Project told Gizmodo that it hasn’t been involved in this project and that protecting mobile data can be difficult. “This looks like an interesting and creative way to approach that, but it still requires that you put a lot of trust into your mobile provider in ensuring that no leaks happen,” they said.

Info on joining the beta is available here and Brass Horn expects to make its SIM card available to the general public in the UK next year. Most people should wait until there’s some independent research done on the service, but it’s all an intriguing idea that could provide a model for other countries.

Source: SIM Cards That Force Your Mobile Data Through Tor Are Coming

Facebook, Google sued for ‘secretly’ slurping people’s whereabouts – while Feds lap it up

Facebook and Google are being sued in two proposed class-action lawsuits for allegedly deceptively gathering location data on netizens who thought they had opted out of such cyber-stalking.

The legal challenges stem from revelations earlier this year that even after users actively turn off “location history” on their smartphones, their location is still gathered, stored, and exploited to sling adverts.

Both companies use weasel words in their support pages to continue to gather the valuable data while seemingly giving users the option to opt out – and that “deception” is at the heart of both lawsuits.

In the first, Facebook user Brett Heeger claims the antisocial network is misleading folks by providing the option to stop the gathering and storing of their location data but in reality in continues to grab the information and add it to a “Location History” feature that it then uses for targeted advertising.

“Facebook misleads its users by offering them the option to restrict Facebook from tracking, logging and storing their private location information, but then continuing to track, log, and store that location information regardless of users’ choices,” the lawsuit, filed in California, USA, states. “In fact, Facebook secretly tracks, logs and stories location data for all of its users – including those who have sought to limit the information about their locations.”

This action is “deceptive” and offers users a “false sense of security,” the lawsuit alleges. “Facebook’s false assurance are intended to make users feel comfortable continuing to use Facebook and share their personal information so that Facebook can continue to be profitable, at the expense of user privacy… Advertisers pay Facebook to place advertisements because Facebook is so effective at using location information to target advertisement to consumers.”

And over to you, Google

In the second lawsuit, also filed in Cali, three people – Leslie Lee of Wyoming and Colorado residents Stacy Smedley and Fredrick Davis – make the same claim: that Google is deceiving smartphone users by giving them the option to “pause” the gathering of your location data through a setting called “Location History.”

In reality, however, Google continues to gather locations data through its two most popular apps – Search and Maps – even when you actively choose to turn off location data. Instead, users have to go to a separate setting called “Web and App Activity” to really turn the gathering off. There is no mention of location data within that setting and nowhere does Google refer people to that setting in order to really stop location tracking.

As such, Google is engaged in a “deliberate, deceptive practice to collect personal information from which they can generate millions of dollars in revenue by covertly recording contemporaneous location data about Android and iPhone mobile phone users who are using Google Maps or other Google applications and functionalities, but who have specifically opted out of such tracking,” the lawsuit alleges.

Both legal salvos hope to become class-action lawsuits with jury trials, so potentially millions of other affected users will be able to join the action and so propel the case forward. The lawsuits seek compensation and damages as well as injunctions preventing both companies from gathering such data with gaining the explicit consent of users.

Meanwhile at the other end of the scale, the ability for the companies to constantly gather user location data has led to them being targeted by law enforcement in an effort to solve crimes.

Warrant required

Back in June, the US Supreme Court made a landmark ruling about location data, requiring cops and FBI agents to get a warrant before accessing such records from mobile phone operators.

But it is not clear which hurdles or parameters need to be met before a court should sign off on such a warrant, leading to an increasing number of cases where the Feds have provided times, dates, and rough geographic locations and asked Google, Facebook, Snapchat, and others, to provide the data of everyone who was in the vicinity at the time.

This so-called “reverse location” order has many civil liberties groups concerned because it effectively exposes innocent individuals’ personal data to the authorities simply because they were in the same rough area where a crime was carried out.

[…]

Leaky apps

And if all that wasn’t bad enough, this week a paper [PDF] by eggheads at the University of Oxford in the UK who studied the source code of just under one million apps found that Google and Facebook were top of the list when it came to gathering data on users from third parties.

Google parent company Alphabet receives user data from an incredible 88 per cent of apps on the market. Often this information was accumulated through third parties and included information like age, gender and location. The data “enables construction of detailed profiles about individuals, which could include inferences about shopping habits, socio-economic class or likely political opinions,” the paper revealed.

Facebook received data from 43 per cent of the apps, followed by Twitter with 34 per cent. Mobile operator Verizon – renowned for its “super cookie” tracker gets information from 26 per cent of apps; Microsoft 23 per cent; and Amazon 18 per cent.

Source: Facebook, Google sued for ‘secretly’ slurping people’s whereabouts – while Feds lap it up • The Register

Star Wars: KOTOR Fan Remake Shutting Down After Cease And Desist From Lucasfilm

Back in 2016, an ambitious group of fans began work on an Unreal Engine 4 “reboot” of role-playing, light-sabering classic Star Wars: Knights of the Old Republic called Apeiron. The project has made impressive progress since then, but it emitted a tragic Wilhelm scream this week when Lucasfilm lawyers zapped it out of existence.

As is often the case with ambitious fan projects, Apeiron received a cease-and-desist letter from lawyers representing the series its team was trying to pay homage to. Apeiron’s developer, Poem Studios, took to Twitter to share the news. “After a few days, I’ve exhausted my options to keep it [Apeiron] afloat; we knew this day was a possibility. I’m sorry and may the force be with you,” Poem wrote alongside a screenshot of a letter purporting to be from Lucasfilm.

“Notwithstanding Poem Studios affection and enthusiasm for the Star Wars franchise and the original KOTOR game, we must object to any unlicensed use of Lucasfilm intellectual property,” reads Lucasfilm’s letter. It goes on to call Apeiron’s use of Star Wars characters, artwork, and images on its website and social media “infringing” and demands that 1) Star Wars materials are removed, 2) the Apeiron team ceases development and destroys its code, and 3) they don’t use any Lucasfilm properties in future games.

Source: Star Wars: KOTOR Fan Remake Shutting Down After Cease And Desist From Lucasfilm

What a bunch of dicks at Lucasfilm.

EU hijacking: self-driving car data will be copyrighted…by the manufacturer – not to be released by drivers / engineers / researchers / mechanics

Today, the EU held a routine vote on regulations for self-driving cars, when something decidedly out of the ordinary happened…

The autonomous vehicle rules contained a clause that affirmed that “data generated by autonomous transport are automatically generated and are by nature not creative, thus making copyright protection or the right on databases inapplicable.”

This is pretty inoffensive stuff. Copyright protects creative work, not factual data, and the telemetry generated by your car — self-driving or not — is not copyrighted.

But just before the vote, members of the European Peoples’ Party (the same bloc that pushed through the catastrophic new Copyright Directive) stopped the proceedings with a rare “roll call” and voted down the clause.

In other words, they’ve snuck in a space for the telemetry generated by autonomous vehicles to become someone’s property. This is data that we will need to evaluate the safety of autonomous vehicles, to fine-tune their performance, to ensure that they are working as the manufacturer claims — data that will not be public domain (as copyright law dictates), but will instead be someone’s exclusive purview, to release or withhold as they see fit.

Who will own this data? It’s unlikely that it will be the owners of the vehicles. Just look at the data generated by farmers who own John Deere tractors. These tractors create a wealth of soil data, thanks to humidity sensors, location sensors and torque sensors — a centimeter-accurate grid of soil conditions in the farmer’s own field.

But all of that data is confiscated by John Deere, locked up behind the company’s notorious DRM and only made available in fragmentary form to the farmer who generated it (it comes bundled with the app that you get if you buy Monsanto seed) — meanwhile, the John Deere company aggregates the data for sale into the crop futures market.

It’s already the case that most auto manufacturers use license agreements and DRM to lock up your car so that you can’t fix it yourself or take it to an independent service center. The aggregated data from millions of self-driving cars across the EU aren’t just useful to public safety analysts, consumer rights advocates, security researchers and reviewers (who would benefit from this data living in the public domain) — it is also a potential gold-mine for car manufacturers who could sell it to insurers, market researchers and other deep-pocketed corporate interests who can profit by hiding that data from the public who generate it and who must share their cities and streets with high-speed killer robots.

Source: EU hijacking: self-driving car data will be copyrighted…by the manufacturer / Boing Boing

Ancestry Sites Could Soon Expose Nearly Anyone’s Identity, Researchers Say

Genetic testing has helped plenty of people gain insight into their ancestry, and some services even help users find their long-lost relatives. But a new study published this week in Science suggests that the information uploaded to these services can be used to figure out your identity, regardless of whether you volunteered your DNA in the first place.

The researchers behind the study were inspired by the recent case of the alleged Golden State Killer.

Earlier this year, Sacramento police arrested 72-year-old Joseph James DeAngelo for a wave of rapes and murders allegedly committed by DeAngelo in the 1970s and 1980s. And they claimed to have identified DeAngelo with the help of genealogy databases.

Traditional forensic investigation relies on matching certain snippets of DNA, called short tandem repeats, to a potential suspect. But these snippets only allow police to identify a person or their close relatives in a heavily regulated database. Thanks to new technology, the investigators in the Golden State Killer case isolated the genetic material that’s now collected by consumer genetic testing companies from the suspected killer’s DNA left behind at a crime scene. Then they searched for DNA matches within these public databases.

This information, coupled with other historical records, such as newspaper obituaries, helped investigators create a family tree of the suspect’s ancestors and other relatives. After zeroing on potential suspects, including DeAngelo, the investigators collected a fresh DNA sample from DeAngelo—one that matched the crime scene DNA perfectly.

But while the detective work used to uncover DeAngelo’s alleged crimes was certainly clever, some experts in genetic privacy have been worried about the grander implications of this method. That includes Yaniv Erlich, a computer engineer at Columbia University and chief science officer at MyHeritage, an Israel-based ancestry and consumer genetic testing service.

Erlich and his team wanted to see how easy it would be in general to use the method to find someone’s identity by relying on the DNA of distant and possibly unknown family members. So they looked at more than 1.2 million anonymous people who had gotten testing from MyHeritage, and specifically excluded anyone who had immediate family members also in the database. The idea was to figure out whether a stranger’s DNA could indeed be used to crack your identity.

They found that more than half of these people had distant relatives—meaning third cousins or further—who could be spotted in their searches. For people of European descent, who made up 75 percent of the sample, the hit rate was closer to 60 percent. And for about 15 percent of the total sample, the authors were also able to find a second cousin.

Much like the Golden State investigators, the team found they could trace back someone’s identity in the database with relative ease by using these distant relatives and other demographic but not overly specific information, such as the target’s age or possible state residence.

[…]

According to the researchers, it will take only about 2 percent of an adult population having their DNA profiled in a database before it becomes theoretically possible to trace any person’s distant relatives from a sample of unknown DNA—and therefore, to uncover their identity. And we’re getting ever closer to that tipping point.

“Once we reach 2 percent, nearly everyone will have a third cousin match, and a substantial amount will have a second cousin match,” Erlich explained. “My prediction is that for people of European descent, we’ll reach that threshold within two or three years.”

[…]

What this means for you: If you want to protect your genetic privacy, the best thing you can do is lobby for stronger legal protections and regulations. Because whether or not you’ve ever submitted your DNA for testing, someone, somewhere, is likely to be able to pick up your genetic trail.

Source: Ancestry Sites Could Soon Expose Nearly Anyone’s Identity, Researchers Say

The US Democracy is turning away so many people at polling stations, they need a What to Do If You’re Turned Away at the Polls guide

Several states have instituted stricter voter ID laws since the 2016 presidential election; more, still, are purging voter rolls in the lead up to the election, and the recent Supreme Court decision to uphold Ohio’s aggressive purging law means you can expect many more people to be removed. So, even if you’re registered to vote (and you should really double check) you might find yourself turned away at the polls come November 6.

Source: What to Do If You’re Turned Away at the Polls

One man, one vote? Not so much. Two parties and no voters.

Instagram explores sharing your precise location history with Facebook even when not using the app

Instagram is currently testing a feature that would allow it to share your location data with Facebook, even when you’re not using the app, reports app researcher Jane Manchun Wong (via TechCrunch). The option, which Wong notes is being tested as a setting you have to opt-in to, allows Facebook products to “build and use a history of precise locations” which the company says “helps you explore what’s around you, get more relevant ads and helps improve Facebook.” When activated, the service will report your location “even if you leave the app.”

The discovery of the feature comes just weeks after Instagram’s co-founders resigned from the company, reportedly as a result of Facebook CEO Mark Zuckerberg’s meddling in the service. Examples of this meddling include removing Instagram’s attribution from posts re-shared to Facebook, and badged notifications inside Instagram that encouraged people to open the Facebook app. With the two men who were deeply involved in the day-to-day running of Instagram now gone, such intrusions are expected to increase.

Instagram is not the only service that Facebook has sought to share data between. Back in 2016 the company announced that it would be sharing user data between WhatsApp and Facebook in order to offer better friend suggestions. The practice was later halted in the European Union thanks to its GDPR legislation, although WhatsApp’s CEO and co-founder later left over data privacy concerns.

Source: Instagram explores sharing your precise location history with Facebook – The Verge

Wait – instagram continually monitors your location too?!

Lawyers for Vizio data grabbing Smart TV owners propose final deal, around $20 per person. Lawyers themselves get $5.6 million.

Lawyers representing Vizio TV owners have asked a federal judge in Orange County, California to sign off on a proposed class-action settlement with the company for $17 million, for an affected class of 16 million people, who must opt-in to get any money. Vizio also agrees to delete all data that it collected.

Notice of the lawsuit will be shown directly on the Vizio Smart TVs, three separate times, as well as through paper mailings.

When it’s all said and done, new court filings submitted on Thursday say each of those 16 million people will get a payout of somewhere between $13 and $31. By contrast, their lawyers will collectively earn a maximum payout of $5.6 million in fees.

Source: Lawyers for Vizio Smart TV owners propose final deal, around $20 per person | Ars Technica

New Zealand border cops warn travelers that without handing over electronic passwords ‘You shall not pass!’

Customs laws in New Zealand now allow border agents to demand travellers unlock their phones or face an NZ$5,000 (around US$3,300) fine.

The law was passed during 2017 with its provisions coming into effect on October 1. The security conscious of you will also be pleased to know Kiwi officials still need a “reasonable” suspicion that there’s something to find.

As the country’s minister of Justice Andrew Little explained to a parliamentary committee earlier this year:

“The bill provides for that power of search and examination, but in order to exercise that power, a customs officer, first of all, has to be satisfied, or at least to have a reasonable suspicion, that a person in possession of such a device—it would be a cellphone or a laptop or anything else that might be described as an ‘e-device’—has been involved in criminal offending.

That’s somewhat tighter than the rules that apply in America. Border Patrol agents can take a look at phones without giving any reason, but in January this year, a new directive stipulated that a “reasonable suspicion” test applies if the agent wants to copy anything from a phone.

Like the American regulation, New Zealand’s searchers are limited to files held on the phone. A Customs spokesperson told Radio New Zealand “We’re not going into ‘the cloud’. We’ll examine your phone while it’s on flight mode”.

According to Radio NZ, the Council of Civil Liberties criticised the “reasonable cause” protection as inadequate, because someone asked to unlock a device isn’t told what that cause might be, and therefore has no way to challenge the request.

Source: New Zealand border cops warn travelers that without handing over electronic passwords ‘You shall not pass!’ • The Register

Tim Berners-Lee Announces Solid, an Open Source Project Which Would Aim To Decentralize the Web

Tim Berners-Lee, the founder of the World Wide Web, thinks it’s broken and he has a plan to fix it. The British computer scientist has announced a new project that he hopes will radically change his creation by giving people full control over their data. Tim Berners-Lee: This is why I have, over recent years, been working with a few people at MIT and elsewhere to develop Solid, an open-source project to restore the power and agency of individuals on the web. Solid changes the current model where users have to hand over personal data to digital giants in exchange for perceived value. As we’ve all discovered, this hasn’t been in our best interests. Solid is how we evolve the web in order to restore balance — by giving every one of us complete control over data, personal or not, in a revolutionary way. Solid is a platform, built using the existing web. It gives every user a choice about where data is stored, which specific people and groups can access select elements, and which apps you use. It allows you, your family and colleagues, to link and share data with anyone. It allows people to look at the same data with different apps at the same time. Solid unleashes incredible opportunities for creativity, problem-solving and commerce. It will empower individuals, developers and businesses with entirely new ways to conceive, build and find innovative, trusted and beneficial applications and services. I see multiple market possibilities, including Solid apps and Solid data storage.

Solid is guided by the principle of “personal empowerment through data” which we believe is fundamental to the success of the next era of the web. We believe data should empower each of us. Imagine if all your current apps talked to each other, collaborating and conceiving ways to enrich and streamline your personal life and business objectives? That’s the kind of innovation, intelligence and creativity Solid apps will generate. With Solid, you will have far more personal agency over data — you decide which apps can access it. In an interview with Fast Company, he shared more on Solid and its creation: “I have been imagining this for a very long time,” says Berners-Lee. He opens up his laptop and starts tapping at his keyboard. Watching the inventor of the web work at his computer feels like what it might have been like to watch Beethoven compose a symphony: It’s riveting but hard to fully grasp. “We are in the Solid world now,” he says, his eyes lit up with excitement. He pushes the laptop toward me so I too can see. On his screen, there is a simple-looking web page with tabs across the top: Tim’s to-do list, his calendar, chats, address book. He built this app — one of the first on Solid — for his personal use. It is simple, spare. In fact, it’s so plain that, at first glance, it’s hard to see its significance. But to Berners-Lee, this is where the revolution begins. The app, using Solid’s decentralized technology, allows Berners-Lee to access all of his data seamlessly — his calendar, his music library, videos, chat, research. It’s like a mashup of Google Drive, Microsoft Outlook, Slack, Spotify, and WhatsApp. The difference here is that, on Solid, all the information is under his control. Every bit of data he creates or adds on Solid exists within a Solid pod — which is an acronym for personal online data store. These pods are what give Solid users control over their applications and information on the web. Anyone using the platform will get a Solid identity and Solid pod. This is how people, Berners-Lee says, will take back the power of the web from corporations.

Starting this week, developers around the world will be able to start building their own decentralized apps with tools through the Inrupt site. Berners-Lee will spend this fall crisscrossing the globe, giving tutorials and presentations to developers about Solid and Inrupt. “What’s great about having a startup versus a research group is things get done,” he says. These days, instead of heading into his lab at MIT, Berners-Lee comes to the Inrupt offices, which are currently based out of Janeiro Digital, a company he has contracted to help work on Inrupt. For now, the company consists of Berners-Lee; his partner John Bruce, who built Resilient, a security platform bought by IBM; a handful of on-staff developers contracted to work on the project; and a community of volunteer coders. Later this fall, Berners-Lee plans to start looking for more venture funding and grow his team. The aim, for now, is not to make billions of dollars. The man who gave the web away for free has never been motivated by money. Still, his plans could impact billion-dollar business models that profit off of control over data. It’s not likely that the big powers of the web will give up control without a fight.

Source: Tim Berners-Lee Announces Solid, an Open Source Project Which Would Aim To Decentralize the Web – Slashdot

CBS Shuts Down Ambitious Fan Effort To Make A Virtual Starship Enterprise

Before there was Star Trek: Bridge Crew, Ubisoft’s game about piloting the original Enterprise, there was Star Trek Stage-9, a fan project recreating the Enterprise-D from The Next Generation in Unreal Engine. This week the project is no more, following a cease and desist demand by CBS.

One of the leads on the project, who goes by Scragnog, posted a video on YouTube explaining why it would no longer be getting future updates and development was coming to an end. “On Wednesday, September 12, 2018, we received a letter from the CBS legal department,” he said. “This letter was a cease and desist order. The uncertain future we always had at the back of our minds had caught up to us.”

The team immediately shut down the project’s website and began trying to reach out to the company to try and work on an alternative outcome. After nearly two weeks of not being able to get ahold of anybody, a representative from the legal depart confirmed the Stage 9 team that CBS wasn’t going to budge and the game needed to stay down. CBS did not respond to a request by Kotaku for comment.

Source: CBS Shuts Down Ambitious Fan Effort To Make A Virtual Starship Enterprise

Which goes to show why copyright for such extended periods is such a bad idea. The innovation is killed off and for what? Extended corporate profits, even when they are not making much use of the possibilities? This project was way better than anything CBS churned out.

Facebook Is Giving Advertisers Access to Your Shadow Contact Information – and you can’t find out what that is

Last week, I ran an ad on Facebook that was targeted at a computer science professor named Alan Mislove. Mislove studies how privacy works on social networks and had a theory that Facebook is letting advertisers reach users with contact information collected in surprising ways. I was helping him test the theory by targeting him in a way Facebook had previously told me wouldn’t work. I directed the ad to display to a Facebook account connected to the landline number for Alan Mislove’s office, a number Mislove has never provided to Facebook. He saw the ad within hours.

What Facebook told Alan Mislove about the ad I targeted at his office landline number
Screenshot: Facebook (Alan Mislove)

One of the many ways that ads get in front of your eyeballs on Facebook and Instagram is that the social networking giant lets an advertiser upload a list of phone numbers or email addresses it has on file; it will then put an ad in front of accounts associated with that contact information. A clothing retailer can put an ad for a dress in the Instagram feeds of women who have purchased from them before, a politician can place Facebook ads in front of anyone on his mailing list, or a casino can offer deals to the email addresses of people suspected of having a gambling addiction. Facebook calls this a “custom audience.”

You might assume that you could go to your Facebook profile and look at your “contact and basic info” page to see what email addresses and phone numbers are associated with your account, and thus what advertisers can use to target you. But as is so often the case with this highly efficient data-miner posing as a way to keep in contact with your friends, it’s going about it in a less transparent and more invasive way.

Facebook is not content to use the contact information you willingly put into your Facebook profile for advertising. It is also using contact information you handed over for security purposes and contact information you didn’t hand over at all, but that was collected from other people’s contact books, a hidden layer of details Facebook has about you that I’ve come to call “shadow contact information.”

[…]

Giridhari Venkatadri, Piotr Sapiezynski, and Alan Mislove of Northeastern University, along with Elena Lucherini of Princeton University, did a series of tests that involved handing contact information over to Facebook for a group of test accounts in different ways and then seeing whether that information could be used by an advertiser.

[…]

They found that when a user gives Facebook a phone number for two-factor authentication or in order to receive alerts about new log-ins to a user’s account, that phone number became targetable by an advertiser within a couple of weeks. So users who want their accounts to be more secure are forced to make a privacy trade-off and allow advertisers to more easily find them on the social network.

[…]

The researchers also found that if User A, whom we’ll call Anna, shares her contacts with Facebook, including a previously unknown phone number for User B, whom we’ll call Ben, advertisers will be able to target Ben with an ad using that phone number, which I call “shadow contact information,” about a month later.

[…]

I think that many users don’t fully understand how ad targeting works today: that advertisers can literally specify exactly which users should see their ads by uploading the users’ email addresses, phone numbers, names+dates of birth, etc,” said Mislove. “In describing this work to colleagues, many computer scientists were surprised by this, and were even more surprised to learn that not only Facebook, but also Google, Pinterest, and Twitter all offer related services. Thus, we think there is a significant need to educate users about how exactly targeted advertising on such platforms works today.”

Source: Facebook Is Giving Advertisers Access to Your Shadow Contact Information

Google Chrome Is Now Quietly Forcing You to Log In—Here’s What to Do About It 

Once again, Google has rankled privacy-focused people with a product change that appears to limit users’ options. It’s easy to miss the fact that you’re automatically being logged-in to Chrome if you’re not paying attention.

Chrome 69 released to users on September 5, and you likely noticed that it has a different look. But if you’re the type of person who doesn’t like to log in to the browser with your Google account, you may have missed the fact that it happens automatically when you sign-in to a Google service like Gmail. Previously, users were allowed to keep those logins separate. Members of the message board Hacker News noticed the change relatively quickly and over the weekend, several developers called attention to it.

[…]

If you want to disable the forced login, a user on Hacker News points out a workaround that could change at any time. Copy and paste this text into your browser’s address bar: chrome://flags/#account-consistency. Then disable the option labeled, “Identity consistency between browser and cookie jar,” and restart your browser. Go to this link to ensure that your Sync settings are configured the way you like them. For now, you have a choice, but it shouldn’t be so difficult or obscure.

Source: Google Chrome Is Now Quietly Forcing You to Log In—Here’s What to Do About It