Iceland Has Tested 13% of Its Population for Coronavirus. They have days with 0 deaths. Here’s What It Found

Iceland’s testing yielded new leads for scientists about how the virus behaves. Early results suggested 0.6 percent of the population were “silent carriers” of the disease with no symptoms or only a mild cough and runny nose.

Preliminary research suggests one-third of those who tested positive at deCODE infected someone around them, providing evidence that silent carriers do transmit the disease but much less than symptomatic patients.

In a random sample of 848 children under the age of 10 none of them tested positive, which guided Icelandic authorities’ decision to keep schools open for children under 16.

Alongside the testing, civil defense authorities set up a Contact Tracing Team, including police officers and university students, which used legwork and phone calls to identify people who had come into contact with infected individuals. A mobile phone tracing app was up and running a few weeks later.

Gudnason said the approach’s success is shown by the fact that about 60% of people who tested positive were already in quarantine after being contacted by the tracing team.

Altogether, 19,000 people were ordered into two-week quarantine. Everyone else carried on with a semblance of normality. Primary schools remained open, and some cafes and restaurants kept operating, following social distancing rules: no more than 20 people gathered at once and everyone 2 meters (6.5 feet) apart.

Starting Monday, gatherings of up to 50 will be permitted, high schools and colleges can resume classes and all businesses except bars, gyms and swimming pools can reopen.

The entire country, however, must self-isolate from the rest of the world for the time being. Everyone arriving from abroad faces a 14-day quarantine.

Source: Iceland Has Tested 13% of Its Population for Coronavirus. Here’s What It Found | Time

Researchers create a new system to protect users’ online data by checking if data entered is consistent with the privacy policy

Researchers have created a new a new system that helps Internet users ensure their online data is secure.

The software-based system, called Mitigator, includes a plugin users can install in their browser that will give them a secure signal when they visit a website verified to process its data in compliance with the site’s privacy policy.

“Privacy policies are really hard to read and understand,” said Miti Mazmudar, a PhD candidate in Waterloo’s David R. Cheriton School of Computer Science. “What we try to do is have a compliance system that takes a simplified model of the privacy policy and checks the code on the website’s end to see if it does what the privacy policy claims to do.

“If a website requires you to enter your email address, Mitigator will notify you if the privacy policy stated that this wouldn’t be needed or if the privacy policy did not mention the requirement at all.”

Mitigator can work on any computer, but the companies that own the website servers must have machines with a trusted execution environment (TEE). TEE, a secure area of modern server-class processors, guarantees the protection of code and data loaded in it with respect to confidentiality and integrity.

“The big difference between Mitigator and prior systems that had similar goals is that Mitigator’s primary focus is on the signal it gives to the user,” said Ian Goldberg, a professor in Waterloo’s Faculty of Mathematics. “The important thing is not just that the company knows their software is running correctly; we want the user to get this assurance that the company’s software is running correctly and is processing their data properly and not just leaving it lying around on disk to be stolen.

“Users of Mitigator will know whether their data is being properly protected, managed, and processed while the companies will benefit in that their customers are happier and more confident that nothing untoward is being done with their data.”

The study, Mitigator: Privacy policy compliance using trusted hardware, authored by Mazmudar and Goldberg, has been accepted for publication in the Proceedings of Privacy Enhancing Technologies.

Source: Researchers create a new system to protect users’ online data | Waterloo Stories | University of Waterloo

Antwerpen Uni bans video app Zoom – city of Antwerp is stupid enough to keep using it

De Universiteit Antwerpen verbiedt het gebruik van videobelapp Zoom. De applicatie zou niet veilig genoeg en de universiteit wil geen risico’s nemen nadat men vorig jaar al eens het slachtoffer is geworden van een cyberaanval.

Ook Google en de Amerikaanse ruimtevaartorganisatie NASA namen onlangs het besluit om Zoom niet meer te gebruiken.

Bij de stad Antwerpen wordt Zoom nog volop gebruikt. ‘Door het nemen van gepaste veiligheidsmaatregelen en gebruikmakend van de beveiligingsopties van Zoom zelf werden onnodige risico’s vermeden’, zegt woordvoerder Dirk Delechambre.

Source: Universiteit Antwerpen verbiedt videobelapp Zoom – Emerce

Sorry Dirk, you’re wrong. There is no “safe” way to use the app.

UK COVID-19 contact tracing app data may be kept for ‘research’ after crisis ends, MPs told

Britons will not be able to ask NHS admins to delete their COVID-19 tracking data from government servers, digital arm NHSX’s chief exec Matthew Gould admitted to MPs this afternoon.

Gould also told Parliament’s Human Rights Committee that data harvested from Britons through NHSX’s COVID-19 contact tracing app would be “pseudonymised” – and appeared to leave the door open for that data to be sold on for “research”.

The government’s contact-tracing app will be rolled out in Britain this week. A demo seen by The Register showed its basic consumer-facing functions. Key to those is a big green button that the user presses to send 28 days’ worth of contact data to the NHS.

Screenshot of the NHSX covid-19 contact tracing app

Screenshot of the NHSX COVID-19 contact tracing app … Click to enlarge

Written by tech arm NHSX, Britain’s contact-tracing app breaks with international convention by opting for a centralised model of data collection, rather than keeping data on users’ phones and only storing it locally.

In response to questions from Scottish Nationalist MP Joanna Cherry this afternoon, Gould told MPs: “The data can be deleted for as long as it’s on your own device. Once uploaded all the data will be deleted or fully anonymised with the law, so it can be used for research purposes.”

Source: UK COVID-19 contact tracing app data may be kept for ‘research’ after crisis ends, MPs told • The Register

Why smartphones are digital truth serum

Do smartphones alter what people are willing to disclose about themselves to others? A new study in the Journal of Marketing suggests that they might. The research indicates that people are more willing to reveal about themselves online using their smartphones compared to desktop computers. For example, Tweets and reviews composed on smartphones are more likely to be written from the perspective of the first person, to disclose negative emotions, and to discuss the writer’s private family and personal friends. Likewise, when consumers receive an online ad that requests personal information (such as and income), they are more likely to provide it when the request is received on their smartphone compared to their desktop or laptop computer.

Why do smartphones have this effect on behavior? Melumad explains that “Writing on one’s smartphone often lowers the barriers to revealing certain types of sensitive information for two reasons; one stemming from the unique form characteristics of phones and the second from the emotional associations that consumers tend to hold with their device.” First, one of the most distinguishing features of phones is the small size; something that makes viewing and creating content generally more difficult compared with desktop computers. Because of this difficulty, when writing or responding on a smartphone, a person tends to narrowly focus on completing the task and become less cognizant of external factors that would normally inhibit self-disclosure, such as concerns about what others would do with the information. Smartphone users know this effect well—when using their phones in public places, they often fixate so intently on its content that they become oblivious to what is going on around them.

The second reason people tend to be more self-disclosing on their phones lies in the feelings of comfort and familiarity people associate with their phones. Melumad adds, “Because our smartphones are with us all of the time and perform so many vital functions in our lives, they often serve as ‘adult pacifiers’ that bring feelings of comfort to their owners.” The downstream effect of those feelings shows itself when people are more willing to disclose feelings to a close friend compared to a stranger or open up to a therapist in a comfortable rather than uncomfortable setting. As Meyer says, “Similarly, when writing on our phones, we tend to feel that we are in a comfortable ‘safe zone.’ As a consequence, we are more willing to open up about ourselves.”

The data to support these ideas is far-ranging and includes analyses of thousands of social media posts and online reviews, responses to web ads, and controlled laboratory studies. For example, initial evidence comes from analyses of the depth of self-disclosure revealed in 369,161 Tweets and 10,185 restaurant reviews posted on TripAdvisor.com, with some posted on PCs and some on smartphones.? Using both automated natural-language processing tools and human judgements of self-disclosure, the researchers find robust evidence that -generated content is indeed more self-disclosing. Perhaps even more compelling is evidence from an analysis of 19,962 “call to action” web ads, where consumers are asked to provide private information.

Consistent with the tendency for smartphones to facilitate greater self-disclosure, compliance was systematically higher for ads targeted at smartphones versus PCs.

The findings have clear and significant implications for firms and consumers. One is that if a firm wishes to gain a deeper understanding of the real preferences and needs of consumers, it may obtain better insights by tracking what they say and do on their smartphones than on their desktops. Likewise, because more self-disclosing content is often perceived to be more honest, firms might encourage consumers to post reviews from their personal devices. But therein lies a potential caution for —these findings suggest that the device people use to communicate can affect what they communicate. This should be kept in mind when thinking about the device one is using when interacting with firms and others.

Source: Why smartphones are digital truth serum