Spanish renewable energy giant and offshore wind energy leader Siemens Gamesa Renewable Energy last week inaugurated operations of its electrothermal energy storage system which can store up to 130 megawatt-hours of electricity for a week in volcanic rock.
[…]
The heat storage facility consists of around 1,000 tonnes of volcanic rock which is used as the storage medium. The rock is fed with electrical energy which is then converted into hot air by means of a resistance heater and a blower that, in turn, heats the rock to 750°C/1382 °F. When demand requires the stored energy, ETES uses a steam turbine to re-electrify the stored energy and feeds it back into the grid.
The new ETES facility in Hamburg-Altenwerder can store up to 130 MWh of thermal energy for a week, and storage capacity remains constant throughout the charging cycles.
Google Calendar was down for users around the world for nearly three hours earlier today. Calendar users trying to access the service were met with a 404 error message through a browser from around 10AM ET until around 12:40PM ET. Google’s Calendar service dashboard now reveals that issues should be resolved for everyone within the next hour.
“We expect to resolve the problem affecting a majority of users of Google Calendar at 6/18/19, 1:40 PM,” the message reads. “Please note that this time frame is an estimate and may change.” Google Calendar appears to have returned for most users, though. Other Google services such as Gmail and Google Maps appeared to be unaffected during the calendar outage, although Hangouts Meet reportedly experiencing some difficulties.
Google Calendar is currently experiencing a service disruption. Please stay tuned for updates or follow here: https://t.co/2SGW3X1cQn
Google Calendar’s issues come in the same month as another massive Google outage which saw YouTube, Gmail, and Snapchat taken offline because of problems with the company’s overall Cloud service. At the time, Google blamed “high levels of network congestion in the eastern USA” for the issues.
The outage also came just over an hour after Google’s G Suite twitter account sent out a tweet promoting Google Calendar’s ability to making scheduling simpler.
However, I recently met other open source developers that make a living from donations, and they helped widen my perspective. At Amsterdam.js, I heard Henry Zhu speak about sustainability in the Babel project and beyond, and it was a pretty dire picture. Later, over breakfast, Henry and I had a deeper conversation on this topic. In Amsterdam I also met up with Titus, who maintains the Unified project full-time. Meeting with these people I confirmed my belief in the donation model for sustainability. It works. But, what really stood out to me was the question: is it fair?
I decided to collect data from OpenCollective and GitHub, and take a more scientific sample of the situation. The results I found were shocking: there were two clearly sustainable open source projects, but the majority (more than 80%) of projects that we usually consider sustainable are actually receiving income below industry standards or even below the poverty threshold.
What the data says
I picked popular open source projects from OpenCollective, and selected the yearly income data from each. Then I looked up their GitHub repositories, to measure the count of stars, and how many “full-time” contributors they have had in the past 12 months. Sometimes I also looked up the Patreon pages for those few maintainers that had one, and added that data to the yearly income for the project. For instance, it is obvious that Evan You gets money on Patreon to work on Vue.js. These data points allowed me to measure: project popularity (a proportional indicator of the number of users), yearly revenue for the whole team, and team size.
[…]
Those that work full-time sometimes complement their income with savings or by living in a country with lower costs of living, or both (Sindre Sorhus).
Then, based on the latest StackOverflow developer survey, we know that the low end of developer salaries is around $40k, while the high end of developer salaries is above $100k. That range depicts the industry standard for developers, given their status as knowledge workers, many of which are living in OECD countries. This allowed me to classify the results into four categories:
BLUE: 6-figure salary
GREEN: 5-figure salary within industry standards
ORANGE: 5-figure salary below our industry standards
The first chart, below, shows team size and “price” for each GitHub star.
More than 50% of projects are red: they cannot sustain their maintainers above the poverty line. 31% of the projects are orange, consisting of developers willing to work for a salary that would be considered unacceptable in our industry. 12% are green, and only 3% are blue: Webpack and Vue.js. Income per GitHub star is important: sustainable projects generally have above $2/star. The median value, however, is $1.22/star. Team size is also important for sustainability: the smaller the team, the more likely it can sustain its maintainers.
The median donation per year is $217, which is substantial when understood on an individual level, but in reality includes sponsorship from companies that are doing this also for their own marketing purposes.
The next chart shows how revenue scales with popularity.
You can browse the data yourself by accessing this Dat archive with a LibreOffice Calc spreadsheet:
The total amount of money being put into open source is not enough for all the maintainers. If we add up all of the yearly revenue from those projects in this data set, it’s $2.5 million. The median salary is approximately $9k, which is below the poverty line. If split up that money evenly, that’s roughly $22k, which is still below industry standards.
The core problem is not that open source projects are not sharing the money received. The problem is that, in total numbers, open source is not getting enough money. $2.5 million is not enough. To put this number into perspective, startups get typically much more than that.
Tidelift has received $40 million in funding, to “help open source creators and maintainers get fairly compensated for their work” (quote). They have a team of 27 people, some of them ex-employees from large companies (such as Google and GitHub). They probably don’t receive the lower tier of salaries. Yet, many of the open source projects they showcase on their website are below poverty line regarding income from donations.
[…]
GitHub was bought by Microsoft for $7.5 billion. To make that quantity easier to grok, the amount of money Microsoft paid to acquire GitHub – the company – is more than 3000x what the open source community is getting yearly. In other words, if the open source community saved up every penny of the money they ever received, after a couple thousand years they could perhaps have enough money to buy GitHub jointly.
[…]
If Microsoft GitHub is serious about helping fund open source, they should put their money where their mouth is: donate at least $1 billion to open source projects. Even a mere $1.5 million per year would be enough to make all the projects in this study become green. The Matching Fund in GitHub Sponsors is not enough, it gives a maintainer at most just $5k in a year, which is not sufficient to raise the maintainer from the poverty threshold up to industry standard.
The man heading up any potentially US government antitrust probes into tech giants like Apple and Google used to work for… Apple and Google.
In the revolving-door world that is Washington DC, that conflict may not seem like much but one person isn’t having it: Senator Elizabeth Warren (D-MA) this week sent Makan Delrahim a snotagram in which she took issue with him overseeing tech antitrust efforts.
“I am writing to urge you to recuse yourself from the Department of Justice’s (DOJ) reported antitrust investigations into Google and Apple,” she wrote. “Although you are the chief antitrust attorney in the DoJ, your prior work lobbying the federal government on behalf of these and other companies in antitrust matters compromises your ability to manage or advise on this investigation without real or perceived conflicts of interest.”
Warren then outlines precisely what she means by conflict of interests: “In 2007, Google hired you to lobby federal antitrust officials on behalf of the company’s proposed acquisition of online advertising company DoubleClick, a $3.1 billion merger that the federal government eventually signed off on… You reported an estimated $100,000 in income from Google in 2007.”
It’s not just Google either. “In addition to the investigation into Google, the DoJ will also have jurisdiction over Apple. In both 2006 and 2007, Apple hired you to lobby the federal government on its behalf on patent reform issues,” Warren continues.
She notes: “Federal ethics law requires that individuals recuse themselves from any ‘particular matter involving specific parties’ if ‘the circumstances would cause a reasonable person with knowledge of the relevant facts to question his impartiality in the matter.’ Given your extensive and lucrative previous work lobbying the federal government on behalf of Google and Apple… any reasonable person would surely question your impartiality in antitrust matters…”
This is fine
Delrahim has also done work for a range of other companies including Anthem, Pfizer, Qualcomm, and Caesars but it’s the fact that he has specific knowledge and connections with the very highest levels of tech giants while being in charge of one of the most anticipated antitrust investigations of the past 30 years that has got people concerned.
This is ridiculous, of course, because Delrahim is a professional and works for whoever hires him. It’s not as if he would do something completely inappropriate like give a speech outside the United States in which he walks through exactly how he would carry out an antitrust investigation into tech giants and the holes that would exist in such an investigation, thereby giving them a clear blueprint to work against.
He definitely did not do that. What he actually did was talk about how it was possible to investigate tech giants, despite some claiming it wasn’t – which is, you’ll understand, quite the opposite.
“The Antitrust Division does not take a myopic view of competition,” Delrahim said during a speech in Israel this week. “Many recent calls for antitrust reform, or more radical change, are premised on the incorrect notion that antitrust policy is only concerned with keeping prices low. It is well-settled, however, that competition has price and non-price dimensions.”
Instead, he noted: “Diminished quality is also a type of harm to competition… As an example, privacy can be an important dimension of quality. By protecting competition, we can have an impact on privacy and data protection.”
So that’s diminished quality and privacy as lines of attack. Anything else, Makan?
“Generally speaking, an exclusivity agreement is an agreement in which a firm requires its customers to buy exclusively from it, or its suppliers to sell exclusively to it. There are variations of this restraint, such as requirements contracts or volume discounts,” he mused at the Antitrust New Frontiers Conference in Tel Aviv.
So it looks as though he is ignoring most of what is making this antitrust predatory as he’s mainly looking at price, then a bit at quality and privacy. Except he’s not looking at quality and privacy. Or leverrage. Or the waterbed effect. Or undercutting. Or product copying. Or vertical integration. Or aggression.
In an interview this week with CNN, Google CEO Sundar Pichai attempted to turn antitrust questions around by pointing to what they say is the silver lining of size: Big beats China. In the face of an intensifying push for antitrust action, the argument has been called tech’s version of “too big to fail.”
“Scale does offer many benefits, it’s important to understand that,” Google CEO Sundar Pichai said. “As a company, we sometimes invest five, ten years ahead without necessarily worrying about short term profits. If you think about how technology leadership contributes to leadership on a global economic scale. Big companies are what are investing in AI the most. There are many benefits to taking a long term view which big companies are able to do.”
Pichai, who did allow that scrutiny and competition were ultimately good things, made points that echoed arguments made by Facebook CEO Mark Zuckerberg who made his point a lot more frankly.
“I think you have this question from a policy perspective, which is, ‘Do we want American companies to be exporting across the world?’” Zuckerberg said last year. “I think that the alternative, frankly, is going to be the Chinese companies.”
Pichai never outright said the word “China” but he didn’t have to. China’s rising tech industry and increasingly tense relationship with the United States
“There are many countries around the world which aspire to be the next Silicon Valley. And they are supporting their companies, too,” Pichai said to CNN. “So we have to balance both. This doesn’t mean you don’t scrutinize large companies. But you have to balance it with the fact that you want big, successful companies as well.”
This has been one of Silicon Valley’s safest fallback arguments since antitrust sentiment began gaining steam in the United States. But the history of American industry offers a serious counterweight.
Columbia Law School professor Tim Wu spent much of 2018 outlining the case for antitrust action. He wrote a book on the subject, The Curse of Bigness: Antitrust in the New Gilded Age, and appeared all over media to make his argument. In an op-ed for the New York Times, Wu called back to the heated Japanese-American tech competition of the 1980s.
IBM faced an unprecedented international challenge in the mainframe market from Japan’s NEC while Sony, Panasonic, and Toshiba made giant leaps forward. The companies had the strong support of the Japanese government.
Had the United States followed the Zuckerberg logic, we would have protected and promoted IBM, AT&T and other American tech giants — the national champions of the 1970s. Instead, the federal government accused the main American tech firms of throttling competition. IBM was subjected to a devastating, 13-year-long antitrust investigation and trial, and the Justice Department broke AT&T into eight pieces in 1984. And indeed, the effect was to weaken some of America’s most powerful tech firms at a key moment of competition with a foreign power.
But something else happened as well. With IBM and AT&T under constant scrutiny, a whole series of industries and companies were born without fear of being squashed by a monopoly. The American software industry, freed from IBM, came to life, yielding companies like Microsoft, Sun and Lotus. Personal computers from Apple and other companies became popular, and after the breakup of AT&T, companies like CompuServe and America Online rushed into online networking, eventually yielding what we now call the “internet economy.”
Silicon Valley’s argument, however, does resonate. The 1980s is not the 2010s and the relationship between China and the U.S. today is significantly colder and even more complex than Japan and the U.S. three decades ago.
American politicians have echoed some of big tech’s concerns about Chinese leadership.
I’d agree with Wu – the China argument is a fear trap. Antitrust history – in the tech, oil and telephony industries, among others – has shown that when titans fall, many smaller, agile and much more innovative companies spring up to take their place, fueling employment gains, exports and better lifestyles for all of us.
Phantom Brigade is a hybrid turn-based & real-time tactical RPG, focusing on in-depth customization and player driven stories. As the last surviving squad of mech pilots, you must capture enemy equipment and facilities to level the playing field. Outnumbered and out-gunned, lead The Brigade through a desperate campaign to retake their war-torn homeland.
According to new research, Antlia 2’s current position is consistent with a collision with the Milky Way hundreds of millions of years ago that could have produced the perturbations we see today. The paper has been submitted for publication and is undergoing peer review.
Antlia 2 was a bit of a surprise when it showed up in the second Gaia mission data release last year. It’s really close to the Milky Way – one of our satellite galaxies – and absolutely enormous, about the size of the Large Magellanic Cloud.
But it’s incredibly diffuse and faint, and hidden from view by the galactic disc, so it managed to evade detection.
That data release also showed in greater detail ripples in the Milky Way’s disc. But astronomers had known about perturbations in that region of the disc for several years by that point, even if the data wasn’t as clear as that provided by Gaia.
It was based on this earlier information that, in 2009, astrophysicist Sukanya Chakrabarti of the Rochester Institute of Technology and colleagues predicted the existence of a dwarf galaxy dominated by dark matter in pretty much the exact location Antlia 2 was found nearly a decade later.
Using the new Gaia data, the team calculated Antlia 2’s past trajectory, and ran a series of simulations. These produced not just the dwarf galaxy’s current position, but the ripples in the Milky Way’s disc by way of a collision less than a billion years ago.
Simulation of the collision: The gas distribution is on the left, stars on the right. (RIT)
Previously, a different team of researchers had attributed these perturbations to an interaction with the Sagittarius Dwarf Spheroidal Galaxy, another of the Milky Way’s satellites.
Chakrabarti and her team also ran simulations of this scenario, and found that the Sagittarius galaxy’s gravity probably isn’t strong enough to produce the effects observed by Gaia.
“Thus,” the researchers wrote in their paper, “we argue that Antlia 2 is the likely driver of the observed large perturbations in the outer gas disk of the Galaxy.”
A bug impacting editors Vim and Neovim could allow a trojan code to escape sandbox mitigations.
A high-severity bug impacting two popular command-line text editing applications, Vim and Neovim, allow remote attackers to execute arbitrary OS commands. Security researcher Armin Razmjou warned that exploiting the bug is as easy as tricking a target into clicking on a specially crafted text file in either editor.
Razmjou’s PoC is able to bypass modeline mitigations, which execute value expressions in a sandbox. That’s to prevent somebody from creating a trojan horse text file in modelines, the researcher said.
“However, the :source! command (with the bang [!] modifier) can be used to bypass the sandbox. It reads and executes commands from a given file as if typed manually, running them after the sandbox has been left,” according to the PoC report.
“Beyond patching, it’s recommended to disable modelines in the vimrc (set nomodeline), to use the securemodelinesplugin, or to disable modelineexpr (since patch 8.1.1366, Vim-only) to disallow expressions in modelines,” the researcher said.
First off, you can’t click in vi, but OK. Second, the whole idea is that you can run commands from vi. So basically he is calling a functionality a flaw.
To see exactly how inscrutable they have become, I analyzed the length and readability of privacy policies from nearly 150 popular websites and apps. Facebook’s privacy policy, for example, takes around 18 minutes to read in its entirety – slightly above average for the policies I tested.
The comparison is between websites with a focus on Facebook and Google, but the main takeaway I think is that almost all privacy policies are complex, because they’re not there for the users.
A novel magnet half the size of a cardboard toilet tissue roll usurped the title of “world’s strongest magnetic field” from the metal titan that had held it for two decades at the Florida State University-headquartered National High Magnetic Field Laboratory.
And its makers say we ain’t seen nothing yet: By packing an exceptionally high-field magnet into a coil you could pack in a purse, MagLab scientists and engineers have shown a way to build and use electromagnets that are stronger, smaller and more versatile than ever before.
Their work is outlined in an article published today in the journal Nature.
“We are really opening a new door,” said MagLab engineer Seungyong Hahn, the mastermind behind the new magnet and an associate professor at the FAMU-FSU College of Engineering. “This technology has a very good potential to entirely change the horizons of high-field applications because of its compact nature.”
[…]
Both the 45-T magnet and the 45.5-T test magnet are built in part with superconductors, a class of conductors boasting special properties, including the ability to carry electricity with perfect efficiency.
The superconductors used in the 45-T are niobium-based alloys, which have been around for decades. But in the 45.5-T proof-of-principle magnet, Hahn’s team used a newer compound called REBCO (rare earth barium copper oxide) with many advantages over conventional superconductors.
Notably, REBCO can carry more than twice as much current as a same-sized section of niobium-based superconductor. This current density is crucial: After all, the electricity running through an electromagnet generates its field, so the more you can cram in, the stronger the field.
Also critical was the specific REBCO product used—paper-thin, tape-shaped wires manufactured by SuperPower Inc.
Credit: Florida State University
MagLab Chief Materials Scientist David Larbalestier, who is also a professor at the FAMU-FSU College of Engineering, saw the product’s promise to pack more power into a potential world-record magnet, and encouraged Hahn to give it a go.
The other key ingredient was not something they put in, but rather something they left out: insulation.
Today’s electromagnets contain insulation between conducting layers, which directs the current along the most efficient path. But it also adds weight and bulk.
Hahn’s innovation: A superconducting magnet without insulation. In addition to yielding a sleeker instrument, this design protects the magnet from a malfunction known as a quench. Quenches can occur when damage or imperfections in the conductor block the current from its designated path, causing the material to heat up and lose its superconducting properties. But if there is no insulation, that current simply follows a different path, averting a quench.
“The fact that the turns of the coil are not insulated from each other means that they can share current very easily and effectively in order to bypass any of these obstacles,” explained Larbalestier, corresponding author on the Nature paper.
There’s another slimming aspect of Hahn’s design that relates to quenches: Superconducting wires and tapes must incorporate some copper to help dissipate heat from potential hot spots. His “no-insulation” coil, featuring tapes a mere 0.043-mm thick, requires much less copper than do conventional magnets.
Britain’s Home Secretary Sajid Javid told BBC Radio today that he has signed the extradition order for Julian Assange, paving the way for the WikiLeaks founder to be sent to the U.S. to face charges of computer hacking and espionage.
“There’s an extradition request from the U.S. that is before the courts tomorrow, but yesterday I signed the extradition order, certified it, and that will be going in front of the courts tomorrow,” Javid said according to Australia’s public broadcaster, the ABC.
Assange is scheduled to appear in a UK court on Friday, though it’s not clear whether he’ll appear by video link or in person.
“It’s a decision ultimately for the courts but there is a very important part of it for the Home Secretary and I want to see justice done at all times, and we’ve got a legitimate extradition request so I’ve signed it, but the final decision is now with the courts,” Javid continued.
Curiously, Home Secretary Javid signed the extradition paperwork despite not being on the best terms with the U.S. government right now. Javid wasn’t invited to attend formal ceremonies when President Donald Trump recently visited the UK and some believe it’s because Javid criticized Trump’s treatment of Muslims in 2017 as well as the American president’s retweets of the far right group Britain First. Javid has a Muslim background, though he insists he doesn’t know why he wasn’t invited to the recent U.S.-focused events in Britain.
Assange is currently being held in Belmarsh prison in southern London and is serving a 50-week sentence for jumping bail in 2012. Assange sought asylum during the summer of 2012 at Ecuador’s embassy in London, where he lived for almost seven years until this past April. Ecuador revoked Assange’s asylum and the WikiLeaks founder was physically dragged out of the embassy by British police.
WikiLeaks founder Julian Assange, a 47-year-old Australian national, appears to be one step closer to being sent to the United States, but the deal is not done, as Javid notes. Not only does the extradition order need final approval by the UK court, there’s still the question of whether Assange could be sent to Sweden to face sexual assault charges.
The statute of limitation has expired for one of the sexual assault claims made against Assange in Sweden, but a rape claim could still be pursued if Swedish prosecutors decide to push the case. A Swedish court ruled earlier this month that Assange should not be detained in absentia, the first move under Swedish law that would have paved the way for his extradition.
Assange’s Swedish lawyer has previously claimed that Assange was too ill to even appear in court via video link, but secret video seemingly recorded by another inmate recently showed Assange looking relatively normal and healthy.
Assange has been charged with 18 counts by the U.S. Justice Department, including one under the Espionage Act, which potentially carries the death penalty. But American prosecutors supposedly gave Ecuador a “verbal pledge” that they won’t pursue death in Assange’s case, according to American news channel ABC. Obviously, a “verbal pledge” is not something that would hold up in court.
As far back as 2015, major companies like Sony and Intel have sought to crowdsource efforts to secure their systems and applications through the San Francisco startup HackerOne. Through the “bug bounty” program offered by the company, hackers once viewed as a nuisance—or worse, as criminals—can identify security vulnerabilities and get paid for their work.
On Tuesday, HackerOne published a wealth of anonymized data to underscore not only the breadth of its own program but highlight the leading types of bugs discovered by its virtual army of hackers who’ve reaped financial rewards through the program. Some $29 million has been paid out so far with regards to the top 10 most rewarded types of security weakness alone, according to the company.
HackerOne markets the bounty program as a means to safely mimic an authentic kind of global threat. “It’s one of the best defenses you can have against what you’re actually protecting against,” said Miju Han, HackerOne’s director of product management. “There are a lot of security tools out there that have theoretically risks—and we definitely endorse those tools as well. But what we really have in bug bounty programs is a real-world security risk.”
The program, of course, has its own limitations. Participants have the ability to define the scope of engagement and in some cases—as with the U.S. Defense Department, a “hackable target”—place limits on which systems and methods are authorized under the program. Criminal hackers and foreign adversaries are, of course, not bound by such rules.
“Bug bounties can be a helpful tool if you’ve already invested in your own security prevention and detection,” said Katie Moussouris, CEO of Luta Security, “in terms of secure development if you publish code, or secure vulnerability management if your organization is mostly just trying to keep up with patching existing infrastructure.”
“It isn’t suitable to replace your own preventative measures, nor can it replace penetration testing,” she said.
Not surprisingly, HackerOne’s data shows that overwhelmingly cross-site scripting (XSS) attacks—in which malicious scripts are injected into otherwise trusted sites—remain the top vulnerability reported through the program. Of the top 10 types of bugs reported, XSS makes up 27 percent. No other type of bug comes close. Through HackerOne, some $7.7 million has been paid out to address XSS vulnerabilities alone.
Cloud migration has also led to a rise in exploits such as server-side request forgery (SSRF). “The attacker can supply or modify a URL which the code running on the server will read or submit data to, and by carefully selecting the URLs, the attacker may be able to read server configuration such as AWS metadata, connect to internal services like http-enabled databases or perform post requests towards internal services which are not intended to be exposed,” HackerOne said.
Currently, SSRF makes up only 5.9 percent of the top bugs reported. Nevertheless, the company says, these server-side exploits are trending upward as more and more companies find homes in the cloud.
Other top bounties include a range of code injection exploits or misconfigurations that allow improper access to systems that should be locked down. Companies have paid out over $1.5 million alone to address improper access control.
“Companies that pay more for bounties are definitely more attractive to hackers, especially more attractive to top hackers,” Han said. “But we know that bounties paid out are not the only motivation. Hackers like to hack companies that they like using, or that are located in their country.” In other words, even though a company is spending more money to pay hackers to find bugs, it doesn’t necessarily mean that they have more security.
“Another factor is how fast a company is changing,” she said. “If a company is developing very rapidly and expanding and growing, even if they pay a lot of bounties, if they’re changing up their code base a lot, then that means they are not necessary as secure.”
According to an article this year in TechRepublic, some 300,000 hackers are currently signed up with HackerOne; though only 1-in-10 have reportedly claimed a bounty. The best of them, a group of roughly 100 hackers, have earned over $100,000. Only a couple of elite hackers have attained the highest-paying ranks of the program, reaping rewards close to, or in excess of, $1 million.
View a full breakdown of HackerOne’s “most impactful and rewarded” vulnerability types here.
The well-known and respected data breach notification website “Have I Been Pwned” is up for sale.
Troy Hunt, its founder and sole operator, announced the sale on Tuesday in a blog post where he explained why the time has come for Have I Been Pwned to become part of something bigger and more organized.
“To date, every line of code, every configuration and every breached record has been handled by me alone. There is no ‘HIBP team’, there’s one guy keeping the whole thing afloat,” Hunt wrote. “it’s time for HIBP to grow up. It’s time to go from that one guy doing what he can in his available time to a better-resourced and better-funded structure that’s able to do way more than what I ever could on my own.”
Over the years, Have I Been Pwned has become the repository for data breaches on the internet, a place where users can search for their email address and see whether they have been part of a data breach. It’s now also a service where people can sign up to get notified whenever their accounts get breached. It’s perhaps the most useful, free, cybersecurity service in the world.
Spain’s data protection agency has fined La Liga, the nation’s top professional soccer league, 250,000 euros ($283,000 USD) for using the league’s phone app to spy on its fans. With millions of downloads, the app was reportedly being used to surveil bars in an effort to catch establishments playing matches on television without a license.
The La Liga app provides users with schedules, player rankings, statistics, and league news. It also knows when they’re watching games and where.
According to Spanish newspaper El País, the league told authorities that when its apps detected users were in bars the apps would record audio through phone microphones. The apps would then use the recording to determine if the user was watching a soccer game, using technology that’s similar to the Shazam app. If a game was playing in the vicinity, officials would then be able to determine if that bar location had a license to play the game.
So not only was the app spying on fans, but it was also turning those fans into unwitting narcs. El Diario reports that the app has been downloaded 10 million times.
On June 6, more than 70,000 BGP routes were leaked from Swiss colocation company Safe Host to China Telecom in Frankfurt, Germany, which then announced them on the global internet. This resulted in a massive rerouting of internet traffic via China Telecom systems in Europe, disrupting connectivity for netizens: a lot of data that should have gone to European cellular networks was instead piped to China Telecom-controlled boxes.
BGP leaks are common – they happen every hour of every day – though the size of this one and particularly the fact it lasted for two hours, rather than seconds or minutes, has prompted more calls for ISPs to join an industry program that adds security checks to the routing system.
The fact that China Telecom, which peers with Safe House, was again at the center of the problem – with traffic destined for European netizens routed through its network – has also made internet engineers suspicious, although they have been careful not to make any accusations without evidence.
“China Telecom, a major international carrier, has still implemented neither the basic routing safeguards necessary both to prevent propagation of routing leaks nor the processes and procedures necessary to detect and remediate them in a timely manner when they inevitably occur,” noted Oracle Internet Intelligence’s (OII) director of internet analysis Doug Madory in a report. “Two hours is a long time for a routing leak of this magnitude to stay in circulation, degrading global communications.”
A team at network security outfit vpnMentor was scanning cyber-space as part of a web-mapping project when they happened upon a Graylog management server belonging to Tech Data that had been left freely accessible to the public. Within that database, we’re told, was a 264GB cache of information including emails, payment and credit card details, and unencrypted usernames and passwords. Pretty much everything you need to ruin someone’s day (or year).
The exposure, vpnMentor told The Register today, is particularly bad due to the nature of Tech Data’s customers. The Fortune 500 distie provides everything from financing and marketing services to IT management and user training courses. Among the clients listed on its site are Apple, Symantec, and Cisco.
“This is a serious leak as far as we can see, so much so that all of the credentials needed to log in to customer accounts are available,” a spokesperson for vpnMentor told El Reg. “Because of the size of the database, we could not go through all of it and there may be more sensitive information available to the public than what we have disclosed here.”
In addition to the login credentials and card information, the researchers said they were able to find private API keys and logs in the database, as well as customer profiles that included full names, job titles, phone numbers, and email and postal addresses. All available to anyone who could find it.
vpnMentor says it discovered and reported the open database on June 2 to Tech Data, and by June 4 the distie had told the team it had secured the database and hidden it from public view. Tech Data did not respond to a request for comment from The Register. The US-based company did not mention the incident in its most recent SEC filings.
View the full-size version of the infographic by clicking here
The first representatives of Generation Z have started to trickle into the workplace – and like generations before them, they are bringing a different perspective to things.
Did you know that there are now up to five generations now working under any given roof, ranging all the way from the Silent Generation (born Pre-WWII) to the aforementioned Gen Z?
Let’s see how these generational groups differ in their approaches to communication, career priorities, and company loyalty.
Generational Differences at Work
Today’s infographic comes to us from Raconteur, and it breaks down some key differences in how generational groups are thinking about the workplace.
Let’s dive deeper into the data for each category.
Communication
How people prefer to communicate is one major and obvious difference that manifests itself between generations.
While many in older generations have dabbled in new technologies and trends around communications, it’s less likely that they will internalize those methods as habits. Meanwhile, for younger folks, these newer methods (chat, texting, etc.) are what they grew up with.
Top three communication methods by generation:
Baby Boomers:
40% of communication is in person, 35% by email, and 13% by phone
Gen X:
34% of communication is in person, 34% by email, and 13% by phone
Millennials:
33% of communication is by email, 31% is in person, and 12% by chat
Gen Z:
31% of communication is by chat, 26% is in person, and 16% by emails
Motivators
Meanwhile, the generations are divided on what motivates them in the workplace. Boomers place health insurance as an important decision factor, while younger groups view salary and pursuing a passion as being key elements to a successful career.
Three most important work motivators by generation (in order):
Baby Boomers:
Health insurance, a boss worthy of respect, and salary
Gen X:
Salary, job security, and job challenges/excitement
Millennials:
Salary, job challenges/excitement, and ability to pursue passion
Gen Z:
Salary, ability to pursue passion, and job security
Loyalty
Finally, generational groups have varying perspectives on how long they would be willing to stay in any one role.
Baby Boomers: 8 years
Gen X: 7 years
Millennials: 5 years
Gen Z: 3 years
Given the above differences, employers will have to think clearly about how to attract and retain talent across a wide scope of generations. Further, employers will have to learn what motivates each group, as well as what makes them each feel the most comfortable in the workplace.
The investigation will include a series of hearings held by the Subcommittee on Antitrust, Commercial and Administrative Law on the rise of market power online, as well as requests for information that are relevant to the investigation.
A small number of dominant, unregulated platforms have extraordinary power over commerce, communication and information online. Based on investigative reporting and oversight by international policymakers and enforcers, there are concerns that these platforms have the incentive and ability to harm the competitive process. The Antitrust Subcommittee will conduct a top-to-bottom review of the potential of giant tech platforms to hold monopoly power.
The committee’s investigation will focus on three main areas:
Documenting competition problems in digital markets;
Examining whether dominant firms are engaging in anti-competitive conduct; and
Assessing whether existing antitrust laws, competition policies and current enforcement levels are adequate to address these issues.
“Big Tech plays a huge role in our economy and our world,” said Collins. “As tech has expanded its market share, more and more questions have arisen about whether the market remains competitive. Our bipartisan look at competition in the digital markets gives us the chance to answer these questions and, if necessary, to take action. I appreciate the partnership of Chairman Nadler, Subcommittee Chairman Cicilline and Subcommittee Ranking Member Sensenbrenner on these important issues.”
“The open internet has delivered enormous benefits to Americans, including a surge of economic opportunity, massive investment, and new pathways for education online,” said Nadler. “But there is growing evidence that a handful of gatekeepers have come to capture control over key arteries of online commerce, content, and communications. The Committee has a rich tradition of conducting studies and investigations to assess the threat of monopoly power in the U.S. economy. Given the growing tide of concentration and consolidation across our economy, it is vital that we investigate the current state of competition in digital markets and the health of the antitrust laws.”
“Technology has become a crucial part of Americans’ everyday lives,” said Sensenbrenner. “As the world becomes more dependent on a digital marketplace, we must discuss how the regulatory framework is built to ensure fairness and competition. I believe these hearings can be informative, but it is important for us to avoid any predetermined conclusions. I thank Chairman Nadler, Ranking Member Collins, and Chairman Cicilline as we begin these bipartisan discussions.”
“The growth of monopoly power across our economy is one of the most pressing economic and political challenges we face today. Market power in digital markets presents a whole new set of dangers,” said Cicilline. “After four decades of weak antitrust enforcement and judicial hostility to antitrust cases, it is vital for Congress to step in to determine whether existing laws are adequate to tackle abusive conduct by platform gatekeepers or if we need new legislation.”
Basically they are looking at how antitrust works, which is a great thing, because recently antitrust in the US has focused on consumer prices and ignored everything else. With the price gauging of Amazon, this is not the way to look at things. Have a look at my talk on this if you’re interested
Yale researchers have figured out how to catch and save Schrödinger’s famous cat, the symbol of quantum superposition and unpredictability, by anticipating its jumps and acting in real time to save it from proverbial doom. In the process, they overturn years of cornerstone dogma in quantum physics.
The discovery enables researchers to set up an early warning system for imminent jumps of artificial atoms containing quantum information. A study announcing the discovery appears in the June 3 online edition of the journal Nature.
[…]
The quantum jump is the discrete (non-continuous) and random change in the state when it is observed.
The experiment, performed in the lab of Yale professor Michel Devoret and proposed by lead author Zlatko Minev, peers into the actual workings of a quantum jump for the first time. The results reveal a surprising finding that contradicts Danish physicist Niels Bohr’s established view—the jumps are neither abrupt nor as random as previously thought.
For a tiny object such as an electron, molecule, or an artificial atom containing quantum information (known as a qubit), a quantum jump is the sudden transition from one of its discrete energy states to another. In developing quantum computers, researchers crucially must deal with the jumps of the qubits, which are the manifestations of errors in calculations.
The enigmatic quantum jumps were theorized by Bohr a century ago, but not observed until the 1980s, in atoms.
“These jumps occur every time we measure a qubit,” said Devoret, the F.W. Beinecke Professor of Applied Physics and Physics at Yale and member of the Yale Quantum Institute. “Quantum jumps are known to be unpredictable in the long run.”
“Despite that,” added Minev, “We wanted to know if it would be possible to get an advance warning signal that a jump is about to occur imminently.”
Minev noted that the experiment was inspired by a theoretical prediction by professor Howard Carmichael of the University of Auckland, a pioneer of quantum trajectory theory and a co-author of the study.
In addition to its fundamental impact, the discovery is a potential major advance in understanding and controlling quantum information. Researchers say reliably managing quantum data and correcting errors as they occur is a key challenge in the development of fully useful quantum computers.
The Yale team used a special approach to indirectly monitor a superconducting artificial atom, with three microwave generators irradiating the atom enclosed in a 3-D cavity made of aluminum. The doubly indirect monitoring method, developed by Minev for superconducting circuits, allows the researchers to observe the atom with unprecedented efficiency.
Microwave radiation stirs the artificial atom as it is simultaneously being observed, resulting in quantum jumps. The tiny quantum signal of these jumps can be amplified without loss to room temperature. Here, their signal can be monitored in real time. This enabled the researchers to see a sudden absence of detection photons (photons emitted by an ancillary state of the atom excited by the microwaves); this tiny absence is the advance warning of a quantum jump.
“The beautiful effect displayed by this experiment is the increase of coherence during the jump, despite its observation,” said Devoret. Added Minev, “You can leverage this to not only catch the jump, but also reverse it.”
This is a crucial point, the researchers said. While quantum jumps appear discrete and random in the long run, reversing a quantum jump means the evolution of the quantum state possesses, in part, a deterministic and not random character; the jump always occurs in the same, predictable manner from its random starting point.
“Quantum jumps of an atom are somewhat analogous to the eruption of a volcano,” Minev said. “They are completely unpredictable in the long term. Nonetheless, with the correct monitoring we can with certainty detect an advance warning of an imminent disaster and act on it before it has occurred.
Tinder users in Russia may now have to decide whether the perks of dating apps outweigh a disconcerting invasion of privacy. Russian authorities are now requiring that the dating app hand over a wealth of intimate user data, including private messages, if and when it asks for them.
Tinder is the fourth dating app in the nation to be forced to comply with the Russian government’s request for user data, Moscow Times reports, and it’s among 175 services that have already consented to share information with the nation’s Federal Security Service, according to a registry online.
Tinder was added to the list of services that have to comply with the Russian data requests last Friday, May 31. The data Tinder must collect and provide to Russia upon request includes user data and all communications including audio and video. According to Tinder’s privacy policy, it does collect all your basic profile details, such as your date of birth and gender as well as the content you publish and your chats with other users, among other information. Which means the Russian government could get its hands on your sexts, your selfies, and even details on where you’ve been or where you might be going if it wants to.
It’s unclear if the possible data requests will apply to just Tinder users within Russia or any users of the dating app, regardless of where they are. If it’s the latter, it points to an unsettling reality in which one nation is able to extend its reach into the intimate data of people all over the world by simply making the request to any complying service that happens to also operate in Russia.
We have reached out to Tinder about which users this applies to, whether it will comply with this request, and what type of data it will share with the Russian authorities. We will update when we hear back. According to the Associated Press, Russian’s communications regulator confirmed on Monday that the company had shared information with it.
The Russian government is not only targeting Tinder. As the lengthy registry online indicates, a large and diverse range of services are already on the list and have been for years. This includes Snap, Wechat, Vimeo, and Badoo, another popular dating app in Russia.
Telegram famously objected to the Russian authorities’ request for its encryption keys last year, which resulted in the government banning the encrypted messaging app. It was an embarrassing mess for Russian internet service providers, which in their attempt to block workarounds for the messaging app, disrupted a litany of services online.
Clinical lab testing titan Quest Diagnostics acknowledged in a press release on Monday that an “unauthorized user” had gained access to personal information on around 11.9 million customers, including some financial and medical data.
Per NBC News, news of the breach comes via way of a Securities and Exchange Commission filing in which Quest wrote that American Medical Collection Agency (AMCA), which provides billing collection services to Quest contractor Optum 360, had notified it of the breach in mid-May. NBC wrote that Quest said AMCA’s web payments page had possibly been compromised from Aug. 1, 2018 to March 30, 2019.
In its statement, Quest wrote that compromised information could include “certain financial data,” Social Security numbers, and some medical material—but not the results of laboratory tests on patients. It also wrote the extent of the breach remained unclear:
AMCA believes this information includes personal information, including certain financial data, Social Security numbers, and medical information, but not laboratory test results.
AMCA has not yet provided Quest or Optum360 detailed or complete information about the AMCA data security incident, including which information of which individuals may have been affected. And Quest has not been able to verify the accuracy of the information received from AMCA.
Quest added that it had “suspended” sending collections requests to AMCA. According to the Wall Street Journal, a spokesperson for Optum360 parent company UnitedHealth said their Optum360 systems were unaffected by the breach.
Owners of Supra Smart Cloud TVs are in danger of getting some unwanted programming: it’s possible for miscreants or malware on your Wi-Fi network to switch whatever you’re watching for video of their or its choosing.
Bug-hunter Dhiraj Mishra laid claim to CVE-2019-12477, a remote file inclusion zero-day vulnerability that allows anyone with local network access to specify their own video to display on the TV, overriding whatever is being shown, with no password necessary. As such it’s more likely to be used my mischievous family members than hackers.
Mishra told The Register the issue is due to a complete lack of any authentication or session management in the software controlling the Wi-Fi-connected telly. By crafting a malicious HTTP GET request, and sending it to the set over the network, an attacker would be able to provide whatever video URL they desired to the target, and have the stream played on the TV without any sort of security check.
The Australian National University (ANU) today copped to a fresh breach in which intruders gained access to “significant amounts” of data stretching back 19 years.
The top-ranked Oz uni said it noticed about a fortnight ago that hackers had got their claws on staff, visitor and student data, including names, addresses, dates of birth, phone numbers, personal email addresses, emergency contact details, tax file numbers, payroll information, bank account details and passport details. It said the breach took place in “late 2018” – the same year it ‘fessed up to another lengthy attack.
Students will be miffed to find out that someone knows they had to retake second-year Statistics since academic records were also accessed.
The uni insisted: “The systems that store credit card details, travel information, medical records, police checks, workers’ compensation, vehicle registration numbers, and some performance records have not been affected.”
Advanced Driver Assistance Systems (ADAS) in cars such as automatic braking systems, systems that detect the state of the road, if there is anything in your blind spot and navigation systems will be sharing their data with European countries, car manufacturers and presumably insurers under the cloak of making driving safer. I’m sure it will, but I still don’t feel comfortable having the government know where I am at all times and what my driving style is like.