Software below the poverty line – Open Source Developers being exploited

However, I recently met other open source developers that make a living from donations, and they helped widen my perspective. At Amsterdam.js, I heard Henry Zhu speak about sustainability in the Babel project and beyond, and it was a pretty dire picture. Later, over breakfast, Henry and I had a deeper conversation on this topic. In Amsterdam I also met up with Titus, who maintains the Unified project full-time. Meeting with these people I confirmed my belief in the donation model for sustainability. It works. But, what really stood out to me was the question: is it fair?

I decided to collect data from OpenCollective and GitHub, and take a more scientific sample of the situation. The results I found were shocking: there were two clearly sustainable open source projects, but the majority (more than 80%) of projects that we usually consider sustainable are actually receiving income below industry standards or even below the poverty threshold.

What the data says

I picked popular open source projects from OpenCollective, and selected the yearly income data from each. Then I looked up their GitHub repositories, to measure the count of stars, and how many “full-time” contributors they have had in the past 12 months. Sometimes I also looked up the Patreon pages for those few maintainers that had one, and added that data to the yearly income for the project. For instance, it is obvious that Evan You gets money on Patreon to work on Vue.js. These data points allowed me to measure: project popularity (a proportional indicator of the number of users), yearly revenue for the whole team, and team size.

[…]

Those that work full-time sometimes complement their income with savings or by living in a country with lower costs of living, or both (Sindre Sorhus).

Then, based on the latest StackOverflow developer survey, we know that the low end of developer salaries is around $40k, while the high end of developer salaries is above $100k. That range depicts the industry standard for developers, given their status as knowledge workers, many of which are living in OECD countries. This allowed me to classify the results into four categories:

  • BLUE: 6-figure salary
  • GREEN: 5-figure salary within industry standards
  • ORANGE: 5-figure salary below our industry standards
  • RED: salary below the official US poverty threshold

The first chart, below, shows team size and “price” for each GitHub star.

Open source projects, income-per-star versus team size

More than 50% of projects are red: they cannot sustain their maintainers above the poverty line. 31% of the projects are orange, consisting of developers willing to work for a salary that would be considered unacceptable in our industry. 12% are green, and only 3% are blue: Webpack and Vue.js. Income per GitHub star is important: sustainable projects generally have above $2/star. The median value, however, is $1.22/star. Team size is also important for sustainability: the smaller the team, the more likely it can sustain its maintainers.

The median donation per year is $217, which is substantial when understood on an individual level, but in reality includes sponsorship from companies that are doing this also for their own marketing purposes.

The next chart shows how revenue scales with popularity.

Open source projects, yearly revenue versus GitHub stars

You can browse the data yourself by accessing this Dat archive with a LibreOffice Calc spreadsheet:

dat://bf7b912fff1e64a52b803444d871433c5946c990ae51f2044056bf6f9655ecbf
 [...]

The total amount of money being put into open source is not enough for all the maintainers. If we add up all of the yearly revenue from those projects in this data set, it’s $2.5 million. The median salary is approximately $9k, which is below the poverty line. If split up that money evenly, that’s roughly $22k, which is still below industry standards.

The core problem is not that open source projects are not sharing the money received. The problem is that, in total numbers, open source is not getting enough money. $2.5 million is not enough. To put this number into perspective, startups get typically much more than that.

Tidelift has received $40 million in funding, to “help open source creators and maintainers get fairly compensated for their work” (quote). They have a team of 27 people, some of them ex-employees from large companies (such as Google and GitHub). They probably don’t receive the lower tier of salaries. Yet, many of the open source projects they showcase on their website are below poverty line regarding income from donations.

[…]

GitHub was bought by Microsoft for $7.5 billion. To make that quantity easier to grok, the amount of money Microsoft paid to acquire GitHub – the company – is more than 3000x what the open source community is getting yearly. In other words, if the open source community saved up every penny of the money they ever received, after a couple thousand years they could perhaps have enough money to buy GitHub jointly.

[…]

If Microsoft GitHub is serious about helping fund open source, they should put their money where their mouth is: donate at least $1 billion to open source projects. Even a mere $1.5 million per year would be enough to make all the projects in this study become green. The Matching Fund in GitHub Sponsors is not enough, it gives a maintainer at most just $5k in a year, which is not sufficient to raise the maintainer from the poverty threshold up to industry standard.

Source: André Staltz – Software below the poverty line

Unfortunately I’ve been talking about this for years now.

It’s time to make open source open but less free for the big users.

Anyone else find it weird that the bloke tasked with probing tech giants for antitrust abuses used to, um, work for the same tech giants?

The man heading up any potentially US government antitrust probes into tech giants like Apple and Google used to work for… Apple and Google.

In the revolving-door world that is Washington DC, that conflict may not seem like much but one person isn’t having it: Senator Elizabeth Warren (D-MA) this week sent Makan Delrahim a snotagram in which she took issue with him overseeing tech antitrust efforts.

“I am writing to urge you to recuse yourself from the Department of Justice’s (DOJ) reported antitrust investigations into Google and Apple,” she wrote. “Although you are the chief antitrust attorney in the DoJ, your prior work lobbying the federal government on behalf of these and other companies in antitrust matters compromises your ability to manage or advise on this investigation without real or perceived conflicts of interest.”

Warren then outlines precisely what she means by conflict of interests: “In 2007, Google hired you to lobby federal antitrust officials on behalf of the company’s proposed acquisition of online advertising company DoubleClick, a $3.1 billion merger that the federal government eventually signed off on… You reported an estimated $100,000 in income from Google in 2007.”

It’s not just Google either. “In addition to the investigation into Google, the DoJ will also have jurisdiction over Apple. In both 2006 and 2007, Apple hired you to lobby the federal government on its behalf on patent reform issues,” Warren continues.

She notes: “Federal ethics law requires that individuals recuse themselves from any ‘particular matter involving specific parties’ if ‘the circumstances would cause a reasonable person with knowledge of the relevant facts to question his impartiality in the matter.’ Given your extensive and lucrative previous work lobbying the federal government on behalf of Google and Apple… any reasonable person would surely question your impartiality in antitrust matters…”

This is fine

Delrahim has also done work for a range of other companies including Anthem, Pfizer, Qualcomm, and Caesars but it’s the fact that he has specific knowledge and connections with the very highest levels of tech giants while being in charge of one of the most anticipated antitrust investigations of the past 30 years that has got people concerned.

This is ridiculous, of course, because Delrahim is a professional and works for whoever hires him. It’s not as if he would do something completely inappropriate like give a speech outside the United States in which he walks through exactly how he would carry out an antitrust investigation into tech giants and the holes that would exist in such an investigation, thereby giving them a clear blueprint to work against.

Because that would be nuts.

He definitely did not do that. What he actually did was talk about how it was possible to investigate tech giants, despite some claiming it wasn’t – which is, you’ll understand, quite the opposite.

“The Antitrust Division does not take a myopic view of competition,” Delrahim said during a speech in Israel this week. “Many recent calls for antitrust reform, or more radical change, are premised on the incorrect notion that antitrust policy is only concerned with keeping prices low. It is well-settled, however, that competition has price and non-price dimensions.”

Instead, he noted: “Diminished quality is also a type of harm to competition… As an example, privacy can be an important dimension of quality. By protecting competition, we can have an impact on privacy and data protection.”

So that’s diminished quality and privacy as lines of attack. Anything else, Makan?

“Generally speaking, an exclusivity agreement is an agreement in which a firm requires its customers to buy exclusively from it, or its suppliers to sell exclusively to it. There are variations of this restraint, such as requirements contracts or volume discounts,” he mused at the Antitrust New Frontiers Conference in Tel Aviv.

Source: Anyone else find it weird that the bloke tasked with probing tech giants for antitrust abuses used to, um, work for the same tech giants? • The Register

So it looks as though he is ignoring most of what is making this antitrust predatory as he’s mainly looking at price, then a bit at quality and privacy. Except he’s not looking at quality and privacy. Or leverrage. Or the waterbed effect. Or undercutting. Or product copying. Or vertical integration. Or aggression.

For more on why monopolies are bad, check out

 

Facing Antitrust Pressure, Google Starts Spinning Its Own Too Big to Fail Argument

In an interview this week with CNN, Google CEO Sundar Pichai attempted to turn antitrust questions around by pointing to what they say is the silver lining of size: Big beats China. In the face of an intensifying push for antitrust action, the argument has been called tech’s version of “too big to fail.”

“Scale does offer many benefits, it’s important to understand that,” Google CEO Sundar Pichai said. “As a company, we sometimes invest five, ten years ahead without necessarily worrying about short term profits. If you think about how technology leadership contributes to leadership on a global economic scale. Big companies are what are investing in AI the most. There are many benefits to taking a long term view which big companies are able to do.”

Pichai, who did allow that scrutiny and competition were ultimately good things, made points that echoed arguments made by Facebook CEO Mark Zuckerberg who made his point a lot more frankly.

“I think you have this question from a policy perspective, which is, ‘Do we want American companies to be exporting across the world?’” Zuckerberg said last year. “I think that the alternative, frankly, is going to be the Chinese companies.”

Pichai never outright said the word “China” but he didn’t have to. China’s rising tech industry and increasingly tense relationship with the United States

“There are many countries around the world which aspire to be the next Silicon Valley. And they are supporting their companies, too,” Pichai said to CNN. “So we have to balance both. This doesn’t mean you don’t scrutinize large companies. But you have to balance it with the fact that you want big, successful companies as well.”

This has been one of Silicon Valley’s safest fallback arguments since antitrust sentiment began gaining steam in the United States. But the history of American industry offers a serious counterweight.

Columbia Law School professor Tim Wu spent much of 2018 outlining the case for antitrust action. He wrote a book on the subject, The Curse of Bigness: Antitrust in the New Gilded Age, and appeared all over media to make his argument. In an op-ed for the New York Times, Wu called back to the heated Japanese-American tech competition of the 1980s.

IBM faced an unprecedented international challenge in the mainframe market from Japan’s NEC while Sony, Panasonic, and Toshiba made giant leaps forward. The companies had the strong support of the Japanese government.

Wu laid out what happened next:

Had the United States followed the Zuckerberg logic, we would have protected and promoted IBM, AT&T and other American tech giants — the national champions of the 1970s. Instead, the federal government accused the main American tech firms of throttling competition. IBM was subjected to a devastating, 13-year-long antitrust investigation and trial, and the Justice Department broke AT&T into eight pieces in 1984. And indeed, the effect was to weaken some of America’s most powerful tech firms at a key moment of competition with a foreign power.

But something else happened as well. With IBM and AT&T under constant scrutiny, a whole series of industries and companies were born without fear of being squashed by a monopoly. The American software industry, freed from IBM, came to life, yielding companies like Microsoft, Sun and Lotus. Personal computers from Apple and other companies became popular, and after the breakup of AT&T, companies like CompuServe and America Online rushed into online networking, eventually yielding what we now call the “internet economy.”

Silicon Valley’s argument, however, does resonate. The 1980s is not the 2010s and the relationship between China and the U.S. today is significantly colder and even more complex than Japan and the U.S. three decades ago.

American politicians have echoed some of big tech’s concerns about Chinese leadership.

Congress just opened what promises to be a lengthy antitrust investigation into big tech that barely talked about China.

Source: Facing Antitrust Pressure, Google Starts Spinning Its Own Too Big to Fail Argument

I’d agree with Wu – the China argument is a fear trap. Antitrust history – in the tech, oil and telephony industries, among others – has shown that when titans fall, many smaller, agile and much more innovative companies spring up to take their place, fueling employment gains, exports and better lifestyles for all of us.

Phantom Brigade – turn based mech game where you can see into the future

Phantom Brigade is a hybrid turn-based & real-time tactical RPG, focusing on in-depth customization and player driven stories. As the last surviving squad of mech pilots, you must capture enemy equipment and facilities to level the playing field. Outnumbered and out-gunned, lead The Brigade through a desperate campaign to retake their war-torn homeland.

 

Source: Phantom Brigade | Brace Yourself Games

We Have Detected Signs of Our Milky Way Colliding With Another Galaxy

According to new research, Antlia 2’s current position is consistent with a collision with the Milky Way hundreds of millions of years ago that could have produced the perturbations we see today. The paper has been submitted for publication and is undergoing peer review.

Antlia 2 was a bit of a surprise when it showed up in the second Gaia mission data release last year. It’s really close to the Milky Way – one of our satellite galaxies – and absolutely enormous, about the size of the Large Magellanic Cloud.

But it’s incredibly diffuse and faint, and hidden from view by the galactic disc, so it managed to evade detection.

That data release also showed in greater detail ripples in the Milky Way’s disc. But astronomers had known about perturbations in that region of the disc for several years by that point, even if the data wasn’t as clear as that provided by Gaia.

It was based on this earlier information that, in 2009, astrophysicist Sukanya Chakrabarti of the Rochester Institute of Technology and colleagues predicted the existence of a dwarf galaxy dominated by dark matter in pretty much the exact location Antlia 2 was found nearly a decade later.

Using the new Gaia data, the team calculated Antlia 2’s past trajectory, and ran a series of simulations. These produced not just the dwarf galaxy’s current position, but the ripples in the Milky Way’s disc by way of a collision less than a billion years ago.

antlia collisionSimulation of the collision: The gas distribution is on the left, stars on the right. (RIT)

Previously, a different team of researchers had attributed these perturbations to an interaction with the Sagittarius Dwarf Spheroidal Galaxy, another of the Milky Way’s satellites.

Chakrabarti and her team also ran simulations of this scenario, and found that the Sagittarius galaxy’s gravity probably isn’t strong enough to produce the effects observed by Gaia.

“Thus,” the researchers wrote in their paper, “we argue that Antlia 2 is the likely driver of the observed large perturbations in the outer gas disk of the Galaxy.”

Source: We Have Detected Signs of Our Milky Way Colliding With Another Galaxy

Storm in a teacup: Linux Command-Line Editors Do What they’re supposed to do, are called Vulnerable to High-Severity Bugs by ‘researcher’

A bug impacting editors Vim and Neovim could allow a trojan code to escape sandbox mitigations.

A high-severity bug impacting two popular command-line text editing applications, Vim and Neovim, allow remote attackers to execute arbitrary OS commands. Security researcher Armin Razmjou warned that exploiting the bug is as easy as tricking a target into clicking on a specially crafted text file in either editor.

Razmjou’s PoC is able to bypass modeline mitigations, which execute value expressions in a sandbox. That’s to prevent somebody from creating a trojan horse text file in modelines, the researcher said.

“However, the :source! command (with the bang [!] modifier) can be used to bypass the sandbox. It reads and executes commands from a given file as if typed manually, running them after the sandbox has been left,” according to the PoC report.

Vim and Neovim have both released patches for the bug (CVE-2019-12735) that the National Institute of Standards and Technology warns, “allows remote attackers to execute arbitrary OS commands via the :source! command in a modeline.”

“Beyond patching, it’s recommended to disable modelines in the vimrc (set nomodeline), to use the securemodelinesplugin, or to disable modelineexpr (since patch 8.1.1366, Vim-only) to disallow expressions in modelines,” the researcher said.

Source: Linux Command-Line Editors Vulnerable to High-Severity Bug | Threatpost

First off, you can’t click in vi, but OK. Second, the whole idea is that you can run commands from vi. So basically he is calling a functionality a flaw.