Software Development Environments Move to the Cloud

If you’re a newly hired software engineer, setting up your development environment can be tedious. If you’re lucky, your company will have a documented, step-by-step process to follow. But this still doesn’t guarantee you’ll be up and running in no time. When you’re tasked with updating your environment, you’ll go through the same time-consuming process. With different platforms, tools, versions, and dependencies to grapple with, you’ll likely encounter bumps along the way.

Austin-based startup Coder aims to ease this process by bringing development environments to the cloud. “We grew up in a time where [Microsoft] Word documents changed to Google Docs. We were curious why this wasn’t happening for software engineers,” says John A. Entwistle, who founded Coder along with Ammar Bandukwala and Kyle Carberry in 2017. “We thought that if you could move the development environment to the cloud, there would be all sorts of cool workflow benefits.”

With Coder, software engineers access a preconfigured development environment on a browser using any device, instead of launching an integrated development environment installed on their computers. This convenience allows developers to learn a new code base more quickly and start writing code right away.

[…]

Yet cloud-based platforms have their limitations, the most crucial of which is they require reliable Internet service. “We have support for intermittent connections, so if you lose connection for a few seconds, you don’t lose everything. But you do need access to the Internet,” says Entwistle. There’s also the task of setting up and configuring your team’s development environment before getting started on Coder, but once that’s done, you can share your predefined environment with the team.

To ensure security, all source code and related development activities are hosted on a company’s infrastructure—Coder doesn’t host any data. Organizations can deploy Coder on their private servers or on cloud computing platforms such as Amazon Web Services or Google Cloud Platform. This option could be advantageous for banks, defense organizations, and other companies handling sensitive data. In fact, one of Coder’s customers is the U.S. Air Force, and the startup closed a US $30 million Series B funding round last month (bringing its total funding to $43 million), with In-Q-Tel, a venture capital firm with ties to the U.S. Central Intelligence Agency, as one of its backers.

Source: Software Development Environments Move to the Cloud – IEEE Spectrum

Lockdown-Ignoring Sweden Now Has Nordic Europe’s Highest Per-Capita Death Rate and only 7.3% antibodies

Sweden’s death rate per million (376) “is far in advance of Norway’s (44), Denmark’s (96) and Finland’s (55) — countries with similar welfare systems and demographics, but which imposed strict lockdowns…” reports the Guardian, “raising concerns that the country’s light-touch approach to the coronavirus may not be helping it build up broad immunity.”

“According to the scientific online publication Ourworldindata.com, Covid-19 deaths in Sweden were the highest in Europe per capita in a rolling seven-day average between 12 and 19 May. The country’s 6.25 deaths per million inhabitants a day was just above the UK’s 5.75.”

Slashdot reader AleRunner writes: Immunity levels in Sweden, which were expected to reach 33% by the start of May have been measured at only 7.3%, suggesting that Sweden’s lighter lockdown may continue indefinitely whilst other countries begin to revive their economies. Writing about new Swedish antibody results in the Guardian Jon Henley goes on to report that other European countries like Finland are already considering blocking travel from Sweden which may increase Sweden’s long term isolation.

We have discussed before whether Sweden, which locked down earlier than most but with fewer restrictions could be a model for other countries.

As it is, now, the country is looking more like a warning to the rest of the world.
The Guardian concludes that the Swedish government’s decision to avoid a strict lockdown “is thought unlikely to spare the Swedish economy. Although retail and entertainment spending has not collapsed quite as dramatically as elsewhere, analysts say the country will probably not reap any long-term economic benefit.”

Source: Lockdown-Ignoring Sweden Now Has Europe’s Highest Per-Capita Death Rate – Slashdot

A drastic reduction in hardware overhead for quantum computing with new error correcting techniques

A scientist at the University of Sydney has achieved what one quantum industry insider has described as “something that many researchers thought was impossible”.

Dr. Benjamin Brown from the School of Physics has developed a type of error-correcting code for quantum computers that will free up more hardware to do useful calculations. It also provides an approach that will allow companies like Google and IBM to design better quantum microchips.

He did this by applying already known code that operates in three-dimensions to a two-dimensional framework.

“The trick is to use time as the third dimension. I’m using two physical dimensions and adding in time as the third dimension,” Dr. Brown said. “This opens up possibilities we didn’t have before.”

His research is published today in Science Advances.

“It’s a bit like knitting,” he said. “Each row is like a one-dimensional line. You knit row after row of wool and, over time, this produces a two-dimensional panel of material.”

Fault-tolerant quantum computers

Reducing errors in is one of the biggest challenges facing scientists before they can build machines large enough to solve useful problems.

“Because quantum information is so fragile, it produces a lot of errors,” said Dr. Brown, a research fellow at the University of Sydney Nano Institute.

Completely eradicating these errors is impossible, so the goal is to develop a “fault-tolerant” architecture where useful processing operations far outweigh error-correcting operations.

“Your mobile phone or laptop will perform billions of operations over many years before a single error triggers a blank screen or some other malfunction. Current quantum operations are lucky to have fewer than one error for every 20 operations—and that means millions of errors an hour,” said Dr. Brown who also holds a position with the ARC Centre of Excellence for Engineered Quantum Systems.

“That’s a lot of dropped stitches.”

Most of the building blocks in today’s experimental quantum computers—quantum bits or qubits—are taken up by the “overhead” of .

“My approach to suppressing errors is to use a code that operates across the surface of the architecture in two dimensions. The effect of this is to free up a lot of the hardware from error correction and allow it to get on with the useful stuff,” Dr. Brown said.

Dr. Naomi Nickerson is Director of Quantum Architecture at PsiQuantum in Palo Alto, California, and unconnected to the research. She said: “This result establishes a new option for performing fault-tolerant gates, which has the potential to greatly reduce overhead and bring practical quantum computing closer.”

Source: A stitch in time: How a quantum physicist invented new code from old tricks

More information: Science Advances (2020). DOI: 10.1126/sciadv.eaay4929 , advances.sciencemag.org/content/6/21/eaay4929

Breathing Habits Are Related To Physical and Mental Health

Breathing is a missing pillar of health, and our attention to it is long overdue. Most of us misunderstand breathing. We see it as passive, something that we just do. Breathe, live; stop breathing, die. But breathing is not that simple and binary. How we breathe matters, too. Inside the breath you just took, there are more molecules of air than there are grains of sand on all the world’s beaches. We each inhale and exhale some 30 pounds of these molecules every day — far more than we eat or drink. The way that we take in that air and expel it is as important as what we eat, how much we exercise and the genes we’ve inherited. This idea may sound nuts, I realize. It certainly sounded that way to me when I first heard it several years ago while interviewing neurologists, rhinologists and pulmonologists at Stanford, Harvard and other institutions. What they’d found is that breathing habits were directly related to physical and mental health.

Today, doctors who study breathing say that the vast majority of Americans do it inadequately. […] But it’s not all bad news. Unlike problems with other parts of the body, such as the liver or kidneys, we can improve the airways in our too-small mouths and reverse the entropy in our lungs at any age. We can do this by breathing properly. […] [T]he first step in healthy breathing: extending breaths to make them a little deeper, a little longer. Try it. For the next several minutes, inhale gently through your nose to a count of about five and then exhale, again through your nose, at the same rate or a little more slowly if you can. This works out to about six breaths a minute. When we breathe like this we can better protect the lungs from irritation and infection while boosting circulation to the brain and body. Stress on the heart relaxes; the respiratory and nervous systems enter a state of coherence where everything functions at peak efficiency. Just a few minutes of inhaling and exhaling at this pace can drop blood pressure by 10, even 15 points. […] [T]he second step in healthy breathing: Breathe through your nose. Nasal breathing not only helps with snoring and some mild cases of sleep apnea, it also can allow us to absorb around 18% more oxygen than breathing through our mouths. It reduces the risk of dental cavities and respiratory problems and likely boosts sexual performance. The list goes on.

Source: Breathing Habits Are Related To Physical and Mental Health – Slashdot

Linux not Windows: Why Munich is shifting back from Microsoft to open source – again

In a notable U-turn for the city, newly elected politicians in Munich have decided that its administration needs to use open-source software, instead of proprietary products like Microsoft Office.

“Where it is technologically and financially possible, the city will put emphasis on open standards and free open-source licensed software,” a new coalition agreement negotiated between the recently elected Green party and the Social Democrats says.

The agreement was finalized Sunday and the parties will be in power until 2026. “We will adhere to the principle of ‘public money, public code’. That means that as long as there is no confidential or personal data involved, the source code of the city’s software will also be made public,” the agreement states.

The decision is being hailed as a victory by advocates of free software, who see this as a better option economically, politically, and in terms of administrative transparency.

However, the decision by the new coalition administration in Germany’s third largest and one of its wealthiest cities is just the latest twist in a saga that began over 15 years ago in 2003, spurred by Microsoft’s plans to end support for Windows NT 4.0.

Because the city needed to find a replacement for aging Microsoft Windows workstations, Munich eventually began the move away from proprietary software at the end of 2006.

At the time, the migration was seen as an ambitious, pioneering project for open software in Europe. It involved open-standard formats, vendor-neutral software and the creation of a unique desktop infrastructure based on Linux code named ‘LiMux’ – a combination of Linux and Munich.

By 2013, 80% of desktops in the city’s administration were meant to be running LiMux software. In reality, the council continued to run the two systems – Microsoft and LiMux – side by side for several years to deal with compatibility issues.

As the result of a change in the city’s government, a controversial decision was made in 2017 to leave LiMux and move back to Microsoft by 2020. At the time, critics of the decision blamed the mayor and deputy mayor and cast a suspicious eye on the US software giant’s decision to move its headquarters to Munich.

In interviews, a former Munich mayor, under whose administration the LiMux program began, has been candid about the efforts Microsoft went to to retain their contract with the city.

The migration back to Microsoft and to other proprietary software makers like Oracle and SAP, costing an estimated €86.1m ($93.1m), is still in progress today.

“We’re very happy that they’re taking on the points in the ‘Public Money, Public Code’ campaign we started two and a half years ago,” Alex Sander, EU public policy manager at the Berlin-based Free Software Foundation Europe, tells ZDNet. But it’s also important to note that this is just a statement in a coalition agreement outlining future plans, he says.

“Nothing will change from one day to the next, and we wouldn’t expect it to,” Sander continued, noting that the city would also be waiting for ongoing software contracts to expire. “But the next time there is a new contract, we believe it should involve free software.”

Any such step-by-step transition can be expected to take years. But it is also possible that Munich will be able to move faster than most because they are not starting from zero, Sander noted. It can be assumed that some LiMux software is still in use and that some of the staff there would have used it before.

[…]

Source: Linux not Windows: Why Munich is shifting back from Microsoft to open source – again | ZDNet

Libraries Have Never Needed Permission To Lend Books, And The Move To Change That Is A Big Problem

There are a variety of opinions concerning the Internet Archive’s National Emergency Library in response to the pandemic. I’ve made it clear in multiple posts why I believe the freakout from some publishers and authors is misguided, and that the details of the program are very different than those crying about it have led you to believe. If you don’t trust my analysis and want to whine about how I’m biased, I’d at least suggest reading a fairly balanced review of the issues by the Congressional Research Service.

However, Kyle Courtney, the Copyright Advisor for Harvard University, has a truly masterful post highlighting not just why the NEL makes sense, but just how problematic it is that many — including the US Copyright Office — seem to want to move to a world of permission and licensing for culture that has never required such things in the past.

Licensing culture is out of control. This has never been clearer than during this time when hundreds of millions of books and media that were purchased by libraries, archives, and other cultural intuitions have become inaccessible due to COVID-19 closures or, worse, are closed off further by restrictive licensing.

What’s really set Courtney off is that the Copyright Office has come out, in response to the NEL, to suggest that the solution to any such concerns raised by books being locked up by the pandemic must be more licensing:

The ultimate example of this licensing culture gone wild is captured in a recent U.S. Copyright Office letter. Note that this letter is not a legally binding document. It is the opinion of an office under the control of the Library of Congress, that is tasked among other missions, with advising Congress when they ask copyright questions, as in this case.

Senator Tom Udall asked the Copyright Office to give its legal analysis of the NEL and similar library efforts, and it did so… badly.

The Office responded with a letter revealing their recommendation was not going to be the guidance document to “help libraries, authors, and online outlets,” but, ultimately, called for more licensing. It also continued a common misunderstanding of an important case, Capitol Records, LLC v. ReDigi Inc., 910 F. 3d 649 (2d Cir 2018).

We’ve written about the Redigi case a few times, but as Courtney details, the anti-internet, pro-extreme copyright folks have embraced it to mean much more than it actually means (we’ll get back that shortly). Courtney points out that the Copyright Office seems to view everything through a single lens: “licensing” (i.e., permission). So while the letter applauds more licensing, that’s really just a celebration of greater permission when none is necessary. And through that lens the Copyright Office seems to think that the NEL isn’t really necessary because publishers have been choosing to make some of their books more widely available (via still restrictive licensing). But, as Courtney explains, libraries aren’t supposed to need permission:

Here’s the problem though: these vendors and publishers are not libraries. The law does not treat them the same. Vendors must must ask permission, they must license, this is their business model. Libraries are special creatures of copyright. Libraries have a legally authorized mandate granted by Congress to complete their mission to provide access to materials. They put many of these in copyright exemptions for libraries in the Copyright Act itself.

The Copyright Office missed this critical difference completely when it said digital, temporary, or emergency libraries should “seek permission from authors or publishers prior” to the use. I think think this is flat-out wrong. And I have heard this in multiple settings over the last few months: somehow it has crept into our dialog that libraries should have always sought a license to lend books, even digital books, exactly like the vendors and publishers who sought permission first. Again, this is fundamentally wrong.

Let me make this clear: Libraries do not need a license to loan books. What libraries do (give access to their acquired collections of acquired books) is not illegal. And libraries generally do not need to license or contract before sharing these legally acquired works, digital or not. Additionally, libraries, and their users, can make (and do make) many uses of these works under the law including interlibrary loan, reserves, preservation, fair use, and more!

[…]

Source: Libraries Have Never Needed Permission To Lend Books, And The Move To Change That Is A Big Problem | Techdirt