The Linkielist

Linking ideas with the world

The Linkielist

Issue with Cloudflare’s DNS service and crappy router shuts down half the web. Again.

Scores of websites and services went down Friday afternoon due to problems with Cloudflare’s DNS service, sparking rampant speculation about the cause. After all, a global DDOS attack would totally fit the real-life apocalypse movie that 2020 is increasingly turning into.

The outage, which started shortly after 5 p.m. ET, brought down popular sites and services like Discord, Politico, Feedly, and League of Legends for roughly half an hour on Friday. Once connections were restored, Cloudflare issued an incident report stating that the issue “was not as a result of an attack” and that it “has been identified and a fix is being implemented.”

Turns out the real explanation’s nothing so nefarious. Evidently, half the internet briefly went dark because of a crappy router in Atlanta.

“It appears that a router in Atlanta had an error that caused bad routes across our backbone. That resulted in misrouted traffic to PoPs that connect to our backbone,” Cloudflare CEO Matthew Prince tweeted Friday. “We isolated the Atlanta router and shut down our backbone, routing traffic across transit providers instead. There was some congestion that caused slow performance on some links as the logging caught up. Everything is restored now and we’re looking into the root cause.”

According to the incident report, this issue with Cloudflare’s 1.1.1.1 DNS service impacted its data centers internationally, from Frankfurt to Paris and Schiphol, as well as several in major U.S. cities, including Los Angeles, Chicago, Seattle, Atlanta, and San Jose. Reports on Downdetector showed the outages appeared to be concentrated in the U.S. and northern Europe.

Source: Issue with Cloudflare’s DNS service shuts down half the web

The Cheap Solution for Pantone Color Picking

Designers often rely on their smartphones for snapping a quick photo of something that inspires them, but Pantone has found a way to turn their smartphone into a genuine design tool. As part of a new online service, it’s created a small card that can be used to accurately sample real world colors by simply holding the card against an object and taking a photo.

[…]

There are existing solutions to this problem. Even Pantone itself sells handheld devices that use highly-calibrated sensors and controlled lighting to sample a real-life color when placed directly on an object. After sampling, the device lets you know how to recreate it in your design software. The problem is they can set you back well north of $700 if the design work you’re doing is especially color critical and accuracy is paramount.

Illustration for article titled This $15 Rainbow Card Turns Your Smartphone Into a Highly Accurate Color Picker
Photo: Pantone (Other)

At $15, the Pantone Color Match Card is a much cheaper solution, and it’s one that can be carried in your wallet. When you find a color you want to sample in the real world, you place the card atop it, with the hole in the middle revealing that color, and then take a photo using the Pantone Connect app available for iOS and Android devices.

The app knows the precise color measurements of all the colored squares printed on the rest of the card, which it uses as a reference to accurately calibrate and measure the color you’re sampling. It then attempts to closely match the selection to a shade indexed in the Pantone color archive. The results can be shared to design apps like Adobe Photoshop and Adobe Illustrator using Pantone’s other software tools, and while you can use the app and the Color Match Card with a free Pantone Connect account, a paid account is needed for some of the more advanced interoperability functionality.

Source: The Cheap Solution for Pantone Color Picking

Purism’s quest against Intel’s Management Engine black box CPU now comes in 14 inches

This latest device succeeds the previous Librem 13 laptop, which ran for four generations, and includes a slightly bigger display, a hexa-core Ice Lake Intel Core i7 processor, gigabit Ethernet, and USB-C. As the name implies, the Librem 14 packs a 14-inch, 1920×1080 IPS display. Purism said this comes without increasing the laptop’s dimensions thanks to smaller bezels. You can find the full specs here.

Librem 14

Crucially, it is loaded with the usual privacy features found in Purism’s kit such as hardware kill switches that disconnect the microphone and webcam from the laptop’s circuitry. It also comes with the firm’s PureBoot tech, which includes Purism’s in-house CoreBoot BIOS replacement, and a mostly excised Intel Management Engine (IME).

The IME is a hidden coprocessor included in most of Chipzilla’s chipsets since 2008. It allows system administrators to remotely manage devices using out-of-band communications. But it’s also controversial in the security community since it’s somewhat of a black box.

There is little by way of public documentation. Intel hasn’t released the source code. And, to add insult to injury, it’s also proven vulnerable to exploitation in the past.

Source: Purism’s quest against Intel’s Management Engine black box CPU now comes in 14 inches • The Register

T-Mobile US outage finally ends after more than twelve hours (updated)

T-Mobile’s network is having an issue with voice and data service. There was a huge spike in outage reports on Down Detector starting at around 1 PM ET today, with many people across the US suggesting on that site and Twitter that they’re having problems. By around 3:30 PM ET, Down Detector had collected more than 82,000 outage reports.

Some people are unable to make or receive calls, but Wi-Fi calling still seems to work (in case you’re wondering why you can still call someone else from a T-Mobile phone right now). There are problems with data service too. T-Mobile’s president of technology Neville Ray confirmed the issue and said the company’s engineers are working to resolve them:

Source: T-Mobile outage finally ends after more than twelve hours (updated) | Engadget

A drastic reduction in hardware overhead for quantum computing with new error correcting techniques

A scientist at the University of Sydney has achieved what one quantum industry insider has described as “something that many researchers thought was impossible”.

Dr. Benjamin Brown from the School of Physics has developed a type of error-correcting code for quantum computers that will free up more hardware to do useful calculations. It also provides an approach that will allow companies like Google and IBM to design better quantum microchips.

He did this by applying already known code that operates in three-dimensions to a two-dimensional framework.

“The trick is to use time as the third dimension. I’m using two physical dimensions and adding in time as the third dimension,” Dr. Brown said. “This opens up possibilities we didn’t have before.”

His research is published today in Science Advances.

“It’s a bit like knitting,” he said. “Each row is like a one-dimensional line. You knit row after row of wool and, over time, this produces a two-dimensional panel of material.”

Fault-tolerant quantum computers

Reducing errors in is one of the biggest challenges facing scientists before they can build machines large enough to solve useful problems.

“Because quantum information is so fragile, it produces a lot of errors,” said Dr. Brown, a research fellow at the University of Sydney Nano Institute.

Completely eradicating these errors is impossible, so the goal is to develop a “fault-tolerant” architecture where useful processing operations far outweigh error-correcting operations.

“Your mobile phone or laptop will perform billions of operations over many years before a single error triggers a blank screen or some other malfunction. Current quantum operations are lucky to have fewer than one error for every 20 operations—and that means millions of errors an hour,” said Dr. Brown who also holds a position with the ARC Centre of Excellence for Engineered Quantum Systems.

“That’s a lot of dropped stitches.”

Most of the building blocks in today’s experimental quantum computers—quantum bits or qubits—are taken up by the “overhead” of .

“My approach to suppressing errors is to use a code that operates across the surface of the architecture in two dimensions. The effect of this is to free up a lot of the hardware from error correction and allow it to get on with the useful stuff,” Dr. Brown said.

Dr. Naomi Nickerson is Director of Quantum Architecture at PsiQuantum in Palo Alto, California, and unconnected to the research. She said: “This result establishes a new option for performing fault-tolerant gates, which has the potential to greatly reduce overhead and bring practical quantum computing closer.”

Source: A stitch in time: How a quantum physicist invented new code from old tricks

More information: Science Advances (2020). DOI: 10.1126/sciadv.eaay4929 , advances.sciencemag.org/content/6/21/eaay4929

Amazon builds UV-light robot to kill coronavirus on surfaces

Amazon built robot that is designed to kill the novel coronavirus with ultraviolet light.

The robot looks a little like a hotel luggage cart, with a tall metal frame attached to a rectangular wheeled bottom. One side of the frame is outfitted with at least 10 ultraviolet tube lights.

In a video shared with CBS News’ “60 Minutes,” the robot rolls down the freezer aisle of a Whole Foods store, aiming UV light at the freezer doors.

The robot could be used in warehouses and at Whole Foods stores to kill the virus on surfaces such as food, packaging, and door handles.

Source: Amazon builds UV-light robot to kill coronavirus on surfaces – Business Insider

Three things in life are certain: Death, taxes, and cloud-based IoT gear bricked by vendors. Looking at you, Belkin

Oh look, here’s another cautionary tale about buying cloud-based IoT kit. On 29 May, global peripheral giant Belkin will flick the “off” switch on its Wemo NetCam IP cameras, turning the popular security devices into paperweights.

It’s not unusual for a manufacturer to call time on physical hardware. Like software, it has a lifespan where, afterwards, it’s deemed not economically viable for the vendor to continue providing support.

But this is a little different, because Belkin isn’t merely ending support. It also plans to decommission the cloud services required for its Wemo NetCam devices to actually work.

“Although your Wemo NetCam will still connect to your Wi-Fi network, without these servers you will not be able to view the video feed or access the security features of your Wemo NetCam, such as Motion Clips and Motion Notifications,” Belkin said on its official website.

“If you use your Wemo NetCam as a motion sensor for your Wemo line of products, it will no longer provide this functionality and will be removed as an option from your Wemo app,” the company added.

Adding insult to injury, the ubiquitous consumer network gear maker only plans to refund customers with active warranties, which excludes anyone who bought their device more than two years ago. The window to submit requests is open from now until 30 June.

Source: Three things in life are certain: Death, taxes, and cloud-based IoT gear bricked by vendors. Looking at you, Belkin • The Register

Buyer beware—that 2TB-6TB “NAS” drive you’ve been eyeing might be SMR – and won’t work in your NAS

Storage vendors, including but reportedly not limited to Western Digital, have quietly begun shipping SMR (Shingled Magnetic Recording) disks in place of earlier CMR (Conventional Magnetic Recording) disks.

SMR is a technology that allows vendors to eke out higher storage densities, netting more TB capacity on the same number of platters—or fewer platters, for the same amount of TB.

Until recently, the technology has only been seen in very large disks, which were typically clearly marked as “archival”. In addition to higher capacities, SMR is associated with much lower random I/O performance than CMR disks offer.

Storage vendors appear to be getting much bolder about deploying the new technology into ever-smaller formats, presumably to save a bit on manufacturing costs. A few weeks ago, a message popped up on the zfs-discuss mailing list:

WD and Seagate are both submarining Drive-managed SMR (DM-SMR) drives into channels, disguised as “normal” drives.

For WD REDs this shows as EFRX (standard drive) suffix being changed to EFAX suffix (DM-SMR) […] The only clue you’ll get about these drives being SMR is the appalling sequential write speeds (~40MB/s from blank) and the fact that they report a “trim” function.

The unexpected shift from CMR to SMR in these NAS (Network Attached Storage) drives has caused problems above and beyond simple performance; the user quoted above couldn’t get his SMR disks to stay in his ZFS storage array at all.

There has been speculation that the drives got kicked out of the arrays due to long timeouts—SMR disks need to perform garbage-collection routines in the background and store incoming writes in a small CMR-encoded write-cache area of the disk, before moving them to the main SMR encoded storage.

It’s possible that long periods of time with no new writes accepted triggered failure-detection routines that marked the disk as bad. We don’t know the details for certain, but several users have reported that these disks cannot be successfully used in their NAS systems—despite the fact that the name of the actual product is WD Red NAS Hard Drive.

[…]

What really grinds our gears about this is that the only conceivable reason to shift to SMR technology in such small disks—lowered manufacturing costs due to fewer platters required—doesn’t seem to be being passed down to the consumer. The screenshot above shows the Amazon price of a WD Red 2TB EFRX and WD Red 2TB EFAX—the EFRX is the faster CMR drive, and the EFAX is the much slower SMR drive.

Western Digital doesn’t appear to be the only hard drive manufacturer doing this—blocksandfiles has confirmed quiet, undocumented use of SMR in small retail drives from Seagate and Toshiba as well.

We suspect the greater ire aimed at Western Digital is due both to the prominent NAS branding of the Red line and the general best-in-class reputation it has enjoyed in that role for several years.

Source: Buyer beware—that 2TB-6TB “NAS” drive you’ve been eyeing might be SMR | Ars Technica

After 50 Years of Effort, Researchers Made Silicon Emit Light, could improve computer speeds vastly

Modern transistors, which function as a computer’s brain cells, are only a few atoms long. If they are packed too tightly, that can cause all sorts of problems: electron traffic jams, overheating, and strange quantum effects. One solution is to replace some electronic circuits with optical connections that use photons instead of electrons to carry data around a chip. There’s just one problem: Silicon, the main material in computer chips, is terrible at emitting light.

Now, a team of European researchers says they have finally overcome this hurdle. On Wednesday, a research team led by Erik Bakkers, a physicist at Eindhoven University of Technology in the Netherlands, published a paper in Nature that details how they grew silicon alloy nanowires that can emit light. It’s a problem that physicists have grappled with for decades, but Bakkers says his lab is already using the technique to develop a tiny silicon laser that can be built into computer chips. Integrating photonic circuits on conventional electronic chips would enable faster data transfer and lower energy consumption without raising the chip’s temperature, which could make it particularly useful for data-intensive applications like machine learning.

“It’s a big breakthrough that they were able to demonstrate light emission from nanowires made of a silicon mixture, because these materials are compatible with the fabrication processes used in the computer chip industry,” says Pascal Del’Haye, who leads the microphotonics group at the Max Planck Institute for the Science of Light and was not involved in the research. “In the future, this might enable the production of microchips that combine both optical and electronic circuits.”

Source: After 50 Years of Effort, Researchers Made Silicon Emit Light | WIRED

no Intel Management Engine: Purism lifts lid on the Librem Mini, a privacy-focused micro PC

Purism has dropped the veil on the latest computer in its privacy-focused lineup – a small form-factor PC designed for space-conscious free software enthusiasts.

Available to pre-order now, the Librem Mini packs an eighth-generation, quad-core Whiskey Lake i7-8565U processor, modified with Purism’s Pureboot technology. At its heart, this aims to minimise any potential third-party interference with the operation of the computer – particularly during the boot phase, where it is potentially vulnerable.

It accomplishes this by thoroughly excising the Intel Management Engine, which Purism regards as an untrustworthy black-box baked into the heart of the processor, along with other software-level approaches. These include the use of the free software Coreboot BIOS, as well as the Purism-developed Heads, which aims to identify potential tampering within the BIOS, Kernel, and GRUB config.

In terms of expansibility, the machine packs a SATA and M.2 slot, and comes with two SODIMM slots, which can be filled with up to 64GB of RAM. There’s no dedicated graphics to speak of, but it does include Intel’s UHD 640 integrated graphics. Aside from a smattering of USB-A and USB-C slots, the Librem Mini also includes both Display and HDMI slots.

There’s also a standard RJ45 Ethernet slot – although you can add WiFi and Bluetooth via an optional Atheros ATH9k jack.

The Librem Mini has a small footprint, measuring just 5 inches across and weighing just 1kg – which is lighter than many laptops.

This machine is the latest in a growing lineup of machines that cater to the privacy-centric punter, including the Librem 13 and 15 laptops. Purism is also in the process of developing a smartphone platform to run on its own Linux-based PureOS operating system, and a baseband fully separate from the CPU. The firm has raised $2 million via crowdfunding for this effort and is expected to ship the first units later this year.

Pre-orders for the Librem Mini are open now. Retailing at $699, the base model packs 8GB of RAM and 256GB of NVMe storage. Units will ship one month after the firm has reached its (relatively modest) $50,000 pre-order goal.

Purism touts the Librem Mini as a potential mini-desktop or media server, although El Reg feels the use-case isn’t really as relevant as the potential customer. Greater awareness of privacy – and the way it’s gradually being eroded – has created an appetite for such devices, as demonstrated by Purism’s previous crowdfunding accomplishments. And if you want to excise a greater control over how you use your computer, this machine will undoubtedly appeal to you. ®

Source: Look ma, no Intel Management Engine: Purism lifts lid on the Librem Mini, a privacy-focused micro PC • The Register

Japanese robot could call last orders on human bartenders

The repurposed industrial robot serves drinks in is own corner of a Japanese pub operated by restaurant chain Yoronotaki. An attached tablet computer face smiles as it chats about the weather while preparing orders.

The robot, made by the company QBIT Robotics, can pour a beer in 40 seconds and mix a cocktail in a minute. It uses four cameras to monitors customers to analyze their expressions with artificial intelligence (AI) software.

“I like it because dealing with people can be a hassle. With this you can just come and get drunk,” Satoshi Harada, a restaurant worker said after ordering a drink.

“If they could make it a little quicker it would be even better.”

Finding workers, especially in Japan’s service sector, is set to get even more difficult.

The government has eased visa restrictions to attract more foreign workers but companies still face a labor shortage as the population shrinks and the number of people over 65 increases to more than a third of the total.

Source: Japanese robot could call last orders on human bartenders – Reuters

Sonos CEO apologizes for confusion, says legacy products will work ‘as long as possible’ – however long that is

Sonos CEO Patrick Spence just published a statement on the company’s website to try to clear up an announcement made earlier this week: on Tuesday, Sonos announced that it will cease delivering software updates and new features to its oldest products in May. The company said those devices should continue functioning properly in the near term, but it wasn’t enough to prevent an uproar from longtime customers, with many blasting Sonos for what they perceive as planned obsolescence. That frustration is what Spence is responding to today. “We heard you,” is how Spence begins the letter to customers. “We did not get this right from the start.”

Spence apologizes for any confusion and reiterates that the so-called legacy products will “continue to work as they do today.” Legacy products include the original Sonos Play:5, Zone Players, and Connect / Connect:Amp devices manufactured between 2011 and 2015.

“Many of you have invested heavily in your Sonos systems, and we intend to honor that investment for as long as possible.” Similarly, Spence pledges that Sonos will deliver bug fixes and security patches to legacy products “for as long as possible” — without any hard timeline. Most interesting, he says “if we run into something core to the experience that can’t be addressed, we’ll work to offer an alternative solution and let you know about any changes you’ll see in your experience.”

The letter from Sonos’ CEO doesn’t retract anything that the company announced earlier this week; Spence is just trying to be as clear as possible about what’s happening come May. Sonos has insisted that these products, some of which are a decade old, have been taken to their technological limits.

Spence again confirms that Sonos is planning a way for customers to fork any legacy devices they might own off of their main Sonos system with more modern speakers. (Sonos architected its system so that all devices share the same software. Once one product is no longer eligible for updates, the whole setup stops receiving them. This workaround is designed to avoid that problem.)

Source: Sonos CEO apologizes for confusion, says legacy products will work ‘as long as possible’ – The Verge

An Open Source eReader That’s Free of Corporate Restrictions Is Exactly What I Want Right Now

The Open Book Project was born from a contest held by Hackaday and that encouraged hardware hackers to find innovative and practical uses for the Arduino-based Adafruit Feather development board ecosystem. The winner of that contest was the Open Book Project which has been designed and engineered from the ground up to be everything devices like the Amazon Kindle or Rakuten Kobo are not. There are no secrets inside the Open Book, no hidden chips designed to track and share your reading habits and preferences with a faceless corporation. With enough know-how, you could theoretically build and program your own Open Book from scratch, but as a result of winning the Take Flight With Feather contest, Digi-Key will be producing a small manufacturing run of the ereader, with pricing and availability still to be revealed.

The raw hardware isn’t as sleek or pretty as devices like the Kindle, but at the same time there’s a certain appeal to the exposed circuit board which features brief descriptions of various components, ports, and connections etched right onto the board itself for those looking to tinker or upgrade the hardware. Users are encouraged to design their own enclosures for the Open Book if they prefer, either through 3D-printed cases made of plastic, or rustic wooden enclosures created using laser cutting machines.

Text will look a little aliased on the Open Book’s E Ink display.
Text will look a little aliased on the Open Book’s E Ink display.
Photo: Hackaday.io

With a resolution of just 400×300 pixels on its monochromatic E Ink display, text on the Open Book won’t look as pretty as it does on the Amazon Kindle Oasis which boasts a resolution of 1,680×1,264 pixels, but it should barely sip power from its built-in lithium-polymer rechargeable battery—a key benefit of using electronic paper.

The open source ereader—powered by an ARM Cortex M4 processor—will also include a headphone jack for listening to audio books, a dedicated flash chip for storing language files with specific character sets, and even a microphone that leverages a TensorFlow-trained AI model to intelligently process voice commands so you can quietly mutter “next!” to turn the page instead of reaching for one of the ereader’s physical buttons like a neanderthal. It can also be upgraded with additional functionality such as Bluetooth or wifi using Adafruit Feather expansion boards, but the most important feature is simply a microSD card slot allowing users to load whatever electronic text and ebook files they want. They won’t have to be limited by what a giant corporation approves for its online book store, or be subject to price-fixing schemes which, for some reason, have still resulted in electronic files costing more than printed books.

What remains to be seen is whether or not the Open Book Project can deliver an ereader that’s significantly cheaper than what Amazon or Rakuten has delivered to consumers. Both of those companies benefit from the economy of scale having sold millions of devices to date, and are able to throw their weight around when it comes to manufacturing costs and sourcing hardware. If the Open Book can be churned out for less than $50, it could potentially provide some solid competition to the limited ereader options currently out there.

Source: An Open Source eReader That’s Free of Corporate Restrictions Is Exactly What I Want Right Now

Apple’s latest AI acquisition leaves some Wyze cameras without people detection

Earlier today, Apple confirmed it purchased Seattle-based AI company Xnor.ai (via MacRumors). Acquisitions at Apple’s scale happen frequently, though rarely do they impact everyday people on the day of their announcement. This one is different.

Cameras from fellow Seattle-based company Wyze, including the Wyze Cam V2 and Wyze Cam Pan, have utilized Xnor.ai’s on-device people detection since last summer. But now that Apple owns the company, it’s no longer available. Some people on Wyze’s forum are noting that the beta firmware removing the people detection has already started to roll out.

Oddly enough, word of this lapse in service isn’t anything new. Wyze issued a statement in November 2019 saying that Xnor.ai had terminated their contract (though its reason for doing so wasn’t as clear then as it is today), and that a firmware update slated for mid-January 2020 would remove the feature from those cameras.

There’s a bright side to this loss, though, even if Apple snapping up Xnor.ai makes Wyze’s affordable cameras less appealing in the interim. Wyze says that it’s working on its own in-house version of people detection for launch at some point this year. And whether it operates on-device via “edge AI” computing like Xnor.ai’s does, or by authenticating through the cloud, it will be free for users when it launches.

That’s good and all, but the year just started, and it’s a little worrying Wyze hasn’t followed up with a specific time frame for its replacement of the feature. Two days ago, Wyze’s social media community manager stated that the company was “making great progress” on its forums, but they didn’t offer up when it would be available.

As for what Apple plans to do with Xnor.ai is anyone’s guess. Ahead of its partnership with Wyze, the AI startup had developed a small, wireless AI camera that ran exclusively on solar power. Regardless of whether Apple is more interested in its edge computing algorithm, as was seen working on Wyze cameras for a short time, or its clever hardware ideas around AI-powered cameras, it’s getting all of it with the purchase.

Source: Apple’s latest AI acquisition leaves some Wyze cameras without people detection – The Verge

Amazon, Apple, Google, and the Zigbee Alliance joined together to form working group to develop open standard for smart home devices

Amazon, Apple, Google, and the Zigbee Alliance joined together to promote the formation of the Working Group. Zigbee Alliance board member companies IKEA, Legrand, NXP Semiconductors, Resideo, Samsung SmartThings, Schneider Electric, Signify (formerly Philips Lighting), Silicon Labs, Somfy, and Wulian are also on board to join the Working Group and contribute to the project.

The goal of the Connected Home over IP project is to simplify development for manufacturers and increase compatibility for consumers. The project is built around a shared belief that smart home devices should be secure, reliable, and seamless to use. By building upon Internet Protocol (IP), the project aims to enable communication across smart home devices, mobile apps, and cloud services and to define a specific set of IP-based networking technologies for device certification.

The industry Working Group will take an open-source approach for the development and implementation of a new, unified connectivity protocol. The project intends to use contributions from market-tested smart home technologies from Amazon, Apple, Google, Zigbee Alliance, and others. The decision to leverage these technologies is expected to accelerate the development of the protocol, and deliver benefits to manufacturers and consumers faster.

The project aims to make it easier for device manufacturers to build devices that are compatible with smart home and voice services such as Amazon’s Alexa, Apple’s Siri, Google’s Assistant, and others. The planned protocol will complement existing technologies, and Working Group members encourage device manufacturers to continue innovating using technologies available today.

Source: Project Connected Home over IP

Getting Drivers for Old Hardware Is Harder Than Ever

despite the fact that all the drivers generally have to do is simply sit on the internet, available when they’re necessary.

Apparently, that isn’t easy enough for Intel. Recently, the chipmaker took BIOS drivers, a boot-level firmware technology used for hardware initialization in earlier generations of PCs, for a number of its unsupported motherboards off its website, citing the fact that the programs have reached an “End of Life” status. While it reflects the fact that Unified Extensible Firmware Interface (UEFI), a later generation of firmware technology used in PCs and Macs, is expected to ultimately replace BIOS entirely, it also leaves lots of users with old gadgets out in a lurch. And as Bleeping Computer has noted, it appears to be part of a broader trend to prevent downloads for unsupported hardware on the Intel website—things that have long lived past their current lives. After all, if something goes wrong, Intel can be sure it’s not liable if a 15-year-old BIOS update borks a system.

In a comment to Motherboard, Intel characterized the approach to and timing of the removals as reflecting industry norms.

[…]

However, this is a problem for folks who take collecting or use of old technology seriously, such as those on the forum Vogons, which noticed the issue first, though it’s far from anything new. Technology companies come and go all the time, and as things like mergers and redesigns happen, often the software repository gets affected when the technology goes out of date.

A Problem For Consumers & Collectors

Jason Scott, the Internet Archive’s lead software curator, says that Intel’s decision to no longer provide old drivers on its website reflects a tendency by hardware and software developers to ignore their legacies when possible—particularly in the case of consumer software, rather than in the enterprise, where companies’ willingness to pay for updates ensures that needed updates won’t simply sit on the shelf.

[…]

By the mid-90s, companies started to create FTP repositories to distribute software, which had the effect of changing the nature of updates: When the internet made distribution easier and both innovation and security risks grew more advanced, technology companies updated their apps far more often.

FTP’s Pending Fadeout

Many of those FTP servers are still around today, but the news cycle offers a separate, equally disappointing piece of information for those looking for vintage drivers: Major web browsers are planning to sunset support for the FTP protocol. Chrome plans to remove support for FTP sites by version 82, which is currently in the development cycle and will hit sometime next year. And Firefox makers Mozilla have made rumblings about doing the same thing.

The reasons for doing so, often cited for similar removals of legacy features, come down to security. FTP is a legacy service that can’t be secured in much the same way that its successor, SFTP, can.

While FTP applications like CyberDuck will likely exist for decades from now, the disconnect from the web browser will make these servers a lot harder to use. The reason goes back to the fact that the FTP protocol isn’t inherently searchable—but the best way to find information about it is with a web-based search engine … such as Google.

[…]

Earlier this year, I was attempting to get a vintage webcam working, and while I was ultimately unable to get it to work, it wasn’t due to lack of software access. See, Logitech actually kept copies of Connectix’s old webcam software on its FTP site. This is software that hasn’t seen updates in more than 20 years; that only supports Windows 3.1, Windows NT, and Windows 95; and that wasn’t on Logitech’s website.

One has to wonder how soon those links will disappear from Google searches once the two most popular desktop browsers remove easy access to those files. And there’s no guarantee that a company is going to keep a server online beyond that point.

“It was just it was this weird experience that FTP sites, especially, could have an inertia of 15 to 20 years now, where they could be running all this time, untouched,” Scott added. “And just every time that, you know, if the machine dies, it goes away.”

Source: Getting Drivers for Old Hardware Is Harder Than Ever – VICE

Nikon Is Killing Its Authorized Repair Program

Nikon is ending its authorized repair program in early 2020, likely leaving more than a dozen repair shops without access to official parts and tools, and cutting the number of places you can get your camera fixed with official parts from more than a dozen independent shops to two facilities at the ends of the U.S.

That means that Nikon’s roughly 15 remaining Authorized Repair Station members are about to become non-authorized repair shops. Since Nikon decided to stop selling genuine parts to non-authorized shops back in 2012, it’s unlikely those stores will continue to have access to the specialty components, tools, software, manuals, and model training Nikon previously provided. But Nikon hasn’t clarified this, so repair shops have been left in the dark.

“This is very big, and we have no idea what’s coming next,” said Cliff Hanks, parts manager for Kurt’s Camera Repair in San Diego, Calif. “We need more information before March 31. We can make contingency plans, start stocking up on stuff, but when will we know for sure?”

In a letter obtained by iFixit, Nikon USA told its roughly 15 remaining Authorized Repair Station members in early November that it would not renew their agreements after March 31, 2020. The letter notes that “The climate in which we do business has evolved, and Nikon Inc. must do the same.” And so, Nikon writes, it must “change the manner in which we make product service available to our end user customers.”

In other words: Nikon’s camera business, slowly bled by smartphones, is going to adopt a repair model that’s even more restrictive than that of Apple or other smartphone makers. If your camera breaks, and you want it fixed with official parts or under warranty, you’ll now have to mail it to one of two ends of the country. This is more than a little inconvenient, especially for professional photographers.

Source: Nikon Is Killing Its Authorized Repair Program – iFixit

System76 Will Begin Shipping 2 Linux Laptops With Coreboot-Based Open Source Firmware

System76, the Denver-based Linux PC manufacturer and developer of Pop OS, has some stellar news for those of us who prefer our laptops a little more open. Later this month the company will begin shipping two of their laptop models with its Coreboot-powered open source firmware.

Beginning today, System76 will start taking pre-orders for both the Galago Pro and Darter Pro laptops. The systems will ship out later in October, and include the company’s Coreboot-based open source firmware which was previously teased at the 2019 Open Source Firmware Conference.

(Coreboot, formerly known as LinuxBIOS, is a software project aimed at replacing proprietary firmware found in most computers with a lightweight firmware designed to perform only the minimum number of tasks necessary to load and run a modern 32-bit or 64-bit operating system.)

What’s so great about ripping out the proprietary firmware included in machines like this and replacing it with an open alternative? To begin with, it’s leaner. System76 claims that users can boot from power off to the desktop 29% faster with its Coreboot-based firmware.

Source: System76 Will Begin Shipping 2 Linux Laptops With Coreboot-Based Open Source Firmware

MIT Researchers Build Functional Carbon Nanotube Microprocessor

Scientists at MIT built a 16-bit microprocessor out of carbon nanotubes and even ran a program on it, a new paper reports.

Silicon-based computer processors seem to be approaching a limit to how small they can be scaled, so researchers are looking for other materials that might make for useful processors. It appears that transistors made from tubes of rolled-up, single-atom-thick sheets of carbon, called carbon nanotubes, could one day have more computational power while requiring less energy than silicon.

[…]

the MIT group, led by Gage Hills and Christian Lau, has now debuted a functional 16-bit processor called RV16X-NANO that uses carbon nanotubes, rather than silicon, for its transistors. The processor was constructed using the same industry-standard processes behind silicon chips—Shulaker explained that it’s basically just a silicon microprocessor with carbon nanotubes instead of silicon.

The processor works well enough to run HELLO WORLD, a program that simply outputs the phrase “HELLO WORLD” and is the first program that most coding students learn. Shulaker compared its performance to a processor you’d buy at hobby shop to control a small robot.

[…]

A small but notable fraction of carbon nanotubes act like conductors instead of semiconductors. Shulaker explained that study author Hills devised a technique called DREAM, where the circuits were specifically designed to work despite the presence of metallic nanotubes. And of course, the effort relied on the contribution of every member of the relatively small team. The researchers published their results in the journal Nature today.

[…]

Ultimately, the goal isn’t to erase the decades of progress made by silicon microchips—perhaps companies can integrate carbon nanotube pieces into existing architectures.

This is still a proof-of-concept. The team still hasn’t calculated the chip’s performance or whether it’s actually more energy efficient than silicon—the gains are based on projections. But Shulaker hopes that the team’s work will serve as a roadmap toward incorporating carbon nanotubes in computers for the future.

Source: MIT Researchers Build Functional Carbon Nanotube Microprocessor

Researchers build a heat shield just 10 atoms thick to protect electronic devices

Excess heat given off by smartphones, laptops and other electronic devices can be annoying, but beyond that it contributes to malfunctions and, in extreme cases, can even cause lithium batteries to explode.

To guard against such ills, engineers often insert glass, plastic or even layers of air as insulation to prevent heat-generating components like microprocessors from causing damage or discomforting users.

Now, Stanford researchers have shown that a few layers of atomically , stacked like sheets of paper atop hot spots, can provide the same insulation as a sheet of glass 100 times thicker. In the near term, thinner heat shields will enable engineers to make even more compact than those we have today, said Eric Pop, professor of electrical engineering and senior author of a paper published Aug. 16 in Science Advances.

[…]

To make nanoscale heat shields practical, the researchers will have to find some mass production technique to spray or otherwise deposit atom-thin layers of materials onto electronic components during manufacturing. But behind the immediate goal of developing thinner insulators looms a larger ambition: Scientists hope to one day control the vibrational energy inside materials the way they now control electricity and light. As they come to understand the heat in solid objects as a form of sound, a new field of phononics is emerging, a name taken from the Greek root word behind telephone, phonograph and phonetics.

“As engineers, we know quite a lot about how to control electricity, and we’re getting better with light, but we’re just starting to understand how to manipulate the high-frequency sound that manifests itself as at the atomic scale,” Pop said.

Source: Researchers build a heat shield just 10 atoms thick to protect electronic devices

Apple Is Locking iPhone Batteries to Discourage Repair, showing ominous errors if you replace your battery

By activating a dormant software lock on their newest iPhones, Apple is effectively announcing a drastic new policy: only Apple batteries can go in iPhones, and only they can install them.

If you replace the battery in the newest iPhones, a message indicating you need to service your battery appears in Settings > Battery, next to Battery Health. The “Service” message is normally an indication that the battery is degraded and needs to be replaced. The message still shows up when you put in a brand new battery, however. Here’s the bigger problem: our lab tests confirmed that even when you swap in a genuine Apple battery, the phone will still display the “Service” message.

It’s not a bug; it’s a feature Apple wants. Unless an Apple Genius or an Apple Authorized Service Provider authenticates a battery to the phone, that phone will never show its battery health and always report a vague, ominous problem.

Source: Apple Is Locking iPhone Batteries to Discourage Repair – iFixit

Quantum interference allows huge data sets to be sifted through much more quickly

Contemporary science, medicine, engineering and information technology demand efficient processing of data—still images, sound and radio signals, as well as information coming from different sensors and cameras. Since the 1970s, this has been achieved by means of the Fast Fourier Transform algorithm (FFT). The FFT makes it possible to efficiently compress and transmit data, store pictures, broadcast digital TV, and talk over a mobile phone. Without this algorithm, medical imaging systems based on magnetic resonance or ultrasound would not have been developed. However, it is still too slow for many demanding applications.

To meet this goal, scientists have been trying for years to harness quantum mechanics. This resulted in the development of a quantum counterpart of the FFT, the Quantum Fourier Transform (QFT), which can be realized with a quantum computer. As the quantum computer simultaneously processes all possible values (so-called “superpositions”) of input data, the number of operations decreases considerably.

[…]

Mathematics describes many transforms. One of them is a Kravchuk transform. It is very similar to the FFT, as it allows processing of discrete (e.g. digital) data, but uses Kravchuk functions to decompose the input sequence into the spectrum. At the end of the 1990s, the Kravchuk transform was “rediscovered” in computer science. It turned out to be excellent for image and sound processing. It allowed scientists to develop new and much more precise algorithms for the recognition of printed and handwritten text (including even the Chinese language), gestures, sign language, people, and faces. A dozen years ago, it was shown that this transform is ideal for processing low-quality, noisy and distorted data, and thus it could be used for computer vision in robotics and autonomous vehicles. There is no fast algorithm to compute this transform, but it turns out that quantum mechanics allows one to circumvent this limitation.

“Holy Grail” of computer science

In their article published in Science Advances, scientists from the University of Warsaw—Dr. Magdalena Stobinska and Dr. Adam Buraczewski, scientists from the University of Oxford, and NIST, have shown that the simplest quantum gate, which interferes between two quantum states, essentially computes the Kravchuk transform. Such a gate could be a well-known optical device—a beam splitter, which divides photons between two outputs. When two states of quantum light enter its input ports from two sides, they interfere. For example, two identical photons, which simultaneously enter this device, bunch into pairs and come out together by the same exit port. This is the well-known Hong-Ou-Mandel effect, which can also be extended to states made of many particles. By interfering “packets” consisting of many indistinguishable photons (indistinguishability is very important, as its absence destroys the quantum effect), which encode the information, one obtains a specialized quantum computer that computes the Kravchuk transform.

The experiment was performed in a quantum optical laboratory at the Department of Physics at the University of Oxford, where a special setup was built to produce multiphoton quantum states, so-called Fock states. This laboratory is equipped with TES (Transmission Edge Sensors), developed by NIST, which operate at near-absolute zero temperatures. These detectors possess a unique feature: they can actually count photons. This allows one to precisely read the quantum state leaving the beam splitter and thus, the result of the computation. Most importantly, such a computation of the quantum Kravchuk transform always takes the same time, regardless of the size of the input data set. It is the “Holy Grail” of computer science: an algorithm consisting of just one operation, implemented with a single gate. Of course, in order to obtain the result in practice, one needs to perform the experiment several hundred times to get the statistics. This is how every quantum computer works. However, it does not take long, because the laser produces dozens of millions of multiphoton “packets” per second.

Source: Quantum interference in the service of information technology

AMD Ryzen 7 3700X + Ryzen 9 3900X Offer Incredible Linux Performance – if you can get it to boot. Which newer distros seemingly can’t

On newer Linux distributions, there’s a hard regression either within the kernel but more likely some cross-kernel/user-space interaction issue leaving newer Linux distributions unbootable.

While Ubuntu 18.04 LTS and older Linux distributions boot Zen 2, to date I have not been able to successfully boot the likes of Ubuntu 19.04, Manjaro Linux, and Fedora Workstation 31. On all newer Linux distributions I’ve tried on two different systems built around the Ryzen 7 3700X and Ryzen 9 3900X, each time early in the boot process as soon as trying to start systemd services, all systemd services fail to start.

I’ve confirmed with AMD they do have an open issue surrounding “5.0.9” (the stock kernel of Ubuntu 19.04) but as of writing hadn’t shed any light into the issue. AMD has said their testing has been mostly focused on Ubuntu 18.04 given its LTS status. I’ve also confirmed the same behavior with some other Windows reviewers who occasionally dabble with Linux.

So unfortunately not being able to boot newer Linux distributions is a huge pain. I’ve spent days trying different BIOS versions/options, different kernel command line parameters, and other options to no avail. On some Linux distributions after roughly 20~30 minutes of waiting after all systemd services fail to start, sometimes there will be a kernel panic but that hadn’t occurred on all systems at least not within that time-frame.

Source: AMD Ryzen 7 3700X + Ryzen 9 3900X Offer Incredible Linux Performance But With A Big Caveat Review – Phoronix

The Asus ZenBook Pro Duo laptop with two 4K screens – for some reason people are comparing to Apples touch bar, but has nothing to do with that.

The ZenBook Pro Duo has not one, but two 4K screens. (At least if you’re counting horizontal pixels.) There’s a 15-inch 16:9 OLED panel where you’d normally find the display on a laptop, then a 32:9 IPS “ScreenPad Plus” screen directly above the keyboard that’s the same width and half the height. It’s as if Asus looked at the MacBook Pro Touch Bar and thought “what if that, but with 32 times as many pixels?”

Unlike the Touch Bar, though, the ScreenPad Plus doesn’t take anything away from the ZenBook Pro Duo, except presumably battery life. Asus still included a full-sized keyboard with a function row, including an escape key, and the trackpad is located directly to the right. The design is very reminiscent of Asus’ Zephryus slimline gaming laptops — you even still get the light-up etching that lets you use the trackpad as a numpad. HP tried something similar recently, too, though its second screen was far smaller.

asus

Asus has built some software for the ScreenPad Plus that makes it more of a secondary control panel, but you can also use it as a full-on monitor, or even two if you want to split it into two smaller 16:9 1080p windows. You can also set it to work as an extension of the main screen, so websites rise up from above your keyboard as you scroll down, which is pretty unnerving. Or you could use it to watch Lawrence of Arabia while you jam on Excel spreadsheets.

The ZenBook Pro Duo has up to an eight-core Intel Core i9 processor with an Nvidia RTX 2060 GPU. There are four far-field microphones designed for use with Alexa and Cortana, and there’s an Echo-style blue light at the bottom edge that activates with voice commands. It has a Thunderbolt 3 port, two USB-A ports, a headphone jack, and a full-sized HDMI port.

Performance seemed fine in my brief time using the ZenBook Pro Duo, without any hiccups or hitches even when running an intensive video editing software demo. It’s a fairly hefty laptop at 2.5kg (about 5.5lbs), but that’s to be expected given the gaming laptop-class internals. I would also expect its battery life to fall somewhere close to that particular category of products, though we’ll have to wait and see about that.

While both of the screens looked good, I will say they looked different. Part of that is because of the searing intensity of the primary OLED panel, but the ScreenPad Plus is also coated with a matte finish, and usually looks less bright because of how you naturally view it at an off angle.

asus

Asus is also making a cheaper and smaller 14-inch model called the ZenBook Duo. The design and concept is basically the same, but both screens are full HD rather than 4K, there’s no Core i9 option, and the discrete GPU has been heavily downgraded to an MX250.

Asus hasn’t announced pricing or availability for the ZenBook Pro Duo or the ZenBook Duo, but they’re expected to land in the third quarter of this year.

Source: The Asus ZenBook Pro Duo is an extravagant laptop with two 4K screens – The Verge

Why they see any similtarity to the Apple touch bar is beyond me – this is sprung from a totally different well. The dual screen laptop concept has been around for a lot longer than Apple putting a tiny strip somewhere. This is something that’s actually useful.

Tractors, not phones, will (maybe) get America a right-to-repair law at this rate: Bernie slams ‘truly insane’ situation

A person’s “right to repair” their own equipment may well become a US election issue, with presidential candidate Bernie Sanders making it a main talking point during his tour of Iowa.

“Are you ready for something truly insane?” the veteran politician’s account tweeted on Sunday, “Farmers aren’t allowed to repair their own tractors without paying an authorized John Deere repair agent.”

The tweet links to a clip of a recent Sanders rally during which he told the crowd to cheers: “Unbelievably, farmers are unable to even repair their own tractors, and tractors cost what – at least $150,000 – people are spending $150,000 for a piece of machinery. You know what I think? The person who buys that machinery has a right to fix the damn piece of machinery.”

The right-to-repair was also highlighted as one of Sanders’ key policies issues in his plan to “revitalize rural America,” and he promised: “When we are in the White House, we will pass a national right-to-repair law that gives every farmer in America full rights over the machinery they buy.”

Source: Tractors, not phones, will (maybe) get America a right-to-repair law at this rate: Bernie slams ‘truly insane’ situation • The Register

There is hope yet…