Qubits put new spin on magnetism: Boosting applications of quantum computers

[…] “With the help of a quantum annealer, we demonstrated a new way to pattern ,” said Alejandro Lopez-Bezanilla, a virtual experimentalist in the Theoretical Division at Los Alamos National Laboratory. Lopez-Bezanilla is the corresponding author of a paper about the research in Science Advances.

“We showed that a magnetic quasicrystal lattice can host states that go beyond the zero and one bit states of classical information technology,” Lopez-Bezanilla said. “By applying a to a finite set of spins, we can morph the magnetic landscape of a quasicrystal object.”

[…]

Lopez-Bezanilla selected 201 on the D-Wave computer and coupled them to each other to reproduce the shape of a Penrose quasicrystal.

Since Roger Penrose in the 1970s conceived the aperiodic structures named after him, no one had put a spin on each of their nodes to observe their behavior under the action of a magnetic field.

“I connected the qubits so all together they reproduced the geometry of one of his quasicrystals, the so-called P3,” Lopez-Bezanilla said. “To my surprise, I observed that applying specific external magnetic fields on the structure made some qubits exhibit both up and down orientations with the same probability, which leads the P3 to adopt a rich variety of magnetic shapes.”

Manipulating the interaction strength between qubits and the qubits with the external field causes the quasicrystals to settle into different magnetic arrangements, offering the prospect of encoding more than one bit of information in a single object.

Some of these configurations exhibit no precise ordering of the qubits’ orientation.

“This can play in our favor,” Lopez-Bezanilla said, “because they could potentially host a quantum quasiparticle of interest for .” A spin quasiparticle is able to carry information immune to external noise.

A quasiparticle is a convenient way to describe the collective behavior of a group of basic elements. Properties such as mass and charge can be ascribed to several spins moving as if they were one.

More information: Alejandro Lopez-Bezanilla, Field-induced magnetic phases in a qubit Penrose quasicrystal, Science Advances (2023). DOI: 10.1126/sciadv.adf6631. www.science.org/doi/10.1126/sciadv.adf6631

Source: Qubits put new spin on magnetism: Boosting applications of quantum computers

A look inside the lab building mushroom computers

Upon first glance, the Unconventional Computing Laboratory looks like a regular workspace, with computers and scientific instruments lining its clean, smooth countertops. But if you look closely, the anomalies start appearing. A series of videos shared with PopSci show the weird quirks of this research: On top of the cluttered desks, there are large plastic containers with electrodes sticking out of a foam-like substance, and a massive motherboard with tiny oyster mushrooms growing on top of it. 

[…]

Why? Integrating these complex dynamics and system architectures into computing infrastructure could in theory allow information to be processed and analyzed in new ways. And it’s definitely an idea that has gained ground recently, as seen through experimental biology-based algorithms and prototypes of microbe sensors and kombucha circuit boards.

In other words, they’re trying to see if mushrooms can carry out computing and sensing functions.

Inside the lab that’s growing mushroom computers
A mushroom motherboard. Andrew Adamatzky

With fungal computers, mycelium—the branching, web-like root structure of the fungus—acts as conductors as well as the electronic components of a computer. (Remember, mushrooms are only the fruiting body of the fungus.) They can receive and send electric signals, as well as retain memory. 

“I mix mycelium cultures with hemp or with wood shavings, and then place it in closed plastic boxes and allow the mycelium to colonize the substrate, so everything then looks white,” says Andrew Adamatzky, director of the Unconventional Computing Laboratory at the University of the West of England in Bristol, UK. “Then we insert electrodes and record the electrical activity of the mycelium. So, through the stimulation, it becomes electrical activity, and then we get the response.” He notes that this is the UK’s only wet lab—one where chemical, liquid, or biological matter is present—in any department of computer science.

Inside the lab that’s growing mushroom computers
Preparing to record dynamics of electrical resistance of hemp shaving colonized by oyster fungi. Andrew Adamatzky

The classical computers today see problems as binaries: the ones and zeros that represent the traditional approach these devices use. However, most dynamics in the real world cannot always be captured through that system. This is the reason why researchers are working on technologies like quantum computers (which could better simulate molecules) and living brain cell-based chips (which could better mimic neural networks), because they can represent and process information in different ways, utilizing a series of complex, multi-dimensional functions, and provide more precise calculations for certain problems. 

Already, scientists know that mushrooms stay connected with the environment and the organisms around them using a kind of “internet” communication. You may have heard this referred to as the wood wide web. By deciphering the language fungi use to send signals through this biological network, scientists might be able to not only get insights about the state of underground ecosystems, and also tap into them to improve our own information systems. 

Cordyceps fungi
An illustration of the fruit bodies of Cordyceps fungi. Irina Petrova Adamatzky

Mushroom computers could offer some benefits over conventional computers. Although they can’t ever match the speeds of today’s modern machines, they could be more fault tolerant (they can self-regenerate), reconfigurable (they naturally grow and evolve), and consume very little energy.

Before stumbling upon mushrooms, Adamatzky worked on slime mold computers—yes, that involves using slime mold to carry out computing problems—from 2006 to 2016. Physarum, as slime molds are called scientifically, is an amoeba-like creature that spreads its mass amorphously across space. 

Slime molds are “intelligent,” which means that they can figure out their way around problems, like finding the shortest path through a maze without programmers giving them exact instructions or parameters about what to do. Yet, they can be controlled as well through different types of stimuli, and be used to simulate logic gates, which are the basic building blocks for circuits and electronics.

[Related: What Pong-playing brain cells can teach us about better medicine and AI]

Inside the lab that’s growing mushroom computers
Recording electrical potential spikes of hemp shaving colonized by oyster fungi. Andrew Adamatzky

Much of the work with slime molds was done on what are known as “Steiner tree” or “spanning tree” problems that are important in network design, and are solved by using pathfinding optimization algorithms. “With slime mold, we imitated pathways and roads. We even published a book on bio-evaluation of the road transport networks,” says Adamatzky “Also, we solved many problems with computation geometry. We also used slime molds to control robots.” 

When he had wrapped up his slime mold projects, Adamatzky wondered if anything interesting would happen if they started working with mushrooms, an organism that’s both similar to, and wildly different from, Physarum. “We found actually that mushrooms produce action potential-like spikes. The same spikes as neurons produce,” he says. “We’re the first lab to report about spiking activity of fungi measured by microelectrodes, and the first to develop fungal computing and fungal electronics.”  

Inside the lab that’s growing mushroom computers
An example of how spiking activity can be used to make gates. Andrew Adamatzky

In the brain, neurons use spiking activities and patterns to communicate signals, and this property has been mimicked to make artificial neural networks. Mycelium does something similar. That means researchers can use the presence or absence of a spike as their zero or one, and code the different timing and spacing of the spikes that are detected to correlate to the various gates seen in computer programming language (or, and, etc). Further, if you stimulate mycelium at two separate points, then conductivity between them increases, and they communicate faster, and more reliably, allowing memory to be established. This is like how brain cells form habits.

Mycelium with different geometries can compute different logical functions, and they can map these circuits based on the electrical responses they receive from it. “If you send electrons, they will spike,” says Adamatzky. “It’s possible to implement neuromorphic circuits… We can say I’m planning to make a brain from mushrooms.” 

Inside the lab that’s growing mushroom computers
Hemp shavings in the shaping of a brain, injected with chemicals. Andrew Adamatzky

So far, they’ve worked with oyster fungi (Pleurotus djamor), ghost fungi (Omphalotus nidiformis), bracket fungi (Ganoderma resinaceum), Enoki fungi (Flammulina velutipes), split gill fungi (Schizophyllum commune) and caterpillar fungi (Cordyceps militari).

“Right now it’s just feasibility studies. We’re just demonstrating that it’s possible to implement computation, and it’s possible to implement basic logical circuits and basic electronic circuits with mycelium,” Adamatzky says. “In the future, we can grow more advanced mycelium computers and control devices.” 

Source: A look inside the lab building mushroom computers | Popular Science

Knots smaller than human hair make materials unusually tough and (de|re)formable

[…] In the latest advance in nano- and micro-architected materials, engineers at Caltech have developed a new material made from numerous interconnected microscale knots.

The make the material far tougher than identically structured but unknotted materials: they absorb more energy and are able to deform more while still being able to return to their original shape undamaged. These new knotted materials may find applications in biomedicine as well as in aerospace applications due to their durability, possible biocompatibility, and extreme deformability.

[…]

Each knot is around 70 micrometers in height and width, and each fiber has a radius of around 1.7 micrometers (around one-hundredth the radius of a human hair). While these are not the smallest knots ever made—in 2017 chemists tied a knot made from an individual strand of atoms—this does represent the first time that a material composed of numerous knots at this scale has ever been created. Further, it demonstrates the potential value of including these nanoscale knots in a material—for example, for suturing or tethering in biomedicine.

The knotted materials, which were created out of polymers, exhibit a tensile toughness that far surpasses materials that are unknotted but otherwise structurally identical, including ones where individual strands are interwoven instead of knotted. When compared to their unknotted counterparts, the knotted materials absorb 92 percent more energy and require more than twice the amount of strain to snap when pulled.

The knots were not tied but rather manufactured in a knotted state by using advanced high-resolution 3D lithography capable of producing structures in the nanoscale. The samples detailed in the Science Advancespaper contain simple knots—an overhand with an extra twist that provides additional friction to absorb additional energy while the material is stretched. In the future, the team plans to explore materials constructed from more complex knots.

[…]

More information: Widianto P. Moestopo et al, Knots are not for naught: Design, properties, and topology of hierarchical intertwined microarchitected materials, Science Advances (2023). DOI: 10.1126/sciadv.ade6725

Source: Knots smaller than human hair make materials unusually tough

Reaserchers propose Organoid intelligence (OI): the new frontier in biocomputing and intelligence-in-a-dish

[…] Human brains are slower than machines at processing simple information, such as arithmetic, but they far surpass machines in processing complex information as brains deal better with few and/or uncertain data. Brains can perform both sequential and parallel processing (whereas computers can do only the former), and they outperform computers in decision-making on large, highly heterogeneous, and incomplete datasets and other challenging forms of processing

[…]

fundamental differences between biological and machine learning in the mechanisms of implementation and their goals result in two drastically different efficiencies. First, biological learning uses far less power to solve computational problems. For example, a larval zebrafish navigates the world to successfully hunt prey and avoid predators (4) using only 0.1 microwatts (5), while a human adult consumes 100 watts, of which brain consumption constitutes 20% (6, 7). In contrast, clusters used to master state-of-the-art machine learning models typically operate at around 106 watts.

[…]

biological learning uses fewer observations to learn how to solve problems. For example, humans learn a simple “same-versus-different” task using around 10 training samples (12); simpler organisms, such as honeybees, also need remarkably few samples (~102) (13). In contrast, in 2011, machines could not learn these distinctions even with 106 samples (14) and in 2018, 107 samples remained insufficient (15). Thus, in this sense, at least, humans operate at a >106 times better data efficiency than modern machines

[…]

The power and efficiency advantages of biological computing over machine learning are multiplicative. If it takes the same amount of time per sample in a human or machine, then the total energy spent to learn a new task requires 1010 times more energy for the machine.

[…]

We have coined the term “organoid intelligence” (OI) to describe an emerging field aiming to expand the definition of biocomputing toward brain-directed OI computing, i.e. to leverage the self-assembled machinery of 3D human brain cell cultures (brain organoids) to memorize and compute inputs.

[…]

In this article, we present an architecture (Figure 1) and blueprint for an OI development and implementation program designed to:

● Determine the biofeedback characteristics of existing human brain organoids caged in microelectrode shells, potentially using AI to analyze recorded response patterns to electrical and chemical (neurotransmitters and their corresponding receptor agonists and antagonists) stimuli.

● Empirically test, refine, and, where needed, develop neurocomputational theories that elucidate the basis of in vivo biological intelligence and allow us to interact with and harness an OI system.

● Further scale up the brain organoid model to increase the quantity of biological matter, the complexity of brain organoids, the number of electrodes, algorithms for real-time interactions with brain organoids, and the connected input sources and output devices; and to develop big-data warehousing and machine learning methods to accommodate the resulting brain-directed computing capacity.

● Explore how this program could improve our understanding of the pathophysiology of neurodevelopmental and neurodegenerative disorders toward innovative approaches to treatment or prevention.

● Establish a community and a large-scale project to realize OI computing, taking full account of its ethical implications and developing a common ontology.

FIGURE 1
www.frontiersin.orgFigure 1 Architecture of an OI system for biological computing. At the core of OI is the 3D brain cell culture (organoid) that performs the computation. The learning potential of the organoid is optimized by culture conditions and enrichment by cells and genes critical for learning (including IEGs). The scalability, viability, and durability of the organoid are supported by integrated microfluidic systems. Various types of input can be provided to the organoid, including electrical and chemical signals, synthetic signals from machine sensors, and natural signals from connected sensory organoids (e.g. retinal). We anticipate high-resolution output measurement both by electrophysiological recordings obtained via specially designed 2D or 3D (shell) MEA, and potentially from implantable probes, and imaging of organoid structural and functional properties. These outputs can be used directly for computation purposes and as biofeedback to promote organoid learning. AI and machine learning are used throughout to encode and decode signals and to develop hybrid biocomputing solutions, in conjunction with a suitable big-data management system.

To the latter point, a community-forming workshop was held in February 2022 (51), which gave rise to the Baltimore Declaration Toward OI (52). It provides a statement of vision for an OI community that has led to the development of the program outlined here.

[…]

The past decade has seen a revolution in brain cell cultures, moving from traditional monolayer cultures to more organ-like, organized 3D cultures – i.e. brain organoids (Figure 2A). These can be generated either from embryonic stem cells or from the less ethically problematic iPSC typically derived from skin samples (54). The Johns Hopkins Center for Alternatives to Animal Testing, among others, has produced such brain organoids with high levels of standardization and scalability (32) (Figure 2B). Having a diameter below 500 μm, and comprising fewer than 100,000 cells, each organoid is roughly one 3-millionth the size of the human brain (theoretically equating to 800 MB of memory storage). Other groups have reported brain organoids with average diameters of 3–5 mm and prolonged culture times exceeding 1 year (3436, 5559).

FIGURE 2
www.frontiersin.orgFigure 2 Advances in 3D cell culturing provide the foundation for systems to explore organoid intelligence. (A) 3D neural cell cultures have important advantages for biological learning, compared with conventional 2D monolayers – namely a far greater density of cells, enhanced synaptogenesis, high levels of myelination, and enrichment by cell types essential to learning. (B) Brain organoid differentiation over time from 4 to 15 weeks, showing neurons (microtubule associated protein 2 [MAP2]; pink), oligodendrocytes (oligodendrocyte transcription factor [OLIG2]; red), and astrocytes (glial fibrillary acidic protein [GFAP]; green). Nuclei are stained with Hoechst 33342 (blue). Images were taken with an LCM 880 confocal microscope with 20x and 63x magnification. Scale bars are 100 μm and 20 μm, respectively. The images show the presence of MAP2-positive neurons as early as 4 weeks, while glial cells emerge at 8 weeks and there is a continuous increase in the number of astrocytes over time.

These organoids show various attributes that should improve their potential for biocomputing (Figure 2).

[…]

axons in these organoids show extensive myelination. Pamies et al. were the first to develop a 3D human brain model showing significant myelination of axons (32). About 40% of axons in the brain organoids were myelinated (30, 31), which approaches the 50% found in the human brain (60, 61). Myelination has since been reproduced in other brain organoids (47, 62). Myelin reduces the capacitance of the axonal membrane and enables saltatory conduction from one node of Ranvier to the next. As myelination increases electrical conductivity approximately 100-fold, this promises to boost biological computing performance, though its functional impact in this model remains to be demonstrated.

Finally, these organoid cultures can be enriched with various cell types involved in biological learning, namely oligodendrocytes, microglia, and astrocytes. Glia cells are integrally important for the pruning of synapses in biological learning (6365) but have not yet been reported at physiologically relevant levels in brain organoid models. Preliminary work in our organoid model has shown the potential for astroglia cell expansion to physiologically relevant levels (47). Furthermore, recent evidence that oligodendrocytes and astrocytes significantly contribute to learning plasticity and memory suggests that these processes should be studied from a neuron-to-glia perspective, rather than the neuron-to-neuron paradigm generally used (6365). In addition, optimizing the cell culture conditions to allow the expression of immediate early genes (IEGs) is expected to further boost the learning and memory capacities of brain organoids since these are key to learning processes and are expressed only in neurons involved in memory formation

[…]

Source: Frontiers | Organoid intelligence (OI): the new frontier in biocomputing and intelligence-in-a-dish

AMD, NVidia are ‘undershipping’ chips to keep CPU, GPU prices elevated

[…]

AMD’s client PC sales also dropped dramatically—a whopping 51 percent year-over-year—but the company managed to eke out a small profit despite the sky falling. So why aren’t CPU and GPU prices falling too? In a call with investors Tuesday night, CEO Lisa Su confirmed that AMD has been “undershipping” chips for a while now to balance supply and demand (read: keep prices up).

“We have been undershipping the sell-through or consumption for the last two quarters,” Su said, as spotted by PC Gamer. “We undershipped in Q3, we undershipped in Q4. We will undership, to a lesser extent, in Q1.”

With the pandemic winding down and inflation ramping up, far fewer people are buying CPUs, GPUs, and PCs. It’s a hard, sudden reverse from just months ago, when companies like Nvidia and AMD were churning out graphic cards as quickly as possible to keep up with booming demand from cryptocurrency miners and PC gamers alike. Now that GPU mining is dead, shelves are brimming with unsold chips.

Despite the painfully high price tags of new next-gen GPUs, last-gen GeForce RTX 30-series and Radeon RX 6000-series graphics cards are still selling for very high prices considering their two-year-old status. Strategic under-shipping helps companies maintain higher prices for their wares.

[…]

AMD isn’t the only one doing it, either.

“We’re continuing to watch each and every day in terms of the sell-through that we’re seeing,” Nvidia CFO Colette Kress said to investors in November. “So we have been undershipping. We have been undershipping gaming at this time so that we can correct that inventory that is out in the channel.”

Since then, Nvidia has released the $1,200 GeForce RTX 4080 and $800 RTX 4070 Ti, two wildly overpriced graphics cards, and tried positioning them as enthusiast-grade upsells over the RTX 30-series, rather than treating them like the usual cyclical upgrades. AMD’s $900 Radeon RX 7900 XT offers similarly disappointing value and the company recently released a blog post also positioning its new GPUs as enthusiast-grade upsells.

[…]

We expect—hope?—that as stocks dwindle down and competition ramps up, sanity will return to graphics card prices, mirroring AMD and Intel’s recent CPU price adjustments. Just this morning, Intel announced that its Arc A750 graphics card was getting a price cut to $250, instantly making it an all-too-rare tempting target for PC gamers on a budget.

Source: AMD is ‘undershipping’ chips to keep CPU, GPU prices elevated | PCWorld

Perfectly Good MacBooks From 2020 Are Being Sold for Scrap Because of Activation Lock

Secondhand MacBooks that retailed for as much as $3,000 are being turned into parts because recyclers have no way to login and factory reset the machines, which are often just a couple years old.

“How many of you out there would like a 2-year-old M1 MacBook? Well, too bad, because your local recycler just took out all the Activation Locked logic boards and ground them into carcinogenic dust,” John Bumstead, a MacBook refurbisher and owner of the RDKL INC repair store, said in a recent tweet.

The problem is Apple’s T2 security chip. First introduced in 2018, the laptop makes it impossible for anyone who isn’t the original owner to log into the machine. It’s a boon for security and privacy and a plague on the second hard market. “Like it has been for years with recyclers and millions of iPhones and iPads, it’s pretty much game over with MacBooks now—there’s just nothing to do about it if a device is locked,” Bumstead told Motherboard. “Even the jailbreakers/bypassers don’t have a solution, and they probably won’t because Apple proprietary chips are so relatively formidable.” When Apple released its own silicon with the M1, it integrated the features of the T2 into those computers.

[…]

Bumstead told Motherboard that every year Apple makes life a little harder for the second hand market. “The progression has been, first you had certifications with unrealistic data destruction requirements, and that caused recyclers to pull drives from machines and sell without drives, but then as of 2016 the drives were embedded in the boards, so they started pulling boards instead,” he said. “And now the boards are locked, so they are essentially worthless. You can’t even boot locked 2018+ MacBooks to an external device because by default the MacBook security app disables external booting.”

Motherboard first reported on this problem in 2020, but Bumstead said it’s gotten worse recently. “Now we’re seeing quantity come through because companies with internal 3-year product cycles are starting to dump their 2018/2019s, and inevitably a lot of those are locked,” he said.

[…]

Bumstead offered some solutions to the problem. “When we come upon a locked machine that was legally acquired, we should be able to log into our Apple account, enter the serial and any given information, then click a button and submit the machine to Apple for unlocking,” he said. “Then Apple could explore its records, query the original owner if it wants, but then at the end of the day if there are no red flags and the original owner does not protest within 30 days, the device should be auto-unlocked.”

[…]

Source: Perfectly Good MacBooks From 2020 Are Being Sold for Scrap Because of Activation Lock

John Deere signs right to repair agreement

As farming has become more technology-driven, Deere has increasingly injected software into its products with all of its tractors and harvesters now including an autopilot feature as standard.

There is also the John Deere Operations Center, which “instantly captures vital operational data to boost transparency and increase productivity for your business.”

Within a matter of years, the company envisages having 1.5 million machines and half a billion acres of land connected to the cloud service, which will “collect and store crop data, including millions of images of weeds that can be targeted by herbicide.”

Deere also estimates that software fees will make up 10 percent of the company’s revenues by the end of the decade, with Bernstein analysts pegging the average gross margin for farming software at 85 percent, compared to 25 percent for equipment sales.

Just like other commercial software vendors, however, Deere exercises close control and restricts what can be done with its products. This led farm labor advocacy groups to file a complaint to the US Federal Trade Commission last year, claiming that Deere unlawfully refused to provide the software and technical data necessary to repair its machinery.

“Deere is the dominant force in the $68 billion US agricultural equipment market, controlling over 50 per cent of the market for large tractors and combines,” said Fairmark Partners, the groups’ attorneys, in a preface to the complaint [PDF].

“For many farmers and ranchers, they effectively have no choice but to purchase their equipment from Deere. Not satisfied with dominating just the market for equipment, Deere has sought to leverage its power in that market to monopolize the market for repairs of that equipment, to the detriment of farmers, ranchers, and independent repair providers.”

[…]

The MoU, which can be read here [PDF], was signed yesterday at the 2023 AFBF Convention in San Juan, Puerto Rico, and seems to be a commitment by Deere to improve farmers’ access and choice when it comes to repairs.

[…]

Duvall said on a podcast about the matter that the MoU is the result of several years’ work. “As you use equipment, we all know at some point in time, there’s going to be problems with it. And we did have problems with having the opportunity to repair our equipment where we wanted to, or even repair it on the farm,” he added.

“It ensures that our farmers can repair their equipment and have access to the diagnostic tools and product guides so that they can find the problems and find solutions for them. And this is the beginning of a process that we think is going to be real healthy for our farmers and for the company because what it does is it sets up an opportunity for our farmers to really work with John Deere on a personal basis.”

[…]

Source: John Deere signs right to repair agreement • The Register

But… still gives John Deere access to their data for free?

This may also have something to do with the security of John Deere machines being so incredibly piss poor, mainly due to really bad update hygiene

Europe Won’t Allow Mercedes’ EV Performance Subscription Fee, For Now

Mercedes raised some worried eyebrows with its recent announcement to offer additional power for its EVs via subscription. For electric EQE and EQS models, Mercedes will bump their horsepower if customers pay an additional $1,200 per year. However, that’s going to remain a U.S. market service only for the time being, as Europe currently won’t allow Mercedes to offer it, according to this report from Top Gear NL.

A spokesperson for Mercedes Netherlands told Top Gear NL that legal matters currently prevent Mercedes from offering a subscription-based power upgrade. However, the spokesperson declined to comment further, so it’s currently unknown what sort of laws block such subscription-based services. Especially when there are other subscription services that are available in Europe, such as BMW’s heated seat subscription. Automakers can also update a car’s horsepower, via free over-the-air service updates, as both Polestar and Tesla do so in Europe. But that comes at no extra cost and is a one-time, permanent upgrade. So there seems to be some sort of legal issue with charging a yearly subscription for horsepower.

In the U.S. market, Mercedes’ $1,200 yearly subscription gets EQE and EQS owners nearly a 100 horsepower gain. However, because it’s only software that unlocks the power, it’s obvious that the powertrain is capable of that much power regardless of subscription. So customers might feel cheated that they’re paying for a car with a powertrain that’s intentionally hamstrung from the factory, with its full potential hidden behind a paywall.

Source: Europe Won’t Allow Mercedes’ EV Performance Subscription Fee, For Now: Report

Let’s hope that this gets regulated properly at EU level – it’s bizarre that you can’t use something you paid for because it’s disabled and can be re-enabled remotely.

Intel and AMD did something like this in 2010 in a process called binning where they artificially disabled features in the hardware:

As Engadget rather calmly points out, Intel has been testing the waters with a new “Upgrade Card” system, which essentially involves buying a $50 scratch card with a code that unlocks features in your PC’s processor.

The guys at Hardware.info broke this story last month, although nobody seemed to notice right away—perhaps because their site’s in Dutch. The article shows how the upgrade key unlocks “an extra megabyte L3 cache and Hyper Threading” on the Pentium G6951. In its locked state, that 2.8GHz processor has two physical cores, two threads, and 3MB of L3 cache, just like the retail-boxed Pentium G6950.

[…]

Detractors of the scheme might point out that Intel is making customers pay for features already present in the CPU they purchased. That’s quite true. However, as the Engadget post notes, both Intel and AMD have been selling CPUs with bits and pieces artificially disabled for years. That practice is known as binning—sometimes, chipmakers use it to unload parts with malfunctioning components; other times, it’s more about product segmentation and demand. There have often been unofficial workarounds, too. These days, for example, quite a few AMD motherboards let you unlock cores in Athlon II X3 and Phenom II X2 processors. Intel simply seems to be offering an official workaround for its CPUs… and cashing in on it.

source: Intel ‘upgrade card’ unlocks disabled CPU features

Miners flood market with GPUs they no longer need as cryptocurrencies crash

As the cryptocurrency market currently goes through one of its worst nosedives in recent years, miners are trying to get rid of their mining hardware. Due to the crashing prices of popular crypto coins, numerous Chinese miners and e-cafes are flooding the market with graphics cards they no longer need.

Miners, e-cafes, and scalpers are now trying to sell their hardware stock on streams and auctions. As a result, users can snag a second-hand GPU, such as the RTX 3060 Ti, for $350 or even less. Many popular graphics cards going for MSRP or even less is quite a sight to behold after astronomically high prices and scarce availability during the last two years.

As tempting as it might be to snag a powerful Nvidia or AMD GPU for a price lower than its MSRP, it is not the best idea to go after a graphics card that went through seven rings of mining hell. Potential buyers should be aware that the mining GPUs are often not in their best conditions after spending months in always-on, always-100% mode.

With manufacturers increasing their supply and prices going down like never before, you may better spend a little more and get a new graphics card with a warranty and peace of mind. As a bonus, you can enjoy the view of scalpers getting desperate to get at least some money from their stock.

Source: Miners flood market with GPUs they no longer need as cryptocurrencies crash – Neowin

VR Controller Lets You Feel Objects Slip Between Your Fingers

[…]

Last year, researchers from the National Taiwan University’s Interactive Graphics (and Multimedia) Laboratory and the National Chengchi University revealed their Hair Touch controller at the 2021 Computer-Human Interaction conference. The bizarre-looking contraption featured a tuft of hair that could be extended and contracted so that when someone tried to pet a virtual cat, or interact with other furry objects in virtual reality, their fingers would actually feel the fur, as far as their brains were concerned.

That was more or less the same motivation for researchers from the Korea Advanced Institute of Science and Technology’s MAKinteract Lab to create the SpinOcchio VR controller. Instead of making virtual fur feel real, the controller is designed to recreate the feeling of slipping something between your fingers. In the researchers’ own words, it’s described as “a handheld haptic controller capable of rendering the thickness and slipping of a virtual object pinched between two fingers.”

To keep this story PG-13, let’s stick with one of the example use cases the researchers suggest for the SpinOcchio controller: virtual pottery. Making bowls, vases, and other ceramics on a potter’s wheel in real life requires the artist to be able to feel the spinning object in their hands in order to make it perfectly cylindrical and stable. Attempting to use a potter’s wheel in virtual reality with a pair of VR joysticks in hand is nowhere near the same experience, but that’s the ultimate goal of VR: to accurately recreate an experience that otherwise may be inaccessible to a user.

[…]

Source: VR Controller Lets You Feel Objects Slip Between Your Fingers

UltraRAM Breakthrough Brings Combined Memory and Storage to a single wafer

Scientists from the Physics and Engineering Department of the UK’s Lancaster University have published a paper detailing a breakthrough in the mass production of UltraRAM. Researchers have pondered over this novel memory type for several years due to its highly attractive qualities, and the latest breakthrough means that mass production on silicon wafers could be within sight. UltraRAM is described as a memory technology which “combines the non-volatility of a data storage memory, like flash, with the speed, energy-efficiency, and endurance of a working memory, like DRAM.”

ULTRARAM fabrication

(Image credit: Lancaster University)

Importantly, UltraRAM on silicon could be the universal memory type that will one day cater to all the memory needs (both RAM and storage) of PCs and devices.

[…]

The fundamental science behind UltraRAM is that it uses the unique properties of compound semiconductors, commonly used in photonic devices such as LEDs, lasers, and infrared detectors can now be mass-produced on silicon. The researchers claim that the latest incarnation on silicon outperforms the technology as tested on Gallium Arsenide semiconductor wafers.

An ULTRARAM cell

(Image credit: Lancaster University)

Some extrapolated numbers for UltraRAM are that it will offer “data storage times of at least 1,000 years,” and its fast switching speed and program-erase cycling endurance is “one hundred to one thousand times better than flash.” Add these qualities to the DRAM-like speed, energy efficiency, and endurance, and this novel memory type sounds hard for tech companies to ignore.

If you read between the lines above, you can see that UltraRAM is envisioned to break the divide between RAM and storage. So, in theory, you could use it as a one-shot solution to fill these currently separate requirements. In a PC system, that would mean you would get a chunk of UltraRAM, say 2TB, and that would cover both your RAM and storage needs.

The shift, if it lives up to its potential, would be a great way to push forward with the popular trend towards in-memory processing. After all, your storage would be your memory – with UltraRAM; it is the same silicon.

[…]

Source: UltraRAM Breakthrough Brings New Memory and Storage Tech to Silicon | Tom’s Hardware

CyberPowerPC case uses Kinetic Architecture to adjust airflow in real-time

[…]

Kinetic Architecture is a concept on which buildings are designed to allow parts of the structure to move. CyberPowerPC took this idea and created a KINETIC chassis with 18 individually controlled articulating vents that open and close automatically, all based on the computer’s current internal ambient temperatures.

“We are entering 2022 with some of our most sophisticated and elegant designs ever. For discriminating gamers our PC Master Builders are ready to hand-build and test new gaming PCs that are ultra-clean, streamlined, and deliver maximum performance for those who want something truly unique.”

Eric Cheung, CyberPowerPC CEO

The vents aren’t a simple case of opening and closing either and adjust based on every degree of internal temperature by opening to varying degrees. Users can customize and adjust the temperature ranges as well, and a quick button will allow you to fully open or close the vents instantly. The KINETIC chassis supports full ATX size motherboards, up to seven 120mm or five 140mm fans, and most extended length graphics cards.

Key features of the CyberPowerPC KINETIC chassis include:

  • CyberPowerPC exclusive patent pending kinetic design.
  • 18 Individually actuating vents that adjust in real time to ambient case temperatures.
  • Maximizes airflow and cooling case temps are high.
  • Reduces noise and dust when case temps are low.
  • Temperature sensor ranges can be adjusted to fit your needs.
  • Available in both black and white mid-tower options.

The CyberPowerPC KINETIC Series PC case will ship in Q3 2022 from CyberPowerPC.com and CyberPowerPC’s network of authorized retailers and distributors. The chassis is backed by a one-year warranty and lifetime technical support. The suggested MSRP is US$249.

[…]

Source: [CES 2022] CyberPowerPC case uses Kinetic Architecture to adjust airflow in real-time

Why our electronics break: what we can learn from nearly 10 years of repairs over 50k broken items

We now have data on over 21,000 broken items and what was done to fix them. This information comes from volunteers at our own events and others who use our community repair platform, restarters.net.

Thanks to our partners in the Open Repair Alliance who also collect this kind of data, we were able to include extra data from other networks around the world.

Together, this brought the total to nearly 50,000 broken items.

Want to see this data for yourself? Download the full dataset here
(Note: Links to the datasets that contain fault types are further down this page)

That’s a lot of data. So to analyse it, we focused on three types of products that the European Commission would be investigating:

  • Printers
  • Tablets
  • The batteries that power many of our gadgets.

[…]

Thanks to this collective effort, we were able to identify the most common reasons printers, tablets and batteries become unusable.

A diagram showing the most common tablet problems
These findings are based on the analysis of problems in 647 tablets brought to community repair events, but don’t include 131 tablets with poor data quality, making it impossible to confirm the main fault.

In addition, many of the items we looked at were fairly old, demonstrating that people really want to keep using their devices for longer.

But we also found that there are lots of barriers to repair that make this tricky. Some of the biggest are the lack of spare parts and repair documentation as well as designs that make opening the product difficult without causing extra damage.

You can see our full results and download the data for yourself here:

[…]

We want rules that make products easier to fix. And we’re already using data to push for a real Right to Repair. Just recently, we used previous findings to undermine an industry lobbyist’s anti-repair arguments in an EU policy meeting about upcoming regulations for smartphone and tablet repairability.

As a follow up, we also contributed our findings on common fault types in tablets, making the case for the need for better access to spare parts and repair information for this product category as well.

Next, we hope to increase the pressure on European policymakers for regulating printer repairability and battery-related issues in consumer products. For printers, the European Commission is considering rejecting a “voluntary agreement” proposed by industry, which ignores repairability for consumer printers.

And as for batteries, European institutions are working towards a Batteries Regulation, which must prioritise user-replaceability as well as the availability of spare parts.

[…]

Source: Why our electronics break: what we can learn from nearly 10 years of repairs – The Restart Project

New IBM and Samsung transistors could be key to super-efficient vertical chips

IBM and Samsung claim they’ve made a breakthrough in semiconductor design. On day one of the IEDM conference in San Francisco, the two companies unveiled a new design for stacking transistors vertically on a chip. With current processors and SoCs, transistors lie flat on the surface of the silicon, and then electric current flows from side-to-side. By contrast, Vertical Transport Field Effect Transistors (VTFET) sit perpendicular to one another and current flows vertically.

[…]

the design leads to less wasted energy thanks to greater current flow. They estimate VTFET will lead to processors that are either twice as fast or use 85 percent less power than chips designed with FinFET transistors.

[…]

Source: New IBM and Samsung transistors could be key to super-efficient chips (updated) | Engadget

How to Build a Supersonic Trebuchet

What do you get when you combine ancient designs with modern engineering? An exciting new way to convert time and money into heat and noise! I’m not sure whether to call this a catapult or a trebuchet, but it’s definitely the superior siege engine.

Have you ever sat down and thought “I wonder if a trebuchet could launch a projectile at supersonic speeds?” Neither have we. That’s what separates [David Eade] from the rest of us. He didn’t just ask the question, he answered it! And he documented the entire build in a YouTube video which you can see below the break.

Source: https://hackaday.com/2021/12/01/supersonic-projectile-exceeds-engineers-dreams-the-supersonic-trebuchet/

Replacement Motherboard Brings New Lease Of Life To Classic T60 / T61 Thinkpads

[…]Even the best hardware eventually becomes obsolete when it can no longer run modern software: with a 2.0 GHz Core Duo and 3 GB of RAM you can still browse the web and do word processing today, but you can forget about 4K video or a 64-bit OS. Luckily, there’s hope for those who are just not ready to part with their trusty Thinkpads: [Xue Yao] has designed a replacement motherboard that fits the T60/T61 range, bringing them firmly into the present day. The T700 motherboard is currently in its prototype phase, with series production expected to start in early 2022, funded through a crowdfunding campaign.

Designing a motherboard for a modern CPU is no mean feat, and making it fit an existing laptop, with all the odd shapes and less-than-standard connections, is even more impressive. The T700 has an Intel Core i7 CPU with four cores running at 2.8 GHz, while two RAM slots allow for up to 64 GB of DDR4-3200 memory. There are modern USB-A and USB-C ports as well as well as a 6 Gbps SATA interface and two m.2 slots for your SSDs.

As for the display, the T700 motherboard will happily connect to the original screens built into the T60/T61, or to any of a range of aftermarket LED based replacements. A Thunderbolt connector is available, but only operates in USB-C mode due to firmware issues; according to the project page, full support for Thunderbolt 4 is expected once the open-source coreboot firmware has been ported to the T700 platform.

We love projects like this that extend the useful life of classic computers to keep them running way past their expected service life. But impressive though this is, it’s not the first time someone has made a replacement motherboard for the Thinkpad line; we covered a project from the nb51 forum back in 2018, which formed the basis for today’s project. We’ve seen lots of other useful Thinkpad hacks over the years, from replacing the display to revitalizing the batteries. Thanks to [René] for the tip.

Source: Replacement Motherboard Brings New Lease Of Life To Classic Thinkpads | Hackaday

Commercial and Military Applications and Timelines for Quantum Technology | RAND

This report provides an overview of the current state of quantum technology and its potential commercial and military applications. The author discusses each of the three major categories of quantum technology: quantum sensing, quantum communication, and quantum computing. He also considers the likely commercial outlook over the next few years, the major international players, and the potential national security implications of these emerging technologies. This report is based on a survey of the available academic literature, news reporting, and government-issued position papers.

Most of these technologies are still in the laboratory. Applications of quantum sensing could become commercially or militarily ready within the next few years. Although limited commercial deployment of quantum communication technology already exists, the most-useful military applications still lie many years away. Similarly, there may be niche applications of quantum computers in the future, but all known applications are likely at least ten years away. China currently leads the world in the development of quantum communication, while the United States leads in the development of quantum computing.

Key Findings

Quantum technology is grouped into three broad categories: quantum sensing, quantum communication, and quantum computing

  • Quantum sensing refers to the ability to use quantum mechanics to build extremely precise sensors. This is the application of quantum technology considered to have the nearest-term operational potential.
  • The primary near-term application of quantum communication technology is security against eavesdroppers, primarily through a method known as quantum key distribution (QKD). Longer-term applications include networking together quantum computers and sensors.
  • Quantum computing refers to computers that could, in principle, perform certain computations vastly more quickly than is fundamentally possible with a standard computer. Certain problems that are completely infeasible to solve on a standard computer could become feasible on a quantum computer.

Every subfield of quantum technology potentially has major implications for national security

  • Some of the primary applications for quantum sensing include position, navigation, and timing and possibly intelligence, surveillance, and reconnaissance.
  • Quantum communication technology could use QKD to protect sensitive encrypted communications against hostile interception, although some experts consider other security solutions to be more promising.
  • Quantum computing could eventually have the most severe impact on national security. A large-scale quantum computer capable of deploying Shor’s algorithm on current encryption would have a devastating impact on virtually all internet security.

There is no clear overall world leader in quantum technology

  • The United States, China, the European Union, the United Kingdom, and Canada all have specific national initiatives to encourage quantum-related research.
  • The United States and China dominate in overall spending and the most-important technology demonstrations, but Canada, the United Kingdom, and the European Union also lead in certain subfields.
  • China is the world leader in quantum communication, and the United States is the world leader in quantum computing.

The highest-impact quantum technologies are still many years away

  • Applications of quantum sensing could become commercially or militarily ready within the next few years.
  • Limited commercial deployment of quantum communication technology already exists, but the most-useful military and commercial applications still lie many years away.
  • There may be niche applications of quantum computers over the next few years, but all currently known applications are likely at least ten years away.

Source: Commercial and Military Applications and Timelines for Quantum Technology | RAND

Oculus Quest VR Goggles Becomes a Paperweight When Facebook Goes Down

When Facebook went down yesterday for nearly six hours, so did Oculus’ services. Since Facebook owns VR headset maker Oculus, and controversially requires Oculus Quest users to log in with a Facebook account, many Quest owners reported not being able to load their Oculus libraries. “[A]nd those who just took a Quest 2 out of the box have reported that they’re unable to complete the initial setup,” adds PCGamer. As VRFocus points out, “the issue has raised another important question relating to Oculus’ services being so closely linked with a Facebook account, your Oculus Quest/Quest 2 is essentially bricked until services resume.” From the report: This vividly highlights the problem with having to connect to Facebook’s services to gain access to apps — the WiFi connection was fine. Even all the ones downloaded and taking up actual storage space didn’t show up. It’s why some VR fans began boycotting the company when it made all mandatory that all Oculus Quest 2’s had to be affiliated with a Facebook account. If you want to unlink your Facebook account from Oculus Quest and don’t want to pay extra for that ability, you’re in luck thanks to a sideloadable tool called “Oculess.” From an UploadVR article published earlier today: You still need a Facebook account to set up the device in the first place and you need to give Facebook a phone number or card details to sideload, but after that you could use Oculess to forgo Facebook entirely — just remember to never factory reset. The catch is you’ll lose access to Oculus Store apps because the entitlement check used upon launching them will no longer function. System apps like Oculus TV and Browser will also no longer launch, and casting won’t work. You can still sideload hundreds of apps from SideQuest though, and if you want to keep browsing the web in VR you can sideload Firefox Reality. You can still use Oculus Link to play PC VR content, but only if you stay signed into Facebook on the Oculus PC app. Virtual Desktop won’t work because it’s a store app, but you can sideload free alternatives such as ALVR.

To use Oculess, just download it from GitHub and sideload it using SideQuest or Oculus Developer Hub, then launch it from inside VR. If your Quest isn’t already in developer mode or you don’t know how to sideload you can follow our guide here.

Source: Oculus Quest Becomes a Paperweight When Facebook Goes Down – Slashdot

Scientists Have Successfully Recorded Data to DNA in Minutes not hours

[…]

researchers at Northwestern University have devised a new method for recording information to DNA that takes minutes rather than hours or days.

The researchers utilized a novel enzymatic system to synthesize DNA that records rapidly changing environmental signals straight into its sequences, and this method could revolutionize how scientists examine and record neurons inside the brain.

A faster and higher resolution recording

To record intracellular molecular and digital data to DNA, scientists currently rely on multipart processes that combine new information with existing DNA sequences. This means that, for an accurate recording, they must stimulate and repress the expression of specific proteins, which can take over 10 hours to complete.

The new study’s researchers hypothesized they could make this process faster by utilizing a new method they call “Time-sensitive Untemplated Recording using Tdt for Local Environmental Signals”, or TURTLES. This way, they would synthesize completely new DNA rather than copying a template of it. The method enabled the data to be recorded into the genetic code in a matter of minutes.

[…]

Source: Scientists Have Successfully Recorded Data to DNA in a Few Short Minutes

Apple miffed by EU’s ‘strict’ one-size-fits-all charger plan

Smartphones, tablets, and cameras sold within the European Union could be forced to adopt a single standard charging port by the middle of the decade if the latest plans from the European Commission get the go-ahead.

The proposals for a revised Radio Equipment Directive would mean that charging port and fast-charging technology would be “harmonised” across the EU with USB-C becoming the standard for all tech. Quite where this leaves Apple is open to some debate.

Plans to standardise chargers were hatched all the way back in 2011 and by 2014 MicroUSB was the connector design chosen. Vendors signed an MoU but Cupertino went its own way.

Under the EU’s latest effort, the proposal will be legally binding. A bloc-wide common charging standard was put to MEPs in January 2020 and the measure passed by 582 votes to 40, with 37 abstentions.

Today’s announcement also means that chargers would no longer be sold with gadgets and gizmos. The EU calculated seven years ago that 51,000 metric tons of electronics waste across the nation states was attributed annually to old chargers, although that number seems to have fallen dramatically since.

[…]

The direction of travel, however, has flagged concerns for Apple – not for the first time – which appears displeased at being steamrolled into making changes. El Reg understands the tech giant is concerned about the impact this would have on Apple’s bottom line the industry and create waste (in the short term at least).

Indeed, there are also concerns that if the rules are introduced too quickly it could mean that perfectly good tech with plenty of shelf life gets dumped prematurely.

In a statement, a spokesperson for Apple told The Reg – you heard that right – that while it “shares the European Commission’s commitment to protecting the environment,” it remains “concerned that strict regulation mandating just one type of connector stifles innovation rather than encouraging it, which in turn will harm consumers in Europe and around the world.”

Nevertheless, the EU is prepared to plough on.

[…]

Source: Apple miffed by EU’s ‘strict’ one-size-fits-all charger plan • The Register

Apple’s M1 MacBook screens are stunning – stunningly fragile and defective, that is, lawsuits allege

Aggrieved MacBook owners in two separate lawsuits claim Apple’s latest laptops with its M1 chips have defective screens that break easily and malfunction.

The complaints, both filed on Wednesday in a federal district court in San Jose, California, are each seeking class certification in the hope that the law firms involved will get a judicial blessing to represent the presumed large group of affected customers and, if victorious, to share any settlement.

Each of the filings contends Apple’s 2020-2021 MacBook line – consisting of the M1-based MacBook Air and M1-based 13″ MacBook Pro – have screens that frequently fail. They say Apple knew about the alleged defect or should have known, based on its own extensive internal testing, reports from technicians, and feedback from customers.

“[T]he M1 MacBook is defective, as the screens are extraordinarily fragile, cracking, blacking out, or showing magenta, purple and blue lines and squares, or otherwise ceasing to function altogether,” says a complaint filed on behalf of plaintiff Nestor Almeida [PDF]. “Thousands of users from across the globe have reported this issue directly to Apple and on Apple sponsored forums.”

Image of flawed Apple MacBook screen from Almeida complaint

Photograph from one of the lawsuits of a broken screen, redacted by the owner … Click to enlarge

The other complaint [PDF], filed on behalf of plaintiffs Daphne Pareas and Daniel Friend, makes similar allegations.

“The Class Laptops are designed and manufactured with an inherent defect that compromises the display screen,” it says. “During ordinary usage the display screens of the Class Laptops (1) may become obscured with black or gray bars and/or ‘dead spots’ where no visual output is displayed and (2) are vulnerable to cracks that obscure portions of the display. The appearance of black or gray bars on screen may precede, accompany, or follow cracks in the display glass.”

The Almeida complaint says thousands of Apple customers from around the world have reported MacBook screen problems to Apple and in online forums. It claims Apple has often refused to pay for repairs, forcing customers to pay as much as $850 through outside vendors. And where Apple has provided repairs, some customers have seen the problems return.

[…]

Source: Apple’s M1 MacBook screens are stunning – stunningly fragile and defective, that is, lawsuits allege • The Register

Samsung Is the Latest SSD Manufacturer (Crucial, Western Digital) Caught Cheating Its Customers

In the past 11 days, both Crucial and Western Digital have been caught swapping the TLC NAND used for certain products with inferior QLC NAND without updating product SKUs or informing reviewers that this change was happening. Shipping one product to reviewers and a different product to consumers is unacceptable and we recently recommended that readers buy SSDs from Samsung or Intel in lieu of WD or Crucial.

As of today, we have to take Samsung off that list. One difference in this situation is that Samsung isn’t swapping TLC for QLC — it’s swapping the drive controller + TLC for a different, inferior drive controller and different TLC. The net effect is still a steep performance decline in certain tests. We’ve asked Intel to specifically confirm it does not engage in this kind of consumer-hostile behavior and will report back if it does.

The other beats of this story are familiar. Computerbase.de reports on a YouTube Channel, 潮玩客, which compared two different versions of the Samsung 970 Plus. Both drives are labeled with the same sticker declaring them to be a 970EVO Plus, but the part numbers are different. One drive is labeled the MZVLB1T0HBLR (older, good) and one is the MZVL21T0HBLU (newer, inferior).

Right-click and open in a new window for a full-size image. (Photo: 潮玩客)

Peel the sticker back, and the chips underneath are rather different. The Phoenix drive (top) is older than the Elpis drive on the bottom. Production dates for drives point to April for the older product and June for the newer. A previous version of this post misstated the dating, ET regrets the error. Thanks to Eldakka for catching it.

Right-click and open in a new window for a full-size image. (Photo: 潮玩客)

And — just as we’ve seen from Crucial and Western Digital — performance in some benchmarks after the swap is just fine, while other benchmarks crater. Here’s what write performance looks like when measured over much of the drive(s):

Right-click and open in a new window for a full-size image. (Photo: 潮玩客)

The original 970 Plus starts with solid performance and holds it for the entire 200GB test. The right-hand SSD is even faster than the OG 970 Plus until we hit the 120GB mark, at which point performance drops to 50 percent of what it was. Real-world file copies also bear this out, with one drive holding 1.58GB/s and one at 830MB/s. TLC hasn’t been swapped for QLC, but the 50 percent performance hit in some tests is as bad as what we see when it has been.

The only thing worse than discovering a vendor is cheating people is discovering that lots of vendors have apparently decided to cheat people. I don’t know what kind of substances got passed around the last time NAND manufacturers threw themselves a summit, but next time there needs to be more ethics and less marijuana. Or maybe there needs to be more ethics and marijuana, but less toluene. I’m open to suggestions, really.

Source: Samsung Is the Latest SSD Manufacturer Caught Cheating Its Customers – ExtremeTech

Engineers make critical advance in quantum computer design

They discovered a new technique they say will be capable of controlling millions of spin qubits—the basic units of information in a silicon quantum processor.

Until now, quantum computer engineers and scientists have worked with a proof-of-concept model of quantum processors by demonstrating the control of only a handful of qubits.

[…]

“Up until this point, controlling electron spin qubits relied on us delivering microwave magnetic fields by putting a current through a wire right beside the ,” Dr. Pla says.

“This poses some real challenges if we want to scale up to the millions of qubits that a quantum computer will need to solve globally significant problems, such as the design of new vaccines.

“First off, the magnetic fields drop off really quickly with distance, so we can only control those qubits closest to the wire. That means we would need to add more and more wires as we brought in more and more qubits, which would take up a lot of real estate on the chip.”

And since the chip must operate at freezing cold temperatures, below -270°C, Dr. Pla says introducing more wires would generate way too much heat in the chip, interfering with the reliability of the qubits.

[…]

Rather than having thousands of control wires on the same thumbnail-sized silicon chip that also needs to contain millions of qubits, the team looked at the feasibility of generating a from above the chip that could manipulate all of the qubits simultaneously.

[…]

Dr. Pla and the team introduced a new component directly above the silicon chip—a crystal prism called a dielectric resonator. When microwaves are directed into the resonator, it focuses the wavelength of the microwaves down to a much smaller size.

“The dielectric resonator shrinks the wavelength down below one millimeter, so we now have a very efficient conversion of microwave power into the magnetic field that controls the spins of all the qubits.

“There are two key innovations here. The first is that we don’t have to put in a lot of power to get a strong driving field for the qubits, which crucially means we don’t generate much heat. The second is that the field is very uniform across the chip, so that millions of qubits all experience the same level of control.”

[…]

Source: Engineers make critical advance in quantum computer design

Chinese scientists develop world’s strongest glass that’s harder than diamond

Scientists in China have developed the hardest and strongest glassy material known so far that can scratch diamond crystals with ease.

The researchers, including those from Yanshan University in China, noted that the new material – tentatively named AM-III – has “outstanding” mechanical and electronic properties, and could find applications in solar cells due to its “ultra-high” strength and wear resistance.

Analysis of the material, published in the journal National Science Review, revealed that its hardness reached 113 gigapascals (GPa) while natural diamond stone usually scores 50 to 70 on the same test.

[…]

Using fullerenes, which are materials made of hollow football-like arrangements of carbon atoms, the researchers produced different types of glassy materials with varying molecular organisation among which AM-III had the highest order of atoms and molecules.

To achieve this order of molecules, the scientists crushed and blended the fullerenes together, gradually applying intense heat and pressure of about 25 GPa and 1,200 degrees Celsius in an experimental chamber for about 12 hours, spending an equal amount of time cooling the material.

[…]

 

Source: Chinese scientists develop world’s strongest glass that’s as hard as diamond | The Independent

Samsung Bricking Original SmartThings Hubs

Samsung is causing much angst among its SmartThings customers by shutting down support for its original SmartThings home automation hub as of the end of June. These are network-connected home automation routers providing Zigbee and Z-Wave connectivity to your sensors and actuators. It’s not entirely unreasonable for manufacturers to replace aging hardware with new models. But in this case the original hubs, otherwise fully functional and up to the task, have intentionally been bricked.

Users were offered a chance to upgrade to a newer version of the hub at a discount. But the hardware isn’t being made by Samsung anymore, after they redirected their SmartThings group to focus entirely on software. With this new dedication to software, you’d be forgiven for thinking the team implemented a seamless transition plan for its loyal user base — customers who supported and built up a thriving community since the young Colorado-based SmartThings company bootstrapped itself by a successful Kickstarter campaign in 2012. Instead, Samsung seems to leave many of those users in the lurch.

There is no upgrade path for switching to a new hub, meaning that the user has to manually reconnect each sensor in the house which often involves a cryptic sequence of button presses and flashing lights (the modern equivalent of setting the time on your VCR). Soon after you re-pair all your devices, you will discover that the level of software customization and tools that you’ve relied upon for home automation has, or is about to, disappear. They’ve replaced the original SmartThings app with a new in-house app, which by all accounts significantly dumbs down the features and isn’t being well-received by the community. Another very popular tool called Groovy IDE, which allowed users to add support for third-party devices and complex automation tasks, is about to be discontinued, as well.

 

Samsung’s announcement from last year laid out the goals of the transition divided into three phases. After the dust settles, it may well be that new tools will be rolled out which restore the functionality and convenience of the discontinued apps. But it seems that their priority at the moment is to focus on “casual” home automation users, those which just a handful of devices. The “power” users, with dozens and dozens of devices, are left wondering whether they’ve been abandoned. A casual scan through various online forums suggests that many of these loyal users are not waiting to be abandoned. Instead, they are abandoning SmartThings and switching to self-hosted solutions such as Home Assistant.

If this story sounds familiar, it is. We’ve covered several similar of IoT service closures in recent years, including:

Considering the typical home is a decades-long investment, we’d hope that the industry will eventually focus on longer-term approaches to home automation. For example, interoperability of devices using existing or new standards might be a good starting point. If you are using an automation system in your home, do you use a bundled solution like SmartThings, or have you gone the self-hosting route?

Source: Samsung Shuttering Original SmartThings Hubs | Hackaday

Bricking is pretty damn harsh and incredibly wasteful. Also, you bought the hardware, it’s yours!