Kinetic Architecture is a concept on which buildings are designed to allow parts of the structure to move. CyberPowerPC took this idea and created a KINETIC chassis with 18 individually controlled articulating vents that open and close automatically, all based on the computer’s current internal ambient temperatures.
“We are entering 2022 with some of our most sophisticated and elegant designs ever. For discriminating gamers our PC Master Builders are ready to hand-build and test new gaming PCs that are ultra-clean, streamlined, and deliver maximum performance for those who want something truly unique.”
Eric Cheung, CyberPowerPC CEO
The vents aren’t a simple case of opening and closing either and adjust based on every degree of internal temperature by opening to varying degrees. Users can customize and adjust the temperature ranges as well, and a quick button will allow you to fully open or close the vents instantly. The KINETIC chassis supports full ATX size motherboards, up to seven 120mm or five 140mm fans, and most extended length graphics cards.
Key features of the CyberPowerPC KINETIC chassis include:
18 Individually actuating vents that adjust in real time to ambient case temperatures.
Maximizes airflow and cooling case temps are high.
Reduces noise and dust when case temps are low.
Temperature sensor ranges can be adjusted to fit your needs.
Available in both black and white mid-tower options.
The CyberPowerPC KINETIC Series PC case will ship in Q3 2022 from CyberPowerPC.com and CyberPowerPC’s network of authorized retailers and distributors. The chassis is backed by a one-year warranty and lifetime technical support. The suggested MSRP is US$249.
We now have data on over 21,000 broken items and what was done to fix them. This information comes from volunteers at our own events and others who use our community repair platform, restarters.net.
Thanks to our partners in the Open Repair Alliance who also collect this kind of data, we were able to include extra data from other networks around the world.
Together, this brought the total to nearly 50,000 broken items.
Want to see this data for yourself? Download the full dataset here
(Note: Links to the datasets that contain fault types are further down this page)
That’s a lot of data. So to analyse it, we focused on three types of products that the European Commission would be investigating:
Printers
Tablets
The batteries that power many of our gadgets.
[…]
Thanks to this collective effort, we were able to identify the most common reasons printers, tablets and batteries become unusable.
These findings are based on the analysis of problems in 647 tablets brought to community repair events, but don’t include 131 tablets with poor data quality, making it impossible to confirm the main fault.
In addition, many of the items we looked at were fairly old, demonstrating that people really want to keep using their devices for longer.
But we also found that there are lots of barriers to repair that make this tricky. Some of the biggest are the lack of spare parts and repair documentation as well as designs that make opening the product difficult without causing extra damage.
You can see our full results and download the data for yourself here:
We want rules that make products easier to fix. And we’re already using data to push for a real Right to Repair. Just recently, we used previous findings to undermine an industry lobbyist’s anti-repair arguments in an EU policy meeting about upcoming regulations for smartphone and tablet repairability.
As a follow up, we also contributed our findings on common fault types in tablets, making the case for the need for better access to spare parts and repair information for this product category as well.
Next, we hope to increase the pressure on European policymakers for regulating printer repairability and battery-related issues in consumer products. For printers, the European Commission is considering rejecting a “voluntary agreement” proposed by industry, which ignores repairability for consumer printers.
And as for batteries, European institutions are working towards a Batteries Regulation, which must prioritise user-replaceability as well as the availability of spare parts.
IBM and Samsung claim they’ve made a breakthrough in semiconductor design. On day one of the IEDM conference in San Francisco, the two companies unveiled a new design for stacking transistors vertically on a chip. With current processors and SoCs, transistors lie flat on the surface of the silicon, and then electric current flows from side-to-side. By contrast, Vertical Transport Field Effect Transistors (VTFET) sit perpendicular to one another and current flows vertically.
[…]
the design leads to less wasted energy thanks to greater current flow. They estimate VTFET will lead to processors that are either twice as fast or use 85 percent less power than chips designed with FinFET transistors.
What do you get when you combine ancient designs with modern engineering? An exciting new way to convert time and money into heat and noise! I’m not sure whether to call this a catapult or a trebuchet, but it’s definitely the superior siege engine.
Have you ever sat down and thought “I wonder if a trebuchet could launch a projectile at supersonic speeds?” Neither have we. That’s what separates [David Eade] from the rest of us. He didn’t just ask the question, he answered it! And he documented the entire build in a YouTube video which you can see below the break.
[…]Even the best hardware eventually becomes obsolete when it can no longer run modern software: with a 2.0 GHz Core Duo and 3 GB of RAM you can still browse the web and do word processing today, but you can forget about 4K video or a 64-bit OS. Luckily, there’s hope for those who are just not ready to part with their trusty Thinkpads: [Xue Yao] has designed a replacement motherboard that fits the T60/T61 range, bringing them firmly into the present day. The T700 motherboard is currently in its prototype phase, with series production expected to start in early 2022, funded through a crowdfunding campaign.
Designing a motherboard for a modern CPU is no mean feat, and making it fit an existing laptop, with all the odd shapes and less-than-standard connections, is even more impressive. The T700 has an Intel Core i7 CPU with four cores running at 2.8 GHz, while two RAM slots allow for up to 64 GB of DDR4-3200 memory. There are modern USB-A and USB-C ports as well as well as a 6 Gbps SATA interface and two m.2 slots for your SSDs.
As for the display, the T700 motherboard will happily connect to the original screens built into the T60/T61, or to any of a range of aftermarket LED based replacements. A Thunderbolt connector is available, but only operates in USB-C mode due to firmware issues; according to the project page, full support for Thunderbolt 4 is expected once the open-source coreboot firmware has been ported to the T700 platform.
We love projects like this that extend the useful life of classic computers to keep them running way past their expected service life. But impressive though this is, it’s not the first time someone has made a replacement motherboard for the Thinkpad line; we covered a project from the nb51 forum back in 2018, which formed the basis for today’s project. We’ve seen lots of other useful Thinkpad hacks over the years, from replacing the display to revitalizing the batteries. Thanks to [René] for the tip.
This report provides an overview of the current state of quantum technology and its potential commercial and military applications. The author discusses each of the three major categories of quantum technology: quantum sensing, quantum communication, and quantum computing. He also considers the likely commercial outlook over the next few years, the major international players, and the potential national security implications of these emerging technologies. This report is based on a survey of the available academic literature, news reporting, and government-issued position papers.
Most of these technologies are still in the laboratory. Applications of quantum sensing could become commercially or militarily ready within the next few years. Although limited commercial deployment of quantum communication technology already exists, the most-useful military applications still lie many years away. Similarly, there may be niche applications of quantum computers in the future, but all known applications are likely at least ten years away. China currently leads the world in the development of quantum communication, while the United States leads in the development of quantum computing.
Key Findings
Quantum technology is grouped into three broad categories: quantum sensing, quantum communication, and quantum computing
Quantum sensing refers to the ability to use quantum mechanics to build extremely precise sensors. This is the application of quantum technology considered to have the nearest-term operational potential.
The primary near-term application of quantum communication technology is security against eavesdroppers, primarily through a method known as quantum key distribution (QKD). Longer-term applications include networking together quantum computers and sensors.
Quantum computing refers to computers that could, in principle, perform certain computations vastly more quickly than is fundamentally possible with a standard computer. Certain problems that are completely infeasible to solve on a standard computer could become feasible on a quantum computer.
Every subfield of quantum technology potentially has major implications for national security
Some of the primary applications for quantum sensing include position, navigation, and timing and possibly intelligence, surveillance, and reconnaissance.
Quantum communication technology could use QKD to protect sensitive encrypted communications against hostile interception, although some experts consider other security solutions to be more promising.
Quantum computing could eventually have the most severe impact on national security. A large-scale quantum computer capable of deploying Shor’s algorithm on current encryption would have a devastating impact on virtually all internet security.
There is no clear overall world leader in quantum technology
The United States, China, the European Union, the United Kingdom, and Canada all have specific national initiatives to encourage quantum-related research.
The United States and China dominate in overall spending and the most-important technology demonstrations, but Canada, the United Kingdom, and the European Union also lead in certain subfields.
China is the world leader in quantum communication, and the United States is the world leader in quantum computing.
The highest-impact quantum technologies are still many years away
Applications of quantum sensing could become commercially or militarily ready within the next few years.
Limited commercial deployment of quantum communication technology already exists, but the most-useful military and commercial applications still lie many years away.
There may be niche applications of quantum computers over the next few years, but all currently known applications are likely at least ten years away.
When Facebook went down yesterday for nearly six hours, so did Oculus’ services. Since Facebook owns VR headset maker Oculus, and controversially requires Oculus Quest users to log in with a Facebook account, many Quest owners reported not being able to load their Oculus libraries. “[A]nd those who just took a Quest 2 out of the box have reported that they’re unable to complete the initial setup,” adds PCGamer. As VRFocus points out, “the issue has raised another important question relating to Oculus’ services being so closely linked with a Facebook account, your Oculus Quest/Quest 2 is essentially bricked until services resume.” From the report: This vividly highlights the problem with having to connect to Facebook’s services to gain access to apps — the WiFi connection was fine. Even all the ones downloaded and taking up actual storage space didn’t show up. It’s why some VR fans began boycotting the company when it made all mandatory that all Oculus Quest 2’s had to be affiliated with a Facebook account. If you want to unlink your Facebook account from Oculus Quest and don’t want to pay extra for that ability, you’re in luck thanks to a sideloadable tool called “Oculess.” From an UploadVR article published earlier today: You still need a Facebook account to set up the device in the first place and you need to give Facebook a phone number or card details to sideload, but after that you could use Oculess to forgo Facebook entirely — just remember to never factory reset. The catch is you’ll lose access to Oculus Store apps because the entitlement check used upon launching them will no longer function. System apps like Oculus TV and Browser will also no longer launch, and casting won’t work. You can still sideload hundreds of apps from SideQuest though, and if you want to keep browsing the web in VR you can sideload Firefox Reality. You can still use Oculus Link to play PC VR content, but only if you stay signed into Facebook on the Oculus PC app. Virtual Desktop won’t work because it’s a store app, but you can sideload free alternatives such as ALVR.
To use Oculess, just download it from GitHub and sideload it using SideQuest or Oculus Developer Hub, then launch it from inside VR. If your Quest isn’t already in developer mode or you don’t know how to sideload you can follow our guide here.
researchers at Northwestern University have devised a new method for recording information to DNA that takes minutes rather than hours or days.
The researchers utilized a novel enzymatic system to synthesize DNA that records rapidly changing environmental signals straight into its sequences, and this method could revolutionize how scientists examine and record neurons inside the brain.
A faster and higher resolution recording
To record intracellular molecular and digital data to DNA, scientists currently rely on multipart processes that combine new information with existing DNA sequences. This means that, for an accurate recording, they must stimulate and repress the expression of specific proteins, which can take over 10 hours to complete.
The new study’s researchers hypothesized they could make this process faster by utilizing a new method they call “Time-sensitive Untemplated Recording using Tdt for Local Environmental Signals”, or TURTLES. This way, they would synthesize completely new DNA rather than copying a template of it. The method enabled the data to be recorded into the genetic code in a matter of minutes.
Smartphones, tablets, and cameras sold within the European Union could be forced to adopt a single standard charging port by the middle of the decade if the latest plans from the European Commission get the go-ahead.
The proposals for a revised Radio Equipment Directive would mean that charging port and fast-charging technology would be “harmonised” across the EU with USB-C becoming the standard for all tech. Quite where this leaves Apple is open to some debate.
Under the EU’s latest effort, the proposal will be legally binding. A bloc-wide common charging standard was put to MEPs in January 2020 and the measure passed by 582 votes to 40, with 37 abstentions.
Today’s announcement also means that chargers would no longer be sold with gadgets and gizmos. The EU calculated seven years ago that 51,000 metric tons of electronics waste across the nation states was attributed annually to old chargers, although that number seems to have fallen dramatically since.
[…]
The direction of travel, however, has flagged concerns for Apple – not for the first time – which appears displeased at being steamrolled into making changes. El Reg understands the tech giant is concerned about the impact this would have on Apple’s bottom line the industry and create waste (in the short term at least).
Indeed, there are also concerns that if the rules are introduced too quickly it could mean that perfectly good tech with plenty of shelf life gets dumped prematurely.
In a statement, a spokesperson for Apple told The Reg – you heard that right – that while it “shares the European Commission’s commitment to protecting the environment,” it remains “concerned that strict regulation mandating just one type of connector stifles innovation rather than encouraging it, which in turn will harm consumers in Europe and around the world.”
Aggrieved MacBook owners in two separate lawsuits claim Apple’s latest laptops with its M1 chips have defective screens that break easily and malfunction.
The complaints, both filed on Wednesday in a federal district court in San Jose, California, are each seeking class certification in the hope that the law firms involved will get a judicial blessing to represent the presumed large group of affected customers and, if victorious, to share any settlement.
Each of the filings contends Apple’s 2020-2021 MacBook line – consisting of the M1-based MacBook Air and M1-based 13″ MacBook Pro – have screens that frequently fail. They say Apple knew about the alleged defect or should have known, based on its own extensive internal testing, reports from technicians, and feedback from customers.
“[T]he M1 MacBook is defective, as the screens are extraordinarily fragile, cracking, blacking out, or showing magenta, purple and blue lines and squares, or otherwise ceasing to function altogether,” says a complaint filed on behalf of plaintiff Nestor Almeida [PDF]. “Thousands of users from across the globe have reported this issue directly to Apple and on Apple sponsored forums.”
Photograph from one of the lawsuits of a broken screen, redacted by the owner … Click to enlarge
The other complaint [PDF], filed on behalf of plaintiffs Daphne Pareas and Daniel Friend, makes similar allegations.
“The Class Laptops are designed and manufactured with an inherent defect that compromises the display screen,” it says. “During ordinary usage the display screens of the Class Laptops (1) may become obscured with black or gray bars and/or ‘dead spots’ where no visual output is displayed and (2) are vulnerable to cracks that obscure portions of the display. The appearance of black or gray bars on screen may precede, accompany, or follow cracks in the display glass.”
The Almeida complaint says thousands of Apple customers from around the world have reported MacBook screen problems to Apple and in online forums. It claims Apple has often refused to pay for repairs, forcing customers to pay as much as $850 through outside vendors. And where Apple has provided repairs, some customers have seen the problems return.
In the past 11 days, both Crucial and Western Digital have been caught swapping the TLC NAND used for certain products with inferior QLC NAND without updating product SKUs or informing reviewers that this change was happening. Shipping one product to reviewers and a different product to consumers is unacceptable and we recently recommended that readers buy SSDs from Samsung or Intel in lieu of WD or Crucial.
As of today, we have to take Samsung off that list. One difference in this situation is that Samsung isn’t swapping TLC for QLC — it’s swapping the drive controller + TLC for a different, inferior drive controller and different TLC. The net effect is still a steep performance decline in certain tests. We’ve asked Intel to specifically confirm it does not engage in this kind of consumer-hostile behavior and will report back if it does.
The other beats of this story are familiar. Computerbase.de reports on a YouTube Channel, 潮玩客, which compared two different versions of the Samsung 970 Plus. Both drives are labeled with the same sticker declaring them to be a 970EVO Plus, but the part numbers are different. One drive is labeled the MZVLB1T0HBLR (older, good) and one is the MZVL21T0HBLU (newer, inferior).
Right-click and open in a new window for a full-size image. (Photo: 潮玩客)
Peel the sticker back, and the chips underneath are rather different. The Phoenix drive (top) is older than the Elpis drive on the bottom. Production dates for drives point to April for the older product and June for the newer. A previous version of this post misstated the dating, ET regrets the error. Thanks to Eldakka for catching it.
Right-click and open in a new window for a full-size image. (Photo: 潮玩客)
And — just as we’ve seen from Crucial and Western Digital — performance in some benchmarks after the swap is just fine, while other benchmarks crater. Here’s what write performance looks like when measured over much of the drive(s):
Right-click and open in a new window for a full-size image. (Photo: 潮玩客)
The original 970 Plus starts with solid performance and holds it for the entire 200GB test. The right-hand SSD is even faster than the OG 970 Plus until we hit the 120GB mark, at which point performance drops to 50 percent of what it was. Real-world file copies also bear this out, with one drive holding 1.58GB/s and one at 830MB/s. TLC hasn’t been swapped for QLC, but the 50 percent performance hit in some tests is as bad as what we see when it has been.
The only thing worse than discovering a vendor is cheating people is discovering that lots of vendors have apparently decided to cheat people. I don’t know what kind of substances got passed around the last time NAND manufacturers threw themselves a summit, but next time there needs to be more ethics and less marijuana. Or maybe there needs to be more ethics and marijuana, but less toluene. I’m open to suggestions, really.
They discovered a new technique they say will be capable of controlling millions of spin qubits—the basic units of information in a silicon quantum processor.
Until now, quantum computer engineers and scientists have worked with a proof-of-concept model of quantum processors by demonstrating the control of only a handful of qubits.
[…]
“Up until this point, controlling electron spin qubits relied on us delivering microwave magnetic fields by putting a current through a wire right beside the qubit,” Dr. Pla says.
“This poses some real challenges if we want to scale up to the millions of qubits that a quantum computer will need to solve globally significant problems, such as the design of new vaccines.
“First off, the magnetic fields drop off really quickly with distance, so we can only control those qubits closest to the wire. That means we would need to add more and more wires as we brought in more and more qubits, which would take up a lot of real estate on the chip.”
And since the chip must operate at freezing cold temperatures, below -270°C, Dr. Pla says introducing more wires would generate way too much heat in the chip, interfering with the reliability of the qubits.
[…]
Rather than having thousands of control wires on the same thumbnail-sized silicon chip that also needs to contain millions of qubits, the team looked at the feasibility of generating a magnetic field from above the chip that could manipulate all of the qubits simultaneously.
[…]
Dr. Pla and the team introduced a new component directly above the silicon chip—a crystal prism called a dielectric resonator. When microwaves are directed into the resonator, it focuses the wavelength of the microwaves down to a much smaller size.
“The dielectric resonator shrinks the wavelength down below one millimeter, so we now have a very efficient conversion of microwave power into the magnetic field that controls the spins of all the qubits.
“There are two key innovations here. The first is that we don’t have to put in a lot of power to get a strong driving field for the qubits, which crucially means we don’t generate much heat. The second is that the field is very uniform across the chip, so that millions of qubits all experience the same level of control.”
Scientists in China have developed the hardest and strongest glassy material known so far that can scratch diamond crystals with ease.
The researchers, including those from Yanshan University in China, noted that the new material – tentatively named AM-III – has “outstanding” mechanical and electronic properties, and could find applications in solar cells due to its “ultra-high” strength and wear resistance.
Analysis of the material, published in the journal National Science Review, revealed that its hardness reached 113 gigapascals (GPa) while natural diamond stone usually scores 50 to 70 on the same test.
[…]
Using fullerenes, which are materials made of hollow football-like arrangements of carbon atoms, the researchers produced different types of glassy materials with varying molecular organisation among which AM-III had the highest order of atoms and molecules.
To achieve this order of molecules, the scientists crushed and blended the fullerenes together, gradually applying intense heat and pressure of about 25 GPa and 1,200 degrees Celsius in an experimental chamber for about 12 hours, spending an equal amount of time cooling the material.
Samsung is causing much angst among its SmartThings customers by shutting down support for its original SmartThings home automation hub as of the end of June. These are network-connected home automation routers providing Zigbee and Z-Wave connectivity to your sensors and actuators. It’s not entirely unreasonable for manufacturers to replace aging hardware with new models. But in this case the original hubs, otherwise fully functional and up to the task, have intentionally been bricked.
Users were offered a chance to upgrade to a newer version of the hub at a discount. But the hardware isn’t being made by Samsung anymore, after they redirected their SmartThings group to focus entirely on software. With this new dedication to software, you’d be forgiven for thinking the team implemented a seamless transition plan for its loyal user base — customers who supported and built up a thriving community since the young Colorado-based SmartThings company bootstrapped itself by a successful Kickstarter campaign in 2012. Instead, Samsung seems to leave many of those users in the lurch.
There is no upgrade path for switching to a new hub, meaning that the user has to manually reconnect each sensor in the house which often involves a cryptic sequence of button presses and flashing lights (the modern equivalent of setting the time on your VCR). Soon after you re-pair all your devices, you will discover that the level of software customization and tools that you’ve relied upon for home automation has, or is about to, disappear. They’ve replaced the original SmartThings app with a new in-house app, which by all accounts significantly dumbs down the features and isn’t being well-received by the community. Another very popular tool called Groovy IDE, which allowed users to add support for third-party devices and complex automation tasks, is about to be discontinued, as well.
Samsung’s announcement from last year laid out the goals of the transition divided into three phases. After the dust settles, it may well be that new tools will be rolled out which restore the functionality and convenience of the discontinued apps. But it seems that their priority at the moment is to focus on “casual” home automation users, those which just a handful of devices. The “power” users, with dozens and dozens of devices, are left wondering whether they’ve been abandoned. A casual scan through various online forums suggests that many of these loyal users are not waiting to be abandoned. Instead, they are abandoning SmartThings and switching to self-hosted solutions such as Home Assistant.
If this story sounds familiar, it is. We’ve covered several similar of IoT service closures in recent years, including:
Considering the typical home is a decades-long investment, we’d hope that the industry will eventually focus on longer-term approaches to home automation. For example, interoperability of devices using existing or new standards might be a good starting point. If you are using an automation system in your home, do you use a bundled solution like SmartThings, or have you gone the self-hosting route?
A few days ago, the US Federal Trade Commission (FTC) came out with a 5-0 unanimous vote on its position on right to repair. (PDF) It’s great news, in that they basically agree with us all:
Restricting consumers and businesses from choosing how they repair products can substantially increase the total cost of repairs, generate harmful electronic waste, and unnecessarily increase wait times for repairs. In contrast, providing more choice in repairs can lead to lower costs, reduce e-waste by extending the useful lifespan of products, enable more timely repairs, and provide economic opportunities for entrepreneurs and local businesses.
The long version of the “Nixing the Fix” report goes on to list ways that the FTC found firms were impeding repair: ranging from poor initial design, through restrictive firmware and digital rights management (DRM), all the way down to “disparagement of non-OEM parts and independent repair services”.
While the FTC isn’t making any new laws here, they’re conveying a willingness to use the consumer-protection laws that are already on the books: the Magnuson-Moss Warranty Act and Section 5 of the FTC Act, which prohibits unfair competitive practices.
Only time will tell if this dog really has teeth, but it’s a good sign that it’s barking. And given that the European Union is heading in a similar direction, we’d be betting that repairability increases in the future.
What are these terms and how do they work in terms of control schemes? In this world you generally get what you pay for – if it’s cheap, then it’s probably plasticky and nasty. If it’s expensive, then it’s probably high quality. Saitek and Logitech have equipment running from low to midrange. Thrustmaster from mid to high range.
The VKB Gladiator NXT is currently the most popular midrange joystick you can find around $120 – $150 which comes in left and righthand versions.
If you have the money though, you go for the Virpil (VPC) Constellation Alpha (both left and right hand) and MongoosT-50CM2 grips and bases
WingWin.cn has a very good F-16 throttle, stick and instrument panel with desk mounts
HOTAS
The world of flight sim control used to be fairly straightforward: ideally you had a stick on the right, a throttle unit on the left and rudders in the middle. Some stick makers tried to replace the rudder with a twistable stick grip and maybe a little throttle lever on the stick so you could get full control cheaper – the four degrees of freedom (roll, yaw, pitch and thrust) / 4 DOF on a single stick. You had less buttons but you used the keyboard and mouse more.
HOSAS / Dual Stick
Now in the resurgence of the age of space sims – (Elite Dangerous, Star Citizen,No Mans Sky, Star Wars Squadrons and Tie Fighter Total Conversion to name a few) the traditional HOTAS (Hands on Throttle and Stick) is losing ground to the HOSAS (Hands on Stick and Stick). The HOSAS offers six degrees of freedom (6 DOF): roll, yaw, pitch, thrust + horizontal and vertical translation / strafing, which makes sense for a space plane that can not only go backwards but can also strafe directly upwards and downwards or left and right.
This gives rise to some interesting control schemes:
Left stick
x-axis
translate / strafe left + right
y-axis
throttle
z/twist-axis
translate / strafe up + down
Right stick
x-axis
roll
y-axis
pitch
z/twist axis
yaw
a variation which seems to be popular in Star Wars Squadrons is
Right stick
x-axis
yaw
y-axis
pitch
z/twist-axis
roll
`
another variation with throttling
Left stick
x-axis
translate / strafe left + right
y-axis
translate / strafe up + down
z/twist-axis
thrust
often combined with:
rudder left foot
throttle backwards / reverse
rudder right foot
throttle forwards
Different combinations work better or worse depending on the person and how tiring it is for them personally. As Reddit user Enfiguralimificuleur points out: “It worked best for me with Z/twist being the throttle. I found it very efficient to adjust your speed properly. Very easy to stay at that speed as well. However due to wrist issue and tendinitis, some positions are VERY awkward. Try pulling+right+twisting. Ouch. And even without the pain, this is not comfortable.”
Throttling and the Omnithrottle
The throttle can be set in different ways: a traditional HOTAS throttle is set to where it’s pushed to. Generally sticks have a recentering mechanism. This means that it’s easy to find reverse but can get annoying because to throttle you need to keep pushing the stick forwards. There are a few solutions to this.
First, The VKB gunfighter III base has a dry clutch which will remove the centering spring back of the pitch axes, meaning you can assign that to thrust and basicly have a stick that stays there mimicking a throttle while still allowing for rotation and roll axes.
Second, people can use a traditional throttle as well (so then I guess it becomes a HOTSAS)
Third, you can map a hat to 0, 50, 75, 100% speed and set speeds that way as a sort of cruise control
Fourth you can use the rudder (left foot back, front foot forward) or z-axis (twist) for thrust / throttle control. This will not eliminate the problem though.
The omnithrottle is when you angle the left hand stick around 90 degrees downwards so that it looks like a throttle. You retain the three axes and the extra buttons and hats, giving you more freedom.
Windows 11 was officially unveiled this week, and many eager users are checking to see if their PCs can run the upcoming OS with Microsoft’s Windows Health Check app. However, some are surprised to learn that their PCs aren’t “Windows 11 ready,” despite having new, high-end hardware.
What’s a TPM?
The main source of confusion is the TPM (Trusted Platform Module) chip, which was an uncommon hardware requirement until now. TPMs are a security component that monitors your PC for issues and can protect against potential malware and ransomware attacks. They can also securely store encryption keys, passwords, and other sensitive information locally.
TPMs have been a “soft” requirement for Windows 10 for years, but Microsoft is making them a “hard” requirement for Windows 11 to increase the baseline data security for Windows 11 PCs. Users need a version 2.0 TPM or higher to run Windows 11, along with a DirectX 12-compatible GPU; a supported Intel, AMD, or Qualcomm CPU; 4 GB RAM; and at least 65GBs of storage.
Not everyone needs to upgrade
Microsoft wants Windows 11 to be more resilient against malware, ransomware, and other cybersecurity threats than previous versions of Windows. The company is relying on technology like 2.0 TPMs and UEFI Secure Boot to reach that goal, but TPMs are probably not a component that users consider when buying or building a new PC. This would explain why some PCs are “not Windows 11 ready” even if the rest of the hardware meets the Windows 11 requirements.
However, it’s possible many users already have a TPM without realizing it. Many (but not all) CPUs released in the past few years have a built-in TPM module that needs to be enabled in your computer’s BIOS settings. Windows turns these off by default, and if it isn’t active, it may not show up when Windows Health Check scans your hardware. Accessing and enabling your TPM—and even the name of the setting you need to activate—differs greatly between manufacturers. Consult your CPU or motherboard manufacturer for the proper steps.
What to do if you don’t have a TPM chip
If you don’t have a TPM, the next option is to buy one online and install it yourself. Unfortunately, this will be difficult for the average user.
The first task is to find a compatible TPM. Some CPUs can’t support TPMs, so make sure to research before you buy one…or should I say, if you can buy one—Scalpers are hoarding TPM chips and selling them at prices that are much higher than the MSRP. What is normally a $14-$30 dollar component now costs upwards of $100. It’s not as bad as the current GPU and CPU market, but that’s not saying much.
If you find a compatible 2.0 TPM at a fair price, you then have to open your PC and access the motherboard to install it manually. This will be a challenge for some PCs (especially laptops) and impossible on certain tablets and hybrid devices like the Microsoft Surface. Again, do your research before you buy.
If you can’t buy and install a TPM for your current PC, then you’ll need to buy or build a new computer if you want to upgrade to Windows 11. Thankfully, Microsoft intends to support Windows 10 until October 14, 2025, so there’s no pressure to upgrade immediately. Hopefully, the TPM market—and the tech hardware market in general—will stabilize long before then and upgrading won’t be such a hassle.
While a lot of focus has been on the TPM requirements for Windows 11, Microsoft has since updated its documentation to provide a complete list of supported processors. At present the list includes only Intel 8th Generation Core processors or newer, and AMD Ryzen Zen+ processors or newer, effectively limiting Windows 11 to PC less than 4-5 years old.
Originally, Microsoft noted that CPU generation requirements are a “soft floor” limit for the Windows 11 installer, which should have allowed some older CPUs to be able to install Windows 11 with a warning, but hours after we published this story, the company updated that page to explicitly require the list of chips above.
Many Windows 10 users have been downloading Microsoft’s PC Health App (available here) to see whether Windows 11 works on their systems, only to find it fails the check… This is the first significant shift in Windows hardware requirements since the release of Windows 8 back in 2012, and the CPU changes are understandably catching people by surprise.
Microsoft is also requiring a front-facing camera for all Windows 11 devices except desktop PCs from January 2023 onwards.
“In order to run Windows 11, devices must meet the hardware specifications,” explains Microsoft’s official compatibility page for Windows 11.
“Devices that do not meet the hardware requirements cannot be upgraded to Windows 11.”
Stanford researchers have developed a new technique that produces “atomically-thin” transistors under 100 nanometers long. That’s “several times” shorter than the previous best, according to the university.
The team accomplished the feat by overcoming a longstanding hurdle in flexible tech. While ‘2D’ semiconductors are the ideal, they require so much heat to make that they’d melt the flexible plastic. The new approach covers glass-coated silicon with a super-thin semiconductor film (molybdenum disulfide) overlayed with nano-patterened gold electrodes. This produces a film just three atoms thick using a temperature nearing 1,500F — the conventional plastic substrate would have deformed around 680F.
Once the components have cooled, the team can apply the film to the substrate and take a few “additional fabrication steps” to create a whole structure about five microns thick, or a tenth the thickness of human hair. It’s even ideal for low-power use, as it can handle high currents at low voltage.
Featured in Nature Communications, this new research could result in the development of wearable tech that could sense, store, analyze, and infer the activity(s) of its wearers in real-time. The senior author of the study, Yeol Fink, believes that digital fibers like those developed in this study could help expand the possibilities for fabrics to “uncover the context of hidden patterns in the human body that could be used for physical performance monitoring, medical inference, and early disease detection.”
Applications for the technology could even expand into other areas of our lives like, for example, storing wedding music within the bride’s gown.
This study is important as, up to now, most electronic fibers have been analog. This means that they carry a continuous electronic signal rather than a purely digital one.
“This work presents the first realization of a fabric with the ability to store and process data digitally, adding a new information content dimension to textiles and allowing fabrics to be programmed literally,” explained Fink.
The fibers are made from chains of hundreds of tiny silicon chips
The fibers were created by chaining hundreds of microscale silicon digital chips into a preform to make a new “smart” polymer fiber. By using precision control, the authors of the study were able to create fibers with the continuous electrical connection between each chip of tens of meters.
These fibers are thin and flexible and can even be passed through the eye of a needle. This would mean they could be seamlessly (pun intended) woven into existing fabrics, and can even withstand being washed at least ten times without degrading.
This would mean this wearable tech could be retrofitted to existing clothing and you wouldn’t even know it’s there.
[…]
The fiber also has a pretty decent storage capacity too — all things considered. During the research, it was found to be possible to write, store, and recall 767-kilobit full-color short movie files and a 0.48-megabyte music file. The files can be stored for two months without power.
The fibers have also been outfitted with their own neural network
The fibers also integrate a neural network with thousands of connections. This was used to monitor and analyze the surface body temperature of a test subject after being woven into the armpit of the shirt.
By training the neural network with 270-minutes of data the team got it to predict the minute-by-minute activity of the shirt’s wearer with 96% accuracy.
“This type of fabric could give quantity and quality open-source data for extracting out new body patterns that we did not know about before,” Loke added.
With their analytical capabilities, such fibers could, conceivably, provide real-time alerts about a person’s health (like respiratory or heart problems). It could even be used to help deliver muscle activation signals or heart rate data for athletes.
The fibers are also controlled using a small external device that could have microcontrollers added to it in the future.
Jaguar Land Rover (JLR) is shutting its two main car factories temporarily due to a shortage of computer chips.
The difficulties at Britain’s biggest carmaker echo similar problems at other manufacturers, including Ford, who have been hit by a global shortage of chips.
JLR said there would be a “limited period” of closure at its Halewood and Castle Bromwich sites from Monday.
A mixture of strong demand and Covid shutdowns at chipmakers has also hit phone, TV and video games companies.
Companies that sell refrigerators, washers, hairdryers, or TVs in the European Union will need to ensure those appliances can be repaired for up to 10 years, to help reduce the vast mountain of electrical waste that piles up each year on the continent.
The “right to repair,” as it is sometimes called, comes into force across the 27-nation bloc on Monday. It is part of a broader effort to cut the environmental footprint of manufactured goods by making them more durable and energy-efficient.
[…]
“This is a really big step in the right direction,” said Daniel Affelt of the environmental group BUND-Berlin, which runs several “repair cafes” where people can bring in their broken appliances and get help fixing them up again.
Modern appliances are often glued or riveted together, he said. “If you need special tools or have to break open the device, then you can’t repair it.”
Lack of spare parts is another problem, campaigners say. Sometimes a single broken tooth on a tiny plastic sprocket can throw a proverbial wrench in the works.
“People want to repair their appliances,” Affelt said. “When you tell them that there are no spare parts for a device that’s only a couple of years old then they are obviously really frustrated by that.”
Under the new EU rules, manufacturers will have to ensure parts are available for up to a decade, though some will only be provided to professional repair companies to ensure they are installed correctly.
The demand to store ever-increasing volumes of information has resulted in the widespread implementation of data centers for Big Data. These centers consume massive amounts of energy (about 3% of global electricity supply) and rely on magnetization-based hard disk drives with limited storage capacity (up to 2 TB per disk) and lifespan (three to five years). Laser-enabled optical data storage is a promising and cost-effective alternative for meeting this unprecedented demand. However, the diffractive nature of light has limited the size to which bits can be scaled, and as a result, the storage capacity of optical disks.Researchers at USST, RMIT and NUS have now overcome this limitation by using earth-rich lanthanide-doped upconversion nanoparticles and graphene oxide flakes. This unique material platform enables low-power optical writing nanoscale information bits.A much-improved data density can be achieved for an estimated storage capacity of 700 TB on a 12-cm optical disk, comparable to a storage capacity of 28,000 Blu-ray disks. Furthermore, the technology uses inexpensive continuous-wave lasers, reducing operating costs compared to traditional optical writing techniques using expensive and bulky pulsed lasers.This technology also offers the potential for optical lithography of nanostructures in carbon-based chips under development for next-generation nanophotonic devices.
Seismologists at Caltech working with optics experts at Google have developed a method to use existing underwater telecommunication cables to detect earthquakes. The technique could lead to improved earthquake and tsunami warning systems around the world.
[…]
evious efforts to use optical fibers to study seismicity have relied on the addition of sophisticated scientific instruments and/or the use of so-called “dark fibers,” fiber optic cables that are not actively being used.
Now Zhongwen Zhan (Ph.D. ’13), assistant professor of geophysics at Caltech, and his colleagues have come up with a way to analyze the light traveling through “lit” fibers—in other words, existing and functioning submarine cables—to detect earthquakes and ocean waves without the need for any additional equipment. They describe the new method in the February 26 issue of the journal Science.
[…]
The cable networks work through the use of lasers that send pulses of information through glass fibers bundled within the cables to deliver data at rates faster than 200,000 kilometers per second to receivers at the other end. To make optimal use of the cables—that is, to transfer as much information as possible across them—one of the things operators monitor is the polarization of the light that travels within the fibers. Like other light that passes through a polarizing filter, laser light is polarized—meaning, its electric field oscillates in just one direction rather than any which way. Controlling the direction of the electric field can allow multiple signals to travel through the same fiber simultaneously. At the receiving end, devices check the state of polarization of each signal to see how it has changed along the path of the cable to make sure that the signals are not getting mixed.
[…]
On land, all sorts of disturbances, such as changes in temperature and even lightning strikes, can change the polarization of light traveling through fiber optic cables. Because the temperature in the deep ocean remains nearly constant and because there are so few disturbances there, the change in polarization from one end of the Curie Cable to the other remains quite stable over time, Zhan and his colleagues found.
However, during earthquakes and when storms produce large ocean waves, the polarization changes suddenly and dramatically, allowing the researchers to easily identify such events in the data.
Currently, when earthquakes occur miles offshore, it can take minutes for the seismic waves to reach land-based seismometers and even longer for any tsunami waves to be verified. Using the new technique, the entire length of a submarine cable acts as a single sensor in a hard-to-monitor location. Polarization can be measured as often as 20 times per second. That means that if an earthquake strikes close to a particular area, a warning could be delivered to the potentially affected areas within a matter of seconds.
During the nine months of testing reported in the new study (between December 2019 and September 2020), the researchers detected about 20 moderate-to-large earthquakes along the Curie Cable, including the magnitude-7.7 earthquake that took place off of Jamaica on January 28, 2020.
Although no tsunamis were detected during the study, the researchers were able to detect changes in polarization produced by ocean swells that originated in the Southern Ocean. They believe the changes in polarization observed during those events were caused by pressure changes along the seafloor as powerful waves traveled past the cable. “This means we can detect ocean waves, so it is plausible that one day we will be able to detect tsunami waves,” says Zhan.
Zhan and his colleagues at Caltech are now developing a machine learning algorithm that would be able to determine whether detected changes in polarization are produced by earthquakes or ocean waves rather than some other change to the system, such as a ship or crab moving the cable. They expect that the entire detection and notification process could be automated to provide critical information in addition to the data already collected by the global network of land-based seismometers and the buoys in the Deep-ocean Assessment and Reporting of Tsunamis (DART) system, operated by the National Oceanic and Atmospheric Administration’s National Data Buoy Center.
Apple, on its French website, is now publishing repairability scores for its notoriously difficult to repair products, in accordance with a Gallic environmental law enacted a year ago.
Cook & Co score themselves on repairability however, and Cupertino kit sometimes fares better under internal interpretation of the criteria [PDF] than it does under ratings awarded by independent organizations.
For example, Apple gave its 2019 model year 16-inch MacBook Pro (A2141) a repairability score of 6.3 out of 10. According to iFixit, a repair community website, that MacBook Pro model deserves a score of 1 out of 10.
Apple’s evaluation of its products aligns more closely with independent assessment when it comes to phones. Apple gives its iPhone 12 Pro a repairability score of six, which matches the middling score bestowed by iFixit.
“It’s self-reporting right now,” said Gay Gordon-Byrne, executive director of The Repair Association, a repair advocacy group, in an email to The Register. “No audit, no validation, yet. I think there is another year before there are any penalties for lying.”