Why is Xiaomi’s fitness tracker detecting a heartbeat from a roll of toilet paper?
Weibo users are confused, but the answer isn’t as wild as it seems
Does a roll of toilet paper have a heart? Obviously not. So why does Xiaomi’s fitness band display a heart rate when it’s wrapped around a roll of toilet paper?
Weibo users have been discussing the phenomenon, with plenty of pictures from mystified users who say the Xiaomi Mi Band 3 fitness tracker is “detecting” a heart rate on toilet paper.
So we decided to get a Mi Band 3 — and of course, a roll of toilet paper — to check it out.
Bizarrely, it’s true.
It didn’t work all the time — only around a quarter of attempts gave us a heartbeat. The numbers were pretty random (ranging from 59bpm to 88bpm), but they were real.
So what about other objects? We tried wrapping the Mi Band 3 around a mug, because we had a mug, and a banana, because the internet likes bananas. Both gave us a heart rate quickly and far more consistently than the toilet paper did.
59bpm? That roll of toilet paper is so chill right now. (Picture: Abacus)
But the Xiaomi band isn’t alone. We also tried the banana and mug with an Apple Watch Series 4 and a Ticwatch, an Android Wear smartwatch. Both also displayed a heartbeat for the two heartless objects, ranging from 33bpm on the banana (Apple Watch) to 130bpm for the mug (Ticwatch).
Researchers from Linköping University and the Royal Institute of Technology in Sweden have proposed a new device concept that can efficiently transfer the information carried by electron spin to light at room temperature—a stepping stone toward future information technology. They present their approach in an article in Nature Communications.
Light and electron charge are the main media for information processing and transfer. In the search for information technology that is even faster, smaller and more energy-efficient, scientists around the globe are exploring another property of electrons—their spin. Electronics that exploit both the spin and the charge of the electron are called “spintronics.”
[…]
“The main problem is that electrons easily lose their spin orientations when the temperature rises. A key element for future spin-light applications is efficient quantum information transfer at room temperature, but at room temperature, the electron spin orientation is nearly randomized.
[…]
Now, researchers from Linköping University and the Royal Institute of Technology have devised an efficient spin-light interface.
“This interface can not only maintain and even enhance the electron spin signals at room temperature. It can also convert these spin signals to corresponding chiral light signals travelling in a desired direction,” says Weimin Chen.
The key element of the device is extremely small disks of gallium nitrogen arsenide, GaNAs. The disks are only a couple of nanometres high and stacked on top of each other with a thin layer of gallium arsenide (GaAs) between to form chimney-shaped nanopillars. For comparison, the diameter of a human hair is about a thousand times larger than the diameter of the nanopillars.
The unique ability of the proposed device to enhance spin signals is due to minimal defects introduced into the material by the researchers. Fewer than one out of a million gallium atoms are displaced from their designated lattice sites in the material. The resulting defects in the material act as efficient spin filters that can drain electrons with an unwanted spin orientation and preserve those with the desired spin orientation.
“An important advantage of the nanopillar design is that light can be guided easily and more efficiently coupled in and out,” says Shula Chen, first author of the article.
Over on the EEVblog, someone noticed an interesting chip that’s been apparently flying under our radar for a while. This is an ARM processor capable of running Linux. It’s hand-solderable in a TQFP package, has a built-in Mali GPU, support for a touch panel, and has support for 512MB of DDR3. If you do it right, this will get you into the territory of a BeagleBone or a Raspberry Pi Zero, on a board that’s whatever form factor you can imagine. Here’s the best part: you can get this part for $1 USD in large-ish quantities. A cursory glance at the usual online retailers tells me you can get this part in quantity one for under $3. This is interesting, to say the least.
The chip in question, the Allwinner A13, is a 1GHz ARM Cortex-A8 processor. While it’s not much, it is a chip that can run Linux in a hand-solderable package. There is no HDMI support, you’ll need to add some more chips (that are probably in a BGA package), but, hey, it’s only a dollar.
If you’d like to prototype with this chip, the best options right now are a few boards from Olimex, and a System on Module from the same company. That SoM is an interesting bit of kit, allowing anyone to connect a power supply, load an SD card, and get this chip doing something.
Currently, there aren’t really any good solutions for a cheap Linux system you can build at home, with hand-solderable chips. Yes, you could put Linux on an ATMega, but that’s the worst PC ever. A better option is the Octavo OSD335x SoC, better known as ‘the BeagleBone on a Chip’. This is a BGA chip, but the layout isn’t too bad, and it can be assembled using a $12 toaster oven. The problem with this chip is the price; at quantity 1000, it’s a $25 chip. At quantity one, it’s a $40 chip. NXP’s i.MX6 chips have great software support, but they’re $30 chips, and you’ll need some DDR to make it do something useful, and that doesn’t even touch the fiddlyness of a 600-ball package
While the Allwinner A13 beats all the other options on price and solderability, it should be noted that like all of these random Linux-capable SoCs, the software is a mess. There is a reason those ‘Raspberry Pi killers’ haven’t yet killed the Raspberry Pi, and it’s because the Allwinner chips don’t have documentation and let’s repeat that for emphasis: the software is a mess.
Metals are widely used for antennas; however, their bulkiness limits the fabrication of thin, lightweight, and flexible antennas. Recently, nanomaterials such as graphene, carbon nanotubes, and conductive polymers came into play. However, poor conductivity limits their use. We show RF devices for wireless communication based on metallic two-dimensional (2D) titanium carbide (MXene) prepared by a single-step spray coating. We fabricated a ~100-nm-thick translucent MXene antenna with a reflection coefficient of less than −10 dB. By increasing the antenna thickness to 8 μm, we achieved a reflection coefficient of −65 dB. We also fabricated a 1-μm-thick MXene RF identification device tag reaching a reading distance of 8 m at 860 MHz. Our finding shows that 2D titanium carbide MXene operates below the skin depth of copper or other metals as well as offers an opportunity to produce transparent antennas.
Lenovo is making it easier for their customers running Linux to update their firmware now on ThinkPad, ThinkStation, and ThinkCenter hardware.
Lenovo has joined the Linux Vendor Firmware Service (LVFS) and following collaboration with the upstream developers is beginning to roll-out support for offering their device firmware on this platform so it can be easily updated by users with the fwupd stack. Kudos to all involved especially with Lenovo ThinkPads being very popular among Linux users.
Red Hat’s Richard Hughes outlined the Lenovo collaboration on his blog and more Lenovo device firmware will begin appearing on LVFS in the next few weeks.
In his post, Richard also called out HP as now being one of the few major vendors not yet officially backing the LVFS.
There are, to date, seven different HDMI versions, starting with 1.0, which was introduced back in 2002, and currently ending with 2.1, which was only announced back in November of 2017. The amount of bandwidth each each version is capable of supporting, as well as any additional cool features a version may possess, is decided upon by the HDMI licensing group, which is made of a collection of companies, including Toshiba, Technicolor, Panasonic and Sony.
HDMI Version 1.4, which was introduced back in 2009, is the current de facto standard HDMI cable. It supports up to 10Gbps and a 1080p resolution with a 120Hz refresh rate (which means the screen can display 120 frames per second—great for sports and games), but it can only do 4K at 60Hz, and it can’t handle new features like HDR and wide color gamut. That means it’s worthless if you’re trying to hook up the latest set-top box or game console with most TVs made in the last two to three years.
Well, it’s not worthless, but it’s not ideal, either! You’re essentially losing out on the cool features you paid for in that TV and HDMI-connected device.
HDMI 1.4 also has to sub versions: 1.4a and 1.4b. The former allows the cable to work with 3D televisions in 1080p 24Hz, and the latter allows it to also handle 3D 1080p at 120Hz. Neither provides any noticeable improvement if you’re using one with a 2D television. As 3D TVs aren’t especially popular anymore, and there’s not a lot of content available, you don’t really need to think too much about these two—they’ll still work just like a vanilla version 1.4 cable.
What does provide an improvement is moving to Version 2.0. With this upgrade, the maximum bandwidth of the cable nearly doubles, from 10Gbps to 18Gbps. This means the cable can theoretically transmit a lot more data—like all the data needed to properly render a wider color gamut or HDR. Unfortunately, you’re still capped at 4K and 60Hz. So if you head into the big box store and they try to sell you on a fancy 4K TV capable of 120Hz, don’t necessarily feel like you need to spend the money. You will not be able to get a 4K 120Hz picture transmitted over HDMI with version 2.0 or earlier.
This might be where you point to Version 2.1, which was announced back in November 2017. It doesn’t just double the bandwidth. At a theoretical max of 48Gbps, it’s almost three times faster than 2.0 and nearly five times faster than 1.4 or earlier. It can actually do 4K and 120Hz and wide color gamut and HDR all at the same time. However, because it was announced in November 2017, there are very, very few TVs with ports that support the standard, or cables made to the standard.
HDMI cable standards are hidden, because the world is terrible
At this point, you might think you cracked the code, as if you could just go out, find an HDMI 2.0 or 2.1 cable, plug it in, and you’re good to go. Unfortunately, in 2012 HDMI pulled a truly bonehead move and essentially forbid anyone from actually saying what standards their cables support.
You can’t just go to Monoprice or Amazon and choose a nice-looking 2.0 cable and call it a day. But thankfully this guide exists, so you also don’t have to pore over every single number that chases after an HDMI cable when you do a search on Monoprice or Amazon.
The key thing isn’t to look for 4K, or 60Hz, or HDR, or more complex stats like YUV 4:4:4. All you actually need to pay attention to is the bandwidth of the cable. You want to find cables that say they are capable of 18Gbps or higher.
You also want to make sure that those cables are certified, as uncertified cables can make any kind of bandwidth claim they please and not actually deliver. A certified cable will be a little more expensive, but that means a dollar or two more. It’s a small price to pay to make sure your $1,000 TV is showing the picture it was designed to show.
Knowing when to trash a cable
So how do you know if the cables you already have are worthless? There typically aren’t any markers on the cable you can trust to accurately tell you. So if you don’t want to chuck all the cables you currently own and go buy all new ones, you’ll need to check a few things.
First, look at the manual for you TV and see what version of HDMI each port supports. Many TVs, especially cheaper ones, might only have Version 2.0 or higher on one port! That means there’s only one port that can handle 4K and HDR and all the stuff the TV bragged about having when you bought it. So locate a Version 2.0 port on your TV and plug in a device that supports 4K and HDR. Now, confirm that HDR is enabled on the TV. You’ll need to check you manual as every TV confirms HDR differently.
If HDR is enabled then you’re probably good to go! But if it is enabled and you notice the picture is pixelating or stuttering, then it means the cable can’t handle all the data and should be replaced. This is especially common with cables over 6 feet that are attempting to transmit 4K 60Hz picture with wide color gamut and HDR. For that reason, it’s rarely a good idea to buy a cable that is longer than 6 feet.
As good certified cables can be found at places like Amazon and Monoprice for under $10, there’s really no reason not to double-check and replace your cables if needed. You spent all that money on a good picture, so why waste it because of a cheap cable?
Nvidia’s flagship Titan V graphics cards may have hardware gremlins causing them to spit out different answers to repeated complex calculations under certain conditions, according to computer scientists.
The Titan V is the Silicon Valley giant’s most powerful GPU board available to date, and is built on Nv’s Volta technology. Gamers and casual users will not notice any errors or issues, however folks running intensive scientific software may encounter occasional glitches.
One engineer told The Register that when he tried to run identical simulations of an interaction between a protein and enzyme on Nvidia’s Titan V cards, the results varied. After repeated tests on four of the top-of-the-line GPUs, he found two gave numerical errors about 10 per cent of the time. These tests should produce the same output values each time again and again. On previous generations of Nvidia hardware, that generally was the case. On the Titan V, not so, we’re told.
We have repeatedly asked Nvidia for an explanation, and spokespeople have declined to comment. With Nvidia kicking off its GPU Technology Conference in San Jose, California, next week, perhaps then we’ll get some answers.
All in all, it is bad news for boffins as reproducibility is essential to scientific research. When running a physics simulation, any changes from one run to another should be down to interactions within the virtual world, not rare glitches in the underlying hardware.
[…]
Unlike previous GeForce and Titan GPUs, the Titan V is geared not so much for gamers but for handling intensive parallel computing workloads for data science, modeling, and machine learning.
And at $2,999 (£2,200) a pop, it’s not cheap to waste resources and research time on faulty hardware. Engineers speaking to The Register on condition of anonymity to avoid repercussions from Nvidia said the best solution to these problems is to avoid using Titan V altogether until a software patch has been released to address the mathematical oddities.
March 19 is the first day of IBM Think 2018, the company’s flagship conference, where the company will unveil what it claims is the world’s smallest computer. They’re not kidding: It’s literally smaller than a grain of salt.
But don’t let the size fool you: This sucker has the computing power of the x86 chip from 1990. Okay, so that’s not great compared to what we have today, but cut it some slack — you need a microscope to see it.
The computer will cost less than ten cents to manufacture, and will also pack “several hundred thousand transistors,” according to the company. These will allow it to “monitor, analyze, communicate, and even act on data.”
[…]
According to IBM, this is only the beginning. “Within the next five years, cryptographic anchors — such as ink dots or tiny computers smaller than a grain of salt — will be embedded in everyday objects and devices,” says IBM head of research Arvind Krishna. If he’s correct, we’ll see way more of these tiny systems in objects and devices in the years to come.
It is not much to look at: the nematode C. elegans is about one millimetre in length and is a very simple organism. But for science, it is extremely interesting. C. elegans is the only living being whose neural system has been analysed completely. It can be drawn as a circuit diagram or reproduced by computer software, so that the neural activity of the worm is simulated by a computer program.
Such an artificial C. elegans has now been trained at TU Wien (Vienna) to perform a remarkable trick: The computer worm has learned to balance a pole at the tip of its tail.
[…]
“With the help of reinforcement learning, a method also known as ‘learning based on experiment and reward’, the artificial reflex network was trained and optimized on the computer”, Mathias Lechner explains. And indeed, the team succeeded in teaching the virtual nerve system to balance a pole. “The result is a controller, which can solve a standard technology problem – stabilizing a pole, balanced on its tip. But no human being has written even one line of code for this controller, it just emerged by training a biological nerve system”, says Radu Grosu.
The team is going to explore the capabilities of such control-circuits further. The project raises the question, whether there is a fundamental difference between living nerve systems and computer code. Is machine learning and the activity of our brain the same on a fundamental level? At least we can be pretty sure that the simple nematode C. elegans does not care whether it lives as a worm in the ground or as a virtual worm on a computer hard drive.
Razer is a vendor that makes high-end gaming hardware, including laptops, keyboards and mice. I opened a ticket with Razor a few days ago asking them if they wanted to support the LVFS project by uploading firmware and sharing the firmware update protocol used. I offered to upstream any example code they could share under a free license, or to write the code from scratch given enough specifications to do so. This is something I’ve done for other vendors, and doesn’t take long as most vendor firmware updaters all do the same kind of thing; there are only so many ways to send a few kb of data to USB devices. The fwupd project provides high-level code for accessing USB devices, so yet-another-update-protocol is no big deal. I explained all about the LVFS, and the benefits it provided to a userbase that is normally happy to vote using their wallet to get hardware that’s supported on the OS of their choice.
I just received this note on the ticket, which was escalated appropriately:
I have discussed your offer with the dedicated team and we are thankful for your enthusiasm and for your good idea.
I am afraid I have also to let you know that at this moment in time our support for software is only focused on Windows and Mac.
The CEO of Razer Min-Liang Tan said recently “We’re inviting all Linux enthusiasts to weigh in at the new Linux Corner on Insider to post feedback, suggestions and ideas on how we can make it the best notebook in the world that supports Linux.” If this is true, and more than just a sound-bite, supporting the LVFS for firmware updates on the Razer Blade to solve security problems like Meltdown and Spectre ought to be a priority?
I have gone off them since they require their products to be connected via their cloud to change settings and receive updates. There is absolutely no reason for a mouse to need to be connected to Razer to change settings.
engineers at MIT have designed an artificial synapse in such a way that they can precisely control the strength of an electric current flowing across it, similar to the way ions flow between neurons. The team has built a small chip with artificial synapses, made from silicon germanium. In simulations, the researchers found that the chip and its synapses could be used to recognize samples of handwriting, with 95 percent accuracy.
[…]
Most neuromorphic chip designs attempt to emulate the synaptic connection between neurons using two conductive layers separated by a “switching medium,” or synapse-like space. When a voltage is applied, ions should move in the switching medium to create conductive filaments, similarly to how the “weight” of a synapse changes.
But it’s been difficult to control the flow of ions in existing designs. Kim says that’s because most switching mediums, made of amorphous materials, have unlimited possible paths through which ions can travel — a bit like Pachinko, a mechanical arcade game that funnels small steel balls down through a series of pins and levers, which act to either divert or direct the balls out of the machine.
Like Pachinko, existing switching mediums contain multiple paths that make it difficult to predict where ions will make it through. Kim says that can create unwanted nonuniformity in a synapse’s performance.
“Once you apply some voltage to represent some data with your artificial neuron, you have to erase and be able to write it again in the exact same way,” Kim says. “But in an amorphous solid, when you write again, the ions go in different directions because there are lots of defects. This stream is changing, and it’s hard to control. That’s the biggest problem — nonuniformity of the artificial synapse.”
A perfect mismatch
Instead of using amorphous materials as an artificial synapse, Kim and his colleagues looked to single-crystalline silicon, a defect-free conducting material made from atoms arranged in a continuously ordered alignment. The team sought to create a precise, one-dimensional line defect, or dislocation, through the silicon, through which ions could predictably flow.
To do so, the researchers started with a wafer of silicon, resembling, at microscopic resolution, a chicken-wire pattern. They then grew a similar pattern of silicon germanium — a material also used commonly in transistors — on top of the silicon wafer. Silicon germanium’s lattice is slightly larger than that of silicon, and Kim found that together, the two perfectly mismatched materials can form a funnel-like dislocation, creating a single path through which ions can flow.
The researchers fabricated a neuromorphic chip consisting of artificial synapses made from silicon germanium, each synapse measuring about 25 nanometers across. They applied voltage to each synapse and found that all synapses exhibited more or less the same current, or flow of ions, with about a 4 percent variation between synapses — a much more uniform performance compared with synapses made from amorphous material.
They also tested a single synapse over multiple trials, applying the same voltage over 700 cycles, and found the synapse exhibited the same current, with just 1 percent variation from cycle to cycle.
A joint China-Austria team has performed quantum key distribution between the quantum-science satellite Micius and multiple ground stations located in Xinglong (near Beijing), Nanshan (near Urumqi), and Graz (near Vienna). Such experiments demonstrate the secure satellite-to-ground exchange of cryptographic keys during the passage of the satellite Micius over a ground station. Using Micius as a trusted relay, a secret key was created between China and Europe at locations separated up to 7,600 km on the Earth.
[…]
Within a year after launch, three key milestones for a global-scale quantum internet were achieved: satellite-to-ground decoy-state QKD with kHz rate over a distance of ~1200 km (Liao et al. 2017, Nature 549, 43); satellite-based entanglement distribution to two locations on the Earth separated by ~1200 km and Bell test (Yin et al. 2017, Science 356, 1140), and ground-to-satellite quantum teleportation (Ren et al. 2017, Nature 549, 70). The effective link efficiencies in the satellite-based QKD were measured to be ~20 orders of magnitude larger than direct transmission through optical fibers at the same length of 1200 km. The three experiments are the first steps toward a global space-based quantum internet.
The satellite-based QKD has now been combined with metropolitan quantum networks, in which fibers are used to efficiently and conveniently connect numerous users inside a city over a distance scale of ~100 km. For example, the Xinglong station has now been connected to the metropolitan multi-node quantum network in Beijing via optical fibers. Very recently, the largest fiber-based quantum communication backbone has been built in China, also by Professor Pan’s team, linking Beijing to Shanghai (going through Jinan and Hefei, and 32 trustful relays) with a fiber length of 2000 km. The backbone is being tested for real-world applications by government, banks, securities and insurance companies.
Read more at: https://phys.org/news/2018-01-real-world-intercontinental-quantum-enabled-micius.html#jCp
The Japanese company will unveil and test its “brain-to-vehicle” technology at next week’s Consumer Electronics Show in Las Vegas. The “B2V” system requires a driver to wear a skullcap that measures brain-wave activity and transmits its readings to steering, acceleration and braking systems that can start responding before the driver initiates the action.The driver still turns the wheel or hits the gas pedal, but the car anticipates those movements and begins the actions 0.2 seconds to 0.5 seconds sooner, said Lucian Gheorghe, a senior innovation researcher at Nissan overseeing the project. The earlier response should be imperceptible to drivers, he said.“We imagine a future where manual driving is still a value of society,” said Gheorghe, 40, who earned a doctorate in applied neural technology. “Driving pleasure is something as humans we should not lose.”
Supports 2.5″ SATA I/II/III hard drive/solid state drive. USB 3.0 supports data transfer speeds up to 5Gbps. Backwards compatible with USB2.0/USB1.0
Efficient UASP Transfer Protocol. An Equipped Cover provides better dust protecting SATA connector from dust.
Portable and lightweight design make it is easy to carry. LED light shows Power and Activity status.
Support hot swapping, easy and tool-free installation. No drivers or software needed
What We Offer – Unitek USB 3.0 to SATA 6G Adapter x1, 2-year warranty quality guarantee, 24h friendly customer service and email support
The Rival 600 even has its own CPU and storage tucked inside, so that once you get everything configured just the way you like, you can save those settings directly in the mouse, so you won’t need to re-download the SteelSeries app if you play with it on a different machine.
What I really really dislike about Razer’s offering is that their control panel requires an online account and connection. The settings and who knows what else is stored in their ‘cloud’. For a mouse or keyboard driver, this seems to me to be totally unnecessary and an invasion of privacy. This looks like a good alternative.
Mesh routers like Eero, Netgear’s Orbi, and Google Wifi are getting all the hype these days, but replacing your whole network with a bunch of new devices can be kind of expensive. Asus has a good solution with its new AiMesh system, which lets you repurpose your existing Asus routers as part of a mesh network.For now, the mesh support is coming to a few routers today in beta, including the ASUS RT-AC68U, RT-AC1900P, RT-AC86U, RT-AC5300, and the ROG Rapture GT-AC5300. with additional support planned for the RT-AC88U and RT-AC3100 later this year.
Seagate is increasing IO performance in disk drives by separating read-write heads into two separate sets which can operate independently and in parallel.The heads are positioned at one end of actuator arms which rotate around a post at their other end to move the heads across the platter surfaces. Thus, with an eight-platter drive, each read-write head is positioned above the same cylindrical track on each platter and reads or writes to and from the same disk blocks on each platter’s surface.Seagate’s Multi Actuator technology divides these eight heads into two sets of four, and they can move independently of each other. An animated graphic here shows them in operation.
A 2-in-1 Windows 10 laptop powered by a smartphone chip
The chipset behind the Asus NovaGo comes straight from smartphones, so we were into the fact that the volume and power keys are aligned along the right side of the laptop. This is shaping up to be the always-connected laptop counterpart to a smartphone in so many ways.
[…]
The Asus NovaGo presents a glimpse of an always-connected laptop future with what promises to be stellar battery life, mixed with last year’s smartphone chipset and older ports.
It has us excited for what this laptop eliminates more than it introduces. Not having to connect to unsecure Wi-Fi, setup a hotspot or worry as much about battery life is a brilliant change that makes it possible to use this laptop anyway.
Performance is the wildcard. How does Qualcomm’s smartphone chipset backed by a lot of RAM compare to laptop that have the usual Intel CPUs at the heart?
That’s going to require more testing of the Asus NovaGo in a full review coming soon.
Having been very happy about my old Canon printer, I decided to get another one when it died after four years of trusted service. This one is absolutely horrific. It started off with difficulties connecting via WiFi. The amount of paper jams I have is around 1 page printed to 1 page jammed. The scanner can’t remember if you want to scan a PDF or a PNG and defaults to PNG. Scans are unceremoniously dumped into the Documents folder. When you open the lid to change pages to scan, you are as likely to open the ink drawer. Occassionaly the printer decides to forget what type of paper is in the drawer and asks you to register the paper type (it has never been anything BUT A4!). Sometimes it just randomly prints off blank pages. Because it feels like it. A true frustration, getting behind this damn thing.
An apparent factory cockup has left OnePlus Android smartphones with an exposed diagnostics tool that can be potentially exploited to root the handsets.
Security researcher Robert Baptiste suggested the EngineerMode APK was made by Qualcomm, and was intended to be used by factory staff to test phones for basic functionality before they are shipped out to the public.
Unfortunately, it seems someone at OnePlus forgot to remove or disable the package before kicking the handsets out to the general public, and as a result folks now have access to what is effectively a backdoor in their Android phones.
In addition to basic diagnostic tasks like checking the functionality of the phone’s hardware components – such as the GPS and wireless electronics – the tool can also allow people, using the password ‘angela’, to obtain root access and gain full control over a device:
Being able to root your phone gives you access to the full functionality of the OS, however. This is something I think is a good idea – there are plenty of apps (eg battery monitors) that require root access to function.
The phone-in-the-closet phenomenon has become a hidden store of e-waste; a two-year-old phone still has value and is still a powerful device. And so it’s great news that Samsung is starting a new “Upcycling” initiative that is designed to turn old smartphones and turn them into something brand new.Behold, for example, this bitcoin mining rig, made out of 40 old Galaxy S5 devices, which runs on a new operating system Samsung has developed for its upcycling initiative.
[…]
The team hooked 40 old Galaxy S5’s together to make a bitcoin mining rig, repurposed an old Galaxy tablet into a ubuntu-powered laptop, used a Galaxy S3 to monitor a fishtank, and programed an old phone with facial recognition software to guard the entrance of a house in the form of an owl.
[…]
It’s all very cool and Samsung plans to release both the software it used to unlock the phones as well as the various plans for the projects online for free.
[…]
Upcycling is a great way to keep old devices alive and it can’t easily happen without the original manufacturer’s support. “The challenge with keeping old electronics running a long time is software,” Kyle Wiens, CEO of iFixit, told me over the phone. “With phones in particular, the old software is insecure and doesn’t run the new apps.
[…]
Samsung’s upcycling project has a placeholder github with a video explaining its process. “They’re setting up a maker magazine style portfolio of projects,” Wiens explained. The site will work by allowing users to download software that removes Android and opens the devices up to other forms of software. From there, users can browse a wide variety of homebrew software and projects.
The platform will be open, so users can make and upload their own projects and software once it launches. In an example from a Samsung promotional video, a user downloaded fish monitoring software to an old Galaxy S3 and ordered the sensors for the water right from the website. After it’s all set up, the user has a device that monitors the PH balance and heat of the fish tank. It even allows the pet owner to snap pics of their swimmers or turn the lights on and off.
Robust support for repurposing devices like this is unheard of in the tech industry. Companies such as Apple have made it hard for users to fix their own broken devices. In most cases, manufacturers would rather people just buy new devices than fix their old ones. It’s a philosophy that’s good for the company, but bad for the environment and bad for the customer.
Manufacturing of the Kinect has shut down. Originally created for the Xbox 360, Microsoft’s watershed depth camera and voice recognition microphone sold ~35 million units since its debut in 2010, but Microsoft will no longer produce it when retailers sell off their existing stock. The company will continue to support Kinect for customers on Xbox, but ongoing developer tools remain unclear. Microsoft shared the news with Co.Design in exclusive interviews with Alex Kipman, creator of the Kinect, and Matthew Lapsen, GM of Xbox Devices Marketing.
The Kinect had already been slowly de-emphasized by Microsoft, as the Xbox team anchored back around traditional gaming to counter the PS4, rather than take its more experimental approach to entertainment. Yet while the Kinect as a standalone product is off the market, its core sensor lives on. Kinect v4–and soon to be, v5–powers Microsoft’s augmented reality Hololens, which Kipman also created. Meanwhile, Kinect’s team of specialists have gone on to build essential Microsoft technologies, including the Cortana voice assistant, the Windows Hello biometric facial ID system, and a context-aware user interface for the future that Microsoft dubs Gaze, Gesture, and Voice (GGV).
A real shame for a truly revolutionary MS product.
During a routine periodic fire suppression system maintenance, an unexpected release of inert fire suppression agent occurred. When suppression was triggered, it initiated the automatic shutdown of Air Handler Units (AHU) as designed for containment and safety. While conditions in the data center were being reaffirmed and AHUs were being restarted, the ambient temperature in isolated areas of the impacted suppression zone rose above normal operational parameters. Some systems in the impacted zone performed auto shutdowns or reboots triggered by internal thermal health monitoring to prevent overheating of those systems.
[…]
However, some of the overheated servers and storage systems “did not shutdown in a controlled manner,” and it took a while to bring them back online.
As a result, virtual machines were axed to avoid any data corruption by keeping them alive. Azure Backup vaults were not available, and this caused backup and restore operation failures. Azure Site Recovery lost failover ability and HDInsight, Azure Scheduler and Functions dropped jobs as their storage systems went offline.
Azure Monitor and Data Factory showed serious latency and errors in pipelines, Azure Stream Analytics jobs stopped processing input and producing output, albeit only for a few minutes, and Azure Media Services saw failures and latency issues for streaming requests, uploads, and encoding.
Tosi’s conceptual breakthrough is the creation of an entirely new type of qubit, using both the nucleus and the electron. In this approach, a qubit ‘0’ state is defined when the spin of the electron is down and the nucleus spin is up, while the ‘1’ state is when the electron spin is up, and the nuclear spin is down.
“We call it the ‘flip-flop’ qubit,” said Tosi. “To operate this qubit, you need to pull the electron a little bit away from the nucleus, using the electrodes at the top. By doing so, you also create an electric dipole.”
“This is the crucial point,” adds Morello. “These electric dipoles interact with each other over fairly large distances, a good fraction of a micron, or 1,000 nanometres.
“This means we can now place the single-atom qubits much further apart than previously thought possible,” he continued. “So there is plenty of space to intersperse the key classical components such as interconnects, control electrodes and readout devices, while retaining the precise atom-like nature of the quantum bit.”
Antennas receive information by resonating with EM waves, which they convert into electrical voltage. For such resonance to occur, a traditional antenna’s length must roughly match the wavelength of the EM wave it receives, meaning that the antenna must be relatively big. However, like a guitar string, an antenna can also resonate with acoustic waves. The new antennas take advantage of this fact. They will pick up EM waves of a given frequency if its size matches the wavelength of the much shorter acoustic waves of the same frequency. That means that that for any given signal frequency, the antennas can be much smaller.
The trick is, of course, to quickly turn the incoming EM waves into acoustic waves. To do that, the two-part antenna employs a thin sheet of a so-called piezomagnetic material, which expands and contracts when exposed to a magnetic field. If it’s the right size and shape, the sheet efficiently converts the incoming EM wave to acoustic vibrations. That piezomagnetic material is then attached to a piezoelectric material, which converts the vibrations to an oscillating electrical voltage. When the antenna sends out a signal, information travels in the reverse direction, from electrical voltage to vibrations to EM waves. The biggest challenge, Sun says, was finding the right piezomagnetic material—he settled on a combination of iron, gallium, and boron—and then producing it at high quality.
The team created two kinds of acoustic antennas. One has a circular membrane, which works for frequencies in the gigahertz range, including those for WiFi. The other has a rectangular membrane, suitable for megahertz frequencies used for TV and radio. Each is less than a millimeter across, and both can be manufactured together on a single chip. When researchers tested one of the antennas in a specially insulated room, they found that compared to a conventional ring antenna of the same size, it sent and received 2.5 gigahertz signals about 100,000 times more efficiently, they report today in Nature Communications.