Powerful antibiotic discovered using machine learning for first time

A powerful antibiotic that kills some of the most dangerous drug-resistant bacteria in the world has been discovered using artificial intelligence.

The drug works in a different way to existing antibacterials and is the first of its kind to be found by setting AI loose on vast digital libraries of pharmaceutical compounds.

[…]

“I think this is one of the more powerful antibiotics that has been discovered to date,” added James Collins, a bioengineer on the team at MIT. “It has remarkable activity against a broad range of antibiotic-resistant pathogens.”

[…]

To find new antibiotics, the researchers first trained a “deep learning” algorithm to identify the sorts of molecules that kill bacteria. To do this, they fed the program information on the atomic and molecular features of nearly 2,500 drugs and natural compounds, and how well or not the substance blocked the growth of the bug E coli.

Once the algorithm had learned what molecular features made for good antibiotics, the scientists set it working on a library of more than 6,000 compounds under investigation for treating various human diseases. Rather than looking for any potential antimicrobials, the algorithm focused on compounds that looked effective but unlike existing antibiotics. This boosted the chances that the drugs would work in radical new ways that bugs had yet to develop resistance to.

Jonathan Stokes, the first author of the study, said it took a matter of hours for the algorithm to assess the compounds and come up with some promising antibiotics. One, which the researchers named “halicin” after Hal, the astronaut-bothering AI in the film 2001: A Space Odyssey, looked particularly potent.

Writing in the journal Cell, the researchers describe how they treated numerous drug-resistant infections with halicin, a compound that was originally developed to treat diabetes, but which fell by the wayside before it reached the clinic.

Tests on bacteria collected from patients showed that halicin killed Mycobacterium tuberculosis, the bug that causes TB, and strains of Enterobacteriaceae that are resistant to carbapenems, a group of antibiotics that are considered the last resort for such infections. Halicin also cleared C difficile and multidrug-resistant Acinetobacter baumannii infections in mice.

To hunt for more new drugs, the team next turned to a massive digital database of about 1.5bn compounds. They set the algorithm working on 107m of these. Three days later, the program returned a shortlist of 23 potential antibiotics, of which two appear to be particularly potent. The scientists now intend to search more of the database.

Stokes said it would have been impossible to screen all 107m compounds by the conventional route of obtaining or making the substances and then testing them in the lab. “Being able to perform these experiments in the computer dramatically reduces the time and cost to look at these compounds,” he said.

Barzilay now wants to use the algorithm to find antibiotics that are more selective in the bacteria they kill. This would mean that taking the antibiotic kills only the bugs causing an infection, and not all the healthy bacteria that live in the gut. More ambitiously, the scientists aim to use the algorithm to design potent new antibiotics from scratch.

Source: Powerful antibiotic discovered using machine learning for first time | Society | The Guardian

People Are Killing Puppy Clones That Don’t Come Out ‘Perfect’ – wait you can clone your puppy?!

This is a hugely holier than thou article written by a strident anti-abortionist, but it’s quite interesting in that a) you can clone your puppy commercially and b) it’s absolutely not a perfected science.

You have five days after your pet dies to extract its genetic material for cloning, according to the Seoul-based Sooam Biotech Research Foundation, which offers dog and cat cloning services. The company recommends wrapping the deceased in wet blankets and throwing them into the fridge before you send the package. From there, scientists will harvest tissue and eggs, usually from slaughterhouses, then transfer them into surrogate mothers via in vitro fertilization.

It can take dozens of artificial inseminations into a mother animal’s womb to get a single egg to gestation. When that mother finally does give birth — there are scores of these surrogate mothers whose only job is to be filled with needles until they conceive, and then do it again — what’s born might be a genetic copy of the original, but it isn’t a perfect copy.

When I picked up Onruang’s pups and examined them head to hock — they weighed maybe three pounds a piece — I saw surprising amounts of subtle variations in markings and size.

[…]

When an animal is cloned, the donor — the mother carrying the clone — contributes extremely low levels of mitochondrial DNA. “That’s the variation which can account for differing color patterns and other unknowns,” says Doug Antczak, a veterinary scientist at Cornell University who specializes in horse genetics.

What’s eventually passed to the cloned pet buyer is a reasonable facsimile, something good enough to the naked eye that they’ll say:That’s my dog!” And here’s where the scale of this production might — or should — give pause.

Many clones are born with defects and genetic disorders, and since those imperfections aren’t what their buyer is spending tens of thousands of dollars on, they end up discarded.

[…]

if that cloned dog does make it through the gauntlet — but is missing the spot over its eye that a deceased pet had, for instance — it still faces a swift death via euthanasia, just another pile of genetic material to harvest.

“There’s too many mistakes, too many stillbirths, deformities, and mutations,” warns Chris Cauble, a Glendale, California, veterinarian whose mobile service offers tissue collection for cloning pets.

Source: People Are Killing Puppy Clones That Don’t Come Out ‘Perfect’

Car ‘splatometer’ tests reveal 80% decline in number of insects in last decade

Two scientific studies of the number of insects splattered by cars have revealed a huge decline in abundance at European sites in two decades.

The research adds to growing evidence of what some scientists have called an “insect apocalypse”, which is threatening a collapse in the natural world that sustains humans and all life on Earth. A third study shows plummeting numbers of aquatic insects in streams.

The survey of insects hitting car windscreens in rural Denmark used data collected every summer from 1997 to 2017 and found an 80% decline in abundance. It also found a parallel decline in the number of swallows and martins, birds that live on insects.

The second survey, in the UK county of Kent in 2019, examined splats in a grid placed over car registration plates, known as a “splatometer”. This revealed 50% fewer impacts than in 2004. The research included vintage cars up to 70 years old to see if their less aerodynamic shape meant they killed more bugs, but it found that modern cars actually hit slightly more insects.

“This difference we found is critically important, because it mirrors the patterns of decline which are being reported widely elsewhere, and insects are absolutely fundamental to food webs and the existence of life on Earth,” said Paul Tinsley-Marshall from Kent Wildlife Trust. “It’s pretty horrendous.”

[…]

The Danish research, published in the journal Ecology and Evolution, used data from an average of 65 car journeys a year on the same stretch of road and at the same speed between 1997 and 2017. Møller took account of the time of day, temperature, wind speed and date of the journey and found an 80% decline in insect abundance over the 21-year period. Checks using insect nets and sticky traps showed the same trend.

Møller said the causes were likely to be “a bit of everything”, but noted significant changes due to global heating. “In my 50 years, the temperature in April, May and June has increased by 1.5C [2.7F] on average in my study area,” he said. “The amount of rain has increased by 50%. We are talking about dramatic differences.”

The stream research, published in the journal Conservation Biology, analysed weekly data from 1969 to 2010 on a stream in a German nature reserve, where the only major human impact is climate change.

“Overall, water temperature increased by 1.88C and discharge patterns changed significantly. These changes were accompanied by an 81.6% decline in insect abundance,” the scientists reported. “Our results indicate that climate change has already altered [wildlife] communities severely, even in protected areas.”

Source: Car ‘splatometer’ tests reveal huge decline in number of insects | Environment | The Guardian

Antarctica Just Set a New Temperature Record

It’s positively balmy in Antarctica. The National Meteorological Service of Argentina announced on Twitter that its Esperanza weather station recorded a new high for the continent: 18.3 degrees Celsius (64.9 degrees Fahrenheit).

The previous temperature record for Antarctica was set on March 24, 2015, when this same weather station recorded 17.5 degrees Celsius (63.5 degrees Fahrenheit) near the northern tip of the Antarctic Peninsula closest to South America. Antarctica may be one of the coldest zones on Earth, but it’s also one of the fastest-warming places: The World Meteorological Organization reports that the peninsula has warmed almost 3 degrees Celsius (5.4 degrees Fahrenheit) over the last half-century.

Source: Antarctica Just Set a New Temperature Record

Lab-Grown Heart Muscles Have Been Transplanted Into a Human For The First Time

On Monday, researchers from Japan’s Osaka University announced the successful completion of a first-of-its-kind heart transplant.

Rather than replacing their patient’s entire heart with a new organ, these researchers placed degradable sheets containing heart muscle cells onto the heart’s damaged areas – and if the procedure has the desired effect, it could eventually eliminate the need for some entire heart transplants.

To grow the heart muscle cells, the team started with induced pluripotent stem (iPS) cells. These are stem cells that researchers create by taking an adult’s cells – often from their skin or blood – and reprogramming them back into their embryonic-like pluripotent state.

At that point, researchers can coax the iSP cells into becoming whatever kind of cell they’d like. In the case of this Japanese study, the researchers created heart muscle cells from the iSP cells before placing them on small sheets.

The patient who received the transplant suffers from ischemic cardiomyopathy, a condition in which a person’s heart has trouble pumping because its muscles don’t receive enough blood.

In severe cases, the condition can require a heart transplant, but the team from Osaka University hopes that the muscle cells on the sheet will secrete a protein that helps regenerate blood vessels, thereby improving the patient’s heart function.

The researchers plan to monitor the patient for the next year, and they hope to conduct the same procedure on nine other people suffering from the same condition within the next three years.

If all goes well, the procedure could become a much-needed alternative to heart transplants – not only is sourcing iPS cells far easier than finding a suitable donor heart, but a recipient’s immune system is more likely to tolerate the cells than a new organ.

Source: Lab-Grown Heart Muscles Have Been Transplanted Into a Human For The First Time

Body movement is achieved by molecular motors. A new ‘molecular nano-patterning’ technique allows us to study these motors, reveals that some motors coordinate differently

Body movement, from the muscles in your arms to the neurons transporting those signals to your brain, relies on a massive collection of proteins called molecular motors.

Fundamentally, molecular motors are proteins that convert chemical energy into mechanical movement, and have different functions depending on their task. However, because they are so small, the exact mechanisms by which these molecules coordinate with each other is poorly understood.

Publishing in Science Advances, Kyoto University’s School of Engineering has found that two types of kinesin molecular motors have different properties of coordination. Collaborating with the National Institute of Information and Communications Technology, or NICT, the findings were made possible thanks to a new tool the team developed that parks individual motors on platforms thousands of times smaller than a .

“Kinesin is a protein that is involved in actions such as cell division, muscle contractions, and flagella movement. They move along these long protein filaments called microtubules,” explains first author Taikopaul Kaneko. “In the body, kinesins work as a team to inside a cell, or allow the cell itself to move.”

To observe the coordination closely, the team constructed a device consisting of an array of gold nano-pillars 50 nanometers in diameter and spaced 200 to 1000 nanometers apart. For reference, a skin cell is about 30 micrometers, or 30,000 nanometers, in diameter.

“We then combined this array with self-assembled monolayers, or SAM, that immobilized a single kinesin molecule on each nano-pillar,” continues Kaneko. “This ‘nano-patterning’ method of motor proteins gives us control of the number and spacing of kinesins, allowing us to accurately calculate how they transport microtubules.”

The team evaluated two kinesins: kinesin-1 and kinesin-14, which are involved in intercellular transport and cell division, respectively. Their results showed that in the case of kinesin-1, neither the number nor spacing of the molecules change the transport velocity of microtubules.

In contrast, kinesin-14 decreased transport velocity as the number of motors on a filament increased, but increased as the spacing of the motors increased. The results indicate that while kinesin-1 molecules work independently, -14 interacts with each other to tune the speed of transport.

Ryuji Yokokawa who led the team was surprised by the results, “Before we started this study, we thought that more motors led to faster transport and more force. But like most things in biology, it’s rarely that simple.”

The team will be using their new nano-patterning method to study the mechanics of other kinesins and different molecular motors.

“Humans have over 40 kinesins along with two other types of molecular motors called myosin and dynein. We can even modify our array to study how these motors act in a density gradient. Our results and this new tool are sure to expand our understanding of the various basic cellular processes fundamental to all life,” concludes Yokokawa.

Source: A new ‘molecular nano-patterning’ technique reveals that some molecular motors coordinate differently

Turns out that RNA affects DNA in multiple ways. Genes don’t just send messages to RNA which then direct proteins to do stuff.

Rather than directions going one-way from DNA to RNA to proteins, the latest study shows that RNA itself modulates how DNA is transcribed—using a chemical process that is increasingly apparent to be vital to biology. The discovery has significant implications for our understanding of human disease and drug design.

[…]

The picture many of us remember learning in school is an orderly progression: DNA is transcribed into RNA, which then makes proteins that carry out the actual work of living cells. But it turns out there are a lot of wrinkles.

He’s team found that the molecules called messenger RNA, previously known as simple couriers that carry instructions from DNA to proteins, were actually making their own impacts on protein production. This is done by a reversible chemical reaction called methylation; He’s key breakthrough was showing that this methylation was reversible. It wasn’t a one-time, one-way transaction; it could be erased and reversed.

“That discovery launched us into a modern era of RNA modification research, which has really exploded in the last few years,” said He. “This is how so much of gene expression is critically affected. It impacts a wide range of biological processes—learning and memory, circadian rhythms, even something so fundamental as how a cell differentiates itself into, say, a blood cell versus a neuron.”

[…]

they began to see that messenger RNA methylation could not fully explain everything they observed.

This was mirrored in other experiments. “The data coming out of the community was saying there’s something else out there, something extremely important that we’re missing—that critically impacts many early development events, as well as human diseases such as cancer,” he said.

He’s team discovered that a group of RNAs called chromosome-associated regulatory RNAs, or carRNAs, was using the same methylation process, but these RNAs do not code proteins and are not directly involved in translation. Instead, they controlled how DNA itself was stored and transcribed.

“This has major implications in basic biology,” He said. “It directly affects gene transcriptions, and not just a few of them. It could induce global chromatin change and affects transcription of 6,000 genes in the cell line we studied.”

He sees major implications in biology, especially in human health—everything from identifying the genetic basis of disease to better treating patients.

“There are several biotech companies actively developing small molecule inhibitors of RNA methylation, but right now, even if we successfully develop therapies, we don’t have a full mechanical picture for what’s going on,” he said. “This provides an enormous opportunity to help guide disease indication for testing inhibitors and suggest new opportunities for pharmaceuticals.”

Source: Surprise discovery shakes up our understanding of gene expression

Immune cell which kills most cancers discovered by accident by Welsh scientists in major breakthrough 

A new type of immune cell which kills most cancers has been discovered by accident by British scientists, in a finding which could herald a major breakthrough in treatment.

Researchers at Cardiff University were analysing blood from a bank in Wales, looking for immune cells that could fight bacteria, when they found an entirely new type of T-cell.

That new immune cell carries a never-before-seen receptor which acts like a grappling hook, latching on to most human cancers, while ignoring healthy cells.

In laboratory studies, immune cells equipped with the new receptor were shown to kill lung, skin, blood, colon, breast, bone, prostate, ovarian, kidney and cervical cancer.

Professor Andrew Sewell, lead author on the study and an expert in T-cells from Cardiff University’s School of Medicine, said it was “highly unusual” to find a cell that had broad cancer-fighting therapies, and raised the prospect of a universal therapy.

“This was a serendipitous finding, nobody knew this cell existed,” Prof Sewell told The Telegraph.

“Our finding raises the prospect of a ‘one-size-fits-all’ cancer treatment, a single type of T-cell that could be capable of destroying many different types of cancers across the population. Previously nobody believed this could be possible.”

[…]

the new cell attaches to a molecule on cancer cells called MR1, which does not vary in humans.

It means that not only would the treatment work for most cancers, but it could be shared between people, raising the possibility that banks of the special immune cells could be created for instant ‘off-the-shelf’ treatment in future.

When researchers injected the new immune cells into mice bearing human cancer and with a human immune system, they found ‘encouraging’ cancer-clearing results.

And they showed that T-cells of skin cancer patients, which were modified to express the new receptor, could destroy not only the patient’s own cancer cells, but also other patients’ cancer cells in the laboratory.

[…]

Professor Awen Gallimore, of the University’s division of infection and immunity and cancer immunology lead for the Wales Cancer Research Centre, added: “If this transformative new finding holds up, it will lay the foundation for a ‘universal’ T-cell medicine, mitigating against the tremendous costs associated with the identification, generation and manufacture of personalised T-cells.

“This is truly exciting and potentially a great step forward for the accessibility of cancer immunotherapy.”

Commenting on the study, Daniel Davis, Professor of Immunology at the University of Manchester, said it was an exciting discovery which opened the door to cellular therapies being used for more people.

“We are in the midst of a medical revolution harnessing the power of the immune system to tackle cancer.  But not everyone responds to the current therapies and there can be harmful side-effects.

“The team have convincingly shown that, in a lab dish, this type of immune cell reacts against a range of different cancer cells.

“We still need to understand exactly how it recognises and kills cancer cells, while not responding to normal healthy cells.”

The research was published in the journal Nature Immunology.

Source: Immune cell which kills most cancers discovered by accident by British scientists in major breakthrough 

Local water availability is permanently reduced after planting forests

River flow is reduced in areas where forests have been planted and does not recover over time, a new study has shown. Rivers in some regions can completely disappear within a decade. This highlights the need to consider the impact on regional water availability, as well as the wider climate benefit, of tree-planting plans.

“Reforestation is an important part of tackling , but we need to carefully consider the best places for it. In some places, changes to water availability will completely change the local cost-benefits of tree-planting programmes,” said Laura Bentley, a plant scientist in the University of Cambridge Conservation Research Institute, and first author of the report.

Planting large areas of has been suggested as one of the best ways of reducing atmospheric carbon dioxide levels, since trees absorb and store this greenhouse gas as they grow. While it has long been known that planting trees reduces the amount of water flowing into nearby rivers, there has previously been no understanding of how this effect changes as forests age.

The study looked at 43 sites across the world where forests have been established, and used as a measure of water availability in the region. It found that within five years of planting trees, river flow had reduced by an average of 25%. By 25 years, rivers had gone down by an average of 40% and in a few cases had dried up entirely. The biggest percentage reductions in water availability were in regions in Australia and South Africa.

“River flow does not recover after planting trees, even after many years, once disturbances in the catchment and the effects of climate are accounted for,” said Professor David Coomes, Director of the University of Cambridge Conservation Research Institute, who led the study.

Published in the journal Global Change Biology, the research showed that the type of land where trees are planted determines the degree of impact they have on local water availability. Trees planted on natural grassland where the soil is healthy decrease river flow significantly. On land previously degraded by agriculture, establishing forest helps to repair the soil so it can hold more water and decreases nearby river flow by a lesser amount.

Counterintuitively, the effect of trees on river flow is smaller in drier years than wetter ones. When trees are drought-stressed they close the pores on their leaves to conserve water, and as a result draw up less water from the soil. In the trees use more water from the soil, and also catch the rainwater in their leaves.

“Climate change will affect availability around the world,” said Bentley. “By studying how forestation affects , we can work to minimise any local consequences for people and the environment.”

Source: Local water availability is permanently reduced after planting forests

Ultrafast camera takes 1 trillion frames per second of transparent objects and phenomena, can photograph light pulses

A little over a year ago, Caltech’s Lihong Wang developed the world’s fastest camera, a device capable of taking 10 trillion pictures per second. It is so fast that it can even capture light traveling in slow motion.

But sometimes just being quick is not enough. Indeed, not even the fastest camera can take pictures of things it cannot see. To that end, Wang, Bren Professor of Medical Engineering and Electrical Engineering, has developed a that can take up to 1 trillion pictures per second of transparent objects. A paper about the camera appears in the January 17 issue of the journal Science Advances.

The technology, which Wang calls phase-sensitive compressed ultrafast photography (pCUP), can take video not just of transparent objects but also of more ephemeral things like shockwaves and possibly even of the signals that travel through neurons.

Wang explains that his new imaging system combines the high-speed photography system he previously developed with an old technology, phase-contrast microscopy, that was designed to allow better imaging of objects that are mostly transparent such as cells, which are mostly water.

[…]

Wang says the technology, though still early in its development, may ultimately have uses in many fields, including physics, biology, or chemistry.

“As signals travel through neurons, there is a minute dilation of nerve fibers that we hope to see. If we have a network of neurons, maybe we can see their communication in real time,” Wang says. In addition, he says, because temperature is known to change phase contrast, the system “may be able to image how a flame front spreads in a combustion chamber.”

The paper describing pCUP is titled “Picosecond-resolution phase-sensitive imaging of transparent objects in a single shot.”

Source: Ultrafast camera takes 1 trillion frames per second of transparent objects and phenomena

A floating device created to clean up plastic from the ocean is finally doing its job, organizers say

A huge trash-collecting system designed to clean up plastic floating in the Pacific Ocean is finally picking up plastic, its inventor announced Wednesday.

The Netherlands-based nonprofit the Ocean Cleanup says its latest prototype was able to capture and hold debris ranging in size from huge, abandoned fishing gear, known as “ghost nets,” to tiny microplastics as small as 1 millimeter.
“Today, I am very proud to share with you that we are now catching plastics,” Ocean Cleanup founder and CEO Boyan Slat said at a news conference in Rotterdam.
The Ocean Cleanup system is a U-shaped barrier with a net-like skirt that hangs below the surface of the water. It moves with the current and collects faster moving plastics as they float by. Fish and other animals will be able to swim beneath it.
The new prototype added a parachute anchor to slow the system and increased the size of a cork line on top of the skirt to keep the plastic from washing over it.
The Ocean Cleanup's System 001/B collects and holds plastic until a ship can collect it.

It’s been deployed in “The Great Pacific Garbage Patch” — a concentration of trash located between Hawaii and California that’s about double the size of Texas, or three times the size of France.
Ocean Cleanup plans to build a fleet of these devices, and predicts it will be able to reduce the size of the patch by half every five years.

Source: A floating device created to clean up plastic from the ocean is finally doing its job, organizers say – CNN

During Brain Surgery, This AI Can Diagnose a Tumor in 2 Minutes

Expert human pathologists typically require around 30 minutes to diagnose brain tumors from tissue samples extracted during surgery. A new artificially intelligent system can do it in less than 150 seconds—and it does so more accurately than its human counterparts.

New research published today in Nature Medicine describes a novel diagnostic technique that leverages the power of artificial intelligence with an advanced optical imaging technique. The system can perform rapid and accurate diagnoses of brain tumors in practically real time, while the patient is still on the operating table. In tests, the AI made diagnoses that were slightly more accurate than those made by human pathologists and in a fraction of the time. Excitingly, the new system could be used in settings where expert neurologists aren’t available, and it holds promise as a technique that could diagnose other forms of cancer as well.

[…]

New York University neuroscientist Daniel Orringer and his colleagues developed a diagnostic technique that combined a powerful new optical imaging technique, called stimulated Raman histology (SRH), with an artificially intelligent deep neural network. SRH uses scattered laser light to illuminate features not normally seen in standard imaging techniques

[…]

To create the deep neural network, the scientists trained the system on 2.5 million images taken from 415 patients. By the end of the training, the AI could categorize tissue into any of 13 common forms of brain tumors, such as malignant glioma, lymphoma, metastatic tumors, diffuse astrocytoma, and meningioma.

A clinical trial involving 278 brain tumor and epilepsy patients and three different medical institutions was then set up to test the efficacy of the system. SRH images were evaluated by either human experts or the AI. Looking at the results, the AI correctly identified the tumor 94.6 percent of the time, while the human neuropathologists were accurate 93.9 percent of the time. Interestingly, the errors made by humans were different than the errors made by the AI. This is actually good news, because it suggests the nature of the AI’s mistakes can be accounted for and corrected in the future, resulting in an even more accurate system, according to the authors.

“SRH will revolutionize the field of neuropathology by improving decision-making during surgery and providing expert-level assessment in the hospitals where trained neuropathologists are not available,” said Matija Snuderl, a co-author of the study and an associate professor at NYU Grossman School of Medicine, in the press release.

Source: During Brain Surgery, This AI Can Diagnose a Tumor in 2 Minutes

New evidence shows that the key assumption made in the discovery of dark energy is in error

The most direct and strongest evidence for the accelerating universe with dark energy is provided by the distance measurements using type Ia supernovae (SN Ia) for the galaxies at high redshift. This result is based on the assumption that the corrected luminosity of SN Ia through the empirical standardization would not evolve with redshift.

New observations and analysis made by a team of astronomers at Yonsei University (Seoul, South Korea), together with their collaborators at Lyon University and KASI, show, however, that this key assumption is most likely in error. The team has performed very high-quality (signal-to- ~175) spectroscopic observations to cover most of the reported nearby early-type host galaxies of SN Ia, from which they obtained the most direct and reliable measurements of population ages for these host galaxies. They find a significant correlation between SN and stellar population age at a 99.5 percent confidence level. As such, this is the most direct and stringent test ever made for the luminosity evolution of SN Ia. Since SN progenitors in host galaxies are getting younger with redshift (look-back time), this result inevitably indicates a serious systematic bias with redshift in SN cosmology. Taken at face values, the luminosity evolution of SN is significant enough to question the very existence of . When the luminosity evolution of SN is properly taken into account, the team found that the evidence for the existence of dark simply goes away (see Figure 1).

Commenting on the result, Prof. Young-Wook Lee (Yonsei Univ., Seoul), who led the project said, “Quoting Carl Sagan, extraordinary claims require extraordinary evidence, but I am not sure we have such extraordinary evidence for dark energy. Our result illustrates that dark energy from SN cosmology, which led to the 2011 Nobel Prize in Physics, might be an artifact of a fragile and false assumption.”

Other cosmological probes, such as the (CMB) and baryonic acoustic oscillations (BAO), are also known to provide some indirect and “circumstantial” evidence for dark energy, but it was recently suggested that CMB from Planck mission no longer supports the concordance cosmological model which may require new physics (Di Valentino, Melchiorri, & Silk 2019). Some investigators have also shown that BAO and other low-redshift cosmological probes can be consistent with a non-accelerating universe without dark energy (see, for example, Tutusaus et al. 2017). In this respect, the present result showing the luminosity evolution mimicking dark energy in SN cosmology is crucial and very timely.

This result is reminiscent of the famous Tinsley-Sandage debate in the 1970s on luminosity evolution in observational cosmology, which led to the termination of the Sandage project originally designed to determine the fate of the universe.

This work based on the team’s 9-year effort at Las Campanas Observatory 2.5-m telescope and at MMT 6.5-m telescope was presented at the 235th meeting of the American Astronomical Society held in Honolulu on January 5th (2:50 PM in cosmology session, presentation No. 153.05). Their paper is also accepted for publication in the Astrophysical Journal and will be published in January 2020 issue.

Source: New evidence shows that the key assumption made in the discovery of dark energy is in error

This particle accelerator fits on the head of a pin

If you know nothing else about particle accelerators, you probably know that they’re big — sometimes miles long. But a new approach from Stanford researchers has led to an accelerator shorter from end to end than a human hair is wide.

The general idea behind particle accelerators is that they’re a long line of radiation emitters that smack the target particle with radiation at the exact right time to propel it forward a little faster than before. The problem is that depending on the radiation you use and the speed and resultant energy you want to produce, these things can get real big, real fast.

That also limits their applications; you can’t exactly put a particle accelerator in your lab or clinic if they’re half a kilometer long and take megawatts to run. Something smaller could be useful, even if it was nowhere near those power levels — and that’s what these Stanford scientists set out to make.

 

“We want to miniaturize accelerator technology in a way that makes it a more accessible research tool,” explained project lead Jelena Vuckovic in a Stanford news release.

But this wasn’t designed like a traditional particle accelerator like the Large Hadron Collider or one at collaborator SLAC’s National Accelerator Laboratory. Instead of engineering it from the bottom up, they fed their requirements to an “inverse design algorithm” that produced the kind of energy pattern they needed from the infrared radiation emitters they wanted to use.

That’s partly because infrared radiation has a much shorter wavelength than something like microwaves, meaning the mechanisms themselves can be made much smaller — perhaps too small to adequately design the ordinary way.

The algorithm’s solution to the team’s requirements led to an unusual structure that looks more like a Rorschach test than a particle accelerator. But these blobs and channels are precisely contoured to guide infrared laser light pulse in such a way that they push electrons along the center up to a significant proportion of the speed of light.

The resulting “accelerator on a chip” is only a few dozen microns across, making it comfortably smaller than a human hair and more than possible to stack a few on the head of a pin. A couple thousand of them, really.

And it will take a couple thousand to get the electrons up to the energy levels needed to be useful — but don’t worry, that’s all part of the plan. The chips are fully integrated but can be put in a series easily to create longer assemblies that produce larger powers.

These won’t be rivaling macro-size accelerators like SLAC’s or the Large Hadron Collider, but they could be much more useful for research and clinical applications where planet-destroying power levels aren’t required. For instance, a chip-sized electron accelerator might be able to direct radiation into a tumor surgically rather than through the skin.

The team’s work is published in a paper today in the journal Science.

Source: This particle accelerator fits on the head of a pin – TechCrunch

How old ship logs are giving new insights into climate change

n the 19th and early 20th centuries, millions of weather observations were carefully made in the logbooks of ships sailing through largely uncharted waters. Written in pen and ink, the logs recorded barometric pressure, air temperature, ice conditions and other variables. Today, volunteers from a project called Old Weather are transcribing these observations, which are fed into a huge dataset at the National Oceanic and Atmospheric Administration. This “weather time machine,” as NOAA puts it, can estimate what the weather was for every day back to 1836, improving our understanding of extreme weather events and the impacts of climate change.

Source: How old ship logs are giving new insights into climate change

Climate Change Is Already Making Us Sick

The fossil fuels driving climate change make people sick, and so do impacts like extreme heat, wildfires, and more extreme storms, according to research published on Wednesday. In short, the climate crisis is a public health crisis.

A new report from premiere medical journal the Lancet tallies the medical toll of climate change and finds last year saw record-setting numbers of people exposed to heat waves and a near-record spread of dengue fever globally. The scientists also crunched numbers around wildfires for the first time, finding that 77 percent of countries are facing more wildfire-induced suffering than they were at the start of the decade. But while some of the report’s findings are rage-inducing, it also shows that improving access to healthcare may be among the most beneficial ways we can adapt to climate change.

[…]

Heat waves are among the more obvious climate change-linked weather disasters, and the report outlines just how much they’re already hurting the world. Last year saw intense heat waves go off around the world from the UK to Pakistan, to Japan amid the fourth warmest year on record.

[…]

The report also found that 2018 marked the second-worst year since accurate record keeping began in 1990 for the spread of dengue fever-carrying mosquitoes. The two types of mosquitoes that transmit dengue have seen their range expand as temperatures have warmed

[…]

wildfire findings, which are new to this year’s report. Scientists found that more than three-quarters of countries around the world are seeing increased prevalence of wildfires and the sickness-inducing smoke that accompanies them.

[…]

there are also the health risks that come from burning fossil fuels themselves. Air pollution has ended up in people’s lungs where it can cause asthma and other respiratory issues, but it’s also showed up in less obvious locations like people’s brains and women’s placentas.

[…]

“We can do better than to dwell on the problem,” Gina McCarthy, the former head of the Environmental Protection Agency and current Harvard public health professor, said on the press call.

The report found, for example, that despite an uptick in heat waves and heavy downpours that can spur diarrheal diseases, outbreaks have become less common. Ditto for protein-related malnutrition despite the impact intense heat is having on the nutritional value of staple crops and ocean heat waves on coral reefs and fisheries that rely on them. At least some of that is attributable to improved access to healthcare, socioeconomic opportunities, and sanitation in some regions.

We often think about sea walls or other hard infrastructure when it comes to climate adaptation. But rural health clinics and sewer systems fall into that same category, as do programs like affordable crop insurance. The report suggests improving access to financing health-focused climate projects could pay huge dividends as a result, ensuring that people are insulated from the impacts of climate change and helping lift them out of poverty in the process. Of course it also calls for cutting carbon pollution ASAP because even the best equipped hospital in the world isn’t going to be enough to protect people from the full impacts of climate change.

Source: Climate Change Is Already Making Us Sick

The effects of speed on traffic flow – also with a quick look at emissions and fuel consumption

Right now, in the Netherlands there is talk about reducing the speed limit from 130kph to 100kph in order to comply to emissions goals set by the EU (and supported by NL) years ago. Because NL didn’t put into effect any necessary legislation years ago, this is now coming to bite NL in the arse and they are playing panic football.

The Dutch institute for the environment shows pretty clearly where emissions are coming from:

cirkeldiagram stikstofdepositie

Source: Stikstof / RIVM

As you can see it makes perfect sense to do something about traffic, as it causes 6.1% of emissions. Oh wait, there’s the farming sector: that causes 46% of emissions! Why not tackle that? Well, they tried to at first, but then the farmers did an occupy of the Hague with loads of tractors (twice) and all the politicians chickened out. Because nothing determines policy like a bunch of tractors causing traffic jams. Screw the will of the people anyway.

So… traffic it is then.

Now according the the European Environment Agency in their article Do lower speed limits on motorways reduce fuel consumption and pollutant emissions? , reducing speed limits from 120 to 110 kph will decrease fuel usage by only around 2-3% realistically, BUT will have a pretty large impact on certain pollutants.

Speed limit figure 3

Note: emissions expressed relative to their values at 100 km/h, for which the value ‘1’ is assigned.

Source: EMISIA – ETC/ACM

NOx denotes ‘nitrogen oxides’; PM denotes ‘particulate matter’; THC denotes ‘total hydrocarbons’; CO denotes ‘carbon monoxide’.

Speed limit figure 4

Note: emissions expressed relative to their values at 100 km/h, for which the value ‘1’ is assigned.

Source: EMISIA – ETC/ACM

So reducing speed from 120-100 kph should result (for diesels) in an approx 15% decrease in particulate matter, a 40% decrease in nitrogen oxides but an increase in the amount of total hydrocarbons and carbon monoxides.

For gasoline powered cars the it’s a 20% decrease in total hydrocarbons, which means that in NL, we can knock down the 6.1% of the pie generated by cars to around 4%. Yay. We don’t win much.

Now about traffic flow, because that’s what I’m here for. The Dutch claim that lowering the speed limit will decrease the amount of time spent in traffic jams. Here’s an example of two experts saying so in BNN Vara’s article Experts: Door verlaging maximumsnelheid ben je juist sneller thuis

However, if you look at their conclusion, they come straight out of one of just two studies commonly used by seemingly everyone:

Effects of low speed limits on freeway traffic flow (2017)

It is confirmed that the lower the speed limit, the higher the occupancy to achieve a given flow. This result has been observed even for relatively high flows and low speed limits. For instance, a stable flow of 1942 veh/h/lane has been measured with the 40 km/h speed limit in force. The corresponding occupancy was 33%, doubling the typical occupancy for this flow in the absence of speed limits. This means that VSL strategies aiming to restrict the mainline flow on a freeway by using low speed limits will need to be applied carefully, avoiding conditions as the ones presented here, where speed limits have a reduced ability to limit flows. On the other hand, VSL strategies trying to get the most from the increased vehicle storage capacity of freeways under low speed limits might be rather promising. Additionally, results show that lower speed limits increase the speed differences across lanes for moderate demands. This, in turn, also increases the lane changing rate. This means that VSL strategies aiming to homogenize traffic and reduce lane changing activity might not be successful when adopting such low speed limits. In contrast, lower speed limits widen the range of flows under uniform lane flow distributions, so that, even for moderate to low demands, the under-utilization of any lane is avoided.

There are a few problems with this study: First, it’s talking about speed limits of 40, 60 and 80 kph. Nothing around the 100 – 130kph mark. Secondly, the data in the graphs actually shows a lower occupancy with a higher speed limit – which is not their conclusion!

Which is odd, to say the least.

Off to the other study everyone seems to use: Determining optimal speed limits in traffic networks (2014)

This paper aims to evaluate optimal speed limits in traffic networks in a way that economized societal costs are incurred. In this study, experimental and field data as well as data from simulations are used to determine how speed is related to the emission of pollutants, fuel consumption, travel time, and the number of accidents. This paper also proposes a simple model to calculate the societal costs of travel and relate them to speed. As a case study, using emission test results on cars manufactured domestically and by simulating the suburban traffic flow by Aimsun software, the total societal costs of the Shiraz-Marvdasht motorway, which is one of the most traversed routes in Iran, have been estimated. The results of the study show that from a societal perspective, the optimal speed would be 73 km/h, and from a road user perspective, it would be 82 km/h (in 2011, the average speed of the passing vehicles on that motorway was 82 km/h). The experiments in this paper were run on three different vehicles with different types of fuel. In a comparative study, the results show that the calculated speed limit is lower than the optimal speed limits in Sweden, Norway, and Australia.

(Emphasis mine)

It’s a compelling study with great results, which also include accidents.

In a multi-lane motorway divided by a median barrier in Sweden, the optimal speed is 110 km/h. The speed limit is 110 km/h and the current average speed is 109 km/h. In Norway, the optimal speed from a societal perspective is 100 km/h and the speed limit is 90 km/h. The current average speed is 95 km/h [2]. In Australia, the optimum speeds on rural freeways (dual carriageway roads with grade-separated intersections) would be 110 km/h [3]. Table 3 compares the results in Elvik [2] and Cameron [3] with those of the present study.

Table 3. Optimal speed in Norway, Sweden, Australia, and Iran. Source for columns 2 and 3: Elvik [2]. Source for column 4: Cameron [3].

NorwaySwedenAustraliaIran
Optimal speed limits (km/h) according to societal perspective10011011073
Optimal speed limits (km/h) according to road user perspective11012082
Current speed limits (km/h)90110110110
Current mean speed of travel (km/h)9510982

There is a significant difference between the results in Iran and those in Sweden, Norway, and Australia; this difference results from the difference in the costs between Iran and these three countries. Also, the functions of fuel consumption and pollutant emission are different.

If you look at the first graph, you can be forgiven for thinking that the optimum speed is 95 kph, as Ruud Horman (from the BNN Vara piece) seems to think. However, as the author of this study is very careful to point out, it’s a very constrained study and there are per country differences – these results are only any good for a very specific highway in a very specific country.

Now there’s a blog post from the National database of traffic data (Nationale Databank Wegverkeersgegevens): De 130 km/u-maatregel gewikt en gewogen: meer of minder fileleed? / The 130 kph weighed: more or fewer traffic jams? (2019)

They come out with a whole load of pretty pictures based on the following graph:

x= intensity, y= speed.

There are quite a lot of graphs like this. So, the speed limit is 120kph (red dots) and the inttesity is 6000 (heavy) then the actual speed is likely to be around 100 kph op the A16. However if the speed limit is 130 kph with the same intensity – oh wait, it doesn’t get to the same intensity. You seem to have higher intensities more often with a speed limit of 120 kph. But if we have an intensity of around 3000 (which I guess is moderate) then you see that quite often the speed is 125 with a speed limit of 130 and around 100 with a speed limit of 120. However, with that intensity you see that there are slightly more datapoints at around 20 – 50 kph if your speed limit is 130kph than if it’s 120kph.

Oddly enough, they never added data from 100kph, of which there were (and are) plenty of roads. They also never take into account variable speed limits. The 120kph limit is based on data taken in 2012 and the 130kph limit is based on data from 2018.

Their conclusion – raising the speed limit wins you time when the roads are quiet and puts you into a traffic jam when the roads are busy – is spurious and lacks the data to be able to support it.

Now we come to more recent research from China: The Effect of Posted Speed Limit on the Dispersion of Traffic Flow Speed (PDF here)

The conclusion is pretty tough reading but the graphs are quite clear

What they are basically saying is: we researched it pretty well and we had a look at the distribution of vehicle types. Basically, if you set a higher speed limit, people will drive faster. There is variability (the bars you see up and down the lines) so sometimes they will drive faster and somethims they will drive slower but they generally go faster on average with a higher speed limit.

Now one more argument is that the average commute is only about an hour per day. So if you go slower, you will only save a few minutes. The difference between 100 and 130kph is a 30% difference. Over an hour period (say 100 km), that’s a 21 minute difference, assuming you can travel that distance at that speed (what they call free flow conditions). Sure you’ll never get that, but over large distances you can come close. Anyway, say we halve that and say it’s a 10 minute difference. The argument becomes that this is just barely a cup of tea. But it’s 10 minutes difference EVERY WORKING DAY! Excluding weekends and holidays, you can expect to make that commute around 250 times per year, making your net loss 2500 minutes (at least), which is 41 hours or a full working week you now have to spend extra in the car!

Considering the low net environmental gains and the falsity of the arguments saying you will spend less time in traffic jams as the speed decreases – and don’t get me started on traffic accidents, because they go up and down like a yoyo despite speed limit differences

– reducing the speed limit seems like poor populist policy to appease the farmers, look like Something is Being Done ™ and not actually get anything real to happen except piss off commuters.

The EU Has Approved an Ebola Vaccine

The first human vaccine against the often-fatal viral disease Ebola is now an official reality. On Monday, the European Union approved a vaccine developed by the pharmaceutical company Merck, called Ervebo.

The stage for Ervebo’s approval was set this October, when a committee assembled by the European Medicines Agency (EMA) recommended a conditional marketing authorization for the vaccine by the EU. Conditional marketing authorizations are given to new drugs or therapies that address an “unmet medical need” for patients. These drugs are approved on a quicker schedule than the typical new drug and require less clinical trial data to be collected and analyzed for approval.

In Ervebo’s case, though, the data so far seems to be overwhelmingly positive. In April, the World Health Organization revealed the preliminary results of its “ring vaccination” trials with Ervebo during the current Ebola outbreak in the Democratic Republic of Congo. Out of the nearly 100,000 people vaccinated up until that time, less than 3 percent went on to develop Ebola. These results, coupled with earlier trials dating back to the historic 2014-2015 outbreak of Ebola that killed over 10,000 people, secured Ervebo’s approval by the committee.

“Finding a vaccine as soon as possible against this terrible virus has been a priority for the international community ever since Ebola hit West Africa five years ago,” Vytenis Andriukaitis, commissioner in charge of Health and Food Safety at the EU’s European Commission, said in a statement announcing the approval. “Today’s decision is therefore a major step forward in saving lives in Africa and beyond.”

Although the marketing rights for Ervebo are held by Merck, it was originally developed by researchers from the Public Health Agency of Canada, which still maintains non-commercial rights.

The vaccine’s approval, significant as it is, won’t tangibly change things on the ground anytime soon. In October, the WHO said that licensed doses of Ervebo will not be available to the world until the middle of 2020. In the meantime, people in vulnerable areas will still have access to the vaccine through the current experimental program. Although Merck has also submitted Ervebo for approval by the Food and Drug Administration in the U.S., the agency’s final decision isn’t expected until next year as well.

Source: In a World First, the EU Has Approved an Ebola Vaccine

House plants have little effect on indoor air quality, study concludes

New research from a duo of environmental engineers at Drexel University is suggesting the decades-old claim that house plants improve indoor air quality is entirely wrong. Evaluating 30 years of studies, the research concludes it would take hundreds of plants in a small space to even come close to the air purifying effects of simply opening a couple of windows.

Back in 1989 an incredibly influential NASA study discovered a number of common indoor plants could effectively remove volatile organic compounds (VOCs) from the air. The experiment, ostensibly conducted to investigate whether plants could assist in purifying the air on space stations, gave birth to the idea of plants in home and office environments helping clear the air.

Since then, a number of experimental studies have seemed to verify NASA’a findings that plants do remove VOCs from indoor environments. Professor of architectural and environmental engineering at Drexel University Michael Waring, and one of his PhD students, Bryan Cummings, were skeptical of this common consensus. The problem they saw was that the vast majority of these experiments were not conducted in real-world environments.

“Typical for these studies a potted plant was placed in a sealed chamber (often with a volume of a cubic meter or smaller), into which a single VOC was injected, and its decay was tracked over the course of many hours or days,” the duo writes in their study.

To better understand exactly how well potted plants can remove VOCs from indoor environments, the researchers reviewed the data from a dozen published experiments. They evaluated the efficacy of a plant’s ability to remove VOCs from the air using a metric called CADR, or clean air delivery rate.

“The CADR is the standard metric used for scientific study of the impacts of air purifiers on indoor environments,” says Waring, “but many of the researchers conducting these studies were not looking at them from an environmental engineering perspective and did not understand how building air exchange rates interplay with the plants to affect indoor air quality.”

Once the researchers had calculated the rate at which plants dissipated VOCs in each study they quickly discovered that the effect of plants on air quality in real-world scenarios was essentially irrelevant. Air handling systems in big buildings were found to be significantly more effective in dissipating VOCs in indoor environments. In fact, to clear VOCs from just one square meter (10.7 sq ft) of floor space would take up to 1,000 plants, or just the standard outdoor-to-indoor air exchange systems that already exist in most large buildings.

Source: House plants have little effect on indoor air quality, study concludes

Hottest October ever: Earth just experienced its hottest October ever

Last month was the hottest ever October on record globally, according to data released Friday by the Copernicus Climate Change Service, an organization that tracks global temperatures. The month, which was reportedly 1.24 degrees Fahrenheit warmer than the average October from 1981-2010, narrowly beat October 2015 for the top spot.

According to Copernicus, most of Europe, large parts of the Arctic and the eastern U.S. and Canada were most affected. The Middle East, much of Africa, southern Brazil, Australia, eastern Antarctica and Russia also experienced above-average temperatures.

Parts of tropical Africa and Antarctica and the western U.S. and Canada felt much colder than usual, however.

Source: Hottest October ever: Earth just experienced its hottest October ever – CBS News

Thousands of Scientists Declare a Climate Emergency

It only Tuesday, but more than 11,000 scientists around the world have come together to declare a climate emergency. Their paper, published Tuesday in the journal Bioscience, lays out the science behind this emergency and solutions for how we can deal with it.

Scientists aren’t the first people to make this declaration. A tribal nation in the Canadian Yukon, the U.K., and parts of Australia have all come to the same grim conclusion. In the U.S., members of Congress have pushed the federal government to do the same, but y’know, we got Donald Trump. Ain’t shit happening with this fool in office. Anyway, this proclamation from scientists is significant because they’re not doing it out of a political agenda or as an emotional outcry. They’re declaring a climate emergency because the science supports it.

The signatories, who come from 153 countries, note that societies have taken little action to prevent climate disaster. It’s been business as usual, despite scientific consensus that burning fossil fuels and driving cars is gravely harming the environment—you know, the environment we all have to live in for the foreseeable future. Greenhouse gas emissions continue to enter the atmosphere, and if we don’t stop quickly, we’re doomed.

Source: Thousands of Scientists Declare a Climate Emergency

Scholars Shouldn’t Fear ‘Dumbing Down’ for the Public

The internet has made it easier than ever to reach a lot of readers quickly. It has birthed new venues for publication and expanded old ones. At the same time, a sense of urgency of current affairs, from politics to science, technology to the arts, has driven new interest in bringing scholarship to the public directly.

Scholars still have a lot of anxiety about this practice. Many of those relate to the university careers and workplaces: evaluation, tenure, reactions from their peers, hallway jealousy, and so on. These are real worries, and as a scholar and university professor myself, I empathize with many of them.

But not with this one: The worry that they’ll have to “dumb down” their work to reach broader audiences. This is one of the most common concerns I hear from academics. “Do we want to dumb down our work to reach these readers?” I’ve heard them ask among themselves. It’s a wrongheaded anxiety.


Like all experts, academics are used to speaking to a specialized audience. That’s true no matter their discipline, from sociology to geotechnical engineering to classics. When you speak to a niche audience among peers, a lot of understanding comes for free. You can use technical language, make presumptions about prior knowledge, and assume common goals or contexts. When speaking to a general audience, you can’t take those circumstances as a given.

But why would doing otherwise mean “dumbing down” the message? It’s an odd idea when you think about it. The whole reason to reach people who don’t know what you know, as an expert, is so that they might know about it. Giving them reason to care, process, and understand is precisely the point.

The phrase dumbing down got its start in entertainment. During the golden age of Hollywood, in the 1930s, dumbing down became a screenwriter’s shorthand for making an idea simple enough that people with limited education or experience could understand it. Over time, it came to refer to intellectual oversimplification of all kinds, particularly in the interest of making something coarsely popular. In education, it named a worry about curricula and policy: that students were being asked to do less, held to a lower standard than necessary—than they were capable of—and that is necessary to produce an informed citizenry.

In the process, dumbing down has entrenched and spread as a lamentation, often well beyond any justification

[…]

But to assume that even to ponder sharing the results of scholarship amounts to dumbing down, by default, is a new low in this term for new lows. Posturing as if it’s a problem with the audience, rather than with the expert who refuses to address that audience, is perverse.

One thing you learn when writing for an audience outside your expertise is that, contrary to the assumption that people might prefer the easiest answers, they are all thoughtful and curious about topics of every kind. After all, people have areas in their own lives in which they are the experts. Everyone is capable of deep understanding.

Up to a point, though: People are also busy, and they need you to help them understand why they should care. Doing that work—showing someone why a topic you know a lot about is interesting and important—is not “dumb”; it’s smart. Especially if, in the next breath, you’re also intoning about how important that knowledge is, as academics sometimes do. If information is vital to human flourishing but withheld by experts, then those experts are either overestimating its importance or hoarding it.

Source: Scholars Shouldn’t Fear ‘Dumbing Down’ for the Public – The Atlantic

Managed Retreat Buyout Efforts Have Relocated 40,000 Households to avoid rising seawater: Study

The U.S. is slowly being gripped by a flooding crisis as seas rise and waterways overflow with ever more alarming frequency. An idea at the forefront for how to help Americans cope is so-called managed retreat, a process of moving away from affected areas and letting former neighborhoods return to nature. It’s an idea increasingly en vogue as it becomes clearer that barriers won’t be enough to keep floodwaters at bay.

But new research shows a startling finding: Americans are already retreating. More than 40,000 households have been bought out by the federal government over the past three decades. The research published in Science Advances on Wednesday also reveals that there are disparities between which communities opt-in for buyout programs and, even more granularly, which households take the offers and relocate away. The cutting-edge research answers questions that have been out there for a while and raises a whole host of new ones that will only become more pressing in the coming decades as Earth continues to warm.

“People are using buyouts and doing managed retreat,” AR Siders, a climate governance researcher at Harvard and study author, said during a press call. “No matter how difficult managed retreat sounds, we know that there are a thousand communities in the United States, all over the country, who have made it work. I want to hear their stories, I want to know how they did it.”

Source: Managed Retreat Buyout Efforts Have Relocated 40,000 Households: Study

Meet the Money Behind The Climate Denial Movement

Nearly a billion dollars a year is flowing into the organized climate change counter-movement

The overwhelming majority of climate scientists, international governmental bodies, relevant research institutes and scientific societies are in unison in saying that climate change is real, that it’s a problem, and that we should probably do something about it now, not later. And yet, for some reason, the idea persists in some peoples’ minds that climate change is up for debate, or that climate change is no big deal.

Actually, it’s not “for some reason” that people are confused. There’s a very obvious reason. There is a very well-funded, well-orchestrated climate change-denial movement, one funded by powerful people with very deep pockets. In a new and incredibly thorough study, Drexel University sociologist Robert Brulle took a deep dive into the financial structure of the climate deniers, to see who is holding the purse strings.

According to Brulle’s research, the 91 think tanks and advocacy organizations and trade associations that make up the American climate denial industry pull down just shy of a billion dollars each year, money used to lobby or sway public opinion on climate change and other issues.

“The anti-climate effort has been largely underwritten by conservative billionaires,” says the Guardian, “often working through secretive funding networks. They have displaced corporations as the prime supporters of 91 think tanks, advocacy groups and industry associations which have worked to block action on climate change.”

Source: Meet the Money Behind The Climate Denial Movement | Smart News | Smithsonian