What we actually know about the vaccines and the delta variant

The Covid-19 pandemic has changed, and with it, so has the effectiveness of the vaccines.

The bottom line remains the same: The mRNA vaccines from Pfizer/BioNTech and Moderna that are most prevalent in the US are still quite effective in preventing any illness from the novel coronavirus, and extremely effective in preventing the kind of severe illness that leads to hospitalization and death. On Wednesday, the Centers for Disease Control and Prevention confirmed those basic facts with its most robust data yet.

But the statistics that were widely publicized when those vaccines were first approved in December — the ones that showed vaccines were 95 percent effective in preventing all illness and 100 percent effective in stopping hospitalization or death — are now thoroughly out of date. The risk that a vaccinated person would experience symptoms if they contract Covid-19 is higher than it was back then, even if it is still significantly lower than if the person were unvaccinated. The now-dominant delta variant is likely to blame.

So exactly how effective are the vaccines against this new, more dangerous iteration of the virus? And how long does immunity provided by the vaccines actually last?

We are finally starting to get some concrete answers to these questions. The vaccines did appear to lose some of their effectiveness in preventing any kind of illness as the delta variant became dominant, especially for people who are at highest risk from Covid-19. But the protection against severe illness held steady, according to the three CDC studies published this week.

The number of breakthrough infections is “increasing as the delta wave proceeds,” Eric Topol, director of the Scripps Research Translational Institute, told me. “But still, protection versus hospitalizations and deaths is very solid.”

With this new evidence of waning vaccine efficacy, the Biden administration announced plans this week to immediately make immunocompromised patients eligible for a booster shot and recommended that all vaccinated people get a third shot eight months after their second dose.

The new data and new guidance reflect this new chapter in the pandemic. Vaccinated people should still feel confident that they are protected against the worst outcomes from Covid-19. But the large number of unvaccinated Americans, and the delta variant’s potency, has contributed to surges in infections.

Some caution — continuing to wear masks and avoiding large indoor gatherings, for example — can help protect against the high level of spread currently in the US, experts said. Even with the powerful protection of vaccines, it’s possible to get sick after a coronavirus exposure.

We now know more than ever about Covid-19 vaccines and the delta variant

The Covid-19 vaccines were initially tested against the original version of the coronavirus, and they performed incredibly well. But delta has proven somewhat more capable of evading the vaccines and may cause more severe illness than its predecessors, based on early research out of the United Kingdom, one of the first places where delta took hold. The new CDC data is a big step forward because it brings our understanding of the vaccines closer to the present.

One of this week’s CDC studies tracked new cases and hospitalizations from early May to late July in New York. The study period covers the transition from the “alpha” variant to delta, which became dominant by the start of July, but only includes part of the recent surge in reported cases.

Click Here:

The vaccines grew less effective in preventing all illness as the delta variant took over, CDC researchers found. In May, vaccines had an estimated 90 percent effectiveness at preventing new cases. By mid-July, the estimated effectiveness had dropped to just under 80 percent. By that point, vaccinated people were more likely to get infected and actually feel sick. Breakthrough infections became more common.

But the vaccines have remained resilient against severe symptoms, with the estimated effectiveness against hospitalization holding steady around 95 percent from the start to the end of the study period.

“There was a reduction in vaccine effectiveness against SARS-CoV-2 infection, but not against hospitalization,” Celine Gounder, clinical assistant professor of medicine and infectious diseases at the NYU School of Medicine, told me. “The vaccines remain highly protective against hospitalization in all age groups.”

It’s important to remember that “severe” illness is a clinical term that might not align with the common parlance. A vaccinated person who becomes ill with Covid-19 might still feel very sick.

“Severe disease isn’t that you feel sick like a dog and are laid up in bed,” Gounder said. “Severe disease means your lungs are failing, your oxygen levels are dropping, and you need to be in the hospital.”

In other words, the vaccines have gotten less effective at stopping Covid-19 in its tracks but are still extremely good at protecting people from the kind of severe cases that need hospital beds or ventilators. Many patients with breakthrough infections can recover at home.

How the vaccines affect long Covid remains an open question. Preliminary evidence seems to suggest that they help alleviate (but not always eliminate) those long-term symptoms.

Why are vaccines somewhat less effective against infection than before?

The CDC researchers were careful to say the reasons for diminishing vaccine effectiveness are uncertain, but there are some things they’re confident about. The delta variant causes a substantially higher viral load than its predecessors — there is more of the virus when a person gets infected — and the sheer amount of the virus a person contracts may play a role. At the same time, people are now taking fewer precautions against Covid-19 than they were last fall and winter, the researchers said, making it more likely that they’ll be exposed to a high viral load.

Or to think of it another way: The rapid spread of delta in the unvaccinated population means vaccinated people are getting exposed to the virus more often, and being exposed to more of it than they were before.

Data from other countries shows a wide range of vaccine effectiveness against infection with the delta variant, but studies have generally found the protection is less robust than it was against the alpha variant. Still, in all cases, the vaccines available in the US continue to impress in their ability to prevent the worst outcomes.

A second CDC study examined national data to determine whether the vaccines are becoming less effective at stopping severe illness over time. Like the New York study, it found that vaccines are extremely effective — about 90 percent at preventing hospitalization due to Covid-19.

Reassuringly, there was not a meaningful difference in the vaccine’s ability to stop hospitalization as time wore on. The researchers estimated the vaccine’s effectiveness against hospitalization in two time periods: two to 12 weeks after patients received a second dose, and 13 to 24 weeks after that second dose.

They did not find a meaningful decline almost six months after patients received a second dose of the vaccine — which is very good news.

“There was no reduction in vaccine effectiveness over time,” Gounder said, “which demonstrates that protection against hospitalization did not drop over time or after the emergence of the delta variant.”

The best protection for the most vulnerable is everybody getting vaccinated

While the vaccines have generally held up well against the delta variant, some of the people most vulnerable to Covid-19 do not receive the same level of protection.

For people with compromised immune systems, the CDC researchers found that the vaccines were less effective at preventing hospitalization. That finding supported the Biden administration’s plan to make those people immediately eligible for a third shot, experts said.

For immunocompromised individuals, it’s also possible that immunity may wane more over time — but that didn’t happen, according to this research. Their level of protection appeared to be constant during the six-month period covered by the study.

The third CDC study evaluated vaccine effectiveness for nursing home residents, a population particularly vulnerable to Covid-19 and one of the first groups to get vaccinated at the beginning of this year. That study did find declining effectiveness over time against any illness for those Americans, from 75 percent pre-delta to about 50 percent post-delta.

That drop-off may partly reflect the nature of this population. Older people’s immune systems are not as strong as younger people’s; older adults already experienced a lower baseline vaccine effectiveness rate than the general population before delta took over (75 percent versus 90 percent). The decline also probably reflects the basic fact that the delta variant is better at evading the vaccines than the alpha variant was.

“It makes sense to give an extra dose of vaccine to vaccinated nursing home residents,” Gounder said. “But what will have an even bigger impact on protecting those nursing home residents is to vaccinate their caregivers.”

As of late July, about 60 percent of nursing home workers had been vaccinated, substantially lower than the 80 percent rate among residents. The Biden administration announced on Wednesday that it would require nursing homes to mandate all their workers be vaccinated if the facilities want to receive federal health care funding.

Over the long run, so long as a substantial portion of the US population remains unvaccinated, there will be risks to everybody. Currently, 72 percent of the 18-and-over population and 60 percent of the entire US population are vaccinated, according to the New York Times tracker. That leaves millions of people without protection against the virus. Some of them are children not yet eligible for the vaccine, but millions of people who are currently eligible and could receive the vaccine for free still haven’t done so.

“Your risk depends on your vaccination status and what’s happening in your community,” Gounder told me. “Vaccines aren’t an immunity on/off switch for individuals. Vaccines work additively and synergistically across populations.”

Gounder deployed some hypothetical math to explain how risk works under different vaccination scenarios. If a country has a baseline of 1 million “units” of risk for each person, because the virus is very widespread, a 95 percent effective vaccine would reduce that risk to 50,000.

But if the baseline risk is 100, because the country contains the virus through vaccinations and mitigation measures, the vaccinated person faces just 5 units of risk. That huge difference in risk depends on how rampantly the virus is spreading in the overall population.

“This isn’t about individualism, individual rights, individual responsibility, and individual protection,” Gounder said. “This is about community immunity.”

The lab leak hypothesis — true or not — should teach us a lesson

The origins of the novel coronavirus that caused the Covid-19 pandemic remain a mystery. US intelligence agencies have now completed a 90-day probe into the origins of SARS-CoV-2, but their classified findings, according to the New York Times, were inconclusive as to whether the virus escaped from a laboratory in Wuhan, China, or made a natural jump from an animal into a human.

Yet to prevent the next pandemic, scientists don’t need a definitive answer about the genesis of Covid-19. Regardless of how the coronavirus outbreak started, researchers say the world urgently needs to do more to prevent both lab leaks and so-called “spillover” infections from animals. Tracing the route of the virus is an important scientific question, but countries can and should take steps to reduce these risks now, even without a final answer.

“We don’t have to wait for all these results to start acting,” said Andrew Weber, senior fellow at the Council on Strategic Risks and a former assistant secretary of defense for nuclear, chemical, and biological defense programs under President Barack Obama. “There’s some big policy decisions that we can make now.”

Right now, the Covid-19 pandemic is continuing to rage around the world; dozens of countries are fighting the spread of the highly contagious delta variant, and many are struggling to get enough vaccines to protect themselves. Figuring out just where the virus came from would do little to mitigate the current crisis.

It may be years before the world gets a satisfying answer, and one may never emerge. But in the meantime, from regulating wildlife markets to transparency around biological research, there are many measures that can reduce the risk of future outbreaks.

Both spillovers and laboratory leaks of pathogens have happened before

There were warnings that humanity was at risk of a pandemic even before Covid-19, and at several points the world has come scarily close.

Several dangerous pathogens have escaped laboratories in the past and gone on to infect people. In 1977, an outbreak of H1N1 influenza erupted on the border between China and the Soviet Union. Based on a genetic analysis of the strain, many researchers concluded the virus escaped from a lab. Smallpox, meanwhile, was eradicated from the wild in 1977; the following year, Janet Parker, a photographer at Birmingham Medical School in the UK, became infected and later died. The building she worked in housed a research laboratory where scientists were studying smallpox.

In 2004, at least two unnamed researchers contracted the SARS virus at the Chinese National Institute of Virology in Beijing. One of the researchers went on to infect her mother, who later died, as well as a nurse at the hospital where her mother was treated. The outbreak led to 1,000 people being quarantined or placed under medical supervision. At the time, the World Health Organization (WHO) reported there may have been similar outbreaks of SARS in Taiwan and Singapore that also may have originated in labs.

There have been even more alarming close calls where scientists were exposed to dangerous pathogens due to equipment failure or lax adherence to containment protocols.

However, every known lab leak to date has involved a pathogen that was previously identified. There has never been a confirmed case of a never-before-seen virus escaping a research facility.

At the other end of the spectrum of possibilities, virologists point out that the vast majority of pathogens that infect humans originate in nature, and almost all come from interactions with animals. Over the past century, about two new viruses per year have been discovered in humans, most of which spilled over from animals, according to a 2012 paper in Philosophical Transactions of the Royal Society B.

“All these spillovers, wherever they are, it’s because human activity is encroaching upon animal activity,” Vincent Racaniello, a virologist at Columbia University, told Vox in June.

Influenza, for example, is found in birds, poultry, pigs, and seals. HIV likely originated in chimpanzees. Measles has an ancestor in cattle.

In fact, another coronavirus, the 2003 SARS virus, was found to have jumped from bats to civet cats before making the leap to humans. And scientists have warned for years that another bat coronavirus could trigger an outbreak in people.

The debate over Covid-19’s origins may never be resolved to everyone’s satisfaction

Many researchers in the early days of the Covid-19 pandemic were quick to attribute the emergence of the SARS-CoV-2 virus to a natural occurrence, likely a human contact with a bat coronavirus via an intermediary. Two recent papers, one in the journal Science and one in the journal Cell, emphasized this conclusion, citing genetic evidence tracing both the lineage of the virus and the circumstances in China in late 2019 that increased the likelihood of humans coming into contact with wild animals that could harbor the pathogen.

Still, some scientists have argued for closer scrutiny of the possibility that the novel coronavirus escaped from a lab. The most basic hypothesis is that a worker at the Wuhan Institute of Virology was infected by a naturally occurring virus under study there. There is no direct evidence for this, nor any indication that SARS-CoV-2 or a progenitor was being studied at the lab, let alone that someone who worked there was infected. Wuhan is about 1,000 miles away from where the bats that harbor a similar virus are found, raising the question of how SARS-CoV-2 crossed that distance. The laboratory was also known to handle its viral samples at a lower safety level than most scientists recommend for such a pathogen. In addition, no one has found an animal that clearly hosted the virus or a precursor just before it leaped into humans.

More fringe ideas are also circulating, like the notion that the virus was deliberately engineered to be more dangerous or deliberately released as a bioweapon. There’s no evidence for these claims.

With the passage of time, however, it’s becoming more difficult for scientists to study the origins of Covid-19. In a recent article in the journal Nature, WHO researchers warned that time is running out: “The window of opportunity for conducting this crucial inquiry is closing fast: any delay will render some of the studies biologically impossible.”

Another complication is that the question of where the virus came from has become a political issue domestically and internationally. Within the US, some politicians have been eager to blame a lab for the pandemic and shift the blame to China.

That, in turn, has become a major source of friction between the United States and China. Chinese officials stopped cooperating with a WHO investigation into the origins of the coronavirus earlier this year and responded with their own allegations that the virus originated in a US lab (for which there is no evidence).

And without cooperation from Chinese authorities, it’s unlikely that a definitive answer — one way or the other — will emerge anytime soon.

Scientists already know how to stop lab leaks

There are many ways to improve safety in laboratories and install safeguards to prevent dangerous diseases from escaping and wreaking havoc. Even scientists who think the emergence of SARS-CoV-2 was a natural event say that preventing lab leaks should be a high priority.

Gigi Gronvall, senior scholar at the Johns Hopkins Center for Health Security, said it’s important to think about two key concepts: “Biosafety” is about protecting people who work with pathogens from the things they are studying, usually through accidents. “Biosecurity” is preventing misuse of pathogens via deliberate actions.

Both are essential to prevent leaks of biological agents, but they’re often afterthoughts when it comes to conducting research on pathogens. “It’s hard to make this exciting,” Gronvall said. “That is why very often the money goes to research, and biosafety is less emphasized.”

One way to enhance biosafety is to deploy several different methods of containment in a laboratory. For instance, biosafety level 3 precautions for handling pathogens that can spread through the air include only handling samples in “biological safety cabinets” that filter air, controlling lab access with two sets of self-locking doors, and wearing respirators, eye protection, and lab coats. It also includes routine medical screening of lab workers.

“Ideally, if there is any kind of accident, there are still multiple layers before it becomes an issue for anyone outside a laboratory,” Gronvall said.

Biosafety also hinges on the type of studies being conducted. Of particular concern is gain-of-function research, in which pathogens are deliberately modified to become more dangerous to humans. The goal is to map out potential changes that could emerge in the wild and develop ways to counter them before they become major threats. It’s a controversial form of research, and some have alleged that the Wuhan Institute of Virology was conducting such experiments — though, again, there is no evidence this occurred. US officials have also been adamant that they have not funded any gain-of-function research, either in the US or abroad.

Some scientists say this kind of research should not be conducted at all because the risk of an escape is too great, but others say gain-of-function studies can be conducted safely with appropriate precautions.

“Lab incidents will still occur. A robust biosafety and biosecurity system, along with appropriate institutional response, helps to ensure that these incidents are inconsequential,” biosafety experts David Gillum (Arizona State University) and Rebecca Moritz (Colorado State University) wrote for the Conversation. “The challenge is to make sure that any research conducted — gain-of-function or otherwise — doesn’t pose unreasonable risks to researchers, the public and the environment.”

Building a robust biosecurity system requires cooperation from institutions ranging from laboratories to regulators to governments. It demands rigorous oversight to enforce safety standards. Transparency among institutions about the kinds of biological research they’re conducting, as well as potential mishaps and accidents, is also critical. But there’s a pervasive fear among laboratories and the individuals who work there that disclosing problems will hurt their reputations.

“If there is a biosafety incident, it is very hard to get that addressed in a way that doesn’t cause problems for an institution,” Gronvall said. “They have a lot of incentives to keep it under wraps.”

That’s also why it’s been hard to know whether China is stonewalling investigators because of a potential lapse in laboratory protocol or out of general distrust of other countries and institutions like the WHO. “A bat could’ve walked out of a cave wearing a name tag and China would behave the same way,” Gronvall said.

Fixing this requires a change in the culture surrounding biological research to create an environment where mistakes and problems are discussed openly and addressed immediately. Inspections, monitoring, and documentation would also make it harder to sweep any problems under the rug.

What’s tougher is enforcing these principles around the world. There is an international agreement restricting research on biological weapons, the Biological Weapons Convention, but there’s no similar agreement for general biological research. Many countries have their own research programs for pathogens, and it’s the Wild West when it comes to what standards are used and what studies are done.

“There’s really no international body that has the authority to oversee biological security or biological safety,” said Weber, from the Council on Strategic Risks. “We should work with [the] international community to adopt real standards for biosecurity and for high-risk research.”

Some pathogen laboratories, like the US Army’s Fort Detrick in Maryland, already have exchange programs with researchers around the world. More exchanges and international inspections of biological research labs could help ensure that every facility adopts best practices and upholds the highest safety and security standards. But setting up such a regime requires trust and cooperation, and that’s in short supply.

“Natural” spillovers, which often have human causes, can be prevented, too

Pathogens found in the wild have infected people for millennia, but there are ways to tame this force of nature. “What we can do is reduce the rate of exposure of humans,” said Andrew Dobson, professor of ecology and evolutionary biology at Princeton University.

For example, a key route for new human diseases is contact with wildlife. Such interactions increase as cities sprawl into the wilderness and people venture further into remote areas in search of food, fuel, and raw materials. When humans destroy habitats, especially through deforestation, they force animals to flee to new areas and interact with people in cities and suburbs. In one paper published in the journal Science in 2020, Dobson reported that when more than 25 percent of original forest cover is lost, it’s much more likely for humans and their livestock to come into contact with wildlife that may carry diseases.

“That’s what’s exposing people to the hosts and the carriers of these viruses,” said Dobson. Drastically reducing deforestation and placing strict limits on how much people can encroach into forests, grasslands, and deserts can slow the emergence of dangerous new parasites, viruses, bacteria, and fungi.

Domesticated animals, particularly livestock, can also be a source of new diseases. The combination of changes in land use and factory farming of cattle, chicken, and hogs can increase the risk of a pathogen hopping between species.

Another way to reduce the chances of a spillover is to close unregulated markets that sell wildlife as food, ingredients for medicines, or materials for clothing. Phasing out legal wildlife markets and screening the health of animals that humans do come into contact with would also reduce the risks of new diseases jumping into people. “There needs to be [many] more international treaties around that to protect people,” said Dobson.

The hurdle is that the wildlife trade is quite lucrative. The legal wildlife market is worth about $300 billion, while estimates of the value of illegal wildlife trading can be as high as $23 billion. So reducing some of the highest-risk forms of wildlife trade also demands an economic solution for people whose livelihoods would be affected, such as helping them find new jobs.

Surveillance — actively looking for dangerous pathogens in the wild to stay ahead of outbreaks — is another important tactic, but there are some risks. Sending researchers into remote areas to collect samples and study them could expose them to dangerous diseases.

“We need to think more about the precautions people take and the security level in those labs,” Dobson said. “[But] the amount we’ll learn will significantly reduce the risk of future outbreaks.”

Click Here:
The history of narrowly averted pandemics is full of crucial lessons

Beyond probing the roots of the Covid-19 pandemic, it’s also worth investigating close calls of the past. Viruses like the original SARS virus in 2003, for example, had the potential to go global but didn’t. The virus itself had some traits that prevented it from spreading further, and while many of the countries most acutely affected by the outbreak, like China and Vietnam, learned critical public health lessons, much of the rest of the world remained complacent.

In a 2013 paper in the Journal of Management, researchers looking at problems with uncrewed NASA missions highlighted the important lessons that can be found in near-miss scenarios like this. “Disasters are rarely generated by large causes,” the authors noted. “Instead disasters are produced by combinations of small failures and errors across the entire organizational system.” So whether SARS-CoV-2 came from a lab or a natural event, a lot of other things also had to go wrong for it to become an international crisis.

When all these factors don’t align in a particular instance and fail to produce a disaster, it’s easy to take the wrong lesson — that humans were adequately prepared or that the status quo of science and public health is good enough already. But it’s dangerous to ignore close calls, whether it’s a near collision between satellites or a new pathogen that was narrowly contained.

“If near-misses masquerade as successes, then organizations and their members will only learn to continue taking the risks that produced the near-miss outcome until a tragedy occurs,” the researchers wrote.

That’s why it’s important to study the factors behind not just the Covid-19 pandemic but also other outbreaks like those of Ebola, SARS, and MERS, which raised alarms and revealed weaknesses in national and global public health systems.

We may not unravel the origins of SARS-CoV-2 anytime soon — if ever. But by treating both spillovers and lab leaks as urgent risks right now, scientists, health officials, and governments can protect us all from outbreaks of the future.

What’s causing California’s unprecedented wildfires

Another explosive wildfire season is underway in California, with more than a million acres already burned in 2021. While still short of the unprecedented 2020 fire season, the blazes this year are well above average and are still gaining ground.

The Caldor Fire burning near Lake Tahoe has forced thousands of people to evacuate as it has spread to more than 207,000 acres, an area larger than New York City. The blaze ignited August 14 and was 23 percent contained as of Thursday.

The Dixie Fire near Chico, California, that ignited on July 14 has scorched more than 847,000 acres. It was 52 percent contained as of Thursday.

Spread by winds reaching 40 mph and fueled by abundant dry vegetation, wildfires across the Golden State have whipped up enormous clouds of smoke and ash. Some have even spawned fire tornadoes.

Across the United States as a whole, more than 2.7 million acres have been charred in wildfires this year, according to the National Interagency Fire Center.

Researchers said that the current fires align with what they forecast earlier this year, noting that the region was parched by a massive drought, was facing severe heat, and had plenty of trees, brush, and grass ready to burn.

Even so, the blazes have proved surprising in other ways. The Caldor and Dixie fires are the first wildfires on record to cross the Sierra Nevada mountain range, posing new challenges for firefighters working to contain them. “These two big fires started in very steep canyons [that are] difficult to access, with very dry, overloaded forests that are burning intensely and so it’s very hard to get a handle on this,” said Craig Clements, director of the Wildfire Interdisciplinary Research Center at San Jose State University.

And as humans continue to drive up wildfire risks — from building in fire-prone regions to suppressing natural fires to changing the climate — scientists are having to rethink what’s possible.

Even by California standards, the current wildfires are surprising

There are several key ingredients needed for wildfires. They need favorable weather, namely dry and windy conditions. They need fuel. And they need an ignition source.

The California Department of Forestry and Fire Protection said that they are still investigating the origins of most of the blazes underway. But other factors this year stacked the deck in favor of massive conflagrations.

California and much of the western US are in the midst of a years-long drought. With limited moisture, plants dry out and turn into kindling. Ordinarily, vegetation at higher altitudes would still hang onto some moisture and act as a barrier to wildfires in places like the Sierra Nevada. However, the severity of the drought has caused even this greenery to turn yellow and gray.

“That’s the unique thing, that these fires have burned over the Sierra Nevada crest,” said Clements. “The fuel moistures are still at record lows across the state of California. That’s allowing these fires to burn at higher altitudes.”

Click Here:

Moisture in the soil and in vegetation can also act as a cooling mechanism as it evaporates. With the drought, this effect is diminished, allowing even more heat to accumulate and pushing temperatures up to the new record highs achieved this summer. The hot weather in turn drove even more drying across the landscape, reaching aridity levels not typically seen until October.

It’s likely that even more fires are in store for the rest of the year across the West. As the autumn Santa Ana and Diablo winds pick up, the risks remain high.

People make wildfires worse, but can take steps to mitigate them

Wildfires are a natural part of ecosystems across much of the western United States. They serve vital functions like clearing decaying vegetation, regulating forest density, restoring nutrients to soils, and helping plants germinate.

But human activity has made wildfires worse at every step. Climate change caused by burning fossil fuels is increasing the aridity of western forests and increasing the frequency and severity of extreme heat events.

People are also building closer to wildland areas. That means that when fires do occur, they cause more damage to homes and businesses. That proximity also means that humans are more likely to spark new infernos. The vast majority of wildfires are ignited by people, up to 84 percent, whether through errant sparks, downed power lines, or arson.

And for hundreds of years, people have suppressed naturally occurring fires. European settlers also halted cultural burning practices from the Indigenous people of the region. Stopping these smaller fires has allowed forests, grasslands, and chaparral to grow much denser than they would otherwise. Paradoxically, that means more fuel is available to burn when fires do occur, causing blazes to spread farther and faster.

The combination of these factors leads wildfires to keep breaking records, forcing scientists to reevaluate what kinds of fires are possible. Decades ago, a 20,000-acre blaze would have been considered massive. Now, wildfires can gain that much ground overnight. “What we’re finding is that even forecasting the fires is so difficult for us because the domain size is huge,” Clements said. “These fires are massive and we keep having to expand the domain of our weather models.”

There are well established ways to reduce the risks of destructive wildfires. Directly attacking a fire once it’s ignited can only go so far, so much of the focus has to be on prevention. One key way is to reduce fuel load. That can take the form of forest thinning and prescribed burns. Towns and cities in fire-prone regions can build defensible perimeters, cutting fire breaks and clearings to reduce the chances of a fire encroaching.

Building codes that demand fire-resistant materials and avoiding construction in the highest-risk areas can reduce the impact of fires as well. Ignitions can be reduced by hardening or burying power lines.

Restoring Indigenous burning practices can also help mitigate wildfire risk — but that will require governments to address historical wrongs, restoring tribes’ access to and sovereignty over ancestral lands.

Over the long term, slowing climate change by drastically reducing greenhouse gas emissions can also stop fires from becoming even more destructive.

The factors that laid the foundation for massive wildfires took centuries to build and won’t be reversed overnight. But the process of reducing these risks can begin now.

What an enormous global study can tell us about feeling better during the pandemic

During the pandemic, I’ve spent a lot of time alone. I live by myself. I work from home. At times, I experienced fits of fidgetiness and restlessness, contributing to feelings of burnout.

Here’s what helped: reappraising the situation.

What I was feeling was isolation, and the loneliness that comes with it. Instead of letting it gnaw at me, I tried to remember: Loneliness is normal, sometimes even useful. I remembered that sadness existed in part to remind me of something I really value, the company of other people. I knew, when the opportunity arose, I’d reorient myself to immersion with others. And when that time came, I’d embrace it; it was a reminder that I was still capable of feeling the joy I had been lacking. And as a consolation, that felt good.

Cognitive reappraisal — sometimes called cognitive reframing — is most commonly encountered in therapy, where it’s used to regulate emotions. It’s a component of cognitive behavioral therapy, a whole suite of strategies that can encourage positive patterns of thinking and behavior.

Reappraisals are useful. But they’re not something people learn exclusively in the context of clinical care. It’s arguably a skill we all can benefit from. And by “we all,” I mean just about everyone, all across the globe.

Recently, hundreds of researchers in 87 countries published the results of the largest cognitive reappraisal study to date in Nature Human Behavior. They were asking a simple question: Could they make people feel better about the pandemic, if only for one moment in time, by teaching reappraisals? The study, which amassed data on more than 20,000 participants, came back with a resounding answer: yes.

The new study validates the concept of reappraisal. But it also suggests that it could potentially be feasible to deploy as a large-scale global health intervention.

It’s a simple skill, but it could help many people foster resilience in a chaotic world.

Cognitive reappraisal, explained

The peer-reviewed paper in Nature Human Behavior is the most recent project from the Psychological Science Accelerator, a group of hundreds of researchers who combine their resources to pull off psychological studies with massive participant pools and an unusually rigorous methodology.

Near the start of the pandemic, the group put out a call for project proposals to test psychological interventions that could, simply, help people feel better.

“The reason why we choose cognitive reappraisal is because it has been the most widely studied and well-understood strategy,” Ke Wang, the Harvard Kennedy School doctoral student who first proposed this massive project, explains. It’s also a strategy that people don’t always use spontaneously on their own: It helps to be taught.

(The group has two other papers testing different psychological interventions, on how public health messaging in the pandemic can influence behavior. Of note: They’re testing whether “loss aversion,” an influential idea that suggests people respond more strongly when they think they have something to lose, encourages people to protect their health during a pandemic.)

Cognitive reappraisal works because “there’s a link between our thoughts and our feelings,” Kateri McRae, a University of Denver psychologist who studies emotion and who was not involved in this study, says. “A lot of times, our feelings are preceded by certain thoughts.” So when we shift our thoughts, that can precipitate a change in our emotions.

It can be a strategy to cope with a bout of anxiety or depression, or it can just be used to foster mental health resilience. “Individuals who report greater amounts of well-being and daily positive emotion report using reappraisal more frequently than people who report daily negative emotion,” McRae says — though she adds that “there is a little bit of a chicken-and-egg thing here.” What comes first: Do positive people reappraise, or do reappraisers become positive people? “But I certainly think that most people consider it to be something that might serve as a buffer.”

Once you get the hang of the technique, it’s easy to apply reappraisal thinking to many different situations. For instance, sometimes when I felt the excruciating boredom of the pandemic winter lockdown, I tried to reappraise the feeling of boredom as peacefulness, the absence of a bad thing. “I’m lucky to be bored,” I’d think. It would make the bitter pill go down more easily.

Emotion regulation passes a massive worldwide test

Wang and the hundreds of other authors wanted to see if they could teach thousands of people around the world similar coping strategies, to help deal with the stress of the pandemic.

They conducted a preregistered study — meaning a study where the methods and analysis plans are locked into place before data collection begins, to help ensure rigor — and tested two subtly different reappraisal methods, targeting negative emotions associated with living through the pandemic.

The first method is called “refocusing.” It might be better described as “looking on the bright side.”

Let’s say you’re feeling sad, staying home during a lockdown. You can refocus your thoughts to some of the more positive aspects of staying at home. Like: “Staying at home is not that bad,” as Wang explains. “You may find more time to spend with your family, or do things you may not have had time to do, such as cooking.”

Another is called “reconstruing.” This goes a little beyond just looking at the bright side of any particular burden, trying to find an overall less-negative narrative to tell ourselves about the pandemic. It’s less about finding the positive in our individual circumstances and more about looking at the big picture in a new light.

In reconstruing the burdens of the pandemic, for example, you could think: “In the past, people have overcome many challenges that seemed overwhelming at the time, and we will overcome Covid-19 related challenges too,” as the study text suggested to participants.

This isn’t about becoming a blinkered robot that’s only allowed to think positive thoughts. “In our intervention, we’re not forcing them to feel positive all the time,” Wang explains. “We’re teaching them to use it to regulate emotions.” It’s about intervening when thoughts become distressing.

It’s not about never acknowledging negative thoughts, either. “I think there’s a really delicate balance between acknowledging the reality, allowing people to sometimes sit with negativity, but also realizing that positive interpretations of things are possible,” McRae says.

In the study, participants were assigned to read about refocusing, reconstruing, or two control conditions. Participants took a survey before they learned the technique to assess their baseline emotional state. Afterward, they were measured again and asked to assess their feelings overall about the pandemic, and how they are responding to it.

Notably, both techniques fared equally well in decreasing people’s negative emotions, and the effects, the authors report, aren’t just statistically significant — they seemed to make a big practical difference for people.

The difference in feelings between those who learned reappraisals, compared to those who did not, was as big as the difference between people who had faced extreme hardships due to the pandemic, compared to those who had not. That’s a notable improvement. (Of course, the interventions are not “guaranteed” to work for any particular individual. The study reported changes on average.)

Also, the interventions didn’t seem to decrease willingness to engage in Covid-safe behaviors like masking. “Some people may worry that if you improve emotions, people may be less cautionary,” Wang says. “But we don’t find that in our study.”

Notably, too, the interventions — which were translated by a team of hundreds of people into 44 languages — broadly worked in every country tested, though there was some variability. The interventions were most effective in Brazil, Germany, and Hungary, and they were least effective in Russia, Romania, and Egypt. “So far, we haven’t found anything that can systematically explain what country can benefit more or less,” Wang says. (The researchers didn’t have representative samples in all the countries studied, so there could be a lot of reasons why they found the variation.)

A psychological finding you can trust

The narrower conclusion of this study, that cognitive reappraisal works, is not super surprising. “The finding that reappraisal decreases negative emotion and increases positive emotion is something that has been replicated over and over and over ad nauseam,” McRae says. “I couldn’t just get that finding published if I really wanted to, because it’s been so well-established.”

But there were aspects of the study that are new and significant. “I think this scale, scope, and timeliness to speak to the crisis we’re in right now were the most impressive parts about it,” she says.

There’s a burgeoning research movement in psychology dedicated to testing out single-session interventions, delivered either online or remotely. Mental health care is often inaccessible and expensive, so the more psychological interventions that can be unbundled from a whole suite of intensive therapy, the more good they can potentially do around the world. Many people whose distress doesn’t rise to the level of a mental health diagnosis could still benefit, the study suggests.

That said, there’s still more work to do here. Other researchers not involved in the project wish it had studied these participants over time, to see if the intervention had a lasting effect.

“A study this large would have provided a particularly informative test of whether a single-session universal intervention could exert lasting, more generalized effects,” says Jessica Schleider, a Stony Brook University psychologist who specializes in studying single-session psychological interventions. “I do think it’s scientifically valuable to know that reappraisal can provide in-the-moment support this broadly, and it can be recommended as one coping option to try for folks in distress.”

The authors of the paper acknowledge this limitation, and some others. The study had people view photos reminding them of Covid-19 stresses, which “might not represent local situations for different groups of participants,” the authors report. It also doesn’t represent all the myriad emotional triggers we encounter living during a pandemic. But most of all, they see this work as foundational for other questions.

The Psychological Science Accelerator, the group behind the massive undertaking of the paper, was launched in response to psychology’s “replication crisis.” Over the past decade, many famous psychological theories have collapsed under rigorous re-testing. As many as 50 percent of all psychology papers might not be replicable, though no one knows the true extent of the rot in the foundations of psychology. There have also been some high-profile cases of outright data fraud related to some of psychology’s most popular findings. The Accelerator, which operates on a shoestring budget (it reports that this study of tens of thousands of people cost only $17,000, much of which came from individual lab members), is seeking to rebuild the field on a firmer foundation.

It’s a “credibility crisis,” Patrick Forscher, a psychologist and member of the Accelerator who worked on the reappraisal paper, says. “Because there are more issues rather than just replicability. So my personal view is that you can look at a lot of psychological findings and just put a question mark on them — not that they’re definitely false. We know that some of the practices that were used to produce a lot of those findings are, themselves, not all that credible.”

The latest test of cognitive reappraisals puts the science of mental health interventions on a firmer foundation. Psychology encompasses a lot of flimsy ideas that claim to make your life better. Here’s one that seems to actually work.

Click Here:

“Back to normal” puts us back on the path to climate catastrophe

The Covid-19 pandemic upended daily life so drastically that there was a moment when it seemed to be making a dent in the climate crisis. Rush-hour traffic disappeared, global travel slowed to a crawl, and the resulting economic tailspin sent energy-related pollution plummeting almost 6 percent globally. This kind of decline in pollution is unprecedented in modern human history — it’s as though the emissions output of the entire European Union had suddenly disappeared. It led many to wonder if the Covid-19 crisis would at least give us a little extra time to avert climate emergency.

More than a year after Covid-19 abruptly changed everyone’s routines, the United States is itching to return to “normal,” and some parts of the economy are approaching business as usual. But for the climate, “back to normal” means pollution is rebounding and, worryingly, climate change is accelerating.

“We ultimately need cuts that are much larger and sustained longer than the Covid-related shutdowns of 2020,” said Ralph Keeling, a geochemist who measures carbon pollution at Mauna Loa.

As the Covid-19 pandemic continues to rage globally but starts to abate in the US, here are four ways to understand the new “normal” of the climate crisis.

1. Climate change is accelerating despite the pandemic

While emissions dropped last year, carbon and methane concentrations in the atmosphere just reached their highest-known level in millions of years. Think of it as filling a plugged bathtub with water: Even if you turn down the faucet for a little while, the water will keep rising.

The atmospheric concentration of carbon rose to 419 parts per million in June 2021, based on National Oceanic and Atmospheric Association measurements that have been taken in Hawaii since 1958. This is a level probably not seen since around 4 million years ago — when sea levels were 78 feet higher than they are today.

A chart from NOAA shows that CO2 concentrations from human activities are not only increasing, but going up at a faster rate as time goes on. (The red line shows seasonal fluctuations in CO2.)

Methane concentrations also reached a new peak, seeing the largest annual increase recorded since those measurements began in 1983.

As NOAA explained in its recent press release, “There was no discernible signal in the data from the global economic disruption caused by the coronavirus pandemic.” What’s more, Pieter Tans, a climate scientist with NOAA, told me, “If we managed to keep emissions constant, that’s not enough. Then CO2 would continue to go up at the same rate that we’ve seen in the last decade. Emissions really have to go to zero to stop this problem.”

Looking at the longer-term trends, it’s clear that the pandemic did not slow the acceleration of climate change in the way that some hoped.

2. Fossil fuels still rule the economy

In 2020, renewable energy overtook coal consumption in the United States, and electric vehicle purchases soared 43 percent over their 2019 level. But fossil fuels still reign in transportation and the power sector, the world’s two biggest pollution sources.

During the pandemic, transportation took the biggest hit. Travel is still down globally, but in May, the TSA recorded the biggest day for US air travel since March 2020. The number of US air passengers reached 90 percent of 2020 levels, according to TSA data. Passenger car travel also plunged by about half during the pandemic, but some measurements compiled from GPS data show car traffic surging even past its 2019 levels.

The pandemic led to a temporary crash in emissions, as shown by the International Energy Agency’s tracker of 2020’s monthly emissions change compared to 2019. Around the start of December, where the line turns red, monthly emissions surpassed their 2019 levels.

Of course, these are signs of a rebounding economy. But when fossil fuels still run the underlying fundamentals of the economy, we’re gambling dangerously with climate change.

Worse, it’s still possible for pollution to accelerate if the world chooses “business as usual” in its post-pandemic recovery. Last year, climate scientists from the University of East Anglia, Stanford, and other institutions pointed to the possibility that emissions could rebound to levels worse than before if politicians delay climate action for temporary economic gains. The former Trump administration justified environmental rollbacks in part by citing the pandemic’s impact, and now the Wall Street Journal reports that China is limiting the rollout of its national carbon-trading program later in June.

3. The global target of 1.5 degrees Celsius is almost out of reach

One of the key developments of the 2015 Paris climate agreement was a new target for containing climate change: restricting warming to 1.5°C, and “far under” the more disastrous 2°C.

In that effort, “normal” won’t cut it. The return to flying, driving, and commuting carves away from a limited global budget of pollution, which represents everything the atmosphere can afford before the 1.5°C target is reached. A United Nations agency, the World Meteorological Organization, updated its analysis in May and underscored that we’re essentially out of time. It found a fairly good chance — 44 percent — that the Earth will hit 1.5°C of warming in one of the next five years. That’s double the odds from just one year ago.

Last year was also one of the hottest on record, at 1.2°C above the pre-industrial average, and parts of the world are warming unevenly and have already surpassed 1.5°C. These variations don’t sound huge, but their real-world impacts can be catastrophic and concentrated in particular regions.

Parts of the Middle East hit 125 degrees Fahrenheit in June, a dangerous record even before summer settles in. The American Southwest could see similar temperatures this summer. Extreme heat causes more deaths than any other type of weather disaster and can cause power failures and infrastructure problems such as warping roads and railroad tracks.

Not only is the heat a public health threat, it also exacerbates deepening droughts that fuel the conditions for widespread wildfires.

Click Here:

In a warming world, these are not freakish events. These events, and worse than we’ve experienced yet, become the new normal.

4. Public opinion hasn’t changed either, which is surprisingly good news

A return to normal doesn’t have to mean climate change careens out of control. It’s a path governments choose if they continue to subsidize fossil fuels and fail to meet the challenge of investing in renewable infrastructure.

The pandemic hasn’t diminished people’s appetite for action on climate change, argues Anthony Leiserowitz, the director of the Yale Program on Climate Change Communication who has studied American opinions on climate change. “Public opinion about climate change hasn’t changed at all. It actually picked up a little bit,” Leiserowitz said. “I don’t see any evidence that people’s views have changed dramatically, either because of the pandemic or the economic crisis.”

Similarly, in the Great Recession of 2009, Leiserowitz studied the effect of unemployment, home value, and the economic downturn on the public’s views on climate change. He was surprised to find it did not diminish voter attitudes on climate. “A majority of Americans actually think that taking action to deal with climate change will grow the economy and increase the number of jobs,” he said.

Most Americans don’t think there has to be a zero-sum trade-off between climate change and economic growth. The Biden administration has capitalized on that view, making the case for “building back better” and trying to boost the economy with a climate-focused infrastructure package. But this can’t happen without large-scale political action. The US may savor a returning sense of normalcy — but the whole world need to remember that normal was never good enough.

The new Alzheimer’s drug that could break Medicare

Medicare, the federal health insurance program that covers Americans over 65, is facing an impossible dilemma: Should it cover a new and expensive medication for Alzheimer’s disease, which afflicts 6 million Americans and for which there is no existing treatment, even though the drug might not actually work?

It is an enormous question. Alzheimer’s patients and other families with members who endure mild cognitive impairment that may progress to Alzheimer’s have been waiting decades for an effective treatment. For them, even a few more months of life with improved cognition, one more birthday party or a grandchild’s graduation, is the priority.

But the evidence on whether Biogen’s treatment, called aducanumab, is effective is, at best, mixed; the FDA approved it this week over the objections of its own advisory committee. And with a preliminary announced price of nearly $60,000 annually per patient, covering the treatment could cost upward of $100 billion a year, mostly to Medicare, which would almost double the program’s drug spending. Patients themselves could be on the hook for thousands of dollars in out-of-pocket costs.

What Medicare does about aducanumab will have major ramifications not only for the millions of patients who could potentially be eligible for the drug, but for the future of US health care writ large.

The dilemma results from a feature of the American health care system: Unlike in other countries, the federal government has little room to negotiate what Medicare will pay for treatments.

Independent analysts think the drug is worth more like $8,000, but Medicare has no authority to charge a lower price. Instead, the federal program is likely in effect obligated to cover the new drug now that it has FDA approval. The tools it has to make a determination about whether or not to cover aducanumab and for whom are fraught with legal and ethical risk.

The government now finds itself trying to figure out how to satisfy patients who desperately need help, even though scientists think this particular treatment lacks strong evidence for its effectiveness and policy experts warn it is setting up a budgetary nightmare for Medicare in the future.

“Every conversation we’re going to have for the next few years about health care access is going to be about this drug, whether implicitly or explicitly,” Rachel Sachs, a law professor at Washington University in St. Louis who studies drug pricing, told me this week.

The troubled path to aducanumab’s approval

Alzheimer’s is a terrible disease that robs people of their agency during the final years of their lives and robs families of the loved ones they once knew. The emotional and financial costs are severe. And as the number of Americans over 65 grows, those costs are only expected to increase.

In recent history, the decades-long search for an effective treatment or cure has been driven by what’s known as the amyloid hypothesis, which holds that plaque in the brain found in Alzheimer’s patients is at least in part responsible for the disease and removing that plaque could help relieve the symptoms.

Aducanumab, accordingly, targets the amyloid plaque. Clinical trials of the drug started in 2015 but were halted in March 2019 because it did not appear it would meet the threshold for clinical effectiveness established at the start of the trials. It appeared, in other words, as though the drug didn’t work.

Normally, that would be the end of the story. But an unexpected twist came a few months later when Biogen revealed that, after additional data analysis with the FDA, some patients in one trial had actually seen “better but ultimately mixed results,” as the authors of a Health Affairs post on the controversy put it. Biogen announced it would push ahead with seeking FDA approval in October 2019, with the FDA’s apparent support.

Then, in November 2020, Biogen and aducanumab faced what looked like the ultimate setback: The FDA’s advisory committee on neurological therapies voted the data did not demonstrate the drug was clinically effective. The vote was all but unanimous, with zero in favor, 10 nays, and one uncertain. They raised concerns about potential side effects, such as brain swelling in patients who were given high doses.

But, in defiance of its own advisory committee’s recommendation, the FDA granted aducanumab its approval on Monday. The news was welcomed by Alzheimer’s patient groups but roundly criticized by experts in drug development.

“The FDA … has failed in its responsibility to protect patients and families from unproven treatments with known harms,” the Institute for Clinical and Economic Review (ICER), an independent non-government group that gauges the value of new drugs, said in a blistering statement.

And the agency not only approved the drug over the advice of its scientific advisers, but it put effectively no restrictions on which patients with cognitive impairment should be given the drug, a decision that further stunned experts, as STAT reported.

“For the FDA to approve it and with a very broad indication, I was shocked,” Stacie Dusetzina, who studies drug costs at Vanderbilt University, told me. “I really expected them to say no, based on the body of evidence.”

Medicare almost always covers FDA-approved drugs

Now that aducanumab is approved by the FDA, the issue of coverage falls largely to Medicare; because of the age of the patient population most affected by Alzheimer’s, the federal program is likely to bear the brunt of the drug’s costs.

In practice, if the FDA approves a drug, Medicare will pay for it. Aducanumab would be covered through Medicare Part B, which covers outpatient care, because it is an infusion treatment administered directly by doctors. To be covered by Part B, medical care must be “reasonable and necessary” — a vague standard that has, for medications, historically been mostly synonymous with FDA approval.

Because the drug is covered by Part B, doctors will even have a financial incentive to prescribe it. For prescription drugs, the program pays physicians the average price plus 6 percent, a policy that both Presidents Obama and Trump proposed changing but nevertheless remains in place. Determining which patients would benefit from the drug requires expensive scans, and practices will be able to bill Medicare for those, too.

At the individual level, patients could face out-of-pocket costs anywhere from $0 for patients eligible for both Medicare and Medicaid, to $10,000 annually, since Medicare Part B can hold patients responsible for up to 20 percent of costs, advocates told me.

When I asked Russ Paulsen, chief operating officer of UsAgainstAlzheimer’s, about Biogen’s list price, he responded with an audible sigh, saying: “It’s a big number.”

He continued: “We care a lot about making sure the people who are disproportionately affected by this disease, which includes poor people, have the ability to access this drug.”

Medicare’s inability to determine the price it pays for aducanumab is a uniquely American problem compared to health systems in the rest of the developed world. Countries like Australia and the United Kingdom have independent boards that evaluate a new drug’s effectiveness and set a price based on that estimated value. The US pharma industry says the US system is important for encouraging innovation, and companies have made amazing breakthroughs, such as the hepatitis-C drugs that effectively cure that disease.

But, as the standards for approving have sometimes seemed to slip in recent years, the chances of the FDA approving very expensive drugs with only marginal benefits have risen.

“We don’t require prices to reflect the value of treatment, period,” Dusetzina said. “Companies can price their drugs as high as they want. Companies can also get drugs approved with little evidence.”

So Biogen is planning to charge $56,000 annually for aducanumab. ICER, which evaluates the estimated value of new drugs, estimates, based on the clinical evidence, that it’s worth more like $8,000; perhaps as little as $2,500 or as much as $23,100. Regardless, the price announced after Biogen secured FDA approval “far exceeds even this optimistic scenario,” ICER concluded.

“If we were talking about a cure for Alzheimer’s disease, we would figure it out,” Dusetzina told me. “It would be so important to address that burden on our society, we would need to figure it out.”

But aducanumab is not that drug, according to the available data. So what is Medicare to do?

Despite the tradition of honoring FDA approval, experts do not expect Medicare to simply announce it is going to cover the drug with no limitations. One option would be for the program to conduct “national coverage determination,” a lengthy review process to figure out whether to cover the drug and for which patients. (The price would not be on the table.)

The decision that would lead to is unclear. Many experts are urging Medicare to pursue what is called “coverage with evidence development”: essentially setting up its own clinical trial by authorizing aducanumab for use by some patients and collecting real-world data on their outcomes.

“I think it’d be a really smart move,” Dusetzina, who recently joined Medicare’s payment advisory board, said. “This is the perfect time to reevaluate why we need to consider value when we consider what is a fair price for a treatment.”

Along those lines, the private health insurer Cigna announced it would pursue a value-based contract with Biogen to cover the drug, though it did not provide any more details.

But for Medicare, none of these options are ideal. A previous attempt to set up coverage with evidence development for a new cancer drug in 2017 ended up being scuttled after pushback from the drug industry and doctors. Patients with Alzheimer’s and their families are desperate for treatment and will likely object if Medicare tries to restrict access to the drug while undertaking that data collection.

Alzheimer’s advocates are mindful of aducanumab’s cost to the US health care system as well as individual patients, and its potential limitations. They are not necessarily opposed to more evaluation of its effectiveness.

But their ultimate goal is to buy patients more time. As Paulsen told me: “This drug doesn’t do it perfectly, doesn’t do it amazingly well for every single person. But it’s the first one that does that.”

They say they worry about restricting access to patients who are living with this disease right now, for whom time is running out. They point out that cancer drugs with marginal benefits have also been approved by the FDA, with exponentially higher costs per patient than aducanumab.

“We do not want to see delays in the ability of patients and doctors to begin to discuss whether this treatment is right for them,” Robert Egge, chief public policy officer of the Alzheimer’s Association, said. “And if it is, if that’s their decision together, we want them to have access to it. What we do not want to see is a long protracted process that effectively delays the ability for people to begin this treatment now that approval has been given.”

The stakes are enormous — for everyone. The cost of expensive drugs ultimately trickles down in the form of higher premiums or taxes. As the investment advisory firm Capital Alpha DC pointed out this week in a note that warned the drug “could break the Medicare program,” the Medicare trustees are expected to issue a report any time now with an updated estimate of when the program’s hospital benefit might start to become insolvent — which could be as soon as 2024.

As Sachs told me: “It’s very difficult to see how our health system moves through this without significant negative consequences.”

Medicare’s inability to negotiate pharmaceutical prices has meant that a budget crisis is always just one drug approval away. With aducanumab, that crisis has arrived — even when evidence so far suggests there may be minimal benefit for patients in return.

Click Here:

What the Novavax vaccine means for the global fight against Covid-19

Another Covid-19 vaccine, this one from the biotech firm Novavax, has posted superb results in a phase 3 clinical trial, the company announced on Monday. But with more than half of US adults now vaccinated against Covid-19, the biggest impact of these results may be in other countries.

The Novavax vaccine stands out from other Covid-19 vaccines because it uses a technology that has not been deployed to date. It can also be stored at ordinary refrigerator temperatures, unlike some other vaccines that have strict freezer requirements that complicate distribution.

Novavax said its vaccine candidate was 90 percent effective overall against Covid-19 cases that produce symptoms, and 100 percent effective against moderate and severe disease. The results, from nearly 30,000 participants across the US and Mexico, could make it the fourth Covid-19 vaccine to begin distribution in the US, following vaccines developed by Pfizer/BioNTech, Moderna, and Johnson & Johnson.

But the first approvals of the vaccine will likely come in other countries, Stanley Erck, CEO of the Maryland-based company, told the New York Times. Novavax may not even seek emergency authorization for its vaccine in the US until September. At that point, it may not make much of a difference to the US vaccination effort.

As part of the US government’s Operation Warp Speed, last July Novavax was awarded $1.6 billion for vaccine development and production of 100 million doses. At the time, the 20-year-old company faced skepticism for never having brought a vaccine to market.

Novavax now aims to scale up production, with a goal of 150 million doses per month by the end of the year with factories in the US, South Korea, and India. Its two-dose vaccine comes at an expected cost of $16 per injection. That’s more expensive than the adenovirus-based vaccines developed by Johnson & Johnson and AstraZeneca, but around the same price or cheaper than the mRNA vaccines made by Pfizer/BioNTech and Moderna.

The Novavax vaccine did exhibit lower efficacy against variants of Covid-19, but the company is studying reformulated versions to target them. With Covid-19 continuing to spread in many parts of the world, having another option to counter the disease will bolster the effort to contain the pandemic.

What makes Novavax’s approach different from other Covid-19 vaccines

Vaccines are like target practice for the immune system: They encourage our bodies to build up defenses against a particular threat, without making us sick. When the real pathogen arrives, immune cells are ready to act, preventing infection altogether or dampening the worse effects of the disease.

Traditional vaccines contain weakened or inactivated versions of viruses or bacteria, or fragments of them. But new approaches have been brought to bear on Covid-19. Moderna and Pfizer/BioNTech vaccines use a snippet of genetic material, mRNA, encased in a nanoparticle. Human cells can read those genetic instructions and manufacture a fragment of SARS-CoV-2, the virus that causes Covid-19, which spurs the immune system to prepare for the virus.

The Covid-19 vaccines developed by AstraZeneca and Johnson & Johnson also shuttle genetic instructions to human cells, encouraging them to make a fragment of SARS-CoV-2, but they use a different virus — an adenovirus — that carries a snippet of DNA.

Novavax’s approach blends old and new techniques. To make the vaccine, the company combines another kind of virus — a baculovirus — with the genetic information needed to make a spike protein, a key fragment of SARS-CoV-2. When moth cells are infected with this virus, they manufacture the spike protein. Scientists then harvest and fuse those proteins with a nanoparticle. These nanoparticles combined with spike proteins are what is injected in the Novavax vaccine.

According to Novavax, this approach yields a strong immune response with minimal side effects. The main complaints from vaccine recipients were fatigue, headache, and muscle pain lasting less than two days.

Click Here:
How Novavax fits into the vaccination campaign

While new infections, hospitalizations, and deaths are trending downward in the United States, the Covid-19 pandemic continues to rage in other countries. India, currently an epicenter of the pandemic, recently set a new world record of more than 6,000 daily Covid-19 deaths. Part of the toll stems from the Delta/B.1.617 variant of the virus, which appears to be more transmissible. Health officials warn that other countries with limited resources and low vaccination rates remain vulnerable to their own outbreaks. And as long as the virus continues to spread, it risks mutating in dangerous ways that can reverberate to places like the US.

Leaders at the G7 summit last week committed to sharing 1 billion doses of Covid-19 vaccines with other countries, with half coming from the US. For its part, Novavax is partnering with manufacturers in other countries like India and South Korea to scale up its production. The company had pledged at least 1.1. billion doses of its vaccine through Gavi, an international vaccination consortium.

Novavax may still have a future role in the US. The company is investigating how its vaccine could work as a booster, bolstering protection from other vaccines as immunity wanes over time. A study last month showed that even mixing shots of different vaccine platforms led to robust immune protection. But it’s not clear yet how long the shielding provided by other Covid-19 vaccines will last.

At the same time, the virus itself is continuing to change. Novavax’s results on Monday showed that its vaccine had 86.3 percent efficacy at preventing disease caused by the Alpha/B.1.1.7 variant of the virus, which first appeared in the United Kingdom. It shows that protection was high, but not as high as immunity to earlier strains of the virus.

Early phase 2b results from South Africa, however, showed the vaccine yielded 48.6 percent efficacy against the Beta/B.1.351 variant in HIV-negative participants. The company is now investigating a retooled version of its vaccine aimed specifically at the Beta variant.

The ongoing evolution and spread of Covid-19 shows that the pandemic is not over, and it’s too early to become complacent. A new way to immunize against Covid-19 is a welcome development — particularly if it can reach the most vulnerable, and quickly.

The West has all the ingredients for another terrible wildfire season

Summer has not officially started yet, but wildfire season has already arrived in the US. Now an intense heat wave coupled with extreme drought is threatening to make things worse.

Large wildfires have already burned 981,000 acres this year to date, more than the 766,000 acres burned by the same time last year, according to the National Interagency Fire Center.

In Arizona, more than 208,000 acres have burned, sending smoke into Colorado. The 123,000-acre Telegraph Fire is now in Arizona’s top 10 largest fires in history.

In Utah, blazes have charred more than 25,000 acres, with a new fire ignited every day for three weeks. California has seen a fourfold increase in year-to-date area burned compared to 2020.

It’s poised to get worse as summer officially begins. While 2021 may not beat the record-setting 2020 season, experts say it will be severe. “It’s probably going to be above-average for sure, but it’s not going to be off-the-charts,” said Craig Clements, director of the Wildfire Interdisciplinary Research Center at San Jose State University.

It’s important to remember that wildfires are a natural part of many ecosystems. They help clear decay, restore nutrients to the soil, and are even required for some plants to germinate. Regular fires are a feature of many healthy forests and grasslands. However, wildfires have been getting more destructive in recent years, and humans are to blame. From building in fire-prone regions to suppressing natural fires to igniting blazes to changing the climate, humanity is making wildfires more expansive, costly, and deadly.

Even so, there are a lot of complicated and surprising factors that contribute to massive infernos, so there is a lot of variability year to year. Here are some of the factors that forecasters are worrying about in the western US.

Why 2021 is expected to be a bad fire year for the West

To ignite, a wildfire needs fuel, favorable weather, and an ignition source. But whether the overall fire season will be particularly severe or mild depends on variables that interact in complicated and sometimes contradictory ways.

For instance, a wet winter can help encourage more vegetation to grow in the spring, which can then turn into fuel as summer heats up. But a dry winter can add to aridity from ongoing droughts, particularly in areas that already have a lot of flammable fuel, such as forests. “In California, if it’s a dry year, it’s a bad fire season. If it’s a wet year, it’s a bad fire season,” Clements said.

So depending on the particular ecosystem — coastal forest, mountain forest, grassland, chaparral — the same weather and climate conditions can shift fire risk in different directions. But right now, these are the biggest factors driving wildfire risk across the board in the West:

Massive drought
Huge swaths of the western US are experiencing extreme dryness. About 72 percent of the region is considered to be in “severe” drought, while 26 percent is in the worst category of “exceptional” drought. Water levels in reservoirs like Lake Oroville in California and Lake Mead in Nevada have dropped to historic lows. Oregon just experienced its driest spring on record.

This dryness is a combination of both a 20-year drop in precipitation called a megadrought, as well as seasonal variation.

Last summer brought extreme heat to the region, which caused more moisture in the soil to evaporate, leaving less water for plants. The following winter then failed to bring much snow and rain, driven in part by a cooling pattern in the Pacific Ocean known as a La Niña. The snow that did accumulate dissipated faster than average, leaving a zero percent snowpack in the Sierra Nevada in May.

Warm weather
California was graced with some cool weather and light rainfall earlier this month, but now the temperature is starting to rise. The Southwest, meanwhile, is bracing for record heat this week. As many as 40 million Americans are poised to swelter as temperatures rise as high as 120 degrees Fahrenheit.

High heat has a close relationship with fire risk. “If it’s really warm, we generally have a higher fire season,” Clements said. “If it’s cooler, it’ll be below average.”

Air can absorb about 7 percent more water for every degree Celsius the air warms. But if there isn’t much moisture to absorb to begin with, then there is a gap between what the air can fully absorb and what moisture is actually present. This gap is known as the vapor pressure deficit, and it’s a key warning signal of wildfire risk, indicating that there is little moisture moving through trees, shrubs, and grasses.

Lots of dry fuel
The combination of heat and aridity has left vegetation parched and primed to ignite. “Fuel moisture content is a critical factor in understanding fire behavior and fire danger,” Clements said.

That exceptionally dry vegetation then causes fires to burn hotter, faster, and longer, which in turn hampers efforts to contain them. It creates a cycle that can end up driving massive, devastating wildfires.

Day-to-day conditions can mitigate some of the long-term wildfire trends

While the deck is stacked in favor of major wildfires again this year, it’s not a guarantee that they will be larger, more frequent, or more destructive. Blazes still require an ignition source, and they depend on wind and persistent dry conditions to spread. “Things are looking scary, but if there’s no ignition, it’s not so bad,” Clements said.

If there isn’t a major wind event as fires ignite, they could remain contained. Similarly, bouts of rainfall or lower temperatures could quench flames. These weather events can drastically change the dynamics of fires and it’s not clear yet what the coming weeks will hold.

And if there is nothing to spark the flames, then there will be few new fires. The majority of wildfires in the US, upward of 84 percent, are ignited by humans. That can come from arson, unattended campfires, downed power lines, or machinery. So taking steps to reduce ignition, like banning fires in forested areas or limiting routes open to cars in fire-prone chaparral, can go a long way in reducing wildfire risk. Power companies like Pacific Gas & Electric are readying plans to shut off power to their customers to prevent their hardware from lighting new blazes.

But nature can ignite fires too. A dry lightning storm last year triggered a wave of fires in California. July is the peak month for lightning strikes in the West, and that’s one thing humans can’t prevent.

Over time, it’s possible to reduce the destructiveness of wildfires — for example through controlled burns, regular thinning of trees and brush that build up, and relocating homes and businesses away from high-risk areas. But the current situation developed over more than a century of poor planning, and it won’t be fixed overnight. So wildfires in the West are likely to get worse before they get better.

Click Here:

What’s with these invasive “crazy” worms and why can’t we get rid of them?

Tiny, wriggling horrors are hatching right now, under our feet, across the country.

No, not the billions of Brood X cicadas emerging throughout the eastern US. I’m talking instead about baby invasive “crazy worms” that thrash through garden, farm, city, and forest soil, growing to 3 to 6 inches in length, sucking up nutrients, and transforming rich leaf litter into coarse droppings. All while laying nearly 20 hardy worm cocoons a month, without needing a mate.

Variously known as jumping worms, snake worms, Alabama jumpers, and Jersey wrigglers, common Amynthas species are a super-powered version of the more familiar, squishy languidness of the garden-variety European earthworms (whose genus name, Lumbricus, itself sounds plodding). And their rapid spread into new areas has led to a surge of concern about these worms.

This vigorous lifestyle can quickly lead to full-blown infestations — and decimated topsoil. Perhaps it’s no wonder jumping worms recently have been invading the internet, too.

“You can see hundreds of them massing together, eliciting squeals of either horror or delight,” says Bernie Williams, a plant pest and disease expert at the Wisconsin Department of Natural Resources, who has been studying worms for some 20 years (“too many years”). Jumping worms, of the genus Amynthas, have now been spotted in more than half of US states and at least one Canadian province.

Amynthas worms raise not only the frequent disgust of gardeners, but also serious concern for land management experts. By churning through such high volumes of surface mulch and litter (and not allowing it to decompose more naturally into the soil), these worms seem to tie up plant-friendly nutrients into their dry castings, which are then easily washed away. They can physically undermine plants by loosening the top layer of soil — especially when hundreds of them are at work — and make it less able to retain moisture. They also seem to eradicate European earthworms, which help mix and aerate healthy soil, wherever they arrive.

So, it’s panic time, right?

It turns out we know very little about these annelid invaders beyond their self-fertilizing fecundity, physical vigor, and prolific digestive habits. It is true that they are changing the landscapes they enter, but some researchers say that while we should work to control jumping worms, we also need to learn more about them — and, yes, learn how we can live with them, too.