HUNDREDS DEADLY BIOLABS WITH DISASTROUS SECURITY RECORDS, RAN BY CDC AND PHARMAFIA IN YOUR BACKYARD

Pharmafia is not only Capitalism’s most convicted felon, but also the most successful justice-avoidant felon


If you think this is not your problem because you’re not living in US, think again:




The USA TODAY Network’s “Biolabs in Your Backyard” investigation, published since 2015, has revealed hundreds of accidents at corporate, university, government and military labs nationwide. It also has exposed a system of fragmented federal oversight and pervasive secrecy that obscures failings by facilities and regulators.

In January 2015, in an effort to determine the extent of lab accidents at the agency’s facilities, USA TODAY filed a FOIA request seeking copies of all incident reports at CDC labs in Atlanta and Fort Collins during 2013 and 2014. The CDC granted the request “expedited” processing status because USA TODAY demonstrated a compelling public need for the information. But the agency has said it will likely be 2018 before the records are released.

The newly disclosed 2009 incident in the BSL-4 decontamination shower is among about 4,000 pages of records the agency released in late January  in response to two FOIA requests USA TODAY filed in June 2012. Those requests sought records about airflow and security door incidents at CDC’s $214 million, 368,000-square-foot  Emerging Infectious Diseases Laboratory in Atlanta, commonly referred to by the agency as Building 18.

Most of these released records — which focus on airflow engineering issues in labs — involve a 2012 incident that USA TODAY reported four years ago based on documents obtained from sources. The issue involved air from inside a potentially contaminated lab briefly blowing outward into a “clean” corridor where a group of visitors weren’t wearing any protective gear. Among other incidents revealed in the records:

  • In 2011, a worker feeding animals in an enhanced biosafety level 3 lab used for studies on dangerous strains of avian flu, was unable to shower out of the lab after a construction contractor mistakenly closed the wrong water valve in a service tunnel. Not knowing when the water would come back on, the worker removed her protective equipment, put on a clean protective suit and left the lab without taking a shower. “I escorted her through the service tunnel to building (redacted) where she signed into our (redacted) select agent laboratory. She disposed of the tyvek suit in a biohazard bag, placed her scrubs in the laundry bin, and took a personal shower.” The CDC told USA TODAY that because the potential for any exposure was considered low risk, a medical evaluation was not required.
  • In 2008 an unvaccinated repair worker was potentially exposed to an undisclosed pathogen when a door containing contaminated items unexpectedly opened in a malfunctioning device, called an autoclave, that is used to sterilize equipment and other items. The infectious materials inside the device included bedding from infected mice and used laundry.  While a report of the incident said that any material that may have escaped through the clean-side door that opened “was likely to be drawn upward toward the exhaust,” the worker was told to shower and his clothes, shoes, wallet, watch and other personal items were disinfected. He was escorted to the clinic for evaluation. The report notes that the autoclave “was installed backwards during building construction” and that as a result, the manual override controls for doors are reversed “which ultimately resulted in the incident.”

Building 18, which opened in 2005 has had a series of significant issues over the years. While the building’s many other high-containment and lower security labs were in operation from the start, its suite of BSL-4 labs did not go “hot” and start working with pathogens until around early 2009. The lab complex made news in 2007 when backup generators didn’t work to keep airflow systems working during a power outage and in 2008 for high-containment lab door that was being sealed with duct tape. The duct tape was applied after a 2007 incident where the building’s ventilation system malfunctioned and pulled potentially contaminated air out of the lab and into a “clean” hallway. Nine CDC workers were tested for potential exposure to Q fever bacteria. None were infected.

Read all the records released by CDC in response to USA TODAY’s 2012 Freedom of Information Act requests here and here.

The full coverage of USA TODAY’s investigation used to be hosted on its own separate website, biolabs.usatoday.com , but they deleted it, unsurprisingly.

CDC keeps secret its mishaps with deadly germs

CDC failed to disclose lab incidents with bioterror pathogens to Congress

Newly disclosed CDC biolab failures ‘like a screenplay for a disaster movie’

CDC labs repeatedly faced secret sanctions for mishandling bioterror germs

As we’ve shown in our video too, a wide range of mainstream media outlets have reflected the situation over the years, not just USA Today, being quite critical of it, but with almost no impact on the general population. Ah, well…

Why some labs work on making viruses deadlier — and why they should stop

The pandemic should make us question the value of gain-of-function research.

By Kelsey Piper  May 1, 2020, Vox

Editor’s note, June 7, 2021: Since this article was originally published in May 2020, scientific consensus has shifted. Now some experts say the “lab leak” theory warrants an investigation, along with the natural origin theory. The article has been updated to reflect this, but other information may be out of date. For our most up-to-date coverage of the coronavirus pandemic, visit Vox’s coronavirus hub.


Earlier this week, Newsweek and the Washington Post reported that the Wuhan Institute of Virology, a lab near the site of the first coronavirus cases in the world, had been studying bat coronaviruses.

The Newsweek report revealed an alarming tidbit: The Wuhan lab at the center of the controversy had for years been engaged in gain-of-function research. What exactly is it? It’s a line of research where scientists take viruses and study how they might be modified to become deadlier or more transmissible. Why would they do this? Scientists who engage in such research say it helps them figure out which viruses threaten people so they can design countermeasures.

To be clear, there is no evidence that the novel coronavirus, SARS-CoV-2, was released on purpose, and many experts believe it is likely to have been the result of accidental transmission through human contact with wild animals, like almost all disease outbreaks in history have been.

But the emerging reports about the lab in Wuhan are making many people aware for the first time that gain-of-function research happens at all. I wouldn’t blame you if your response to this news is this: The government gives grants to researchers to make potentially pandemic viruses deadlier and to make them transmissible more easily between people? Why are we doing that?

The increased attention to gain-of-function research is a good thing. This kind of highly controversial research — banned under the Obama administration after safety incidents demonstrated that lab containment is rarely airtight — began again under the Trump administration, and many scientists and public health researchers think it’s a really bad idea. Our brush with the horrors of a pandemic might force us to reconsider the warnings those experts have been sounding for years.

The US stopped funding gain-of-function research. Then it started again.

In 2019, Science magazine broke the news that the US government resumed funding two controversial experiments to make the bird flu more transmissible.

The two experiments had been on hold since 2012 amid a fierce debate in the virology community about gain-of-function research. In 2014, the US government, under the Obama administration, declared a moratorium on such research.

That year was a bad one on the biohazard front. In June 2014, as many as 75 scientists at the Centers for Disease Control and Prevention were exposed to anthrax. A few weeks later, Food and Drug Administration officials ran across 16 forgotten vials of smallpox in storage. Meanwhile, the “largest, most severe, and most complex” Ebola outbreak in history was raging across West Africa, and the first patient to be diagnosed in the US had just been announced.

It was in that context that scientists and biosecurity experts found themselves embroiled in a debate about gain-of-function research. The scientists who do this kind of research argue that we can better anticipate deadly diseases by making diseases deadlier in the lab. But many people at the time and since have become increasingly convinced that the potential research benefits — which look limited — just don’t outweigh the risks of kicking off the next deadly pandemic ourselves.

While internally divided, the US government came down on the side of caution at the time. It announced a moratorium on funding gain-of-function research — putting potentially dangerous experiments on hold so the world could discuss the risks this research entailed.

But in 2017, the government under the Trump administration released new guidelines for gain-of-function research, signaling an end to the blanket moratorium. And the news from 2019 suggests that dangerous projects are proceeding.

Experts in biosecurity are concerned the field is heading toward a mistake that could kill innocent people. They argue that, to move ahead with research like this, there should be a transparent process with global stakeholders at the table. After all, if anything goes wrong, the mess we’ll face will certainly be a global one.

Should we really be doing this kind of research?

Advocates of this kind of gain-of-function research (not all gain-of-function research uses pathogens that can cause pandemics) point to a few things they hope it will enable us to do.

In general, they argue it will enhance surveillance and monitoring for new potential pandemics. As part of our efforts to thwart pandemics before they start — or before they get severe — we take samples of the viruses currently circulating. If we know what the deadliest and most dangerous strains out there are, the argument goes, then we’ll be able to monitor for them and prepare a response if it looks like such mutations are arising in the wild.

“As coordination of international surveillance activities and global sharing of viruses improve,” some advocates wrote in mBio, we’ll get better at learning which strains are out there. Then, gain-of-function research will tell us which ones are close to becoming deadly.

“GOF data have been used to launch outbreak investigations and allocate resources (e.g., H5N1 in Cambodia), to develop criteria for the Influenza Risk Assessment Tool, and to make difficult and sometimes costly pandemic planning policy decisions,” they argue.

“The United States government weighed the risks and benefits … and developed new oversight mechanisms. We know that it does carry risks. We also believe it is important work to protect human health,” Yoshihiro Kawaoka, an investigator whose gain-of-function research was approved, told Science magazine.

According to this logic, if we’d known for years that the SARS-CoV-2 coronavirus — the virus now keeping us all indoors — was a particularly dangerous one, maybe we could have had disease surveillance systems out to alert us if it made the jump to humans.

Others are skeptical. Thomas Inglesby, director of the Center for Health Security at Johns Hopkins, told me last year that he doesn’t think the benefits for vaccine development hold up in most cases. “I haven’t seen any of the vaccine companies say that they need to do this work in order to make vaccines,” he pointed out. “I have not seen evidence that the information people are pursuing could be put into widespread use in the field.”

Furthermore, there are unimaginably many possible variants on a virus, of which researchers can identify only a few. Even if we stumble across one way a virus could mutate to become deadly, we might miss thousands of others. “It’s an open question whether laboratory studies are going to come up with the same solution that nature would,” MIT biologist Kevin Esvelt told me last year. “How predictive are these studies really?” As of right now, that’s still an open question.

And even in the best case, the utility of this work would be sharply limited. “It’s important to keep in mind that many countries do not have mechanisms in place at all — much less a real-time way to identify and reduce or eliminate risks as experiments and new technologies are conceived,” Beth Cameron, the Nuclear Threat Initiative’s vice president for global biological policy and programs, told me.

With the stakes so high, many researchers are frustrated that the US government was not more transparent about which considerations prompted them to fund the research. Is it really necessary to study how to make H5N1, which causes a type of bird flu with an eye-popping mortality rate, more transmissible? Will precautions be in place to make it harder for the virus to escape the lab? What are the expected benefits from the research, and which hazards did the experts who approved the work consider?

“The people proposing the work are highly respected virologists,” Inglesby said. “But laboratory systems are not infallible, and even in the greatest laboratories of the world, there are mistakes.” What measures are in place to prevent that? Will potentially dangerous results be published to the whole world, where unscrupulous actors could follow the instructions?

These are exactly the questions the review process was supposed to answer, but didn’t.

Sometimes pathogens escape from the lab. Here’s how it happens.

The reason the subject of gain-of-function research can inspire such heated opposition is because the stakes can be so high. Pathogens have escaped labs before.

Take smallpox, once one of the deadliest diseases.

In 1977, the last case of smallpox was diagnosed in the wild. The victim was Ali Maow Maalin of Somalia. The World Health Organization tracked down every person he’d been in face-to-face contact with to vaccinate everyone at risk and find anyone who might have caught the virus already. Thankfully, they found no one had. Maalin recovered, and smallpox appeared to be over forever.

That moment came at the end of a decades-long campaign to eradicate smallpox — a deadly infectious disease that killed about 30 percent of those who contracted it — from the face of the Earth. Around 500 million people died of smallpox in the century before it was annihilated.

But in 1978, the disease cropped back up — in Birmingham, England. Janet Parker was a photographer at Birmingham Medical School. When she developed a horrifying rash, doctors initially brushed it off as chicken pox. After all, everyone knew smallpox had been chased out of the world — right?

Parker got worse and was admitted to the hospital, where testing determined she had smallpox after all. She died of it a few weeks later.

How did she get a disease that was supposed to have been eradicated?

It turned out that the building Parker worked in also contained a research laboratory, one of a handful where smallpox was studied by scientists who were trying to contribute to the eradication effort. Some papers reported the lab was badly mismanaged, with important precautions ignored because of haste. (The doctor who ran the lab died by suicide shortly after Parker was diagnosed.) Somehow, smallpox escaped the lab to infect an employee elsewhere in the building. Through sheer luck and a rapid response from health authorities, including a quarantine of more than 300 people, the deadly error didn’t turn into an outright pandemic.

In 2014, as the Food and Drug Administration did cleanup for a planned move to a new office, hundreds of unclaimed vials of virus samples were found in a cardboard box in the corner of a cold storage room. Six of them, it turned out, were vials of smallpox. No one had been keeping track of them; no one knew they were there. They may have been there since the 1960s.

Panicked scientists put the materials in a box, sealed it with clear packaging tape, and carried it to a supervisor’s office. (This is not approved handling of dangerous biological materials.) It was later found that the integrity of one vial was compromised — luckily, not one containing a deadly virus.

The 1979 and 2014 incidents grabbed attention because they involved smallpox, but incidents of unintended exposure to controlled biological agents are actually quite common. Hundreds of incidents occur every year, though not all involve potentially pandemic-causing pathogens.

In 2014, a researcher accidentally contaminated a vial of a fairly harmless bird flu with a far-deadlier strain. The deadlier bird flu was then shipped across the country to a lab that didn’t have authorization to handle such a dangerous virus, where it was used for research on chickens.

The mistake was discovered only when the Centers for Disease Control and Prevention conducted an extensive investigation in the aftermath of a different mistake — the potential exposure of 75 federal employees to live anthrax, after a lab that was supposed to inactivate the anthrax samples accidentally prepared activated ones.

The CDC’s Select Agents and Toxins program requires “theft, loss, release causing an occupational exposure, or release outside of primary biocontainment barriers” of agents on its watchlist be immediately reported. Between 2005 and 2012, the agency got 1,059 release reports — an average of one incident every few days. Here are a few examples:

  • In 2008, a sterilization device malfunctioned and unexpectedly opened, exposing a nearby unvaccinated worker to undisclosed pathogens.
  • In 2009, a new high-security bioresearch facility — rated to handle Ebola, smallpox, and other dangerous pathogens — had its decontamination showers fail. The pressurized chamber kept losing pressure and the door back into the lab kept bursting open while the scientists leaned against it to try to keep it closed. Building engineers were eventually called to handle the chemical showers manually.
  • In 2011, a worker at a lab that studied dangerous strains of bird flu found herself unable to shower after a construction contractor accidentally shut off the water. She removed her protective equipment and left without taking a decontaminating shower. (She was escorted to another building and showered there, but pathogens could have been released in the meantime.)

Now, the vast majority of these mistakes never infect anyone. And while 1,059 is an eye-popping number of accidents, it actually reflects a fairly low rate of accidents — working in a controlled biological agents lab is safe compared to many occupations, like trucking or fishing.

But a trucking or fishing accident will, at worst, kill a few dozen people, while a pandemic pathogen accident could potentially kill a few million. Considering the stakes and worst-case scenarios involved, it’s hard to look at those numbers and conclude that our precautions against disaster are sufficient.

Reviewing the incidents, it looks like there are many different points of failure — machinery that’s part of the containment process malfunctions; regulations aren’t sufficient or aren’t followed. Human error means live viruses are handled instead of dead ones.

Now imagine such an error involving viruses enhanced through gain-of-function research. “If an enhanced novel strain of flu escaped from a laboratory and then went on to cause a pandemic, then causing millions of deaths is a serious risk,” Marc Lipsitch, a professor of epidemiology at Harvard University, told me last year.

The cost-benefit analysis for pathogens that might kill the people exposed or a handful of others is vastly different from the cost-benefit analysis for pathogens that could cause a pandemic — but our current procedures don’t really account for that. As a result, allowing gain-of-function research means running unacceptable risks with millions of lives. It’s high time to rethink that.

Problems with disposal of dangerous materials led the government to suspend research at the military’s leading biodefense center.

Denise Braun prepared to demonstrate lab work during a media tour at the  Army Medical Research Institute of Infectious Diseases in Fort Detrick, Md., in 2011.
Denise Braun prepared to demonstrate lab work during a media tour at the  Army Medical Research Institute of Infectious Diseases in Fort Detrick, Md., in 2011.CreditCreditPatrick Semansky/Associated Press

By Denise Grady, Aug 5, 2019, New York Times

Safety concerns at a prominent military germ lab have led the government to shut down research involving dangerous microbes like the Ebola virus.

“Research is currently on hold,” the United States Army Medical Research Institute of Infectious Diseases, in Fort Detrick, Md., said in a statement on Friday. The shutdown is likely to last months, Caree Vander Linden, a spokeswoman, said in an interview.

The statement said the Centers for Disease Control and Prevention decided to issue a “cease and desist order” last month to halt the research at Fort Detrick because the center did not have “sufficient systems in place to decontaminate wastewater” from its highest-security labs.

But there has been no threat to public health, no injuries to employees and no leaks of dangerous material outside the laboratory, Ms. Vander Linden said.

In the statement, the C.D.C. cited “national security reasons” as the rationale for not releasing information about its decision.

The institute is a biodefense center that studies germs and toxins that could be used to threaten the military or public health, and also investigates disease outbreaks. It carries out research projects for government agencies, universities and drug companies, which pay for the work. It has about 900 employees.

The shutdown affects a significant portion of the research normally conducted there, Ms. Vander Linden said.

The suspended research involves certain toxins, along with germs called select agents, which the government has determined have “the potential to pose a severe threat to public, animal or plant health or to animal or plant products.” There are 67 select agents and toxins; examples include the organisms that cause Ebola, smallpox, anthrax and plague, and the poison ricin.

In theory, terrorists could use select agents as weapons, so the government requires any organization that wants to handle them to pass a background check, register, follow safety and security procedures, and undergo inspections through a program run by the C.D.C. and the United States Department of Agriculture. As of 2017, 263 laboratories — government, academic, commercial or private — had registered with the program.

The institute at Fort Detrick was part of the select agent program until its registration was suspended last month, after the C.D.C. ordered it to stop conducting the research.

The shutdown was first reported on Friday by the Frederick News-Post.

The problems date back to May 2018, when storms flooded and ruined a decades-old steam sterilization plant that the institute had been using to treat wastewater from its labs, Ms. Vander Linden said. The damage halted research for months, until the institute developed a new decontamination system using chemicals.

The new system required changes in certain procedures in the laboratories. During an inspection in June, the C.D.C. found that the new procedures were not being followed consistently. Inspectors also found mechanical problems with the chemical-based decontamination system, as well as leaks, Ms. Vander Linden said, though she added that the leaks were within the lab and not to the outside world.

“A combination of things” led to the cease and desist order, and the loss of registration, she said.

Dr. Richard H. Ebright, a molecular biologist and bioweapons expert at Rutgers University, said in an email that problems with the institute’s new chemical-based decontamination process might mean it would have to go back to a heat-based system “which, if it requires constructing a new steam sterilization plant, could entail very long delays and very high costs.”

Although many projects are on hold, Ms. Vander Linden said scientists and other employees are continuing to work, just not on select agents. She said many were worried about not being able meet deadlines for their projects.

Missteps have occurred at other government laboratories, including those at the Centers for Disease Control and the National Institutes of Health. And in 2009, research at the institute in Fort Detrick was suspended because it was storing pathogens not listed in its database. The army institute also employed Bruce E. Ivins, a microbiologist who was a leading suspect — but who was never charged — in the anthrax mailings in 2001 that killed five people. Dr. Ivins died in 2008, apparently by suicide.

FOLLOW UP:

To be continued?
Our work and existence, as media and people, is funded solely by our most generous readers and we want to keep this way.
Help SILVIEW.media survive and grow, please donate here, anything helps. Thank you!

! Articles can always be subject of later editing as a way of perfecting them

Sometimes my memes are 3D. And you can own them. Or send them to someone.
You can even eat some of them.
CLICK HERE

Comments are closed.