BREAKING STUDY: Pfizer mRNA Found in Over 88% of Human Placentas, Sperm, and Blood — and in 50% of Unvaccinated Pregnant Women by Nicolas Hulscher, MPH

Pfizer BioNTech Messenger RNA Distribution

Human biodistribution study shows Pfizer mRNA penetrates fetal and reproductive tissues, persists long-term in the body, and presents clear evidence of shedding.

For years, the public was told a simple story: the mRNA “stays in the arm,” degrades within hours, never enters the bloodstream, never crosses the placenta, never reaches the reproductive system, and certainly cannot be shed or transferred to others. These claims were repeated endlessly by agencies, fact-checkers, news outlets, and medical institutions, despite the fact that no long-term human biodistribution studies had ever been performed.

A new peer-reviewed study published in Annals of Case Reports titled, Detection of Pfizer BioNTech Messenger RNA COVID-19 Vaccine in Human Blood, Placenta and Semen, ends that narrative.

Researchers from Bar-Ilan University and several Israeli medical centers used nested PCR combined with Sanger sequencing—a far more sensitive and specific method than the standard qPCR used in earlier studies—to test for Pfizer mRNA in human tissues from 34 participants, including 22 pregnant women, 4 male sperm donors (8 samples), and 8 additional adults.

Their findings are deeply worrisome: 88% of pregnant women vaccinated within the last 100 days showed detectable Pfizer mRNA in both blood and placental tissue. Among male sperm donors, 100% of those who produced sperm had vaccine mRNA in their sperm cells, and 50% had it detectable in seminal fluid—long after vaccination.

Even more concerning, Pfizer mRNA was detected in 50% of the unvaccinated women tested —two in both placenta and blood, and one in blood alone; a result that forces the scientific community to confront the reality of shedding, something officials categorically deny.

Most striking of all, mRNA was still present in 50% of individuals more than 200 days after injection.

https://www.thefocalpoints.com/p/breaking-study-pfizer-mrna-found

STUDY: Common Vaccines Linked to 38-50% Increased Risk of Dementia and Alzheimer’s

Vaccines Increase Risks

The single largest vaccine–dementia study ever conducted (n=13.3 million) finds risk intensifies with more doses, remains elevated for a full decade, and is strongest after flu and pneumococcal shots.

The single largest and most rigorous study ever conducted on vaccines and dementia — spanning 13.3 million UK adults — has uncovered a deeply troubling pattern: those who received common adult vaccines faced a significantly higher risk of both dementia and Alzheimer’s disease.

The risk intensifies with more doses, remains elevated for a full decade, and is strongest after influenza and pneumococcal vaccination. With each layer of statistical adjustment, the signal doesn’t fade — it becomes sharper, more consistent, and increasingly difficult to explain away.

And critically, these associations persisted even after adjusting for an unusually wide range of potential confounders, including age, sex, socioeconomic status, BMI, smoking, alcohol-related disorders, hypertension, atrial fibrillation, heart failure, coronary artery disease, stroke/TIA, peripheral vascular disease, diabetes, chronic kidney and liver disease, depression, epilepsy, Parkinson’s disease, cancer, traumatic brain injury, hypothyroidism, osteoporosis, and dozens of medications ranging from NSAIDs and opioids to statins, antiplatelets, immunosuppressants, and antidepressants.

Even after controlling for this extensive list, the elevated risks remained strong and remarkably stable.

The vaccine does not stay in the arm.

Its products do not remain confined to the individual.

And the biological signals generated in response may behave in ways that resemble “spread,” even though no infectious agent is present.

https://open.substack.com/pub/petermcculloughmd/p/study-common-vaccines-linked-to-38

Ross Perot

Ross Perot

His employees got thrown in an Iranian prison. He hired Special Forces to break them out.
A $1,000 investment created a man who almost became president.
Ross Perot was 32 years old.
He quit IBM in 1962. Started Electronic Data Systems with nothing but an idea and a thousand bucks.
Everyone said he was crazy.
“Why would you leave the best sales job in America?”
“Data processing? Nobody’s buying that.”
“You’re throwing away your career.”
He didn’t listen.
But here’s what nobody tells you. They were almost right.
Perot was rejected 77 times before landing his first contract.
Seventy-seven nos. Most people quit after five. Maybe ten.
He kept going.
By 1968, EDS went public. The stock opened at $16. Within days it hit $160.
Forbes called him “the fastest, richest Texan.”
By December 1969, his shares were worth $1 billion.
Then April 1970 happened.
In a single day, Perot lost $445 million on the stock exchange.
The biggest individual loss in NYSE history at that time.
Wall Street laughed. “The Texan got what was coming to him.”
Most people would have crumbled. Sold everything. Gone back to a safe job.
Perot? He just kept building.
Fourteen years later, he sold EDS to General Motors for $2.4 billion.
But that’s not what makes this story interesting.
In 1979, a young Bill Gates approached Perot about buying Microsoft. A tiny software company worth maybe $2 million.
Perot thought the asking price was too high.
He passed.
Microsoft is now worth over $3 trillion.
Perot called it “one of the biggest business mistakes I’ve ever made.”
But here’s what separates good entrepreneurs from great ones.
When Steve Jobs got fired from Apple in 1985, he started a new company called NeXT. Everyone said Jobs was finished. Washed up. A has-been at 30.
Perot watched a PBS documentary about Jobs. Called him the next morning.
“If you ever need an investor, call me.”
Perot invested $20 million into NeXT. Took a board seat.
He didn’t want to miss another Microsoft.
NeXT struggled. The computers didn’t sell. Most people would have written it off as a loss.
But in 1996, Apple bought NeXT for $400 million. Jobs came back.
NeXT’s software became the foundation for macOS. And later, the iPhone.
Perot’s “failed” investment helped create the most valuable company on Earth.
But he wasn’t done.
In 1979, two of his EDS employees were arrested in Iran during the revolution. The government wanted $12.7 million in ransom. The U.S. couldn’t help.
Most CEOs would have hired lawyers. Waited it out. Let bureaucracy run its course.
Perot hired a retired Special Forces colonel. Assembled a team of Vietnam veterans who worked for EDS. Flew to Tehran himself.
When negotiations failed, his team helped start a prison riot. 11,000 inmates escaped. Including his two employees.
They drove 500 miles overland to Turkey.
All of them made it home.
Hollywood made a miniseries about it. Ken Follett wrote a bestseller.
Then came politics.
In 1992, Perot announced he was running for president. As an independent. No party. No political machine.
Everyone said it was impossible.
“Third-party candidates never win.”
“He’s wasting his money.”
“Americans don’t vote independent.”
At one point, Perot led the polls. Ahead of both George Bush and Bill Clinton.
He dropped out in July. Came back in October.
Still got nearly 20 million votes. 19% of the popular vote.
The most successful third-party candidate since Theodore Roosevelt in 1912.
He changed how campaigns worked. Used TV infomercials instead of rallies. Spoke directly to voters on talk shows. No handlers. No scripts.
He ran again in 1996. Got 8% of the vote.
Never won an election. Never held office.
But here’s what people miss.
Perot didn’t run to win. He ran to prove a point. That ordinary Americans were tired of being ignored. That you didn’t need permission from the political establishment to have a voice.
When he died in 2019, he was worth $4.1 billion.
Started with $1,000.
Rejected 77 times.
Lost $445 million in a single day.
Missed Microsoft.
Got called crazy for running for president.
Still built companies that sold for billions. Still rescued his people from a foreign prison. Still changed American politics. Still helped fund the technology that powers every iPhone on the planet.
What rejection are you letting stop you?
What “failure” are you treating like the end?
Perot got rejected 77 times before his first yes.
He lost almost half a billion dollars in one day and kept going.
He missed the biggest investment opportunity in tech history and still funded the next one.
He ran for president twice without winning and still made history.
Stop counting your losses.
Start counting your attempts.
The guy who got told no 77 times built a multi-billion dollar empire, bankrolled the iPhone, and ran for president.
Your “impossible” goal doesn’t look so impossible anymore, does it?
Think Big.

Mebendazole

Mebendazole

In the 2024 laboratory study, scientists treated human colon cancer cells with the anti-worm drug Mebendazole and then measured how many cells died after 48 hours.

Using a special test that separates living cells from dying cells, they found that 78% (±12%) of the cancer cells were pushed into apoptosis, which is the cell’s natural self-destruction process.

This result was extremely statistically strong (P = 0.0001), meaning it was very unlikely to be due to chance. In simple terms, this shows that mebendazole didn’t just slow the cancer — it actively forced most of the cancer cells to shut down and die in the lab.

PMID: 37837472

Finish reading: https://pubmed.ncbi.nlm.nih.gov/37837472/

Astrid Lindgren

Astrid Lindgren

Sweden, 1941. A mother sits beside her daughter’s bed. The girl is burning with fever, slipping in and out of delirium. “Tell me a story,” she whispers.
“About what?” the mother asks.
“Tell me about Pippi Longstocking.”
Astrid Lindgren had absolutely no idea what that meant. Her daughter Karin had just invented a name out of thin air. But Astrid started talking anyway—making it up as she went.
She described a girl with bright red pigtails and mismatched stockings. A girl so strong she could lift a horse. A girl who lived alone in a house called Villa Villekulla with a monkey and a horse, with no parents to tell her what to do. A girl who ate candy for breakfast, slept with her feet on the pillow, and told adults “no” whenever she felt like it.
Karin loved her. Astrid kept inventing more Pippi stories every time her daughter asked.
A few years later, Astrid slipped on ice and injured her ankle. Bedridden and bored, she decided to write down all the Pippi stories as a birthday present for Karin. Then she thought: maybe I should try to publish this.
Publishers rejected it immediately.
The character was too wild. Too disrespectful. Too inappropriate. This was 1944 Sweden, where children’s books were about obedient boys and girls learning moral lessons. Pippi Longstocking was pure chaos—a child living without adult supervision, lying when it suited her, defying teachers, physically throwing policemen out of windows, refusing to go to school or follow any rules.
Critics would later call the book dangerous, warning it would teach children to misbehave.
But in 1945, one publisher—Rabén & Sjögren—took a chance. They published Pippi Longstocking.
Children went absolutely wild for it.
Finally, here was a character who represented everything they weren’t allowed to be. Loud. Messy. Free. Independent. Pippi had adventures on her own terms, made her own decisions, and treated adults as equals rather than authorities to be feared.
Some adults were horrified. But other adults—and millions of children—saw something revolutionary: a story that treated children as intelligent, capable people deserving of respect and autonomy.
Astrid kept writing. She created Karlsson-on-the-Roof, Emil of Lönneberga, Ronya the Robber’s Daughter. All of her characters questioned authority, trusted their own judgment, and had rich emotional lives. Astrid never wrote down to children. She didn’t simplify their feelings or pretend life was always happy. Her books dealt with loneliness, fear, injustice, even death—but always with respect for children’s ability to understand complex emotions.
Her books began reshaping how Swedish culture understood childhood itself.
By the 1970s, Astrid Lindgren wasn’t just Sweden’s most beloved children’s author—she was a cultural icon with real political power.
In 1976, she wrote a satirical fairy tale called “Pomperipossa in Monismania” published in Sweden’s largest newspaper. It mocked the country’s absurd tax system using humor—describing a children’s author being taxed at over 100% of her income.
The piece exploded into national conversation. It sparked fierce debate about tax policy. The Social Democratic government, which had ruled Sweden for over 40 years, lost the election shortly after—partly because of the tax debate Astrid’s satire had triggered.
She’d proven her voice could move mountains.
And she decided to use that power for something that mattered even more than taxes.
In the late 1970s, Astrid turned her full attention to a brutal reality that everyone in Sweden simply accepted as normal: hitting children was legal.
Parents spanked. Teachers used rulers and canes on students. It was called “discipline,” not abuse. It was how things had always been done.
Astrid Lindgren believed it was violence against the most defenseless people in society. And she believed it had to stop.
She began speaking everywhere—newspapers, television, public speeches, interviews. She wrote articles. She appeared on national programs. She used every ounce of her fame to argue one simple point: hitting children teaches them that violence is acceptable. Physical punishment doesn’t create better behavior—it creates fear, shame, and the lesson that might makes right.
Sweden listened to her.
In 1979, Sweden became the first country in the entire world to legally ban corporal punishment of children.
Parents could no longer legally hit their children. Teachers couldn’t use physical punishment in schools. The law didn’t criminalize parents, but it established an absolute principle: children have the right to protection from violence, even from their own parents.
It was revolutionary. No country had ever done this before.
And Astrid Lindgren’s advocacy was absolutely crucial to making it happen.
She didn’t stop there. She campaigned for animal rights, environmental protection, and humane treatment of farm animals. She used her platform to push Sweden toward becoming a more compassionate society—for children, for animals, for anyone vulnerable.
Astrid continued writing into her eighties. She published over 100 books translated into more than 100 languages. Pippi Longstocking became a global icon—a symbol of childhood independence and joy recognized on every continent.
When Astrid Lindgren died in 2002 at age 94, Sweden mourned her like a beloved national grandmother. The Swedish royal family attended her funeral. Thousands lined the streets. The ceremony was broadcast live across the nation.
But her real legacy was what she changed.
Sweden’s 1979 ban on corporal punishment influenced the entire world. Today, more than 60 countries have followed Sweden’s lead and outlawed hitting children. That number grows every year.
And countless millions of children grew up reading about Pippi, Emil, Ronya, and Karlsson—characters who showed them that being a child didn’t mean being powerless, voiceless, or less important than adults.
Think about what Astrid Lindgren actually accomplished.
She created Pippi Longstocking in 1941 to entertain her sick daughter. That girl with red pigtails and superhuman strength became one of the most recognized characters in children’s literature worldwide.
But Astrid’s real achievement was understanding that if you’re going to write stories where children have dignity, you have to fight to build a world where they actually do.
She wrote books that respected children. Then she helped create laws that protected them.
Sweden became the first country to write that respect into law.
Because one author believed children deserved better—and refused to stay quiet until the world agreed.
Astrid Lindgren proved that respecting children wasn’t just good storytelling. It was good policy. It was justice. It was necessary.
And it started with a feverish little girl asking her mother to tell her about a character with a funny name.
That’s how revolutions begin.

When the Vaccinated Body Becomes the Broadcast Tower: The Shedding Paradox

Covid Jab Created Harm Factory

This explains why the unjabbed also need to detox the Spike Protein.

Story at a Glance

The paradigm shift: What we call “contagion” may not require pathogens at all. Cells under stress naturally broadcast molecular signals via extracellular vesicles—biological packets that can transfer information between organisms and create the illusion of infectious transmission.

The mRNA revolution: COVID-19 vaccines have transformed human cells into producers of spike-bearing exosomes that circulate for months, appear in all body fluids, and carry pharmacologically-induced signals throughout the population. This is biological broadcasting at an unprecedented scale.

The amplification crisis: Self-amplifying RNA vaccines now multiply this process exponentially, creating replicating genetic instructions that generate vast quantities of synthetic biological signals—potentially turning each injection into a self-perpetuating broadcast system.

The regulatory void: No authority has investigated whether these vesicles influence unvaccinated individuals, despite widespread reports of symptoms following intimate exposure. We have deployed a global biotechnology without understanding its most basic consequence: whether it alters biological communication between humans.

The central revelation: Billions of people may now be involuntary broadcasters of pharmaceutical signals, fundamentally changing the biological information environment of our species.

Finish reading: https://sayerji.substack.com/p/when-the-vaccinated-body-becomes

Here are my two offerings to potentially put a body on the road to recovery:
https://www.healthelicious.com.au/NutriBlast-Anti-Spike.html

https://www.healthelicious.com.au/NutriBlast_DNA_Heart_Mitochondria.html

Neil Diamond

Neil Diamond

He walked away from medical school with $50 in his pocket to chase an impossible dream—and wrote the song that would make stadiums sing for 60 years.

Brooklyn, 1960.

Neil Diamond sat in his NYU dorm room, supposedly studying for his pre-med finals. His parents—humble Jewish immigrants who’d sacrificed everything—were counting on him to become a doctor. Security. Stability. The American Dream.

But Neil couldn’t focus on anatomy textbooks. His mind kept drifting to the melody he’d been humming all week. His fingers kept reaching for his guitar instead of his stethoscope.

That night, he made a choice that terrified him.

He dropped out of medical school. Walked away from the scholarship. Left behind his parents’ dreams and his own guaranteed future.

For what? A job writing songs at Sunbeam Music Publishing for $50 a week.

His parents were devastated. His friends thought he was crazy. He had no backup plan, no connections, no certainty that he’d ever make it.

For six years, he lived on hope and stubbornness. Writing songs nobody wanted. Playing gigs nobody attended. Wondering if he’d made the biggest mistake of his life.

Then 1966 happened.

A song he’d written—”I’m a Believer”—became one of the biggest hits of the decade. Not for him, but for The Monkees. Suddenly, the kid from Brooklyn who’d gambled everything was being played on every radio in America.

But Neil wasn’t done.

He wanted people to hear HIS voice telling HIS stories. So he kept writing. “Solitary Man.” “Cherry, Cherry.” “Cracklin’ Rosie.”

And then, in 1969, he wrote eight simple words that would become bigger than he ever imagined:

“Sweet Caroline… good times never seemed so good.”

Nobody knows for certain who Caroline really was. Some say Caroline Kennedy. Others say it was about his wife. Neil himself has changed the story over the years, almost like he knew the song needed to belong to everyone, not just to him.

Because that’s exactly what happened.

“Sweet Caroline” became the song couples slow-danced to at weddings. The song crowds screamed at baseball games. The song that brought together complete strangers in bars, concert halls, and living rooms across the world.

For over five decades, Neil Diamond gave us the soundtrack to our lives. More than 130 million records sold. A legacy that touched four generations.

In 2018, his voice began to fail him. Parkinson’s disease forced him off the touring stage—the place where he’d felt most alive for 50 years.

He could have disappeared quietly. Retired in peace.

Instead, he keeps writing. Keeps creating. Keeps proving that the fire that made a 20-year-old drop out of medical school never really goes out.

The kid who risked everything on a dream didn’t just make it.

He made us all believe that impossible dreams are worth chasing.

Because sometimes, the biggest risk isn’t following your heart.

It’s spending your whole life wondering what would’ve happened if you had.

Brigadier General Theodore Roosevelt Jr.

Brigadier General Theodore Roosevelt Jr

June 6, 1944.

As the landing craft approached Utah Beach, Brigadier General Theodore Roosevelt Jr. gripped his cane and checked his pistol.

He was fifty-six years old. His heart was failing. Arthritis had crippled his joints from old World War I wounds. Every step hurt.

He wasn’t supposed to be there.

But he had insisted—three times—on going ashore with the first wave of troops. His commanding officer, Major General Raymond “Tubby” Barton, had rejected the request twice. Too dangerous. Too risky. No place for a general.

Roosevelt wrote a letter. Seven bullet points. The last one: “I personally know both officers and men of these advance units and believe that it will steady them to know that I am with them.”

Barton relented.

And so Theodore Roosevelt Jr.—eldest son of President Theodore Roosevelt, veteran of World War I, twice wounded, gassed nearly to blindness—became the only general officer to storm the beaches of Normandy in the first wave.
This wasn’t ancient history. This was June 6, 1944.

The ramp dropped. German guns opened fire. Bullets slapped the water. Artillery shells screamed overhead. Men scrambled onto the sand, some falling before they took three steps.

Roosevelt stepped off the boat, leaning on his cane, carrying only a .45 caliber pistol.

One of his men later recalled: “General Theodore Roosevelt was standing there waving his cane and giving out instructions as only he could do. If we were afraid of the enemy, we were more afraid of him and could not have stopped on the beach had we wanted to.”

Within minutes, Roosevelt realized something was wrong.
The strong tidal currents had pushed the landing craft off course. They’d landed nearly a mile south of their target. The wrong beach. The wrong exits. The whole invasion plan suddenly useless.

Men looked around in confusion. Officers checked maps. The Germans kept firing.

This was the moment that could turn the invasion into a massacre.

Roosevelt calmly surveyed the shoreline. Studied the terrain. Made a decision.

Then he gave one of the most famous orders in D-Day history:

“We’ll start the war from right here!”

For the next four hours, Theodore Roosevelt Jr. stood on that beach under relentless enemy fire, reorganizing units as they came ashore, directing tanks, pointing regiments toward their new objectives. His cane tapping in the sand. His voice steady. His presence unshakable.

A mortar shell landed near him. He looked annoyed. Brushed the sand off his uniform. Kept moving.

Another soldier described seeing him “with a cane in one hand, a map in the other, walking around as if he was looking over some real estate.”

He limped back and forth to the landing craft—back and forth, back and forth—personally greeting each arriving unit, making sure the men kept moving off the beach and inland. The Germans couldn’t figure out what this limping officer with the cane was doing. Neither could they hit him.

By nightfall, Utah Beach was secure. Of the five D-Day landing beaches, Utah had the fewest casualties—fewer than 200 dead compared to over 2,000 at Omaha Beach just miles away.

Commanders credited Roosevelt’s leadership under fire for the success.

Theodore Roosevelt Jr. had been preparing for that day his entire life.

Born September 13, 1887, at the family estate in Oyster Bay, New York, he was the eldest son of Theodore Roosevelt—the larger-than-life president, war hero, and force of nature. Growing up in that shadow was impossible. Meeting that standard seemed even harder.

But Ted tried.

In World War I, he’d been among the first American soldiers to reach France. He fought at the Battle of Cantigny. Got gassed. Got shot. Led his men with such dedication that he bought every soldier in his battalion new combat boots with his own money. He was promoted to lieutenant colonel and awarded the Distinguished Service Cross.

Then, in July 1918, his youngest brother Quentin—a pilot—was shot down and killed over France.

Ted never fully recovered from that loss.

When World War II began, Theodore Roosevelt Jr. was in his fifties. Broken down. Worn out. He could have stayed home. Taken a desk job. No one would have blamed him.

Instead, he fought his way back into combat command. He led troops in North Africa. Sicily. Italy. Four amphibious assaults before Normandy.

And on D-Day, when commanders tried to keep him off that beach, he refused.

“The first men to hit the beach should see the general right there with them.”

After Utah Beach, General Omar Bradley—who commanded all American ground forces in Normandy—called Roosevelt’s actions “the bravest thing I ever saw.”

General George Patton agreed. Days later, Patton wrote to his wife: “He was one of the bravest men I ever knew.”

On July 11, 1944—thirty-six days after D-Day—General Eisenhower approved Roosevelt’s promotion to major general and gave him command of the 90th Infantry Division.

Roosevelt never got the news.

That same day, he spent hours talking with his son, Captain Quentin Roosevelt II, who had also landed at Normandy on D-Day—the only father-son pair to come ashore together on June 6, 1944.

Around 10:00 p.m., Roosevelt was stricken with chest pains.
Medical help arrived. But his heart had taken all it could take.

At midnight on July 12, 1944—five weeks after leading men onto Utah Beach—Theodore Roosevelt Jr. died in his sleep.
He was fifty-six years old.

Generals Bradley, Patton, and Barton served as honorary pallbearers. Roosevelt was initially buried at Sainte-Mère-Église.

In September 1944, he was posthumously awarded the Medal of Honor. When President Roosevelt handed the medal to Ted’s widow, Eleanor, he said, “His father would have been proudest.”

After the war, Roosevelt’s body was moved to the Normandy American Cemetery at Colleville-sur-Mer—the rows of white crosses overlooking Omaha Beach.

And there’s where the story takes its final, heartbreaking turn.

In 1955, the family made a request: Could Quentin Roosevelt—Ted’s younger brother, killed in World War I, buried in France since 1918—be moved to rest beside his brother?

Permission was granted.

Quentin’s remains were exhumed from Chamery, where he’d been buried near the spot his plane crashed thirty-seven years earlier, and reinterred beside Ted.

Two sons of a president. Two brothers. Two wars. Reunited in foreign soil.

Quentin remains the only World War I soldier buried in that World War II cemetery.

Today, at the Normandy American Cemetery, among the 9,388 white marble crosses and Stars of David, two headstones stand side by side:

THEODORE ROOSEVELT JR.
BRIGADIER GENERAL
MEDAL OF HONOR

QUENTIN ROOSEVELT
SECOND LIEUTENANT
WORLD WAR I

The tide still rolls over Utah Beach. The sand looks the same. Tourists walk where soldiers died.

And somewhere in that vast field of white crosses, two brothers rest together—sons of a president who believed in duty, service, and leading from the front.

Some men lead by orders.

Some lead by rank.

Theodore Roosevelt Jr. led by example—cane in hand, heart failing, utterly unflinching.

He didn’t have to be there.

But he refused to lead from anywhere else.