Jordan Lark On Our Bushfires

Smores Bushfire

It’s devastating to see the environmental destruction we’ve once again faced. Homes lost. Burnt or swallowed. Lives changed. Wildlife and livestock decimated.

I was reading a post from a volunteer who’d been on the land for decades, talking about what’s changed and why it now feels so out of control.

Back then:

• farmers burned firebreaks

• locals cleared fuel

• volunteers mobilised

• no one waited for permission

Then the government stepped in with regulatory creep, bans, restrictions, competing environmental directives, endless “stakeholder considerations” and bureaucratic paralysis. Suddenly burning a roadside or clearing fuel loads required more paperwork than running a company. The volunteer summed it up simply: “We didn’t get stupider – we got forbidden.”

Now we’re going to spend billions trying to keep people afloat after entire communities get smashed and let me be clear, that’s not the issue. That’s an Australian value. We look after our own. We do that without hesitation and without whining. Good. Mateship.

What’s fucked is that we already know that cheque is coming every damn year… yet we refuse to spend smart money on the front end to stop the destruction in the first place. It’s insanity. It makes me fucking wild.

We know fire season is coming.

We know what burns.

We know flood season will do the opposite but end the same – homes swallowed, lives upended, communities knocked flat.

It isn’t random. It’s patterned.

The Productivity Commission has been saying it for years: we spend many times more on disaster recovery than on mitigation. Royal commissions after Black Saturday and the 2019–20 fires have said the same thing: reduce fuel loads, maintain firebreaks, invest in volunteers, update infrastructure. Governments nod, pose for photos… then quietly move on.

Meanwhile, watch parliament for five minutes and you’ll see why nothing gets done. Ask a straight question and they don’t answer it, they fucking argue about the words surrounding the question. They twist language, redefine terms, hide behind procedural bullshit and burn half the sitting day pretending that “debating” is the same as governing. It’s bureaucratic cardio: lots of movement, no progress. Maximum energy, minimum output.

This is what happens when optics replace competence.

And it’s not just fires and floods.

We have a housing crisis. We have too many people slipping through the cracks. Not a vibe crisis. Not a discourse crisis.

A structural, material, immediate crisis.

Caused by a stack of real-world policy failures:

• land banking

• overseas investment distortions

• infrastructure lag

• regulatory paralysis

• no planning alignment between federal, state and local

• and yes – immigration policy

Criticising immigration policy isn’t the same as criticising migrants. That’s the childish frame the political class pushes so adults can’t have adult conversations. The issue is scale, sequencing, velocity and carrying capacity. You cannot pour more people into a pipeline that is already bursting and call it compassion.

And if your instinct here is to pick that apart, answer me one question:

How does increasing demand on an already strangled housing stock help the people currently sleeping in cars?

Next time you’re at the shops, actually look at the cars in the car park. Blankets. Storage tubs. Clothes. People are quietly living out of vehicles at a rate nobody in Canberra wants to talk about.

Where is the energy for defending our own?

We don’t lack capacity. We put rockets in space and land them on floating platforms. We can fix this country. It’s a resource allocation problem. Every “complex” problem in Australia is just a resource allocation problem. And our resources are hijacked for political optics, not national wellbeing.

Leadership should be judged on outcomes, not vibes.

Climate obviously changes. We can argue human contribution until the cows come home, but it’s irrelevant to the immediate point, we spend all our energy on the uncontrollables and zero on the controllables. We lecture the weather while neglecting the country.

Australia doesn’t burn every year because the climate fairy got angry. It burns because the people in charge would rather manage narratives than manage the land.

And while they’re busy spinning narratives, they somehow have endless time and money to:

• police social media posts

• write new speech laws

• cry about AI-generated pictures of politicians as if the country will collapse if someone gets memed (put your best albo in a bikini meme in the comments).

All that energy for performative outrage. Almost none for clearing fuel, fixing drainage, aligning planning, or building enough houses for the people already here.

They get away with this because we, as a collective, spend all our energy on each other. Left vs right. City vs country. Native-born vs migrant. Climate vs denial. While we’re busy tearing strips off each other, the basics quietly rot.

At some point we need to hit pause and admit: there’s only so much we can do about the rest of the world right now. We need to reinvest in ourselves for a moment. Clean house. Patch the framework. Take care of our own country.

Because until we stop being domesticated spectators and start demanding competence, this pattern will repeat.

The destruction is not accidental. It is predictable.

And predictability without prevention isn’t bad luck.

It’s policy by negligence.

I appreciate you reading my thoughts.

Grace Groner

Grace Groner

She bought $180 worth of stock during the Great Depression—and never touched it for 75 years.

In 1935, Grace Groner made a decision that looked insignificant at the time. She was working as a secretary at Abbott Laboratories, earning a modest income in a world still reeling from economic collapse. Women were rarely encouraged to build wealth. Financial independence seemed like a luxury reserved for men with means.

That year, Grace bought three shares of Abbott Laboratories stock for sixty dollars each. One hundred eighty dollars total.

Then she did something radical for the era. She held them.

Grace never chased trends. She never sold during panics. She never tried to time the market. She simply reinvested every dividend the company paid and trusted time to do what individual effort could not.

While markets crashed in the years that followed, she held. While World War II erupted and the economy shifted to wartime production, she held. While the Cold War raised fears and recessions came and went, she held. While other investors panicked and sold, she stayed still.

Her life remained simple in a way that seemed almost stubborn to those around her. She lived in a small one-bedroom cottage that had been willed to her. She bought her clothes at rummage sales. After her car was stolen, she never bought another one—she just walked everywhere instead, even into old age with a walker in hand.

She carried the mindset of someone who had lived through scarcity and never forgot it. The Great Depression had taught her that security came from living below your means, not above them.

Her wealth grew quietly in the background while her lifestyle never changed. Nobody suspected. Not her neighbors. Not her colleagues at Abbott where she worked for 43 years before retiring in 1974. Not even most of her friends.

The stock split. The shares multiplied. The dividends compounded. Year after year, decade after decade, that initial $180 investment transformed into something extraordinary—but Grace lived as though it didn’t exist.

She volunteered at the First Presbyterian Church. She donated anonymously to those in need. She attended Lake Forest College football games and stayed connected to the school that had educated her decades earlier. She traveled after retirement, experiencing the world while still maintaining her frugal habits.

In 2008, at age 99, Grace quietly established a foundation. She never told anyone what it would contain.

When Grace died on January 19, 2010, at age one hundred, her attorney opened her will. That’s when everyone discovered the truth.

Her original one hundred eighty dollars—three shares of Abbott Laboratories purchased 75 years earlier—had grown into more than seven million dollars.

The people who knew her were stunned. “Oh, my God,“ exclaimed the president of Lake Forest College when he learned the amount.

Grace, the woman who walked everywhere and bought secondhand clothes, who lived in a tiny cottage and volunteered her time quietly, had been a multimillionaire the entire time. She just chose to live as though she wasn’t.

And she didn’t spend that fortune on herself in the end.

She left nearly all of it to the Grace Elizabeth Groner Foundation—created to fund service-learning opportunities, internships, international study, and community service projects for Lake Forest College students. The same college that had educated her in 1931, paid for by a kind family who took her in after she was orphaned at age 12.

Grace had never forgotten that gift of education. Now she was paying it forward, making it possible for students who needed opportunity the way she once had.

The foundation her estate created would generate hundreds of thousands of dollars annually in dividend income—money that would change countless lives for generations. Students who otherwise couldn’t afford to study abroad or take unpaid internships would now have that chance because of three shares of stock a secretary bought during the Depression.

Her cottage—the small one-bedroom home where she’d lived so simply—was renovated by the foundation and is now home to two female Lake Forest College seniors each year, living there as Grace’s guests.

Grace Groner proved something that challenges every assumption we make about building wealth.

She proved that you don’t need a high income to become wealthy. She proved that you don’t need to be born with privilege or connections. She proved that you don’t need perfect timing or insider knowledge or lucky breaks.

Sometimes wealth comes from something much simpler: patience, discipline, and the belief that your future is worth investing in, even when the first step looks small.

Three shares of stock. One hundred eighty dollars. Seventy-five years of not selling.

That’s all it took.

But it wasn’t really about the stock, was it? It was about understanding something most people never grasp: that compounding requires time more than money. That the most powerful investment strategy isn’t activity—it’s stillness. That true wealth comes not from what you earn but from what you keep and let grow.

Grace worked as a secretary her entire career. She never became an executive. She never got rich from her salary. She never inherited a fortune or won the lottery or built a business empire.

She just bought three shares of a good company and never sold them.

While everyone else was chasing the next hot stock, the next quick profit, the next get-rich scheme, Grace was doing nothing. And in investing, sometimes doing nothing is the most powerful thing you can do.

Her story forces us to confront uncomfortable truths. How many people earn far more than Grace did but will die with far less? How many chase returns instead of letting returns come to them? How many mistake activity for progress?

Grace Groner sat still for 75 years while the world spun around her. She lived modestly while wealth accumulated quietly in the background. She died having touched more lives than most millionaires ever will—not because of what she spent, but because of what she saved and gave away.

Her foundation estimates it provides opportunities to students generating $300,000 annually in benefits. All from $180 invested in 1935 by a secretary who understood something profound about time, patience, and the power of never quitting.

The next time someone tells you it’s impossible to build wealth without advantages, remember Grace Groner. Remember the woman who bought three shares during the Depression and held them until she was 100.

Remember that sometimes the most radical thing you can do is make a small decision and trust it long enough to prove everyone wrong.

Shari Lewis

Shari Lewis

She found out her show was cancelled by overhearing executives on an elevator—they didn’t even know she was standing behind them.

In 1963, Shari Lewis was one of the most talented performers in television.

She could sing. She could dance. She could conduct a symphony orchestra. She could perform ventriloquism so precisely that audiences forgot they were watching a woman with a sock on her hand.

She had trained at the American School of Ballet. Studied acting with Sanford Meisner. Learned piano at two years old. Won a Peabody Award. Hosted a show that ran live television without error—week after week, year after year.

And NBC executives decided she was replaceable.

They cancelled The Shari Lewis Show to make room for cartoons.

She wasn’t told directly. She learned about it while standing in an elevator, listening to men in suits discuss the decision as if she wasn’t there.

“All of it… my entire field crashed around my ears,” she said later.

The industry had made its position clear: children’s television was filler. If the audience was young, the work didn’t count. And the woman who created that work? She was a novelty. A mascot. Not an artist.

But here’s what they got wrong about Shari Lewis:

She didn’t need their permission.

When American networks abandoned live children’s programming, Lewis moved to England and hosted a show on the BBC for eight years. When that work dried up, she performed in Las Vegas casinos, touring companies of Broadway shows, and appeared on variety programs with Ed Sullivan and Johnny Carson.

When those opportunities faded, she reinvented herself again.

She became one of the few female symphony conductors in the world—performing with over 100 orchestras, including the national symphonies of the United States, Canada, and Japan. She learned to speak Japanese for her performances there.

Once, she walked onto a stage at a state fair and found only four people in the audience.

She did the show anyway.

That was who Shari Lewis was.

Not a puppet act. Not a children’s entertainer waiting for permission. A performer who controlled timing, voice, pacing, and audience attention with surgical precision—and refused to stop working just because an industry decided she wasn’t serious.

Then, nearly 30 years after that elevator conversation, PBS came calling.

In 1992, at 59 years old, Lewis launched Lamb Chop’s Play-Along.

The show won five consecutive Emmy Awards. It was the first children’s program in seven years to beat Sesame Street for a writing Emmy. A new generation of children fell in love with Lamb Chop, Charlie Horse, and Hush Puppy—the same characters executives had declared “outdated” three decades earlier.

The audience hadn’t moved on. The industry had simply stopped paying attention.

Lewis didn’t treat this as a comeback. She treated it as what it always was: a correction.

She testified before Congress in 1993 to advocate for children’s television. Lamb Chop was granted special permission to speak. When elementary schools started cutting music programs, Lewis created The Charlie Horse Music Pizza to teach children about music through entertainment.

She was still innovating. Still refusing to be small.

In June 1998, Lewis was diagnosed with uterine cancer and given six weeks to live. She was in the middle of taping new episodes.

She finished them anyway.

Her final performance was a song called “Hello, Goodbye.” Her crew held back tears as she sang. She was saying goodbye to them, to the children watching, and to the character who had been her partner for over 40 years.

Shari Lewis died on August 2, 1998. She was 65 years old.

The industry remembered her fondly. It always does when it’s too late.

But her work didn’t need their remembrance. It endured on its own terms—passed down from parents to children, from one generation to the next, because the audience always knew what the executives never understood:

Precision is not small just because it serves children.

Craft is not diminished by joy.

And the woman who made a sock puppet come alive was never the novelty.

She was the reason it worked at all.

Doug Engelbart

Doug Engelbart

The wooden box had three metallic wheels.

It looked like a toy, or perhaps a piece of scrap assembled in a garage, but the man holding it believed it was the key to the human mind.

It was December 9, 1968.

In the cavernous Brooks Hall in San Francisco, more than a thousand of the world’s top computer scientists sat in folding chairs, waiting. They were used to the roar of air conditioning units cooling massive mainframes. They were used to the smell of ozone and the stack of stiff paper punch cards that defined their working lives.

They were not used to Douglas Engelbart.

He sat alone on the stage, wearing a headset that looked like it belonged to a pilot, staring at a screen that flickered with a ghostly green light. Behind the scenes, a team of engineers held their breath, praying that the delicate web of wires and microwave signals they had cobbled together would hold for just ninety minutes.

If it worked, it would change how humanity thought.

If it failed, Douglas Engelbart would simply be the man who wasted millions of taxpayer dollars on a fantasy.

The world of 1968 was analog.

Information lived on paper. If you wanted to change a paragraph in a report, you retyped the entire page. If you wanted to send a document to a colleague in another city, you put it in an envelope and waited three days. If you wanted to calculate a trajectory, you gave a stack of cards to an operator, who fed them into a machine the size of a room, and you came back the next day for the results.

Computers were calculators. They were powerful, loud, and distant. They were owned by institutions, guarded by specialists, and kept behind glass walls. The idea that a single person would sit in front of a screen and “interact” with a computer in real-time was not just technically difficult; it was culturally absurd.

Engelbart, a soft-spoken engineer from Oregon, saw it differently.

He had grown up in the Depression, fixing water pumps and electrical lines. He understood tools. He believed that the problems facing humanity—war, poverty, disease—were becoming too complex for the unassisted human brain to solve. We needed better tools. We needed to “augment human intellect.”

For years, he had run a lab at the Stanford Research Institute (SRI). While others focused on making computers faster at math, Engelbart’s team focused on making them responsive. They built systems that allowed a user to point, click, and see results instantly.

They called their system NLS, or the “oN-Line System.”

It was a radical departure from the status quo. To the establishment, computing was serious business involving batch processing and efficiency. Engelbart was talking about “manipulating symbols” and “collaboration.”

The pressure on Engelbart was immense.

The funding for his Augmentation Research Center came from ARPA (the Advanced Research Projects Agency), the same government body responsible for military technology. They had poured significant resources into his vision, but results were hard to quantify. There were no enemy codes broken, no missile trajectories calculated. Just a group of men in California moving text around on a screen.

The critics were loud. They called him a dreamer. They said his ideas were “pie in the sky.” Why would anyone need to see a document on a screen when typewriters worked perfectly fine? Why would anyone need to point at data?

This presentation was his answer.

It was an all-or-nothing gamble.

To make the demonstration work, Engelbart wasn’t just using a computer on the stage. The machine itself—an SDS 940 mainframe—was thirty miles away in Menlo Park. He was controlling it remotely.

His team had leased two video lines from the telephone company, a massive expense and logistical nightmare. They had set up microwave transmitters on the roof of the civic center and on a truck parked on a ridge line to relay the signal.

In 1968, sending a video signal and a data signal simultaneously over thirty miles to a live audience was the equivalent of a moon landing.

The computer industry was built on a specific, rigid logic.
Computing Logic: Computers are scarce, expensive resources. Human time is cheap; computer time is expensive. Therefore, humans must prepare work offline (punch cards) to maximize the machine’s efficiency. Interactive computing wastes the machine’s time.

This logic governed the industry. It was why IBM was a titan. It was why office workers sat in rows with typewriters. It was the “correct“ way to do things.

It worked perfectly—until it met Douglas Engelbart.

Engelbart believed that human time was the precious resource, not the machine’s. He believed the machine should serve the mind, even if it was “inefficient” for the hardware.

As the lights went down in Brooks Hall, the hum of the crowd faded.

Engelbart looked small on the big stage. The screen behind him, a massive projection of his small monitor, glowed into life.

He spoke into his microphone, his voice steady but quiet.

“If in your office, you as an intellectual worker were supplied with a computer display, backed up by a computer that was alive for you all day, and was instantly responsive to every action you had, how much value could you derive from that?”

It was a question nobody had ever asked.

He moved his right hand.

On the massive screen, a small dot moved.

The audience froze.

He wasn’t typing coordinates. He wasn’t entering a command code. He was simply moving his hand, and the digital ghost on the screen followed him. He was using the wooden box with the wheels—the device his team had nicknamed “the mouse” because the cord looked like a tail.

Today, a cursor moving on a screen is as natural as breathing. In 1968, it was magic.

But he didn’t stop there.

He clicked on a word. It was highlighted.

He deleted it. It vanished.

The text around it snapped shut to fill the gap.

A murmur ran through the hall. He wasn’t rewriting the page. He
was manipulating information as if it were a physical object, yet it was made of light.

He showed them a “grocery list.” He categorized items. He collapsed the list so only the headers showed, then expanded it again to show the details.

He called this “view control.” We call it windowing.

He showed them a map. He clicked a link, and the screen jumped to a detailed diagram of a component. He clicked back, and he was at the map again.

He called it “hypermedia.” We call it the internet.

The demonstration continued, each minute adding a new impossibility to the list.

The tension in the control room was suffocating. Every second the system stayed online was a victory against the laws of probability. A single blown fuse, a misaligned microwave dish, a software bug—any of it would have turned the screen black and ended the dream.

Then came the moment that truly broke the room.

Engelbart introduced a colleague, Bill Paxton.

Paxton wasn’t on stage. He was thirty miles away, sitting in the lab at SRI.

His face appeared in a window on the screen, crisp and clear.

The audience gasped.

They were looking at a man in Menlo Park, while listening to a man in San Francisco, both looking at the same document on the same screen.

“Okay, Bill,” Engelbart said. “Let’s work on this together.”

On the screen, two cursors appeared. One controlled by Engelbart, one by Paxton.

They edited the text together. Engelbart would point to a sentence, and Paxton would paste it into a new location. They were collaborating, in real-time, across a distance, using a shared digital workspace.

It was Google Docs, Zoom, and Slack, demonstrated a year before the internet (ARPANET) even existed.

The audience, composed of the smartest engineers in the world, sat in stunned silence. They were watching science fiction become a documentary.

They weren’t just seeing new gadgets. They were seeing the destruction of their entire worldview. The idea of the solitary computer operator was dead. The idea of the computer as a mere calculator was dead.

Engelbart was showing them a window into a world where minds could connect through machines.

He typed, he clicked, he spoke. He operated a “chorded keyset” with his left hand, entering commands as fast as a pianist, while his right hand flew across the desk with the mouse. He was a conductor of information.

For ninety minutes, the system held.

The microwave links stayed true. The software didn’t crash. The mainframe thirty miles away processed every command.

When Engelbart finally took off the headset and the screen went dark, there was a pause.

A hesitation.

Then, the audience stood.

It wasn’t a polite golf clap. It was a roar. It was the sound of a thousand experts realizing that everything they knew about their field had just become obsolete.

They rushed the stage. They wanted to touch the mouse. They wanted to see the keyset. They wanted to know how he did it.

The “Mother of All Demos,” as it was later christened, did not immediately change the market. Engelbart did not become a billionaire. He was a researcher, not a salesman. His system was too expensive and too complex for the 1970s.

But the seeds were planted.

Sitting in the audience were the young engineers who would go on to work at Xerox PARC. They would take the mouse, the windows, and the graphical interface, and they would refine them.

Steve Jobs would visit Xerox PARC a decade later, see the descendants of Engelbart’s mouse, and use them to build the Macintosh.

Bill Gates would see it and build Windows.

Tim Berners-Lee would use the concept of hypermedia to build the World Wide Web.

Every smartphone in a pocket, every laptop in a cafe, every video call made to a loved one across the ocean—it all traces back to that ninety-minute window in 1968.

Douglas Engelbart died in 2013. He never sought fame. He watched as the world caught up to the vision he had seen clearly half a century before.

He proved that the pressure of the status quo—the belief that “this is how it’s always been done”—is brittle. It can be broken by a single person with a wooden box and the courage to show us what is possible.

The system said computers were for numbers.

He showed us they were for people.

Sources: Detailed in “The Mother of All Demos” archives (SRI International). Smithsonian Magazine, “The 1968 Demo That Changed Computing.” New York Times obituary for Douglas Engelbart, 2013. Summary of events from the Doug Engelbart Institute records.

New research explains why some minds stay awake at night

why some minds stay awake

  • Insomnia keeps your mind in daytime problem-solving mode at night, which prevents the natural mental drift that helps you fall asleep
  • New research shows that people with insomnia have weaker circadian signals, making it harder for the brain to shift from alert thinking into the dream-like patterns that support rest
  • Sequential thinking stays elevated at night in insomniacs, creating racing thoughts and mental loops that make it difficult to unwind
  • Strengthening your circadian rhythm through morning light exposure, dim evening lighting, and consistent nighttime cues helps your brain recognize when to power down
  • Simple practices such as cognitive shuffling, sensory-based grounding during nighttime awakenings, and daily movement — especially walking — support a clearer day-night contrast and more restorative sleep

https://nexusnewsfeed.com/article/health-healing/new-research-explains-why-some-minds-stay-awake-at-night/

Brutal breakfast reality check – Oatmeal or Omelette?

Oatmeal

Dr. Mark Hyman shares a Harvard study that will make you rethink breakfast:

Overweight teens ate the exact same calories in three different meals — oatmeal, steel-cut oats, or omelette.

Result?

Oatmeal group: sky-high insulin, cortisol & adrenaline (like being chased by a tiger) ate 81% more food later.

Steel-cut oats: still 50% more than an omelette.

Omelette group stayed satisfied longest.

Moral: Start your day with protein + fat — not starch or sugar (no muffins, bagels, oatmeal, pancakes…).

Who’s switching to eggs/bacon/avocado tomorrow?

https://x.com/newstart_2024/status/2009637477158371812?s=20

Millions of your mother’s cells persist inside you, and now we know how

Every human born on this planet is not entirely themselves.

A tiny fraction of our cells – around one in a million – is actually not our own, but comes from our mothers. That means each of us has millions of cells that our immune systems would normally recognize as foreign; yet somehow, in most of us, they hang around peacefully without causing any immune problems.

Now, immunologists have figured out why. A small number of maternal immune cells that cross the placenta during pregnancy actively train the fetus’s immune system to tolerate the mother’s cells for their entire life.

The exchange of cells between a mother and a fetus is a well-documented phenomenon that scientists have known about for more than 50 years. It’s called microchimerism, and it goes both ways: every human who has ever been pregnant retains cells from their fetus, and every human retains cells from their mother.

These lingering cells pose a puzzle for immunology, which is built around the idea that the immune system should mount an attack against foreign cells.

A team led by pediatric infectious disease specialist Sing Sing Way of Cincinnati Children’s Hospital Medical Center wanted to understand more about how these foreign maternal cells keep the immune system in check, and what role they play in shaping the fetus’ immune system.

To find out, the researchers studied maternal microchimerism in mice. Building on their previous studies, the researchers bred mice with immune cells engineered to express specific cell surface markers. This allowed researchers to selectively deplete those cells and see whether or not immune tolerance was maintained.

Here’s where it got fascinating. A small subset of the maternal immune cells, with properties similar to bone marrow myeloid cells and dendritic cells, persisted long after birth. They were also strongly associated with both immune activity and the expansion of regulatory T cells – the cells that tell the immune system that everything is copacetic.

To confirm, the researchers next selectively edited out those specific maternal cells in offspring mice.

The results were dramatic. The regulatory T cells disappeared, and the immune tolerance of maternal cells disappeared.

The implication is that lifelong tolerance to maternal microchimeric cells is probably dependent on just a tiny subset of maternal cells. Take those away, and immune chaos likely ensues. That also means that immune tolerance needs to be continuously and actively maintained; it’s not a one-and-done process during pregnancy.

That’s interesting and exciting in its own right, but the research also offers a way to gain a greater understanding of the broad swath of diseases and conditions to which microchimerism may contribute.

“The new tools we developed to study these cells will help scientists pinpoint exactly what these cells do and how they work in a variety of contexts including autoimmune disease, cancer and neurological disorders,” Way says.

“Microchimerism is increasingly linked with so many health disorders. This study provides an adaptable platform for scientists to investigate whether these rare cells are the cause of disease, or alternatively, found in diseased tissue at increased levels as part of the natural healing process.”

The research has been published in Immunity.

https://nexusnewsfeed.com/article/science-futures/millions-of-your-mother-s-cells-persist-inside-you-and-now-we-know-how/