Astrid Lindgren

Astrid Lindgren

Sweden, 1941. A mother sits beside her daughter’s bed. The girl is burning with fever, slipping in and out of delirium. “Tell me a story,” she whispers.
“About what?” the mother asks.
“Tell me about Pippi Longstocking.”
Astrid Lindgren had absolutely no idea what that meant. Her daughter Karin had just invented a name out of thin air. But Astrid started talking anyway—making it up as she went.
She described a girl with bright red pigtails and mismatched stockings. A girl so strong she could lift a horse. A girl who lived alone in a house called Villa Villekulla with a monkey and a horse, with no parents to tell her what to do. A girl who ate candy for breakfast, slept with her feet on the pillow, and told adults “no” whenever she felt like it.
Karin loved her. Astrid kept inventing more Pippi stories every time her daughter asked.
A few years later, Astrid slipped on ice and injured her ankle. Bedridden and bored, she decided to write down all the Pippi stories as a birthday present for Karin. Then she thought: maybe I should try to publish this.
Publishers rejected it immediately.
The character was too wild. Too disrespectful. Too inappropriate. This was 1944 Sweden, where children’s books were about obedient boys and girls learning moral lessons. Pippi Longstocking was pure chaos—a child living without adult supervision, lying when it suited her, defying teachers, physically throwing policemen out of windows, refusing to go to school or follow any rules.
Critics would later call the book dangerous, warning it would teach children to misbehave.
But in 1945, one publisher—Rabén & Sjögren—took a chance. They published Pippi Longstocking.
Children went absolutely wild for it.
Finally, here was a character who represented everything they weren’t allowed to be. Loud. Messy. Free. Independent. Pippi had adventures on her own terms, made her own decisions, and treated adults as equals rather than authorities to be feared.
Some adults were horrified. But other adults—and millions of children—saw something revolutionary: a story that treated children as intelligent, capable people deserving of respect and autonomy.
Astrid kept writing. She created Karlsson-on-the-Roof, Emil of Lönneberga, Ronya the Robber’s Daughter. All of her characters questioned authority, trusted their own judgment, and had rich emotional lives. Astrid never wrote down to children. She didn’t simplify their feelings or pretend life was always happy. Her books dealt with loneliness, fear, injustice, even death—but always with respect for children’s ability to understand complex emotions.
Her books began reshaping how Swedish culture understood childhood itself.
By the 1970s, Astrid Lindgren wasn’t just Sweden’s most beloved children’s author—she was a cultural icon with real political power.
In 1976, she wrote a satirical fairy tale called “Pomperipossa in Monismania” published in Sweden’s largest newspaper. It mocked the country’s absurd tax system using humor—describing a children’s author being taxed at over 100% of her income.
The piece exploded into national conversation. It sparked fierce debate about tax policy. The Social Democratic government, which had ruled Sweden for over 40 years, lost the election shortly after—partly because of the tax debate Astrid’s satire had triggered.
She’d proven her voice could move mountains.
And she decided to use that power for something that mattered even more than taxes.
In the late 1970s, Astrid turned her full attention to a brutal reality that everyone in Sweden simply accepted as normal: hitting children was legal.
Parents spanked. Teachers used rulers and canes on students. It was called “discipline,” not abuse. It was how things had always been done.
Astrid Lindgren believed it was violence against the most defenseless people in society. And she believed it had to stop.
She began speaking everywhere—newspapers, television, public speeches, interviews. She wrote articles. She appeared on national programs. She used every ounce of her fame to argue one simple point: hitting children teaches them that violence is acceptable. Physical punishment doesn’t create better behavior—it creates fear, shame, and the lesson that might makes right.
Sweden listened to her.
In 1979, Sweden became the first country in the entire world to legally ban corporal punishment of children.
Parents could no longer legally hit their children. Teachers couldn’t use physical punishment in schools. The law didn’t criminalize parents, but it established an absolute principle: children have the right to protection from violence, even from their own parents.
It was revolutionary. No country had ever done this before.
And Astrid Lindgren’s advocacy was absolutely crucial to making it happen.
She didn’t stop there. She campaigned for animal rights, environmental protection, and humane treatment of farm animals. She used her platform to push Sweden toward becoming a more compassionate society—for children, for animals, for anyone vulnerable.
Astrid continued writing into her eighties. She published over 100 books translated into more than 100 languages. Pippi Longstocking became a global icon—a symbol of childhood independence and joy recognized on every continent.
When Astrid Lindgren died in 2002 at age 94, Sweden mourned her like a beloved national grandmother. The Swedish royal family attended her funeral. Thousands lined the streets. The ceremony was broadcast live across the nation.
But her real legacy was what she changed.
Sweden’s 1979 ban on corporal punishment influenced the entire world. Today, more than 60 countries have followed Sweden’s lead and outlawed hitting children. That number grows every year.
And countless millions of children grew up reading about Pippi, Emil, Ronya, and Karlsson—characters who showed them that being a child didn’t mean being powerless, voiceless, or less important than adults.
Think about what Astrid Lindgren actually accomplished.
She created Pippi Longstocking in 1941 to entertain her sick daughter. That girl with red pigtails and superhuman strength became one of the most recognized characters in children’s literature worldwide.
But Astrid’s real achievement was understanding that if you’re going to write stories where children have dignity, you have to fight to build a world where they actually do.
She wrote books that respected children. Then she helped create laws that protected them.
Sweden became the first country to write that respect into law.
Because one author believed children deserved better—and refused to stay quiet until the world agreed.
Astrid Lindgren proved that respecting children wasn’t just good storytelling. It was good policy. It was justice. It was necessary.
And it started with a feverish little girl asking her mother to tell her about a character with a funny name.
That’s how revolutions begin.

Neil Diamond

Neil Diamond

He walked away from medical school with $50 in his pocket to chase an impossible dream—and wrote the song that would make stadiums sing for 60 years.

Brooklyn, 1960.

Neil Diamond sat in his NYU dorm room, supposedly studying for his pre-med finals. His parents—humble Jewish immigrants who’d sacrificed everything—were counting on him to become a doctor. Security. Stability. The American Dream.

But Neil couldn’t focus on anatomy textbooks. His mind kept drifting to the melody he’d been humming all week. His fingers kept reaching for his guitar instead of his stethoscope.

That night, he made a choice that terrified him.

He dropped out of medical school. Walked away from the scholarship. Left behind his parents’ dreams and his own guaranteed future.

For what? A job writing songs at Sunbeam Music Publishing for $50 a week.

His parents were devastated. His friends thought he was crazy. He had no backup plan, no connections, no certainty that he’d ever make it.

For six years, he lived on hope and stubbornness. Writing songs nobody wanted. Playing gigs nobody attended. Wondering if he’d made the biggest mistake of his life.

Then 1966 happened.

A song he’d written—”I’m a Believer”—became one of the biggest hits of the decade. Not for him, but for The Monkees. Suddenly, the kid from Brooklyn who’d gambled everything was being played on every radio in America.

But Neil wasn’t done.

He wanted people to hear HIS voice telling HIS stories. So he kept writing. “Solitary Man.” “Cherry, Cherry.” “Cracklin’ Rosie.”

And then, in 1969, he wrote eight simple words that would become bigger than he ever imagined:

“Sweet Caroline… good times never seemed so good.”

Nobody knows for certain who Caroline really was. Some say Caroline Kennedy. Others say it was about his wife. Neil himself has changed the story over the years, almost like he knew the song needed to belong to everyone, not just to him.

Because that’s exactly what happened.

“Sweet Caroline” became the song couples slow-danced to at weddings. The song crowds screamed at baseball games. The song that brought together complete strangers in bars, concert halls, and living rooms across the world.

For over five decades, Neil Diamond gave us the soundtrack to our lives. More than 130 million records sold. A legacy that touched four generations.

In 2018, his voice began to fail him. Parkinson’s disease forced him off the touring stage—the place where he’d felt most alive for 50 years.

He could have disappeared quietly. Retired in peace.

Instead, he keeps writing. Keeps creating. Keeps proving that the fire that made a 20-year-old drop out of medical school never really goes out.

The kid who risked everything on a dream didn’t just make it.

He made us all believe that impossible dreams are worth chasing.

Because sometimes, the biggest risk isn’t following your heart.

It’s spending your whole life wondering what would’ve happened if you had.

Brigadier General Theodore Roosevelt Jr.

Brigadier General Theodore Roosevelt Jr

June 6, 1944.

As the landing craft approached Utah Beach, Brigadier General Theodore Roosevelt Jr. gripped his cane and checked his pistol.

He was fifty-six years old. His heart was failing. Arthritis had crippled his joints from old World War I wounds. Every step hurt.

He wasn’t supposed to be there.

But he had insisted—three times—on going ashore with the first wave of troops. His commanding officer, Major General Raymond “Tubby” Barton, had rejected the request twice. Too dangerous. Too risky. No place for a general.

Roosevelt wrote a letter. Seven bullet points. The last one: “I personally know both officers and men of these advance units and believe that it will steady them to know that I am with them.”

Barton relented.

And so Theodore Roosevelt Jr.—eldest son of President Theodore Roosevelt, veteran of World War I, twice wounded, gassed nearly to blindness—became the only general officer to storm the beaches of Normandy in the first wave.
This wasn’t ancient history. This was June 6, 1944.

The ramp dropped. German guns opened fire. Bullets slapped the water. Artillery shells screamed overhead. Men scrambled onto the sand, some falling before they took three steps.

Roosevelt stepped off the boat, leaning on his cane, carrying only a .45 caliber pistol.

One of his men later recalled: “General Theodore Roosevelt was standing there waving his cane and giving out instructions as only he could do. If we were afraid of the enemy, we were more afraid of him and could not have stopped on the beach had we wanted to.”

Within minutes, Roosevelt realized something was wrong.
The strong tidal currents had pushed the landing craft off course. They’d landed nearly a mile south of their target. The wrong beach. The wrong exits. The whole invasion plan suddenly useless.

Men looked around in confusion. Officers checked maps. The Germans kept firing.

This was the moment that could turn the invasion into a massacre.

Roosevelt calmly surveyed the shoreline. Studied the terrain. Made a decision.

Then he gave one of the most famous orders in D-Day history:

“We’ll start the war from right here!”

For the next four hours, Theodore Roosevelt Jr. stood on that beach under relentless enemy fire, reorganizing units as they came ashore, directing tanks, pointing regiments toward their new objectives. His cane tapping in the sand. His voice steady. His presence unshakable.

A mortar shell landed near him. He looked annoyed. Brushed the sand off his uniform. Kept moving.

Another soldier described seeing him “with a cane in one hand, a map in the other, walking around as if he was looking over some real estate.”

He limped back and forth to the landing craft—back and forth, back and forth—personally greeting each arriving unit, making sure the men kept moving off the beach and inland. The Germans couldn’t figure out what this limping officer with the cane was doing. Neither could they hit him.

By nightfall, Utah Beach was secure. Of the five D-Day landing beaches, Utah had the fewest casualties—fewer than 200 dead compared to over 2,000 at Omaha Beach just miles away.

Commanders credited Roosevelt’s leadership under fire for the success.

Theodore Roosevelt Jr. had been preparing for that day his entire life.

Born September 13, 1887, at the family estate in Oyster Bay, New York, he was the eldest son of Theodore Roosevelt—the larger-than-life president, war hero, and force of nature. Growing up in that shadow was impossible. Meeting that standard seemed even harder.

But Ted tried.

In World War I, he’d been among the first American soldiers to reach France. He fought at the Battle of Cantigny. Got gassed. Got shot. Led his men with such dedication that he bought every soldier in his battalion new combat boots with his own money. He was promoted to lieutenant colonel and awarded the Distinguished Service Cross.

Then, in July 1918, his youngest brother Quentin—a pilot—was shot down and killed over France.

Ted never fully recovered from that loss.

When World War II began, Theodore Roosevelt Jr. was in his fifties. Broken down. Worn out. He could have stayed home. Taken a desk job. No one would have blamed him.

Instead, he fought his way back into combat command. He led troops in North Africa. Sicily. Italy. Four amphibious assaults before Normandy.

And on D-Day, when commanders tried to keep him off that beach, he refused.

“The first men to hit the beach should see the general right there with them.”

After Utah Beach, General Omar Bradley—who commanded all American ground forces in Normandy—called Roosevelt’s actions “the bravest thing I ever saw.”

General George Patton agreed. Days later, Patton wrote to his wife: “He was one of the bravest men I ever knew.”

On July 11, 1944—thirty-six days after D-Day—General Eisenhower approved Roosevelt’s promotion to major general and gave him command of the 90th Infantry Division.

Roosevelt never got the news.

That same day, he spent hours talking with his son, Captain Quentin Roosevelt II, who had also landed at Normandy on D-Day—the only father-son pair to come ashore together on June 6, 1944.

Around 10:00 p.m., Roosevelt was stricken with chest pains.
Medical help arrived. But his heart had taken all it could take.

At midnight on July 12, 1944—five weeks after leading men onto Utah Beach—Theodore Roosevelt Jr. died in his sleep.
He was fifty-six years old.

Generals Bradley, Patton, and Barton served as honorary pallbearers. Roosevelt was initially buried at Sainte-Mère-Église.

In September 1944, he was posthumously awarded the Medal of Honor. When President Roosevelt handed the medal to Ted’s widow, Eleanor, he said, “His father would have been proudest.”

After the war, Roosevelt’s body was moved to the Normandy American Cemetery at Colleville-sur-Mer—the rows of white crosses overlooking Omaha Beach.

And there’s where the story takes its final, heartbreaking turn.

In 1955, the family made a request: Could Quentin Roosevelt—Ted’s younger brother, killed in World War I, buried in France since 1918—be moved to rest beside his brother?

Permission was granted.

Quentin’s remains were exhumed from Chamery, where he’d been buried near the spot his plane crashed thirty-seven years earlier, and reinterred beside Ted.

Two sons of a president. Two brothers. Two wars. Reunited in foreign soil.

Quentin remains the only World War I soldier buried in that World War II cemetery.

Today, at the Normandy American Cemetery, among the 9,388 white marble crosses and Stars of David, two headstones stand side by side:

THEODORE ROOSEVELT JR.
BRIGADIER GENERAL
MEDAL OF HONOR

QUENTIN ROOSEVELT
SECOND LIEUTENANT
WORLD WAR I

The tide still rolls over Utah Beach. The sand looks the same. Tourists walk where soldiers died.

And somewhere in that vast field of white crosses, two brothers rest together—sons of a president who believed in duty, service, and leading from the front.

Some men lead by orders.

Some lead by rank.

Theodore Roosevelt Jr. led by example—cane in hand, heart failing, utterly unflinching.

He didn’t have to be there.

But he refused to lead from anywhere else.

George Lucas

George Lucas

Hollywood executives laughed when he asked for the toy rights. Then he became richer than all of them combined.
A near-fatal car crash led to changing cinema forever.
George Lucas was 18 years old.
Three days before high school graduation, his Fiat got crushed by a Chevy Impala at an intersection.
The impact threw him from the car. His seatbelt snapped. That malfunction saved his life.
He should have died. Doctors didn’t know if he’d make it.
He spent weeks in the hospital. Had to watch graduation from a bed.
Everything changed after that.
Lucas stopped racing cars. Started thinking about what he actually wanted to do with the second chance he’d been given.
He decided to make films.
Everyone said it was a waste.
“You barely graduated high school.”
“You’re not connected to Hollywood.”
“Film school is for dreamers.”
He didn’t listen.
Went to USC film school. Made student films that caught attention. Got a scholarship from Warner Bros.
His first feature film, THX 1138, flopped. Studio hated it. Cut it against his wishes. Lost money.
Then American Graffiti became a hit. Made over $200 million on a tiny budget.
But Lucas had a bigger idea. A space opera. Something nobody had ever seen before.
He shopped Star Wars to every major studio.
Universal passed.
United Artists passed.
Disney passed.
Everybody passed.
They said it was too weird. Too expensive. Too risky.
“Space movies don’t sell.”
“The script is confusing.”
“Nobody wants to see robots and aliens.”
Finally, 20th Century Fox took a chance. But they didn’t believe in it either.
Here’s where Lucas did something nobody understood at the time.
Instead of negotiating for a bigger directing fee, he asked for something else.
The merchandising rights. And the rights to any sequels.
The studio laughed. Merchandising? From a weird space movie? Sure, take it.
They thought they were getting a deal. Paying Lucas less upfront for rights they considered worthless.
That decision made George Lucas a billionaire.
Star Wars opened in 1977. Forty theaters.
Lines wrapped around blocks. People saw it ten, twenty, fifty times.
It became the highest-grossing film in history at that point.
The toys alone generated billions. Action figures, lunchboxes, video games, books, theme parks.
All because Lucas believed in something nobody else could see.
But he wasn’t done.
He built Industrial Light and Magic because no special effects company could do what he needed. Now it’s created effects for most major blockbusters in history.
He built Skywalker Sound. Changed how movies sound.
He built Lucasfilm into an empire.
In 2012, he sold it to Disney for over $4 billion.
Then gave most of it away to education.
Today, the Star Wars franchise has generated tens of billions of dollars across films, merchandise, streaming, and theme parks.
All because a kid who almost died in a car crash decided to chase an idea everyone said was impossible.
What dream are you abandoning because the first few studios said no?
What rights are you giving away because you don’t see their future value?
Lucas nearly died at 18. His first film flopped. Every major studio rejected his biggest idea.
He took less money upfront because he believed in what he was building.
He created technology that didn’t exist because he needed it for his vision.
He proved that the people who reject you don’t get to define you.
Your near-miss might be your wake-up call.
Your rejection letters might be proof you’re onto something.
Your “worthless” idea might be worth billions.
Stop letting studios, investors, and doubters write your story.
Start thinking like George Lucas.
Take the rights everyone else thinks are worthless.
Build what doesn’t exist yet.
And never let “no” be the end of the conversation.
Sometimes the biggest wins come from the deals nobody else wanted.
Because when everyone underestimates you, you get to keep everything.
Think Big.

Question Everything

Fake News

by Jeff Thomas

The average person in the First World receives more information than he would if he lived in a Second or Third World country. In many countries of the world, the very idea of twenty-four hour television news coverage would be unthinkable, yet many Westerners feel that, without this constant input, they would be woefully uninformed.

Not surprising, then, that the average First Worlder feels that he understands current events better than those elsewhere in the world. But, as in other things, quality and quantity are not the same.

The average news programme features a commentator who provides “the news,” or at least that portion of events that the network deems worthy to be presented. In addition, it is presented from the political slant of the controllers of the network. But we are reassured that the reporting is “balanced,” in a portion of the programme that features a panel of “experts.”

Customarily, the panel consists of the moderator plus two pundits who share his political slant and a pundit who has an opposing slant. All are paid by the network for their contributions. The moderator will ask a question on a current issue, and an argument will ensue for a few minutes. Generally, no real conclusion is reached—neither side accedes to the other. The moderator then moves on to another question.

So, the network has aired the issues of the day, and we have received a balanced view that may inform our own opinions.

Or have we?

Shortcomings

In actual fact, there are significant shortcomings in this type of presentation:

The scope of coverage is extremely narrow. Only select facets of each issue are discussed.

Generally, the discussion reveals precious little actual insight and, in fact, only the standard opposing liberal and conservative positions are discussed, implying that the viewer must choose one or the other to adopt as his own opinion.

On a programme that is liberally-oriented, the one conservative pundit on the panel is made to look foolish by the three liberal pundits, ensuring that the liberal viewer’s beliefs are reaffirmed. (The reverse is true on a conservative news programme.)

Each issue facet that is addressed is repeated many times in the course of the day, then extended for as many days, weeks, or months as the issue remains current. The “message,” therefore, is repeated virtually as often as an advert for a brand of laundry powder.

So, what is the net effect of such news reportage? Has the viewer become well-informed?

In actual fact, not at all. What he has become is well-indoctrinated.

A liberal will be inclined to regularly watch a liberal news channel, which will result in the continual reaffirmation of his liberal views. A conservative will, in turn, regularly watch a conservative news channel, which will result in the continual reaffirmation of his conservative views.

Many viewers will agree that this is so, yet not recognise that, essentially, they are being programmed to simply absorb information. Along the way, their inclination to actually question and think for themselves is being eroded.

Alternate Possibilities

The proof of this is that those who have been programmed, tend to react with anger when they encounter a Nigel Farage or a Ron Paul, who might well challenge them to consider a third option—an interpretation beyond the narrow conservative and liberal views of events. In truth, on any issue, there exists a wide field of alternate possibilities.

By contrast, it is not uncommon for people outside the First World to have better instincts when encountering a news item. If they do not receive the BBC, Fox News, or CNN, they are likely, when learning of a political event, to think through, on their own, what the event means to them.

As they are not pre-programmed to follow one narrow line of reasoning or another, they are open to a broad range of possibilities. Each individual, based upon his personal experience, is likely to draw a different conclusion and, thorough discourse with others, is likely to continue to update his opinion each time he receives a new viewpoint.

As a result, it is not uncommon for those who are not “plugged-in” to be not only more open-minded, but more imaginative in their considerations, even when they are less educated and less “informed” than those in the First World.

Whilst those who do not receive the regular barrage that is the norm in the First World are no more intelligent than their European or American counterparts, their views are more often the result of personal objective reasoning and common sense and are often more insightful.

Those in First World countries often point with pride at the advanced technology that allows them a greater volume of news than the rest of the world customarily receives.

Further, they are likely to take pride in their belief that the two opposing views that are presented indicate that they live in a “free” country, where dissent is encouraged.

Unfortunately, what is encouraged is one of two views—either the liberal view or the conservative view. Other views are discouraged.

The liberal view espouses that a powerful liberal government is necessary to control the greed of capitalists, taxing and regulating them as much as possible to limit their ability to victimise the poorer classes.

The conservative view espouses that a powerful conservative government is needed to control the liberals, who threaten to create chaos and moral collapse through such efforts as gay rights, legalised abortion, etc.

What these two dogmatic concepts have in common is that a powerful government is needed.

Each group, therefore, seeks the increase in the power of its group of legislators to overpower the opposing group. This ensures that, regardless of whether the present government is dominated by liberals of conservatives, the one certainty will be that the government will be powerful.

When seen in this light, if the television viewer were to click the remote back and forth regularly from the liberal channel to the conservative channel, he would begin to see a strong similarity between the two.

It’s easy for any viewer to question the opposition group, to consider them disingenuous—the bearers of false information. It is far more difficult to question the pundits who are on our own “team,” to ask ourselves if they, also, are disingenuous.

This is especially difficult when it’s three to one—when three commentators share our political view and all say the same thing to the odd-man-out on the panel. In such a situation, the hardest task is to question our own team, who are clearly succeeding at beating down the odd-man-out.
Evolution of Indoctrination

In bygone eras, the kings of old would tell their minions what to believe and the minions would then either accept or reject the information received. They would rely on their own experience and reasoning powers to inform them.

Later, a better method evolved: the use of media to indoctrinate the populace with government-generated propaganda (think: Josef Goebbels or Uncle Joe Stalin).

Today, a far more effective method exists—one that retains the repetition of the latter method but helps to eliminate the open-ended field of alternate points of view. It does so by providing a choice between “View A” and “View B.”

In a democracy, there is always an “A” and a “B.” This illusion of choice is infinitely more effective in helping the populace to believe that they have been able to choose their leaders and their points of view.

In the modern method, when voting, regardless of what choice the individual makes, he is voting for an all-powerful government. (Whether it calls itself a conservative one or a liberal one is incidental.)

Likewise, through the modern media, when the viewer absorbs what is presented as discourse, regardless of whether he chooses View A or View B, he is endorsing an all-powerful government.

Two Solutions

One solution to avoid being brainwashed by the dogmatic messaging of the media is to simply avoid watching the news. But this is difficult to do, as our associates and neighbours are watching it every day and will want to discuss with us what they have been taught.

The other choice is to question everything.

To consider that the event that is being discussed may not only be being falsely reported, but that the message being provided by the pundits may be consciously planned for our consumption.

This is difficult to do at first but can eventually become habit. If so, the likelihood of being led down the garden path by the powers-that-be may be greatly diminished. In truth, on any issue, there exists a wide field of alternate possibilities.

Developing your own view may, in the coming years, be vital to your well-being.

Source: https://internationalman.com/articles/question-everything/

Snowman

Snowman

In February 1956, Harry deLeyer arrived late to a horse auction in Pennsylvania.

The auction was over. The valuable horses were gone. The only animals left were the ones nobody wanted—skinny, used-up horses being loaded onto a truck bound for the slaughterhouse in Northport.

Harry was a 28-year-old Dutch immigrant who taught riding at a private school on Long Island. He needed quiet horses for his beginner students. Nothing fancy. Just something safe.

Then he saw him.

A gray gelding, eight years old, filthy and covered in scars from years pulling an Amish plow. The owner warned Harry against buying him. “He’s not sound. He has a hole in his shoulder from the plow harness.”

Harry looked at the horse anyway.

Wide body. Calm demeanor. Intelligent eyes. Good legs despite everything.

“How much?”

“Eighty dollars.”

Harry paid it. The horse stepped off the slaughter truck and into history.

His daughter named him Snowman.

For a few months, Snowman was exactly what Harry needed—a gentle lesson horse the children loved. So gentle, in fact, that Harry eventually sold him to a local doctor for double what he’d paid.

The doctor took Snowman home.

Snowman had other plans.

The next morning, Snowman was back in Harry’s barn.

The doctor took him home again. Built higher fences.
Snowman jumped them. Came back.

Five-foot fences. The horse who’d spent his life pulling a plow was clearing five-foot fences like they were nothing.
Harry stared at this $80 plow horse and saw something nobody else had seen.

Maybe this horse could jump.

In 1958—exactly two years after Harry pulled him off that slaughter truck—Snowman and Harry deLeyer walked into Madison Square Garden.

They were competing against America’s elite show jumpers. Horses with perfect bloodlines. Horses worth tens of thousands of dollars. Horses owned by millionaires who’d never looked at a plow, much less pulled one.

Snowman was still that wide, plain gray gelding.

Still had scars on his shoulder.

Still had the thick neck and powerful hindquarters of a working farm horse.

He won.

Not just won—dominated. The AHSA Horse of the Year. The Professional Horsemen’s Association championship. The National Horse Show championship. Show jumping’s triple crown.

The press went wild. LIFE Magazine called it “the greatest ‘nags-to-riches’ story since Black Beauty.”

They called Snowman “The Cinderella Horse.”

In 1959, they did it again.
Back to Madison Square Garden. Back against the blue-blood horses and their millionaire owners.

Snowman won again. Horse of the Year. Again.

The crowd couldn’t get enough of them. This immigrant riding instructor and his $80 rescue horse, beating horses that cost more than houses.

Snowman jumped obstacles up to seven feet, two inches high. He jumped over other horses. He jumped with a care and precision that made it look easy, even when it wasn’t.

And here’s the part that made people love him even more: the same horse who cleared seven-foot jumps on Saturday could lead a child around the ring on Sunday. Snowman could win an open jumper championship in the morning and a leadline class in the afternoon.

He was called “the people’s horse.”

Snowman and Harry traveled the world. They appeared on television shows. Johnny Carson. National broadcasts. Snowman became as famous as any human athlete.

Secretariat wouldn’t be born for another decade. But people compared Snowman to Seabiscuit—another long-shot champion who’d captured America’s heart in darker times.

The Cold War was raging. The country was anxious. And here was this story: an immigrant and a rescue horse, proving that being born into nothing didn’t mean you were worth nothing.

That the $80 horse could beat the $30,000 horses.

That where you came from mattered less than where you were willing to go.

Snowman competed until 1969.

His final performance was at Madison Square Garden, where it had all started. He was 21 years old—elderly for a show jumper. The crowd gave him a standing ovation. They sang “Auld Lang Syne.”

He retired to Harry’s farm in Virginia, where he lived peacefully for five more years.

Children still came to see him. To touch the horse who’d become a legend. To feed carrots to the champion who’d once been hours away from becoming dog food.

Snowman died on September 24, 1974.

He was 26 years old. Kidney failure.

Harry deLeyer kept teaching, kept training, kept competing. He never found another horse like Snowman. Nobody did.

In 1992—eighteen years after Snowman’s death—the horse was inducted into the Show Jumping Hall of Fame.

In 2011, author Elizabeth Letts wrote “The Eighty-Dollar Champion: Snowman, the Horse That Inspired a Nation.” It became a #1 New York Times bestseller.

In 2015, when Harry was 86 years old, a documentary premiered: “Harry & Snowman.” For the first time, Harry told the whole story himself. His childhood in Nazi-occupied Netherlands, where his family hid Jews in a secret cellar beneath the barn. His immigration to America with nothing. His late arrival at that Pennsylvania auction.

That gray horse on the slaughter truck.

Eighty dollars.

That’s what it cost to save Snowman’s life.

It’s also what it cost to prove that champions aren’t always born in fancy stables with perfect bloodlines.

Sometimes they’re born in Amish fields, pulling plows until their shoulders scar.

Sometimes they’re saved by immigrants who arrive late to auctions and see something nobody else saw.

Sometimes the longest long shot becomes the surest thing.

Harry deLeyer died on June 25, 2021, at age 93.

The obituaries called him many things: riding instructor, champion, immigrant, hero.

But the title he probably loved most was simple.
Snowman’s rider.

The man who paid $80 for a plow horse and got a friend, a champion, and a story that would last forever.

You Are Valuable, Unique and Important

“What each must seek in his life never was on land or sea. It is something out of his own unique potentiality for experience, something that never has been and never could have been experienced by anyone else.”
Joseph Campbell – Author (1904 – 1987)

I have collected some ideas that will help you work out your basic purpose in life based on you personality, what you like doing and what your are good at. Check them out:
How To Work Out Your Basic Purpose
https://www.tomgrimshaw.com/tomsblog/?p=37862

Dr Brian May

Dr Brian May

In 1970, a 23-year-old physics student at Imperial College London was deep into his doctoral research on cosmic dust when he faced an impossible choice.
Brian May had spent three years studying the zodiacal dust cloud—the faint glow of sunlight reflecting off tiny particles scattered throughout the solar system. He’d built his own equipment, collected data, analyzed measurements, and was making genuine progress toward his PhD in astrophysics.
But he was also the guitarist for a rock band that was starting to gain serious attention.
The band was called Queen. They’d just signed a record deal. Tours were being planned. The opportunity was real, immediate, and unlikely to wait while May finished his academic work.
Standing at that crossroads, May made a decision that would leave a question unanswered for 36 years: he chose the guitar over the telescope.
Queen’s rise was meteoric. By the mid-1970s, they were one of the biggest bands in the world. “Boheman Rhapsody” became one of rock’s most iconic songs. May’s guitar work—his distinctive tone created using a homemade guitar called the Red Special—became instantly recognizable. Albums sold millions. Stadiums filled with fans singing along to “We Will Rock You” and “We Are the Champions.”
May’s academic work sat unfinished, his thesis incomplete, his research abandoned but never quite forgotten.
For most people, that would have been the end of the story. A promising academic career sacrificed for rock stardom—a trade-off that millions would gladly make. The PhD simply wasn’t meant to be.
But Brian May wasn’t most people.
Even as Queen dominated the rock world throughout the 1970s and 80s, May maintained his interest in astronomy and astrophysics. He read scientific journals. He attended lectures when touring schedules allowed. He stayed connected to the academic world he’d left behind, following developments in his field, watching as technology advanced and understanding of the solar system deepened.
His thesis supervisor, Professor Michael Rowan-Robinson, had told him decades earlier: “You can always come back and finish.”
May had never forgotten those words.
In 2006, more than three decades after walking away from Imperial College to tour with Queen, Brian May decided it was time.
He contacted Professor Rowan-Robinson, who was still at Imperial College and still remembered his former student who’d left to become a rock star. They discussed whether it was feasible to complete the work May had started in 1970.
The challenge was significant. Astrophysics had advanced enormously in 36 years. The technology May had used for his original observations was obsolete. The data he’d collected was valuable but incomplete by modern standards. Simply picking up where he left off wouldn’t work—he’d need to update his research, incorporate decades of new discoveries, and meet current academic standards.
But the core of his original work remained valid. His observations of the zodiacal dust cloud were still relevant. His research questions were still meaningful. And Rowan-Robinson was willing to supervise him to completion.
May threw himself into the work with the same intensity he’d brought to Queen’s music.
While still maintaining his music career—performing with Queen + Paul Rodgers and working on various projects—May carved out time to update his thesis. He revisited his original data from the early 1970s. He studied the decades of subsequent research on zodiacal dust. He incorporated modern measurements and refined his analysis using contemporary techniques.
The thesis he ultimately submitted was titled “A Survey of Radial Velocities in the Zodiacal Dust Cloud.” It examined the motion of dust particles in the plane of the solar system, work that contributed to understanding how dust behaves in space—research relevant to everything from asteroid studies to the formation of planetary systems.
In August 2007, Imperial College London awarded Brian May a PhD in astrophysics.
Not an honorary degree—universities frequently give those to celebrities and donors without requiring actual academic work. This was a real PhD, earned through genuine research, peer review, and the same rigorous standards applied to any doctoral candidate.
The examination was conducted by experts in the field who evaluated his work on its scientific merits, not his fame as a guitarist. The thesis had to withstand the same scrutiny any astrophysics PhD would face. May had to defend his research, answer technical questions, and demonstrate mastery of his subject.
He passed.
At age 60, Brian May—rock legend, guitarist whose solos had been heard by hundreds of millions—became Dr. Brian May, astrophysicist.
The accomplishment made headlines around the world, but not because a celebrity had purchased a credential or received an honorary title. It made news because it was genuinely remarkable: a world-famous musician had returned to complete legitimate academic work abandoned 36 years earlier, proving that it’s never too late to finish what you started.
The story resonated because it defied easy categorization. We’re used to dividing people into categories: artists versus scientists, creative types versus analytical minds, rock stars versus academics. Brian May refused to fit into any single box.
He’d always been both.
As a child, May had been fascinated by the night sky. He built telescopes with his father. He studied physics and mathematics not because he had to, but because he loved understanding how the universe worked. When he got to Imperial College—one of the world’s top science universities—he excelled academically while also playing guitar in bands.
The guitar he played, the legendary Red Special, was itself a fusion of science and art. May and his father had built it by hand when Brian was a teenager, using materials including parts of an old fireplace mantle, motorcycle springs, and knitting needles. Every design choice was carefully calculated for acoustic properties and tonal qualities. The result was an instrument with a unique sound that would become part of rock history.
That blend of scientific thinking and artistic creativity defined everything May did. His guitar solos were technically complex but emotionally powerful. His approach to music was both intuitive and analytical. He didn’t see science and art as opposites—to him, they were different expressions of the same curiosity about the world.
Earning the PhD wasn’t about proving anything to critics or adding credentials to his resume. May didn’t need the degree for career advancement—he was already one of the most successful musicians in history. He pursued it because the unfinished work bothered him, because he’d always wondered what conclusions his research would reach, because he valued knowledge for its own sake.
After earning his PhD, May didn’t treat it as a culmination but as a beginning. He became increasingly active in science advocacy and public education about astronomy. He served as Chancellor of Liverpool John Moores University for over a decade. He co-founded Asteroid Day, an annual event raising awareness about asteroid impacts. He collaborated with NASA on various projects, including creating stereoscopic images from the New Horizons mission to Pluto.
He published books combining his interests, including academic books about stereoscopy and popular books about astronomy illustrated with historic 3D photographs. He gave lectures at universities worldwide, speaking about both his astrophysics research and the intersection of science and creativity.
And he continued making music, because he never had to choose between being a scientist and being an artist—he was always both.
The 36-year gap in his academic career became part of his story, not a failure but proof that paths don’t have to be linear. You can start something, set it aside for a valid reason, and come back to it decades later if it still matters to you.
That message resonated far beyond the worlds of rock music and astrophysics. Students who’d left school to work could see that returning was possible. People who’d abandoned dreams for practical reasons found encouragement. Anyone who’d ever felt they had to choose between two passions saw an example of someone who ultimately refused to choose.
When May received his doctorate, he joked in interviews that his thesis was “the world’s longest delayed homework assignment.” But beneath the humor was a serious point: intellectual curiosity doesn’t expire. Knowledge you once pursued remains valuable even if you step away from it. And completing something you started, even decades later, brings its own satisfaction independent of external recognition.
The story of Dr. Brian May, astrophysicist and rock legend, stands as a reminder that human beings are not meant to fit into single categories. We can contain multitudes. We can excel in completely different domains. We can be both the person shredding guitar solos in front of 80,000 fans and the person quietly analyzing data about cosmic dust.
In fact, the same qualities that made May an exceptional musician—attention to detail, pattern recognition, creative problem-solving, dedication to craft—translated directly to his scientific work. The disciplines weren’t as separate as they seemed.
Today, when astrophysicists discuss zodiacal dust or musicians analyze Brian May’s guitar technique, they’re talking about the same person—someone who proved that you don’t have to choose between passion and profession, between art and science, between finishing what you started and embracing new opportunities.
You can have both. It might just take 36 years.
But as Dr. Brian May demonstrated: some things are worth coming back to finish, no matter how long the journey takes.