Goldie Hawn

Goldie Hawn

Everyone knows her as the giggling ’dumb blonde’ from the 1960s who won an Oscar at 23—but almost nobody knows she quietly built a brain science program that’s now taught emotional resilience to 6 million children in 48 countries.

In 1968, when Goldie Hawn appeared on TV covered in body paint and a bikini, giggling her way through comedy sketches as the show’s ditzy blonde, a women’s magazine editor confronted her. “Don’t you feel terrible that you’re playing a dumb blonde?” the editor asked.

“While women are fighting for liberation, you’re reinforcing every stereotype. ”

Goldie’s response was immediate: “I don’t understand that question because I’m already liberated. Liberation comes from the inside.”

At twenty-two, Goldie Hawn understood something that would define her entire life: you don’t have to play by anyone else’s rules to be free. You just have to know who you are. And she did.

Born in Washington, D.C., Goldie grew up training seriously as a ballet dancer—a discipline requiring precision, control, and relentless self-awareness. When she transitioned to comedy, those skills came with her. Her persona on Rowan & Martin’s Laugh-In was carefully crafted: the giggling go-go dancer delivering punchlines through high-pitched laughter.

She became a 1960s “It Girl” almost overnight. But what looked like spontaneous silliness was actually masterful comedic craft. Her giggle wasn’t random—it was strategic. Her wide-eyed innocence wasn’t naivete—it was performance. She played the dumb blonde so well that people missed the intelligence underneath. And that was exactly the point. In 1969, Goldie won both the Academy Award and Golden Globe for Best Supporting Actress for Cactus Flower.

She was twenty-three years old. Her film career exploded. But by the late 1970s, Goldie recognized an uncomfortable truth: actresses, no matter how successful, rarely controlled their own narratives.

So she became a producer. In 1980, she co-produced Private Benjamin with friend Nancy Meyers. Studios dismissed it as “too female,“ predicting audiences wouldn’t pay to see a woman’s story about independence. Goldie ignored them.

Private Benjamin became a massive box office hit and earned three Oscar nominations. She continued producing and starring in successful comedies throughout the 1980s and 1990s, crafting characters who laughed at their own pain and weaponized humor against aging and sexism.

But offscreen, something even more remarkable was happening. While her peers chased youth through surgery and desperate career moves, Goldie turned inward. She’d been meditating since the 1970s, long before mindfulness became trendy.

She studied neuroscience, positive psychology, and how the brain works. This wasn’t celebrity dabbling. This was serious, sustained study. And in 2003, it led to what might be Goldie’s most important work.

Alarmed by increases in school violence, youth depression and suicide, Goldie founded The Goldie Hawn Foundation. Working with leading neuroscientists and educators, the foundation developed MindUP—an evidence-based curriculum teaching children social-emotional skills and mindfulness.

MindUP teaches children how their brains work, how to manage stress through “brain breaks,“ how to regulate emotions, build empathy, and develop resilience.

The program is based on actual neuroscience. Research has shown that students using MindUP demonstrate improved focus, increased empathy, better academic performance, and higher levels of optimism.

“If students take two minutes for a brain break three times a day,” Goldie explained, “optimism in the classroom goes up almost 80 percent. ”The program has now served over 6 million children in 48 countries. Read that again: 6 million children.

48 countries. The “dumb blonde” from the 1960s quietly built a global program that’s teaching emotional resilience to millions of kids—many of whom have no idea who Goldie Hawn even is.

This work—sustained, focused on children most people in Hollywood never think about—might be Goldie’s most enduring legacy. Throughout all of this, she’s maintained remarkable stability.

She’s been with Kurt Russell since 1983—over forty years together without marrying. She raised four children who’ve pursued their own careers with her support.

Now in her late seventies, Goldie remains selective about her projects. She took a fifteen-year break from film, returning in 2017 for Snatched with Amy Schumer—who had grown up watching Goldie’s films and wanted to work with her. When asked about ageism in Hollywood, Goldie’s response was characteristically pragmatic: “You think you’re going to fight the system? Anger doesn’t get you anywhere. It’s not productive.”

Instead of fighting battles she couldn’t win, she changed the battlefield. She produced. She built a foundation. She taught millions of children. She lived life on her own terms.

Looking back, Goldie Hawn’s life reveals a consistent pattern: she never let anyone else define her worth. When critics dismissed her as a dumb blonde, she won an Oscar. When Hollywood tried to limit her to acting, she became a producer.

When fame threatened to consume her, she turned to meditation and neuroscience. When she saw children struggling, she built a global program to help them.

The giggle that made her famous was never the whole story. It was the disguise that let her do everything else. Goldie Hawn proved that you don’t have to shout to be powerful. You don’t have to reject femininity to be feminist.

And you don’t have to choose between success and substance—you can have both, as long as you know who you are. She smiled her way through a system designed to limit her, then quietly built an empire that had nothing to do with that system’s approval.

6 million children in 48 countries have learned emotional resilience from a program created by the woman America knew as the giggling blonde in a bikini. That’s not just a career. That’s a masterclass in playing the long game.

Because the greatest act of resistance isn’t fighting the stereotype. It’s using it as cover while you do the real work. And Goldie Hawn has been doing the real work for more than fifty years.

Clair Patterson

Clair Patterson

(Tom: We all owe this being a debt of thanks!)

He discovered how old the Earth was. Then he discovered something that could destroy us all.

For thousands of years, humanity wondered about the age of our planet. Religious texts offered one answer. Philosophers debated another. Scientists made educated guesses based on fossils and rock layers. But nobody actually knew.

Until a quiet scientist named Clair Patterson figured it out in 1953.

He should have become instantly famous. His name should have appeared in every textbook. Instead, what he discovered next turned him into a target. He found himself standing alone against one of the most powerful industries on Earth, fighting a battle that would determine whether millions of children would grow up with damaged minds.

And for decades, almost nobody knew his name.

Patterson’s journey began in the late 1940s at the University of Chicago. He was a young geochemist with an impossible assignment: measure the precise amount of lead isotopes in a meteorite fragment called Canyon Diablo.

The theory was elegant—if he could measure these specific lead ratios accurately, he could calculate when the solar system formed, and therefore, when Earth was born.

But there was a problem that nearly broke him.

Every time he tried to measure the lead in his samples, the numbers were wildly inconsistent. One day high, the next day higher, never stable. His equipment seemed fine. His calculations were correct. Yet the data was chaos.

Most scientists would have given up or blamed the methodology. Patterson was different. He possessed an almost obsessive attention to detail and patience that bordered on stubborn madness.

One day, he realized something shocking: the problem wasn’t his rock sample. The problem was everything else.

There was lead everywhere. On the lab benches. In the air. Tracking in on people’s shoes. Floating as invisible dust particles. The entire world was contaminated, and it was sabotaging his measurements.

So Patterson did something unprecedented. He built the world’s first ultra-clean laboratory.

He scrubbed every surface until his hands bled. He sealed cracks in walls with tape. He installed specialized air filters. He made his assistants wear protective suits and wash repeatedly before entering. For years, he cleaned and refined and eliminated every possible source of contamination.

Finally, in 1953, he achieved it. He got a clean reading. He ran the numbers through a mass spectrometer, performed the calculations, and suddenly held an answer that no human in history had ever known:

4.55 billion years.

The Earth was 4.55 billion years old.

It’s said that in his excitement, he drove straight to his mother’s house in Iowa and told her he’d solved one of humanity’s oldest mysteries. The weight of not knowing had finally lifted.

But while building his clean room, Patterson had stumbled onto something far more disturbing.

Where was all this lead coming from?

Lead is naturally rare on Earth’s surface. It stays locked deep underground in mineral deposits. It doesn’t float freely in the air. It doesn’t coat laboratory tables. Yet it was everywhere—in quantities that made no sense.

Patterson began testing the world outside his lab. Ocean water. Mountain snow. Everywhere he looked, lead levels were hundreds of times higher than natural background levels.

And then he understood.

Since the 1920s, oil companies had been adding a compound called tetraethyl lead to gasoline. It prevented engine knock and made cars run smoother. But every car on every road was functioning as a poison dispersal system, spraying microscopic lead particles into the air with every mile driven.

Lead is a neurotoxin. It damages developing brains. It lowers IQ. It causes behavioral problems, aggression, and cognitive impairment. And an entire generation of children was breathing it every single day.

Patterson had to make a choice.

He was a geochemist. His job was studying rocks and isotopes, not fighting corporations or advocating for public health. He had stable funding and a promising academic career. He could have simply published his Earth-age discovery and moved on to the next project.

But he couldn’t unsee what he’d found.

In the mid-1960s, he published papers warning that industrial lead contamination was poisoning the environment and harming human health.

The response was swift and brutal.

The lead industry was massive, wealthy, and had no intention of losing billions in revenue. Their chief scientific defender was Dr. Robert Kehoe, who had spent decades assuring the public that environmental lead was natural and harmless. Kehoe was polished, well-funded, and had the backing of powerful corporations.

When Patterson challenged this narrative, the industry attempted to buy his silence. Representatives visited him offering generous research grants and institutional support. All he had to do was redirect his focus elsewhere.

Patterson refused.

So they tried to destroy him professionally.

His funding from petroleum-connected sources was immediately cut. The industry pressured his university to dismiss him. They used their influence to block his papers from peer-reviewed journals. They publicly dismissed him as an overzealous geologist stepping outside his expertise.

For years, it worked. Patterson was marginalized, labeled an alarmist, and isolated from mainstream scientific discussions.

But Patterson had something the industry couldn’t counter: evidence from before the contamination began.

He realized he needed a time machine—a way to prove what Earth’s atmosphere was like before automobiles. So he traveled to one of the most remote places on the planet: Greenland.

In brutal, freezing conditions, Patterson and his team drilled deep into ancient glaciers, extracting long cylinders of ice. These ice cores were frozen time capsules. Snow that fell in 1700 was preserved deep in the ice. Snow from 1900 was higher up. Snow from the 1950s was near the surface.

Back in his clean lab, Patterson carefully melted layers of ice from different time periods and measured their lead content.

The results were devastating to the industry’s claims.

For thousands of years, atmospheric lead levels were essentially zero. Then, starting precisely in the 1920s—exactly when leaded gasoline was introduced—the levels shot upward like a rocket. The graph was unmistakable. The contamination wasn’t natural. It was recent, man-made, and accelerating.

Armed with this irrefutable proof, Patterson returned to the fight.

He testified before congressional committees, sitting across from industry lawyers who tried to confuse the science. He wasn’t comfortable with public speaking.

He was nervous, awkward, and preferred the quiet predictability of his laboratory. But he refused to back down.

He told legislators they were poisoning their own children. He showed them the ice core data. He made the invisible visible.

Slowly, reluctantly, the truth broke through.

Other scientists began supporting his findings. Public health advocates took notice. Parents started demanding action. The tide turned.

In the 1970s, the United States passed the Clean Air Act and began the slow process of removing lead from gasoline. It took years of regulatory battles, but eventually, unleaded gasoline became the standard.

The results were nothing short of miraculous.

Within years, blood lead levels in American children dropped by nearly 80%. An entire generation was saved from cognitive impairment, behavioral disorders, and reduced intelligence. Millions of lives were protected from lead-related health problems.

Clair Patterson had won.

Yet when he died in 1995, few outside the scientific community knew his name. He never received a Nobel Prize. He never became wealthy. He simply returned to his laboratory and continued studying the chemistry of the oceans and the history of the Earth.

Patterson’s story is a reminder of what integrity looks like when nobody’s watching.

It’s easy to do the right thing when the crowd is cheering. It’s infinitely harder when powerful interests are trying to ruin you, when your career is threatened, when taking the money would be so much easier.

He could have stayed silent. He could have enjoyed a comfortable, well-funded career studying rocks while children’s minds were damaged. He could have said, “Not my problem.”

But he looked at the data, looked at the world, and decided truth mattered more than comfort.

He gave us the age of the Earth—a number that changed our understanding of time itself.

And then he gave us a future—a world where children could grow up without poison in their lungs.

We often imagine heroes as soldiers, activists, or celebrities. But sometimes a hero is just a stubborn man in a white lab coat, scrubbing a floor over and over, refusing to accept a convenient lie.

He cleaned the room.

And then he cleaned the world.

Twain On Education

Twain On Education

Mark Twain tears apart the myth of education as obedience.
A diploma proves you followed instructions.
A grade proves you memorized the map.
Neither proves you understand the territory.
Schools produce workers who know how to comply.
Curiosity produces minds that know how to think.
Twain’s warning is sharp: intelligence is not measured by how well you pass tests — it’s measured by how well you question them.

Public Health Leaders

Public Health Leaders

Public health messaging has never been louder — or more confusing. We’re told how to eat, how to live, and how to stay healthy, but leadership isn’t just charts and slogans. It’s embodiment. When the messengers don’t reflect the message, trust erodes, and people stop listening. That trust gap explains more than most are willing to admit.

Grace Groner

Grace Groner

She bought $180 worth of stock during the Great Depression—and never touched it for 75 years.

In 1935, Grace Groner made a decision that looked insignificant at the time. She was working as a secretary at Abbott Laboratories, earning a modest income in a world still reeling from economic collapse. Women were rarely encouraged to build wealth. Financial independence seemed like a luxury reserved for men with means.

That year, Grace bought three shares of Abbott Laboratories stock for sixty dollars each. One hundred eighty dollars total.

Then she did something radical for the era. She held them.

Grace never chased trends. She never sold during panics. She never tried to time the market. She simply reinvested every dividend the company paid and trusted time to do what individual effort could not.

While markets crashed in the years that followed, she held. While World War II erupted and the economy shifted to wartime production, she held. While the Cold War raised fears and recessions came and went, she held. While other investors panicked and sold, she stayed still.

Her life remained simple in a way that seemed almost stubborn to those around her. She lived in a small one-bedroom cottage that had been willed to her. She bought her clothes at rummage sales. After her car was stolen, she never bought another one—she just walked everywhere instead, even into old age with a walker in hand.

She carried the mindset of someone who had lived through scarcity and never forgot it. The Great Depression had taught her that security came from living below your means, not above them.

Her wealth grew quietly in the background while her lifestyle never changed. Nobody suspected. Not her neighbors. Not her colleagues at Abbott where she worked for 43 years before retiring in 1974. Not even most of her friends.

The stock split. The shares multiplied. The dividends compounded. Year after year, decade after decade, that initial $180 investment transformed into something extraordinary—but Grace lived as though it didn’t exist.

She volunteered at the First Presbyterian Church. She donated anonymously to those in need. She attended Lake Forest College football games and stayed connected to the school that had educated her decades earlier. She traveled after retirement, experiencing the world while still maintaining her frugal habits.

In 2008, at age 99, Grace quietly established a foundation. She never told anyone what it would contain.

When Grace died on January 19, 2010, at age one hundred, her attorney opened her will. That’s when everyone discovered the truth.

Her original one hundred eighty dollars—three shares of Abbott Laboratories purchased 75 years earlier—had grown into more than seven million dollars.

The people who knew her were stunned. “Oh, my God,“ exclaimed the president of Lake Forest College when he learned the amount.

Grace, the woman who walked everywhere and bought secondhand clothes, who lived in a tiny cottage and volunteered her time quietly, had been a multimillionaire the entire time. She just chose to live as though she wasn’t.

And she didn’t spend that fortune on herself in the end.

She left nearly all of it to the Grace Elizabeth Groner Foundation—created to fund service-learning opportunities, internships, international study, and community service projects for Lake Forest College students. The same college that had educated her in 1931, paid for by a kind family who took her in after she was orphaned at age 12.

Grace had never forgotten that gift of education. Now she was paying it forward, making it possible for students who needed opportunity the way she once had.

The foundation her estate created would generate hundreds of thousands of dollars annually in dividend income—money that would change countless lives for generations. Students who otherwise couldn’t afford to study abroad or take unpaid internships would now have that chance because of three shares of stock a secretary bought during the Depression.

Her cottage—the small one-bedroom home where she’d lived so simply—was renovated by the foundation and is now home to two female Lake Forest College seniors each year, living there as Grace’s guests.

Grace Groner proved something that challenges every assumption we make about building wealth.

She proved that you don’t need a high income to become wealthy. She proved that you don’t need to be born with privilege or connections. She proved that you don’t need perfect timing or insider knowledge or lucky breaks.

Sometimes wealth comes from something much simpler: patience, discipline, and the belief that your future is worth investing in, even when the first step looks small.

Three shares of stock. One hundred eighty dollars. Seventy-five years of not selling.

That’s all it took.

But it wasn’t really about the stock, was it? It was about understanding something most people never grasp: that compounding requires time more than money. That the most powerful investment strategy isn’t activity—it’s stillness. That true wealth comes not from what you earn but from what you keep and let grow.

Grace worked as a secretary her entire career. She never became an executive. She never got rich from her salary. She never inherited a fortune or won the lottery or built a business empire.

She just bought three shares of a good company and never sold them.

While everyone else was chasing the next hot stock, the next quick profit, the next get-rich scheme, Grace was doing nothing. And in investing, sometimes doing nothing is the most powerful thing you can do.

Her story forces us to confront uncomfortable truths. How many people earn far more than Grace did but will die with far less? How many chase returns instead of letting returns come to them? How many mistake activity for progress?

Grace Groner sat still for 75 years while the world spun around her. She lived modestly while wealth accumulated quietly in the background. She died having touched more lives than most millionaires ever will—not because of what she spent, but because of what she saved and gave away.

Her foundation estimates it provides opportunities to students generating $300,000 annually in benefits. All from $180 invested in 1935 by a secretary who understood something profound about time, patience, and the power of never quitting.

The next time someone tells you it’s impossible to build wealth without advantages, remember Grace Groner. Remember the woman who bought three shares during the Depression and held them until she was 100.

Remember that sometimes the most radical thing you can do is make a small decision and trust it long enough to prove everyone wrong.

Shari Lewis

Shari Lewis

She found out her show was cancelled by overhearing executives on an elevator—they didn’t even know she was standing behind them.

In 1963, Shari Lewis was one of the most talented performers in television.

She could sing. She could dance. She could conduct a symphony orchestra. She could perform ventriloquism so precisely that audiences forgot they were watching a woman with a sock on her hand.

She had trained at the American School of Ballet. Studied acting with Sanford Meisner. Learned piano at two years old. Won a Peabody Award. Hosted a show that ran live television without error—week after week, year after year.

And NBC executives decided she was replaceable.

They cancelled The Shari Lewis Show to make room for cartoons.

She wasn’t told directly. She learned about it while standing in an elevator, listening to men in suits discuss the decision as if she wasn’t there.

“All of it… my entire field crashed around my ears,” she said later.

The industry had made its position clear: children’s television was filler. If the audience was young, the work didn’t count. And the woman who created that work? She was a novelty. A mascot. Not an artist.

But here’s what they got wrong about Shari Lewis:

She didn’t need their permission.

When American networks abandoned live children’s programming, Lewis moved to England and hosted a show on the BBC for eight years. When that work dried up, she performed in Las Vegas casinos, touring companies of Broadway shows, and appeared on variety programs with Ed Sullivan and Johnny Carson.

When those opportunities faded, she reinvented herself again.

She became one of the few female symphony conductors in the world—performing with over 100 orchestras, including the national symphonies of the United States, Canada, and Japan. She learned to speak Japanese for her performances there.

Once, she walked onto a stage at a state fair and found only four people in the audience.

She did the show anyway.

That was who Shari Lewis was.

Not a puppet act. Not a children’s entertainer waiting for permission. A performer who controlled timing, voice, pacing, and audience attention with surgical precision—and refused to stop working just because an industry decided she wasn’t serious.

Then, nearly 30 years after that elevator conversation, PBS came calling.

In 1992, at 59 years old, Lewis launched Lamb Chop’s Play-Along.

The show won five consecutive Emmy Awards. It was the first children’s program in seven years to beat Sesame Street for a writing Emmy. A new generation of children fell in love with Lamb Chop, Charlie Horse, and Hush Puppy—the same characters executives had declared “outdated” three decades earlier.

The audience hadn’t moved on. The industry had simply stopped paying attention.

Lewis didn’t treat this as a comeback. She treated it as what it always was: a correction.

She testified before Congress in 1993 to advocate for children’s television. Lamb Chop was granted special permission to speak. When elementary schools started cutting music programs, Lewis created The Charlie Horse Music Pizza to teach children about music through entertainment.

She was still innovating. Still refusing to be small.

In June 1998, Lewis was diagnosed with uterine cancer and given six weeks to live. She was in the middle of taping new episodes.

She finished them anyway.

Her final performance was a song called “Hello, Goodbye.” Her crew held back tears as she sang. She was saying goodbye to them, to the children watching, and to the character who had been her partner for over 40 years.

Shari Lewis died on August 2, 1998. She was 65 years old.

The industry remembered her fondly. It always does when it’s too late.

But her work didn’t need their remembrance. It endured on its own terms—passed down from parents to children, from one generation to the next, because the audience always knew what the executives never understood:

Precision is not small just because it serves children.

Craft is not diminished by joy.

And the woman who made a sock puppet come alive was never the novelty.

She was the reason it worked at all.

Doug Engelbart

Doug Engelbart

The wooden box had three metallic wheels.

It looked like a toy, or perhaps a piece of scrap assembled in a garage, but the man holding it believed it was the key to the human mind.

It was December 9, 1968.

In the cavernous Brooks Hall in San Francisco, more than a thousand of the world’s top computer scientists sat in folding chairs, waiting. They were used to the roar of air conditioning units cooling massive mainframes. They were used to the smell of ozone and the stack of stiff paper punch cards that defined their working lives.

They were not used to Douglas Engelbart.

He sat alone on the stage, wearing a headset that looked like it belonged to a pilot, staring at a screen that flickered with a ghostly green light. Behind the scenes, a team of engineers held their breath, praying that the delicate web of wires and microwave signals they had cobbled together would hold for just ninety minutes.

If it worked, it would change how humanity thought.

If it failed, Douglas Engelbart would simply be the man who wasted millions of taxpayer dollars on a fantasy.

The world of 1968 was analog.

Information lived on paper. If you wanted to change a paragraph in a report, you retyped the entire page. If you wanted to send a document to a colleague in another city, you put it in an envelope and waited three days. If you wanted to calculate a trajectory, you gave a stack of cards to an operator, who fed them into a machine the size of a room, and you came back the next day for the results.

Computers were calculators. They were powerful, loud, and distant. They were owned by institutions, guarded by specialists, and kept behind glass walls. The idea that a single person would sit in front of a screen and “interact” with a computer in real-time was not just technically difficult; it was culturally absurd.

Engelbart, a soft-spoken engineer from Oregon, saw it differently.

He had grown up in the Depression, fixing water pumps and electrical lines. He understood tools. He believed that the problems facing humanity—war, poverty, disease—were becoming too complex for the unassisted human brain to solve. We needed better tools. We needed to “augment human intellect.”

For years, he had run a lab at the Stanford Research Institute (SRI). While others focused on making computers faster at math, Engelbart’s team focused on making them responsive. They built systems that allowed a user to point, click, and see results instantly.

They called their system NLS, or the “oN-Line System.”

It was a radical departure from the status quo. To the establishment, computing was serious business involving batch processing and efficiency. Engelbart was talking about “manipulating symbols” and “collaboration.”

The pressure on Engelbart was immense.

The funding for his Augmentation Research Center came from ARPA (the Advanced Research Projects Agency), the same government body responsible for military technology. They had poured significant resources into his vision, but results were hard to quantify. There were no enemy codes broken, no missile trajectories calculated. Just a group of men in California moving text around on a screen.

The critics were loud. They called him a dreamer. They said his ideas were “pie in the sky.” Why would anyone need to see a document on a screen when typewriters worked perfectly fine? Why would anyone need to point at data?

This presentation was his answer.

It was an all-or-nothing gamble.

To make the demonstration work, Engelbart wasn’t just using a computer on the stage. The machine itself—an SDS 940 mainframe—was thirty miles away in Menlo Park. He was controlling it remotely.

His team had leased two video lines from the telephone company, a massive expense and logistical nightmare. They had set up microwave transmitters on the roof of the civic center and on a truck parked on a ridge line to relay the signal.

In 1968, sending a video signal and a data signal simultaneously over thirty miles to a live audience was the equivalent of a moon landing.

The computer industry was built on a specific, rigid logic.
Computing Logic: Computers are scarce, expensive resources. Human time is cheap; computer time is expensive. Therefore, humans must prepare work offline (punch cards) to maximize the machine’s efficiency. Interactive computing wastes the machine’s time.

This logic governed the industry. It was why IBM was a titan. It was why office workers sat in rows with typewriters. It was the “correct“ way to do things.

It worked perfectly—until it met Douglas Engelbart.

Engelbart believed that human time was the precious resource, not the machine’s. He believed the machine should serve the mind, even if it was “inefficient” for the hardware.

As the lights went down in Brooks Hall, the hum of the crowd faded.

Engelbart looked small on the big stage. The screen behind him, a massive projection of his small monitor, glowed into life.

He spoke into his microphone, his voice steady but quiet.

“If in your office, you as an intellectual worker were supplied with a computer display, backed up by a computer that was alive for you all day, and was instantly responsive to every action you had, how much value could you derive from that?”

It was a question nobody had ever asked.

He moved his right hand.

On the massive screen, a small dot moved.

The audience froze.

He wasn’t typing coordinates. He wasn’t entering a command code. He was simply moving his hand, and the digital ghost on the screen followed him. He was using the wooden box with the wheels—the device his team had nicknamed “the mouse” because the cord looked like a tail.

Today, a cursor moving on a screen is as natural as breathing. In 1968, it was magic.

But he didn’t stop there.

He clicked on a word. It was highlighted.

He deleted it. It vanished.

The text around it snapped shut to fill the gap.

A murmur ran through the hall. He wasn’t rewriting the page. He
was manipulating information as if it were a physical object, yet it was made of light.

He showed them a “grocery list.” He categorized items. He collapsed the list so only the headers showed, then expanded it again to show the details.

He called this “view control.” We call it windowing.

He showed them a map. He clicked a link, and the screen jumped to a detailed diagram of a component. He clicked back, and he was at the map again.

He called it “hypermedia.” We call it the internet.

The demonstration continued, each minute adding a new impossibility to the list.

The tension in the control room was suffocating. Every second the system stayed online was a victory against the laws of probability. A single blown fuse, a misaligned microwave dish, a software bug—any of it would have turned the screen black and ended the dream.

Then came the moment that truly broke the room.

Engelbart introduced a colleague, Bill Paxton.

Paxton wasn’t on stage. He was thirty miles away, sitting in the lab at SRI.

His face appeared in a window on the screen, crisp and clear.

The audience gasped.

They were looking at a man in Menlo Park, while listening to a man in San Francisco, both looking at the same document on the same screen.

“Okay, Bill,” Engelbart said. “Let’s work on this together.”

On the screen, two cursors appeared. One controlled by Engelbart, one by Paxton.

They edited the text together. Engelbart would point to a sentence, and Paxton would paste it into a new location. They were collaborating, in real-time, across a distance, using a shared digital workspace.

It was Google Docs, Zoom, and Slack, demonstrated a year before the internet (ARPANET) even existed.

The audience, composed of the smartest engineers in the world, sat in stunned silence. They were watching science fiction become a documentary.

They weren’t just seeing new gadgets. They were seeing the destruction of their entire worldview. The idea of the solitary computer operator was dead. The idea of the computer as a mere calculator was dead.

Engelbart was showing them a window into a world where minds could connect through machines.

He typed, he clicked, he spoke. He operated a “chorded keyset” with his left hand, entering commands as fast as a pianist, while his right hand flew across the desk with the mouse. He was a conductor of information.

For ninety minutes, the system held.

The microwave links stayed true. The software didn’t crash. The mainframe thirty miles away processed every command.

When Engelbart finally took off the headset and the screen went dark, there was a pause.

A hesitation.

Then, the audience stood.

It wasn’t a polite golf clap. It was a roar. It was the sound of a thousand experts realizing that everything they knew about their field had just become obsolete.

They rushed the stage. They wanted to touch the mouse. They wanted to see the keyset. They wanted to know how he did it.

The “Mother of All Demos,” as it was later christened, did not immediately change the market. Engelbart did not become a billionaire. He was a researcher, not a salesman. His system was too expensive and too complex for the 1970s.

But the seeds were planted.

Sitting in the audience were the young engineers who would go on to work at Xerox PARC. They would take the mouse, the windows, and the graphical interface, and they would refine them.

Steve Jobs would visit Xerox PARC a decade later, see the descendants of Engelbart’s mouse, and use them to build the Macintosh.

Bill Gates would see it and build Windows.

Tim Berners-Lee would use the concept of hypermedia to build the World Wide Web.

Every smartphone in a pocket, every laptop in a cafe, every video call made to a loved one across the ocean—it all traces back to that ninety-minute window in 1968.

Douglas Engelbart died in 2013. He never sought fame. He watched as the world caught up to the vision he had seen clearly half a century before.

He proved that the pressure of the status quo—the belief that “this is how it’s always been done”—is brittle. It can be broken by a single person with a wooden box and the courage to show us what is possible.

The system said computers were for numbers.

He showed us they were for people.

Sources: Detailed in “The Mother of All Demos” archives (SRI International). Smithsonian Magazine, “The 1968 Demo That Changed Computing.” New York Times obituary for Douglas Engelbart, 2013. Summary of events from the Doug Engelbart Institute records.

Failure? A Destination or a Progress Marker?

The dictionary has multiple definitions of failure:

  • 1. lack of success.
  • 2. an unsuccessful person or thing.
  • 3. the neglect or omission of expected or required action.
  • 4. a lack or deficiency of a desirable quality.
  • 5. the action or state of not functioning.
  • 6. a sudden cessation of power.
  • 7. the collapse of a business.

The definitions all deliver the impression of a finite conclusion rather than a step in a process. Failure equals being wrong. Being wrong equals death. As a result, failure has an obvious and deeply negative stigma associated with it. Hence most people fear failing.

In fact many people do not even attempt worthwhile projects for fear of failure. This has been commented upon by various motivational speakers as sad and lamentable but is a natural outcome of the way we are taught to think about failure – it is bad and to be avoided.

And it is a lot easier and very simple to say don’t fear failure than it is to spend the time necessary to change our thinking about it. So what is a better way to think of failure and how do we change our thinking about it?

I don’t know how true it is but I have heard that Edison failed 10,000 times to invent the light bulb before his success. Imagine if he took his first failure as an end point rather than a new starting point. In fact each failure could otherwise be described as a successful experiment to find out that a particular hypothesis did not work.

I was struck by this when I was doing some pullups in the park with 13 kg of weights on my back. I was doing my third set of 5 repetitions and on the last repetition I could not pull myself up more than 85% of my top range of motion. That was my point of failure. Despite my best effort, I could not pull my body up to get my nose over the bar. I “failed“.

Now, when you are exercising, this is something to aim for. Exercising with good form till you are close to failure (with some capacity left in reserve) builds strength and muscle mass.

At this point I realised every person doing resistance training “fails”. We all hit a point where we are at or close to where we can do no more. We are all “failures”, at different points. Some of us fail after 4 repetitions at 13 kg, as did I. Some of fail after 44 repetitions or with 50 kg. None of us stop training “because we failed”. We recognise it as a benchmark or a measure of progress rather than a destination. A “That’s where I am up to.” viewpoint rather than a “That is my end result.” viewpoint.

In many situations, such as in exercise, it is not about failure versus success, it is about WHEN you fail.

Some fail before they start, thinking it is too much effort.

Some fail at the first day that is either too hot or too cold for comfort.

Some fail when their results do not match their expectation.

A rare few fail after they win their marathon, receive their trophy, party on and go to bed at 2:00 am.

It’s all about WHEN you fail! This is why persistence is vital for success. The ultra persistent refuse to fail ’till after the victory party.

Which reminded me of a quote I heard about people who are successful marketers, “They fail fast and they fail often.” They try a lot of things, knowing that many ideas they try will fail and need to be abandoned quickly before wasting too much money on them. By doing that many times and quickly, they sooner or later and without too much wasted money, find that which works and can then do lots of that to huge success.

These top marketers know full well that a fear of failure will not lead to success.

They know that in marketing, as in exercising, it is very easy and natural to view failure as a marker, a peg in the board. A “This is where I am up to”. It is not the end of the road, it is the current position of my progress marker.

What if we started doing that in other spheres of activity? What if every time we thought of something and got the negative thought come in about failing, we just looked at it and thought, “That’s only to be expected. Nothing unusual here. Any time I fail it is merely the current position of my progress marker, just another step toward the ultimate success.”

This I wish for you!

Elizabeth Peratrovich

Elizabeth Peratrovich

She sat quietly knitting while they called her people savages. Then she stood up and used their own words to destroy them.
Juneau, Alaska. February 8, 1945.
The Alaska Territorial Legislature chamber was crowded and tense. In the gallery sat dozens of Native Alaskans—Tlingit, Haida, Tsimshian—who had traveled to the capital for this moment. They came for a single law. The Anti-Discrimination Act. A bill that would make it illegal to post signs reading “No Natives Allowed.” That would let them enter any restaurant, any hotel, any theater without being turned away.
A law that would recognize them as equal citizens in their own ancestral homeland.
But first, they had to endure a hearing where white senators explained why Native people didn’t deserve equal rights.
This was 1945. Ten years before Rosa Parks. Nineteen years before the federal Civil Rights Act. Most Americans don’t know that the first anti-discrimination law in United States history was won in Alaska by a Tlingit woman facing down a room of hostile legislators.
Her name was Elizabeth Peratrovich.
And she was about to deliver one of the most devastating responses in American political history.
One senator after another rose to oppose the bill. They argued that the races should remain separate. That integration would cause problems. That Native people weren’t ready for full equality.
Then the insults became personal.
One senator complained openly that he didn’t want to sit next to Native people in theaters because of how they smelled. Another suggested that Native peoples lacked the sophistication to deserve equal treatment.
The Native people in the gallery sat in dignified silence. They’d heard these attitudes their entire lives—but never so brazenly, never in an official government chamber, never while forced to listen without recourse.
Then Senator Allen Shattuck stood. He was among the most vocal opponents of the bill. He looked directly at the Native people in the gallery, his voice dripping with contempt.
“Who are these people, barely out of savagery, who want to associate with us whites with five thousand years of recorded civilization behind us?”
The room went silent.
He had just called them savages. Primitives. People barely evolved enough to desire equality with civilized whites.
In the back of the chamber, Elizabeth Peratrovich was knitting. She was thirty-three years old, mother of three, and president of the Alaska Native Sisterhood. She was known for her composure, her quiet dignity even in the face of injustice.
She set her knitting needles down.
She stood.
Elizabeth hadn’t come prepared to testify. She was simply a Native woman who had spent her life seeing signs in windows telling her she wasn’t welcome. Who had been turned away from hotels. Who watched her children learn they were considered less than human in their own homeland.
She walked to the front of the chamber. Every eye followed her. The legislators who had been sneering moments before now watched in heavy silence.
She looked directly at Senator Shattuck. She didn’t raise her voice. She didn’t show anger. Her tone was measured, controlled, devastatingly clear.
“I would not have expected that I, who am barely out of savagery, would have to remind gentlemen with five thousand years of recorded civilization behind them of our Bill of Rights.”
The impact was immediate.
She had taken Shattuck’s insult—”barely out of savagery”—and turned it into a weapon. She used his claim of superior civilization to expose his complete lack of it.
A defensive murmur went through the opposition. They knew they’d been caught. Exposed. Shamed.
But Elizabeth wasn’t finished.
She described what it meant to see signs comparing her people to dogs. To have her children ask why they weren’t allowed in certain stores. To be treated as unwelcome in lands their ancestors had inhabited for thousands of years before any white settler arrived.
Then came what opponents thought would trap her. A senator asked skeptically whether a law could truly change people’s hearts and stop discrimination.
Elizabeth’s response became legendary.
“Do your laws against larceny and murder prevent those crimes?” she asked calmly. “No law will eliminate crimes, but at least you as legislators can assert to the world that you recognize the evil of the present situation and speak your intent to help us overcome discrimination.”
Silence.
She had dismantled every argument. She had proven she understood law, morality, and civilization better than the senators who claimed millennia of it.
The Native people didn’t need education from white legislators. The white legislators needed education from Elizabeth Peratrovich.
When the vote was called, the Anti-Discrimination Act of 1945 passed eleven to five.
The first anti-discrimination law in United States history.
Not in New York. Not in California. In Alaska. Because a Tlingit woman refused to remain silent when called a savage.
The law prohibited discrimination in public accommodations. It made “No Natives” signs illegal. It declared that Alaska would not tolerate racial discrimination.
Nineteen years before the federal Civil Rights Act. Ten years before Rosa Parks became a household name.
Yet most Americans have never heard of Elizabeth Peratrovich.
We learn about Rosa Parks, as we should. We study Martin Luther King Jr., the March on Washington, the Civil Rights Movement of the 1960s. These stories deserve to be taught and remembered.
But the woman who won the first anti-discrimination law in American history? The woman who faced down racist senators and won? She remains virtually unknown outside Alaska.
Why? Because Alaska wasn’t the South where national media focused. Because Native American civil rights struggles didn’t capture headlines the way other movements did. Because Elizabeth didn’t have a national platform or massive organization—just her dignity and her refusal to accept injustice.
But Alaska remembers.
February 16th is Elizabeth Peratrovich Day, an official state holiday. Schools and government offices close. In 2020, plans were announced to feature her on the dollar coin. In Juneau stands a bronze statue of Elizabeth, captured in quiet dignity—just as she stood in that chamber in 1945.
Yet beyond Alaska, her story remains obscure. That’s tragic, because what Elizabeth proved was fundamental.
Civilization isn’t measured by how many years your history spans. It’s not measured by monuments or recorded achievements or military conquests.
It’s measured by how you treat the vulnerable. By whether you uphold dignity or destroy it. By whether you use law to protect people or to oppress them.
Senator Shattuck claimed five thousand years of civilization. Elizabeth Peratrovich proved he had none.
Because what’s civilized about “No Dogs, No Natives” signs? What’s civilized about denying people access to public spaces in their own ancestral homeland? What’s civilized about a government official calling people savages?
Nothing.
Elizabeth didn’t need five thousand years of history. She needed moral clarity and courage.
She weaponized their own claims against them. She demonstrated that the supposed “savage” in the room understood America’s founding principles better than the “civilized” senators did.
And she won.
Elizabeth Peratrovich died in 1958 at age forty-seven. She didn’t live to see the federal Civil Rights Act. She didn’t see her image on coins or statues erected in her honor.
But she lived long enough to see “No Natives” signs removed throughout Alaska. She lived to know her children could enter any business in Juneau without being turned away. She lived to see the law of an entire territory changed because she refused to be silent.
That’s not just one woman’s victory. That’s proof that dignity is powerful. That moral clarity can defeat bigotry. That sometimes changing history requires one person willing to stand, set down their work, and speak truth to power.
The senators thought they had civilization on their side. They thought their recorded history gave them authority.
Elizabeth Peratrovich taught them that civilization isn’t inherited. It’s earned every single day by how you treat people.
She was knitting quietly while senators called her people savages.
Then she reminded them what civilization actually means.
And she won the first anti-discrimination law in United States history.
Elizabeth Peratrovich deserves to be as famous as any civil rights leader in American history.
Now you know her name.