Public Health Leaders

Public Health Leaders

Public health messaging has never been louder — or more confusing. We’re told how to eat, how to live, and how to stay healthy, but leadership isn’t just charts and slogans. It’s embodiment. When the messengers don’t reflect the message, trust erodes, and people stop listening. That trust gap explains more than most are willing to admit.

Grace Groner

Grace Groner

She bought $180 worth of stock during the Great Depression—and never touched it for 75 years.

In 1935, Grace Groner made a decision that looked insignificant at the time. She was working as a secretary at Abbott Laboratories, earning a modest income in a world still reeling from economic collapse. Women were rarely encouraged to build wealth. Financial independence seemed like a luxury reserved for men with means.

That year, Grace bought three shares of Abbott Laboratories stock for sixty dollars each. One hundred eighty dollars total.

Then she did something radical for the era. She held them.

Grace never chased trends. She never sold during panics. She never tried to time the market. She simply reinvested every dividend the company paid and trusted time to do what individual effort could not.

While markets crashed in the years that followed, she held. While World War II erupted and the economy shifted to wartime production, she held. While the Cold War raised fears and recessions came and went, she held. While other investors panicked and sold, she stayed still.

Her life remained simple in a way that seemed almost stubborn to those around her. She lived in a small one-bedroom cottage that had been willed to her. She bought her clothes at rummage sales. After her car was stolen, she never bought another one—she just walked everywhere instead, even into old age with a walker in hand.

She carried the mindset of someone who had lived through scarcity and never forgot it. The Great Depression had taught her that security came from living below your means, not above them.

Her wealth grew quietly in the background while her lifestyle never changed. Nobody suspected. Not her neighbors. Not her colleagues at Abbott where she worked for 43 years before retiring in 1974. Not even most of her friends.

The stock split. The shares multiplied. The dividends compounded. Year after year, decade after decade, that initial $180 investment transformed into something extraordinary—but Grace lived as though it didn’t exist.

She volunteered at the First Presbyterian Church. She donated anonymously to those in need. She attended Lake Forest College football games and stayed connected to the school that had educated her decades earlier. She traveled after retirement, experiencing the world while still maintaining her frugal habits.

In 2008, at age 99, Grace quietly established a foundation. She never told anyone what it would contain.

When Grace died on January 19, 2010, at age one hundred, her attorney opened her will. That’s when everyone discovered the truth.

Her original one hundred eighty dollars—three shares of Abbott Laboratories purchased 75 years earlier—had grown into more than seven million dollars.

The people who knew her were stunned. “Oh, my God,“ exclaimed the president of Lake Forest College when he learned the amount.

Grace, the woman who walked everywhere and bought secondhand clothes, who lived in a tiny cottage and volunteered her time quietly, had been a multimillionaire the entire time. She just chose to live as though she wasn’t.

And she didn’t spend that fortune on herself in the end.

She left nearly all of it to the Grace Elizabeth Groner Foundation—created to fund service-learning opportunities, internships, international study, and community service projects for Lake Forest College students. The same college that had educated her in 1931, paid for by a kind family who took her in after she was orphaned at age 12.

Grace had never forgotten that gift of education. Now she was paying it forward, making it possible for students who needed opportunity the way she once had.

The foundation her estate created would generate hundreds of thousands of dollars annually in dividend income—money that would change countless lives for generations. Students who otherwise couldn’t afford to study abroad or take unpaid internships would now have that chance because of three shares of stock a secretary bought during the Depression.

Her cottage—the small one-bedroom home where she’d lived so simply—was renovated by the foundation and is now home to two female Lake Forest College seniors each year, living there as Grace’s guests.

Grace Groner proved something that challenges every assumption we make about building wealth.

She proved that you don’t need a high income to become wealthy. She proved that you don’t need to be born with privilege or connections. She proved that you don’t need perfect timing or insider knowledge or lucky breaks.

Sometimes wealth comes from something much simpler: patience, discipline, and the belief that your future is worth investing in, even when the first step looks small.

Three shares of stock. One hundred eighty dollars. Seventy-five years of not selling.

That’s all it took.

But it wasn’t really about the stock, was it? It was about understanding something most people never grasp: that compounding requires time more than money. That the most powerful investment strategy isn’t activity—it’s stillness. That true wealth comes not from what you earn but from what you keep and let grow.

Grace worked as a secretary her entire career. She never became an executive. She never got rich from her salary. She never inherited a fortune or won the lottery or built a business empire.

She just bought three shares of a good company and never sold them.

While everyone else was chasing the next hot stock, the next quick profit, the next get-rich scheme, Grace was doing nothing. And in investing, sometimes doing nothing is the most powerful thing you can do.

Her story forces us to confront uncomfortable truths. How many people earn far more than Grace did but will die with far less? How many chase returns instead of letting returns come to them? How many mistake activity for progress?

Grace Groner sat still for 75 years while the world spun around her. She lived modestly while wealth accumulated quietly in the background. She died having touched more lives than most millionaires ever will—not because of what she spent, but because of what she saved and gave away.

Her foundation estimates it provides opportunities to students generating $300,000 annually in benefits. All from $180 invested in 1935 by a secretary who understood something profound about time, patience, and the power of never quitting.

The next time someone tells you it’s impossible to build wealth without advantages, remember Grace Groner. Remember the woman who bought three shares during the Depression and held them until she was 100.

Remember that sometimes the most radical thing you can do is make a small decision and trust it long enough to prove everyone wrong.

Shari Lewis

Shari Lewis

She found out her show was cancelled by overhearing executives on an elevator—they didn’t even know she was standing behind them.

In 1963, Shari Lewis was one of the most talented performers in television.

She could sing. She could dance. She could conduct a symphony orchestra. She could perform ventriloquism so precisely that audiences forgot they were watching a woman with a sock on her hand.

She had trained at the American School of Ballet. Studied acting with Sanford Meisner. Learned piano at two years old. Won a Peabody Award. Hosted a show that ran live television without error—week after week, year after year.

And NBC executives decided she was replaceable.

They cancelled The Shari Lewis Show to make room for cartoons.

She wasn’t told directly. She learned about it while standing in an elevator, listening to men in suits discuss the decision as if she wasn’t there.

“All of it… my entire field crashed around my ears,” she said later.

The industry had made its position clear: children’s television was filler. If the audience was young, the work didn’t count. And the woman who created that work? She was a novelty. A mascot. Not an artist.

But here’s what they got wrong about Shari Lewis:

She didn’t need their permission.

When American networks abandoned live children’s programming, Lewis moved to England and hosted a show on the BBC for eight years. When that work dried up, she performed in Las Vegas casinos, touring companies of Broadway shows, and appeared on variety programs with Ed Sullivan and Johnny Carson.

When those opportunities faded, she reinvented herself again.

She became one of the few female symphony conductors in the world—performing with over 100 orchestras, including the national symphonies of the United States, Canada, and Japan. She learned to speak Japanese for her performances there.

Once, she walked onto a stage at a state fair and found only four people in the audience.

She did the show anyway.

That was who Shari Lewis was.

Not a puppet act. Not a children’s entertainer waiting for permission. A performer who controlled timing, voice, pacing, and audience attention with surgical precision—and refused to stop working just because an industry decided she wasn’t serious.

Then, nearly 30 years after that elevator conversation, PBS came calling.

In 1992, at 59 years old, Lewis launched Lamb Chop’s Play-Along.

The show won five consecutive Emmy Awards. It was the first children’s program in seven years to beat Sesame Street for a writing Emmy. A new generation of children fell in love with Lamb Chop, Charlie Horse, and Hush Puppy—the same characters executives had declared “outdated” three decades earlier.

The audience hadn’t moved on. The industry had simply stopped paying attention.

Lewis didn’t treat this as a comeback. She treated it as what it always was: a correction.

She testified before Congress in 1993 to advocate for children’s television. Lamb Chop was granted special permission to speak. When elementary schools started cutting music programs, Lewis created The Charlie Horse Music Pizza to teach children about music through entertainment.

She was still innovating. Still refusing to be small.

In June 1998, Lewis was diagnosed with uterine cancer and given six weeks to live. She was in the middle of taping new episodes.

She finished them anyway.

Her final performance was a song called “Hello, Goodbye.” Her crew held back tears as she sang. She was saying goodbye to them, to the children watching, and to the character who had been her partner for over 40 years.

Shari Lewis died on August 2, 1998. She was 65 years old.

The industry remembered her fondly. It always does when it’s too late.

But her work didn’t need their remembrance. It endured on its own terms—passed down from parents to children, from one generation to the next, because the audience always knew what the executives never understood:

Precision is not small just because it serves children.

Craft is not diminished by joy.

And the woman who made a sock puppet come alive was never the novelty.

She was the reason it worked at all.

Doug Engelbart

Doug Engelbart

The wooden box had three metallic wheels.

It looked like a toy, or perhaps a piece of scrap assembled in a garage, but the man holding it believed it was the key to the human mind.

It was December 9, 1968.

In the cavernous Brooks Hall in San Francisco, more than a thousand of the world’s top computer scientists sat in folding chairs, waiting. They were used to the roar of air conditioning units cooling massive mainframes. They were used to the smell of ozone and the stack of stiff paper punch cards that defined their working lives.

They were not used to Douglas Engelbart.

He sat alone on the stage, wearing a headset that looked like it belonged to a pilot, staring at a screen that flickered with a ghostly green light. Behind the scenes, a team of engineers held their breath, praying that the delicate web of wires and microwave signals they had cobbled together would hold for just ninety minutes.

If it worked, it would change how humanity thought.

If it failed, Douglas Engelbart would simply be the man who wasted millions of taxpayer dollars on a fantasy.

The world of 1968 was analog.

Information lived on paper. If you wanted to change a paragraph in a report, you retyped the entire page. If you wanted to send a document to a colleague in another city, you put it in an envelope and waited three days. If you wanted to calculate a trajectory, you gave a stack of cards to an operator, who fed them into a machine the size of a room, and you came back the next day for the results.

Computers were calculators. They were powerful, loud, and distant. They were owned by institutions, guarded by specialists, and kept behind glass walls. The idea that a single person would sit in front of a screen and “interact” with a computer in real-time was not just technically difficult; it was culturally absurd.

Engelbart, a soft-spoken engineer from Oregon, saw it differently.

He had grown up in the Depression, fixing water pumps and electrical lines. He understood tools. He believed that the problems facing humanity—war, poverty, disease—were becoming too complex for the unassisted human brain to solve. We needed better tools. We needed to “augment human intellect.”

For years, he had run a lab at the Stanford Research Institute (SRI). While others focused on making computers faster at math, Engelbart’s team focused on making them responsive. They built systems that allowed a user to point, click, and see results instantly.

They called their system NLS, or the “oN-Line System.”

It was a radical departure from the status quo. To the establishment, computing was serious business involving batch processing and efficiency. Engelbart was talking about “manipulating symbols” and “collaboration.”

The pressure on Engelbart was immense.

The funding for his Augmentation Research Center came from ARPA (the Advanced Research Projects Agency), the same government body responsible for military technology. They had poured significant resources into his vision, but results were hard to quantify. There were no enemy codes broken, no missile trajectories calculated. Just a group of men in California moving text around on a screen.

The critics were loud. They called him a dreamer. They said his ideas were “pie in the sky.” Why would anyone need to see a document on a screen when typewriters worked perfectly fine? Why would anyone need to point at data?

This presentation was his answer.

It was an all-or-nothing gamble.

To make the demonstration work, Engelbart wasn’t just using a computer on the stage. The machine itself—an SDS 940 mainframe—was thirty miles away in Menlo Park. He was controlling it remotely.

His team had leased two video lines from the telephone company, a massive expense and logistical nightmare. They had set up microwave transmitters on the roof of the civic center and on a truck parked on a ridge line to relay the signal.

In 1968, sending a video signal and a data signal simultaneously over thirty miles to a live audience was the equivalent of a moon landing.

The computer industry was built on a specific, rigid logic.
Computing Logic: Computers are scarce, expensive resources. Human time is cheap; computer time is expensive. Therefore, humans must prepare work offline (punch cards) to maximize the machine’s efficiency. Interactive computing wastes the machine’s time.

This logic governed the industry. It was why IBM was a titan. It was why office workers sat in rows with typewriters. It was the “correct“ way to do things.

It worked perfectly—until it met Douglas Engelbart.

Engelbart believed that human time was the precious resource, not the machine’s. He believed the machine should serve the mind, even if it was “inefficient” for the hardware.

As the lights went down in Brooks Hall, the hum of the crowd faded.

Engelbart looked small on the big stage. The screen behind him, a massive projection of his small monitor, glowed into life.

He spoke into his microphone, his voice steady but quiet.

“If in your office, you as an intellectual worker were supplied with a computer display, backed up by a computer that was alive for you all day, and was instantly responsive to every action you had, how much value could you derive from that?”

It was a question nobody had ever asked.

He moved his right hand.

On the massive screen, a small dot moved.

The audience froze.

He wasn’t typing coordinates. He wasn’t entering a command code. He was simply moving his hand, and the digital ghost on the screen followed him. He was using the wooden box with the wheels—the device his team had nicknamed “the mouse” because the cord looked like a tail.

Today, a cursor moving on a screen is as natural as breathing. In 1968, it was magic.

But he didn’t stop there.

He clicked on a word. It was highlighted.

He deleted it. It vanished.

The text around it snapped shut to fill the gap.

A murmur ran through the hall. He wasn’t rewriting the page. He
was manipulating information as if it were a physical object, yet it was made of light.

He showed them a “grocery list.” He categorized items. He collapsed the list so only the headers showed, then expanded it again to show the details.

He called this “view control.” We call it windowing.

He showed them a map. He clicked a link, and the screen jumped to a detailed diagram of a component. He clicked back, and he was at the map again.

He called it “hypermedia.” We call it the internet.

The demonstration continued, each minute adding a new impossibility to the list.

The tension in the control room was suffocating. Every second the system stayed online was a victory against the laws of probability. A single blown fuse, a misaligned microwave dish, a software bug—any of it would have turned the screen black and ended the dream.

Then came the moment that truly broke the room.

Engelbart introduced a colleague, Bill Paxton.

Paxton wasn’t on stage. He was thirty miles away, sitting in the lab at SRI.

His face appeared in a window on the screen, crisp and clear.

The audience gasped.

They were looking at a man in Menlo Park, while listening to a man in San Francisco, both looking at the same document on the same screen.

“Okay, Bill,” Engelbart said. “Let’s work on this together.”

On the screen, two cursors appeared. One controlled by Engelbart, one by Paxton.

They edited the text together. Engelbart would point to a sentence, and Paxton would paste it into a new location. They were collaborating, in real-time, across a distance, using a shared digital workspace.

It was Google Docs, Zoom, and Slack, demonstrated a year before the internet (ARPANET) even existed.

The audience, composed of the smartest engineers in the world, sat in stunned silence. They were watching science fiction become a documentary.

They weren’t just seeing new gadgets. They were seeing the destruction of their entire worldview. The idea of the solitary computer operator was dead. The idea of the computer as a mere calculator was dead.

Engelbart was showing them a window into a world where minds could connect through machines.

He typed, he clicked, he spoke. He operated a “chorded keyset” with his left hand, entering commands as fast as a pianist, while his right hand flew across the desk with the mouse. He was a conductor of information.

For ninety minutes, the system held.

The microwave links stayed true. The software didn’t crash. The mainframe thirty miles away processed every command.

When Engelbart finally took off the headset and the screen went dark, there was a pause.

A hesitation.

Then, the audience stood.

It wasn’t a polite golf clap. It was a roar. It was the sound of a thousand experts realizing that everything they knew about their field had just become obsolete.

They rushed the stage. They wanted to touch the mouse. They wanted to see the keyset. They wanted to know how he did it.

The “Mother of All Demos,” as it was later christened, did not immediately change the market. Engelbart did not become a billionaire. He was a researcher, not a salesman. His system was too expensive and too complex for the 1970s.

But the seeds were planted.

Sitting in the audience were the young engineers who would go on to work at Xerox PARC. They would take the mouse, the windows, and the graphical interface, and they would refine them.

Steve Jobs would visit Xerox PARC a decade later, see the descendants of Engelbart’s mouse, and use them to build the Macintosh.

Bill Gates would see it and build Windows.

Tim Berners-Lee would use the concept of hypermedia to build the World Wide Web.

Every smartphone in a pocket, every laptop in a cafe, every video call made to a loved one across the ocean—it all traces back to that ninety-minute window in 1968.

Douglas Engelbart died in 2013. He never sought fame. He watched as the world caught up to the vision he had seen clearly half a century before.

He proved that the pressure of the status quo—the belief that “this is how it’s always been done”—is brittle. It can be broken by a single person with a wooden box and the courage to show us what is possible.

The system said computers were for numbers.

He showed us they were for people.

Sources: Detailed in “The Mother of All Demos” archives (SRI International). Smithsonian Magazine, “The 1968 Demo That Changed Computing.” New York Times obituary for Douglas Engelbart, 2013. Summary of events from the Doug Engelbart Institute records.

Failure? A Destination or a Progress Marker?

The dictionary has multiple definitions of failure:

  • 1. lack of success.
  • 2. an unsuccessful person or thing.
  • 3. the neglect or omission of expected or required action.
  • 4. a lack or deficiency of a desirable quality.
  • 5. the action or state of not functioning.
  • 6. a sudden cessation of power.
  • 7. the collapse of a business.

The definitions all deliver the impression of a finite conclusion rather than a step in a process. Failure equals being wrong. Being wrong equals death. As a result, failure has an obvious and deeply negative stigma associated with it. Hence most people fear failing.

In fact many people do not even attempt worthwhile projects for fear of failure. This has been commented upon by various motivational speakers as sad and lamentable but is a natural outcome of the way we are taught to think about failure – it is bad and to be avoided.

And it is a lot easier and very simple to say don’t fear failure than it is to spend the time necessary to change our thinking about it. So what is a better way to think of failure and how do we change our thinking about it?

I don’t know how true it is but I have heard that Edison failed 10,000 times to invent the light bulb before his success. Imagine if he took his first failure as an end point rather than a new starting point. In fact each failure could otherwise be described as a successful experiment to find out that a particular hypothesis did not work.

I was struck by this when I was doing some pullups in the park with 13 kg of weights on my back. I was doing my third set of 5 repetitions and on the last repetition I could not pull myself up more than 85% of my top range of motion. That was my point of failure. Despite my best effort, I could not pull my body up to get my nose over the bar. I “failed“.

Now, when you are exercising, this is something to aim for. Exercising with good form till you are close to failure (with some capacity left in reserve) builds strength and muscle mass.

At this point I realised every person doing resistance training “fails”. We all hit a point where we are at or close to where we can do no more. We are all “failures”, at different points. Some of us fail after 4 repetitions at 13 kg, as did I. Some of fail after 44 repetitions or with 50 kg. None of us stop training “because we failed”. We recognise it as a benchmark or a measure of progress rather than a destination. A “That’s where I am up to.” viewpoint rather than a “That is my end result.” viewpoint.

In many situations, such as in exercise, it is not about failure versus success, it is about WHEN you fail.

Some fail before they start, thinking it is too much effort.

Some fail at the first day that is either too hot or too cold for comfort.

Some fail when their results do not match their expectation.

A rare few fail after they win their marathon, receive their trophy, party on and go to bed at 2:00 am.

It’s all about WHEN you fail! This is why persistence is vital for success. The ultra persistent refuse to fail ’till after the victory party.

Which reminded me of a quote I heard about people who are successful marketers, “They fail fast and they fail often.” They try a lot of things, knowing that many ideas they try will fail and need to be abandoned quickly before wasting too much money on them. By doing that many times and quickly, they sooner or later and without too much wasted money, find that which works and can then do lots of that to huge success.

These top marketers know full well that a fear of failure will not lead to success.

They know that in marketing, as in exercising, it is very easy and natural to view failure as a marker, a peg in the board. A “This is where I am up to”. It is not the end of the road, it is the current position of my progress marker.

What if we started doing that in other spheres of activity? What if every time we thought of something and got the negative thought come in about failing, we just looked at it and thought, “That’s only to be expected. Nothing unusual here. Any time I fail it is merely the current position of my progress marker, just another step toward the ultimate success.”

This I wish for you!

Elizabeth Peratrovich

Elizabeth Peratrovich

She sat quietly knitting while they called her people savages. Then she stood up and used their own words to destroy them.
Juneau, Alaska. February 8, 1945.
The Alaska Territorial Legislature chamber was crowded and tense. In the gallery sat dozens of Native Alaskans—Tlingit, Haida, Tsimshian—who had traveled to the capital for this moment. They came for a single law. The Anti-Discrimination Act. A bill that would make it illegal to post signs reading “No Natives Allowed.” That would let them enter any restaurant, any hotel, any theater without being turned away.
A law that would recognize them as equal citizens in their own ancestral homeland.
But first, they had to endure a hearing where white senators explained why Native people didn’t deserve equal rights.
This was 1945. Ten years before Rosa Parks. Nineteen years before the federal Civil Rights Act. Most Americans don’t know that the first anti-discrimination law in United States history was won in Alaska by a Tlingit woman facing down a room of hostile legislators.
Her name was Elizabeth Peratrovich.
And she was about to deliver one of the most devastating responses in American political history.
One senator after another rose to oppose the bill. They argued that the races should remain separate. That integration would cause problems. That Native people weren’t ready for full equality.
Then the insults became personal.
One senator complained openly that he didn’t want to sit next to Native people in theaters because of how they smelled. Another suggested that Native peoples lacked the sophistication to deserve equal treatment.
The Native people in the gallery sat in dignified silence. They’d heard these attitudes their entire lives—but never so brazenly, never in an official government chamber, never while forced to listen without recourse.
Then Senator Allen Shattuck stood. He was among the most vocal opponents of the bill. He looked directly at the Native people in the gallery, his voice dripping with contempt.
“Who are these people, barely out of savagery, who want to associate with us whites with five thousand years of recorded civilization behind us?”
The room went silent.
He had just called them savages. Primitives. People barely evolved enough to desire equality with civilized whites.
In the back of the chamber, Elizabeth Peratrovich was knitting. She was thirty-three years old, mother of three, and president of the Alaska Native Sisterhood. She was known for her composure, her quiet dignity even in the face of injustice.
She set her knitting needles down.
She stood.
Elizabeth hadn’t come prepared to testify. She was simply a Native woman who had spent her life seeing signs in windows telling her she wasn’t welcome. Who had been turned away from hotels. Who watched her children learn they were considered less than human in their own homeland.
She walked to the front of the chamber. Every eye followed her. The legislators who had been sneering moments before now watched in heavy silence.
She looked directly at Senator Shattuck. She didn’t raise her voice. She didn’t show anger. Her tone was measured, controlled, devastatingly clear.
“I would not have expected that I, who am barely out of savagery, would have to remind gentlemen with five thousand years of recorded civilization behind them of our Bill of Rights.”
The impact was immediate.
She had taken Shattuck’s insult—”barely out of savagery”—and turned it into a weapon. She used his claim of superior civilization to expose his complete lack of it.
A defensive murmur went through the opposition. They knew they’d been caught. Exposed. Shamed.
But Elizabeth wasn’t finished.
She described what it meant to see signs comparing her people to dogs. To have her children ask why they weren’t allowed in certain stores. To be treated as unwelcome in lands their ancestors had inhabited for thousands of years before any white settler arrived.
Then came what opponents thought would trap her. A senator asked skeptically whether a law could truly change people’s hearts and stop discrimination.
Elizabeth’s response became legendary.
“Do your laws against larceny and murder prevent those crimes?” she asked calmly. “No law will eliminate crimes, but at least you as legislators can assert to the world that you recognize the evil of the present situation and speak your intent to help us overcome discrimination.”
Silence.
She had dismantled every argument. She had proven she understood law, morality, and civilization better than the senators who claimed millennia of it.
The Native people didn’t need education from white legislators. The white legislators needed education from Elizabeth Peratrovich.
When the vote was called, the Anti-Discrimination Act of 1945 passed eleven to five.
The first anti-discrimination law in United States history.
Not in New York. Not in California. In Alaska. Because a Tlingit woman refused to remain silent when called a savage.
The law prohibited discrimination in public accommodations. It made “No Natives” signs illegal. It declared that Alaska would not tolerate racial discrimination.
Nineteen years before the federal Civil Rights Act. Ten years before Rosa Parks became a household name.
Yet most Americans have never heard of Elizabeth Peratrovich.
We learn about Rosa Parks, as we should. We study Martin Luther King Jr., the March on Washington, the Civil Rights Movement of the 1960s. These stories deserve to be taught and remembered.
But the woman who won the first anti-discrimination law in American history? The woman who faced down racist senators and won? She remains virtually unknown outside Alaska.
Why? Because Alaska wasn’t the South where national media focused. Because Native American civil rights struggles didn’t capture headlines the way other movements did. Because Elizabeth didn’t have a national platform or massive organization—just her dignity and her refusal to accept injustice.
But Alaska remembers.
February 16th is Elizabeth Peratrovich Day, an official state holiday. Schools and government offices close. In 2020, plans were announced to feature her on the dollar coin. In Juneau stands a bronze statue of Elizabeth, captured in quiet dignity—just as she stood in that chamber in 1945.
Yet beyond Alaska, her story remains obscure. That’s tragic, because what Elizabeth proved was fundamental.
Civilization isn’t measured by how many years your history spans. It’s not measured by monuments or recorded achievements or military conquests.
It’s measured by how you treat the vulnerable. By whether you uphold dignity or destroy it. By whether you use law to protect people or to oppress them.
Senator Shattuck claimed five thousand years of civilization. Elizabeth Peratrovich proved he had none.
Because what’s civilized about “No Dogs, No Natives” signs? What’s civilized about denying people access to public spaces in their own ancestral homeland? What’s civilized about a government official calling people savages?
Nothing.
Elizabeth didn’t need five thousand years of history. She needed moral clarity and courage.
She weaponized their own claims against them. She demonstrated that the supposed “savage” in the room understood America’s founding principles better than the “civilized” senators did.
And she won.
Elizabeth Peratrovich died in 1958 at age forty-seven. She didn’t live to see the federal Civil Rights Act. She didn’t see her image on coins or statues erected in her honor.
But she lived long enough to see “No Natives” signs removed throughout Alaska. She lived to know her children could enter any business in Juneau without being turned away. She lived to see the law of an entire territory changed because she refused to be silent.
That’s not just one woman’s victory. That’s proof that dignity is powerful. That moral clarity can defeat bigotry. That sometimes changing history requires one person willing to stand, set down their work, and speak truth to power.
The senators thought they had civilization on their side. They thought their recorded history gave them authority.
Elizabeth Peratrovich taught them that civilization isn’t inherited. It’s earned every single day by how you treat people.
She was knitting quietly while senators called her people savages.
Then she reminded them what civilization actually means.
And she won the first anti-discrimination law in United States history.
Elizabeth Peratrovich deserves to be as famous as any civil rights leader in American history.
Now you know her name.

The Palais Garnier

The Palais Garnier

In 1861, the engineers hit a problem that should have ended the project. They dug down into the Parisian soil and found a swamp.
By 1875, that swamp sat beneath the most opulent building in Europe.
It was the height of the Second Empire. Napoleon III had commissioned Baron Haussmann to scrub the grime off medieval Paris and replace it with wide boulevards and monuments to Western civilization. At the center of this new urban jewel was to be a grand opera house.
But the ground refused to cooperate.
The site was waterlogged. For months, steam pumps ran day and night, trying to drain the soil, but the water table was too high. Critics sneered. The Emperor grew impatient. Then came the Franco-Prussian War, the fall of the Empire, and the bloody chaos of the Paris Commune. The project sat unfinished, a skeleton of stone in a broken city.
But a young architect named Charles Garnier refused to let the dream die.
Instead of fighting the water, he decided to tame it. He designed a massive double foundation of concrete and brick, creating a gigantic watertight cistern to relieve the groundwater pressure. He built an artificial lake beneath the opera house to float the massive stone structure safely above the muck.
While the government collapsed and regimes changed, Garnier kept building.
On January 5, 1875, the construction walls finally came down.
The Palais Garnier officially opened its doors to a gala that lasted over 13 hours. It was a spectacle of gold leaf, velvet, and marble that cost roughly 36 million francs—about $250 million in today’s money. The auditorium glittered under the light of a massive gas chandelier, seating nearly 2,000 of the city’s elite.
He built it for art.
He built it for France.
He built it for the ages.
The result was not just a theater, but a declaration that beauty and order could triumph over political chaos. The opulent staircase and the gilded statues proved that culture survives even when governments fall.
We see its legacy today in ways most don’t realize. That hidden underground lake Garnier built isn’t just an engineering trick; it became the setting for Gaston Leroux’s novel, “The Phantom of the Opera.” The spooky, subterranean reservoir is real, and it is still used by Paris firefighters in emergencies.
Even the legends were based on fact. In 1896, a counterweight from the chandelier really did fall, killing a concierge—an event Leroux wove into his ghost story.
Today, the Palais Garnier stands as a UNESCO World Heritage site. It reminds us that true grandeur requires a deep, solid foundation to withstand the shifting sands of time.
Sources: Britannica / History Today

Leonard Bernstein

Leonard Bernstein

He was 25, an assistant conductor nobody knew. The star conductor got sick. He had one night to prepare—no rehearsal—then walked onto Carnegie Hall’s stage in front of a national radio audience.

November 13, 1943. Saturday evening. Leonard Bernstein was at home in New York when his phone rang. The voice on the other end belonged to someone from the New York Philharmonic, and they had news that would change his life.

Bruno Walter—one of the most celebrated conductors in the world—had fallen ill with the flu. He couldn’t conduct tomorrow’s concert at Carnegie Hall. The concert was sold out. It would be broadcast live on CBS Radio to millions of listeners across America.

They needed a replacement. Immediately.

Leonard Bernstein was 25 years old. He’d been appointed assistant conductor of the New York Philharmonic just a few months earlier—a promising position, but essentially a backup role. He attended rehearsals, studied scores, and waited for opportunities that rarely came.

Now one had arrived. With less than 24 hours’ notice.

Most conductors would panic. The program was demanding: Schumann’s Manfred Overture, a difficult contemporary work by Miklós Rózsa, Richard Strauss’s Don Quixote, and Wagner’s Prelude to Die Meistersinger. Complex pieces requiring precise communication between conductor and orchestra.

And Bernstein wouldn’t get a single rehearsal.

He’d have to walk onto that stage cold, in front of a packed Carnegie Hall audience and a national radio broadcast, and conduct one of the world’s greatest orchestras through a program he’d never rehearsed with them.

He said yes.

That night, Bernstein barely slept. He pored over the scores, visualizing every tempo change, every entrance, every dynamic shift. He’d studied these works before—he had a photographic memory for music—but studying and conducting are different things entirely.

Sunday, November 14, 1943. Afternoon. Bernstein arrived at Carnegie Hall. No time for a full rehearsal. No time to work out details with the orchestra. Just a brief sound check, a few words with the musicians, and then—showtime.

At 3:00 PM, Leonard Bernstein walked onto the stage of Carnegie Hall.

The audience had come to hear Bruno Walter, a legendary conductor who’d worked with Mahler himself. Instead, they saw a 25-year-old kid they’d never heard of.

Bernstein raised his baton.

What happened next became legend.

From the opening bars of Schumann’s Manfred Overture, it was clear something special was happening. Bernstein didn’t just conduct—he embodied the music. He leaped, twisted, swayed. His gestures were huge, theatrical, passionate. Some conductors are technical. Some are precise. Bernstein was fire.

The orchestra responded. Without rehearsal, they followed his energy, his vision, his interpretation. The Rózsa was thrilling. The Strauss was nuanced. The Wagner surged with power.

When the final note of Die Meistersinger faded, there was a moment of stunned silence.

Then the audience exploded.

Applause thundered through Carnegie Hall. People stood. They cheered. They’d witnessed something extraordinary—not just a good performance, but the arrival of a major talent.

Backstage, Bernstein was mobbed. Musicians congratulated him. Audience members pushed past ushers to shake his hand. The phone lines at CBS were jammed with listeners calling to ask: Who is this guy?

The next morning, The New York Times ran the story prominently. The headline captured what everyone was thinking: a young, unknown conductor had pulled off the impossible. By Monday afternoon, Leonard Bernstein was famous.

Artur Rodzinski, the Philharmonic’s music director who’d hired Bernstein as his assistant, later said he knew immediately that November 14th would be remembered as the day Leonard Bernstein became Leonard Bernstein.

He was right.

The phone calls started immediately. Guest conducting offers poured in. Recording contracts. Commissions. Suddenly, every major orchestra in America wanted this 25-year-old who’d conquered Carnegie Hall with no rehearsal.

But November 14, 1943 wasn’t just about one great performance.

It revealed something essential about Leonard Bernstein: he was fearless.

Most conductors would have been paralyzed by the circumstances. No rehearsal? National broadcast? Substituting for a legend? The pressure alone would crush most people.

Bernstein thrived on it. The impossible odds didn’t scare him—they ignited him.

Over the next 47 years, Leonard Bernstein became one of the most important musicians of the 20th century. He composed West Side Story, Candide, and symphonies that are still performed worldwide. He became music director of the New York Philharmonic—the first American-born conductor to hold that position. He taught generations of musicians. He brought classical music to millions through his televised Young People’s Concerts.

But it all started with one phone call on a Saturday night and one impossible Sunday afternoon.

Leonard Bernstein’s story reminds us that greatness often arrives without warning. You don’t get to choose when your moment comes. You don’t get time to prepare perfectly. The opportunity appears, and you either rise to meet it or watch it pass.

Bernstein rose.

He was 25 years old, an assistant conductor with no rehearsal, facing the most important performance of his young life. He could have played it safe, followed the basics, just gotten through it.

Instead, he conducted like his entire future depended on it.
Because it did.