Feeds:
Posts
Comments

Archive for the ‘United States’ Category

There isn’t a person from my generation who hasn’t heard of Pac-Man.  And there are very few people from my generation who haven’t played Pac-Man.  Personally, I never was a huge fan of the video game, but that may be because I was never very good at it. Pac-Man, you say?  What is this Pac-Man game to which you refer?  If you’re asking the question (or something similar), you must not be from my generation, but I’ll indulge you with a brief description.

The game begins with you in the middle of a maze as a yellow circle.  The maze is full of little yellow dots that you eat for points and some bigger “power-up” dots.  Above you in a center box are four enemies (Blinky, Pinky, Inky, and Clyde).  Their goal is to track you down.  Your goal is to eat all the little dots before you’re tracked down…pretty straightforward.  The power pellets turn the enemies blue, making them suitable for you to eat.  Once you clear all the dots, there is some funky music and you progress to the next level.  Subsequent levels introduce additional bonuses like fruit, but I rarely saw those because I stunk at the game.

If you were a very adept player, there were basically an unlimited number of levels you could play.  The monsters would get faster and stay under the “power-pellet” influence for shorter periods of time.  I say “basically” because, while the game was meant to have unlimited levels, a bug in the software limited the fun to just 255 levels.  But still, you could play to that point and have wasted several hours of your day for just a quarter…not a bad investment.

So all in all, a pretty simple concept.  On May 22, 1980, the gaming company Namco released this little experiment on the world, probably with no idea that it would become one of the most popular video games of all time.  Kids (of all ages) took to Pac-Man like parachute pants, break-dancing, and The Cosby Show, dropping quarter after quarter down the money-hungry maw of the console.  Hundreds of thousands of consoles were sold and billions of dollars were “invested” in an effort to, as one song-writer penned, “carve my name in a video game.”

I spent a few dollars on Pac-Man, but quickly realized that I didn’t have the patience or skill to advance past the second or third level.  While the game has largely gone the way of…well…parachute pants, break dancing, and The Cosby Show, there are still those who work to achieve perfection.  That consists of clearing all 255 levels and all the associated bonuses, which earns you a couple million points…and the loss of a quarter.

The bigger cash out, in my mind, is that Namco’s creation forever changed the landscape of gaming.  If you replace the 2-D maze with a 3-D version, modify your character to be a person with a gun, and change the enemies to monsters, you have created any of the first-person shooters that came on the scene a decade later.  These includes best-sellers like id Software’s Wolfenstein/Doom/Quake franchises or Half-Life or any of dozens of other examples.

Expand your vista a bit and replace the maze with a 3-D world and put your character in the military.  You’ve just created Bohemia Interactive’s incredible Operation Flashpoint / Armed Assault series, the Medal of Honor series, or again, any of a number of military-based shooters.

This is not to say that none of these other games would have come into existence without Pac-Man.  But Pac-Man truly showed that these types of games were not only feasible, but undeniably popular.

Read Full Post »

The signing of the Treaty of Paris in 1783 may have officially signaled the end of America’s fight for freedom with Britain, but in the Colonies, there were still battles being fought.  Those that had backed the cause of freedom took a very dim view of the Tories (or Loyalists), whose allegiances remained more strongly with Britain.  And now that the war was concluded, there was throughout America a wave of anti-British, anti-Tory sentiment – a sort of pent-up backlash against those that supported the losing side.

This was especially true in New York City.  The Big Apple – which, while a sizable city in the 1780s, was still a very Small Apple by today’s standards – had probably been the most “Tory-ized” city in the Colonies.  The city’s citizens were, in general, strongly loyal to the British when fighting broke out, and it was very quickly captured by the British.  As we know, a good portion of the city was burned at that time, though no one knows with certainty which side was responsible for striking the match.

Colonists that stood for independence from Britain felt compelled to flee the city, leaving behind homes, property, and possessions, which were subsequently confiscated by the British military.  New York City remained a British / Loyalist stronghold throughout the war.  It was also something of a prison camp, as ships moored in the East River served as jails for captured American soldiers.  Conditions in these prison boats were appalling.  Disease, malnutrition, and general mistreatment aboard these ships led to the deaths of more than eleven thousand Patriots.

When hostilities ended with Britain, they began anew when New York Patriots returned to their city.  Many found their homes badly damaged or destroyed and their possessions plundered.  Bones of dead Patriots littered the shores of the East River (and would continue to do so for years).  And since the British military was gone, most of the fallout landed on those Loyalist citizens remaining.

Persecution broke out against the Loyalists as their opponents vented their rage in a search for vengeance.  Some were killed, more were tarred and feathered, and all were affected by various laws that were passed against them.

There were some who argued for moderation.  Much could be said about how a victorious people treated its vanquished enemy.  Many countries around the world had very good relations with British subjects, and might take a dim view on their mistreatment, which could affect future diplomacy and trade.

But for now, anger won out.  The New York legislature passed laws allowing the seizure of Tory estates.  There were laws that allowed returning homeowners to sue their Tory occupiers for any damages.  Legislation robbed Tories of the ability to work, stuck them with heavy taxes, and took away many of their basic rights.  While some of these laws may have made some sense, many were passed simply as acts of retribution…or worse, authored by those who found a way to gain financially at Tory expense.

And on May 12, 1784, the legislature passed a law that rescinded the voting rights of all Tories for two years.  Many returning New Yorkers rejoiced at the measure, but others (besides the Loyalists) were horrified.  They pointed to the Treaty of Paris itself, which called for both sides to “forget all past misunderstandings and differences.”  But even more specific, one of the main points of the Treaty was as follows:  “The Congress of the Confederation will “earnestly recommend” to state legislatures to recognize the rightful owners of all confiscated lands “provide for the restitution of all estates, rights, and properties, which have been confiscated belonging to real British subjects [Loyalists].”  Voting restrictions were a direct violation of the Treaty.

But still the legislation had passed.  The war may have been over, but the fighting certainly was not.

Read Full Post »

It wasn’t the White House, because it didn’t yet exist.  It was the Capitol building, because it didn’t yet exist, either.  And it wasn’t even Washington, D.C. because, well, that property still belonged to the states of Maryland and Virginia.  But when we think of a Presidential inauguration, all of those places are usually top of mind.  In 1789, however, they were completely out of mind.  So New York City provided the locale, and City Hall provided the venue for the very first Presidential inauguration.

George Washington was a very nervous man; probably way more nervous than when he had spoken his wedding vows.  He had been unanimously nominated to lead a new country with a new charter and a completely new form of government.  He had spent the winter talking about how he was unqualified to lead, even while the country believed, almost without exception, that he was the most capable man to do so.  His wife, Martha, didn’t really look forward to being First Lady.  In fact, in his biography of the First President, Ron Chernow writes that Mrs. Washington “talked about the presidency as an indescribable calamity that had befallen her.

Regardless of feelings, there was no backing out now.  Vice President Adams, in front of the First Congress, turned to the President-elect and said, “Sir, the Senate and the House of Representatives are ready to attend you to take the oath required by the constitution.

Washington stepped out onto the balcony shortly after noon on April 30, 1789 to an immense roar and took the oath.  Though not required, a committee thought it appropriate, at the eleventh hour, to have the President place his hand on a Bible.  But where to find one?  In the end, a local Masonic Lodge provided its Masonic Bible and Washington was administered the oath.

Then the President addressed the crowd.  Again, this was not required by the Constitution, but it seemed right.  Washington’s original speech, written by David Humphreys, spent too much time defending his decision to accept the Presidency.  It spent too much time talking about his faith in the American people (not necessarily a bad thing).  It spent too much time downplaying any form of dynasty (Washington was childless, after all).  It delved too close (and again, at too great a length) to legislative matters for executive branch comfort.  In fact, at seventy-three pages, it spent too much time on everything.

When Washington sent the speech to James Madison for his thoughts, he promptly tossed it out and wrote a much more succinct address that steered clear of legislative issues, which the President readily accepted and delivered.

Washington had become a household name in the Colonies during the French and Indian War.  He had become the hero of the American Revolution.  He had been a calming force (though he barely spoke) at the critical and sometimes contentious Constitutional Convention.  And now he was the President, chosen by the people (by a wide margin) and the Electoral College (by unanimous consent).

Recommended Reading: Washington – A Life

Read Full Post »

The San Francisco earthquake needs no real introduction.  And that’s true despite the fact that the city surrounding San Francisco Bay is bumped and jostled by a good many quakes each year.  Most of them are rather mild and I suppose some that can be detected by seismic equipment aren’t even felt by the public.

But many can be felt, even if only a little.  Living in the Midwest, I’ve never experienced an earthquake, so I have no idea what one feels like.  I imagine there’s a low rumble and then some wiggling around for a few seconds.  Maybe one feels a bit woozy and disoriented, sort of like air- or sea-sickness, but again, I’m just guessing.  Californians have a far greater depth of experience than I.

Like I said, most quakes are fairly small, but there have been some biggies.  There was a powerful quake that struck in 1989 as the World Series was getting underway…we’ve talked about that one.  But when someone mentions The San Francisco Earthquake, just one is being referenced.

The earthquake that struck on April 18, 1906.

Residents of the city were jolted awake shortly after 5:00am by a powerful shock that measured 7.9 on the Richter Scale, as the San Andreas Fault (which runs just west of the city and bay) ruptured along 300 of its 800 miles.  I’m not an expert, but from what I’ve read, the San Andreas Fault is where two of the earth’s plates meet.  The western plate tends to edge north while the eastern place moves south.  Over time, stresses build up as the plates grind against each other.  Then the pressure releases in a quake.  Most are small, but this particular one was not.

It toppled buildings and homes on a grand scale, causing tremendous damage.  But just as devastating was the resultant fire which, combined with the quake, destroyed upwards of 80% of the city.  Most of the pictures of the quake’s aftermath show destruction on par with cities that were heavily bombed during the Second World War.  More than 3,000 lives were lost and more than half the city’s population was left homeles, making it California’s worst natural disaster, and one of the worst disasters in U.S. history.

Today, structures on the West Coast are built with the various fault lines in mind.  Much like Japan, everything is done with “an eye toward the ground.”  In every sense, San Francisco is far more prepared to deal with earthquakes than, say, St. Louis, which also sits in relative proximity to a fault.  But as I said before, the San Andreas still lurks…

Read Full Post »

On April 12, 1770, British Parliament repealed the Townshend Acts.  That’s Today’s History Lesson, and you’re free to go.  Enjoy your evening.

For those of you that would like a wee bit more information, you’re welcome to hang out for a couple minutes longer.

The Townshend Acts were a series of laws passed by the British in the late 1760s.  Their function (like many of the “acts” of the time) involved some form of taxation.  The British were carrying an enormous war debt and needed help paying for it.  They also maintained a sizeable military force in the Colonies, and one of its functions was (ostensibly) to protect the Colonies.  So Parliament believed that the protected citizens should help defray the costs.

The Townshend Acts included the Revenue Act of 1767, the Indemnity Act, the Commissioners of Customs Act, the Vice Admiralty Court Act, and the New York Restraining Act.  In the past, it had been notoriously difficult for the British to collect the taxes it levied against the Colonies, because people didn’t want to pay and found ways around them.  The Townshend Acts were designed to make the people kind of feel better about paying up.

The taxes from these acts were used to pay the salaries of judges and governors, the idea being that the money collected came from the Colonists, so the people in power would be independent of British rule.  Yeah, it seems a little fishy to me, too.  The money was also used to improve enforcement of other trade rules (in other words, to make sure taxes from other laws still in place were collected).  And, in the case of the New York Restraining Act, there was a bit of punishment for the response to the Quartering Act.

Like most other tax laws of the day, these were met with serious opposition.  This led to the call by local British officials for more soldiers.  This led to more unrest, and eventually the city of Boston was occupied by the British.  This led to more angst, and then there was the death of Christopher Seider which, along with the strong British presence, culminated in the Boston Massacre.

At this point, debate began on at least a partial repeal of the “revenue” parts of the Townshend Acts.  That repeal was passed a month later, on April 12th.

One interesting note as we close.  The one tax that remained was the tax on tea, and we all know how that ended up.

Read Full Post »

The French Revolution, which began with the 1790s, was seen by many American citizens as a chance for another country to throw off the shackles of tyranny.  After all, the Colonies had, in the previous decade, successfully removed British control.  The idea of the French doing the same had big appeal in the Colonies-turned-States.

But as we know, this upheaval quickly turned from “Revolution” to “Reign of Terror”, and fall of the guillotine’s blade became more common than sunrise and sunset in France.  Thousands of the nation’s leaders were slaughtered and tens of thousands of its civilians massacred in a display of countryman-against-countryman butchery that has been rarely duplicated in history.

King Louis XVI was beheaded in January 1793, his head and body stuffed in a basket, then eventually buried in a box.  One executioner began an impromptu business, selling bits of the King’s hair and clothing, as schoolboys cheered and licked the King’s blood.  Make no mistake, the American Revolution was about freedom, and the French Revolution was a disgusting display of man’s basest inhumanity and brutality.

England watched from across the Channel in horror.  William Pitt the Younger called the King’s execution “the foulest and most atrocious act the world has ever seen.”  France’s response?…a declaration of war on February 1.  News travelled slowly back then, and word of war didn’t arrive in America until early April, but it was immediately felt in the States, as pro-British and pro-French elements took their sides and waited for the government to make its position known.  President Washington very quickly (and very wisely, in my opinion) acted and, in April, offered up the Proclamation of Neutrality.  America would not take any side.

But in between the arrival of the news of war and the government’s decision to remain neutral, there was another arrival…this one in Charleston, South Carolina.  On April 8, 1793, the French Minister to the United States arrived aboard the frigate Embuscade.  His name?…Edmund Charles Genet.  But, as Chernow writes, “he would be known to history, in the fraternal style popularized by the French Revolution, as Citizen Genet.

For those with British sympathies, Genet was their worst nightmare.  For anyone siding with the French, here was a man to greet with effusive praise and much regailment.  Genet’s pomp and arrogance not only made him an incredibly polarizing figure, it also meant he was “all the news” for a while.  And that made it easier for the Frenchman to move about and peddle his influence, for Citizen Genet didn’t come to America to escape the Reign of Terror.  This man had an agenda.

And over the course of the next year or so, his disregard for American authority and American foreign policy, which under most circumstances was likely treated as sedition, would cause no end of trouble.

We’ll check back in on Citizen Genet again…trust me.

Recommended Reading: Alexander Hamilton

Read Full Post »

It will be a brief one this evening.

The very fact that you’ve arrived at this website indicates that either you have an interest in historical events or a search engine believes you do.  But regardless of the exact reason, I’ve (to this point) written nothing about events in the 21st century, so something historical drove you here, willingly or no.

Since you’re here, I’m going to assume you know a little something about Jesse Owens, the famed African-American sprinter who set the 1936 Olympic Games on its head.  If you want a little more background, I put together a little piece years ago that will flesh out some of the story.

The story of James Cleveland Owens – the name “Jesse” originated from the heavily accented way in which he pronounced the initials “J. C.” – is remarkable both as an athletic story and as a “social conciousness” story.  The 1936 Olympics were, as you might know, held in Berlin, Germany.  Hitler’s National Socialist party was incredibly racist, believing its own German people were ethnically superior.  Others, particularly Jews and Africans, were seen as second class or even sub-human.

So while Owens was shattering records on the track, he was also shattering the racist myth that “blond hair and blue eyes” was better physically and intellectually.  It is said that Hitler’s disdain (and embarrassment) at Owens’ dominance caused him to leave the Games early…I don’t know for sure, but that sounds like the guy.

I wasn’t there, but for Americans back home, the news of Owens’ exploits was probably received with mixed reviews.  Our country’s founding principles stated that, in God’s eyes (ok…the Creator’s eyes if you want to be technical), everyone is created equal.  But in practice, America was way more than a little hypocritical.  Certain people, especially those of a different color, were considerably less free than others, and forced separation of the races (we called it Segregation) was the order of the day in many southern states.

Jesse Owens probably felt that hypocrisy when returned to America.  Four Gold Medals was an astonishing feat, yet he wasn’t invited to the White House to be honored or celebrated.  Owens was quoted as saying, “…it was FDR who snubbed me.  The President didn’t even send me a telegram.”  It took nearly thirty-five years to be enshrined in Alabama’s Hall of Fame.  Now maybe there are timing rules for that, but it seems ridiculously long to me.

Fortunately, our time has been kinder to Owens than Owens’ time was.  President Ford awarded this athletic giant the Presidential Medal of Freedom in 1976 (shown above).  And on this day, March 28, 1990, President Bush posthumously awarded him the Congressional Gold Medal.  These awards together constitute the highest honors that the government can give a civilian.

It was about time.

Read Full Post »

As Garrison Keillor would say, “Well, it’s been a quiet week…“.  I thought about writing every day last week, but I have this silly little birth defect in my lower back that flares up from time to time.  It’s usually not too much trouble – a little discomfort, a little inconvenience – but this time it was worse.  While walking and riding my bike weren’t too bad, it was pretty painful to sit.  So I spent a lot of time standing around the house, and standing isn’t really conducive to typing on the computer.

I’m a little better today, so much so that I was able to do a few things around the house while my wife was off at a baby shower.  The small of my back is still quite tender, but it seems the worst may have passed.  So while things are good, let’s have a quick write here.

We’ll head back to pre-Revolutionary days.  After all, if I’m not on a World War Two battlefield, I’m pretty much in the Colonies.

The Stamp Act was created by a vote of British Parliament in March of 1765.  It was levied on the Colonies in November of that year.  And to say it was unpopular would be a gross understatement.  But it’s not as though taxes were a new thing.  The Thirteen Colonies had seen their share in recent years, particularly since Britain had stopped fighting with France.  The government had put down its sword and taken up its fiscal pen, only to find itself mired in the all-consuming quicksand of debt.

The interest payment alone on the debt amounted to more than half of the overall yearly budget.  And regardless of the actual number, that’s a staggering percentage.  So the British decided to raise taxes.  Sometimes that’s a necessity.  Living in 21st-century America and up to our eyeballs in government debt, we understand the reality of taxes.  If any government (American, British, or Quatloo) wants to spend lots and lots of money, the people outside of the government are going to have to provide that money.  It was no different in the 18th-century British empire.  But the British also maintained a solid military presence in the Colonies, and Parliament believed it was reasonable that the Colonies pay for the benefits they received.

It wasn’t so much that taxes angered the Colonies.  As I just wrote, taxes weren’t new.  But as we all know, the Colonies were required to pay the taxes without any participation in the process.  They weren’t allowed to offer up alternative ideas, no “colonial” representatives were given any voting power in Parliament, and Colonists had no say in how the revenue would be spent.

So while the tax wasn’t really all that evil, the Colonists were pretty unhappy.

And when it went into effect, the British discovered that enforcing the tax was really difficult.  More troubling was the fact that many colonial merchants were now refusing to import British products until the Stamp Act was repealed.  As a result, British companies were feeling a pinch.  Most troubling of all was the discontent that the taxes had created in America.  People were taking to the streets.  There was shouting.  They were burning tax collectors in effigy (see the image above).  There were inflammatory articles in newspapers that fanned the emotions of the readers.

The British realized that no good thing was coming out of what amounted to a one-penny tax.  So on March 18, 1766, Parliament and King George III repealed the Stamp Act.  For the Colonists, it was a victory of principle and the end of a hated four-month tax.  For the British, it was “back to the drawing board” for new ideas on getting the Colonists to pony up and help pay down the debt.

Read Full Post »

Pretty much everybody has heard of the Boston Massacre.  Even if one doesn’t know all the details, almost anyone can put enough facts together to get the gist of the story.  Way back in 2008, when Today’s History Lesson was newborn, my good friend Michael covered the Boston Massacre.  I don’t feel any real need to add to his very good synopsis, but let’s take a couple minutes and cover a related issue.

The Fifth Anniversary of the Boston Massacre.

March 5, 1775 was the date and the Old South Meeting House was the venue.  The gathering included, of course, Samuel Adams and John Hancock.  There were several men that spoke, including Hancock and Benjamin Church.  They were followed by Dr. Joseph Warren, wearing a white toga (reminiscent of the orators in the ancient Roman Senate).  He spoke of the Pilgrims leaving Europe, comparing it to Noah’s year in the ark, leaving a sin-stained world for a fresh, new existence.  He talked about Britain’s committment to its taxation of the Colonies.

But Warren’s most colorful language was spared for the memories of those killed on that fateful day five years earlier, and Ira Stoll records it in his biography of Samuel Adams.  “Take heed, ye orphan babes, lest, whilst your streaming eyes are fixed upon the ghastly corpse, your feet glide on the stones bespattered with your father’s brains. . . . We wildly stare about, and with amazement ask, who spread this ruin round us?  what wretch has dared deface the image of his God?  has haughty France, or cruel Spain, sent forth her myrmidons?  has the grim savage rused again from the far distant wilderness?  or does some fiend, fierce from the depth of hell, with all the rancorous malice, which the apostate damned can feel, twang her destructive bow, and hurl her deadly arrows at our breast?  no, none of these; but, how astonishing!  It is the hand of Britain that inflicts the wound.”

Warren’s goal of winding up those gathered was achieved.  But more than that, the British officers that were present (and seated towards the front) also got excited, but for entirely different reasons.  As he finished, Samuel Adams told those assembled to return the following year to again commemorate the bloody massacre.

And it was the word “bloody” that set the officers off.  A bit of a melee ensued, and some report that Adams was challenged to a duel.  Others report that Adams accepted.  Fortunately, cooler heads prevailed, and a second Boston Massacre was avoided…barely.

But there is little doubt that Colonists like Church, Adams, and Hancock left the Meeting House with big British targets on their backs.  And you could add Joseph Warren to the list as well.  He joined the Massachusetts militia, but his Revolution (and his life) ended just three months later when he was killed at Bunker Hill.

Recommended Reading: Samuel Adams: A Life

Read Full Post »

In our world, there are lots of famous pairs.  There are a lot of things that just work really well together, like they were meant to be.  And as we start the fifth year of Today’s History Lesson, let’s name some.

Chocolate and peanut butter.
Donnie and Marie.
Spaghetti and meatballs.
The Lone Ranger and Tonto.
Calvin and Hobbes.
Blue Falcon and Dog Wonder.
Abbott and Costello.
Sonny and Cher (ok…admittedly, they worked slightly less well together).
Starksy and Hutch.
Brooks and Dunn.

You get the picture.  In the political world, there have famous pairings, too.  We immediately think of duos like Thomas Jefferson and James Madison, or maybe John and Abigail Adams.  Lexington and Concord.  Valley Forge and Baron von Steuben.  Republicans and tax breaks for the wealthy…I jest, I jest!!!  Hmmm…Democrats and deficits…there, does that even it out?  Anyways, we could go on and on, but I’ll focus instead in one.

George Washington and Alexander Hamilton.  We’ve talked about both of these immensely influential Founders on many occasions, but it’s time we put them together.

Hamilton and Washington were a team for the better part of twenty-five years.  Washington, the first President, was the calm, steady leader.  Hamilton, the first Treasury Secretary, was the impetuous, forceful subordinate.  It fact, it’s very safe to say that during Washington’s first term (and much of his second), Alexander was the second most powerful man in America.  He was more powerful than Vice President Adams.  He was more powerful than Secretary of State Jefferson.

Hamilton’s influence made him a lot of enemies, and Washington’s deference to Hamilton made a great many exceedingly jealous.  Thomas Jefferson, in particular, came to believe that Washington was little more than a marionette, dancing on the strings manipulated from above by a power-maddened Hamilton.

But George Washington’s trust in Hamilton was built on years of experience in close proximity to the man.  Whether you like Hamilton or hate him (or are completely indifferent), you must know that Washington was a pretty good judge of people, and he knew Hamilton better than most.

Their collaboration began on this day in history…March 1, 1777.  George Washington was a General…in fact, he was the General of the army.  Alexander Hamilton was an artillery company Captain, who had distinguished himself in the Battle of White Plains and the Battle of Trenton.  His leadership abilities and good performance under pressure (and under fire) had made him something of a desirable commodity.  General Nathanael Greene had requested his services.  Henry Knox (at that time a Brigadier General) had also sought out Hamilton to be an aide.  Hamilton had refused both, preferring to earn his Revolutionary glory on the field of battle.

But when General Washington invited Hamilton to join his staff as an aide-de-camp, it was an offer he simply couldn’t refuse.  He accepted the General’s offer and joined his staff on this day with the elevated rank of Lieutenant Colonel.  And that’s where this “dynamic duo” got its start.

Speaking of Captains, our son learned today that he has been promoted to the rank of Captain.  Congratulations to him!!

Read Full Post »

Little Christopher Seider probably just wanted to play.  I will wager that, like any ten-year-old boy, he was short on attention and long on energy.  Running through town with his friends, throwing whatever he could fit in his hands, and yelling were not strange activities to him.

I can say all that because I was ten once, though it was quite a while ago.  I did all those things.  I also rode a bike, played with toys, watched a little TV, and so on.  The biggest difference between myself and Christopher Seider (besides the year in which we were ten) is that I lived to see my eleventh birthday.

Christopher Seider did not.

In fact, little Seider died on this day in history.  But his death was not just another in a long line of deaths that has plagued a world where death rates run pretty close to 100%.  This young boy lost his life at a significant time in history, and while you may not have heard of him, he was famous.

To know Christopher, you must also know Ebenezer Richardson.  Well, we can’t fully know him, because there isn’t a lot to know.  He was something of a shady character with a spotted reputation around Boston.  He was a Loyalist, which should give you a hint that we’re heading toward the time of the American Revolution.  He also was an informant to the Attorney General, giving up information about “rebel” activity in town.

February 22, 1770 was a cold, bleak, wintery Thursday that found the Boston townsfolk in an uproar about a local Loyalist merchant.  The standard action was to raise a ruckus at the shopkeeper’s home, yell a lot, throw some rocks, break a window or two, and make their point.  Ebenezer Richardson, wanting to protect a fellow Loyalist, tried to stop the mob, but they simply threw rocks at him, at least one of which hit him in the head.

So Richardson did what all too many people do when something doesn’t go their way:  get a gun and shoot somebody.  More specifically, he went to his house, grabbed his musket, and headed for the shopkeeper’s house, where the mob had gathered.  He climbed to the top of a neighboring building and…

Christopher Seider had little idea what the mob was about, but here was a chance to run down the streets of Boston and throw some rocks.  He and his friends were having a ball.  The people they were with were not only going to let them throw rocks, they were going to do it themselves.  For a ten-year-old, this was pretty exciting.  Exciting, that is, until the bullets started flying.

Richardson, in an effort to break up the mob, began firing randomly into the crowd.  He hit Christopher twice, in the chest and head, and the little boy died that evening.  Ebenezer was immediately apprehended and jailed, but later acquitted.

Needless to say, Seider’s death galvanized Bostonians against the British.  Where there used to be vocal exchanges between the two groups, there now snowballs, which became rocks and homemade spears.  The tensions rapidly reached the breaking point.

Two weeks later, the rocks and snowballs morphed into a physical group attack, as angry citizens charged into a group of British soldiers.  This most famous of events, which we know as the Boston Massacre, left another five people dead.

Read Full Post »

So…it’s Valentine’s Day.  Card companies and flower companies and candy companies love this day for obvious reasons.  People that work at places that sell cards and flowers and candy probably love it a little less, just because of the manic shopping that takes place in the days leading up to (and especially the day of) the holiday.

In general, it’s a fun day with some treats and time spent with those we love.

But it’s not that way for everybody.  For some, Valentine’s Day conjures up pains or hurts that they’d rather not remember.  That was certainly the case for a young Theodore Roosevelt.

On February 14, 1884, the young man who would be President suffered the most grievous of losses.  It may not be the best source for this type of incident, but since I read about it in Timothy Egan’s The Big Burn, it’s the source I’m using.

“The blow of a lifetime came early, on Valentine’s Day 1884, perhaps the best-known single day of trauma in the formative period of a future president.  In the morning, Teddy’s mother died of typhoid fever at the family house on Fifty-seventh Street; she was forty-six.  A few hours later, the suddenly orphaned Roosevelt lost his bride in the same house, to Bright’s disease, a kidney ailment, which had been masked by her pregnancy.  He scrawled a big, shaky X on a diary page and wrote a single sentence: ‘The light has gone out of my life.'”

The young man, in his mid-twenties and a budding politician, chucked it all and headed west, where friends and family and politics wouldn’t be around, and where the Badlands and open country could maybe concoct an elixir to clear the head of a man crushed by loss.  It would be two years before he returned to Manhattan.

Recommended Reading: The Big Burn

Read Full Post »

There are a lot of things I could say about the Seven Years’ War, but most of those things would be made up.  I vaguely remember discussing it in high school American History class and again in World History, but I was in high school many years ago and the memories have mostly faded.

I seem to recall that this conflict was something of a “world war”, not so much because the conflict spanned the globe (though it was pretty widespread), but more because the players involved were all the “all-star” countries.  Britain and Prussia and Portugal were on one side.  They were opposed by France, Russia, Spain, and Austria on the other.  There were probably some other players, too.  The conflict included the French and Indian War, where George Washington became a colonial hero.

The war began in 1755 and, if we hold to the war’s name, simple math tells us the war ended in 1762.  In fact, the Treaty of Paris (not this Treaty of Paris) was signed on February 10, 1763.  And as it turns out, a bunch of territory conquered by each nation was returned to its previous owners, which begs the question as to why they went to war in the first place.  Of course, I don’t know…like I said, high school was twenty-five years ago.

But some countries actually did alright for themselves.  Here in America, Britain (which already owned the Thirteen Colonies) gained from France all the territory east of the Mississippi that it didn’t already possess.  France also ceded New Orleans and, if I recall, the Lousiana Territory to Spain.

So Britain did alright as far as territory was concerned.  But as we well know in the 21st century, war carries baggage of its own, and the 18th century was no different.  In his biography of Samuel Adams, Ira Stoll writes, “One might not know it from the lavish spending or displays of wealth at the coronation, but Britain was mired in war debt . . . The debt was staggering – by 1763 it was £122.6 million, which meant £4.4 million just in annual interest costs, or more than half the total annual budget of £8 million.

The incredible financial burden not only caused King George III to come to the peace table, it had him looking for ways to pay down this massive debt.  And that meant taxes.  And it meant taxing the Thirteen Colonies.  And that meant imposing the taxes on the Colonists without really seeking their input or getting their approval.

First, steps were taken to better enforce collection of existing tariffs, which didn’t sit well with folks accustomed to having the British mostly look the other way.  But it was the Sugar Act, imposed the following year, that started the Colonial blood boiling.

These were some of the very first steps down the road to the American Revolution, and they began as the ink was still drying on a peace treaty.

Recommended Reading: Samuel Adams: A Life

Read Full Post »

Well, the calendar calls, even though no one will be paying attention to Today’s History Lesson…except maybe me.  The Super Bowl tends to drown out all other distractions.  My favorite commercial was probably that first Doritos commercial with the dog, followed by the VW/Star Wars commercial.  The game was fantastic to watch, and I hope you all enjoyed it as much as I did, even if your Patriots lost.  I was neutral tonight, and that makes the game way more entertaining.

Anyways, I won’t take a lot of your time.

On February 5, 1958, the U.S. Air Force got its B-47 Stratojet in its F-86 Sabre.  Or maybe the U.S. Air Force got its F-86 Sabre in its B-47 Stratojet.  And while the idea has worked incredibly well for Reese’s Peanut Butter Cups, it doesn’t have the same happy result when high-speed aircraft are the two ingredients.  And it’s particularly bad when the center of the result is not delicious – no, scrumptious – creamy peanut butter, but a thermonuclear bomb.

The B-47 had taken off from Florida on a simulated combat mission and in the wee morning hours, collided with the Sabre.  The fighter pilot was able to safely eject from his stricken plane, but the bomber guys had a bit of a problem.  Their aircraft was also badly damaged and barely flyable, and the plane needed to be lightened to keep her in the air.  But the plane’s lone occupant (besides the crew) was a Mk-15 thermonuclear device.

The Mk-15 was a tactical weapon, which meant it was fairly small as nuclear bombs went, weighing 7,500 pounds.  And like other instances we’ve discussed, just dropping a nuclear bomb doesn’t guarantee a nuclear detonation, because of all the safety devices that are in place.  And nearly all of these weapons were “two-stage”, with a small warhead that triggered the nuclear cataclysm.  So the bomb reaches it “trigger height”, the small warhead explodes, and (if all the safeties are turned off) the “big one” goes off.  As it turns out, this particular bomb didn’t have the “small exploder” in it (it was a training mission after all).  But still, hitting the surface (whether land or water) might be enough to break the bomb apart, causing radiation from the uranium core to leach into the surroundings.

Got all that?

All that stuff ran through the minds of the pilots way faster than I could type it, and after contacting their superiors, the decision was made to ditch the bomb.  So they dropped it off the coast of Georgia, presumably off Tybee Island, which sits just a handful of miles from Savannah.

There was no visible explosion, so that was good news.  The bad news?  When search crews tried to find the bomb, they couldn’t.  And now we’re what?…54 years later?  That bomb still hasn’t been found.

Anyway, I’m not an expert, but if I’m going to go on an off-shore fishing trip, it’ll be down in Florida, or maybe Alaska, or anywhere not named Tybee Island.

Recommended Reading:  SAC Chart of Nuclear Bombs – A nice comparison of the various nukes.

Read Full Post »

If you’re in mixed company and you say, “Boy, that Samuel Adams was really something!“, most people will ask about the variety to which you refer.  And that’s a remarkable shame.  I’m probably being old-fashioned and naive, but I think it’s a terrible indictment on our culture to mention the older of the Adams cousins (the younger being our first Vice President and second President) and have most people start listing off the various beers and lagers made by the company that bears his name.

Yes, the Samuel Adams beer company makes dozens of brews, some seasonal, some year-round.  People swear by it, love it, drink it on their cornflakes, and have it with their pumpkin pie.  There’s probably even a game on Sporcle where you get five minutes to list as many flavors of Samuel Adams beer as you can.

But what’s been lost in the beer goggles (and probably in many classrooms) is that Samuel Adams (the beer) doesn’t even exist if there wasn’t first Samuel Adams (the man).

If you don’t know the “who” better than the “brew”, I’ll make it super simple for you (whoa!!…a bit of unintentional poetry).

Samuel Adams was quite possibly the single most important driving force behind the initial push for independence in the Thirteen Colonies.  If Twitter had existed in the 1760s (I’m still trying to figure out why Twitter exists today, but one rant at a time), Adams would have been the guy everyone linked to in order to know what was going on.

And while his history has been largely forgotten, Samuel Adams was a giant in his time.  When John Adams went to France in 1779, he was recognized as “not the famous Adams.”  He wrote that his cousin had “the most thorough Understanding of Liberty, and her Resources, in the Temper and Character of the people…”.  Jefferson (the author of The Declaration of Independence) called him “the man of the Revolution…for depth of purpose, zeal, and sagacity, no man in Congress exceeded, if any equalled Sam Adams.”

The British also knew Samuel Adams, and steins of beer were not part of their discussions.  If you want to know their opinion of the man, it’s best explained by example.  In June of 1775, the governor of Massachusetts (Thomas Gage) offered amnesty to all the “rebels” causing trouble…all rebels, that is, but two.  John Hancock and Samuel Adams.  When the British Redcoats met the colonial militias at Lexington and Concord, what was their primary mission?  War?…no.  Territory?…no.  The arrest of Adams and Hancock?…yep.

In the shadow of Thomas Paine’s publication of Common Sense, Samuel Adams (a long-time newspaperman) returned to print.  On February 3, 1776, an article written by Adams (and published under the pseudonym “Candidus”) was published in the Boston Gazette.  It contained what was quite possibly the first call for an actual, formal declaration of independence.  “By declaring independence,” he penned, “we put ourselves on a footing for an equal negotiation.”

And like Paine, he had words for the Quakers.  A devoutly religious man himself, “Candidus” appreciated Quaker piety.  But their tendency to favor the British monarchy under the guise to “pacifist neutrality” irritated him.  “If they profess themselves only pilgrims here, let them walk through the men of this world without interfering with their actions on either side.”

Some of Adams’ words would upset people today.  They upset people in the 1770s.  But when the members of the Continental Congress decided on indpendence a few months later, the words of “Candidus” were on their lips.

Not beer.

Recommended Reading: Samuel Adams: A Life

Read Full Post »

Most all of us know the tale of Robin Hood.  He’s a man (or a fox, if you know the cartoon version better) who lives in the forest with his friends, none of which has any money.  Outside the forest are some very unlikeable, greedy, rich people who love to hoard their wealth and never share it with anyone.  Robin’s job is “wealth redistribution”, which is code for taking that money from the rich and giving it to the poor.

At this point, I could go all kinds of directions.  I could discuss how Robin and his men, with all their good intentions and good deeds, are little more than thieves.  Maybe I would talk about how we warp the minds of little children when we tell them stories that glorify criminal behavior as long as it’s done for a noble reason.  Many might expect a transition to politics, as some think the government, on a grand scale, plays the role of the story’s hero, taking money from wealthy people (while making them feel guilty) and giving it to others.

But I’m not doing any of those things.  I’m going to talk about The Dukes of Hazzard.

Any kid from my generation (I was a teen in the 1980s) that had a television watched at least one episode of The Dukes of Hazzard.  I think it was on Friday nights at 7:00pm (right before Dallas maybe?) and the first episode aired on January 26, 1979.  The intro featured scenes from the show and a song by Waylon Jennings that we can all sing in our sleep.  The last line in the song indicates that the good ole’ boys that didn’t mean any harm were “Fightin’ the system like a true modern-day Robin Hood.”

The good ole’ boys were Bo and Luke Duke and, along with Uncle Jesse and cousin Daisy Duke, they took on the law (just like the opening song said they did).  Of course, the show really didn’t follow Robin Hood at all.  As you know, Robin Hood was about the “hero” stealing from the rich while the authorities tried to catch him.  With the Dukes, Bo and Luke were good guys that were constantly chased by the corrupt authorities (led by Hazzard Country strongman Boss Hogg) for crimes they didn’t commit.  Got that?

But the central character was a bright-orange Dodge Charger with a Confederate Flag on the top and “01” on the doors.  It was called “The General Lee” and that car could pretty much do anything, whether it be jumping a river, jumping a police car, stopping off at the always-empty Boar’s Nest for some countrified dialogue, ripping over Hazzard County’s gravel roads, running Aunt Bea to Mount Pilot, or…wait, one of those things is not like the others.  It whistled Dixie when you pushed the horn button, and was truly the most entertaining character…if you don’t count Flash (Sheriff Coltrane’s hound).

Actually, I kid a little.  As hour-long television shows in the 80s went, it was actually alright.  And for a teenager just preparing to drive, the show was a hoot, what with all the car chases and wild driving and jumps and stuff.  And while it didn’t really fit the “Robin Hood” mold very well, the General Lee rocked!!

Read Full Post »

Well, it’s been quite a while since I last put fingers to keyboard, but I’ve got a good excuse.  We took a vacation to Clearwater Beach, Florida.  I actually took the laptop with me, figuring I’d have time for a bit of work and maybe bit of typing.  Such was not the case.  The weather was absolutely perfect (bright sunshine, blue skies, beautiful beaches, and temperatures in the 70s), the condo was fabulous, and there were plenty of things to do.

I love to eat fish, and being on the Gulf meant there was plenty to be had…all of it was great.  But then we found The Gondolier, an East Coast chain that specializes in pizza.  Their food was outstanding…so good in fact that on our last evening, we simply went back there a second time.  Had we tried that place first, we may have eaten every meal there.  If we go back to Clearwater (and that’s a pretty serious possibility), we may do just that.

The long and short of it is that the laptop stayed mostly parked on the dresser.  But now we’re back to reality (and single-digit temperatures), so I’m hoping to get going this year.  Last year averaged fewer than eight pieces per month, so I’d like to improve on that.

“On January 20, 1791, a bill to charter the Bank of the United States for twenty years virtually breezed through the Senate.”

It’s a pretty simple statement taken from Chernow’s biography of Alexander Hamilton, and one that’s easy to just gloss over because we’re so used to banks in the 21st century.  We have banks of every shape and size on nearly every corner.  We can bank online, at the teller window, in the lobby, at an ATM machine, or on a smartphone.  Banks are as common as grocery stores.

In the 18th century, that was not the case.  And while there are people today that don’t trust banks and bankers, 18th-century opinions against the banking system was almost violent.  For Founders like James Madison and John Adams, their political differences found common ground in their opposition to banks.  Jefferson wrote, “I think our governments will remain virtuous for many centuries as long as they are chiefly agricultural…”  He would describe banks as “an infinity of successive felonious larcenies.”

For those against, banks were seedbeds of corruption and vice, turning honest men into money-hungry, money-grabbing monsters.  I think of a bank as a place to store our money safely and earn a bit of interest.  Men like our third President, through the lens of the 1780s, saw it as an oppressor of the poor and a creator of a class-based society…somewhat ironic considering Jefferson’s adherence to slavery despite his vocal abhorrence of the practice.

Some would say that Jefferson and Madison and Adams and those on their side were somewhat backwards in their stance.  Sure, America was largely agrarian now.  But was agriculture the only industry with a future in brand-new America?  Manufacturing and heavy industry, while not a major force at the time, would certainly increase in importance.  They required large amounts of capital to get started…the kind of capital only a bank could hold.  Furthermore, a national bank would help establish credit with other countries as well as manage and reduce the nation’s outstanding debt.

But for James Madison, it went beyond class and oppression and ended at the Constitution.  Alexander Hamilton had authored the idea of the bank using that most famous little piece of our founding charter…Article 1, Section 8.  We know it best as the “necessary and proper” clause.  It gave (and still gives) Congress the power to pass legislation “necessary and proper” to exercise its delegated duties.  Madison didn’t see a bank as “necessary”.  Nice?…maybe.  Convenient?…maybe.  Necessary?…absolutely not.

Madison had argued for the Constitution’s elasticity when writing pieces for The Federalist, but he believed a national bank pushed that elasticity beyond the breaking point.  Many agreed with him.  Hamilton had also argued for flexibility in the Constitution and believed the bank fit nicely under that clause.  And more Senators agreed with him than with Madison, so the bill passed the Senate.

Curious about the bank’s ultimate claim to fame?  How about the party system we enjoy (or loathe, depending on your bent) today?  Yep, it was along the banks of the “banking river” that political parties were born.

Read Full Post »

I spent an afternoon at the Grand Canyon in the summer of 1986 and it was pretty awesome.  Of course, that’s akin to saying that I spent an afternoon in the Smithsonian.  Or maybe it’s like saying that I read the first five pages of The Lord of the Rings.  Or I flew over the Himalayas.

Not that I’ve done all those things…I’ve only done two of them.  It’s just that a half day was only a fleeting glance at one of the most incredible natural wonders, and that can’t possibly have allowed me to absorb all that is the Grand Canyon.  Even the name “grand” comes off as woefully inadequate.  “Stupendous” might be better, or maybe “phenomenal”, or maybe “awe-inspiring”.  But mentioning the Awe-Inspiring Canyon still wouldn’t give it the justice it deserves.

Then again, maybe just calling it “grand” is purposely meant to be an understatement.  You know, the whole “under-promise and over-deliver” thing.  It’s named “grand” so when you get there, you’re blown away by the unbelievable, indescribable, awesome incredibleness of the place.

President Theodore Roosevelt, a naturalist at heart who ventured all over the world and saw hundreds of examples of nature’s magnificent beauty, visited the Grand Canyon and was quoted as saying, “The Grand Canyon fills me with awe. It is beyond comparison—beyond description; absolutely unparalleled throughout the wide world…”

That’s pretty much my sentiment, too.  It is beyond description.  There is no way to, in human language, tell someone what the place is like.  There are millions of photos you could look at (I posted a reasonably nice example above), but no photograph, no matter how big or how many megapixels, could possibly capture the spectacle.  You simply have to go visit and be thankful for the two eyes that God gave you, so you can take it in visually.

It’s been a quarter century for me, and that’s a long time.  We’re planning on visiting our son again sometime in the spring (he lives in a Phoenix suburb), and we’ve talked about driving down.  If we do, a stop at the Grand Canyon will not only be suggested, it’s probably required.  It’s just a remarkable place.

Oh, by the way, the Grand Canyon National Monument came into being on January 11, 1908.  I, for one, am grateful for that.  I think there are millions of people who, every year, discover they agree with me.

Read Full Post »

It contains more than 30,000,000 books.  It has more than 100,000,000 items from various collections.  Are you bilingual?  Good, this place has materials written in 460 different languages.  It houses invaluable music collections, including some of the first recorded sounds in existance.  It has one of the original Gutenberg Bibles.

Yep, the Library of Congress has just about anything you could want to read, watch, or listen to, and thousands of items swell the inventory every day.  In fact, a couple of weeks ago, it was announced that the Library of Congress had struck a deal with Twitter, allowing it to keep a digital record of Tweets.  Um…yay?  It spans four buildings, three of which are dedicated to our second, third, and fourth Presidents.

But in 1851, the Library didn’t have four buildings.  It had just one.  There were no Tweets.  And apparently, that building didn’t have a sprinkler system…or maybe it did, and it hadn’t been tested.  Regardless, on December 24, 1851, the Library of Congress caught fire.  Before the flames could be extinguished, more than 35,000 books had been destroyed.  By today’s standards, that’s a mighty small percentage of the total collection.  But 160 years ago, the Library contained just 55,000 books.

What makes the loss more painful to take is that much of Thomas Jefferson’s personal collection was among the charred remains.  If you recall, after the Library was burned for the first time (when the British sacked the capital during the War of 1812), Jefferson sold his books to the government to seed the new library.

Today, you can see what’s left of Jefferson’s collection somewhere on the Library of Congress’ 800+ miles of shelves (a few are shown above).  And I bet if you look up at the ceiling, you’ll see a bunch of sprinkler heads.

I may be back this evening, but if not, have a safe, wonderful Christmas Eve.

Read Full Post »

The other night we watched yet another of those “disasters of the Apocalypse” shows that seem to pop up with almost absurd frequency these days.  It’s usually the Discovery Channel, or the History Channel, or the Learning Channel, but they’re on all the time.  I suppose it has something to do with the ominous approach of 2012, the year the Mayan calendar ends and a bunch of people believe “the big one” is going to go up.

Didn’t the Mayans live a thousand years ago?  Their calendar probably ended in 2012 simply because they found more entertaining ways to occupy their time.  Hopefully the weight that lots of people give to this nonsense is mostly just a figment of my imagination, because if it’s not, then there are a lot of people that haven’t (unlike the Mayans) found more entertaining ways to occupy their time.

But I digress.  Anyways, this show was one we hadn’t seen before and was narrated by Samuel Jackson.  It was sort of a countdown of the various ways lots of people could get killed by disasters.  There was a big rainstorm over California at Number 5.  Number 4 I can’t remember, but I’m sure it was worse than a container of duck toys spilling into the Pacific.  Numbers 2 and 1 were completely predictable.  Two was a massive tsunami caused by a volcanic eruption and landslide at La Palma island in the Azores…this has been described on a dozen different “what-if” shows.  And of course, Numero Uno was the mega-volcano erupting in Yellowstone, which would lay waste to most of the American existance.  Again, we are not surprised, as this potential disaster is also well-known.

It was Number 3 that most caught my attention…an earthquake.  To be more specific, an earthquake in the Midwest.  Earthquakes in this area aren’t nearly as famous as those occurring around the Pacific Rim and the corresponding Ring of Fire, because they’re so rare.  But when the bigger ones hit, they pack a powerful wallop.

The most famous of the “Midwest” quakes on record was a series of temblors that culminated in a tremendous quake in February of 1812.  Centered over southeast Missouri, northeast Arkansas, and western Tennessee, the biggest ones were felt over a 1,000,000 square miles and damage was recorded as far away as Maine.

But it all began 200 years ago today…December 16, 1811.

At 2:15 in the morning, people along the New Madrid Fault were thrown from their beds by a tremendous rumbling.  They scrambled out of their crumbling homes and got a night-time view of the apocalypse, as the landscape heaved and bucked like a drunken man, under the influence of a quake that would have registered close to 8.0 on the Richter Scale.  There were sand blows and landslides, soil liquifaction and, to hear the locals tell it, a brief reversal of the mighty Mississippi River.

Six hours later, another quake similar in scope struck the region again.  Too large to be an aftershock, it classifies as its own separate quake.  People all over the region were terrified, looking heaven-ward and awaiting the arrival of the Four Horsemen.  Damage was extensive, but deaths relatively light because the population was sparse.

For many years, scientists believed that major earthquakes struck along the New Madrid Fault every couple of hundred years.  And guess what?…we’re at exactly 200 years today.  But seismic activity along the fault has dwindled to a relative handful of very small shakers each year.  I read somewhere that geologists think the fault might be seizing up to a point where quakes no longer occur.

But for the millions of residents that live in St. Louis, Kansas City, Memphis, Chicago, and other large cities in the region, there is that small concern.  I live in central Iowa, several hundred miles from the fault, but I think about it all the time.  Homes in the Midwest are built with tornadoes (and in recent years, flooding) in mind, not earthquakes.

A repeat of the quakes that began two centuries ago would be cataclysmic.

Read Full Post »

« Newer Posts - Older Posts »