The Great Outdoors History

Raising a Forest by Hand

“The hills bear all manner of fantastic shapes,” Charles Bessey observed, noting that they sometimes featured open pockets of bare sand in blowouts and were “provokingly steep and high.” Bessey was describing the Sandhills, the area of post-glacial dunes wrought by mighty winds in north-central and northwestern Nebraska. Aided by his botany students from the University of Nebraska (today’s University of Nebraska–Lincoln), he cataloged a treasure of plant species in 1892. Yet besides spurges and gooseberries, herbaceous plants such as smooth beardtongue, and grasses such as Eatonia obtusata, he found the potential for forestation.

“He was convinced that the moist soil of the Sandhills would support forest growth,” the historian Thomas R. Walsh wrote. Nebraska had gained statehood in 1867 but still had enough untouched areas to be “a virgin natural laboratory,” as Walsh described it. And there were so few trees for wood, shelter, or shade. Bessey had been pushing the state legislature to reserve Sandhills tracts for tree planting. In 1891, urged by the top forestry official in Washington, D.C., he started a test plot at the eastern edge of the Sandhills, which encompassed an area about the size of New Jersey. Ponderosa pines were a big component of the experiment’s 13,500 conifers. With the initial indication that they would do fine, he started a campaign to convince people that forestation was practical. After all, as Walsh noted, “the area was once covered by a pine forest that was destroyed by prairie fires.”

The pre-dawn fog rises above the Niobara River, located in Valentine, Nebraska. (Pocket Macro/Shutterstock)

Bessey had come to the University of Nebraska in 1884, lured from Iowa Agricultural College (today, Iowa State University) by an offer of $2,500 per year. He was already the author of “Botany for High Schools and Colleges,” the nation’s first textbook on the subject. His motto of “Science with Practice” indicated a teaching philosophy that mixed laboratory and field work with classroom instruction. He was one of a small group of professors at the prairie university, attended by just 373 students in the year he arrived, but he had an outsized and enduring influence through his popular botany seminar. A top student in the 1892 cataloging project was Roscoe Pound, who claimed the university’s first Ph.D. in botany, then distinguished himself as a legal scholar and served two decades as dean of Harvard University’s law school.

Throughout the latter years of the Gilded Age, Bessey kept hammering away at the idea of national forests. To Gifford Pinchot, head of the national Division of Forestry, he wrote, “In the Sandhills, we have a region which has been shown to be adapted to the growth of coniferous forest trees, and here we can now secure large tracts which are not yet owned by private parties.” Pinchot had the ear of President Theodore Roosevelt, who in 1902 set aside 206,028 acres in two reserves in the Sandhills. “This was the first and only instance in which the federal government removed non-forested public domain from settlement to create a man-made forest reserve,” Walsh explained.

The two reserves are 75 miles apart. The northern Samuel R. McKelvie National Forest is on the Niobrara River near the city of Valentine. The southern one, first called Dismal River Forest Reserve, is now the Nebraska National Forest at Halsey and is managed by the Bessey Ranger District. (Nebraskans refer to it as “Halsey Forest.”) Within it are the Bessey Recreation Area and the crucially important Charles E. Bessey Tree Nursery, which yearly produces 1.5 million bare-root seedlings and up to 850,000 container seedlings for distribution in the Great Plains and Rocky Mountain states. Additionally, the nursery acts as the seed bank for Rocky Mountain Region 2, storing about 14,000 pounds of conifer seeds in case of wildfire or insect infestation.

Carson Vaughan, author of “Zoo Nebraska: The Dismantling of an American Dream,” grew up in Broken Bow, about 50 miles from Halsey Forest. It was only after he started writing articles about Bessey and the forest that he comprehended the magnitude of the original undertaking: creating the largest man-made forest in the United States. “Nothing like this has ever happened anywhere else on the planet,” he said. “And it all started because this pioneering botanist, Charles Bessey, had this wild idea and the patience, the dogged persistence, to stick with it over a couple decades and see it come to fruition.”

Vaughan remembered climbing Scott Lookout Tower, near Halsey, and feeling the impact upon viewing a forest amid treeless grasslands. “You get the rolling, billowing Sandhills right next to this very clear, dark, dense forest,” he said. The experience reinforced the concept that “it took human beings planting all of these trees to make this national forest grow out of this sandy, arid region.”

The sun rises over the Dismal River, which runs through the Nebraska Sandhills. (marekuliasz/Shutterstock)

After succeeding in the Sandhills, Bessey turned to other important challenges. In 1903, he was contacted about the effort to save the giant sequoias in certain groves in the Sierra Nevada Mountains of California. He tried to interest President Roosevelt in the cause, then introduced the matter into proceedings of scientific societies, sending their resolutions on the matter to congressional representatives. Although he helped to set the conservation process in motion, Bessey would pass away in 1915 without seeing his efforts bear fruit. 16 years later, the state of California acquired the Mammoth Tree Grove, which is a principal element of the eventual Calaveras Big Trees State Park.

On the other side of the country, Bessey became involved in the effort to create a national forest reserve in the southern Appalachians. “The cutting away and total destruction of the forests is a crime against the community as a whole,” he wrote. In 1908, a bill to authorize the reserves came before the House of Representatives, but soon died. It particularly galled Bessey that one of his former students, Representative Ernest M. Pollard, was on the agricultural committee, which had deferred action. “It does seem as though we had the most stupid and blinded lot of men in charge of our affairs that has ever cursed any country,” Bessey wrote to House Speaker Joseph G. Cannon. Bessey and others kept working, and ultimately, the Weeks Act of 1911 was passed, providing for acquisition and preservation of forested lands nationwide.

Today, visitors to the University of Nebraska–Lincoln can see an image of Bessey in bas-relief on a bronze tablet at—where else?—Bessey Hall. There’s also a Bessey Hall at Iowa State. And at Michigan State University, Ernst Bessey Hall is named for Charles’s son, who became a professor of botany and dean at MSU’s graduate school from 1930 to 1944. The apple didn’t fall far from the tree.


The First Selfie

As of this writing, about 700 billion photographs have been uploaded to the internet. Billions and billions more exist in physical form. Many of these photos fall into the category now referred to as “selfies,” a type of photograph that is typically assumed to be as young as Generation Y. However, the roots of the selfie actually go back almost 200 years.

The son of a Dutch immigrant, Robert Cornelius was born in Philadelphia in 1809. As a child, Cornelius was fascinated by chemistry. This interest was surely fanned by the boy’s father, a silversmith, who taught Robert the business of metal polishing and silver plating.

In 1839, the world was taken by storm when French artist Louis Daguerre invented the daguerreotype, a complex process—involving silver-plated copper, mercury vapor, and liquid chemical treatment—that could produce a photographic likeness. An account of Daguerre’s process was published in Philadelphia on October 15, 1839. The next day, Cornelius was approached by a local watchmaker and inventor named Joseph Saxton, who was at that time, an employee of the Philadelphia Mint. Saxton wanted Cornelius to help him produce a daguerreotype image. Cornelius agreed.

Cornelius created the silver plating for Saxton’s photographic image; and that image, as far as we know, was the first photograph ever taken in the United States. In dark hues of gold and brown, the image was taken from Saxton’s own Philadelphia Mint office window, and it portrays part of the State Arsenal and a section of a neighboring high school. A late-19th century description of the “camera” reveals Saxton’s quick ingenuity: A seidlitz powder (a laxative) box with a few flakes of iodine answered for a coating box, while a cigar box and burning glass were improvised for a camera. One Philadelphia photographer later wrote that the Saxton daguerreotype “created no small excitement among the curious in such matters; and from this date, many of our Philadelphia savants began cultivating the art.”

The experience ignited in Cornelius an abiding interest in photography, and he was determined to improve upon the makeshift daguerreotype he’d helped Saxton throw together. In this effort, he enlisted a physician named Paul Beck Goddard, and later that same month of October, they produced a daguerreotype image of Cornelius himself—the first photographic portrait (picture of a human being) ever taken in the United States. It was probably the first ever in history, although one earlier daguerreotype image had been taken a year before in Paris and happened to include a couple people in the background. But that image wasn’t meant as a portrait.

From the metallic, spotted image, Cornelius, with his head slightly tilted to the left, stares back at us with determined eyes set underneath a prominent forehead, that was partly covered by his thick, disheveled hair. He wears a dark coat with a cravat. His right arm is held upright across his chest, and his right hand is tucked beneath the left side of his coat. The limitations of the technology dictated that for this first-ever selfie, Cornelius had to sit still for up to 15 minutes.

The first the first photographic portrait made in the United States (and probably the world), by Paul Beck Goddard and Robert Cornelius. The subject is Cornelius himself, allowing him to claim the achievement of first-ever “selfie.”

Cornelius went on to establish several photo studios, manufacturing his own cameras, plates, and mats to produce portraits for the prominent people (among others) of his time. Many of those photographs survive to this day.

One of his innovations was in harnessing additional light via the use of reflectors. Writing several decades later, one observer of Cornelius described his process: For coating the plates, he used dry iodine exclusively; and by several large reflectors, set at different angles, both within doors and without, he was enabled, in strong sunshine, to concentrate upon his sitter light enough to obtain through a side-window facing south, an impression within from one to five minutes.

Cornelius was later able to improve his process to the point where he could produce “fair impressions, even without reflectors, in from 10 to 60 seconds—and this too within doors.” But success invited imitation, and, as one mid-19th century historian informs us: “Together with the improvements made by [Cornelius] and others in the heliographic apparatus [light reflectors] and manipulative methods, and the great advance consequent thereon in the mode of obtaining portraits from life, quite a number of persons directed their attention to the art from the hope of making it a source of profit.”

As the demand for and interest in photography spread, and as more studios opened, Cornelius opted to move on to other things: specifically, the invention of a solar lamp that proved highly popular across the United States and Europe—but that’s another story. Incidentally, few people knew or understood at the time that Cornelius had taken the first photographic portrait in American history. He wasn’t one to trumpet his own accomplishments.

Luckily for us, however, Cornelius mentored others at his studio. One of them was a young man named Marcus Root. A quarter-century after that first selfie was taken, in 1864, Root published “The Camera and the Pencil, or the Heliographic Art,” which included, along with the theory and practice of photography, a history of the field. That book explicitly, and rightly, credited Cornelius with the first photographic portrait.

Twelve years later, “The Camera and the Pencil” was exhibited at the Centennial Exhibition, where the book was noticed by a photographer named Julius Sachse. Sachse went on to interview Cornelius, and later became editor of the “American Journal of Photography.” In this way, Cornelius’ legacy was secure—and just in the nick of time. He died the following year, in1877.
So, the next time you take a selfie, take a moment to remember Cornelius—and be glad you don’t have to sit still for 15 minutes.

Founding Fathers History

Washington’s Presidency, the Glorious and the Mundane

George Washington, universally acclaimed nowadays as one of our best presidents, encountered a little bad press in his own day. Even before his inauguration, he knew that facing impossibly high expectations would be a challenge during his time as president.

“My movements to the chair of Government will be accompanied with feelings not unlike those of a culprit who is going to the place of his execution.” These were the unenthusiastic words of George Washington, written to fellow Revolutionary War veteran Henry Knox on April 1, 1789, not long before his nearly inevitable election as president.

For eight years (between 1775 and 1783) and without pay, Washington had led the Continental Army against the British. The aristocratic Virginian might have gone on to leverage his impressive victory to become a “conquering general” and establish a personal dictatorship—an end conceivably within his grasp and even suggested by some in his circle.

Instead, George Washington very emphatically retired. Lest anyone should miss the point, Washington even delivered a public resignation address. His days of service were over, and beloved Mount Vernon was calling.

But now, he was being summoned into service once more. Two weeks after Washington had compared his feelings to those of a culprit on his way to execution, a dispatch arrived at Mount Vernon notifying the retired general of his presidential election. Two days after that, 57-year-old George Washington left Mount Vernon, penning the following in his diary:

About ten o’clock I bade adieu to Mount Vernon, to private life, and to domestic felicity; and with a mind oppressed with more anxious and painful sensations than I have words to express, set out for New York … with the best dispositions to render service to my country in obedience to its call, but with less hope of answering its expectations.

Perhaps no one in America was more familiar with the challenges of directing the new union than George Washington, who had played such a central role in its inception and evolution. As such, he was clearly under no illusion as to the challenges that awaited him. His acquiescence (for so it was) to the presidency was informed less by political ambition and more by solemn duty. There was no relishing of the prospect, no celebration on his part, no reveling in his political achievement. Being the sort of president people wanted—by unanimous vote of the Electoral College, no less!—seemed at the very least a daunting task, and probably an impossible one. He seems to have known this.

Bad Roads and White Robes

New York was to serve as the first temporary capital of the new United States of America, but great distance and bad roads meant that it was quite a journey to get there from Virginia. And if Washington was really weighed down by “expectations” at the moment of his departure, he was certainly more so as the journey progressed. Everywhere he went, crowds cheered his arrival, casting roses and wreaths along his path, or erecting triumphal arches for him to pass through. At Trenton, 13 girls—representing the 13 states—in white robes hailed him as “mighty Chief” in song, while Washington was made to ride beneath a 13-columned arch.

Finally reaching Elizabethtown, New Jersey, across the Hudson from New York City itself, Washington was greeted by an ostentatious barge manned by 13 white-uniformed captains. Upon this gaudy vessel, the president-elect was ferried across the river to where Wall Street met the water. New York Governor George Clinton awaited him there—atop a set of specially prepared steps with their sides draped in lavish cloth.

Engraving depicting George Washington en route to Federal Hall for the first Presidential Inauguration, April 30, 1789. (Archive Photos/Getty Images)

George Washington was sworn in on April 30, his oath of office administered on the balcony of Federal Hall, in front of a massive crowd gathered along Broad and Wall Streets and on balconies and housetops in every direction. All was hushed during the swearing in, after which the officiator exclaimed, “Long live George Washington, President of the United States!”

Thunderous applause echoed throughout the city as a 13-gun salute rang out from the harbor. As the ovation continued, an American flag was hoisted above George Washington himself.

Expectations, indeed.


Of course, the hoped-for utopia to be ushered in by America’s greatest Founding Father never materialized. Even Washington himself had hoped that the new federation would, at the very least, avoid political factions. Instead, the real world offered its usual share of complication and contention—including a highly combative two-party system. By the time Washington left office, his once-invulnerable image had taken a hit among some contemporary people. Complainers picked at flaws, real or not. American newspapers attacked his perceived disloyalty to republicanism and his personal integrity. They attacked the lavish receptions (or “levees”) he hosted with his wife, his “aristocratic” airs, his alleged “monarchical” pretensions, his cold and aloof manner. Critics accused him of being unintelligent and susceptible to bad advice from his cabinet, of treacherously betraying France by proclaiming neutrality—and of betraying the American Revolution by not eagerly supporting the French one.

A whole series of letters (called the “Belisarius” letters, after their author’s pen name), addressed personally to Washington and published in opposition newspapers, lambasted the president on a wide range of counts: for cultivating “a distinction between the people and their Executive servants”; failing to stand up to (post-war) Britain; overseeing a costly war with the American Indians; maintaining a standing army in peacetime; and supporting internal taxation (then “denouncing” the people most affected by it), among other allegations.

Tempering Expectations

Women laying flowers at George Washington’s feet as he rides over a bridge at Trenton, New Jersey, on the way to his inauguration as first president of the United States on April 30, 1789. (MPI/Getty Images)

It may be that the aspersions cast in his direction were a primary reason George Washington decided to retire after just two terms. Indeed, an earlier draft of his Farewell Address actually included these words:

As some of the Gazettes of the United States have teemed with all the Invective that disappointment, ignorance of facts, and malicious falsehoods could invent, to misrepresent my politics and affections; to wound my reputation and feelings; and to weaken, if not entirely destroy the confidence you had been pleased to repose in me; it might be expected at the parting scene of my public life that I should take some notice of such virulent abuse. But, as heretofore, I shall pass them over in utter silence.

The impossible expectations placed upon the first president demonstrate, perhaps, the futility of investing in one individual the utopian hopes and dreams of an entire people. One of the original American lessons, at least as they pertain to the state, is that political saviors don’t exist; not even the vaunted George Washington could be one! He’d felt the weight of such expectations right from the beginning. That weight probably helped drive him out of the spotlight in the end.

When election cycles come around, perhaps our expectations should be tempered by Washington’s experience.

And when politicians talk like saviors, remember George Washington, too.

Dr. W. Kesler Jackson is a university professor of history. Known on YouTube as “The Nomadic Professor,” he offers online history courses featuring his signature on-location videos, filmed the world over, at


The Real Johnny Appleseed

Walt Disney’s 1948 animated short, “The Legend of Johnny Appleseed,” famously depicts its main character as a Pennsylvania farmer yearning to join the pioneers heading west to the frontier. All those settlers would need something to eat—and so Disney’s Johnny is determined to plant apple trees across the land in order to feed them.

The real Johnny Appleseed—a “small, wiry man, full of restless activity” named John Chapman, born in Massachusetts in 1774—was indeed a trained orchardist, and he really did show up in Ohio Territory with “a horse-load of apple seeds” to plant. It’s no myth, either, that John Chapman introduced apple trees to large swathes of frontier territory, from modern Pennsylvania and Ohio to Indiana, Illinois, and even Canadian Ontario.

Orchards for Cider

But Chapman’s apples weren’t meant for eating at all (they would have been far too sour, anyway). No—the apples of the real Johnny Appleseed were meant for making hard cider. As a not-so-subtle Smithsonian Magazine headline framed it in 2014, “The Real Johnny Appleseed Brought Apples—and Booze—to the American Frontier.” Journalist Michael Pollan agreed, explaining in a 2015 interview, “Really, what Johnny Appleseed was doing and the reason he was welcome in every cabin in Ohio and Indiana was he was bringing the gift of alcohol to the frontier. He was our American Dionysus.” Perhaps as early as the 1810s, however, Chapman was already being referred to as “Johnny Appleseed,” recognized as such “in every log-cabin from the Ohio River to the Northern lakes, and westward to the prairies of […] Indiana,” according to an 1871 Harper’s New Monthly Magazine piece. Unfortunately, along with apple trees Chapman also planted countless acres of dogfennel, which he considered as possessing medicinal qualities; dogfennel is now seen as a pernicious weed.

And the real Appleseed’s motivations were also far more complicated than those of the happy planter portrayed in the Disney musical. Far from acting the altruistic scatterer of apple seeds tossed hither and thither wherever he happened to roam, Chapman was probably a shrewd entrepreneur. The nurseries Chapman planted weren’t open to all, but rather fenced off, each tree meant for selling to settlers for six and a quarter cents. In the late 18th and early 19th centuries, some land speculation companies required prospective colonists to plant fifty apple trees on their land; since it took the better part of a decade for these trees to bear fruit, such planting would serve to prove the colonists’ commitment to develop the land over many years. The real Appleseed saw in this a business opportunity. He would thus plant a nursery, enter into a business partnership with someone in the area to manage the sale of the trees to arriving settlers eager to quickly fulfill the companies’ requirements, and then move on to repeat the process elsewhere. Two or three years later, he might return to tend to the nursery.

Traversing the Frontier and In Touch With Indians

This was a business model that demanded he labor “on the farthest verge of white settlements,” and it wasn’t an easy line of work. Chapman essentially lived life as a nomad—and a barefoot one at that. One newspaper described him as “barefooted and almost naked,” and for a time he apparently wore a coffee sack, having cut out holes for his head and appendages. (There was, too, his signature hat: a tin vessel, with a visor-like peak in the front, that doubled as a pot for cooking.) The work itself could be hazardous. Once, while working in a tree, he fell and got his neck stuck between forking branches. If one of his helpers that day, a mere lad of eight years, hadn’t discovered him and run for help, the real Johnny Appleseed might have died in 1819. And as far away as they might have seemed from America’s economic centers, Chapman’s enterprises weren’t immune from the vicissitudes of the larger economy. When recession racked the United States in 1837, the price of his trees plummeted to just two cents apiece.

The frontier along which he worked, skirting American Indian country, could also be dangerous. But the natives, whom Chapman always admired and whose wilderness trails he often traversed, left the real Appleseed alone, considering him something of a medicine man; how else could one explain the privation and exposure which he so easily endured? Even during the War of 1812, when many natives along the frontier allied with Britain to devastate white frontier communities, Chapman never ceased his wanderings, and he was never harmed. One frontier settler, reporting his experience during this period, remembered with a “thrill” the peripatetic Chapman’s timely warning to his community: “The Spirit of the Lord is upon me, and he hath anointed me to blow the trumpet in the wilderness, and sound an alarm in the forest; for behold, the tribes of the heathen are round about your doors, and a devouring flame followed after them.” The real Appleseed’s warnings may have saved hundreds of lives.

Disney’s cartoon Johnny makes friends with a skunk and is beloved by all animals. In truth, the land he traversed teemed with potentially menacing wild animals, including wolves, rattlesnakes, wild hogs, and bears. One account of Chapman, however, does claim he had partially tamed a pet wolf, which followed him around everywhere he went, and his religious fervor (he followed Swedenborgianism) did eventually cultivate in him an almost Jain-like respect for all living creatures. By the time of his death, he had become a full-fledged vegetarian.

An Orator and a Gift-giver

Illustration of Johnny Appleseed delivering a speech, circa 1820. (Fotosearch/Getty Images).

As an itinerant orchardist-nurseryman, the real Appleseed, with his “long dark hair, a scanty beard that was never shaved, and keen black eyes,” was well-known up and down the frontier. This meant Chapman was often “passing through” the towns and settlements and American Indian villages of the region (in the words of one 19th-century newspaper, he “sauntered through town eating his dry rusk and cold meat”). Frequently, he would stop to entertain groups of children—apparently he was a master storyteller—or preach “on the mysteries of his religious faith” to any adult who might listen. To little girls he gifted bits of ribbon and calico; “Many a grandmother in Ohio and Indiana,” reported an article published a few decades after his death, “can remember the presents she received when a child from poor homeless Johnny Appleseed.” Once, after being gifted shoes for his bare feet by a particularly assertive settler, he was discovered a few days later again walking barefoot in the cold; the settler confronted him “with some degree of anger,” only to find out that Chapman had almost immediately re-gifted the shoes to a poor family, some of them barefoot, traveling west.

Days after strolling through the streets of Fort Wayne, Indiana, the real Appleseed died suddenly. The year was 1845, and John Chapman was around 70 years old. He was hailed in one obituary for “his eccentricity,” his “strange garb,” his material self-denial (apparently his faith had worn away at the material entrepreneurship of his youth), his “cheerfulness,” and his “shrewdness and penetration.” Johnny Appleseed was buried at Fort Wayne.

Within a quarter-century, the life of Johnny Appleseed was featured in the aforementioned Harper’s New Monthly Magazine piece, subtitled “A Pioneer Hero.” Even then, it was admitted that, as the frontier disappeared, “the pioneer character is rapidly becoming mythical.” The story of the nomadic nurseryman “whose whole [life] was devoted to the work of planting apple seeds in remote places” had begun to take on a life of its own—a myth which, by the mid-20th-century, had become musical Disney legend.

Dr. W. Kesler Jackson is a university professor of history. Known on YouTube as “The Nomadic Professor,” he offers online history courses featuring his signature on-location videos, filmed the world over, at

History Founding Fathers

Roger Sherman, Low-Key Founding Father

Among the Founding Fathers, Roger Sherman is one of the best-kept secrets. But he shouldn’t be, especially in light of the cumulative and lasting effect he has had on this nation, including the present-day debates on the meaning and legal effect of the Ninth Amendment.

Most notable is the fact that he is the only Founding Father to have signed all of these prominent founding documents: the Declaration and Resolves (1774), which contain many of the rights that are enumerated in the First Amendment; the Articles of Association (1774), which was a trade boycott with Great Britain; the Declaration of Independence (1776); the Articles of Confederation (1777); and the U.S. Constitution (1787).

Sherman’s influence on the Constitution was greater than most realize. Historian Richard Werther wrote in 2017 in the Journal of the American Revolution that, at the Constitutional Convention debates, “of 39 issues cited, Sherman prevailed on 19, Madison on 10, and 7 resulted in compromises (the other 3 were interpretational issues for which no clear-cut winner is determinable).” Werther adds, “While no one is arguing that Sherman, not Madison, assumes the mantle as ‘Father of the Constitution,’ clearly Sherman had a bigger role than may have been previously understood.”

As a boy in Connecticut, Roger Sherman was self-educated in his father’s library and later by a newly built grammar school. He managed two general stores. Although he had no formal education in law, he passed the bar exam and was admitted to the bar in 1754. He wrote and published an almanac each year from 1750 to 1761. He served as a mayor, a justice of the peace, a county judge, a Connecticut Superior Court judge, and as a delegate to both the First Continental Congress and the Second Continental Congress. After ratification of the Constitution, he served in the U.S. House of Representatives from 1789 to 1791 and in the U.S. Senate from 1791 until his death in 1793.

Sherman’s reputation was stellar. He was described as honest, cunning, a staunch opponent of slavery, a devout Christian who was outspoken about his faith, and a protector of states’ rights. William Pierce, a delegate to the Constitutional Convention who took extensive notes, said of Sherman, “He deserves infinite praise, no man has a better heart nor a clearer head. If he cannot embellish he can furnish thoughts that are wise and useful. He is an able politician, and extremely artful in accomplishing any particular object; it is remarked that he seldom fails.”

Role in the Bill of Rights and the Ninth Amendment

Originally, Sherman was opposed to adding a bill of rights to the Constitution due to its being “unnecessary” and “dangerous.” He, like other Federalists, stated that it was unnecessary as the powers enumerated in the Constitution granted limited authority; if certain powers were not enumerated and delegated, then the federal government wouldn’t have the authority to infringe upon the rights in question. Plus, the states had their own constitutions protecting their citizens’ rights, and the Constitution is concerned only with federal guarantees, not states’ guarantees. The Federalists considered it dangerous to list certain rights as it could be construed that other rights not singled out were surrendered to the government; in other words, if they were not written down, then those rights would not be considered protected.

The original Constitution was signed by 39 delegates on September 17, 1787. It was during the First Congress on June 8, 1789, that James Madison proposed to “incorporate such amendments in the Constitution as will secure those rights, which they consider as not sufficiently guarded […] to satisfy the public that we do not disregard their wishes.” After Madison persuaded Congress to create a Bill of Rights, the proposals were referred to a House select committee, the Committee of Eleven, which took up the debates. In 1987, the National Archives discovered among Madison’s papers the only known copy of the deliberations of that House Committee, and they are in Sherman’s handwriting, most likely reflecting the thoughts of the committee as opposed to his personal views.

This discovery has created a vigorous debate among legal scholars as to the meaning and legal effect of the Ninth Amendment, the text of which reads, “The enumeration in the Constitution, of certain rights, shall not be construed to deny or disparage others retained by the people”: namely, what are the rights “retained by the people” referring to, and what legal effect do they have? To give context, it is essential to go back to Madison’s original draft regarding retained rights:

The exceptions here or elsewhere in the Constitution, made in favor of particular rights, shall not be so construed as to diminish the just importance of other rights retained by the people, or as to enlarge the powers delegated by the Constitution; but either as actual limitations of such powers, or as inserted merely for greater caution.

After the House committee’s debates and revisions, Sherman’s notes read:

The people have certain natural rights which are retained by them when they enter into society, such as the rights of conscience in matters of religion; of acquiring property; and of pursuing happiness and safety; of speaking, writing and publishing their sentiments with decency and freedom; of peaceably assembling to consult their common good, and of applying to government by petition or remonstrance for redress of grievances. Of these rights therefore they shall not be deprived by the government of the United States.

According to the Bill of Rights Institute, once the Bill of Rights was drafted, Sherman supported it, just as the people of Connecticut supported it.

Deborah Hommer is a history and philosophy enthusiast who gravitates toward natural law and natural rights. She founded the nonprofit ConstitutionalReflections (website under construction) with the purpose of educating others in the rich history of Western civilization.


The Righteous Revolutionary Thanksgiving ‘Oration’

In mid-1772, a British customs schooner, the HMS Gaspee, attempted to catch an American packet ship off the coast of Rhode Island. The Gaspee was led by one Lieutenant William Dudingston, hated among Rhode Islanders for his strict enforcement of the Navigation Acts. In the case of Dudingston, rather than “enforcement,” the locals might have used the word “harassment.”

During the chase, the Gaspee ran aground. Stuck on a sandbar, and hence vulnerable, the Gaspee became the target of a group of Providence men, many of them Sons of Liberty. Assembled via the town crier, the patriots rushed toward the ship. The crew attempted to resist, but it was no use. Dudingston himself was shot and wounded, and his ship was burned down to the waterline.

British authorities tried to get the colonial perpetrators of the “Gaspee Affair” extradited to England to stand trial for treason, but the government couldn’t figure out who they were. Even a large reward failed to produce the names. But the Affair had only just begun to run its course.

A Baptist minister named John Allen, a recent arrival from Britain, while preaching in Boston at the end of that year, invoked this incident in his December 3 sermon, entitled “An Oration on the Beauties of Liberty.” Subsequently printed as a pamphlet, this sermon became a best-seller throughout the colonies. The question Allen posed was: Do the Rhode Islanders who destroyed the Gaspee receive their Laws from England?

“O! Amazing!” Allen reflected. “I would be glad to know what right the King of England has to America. It cannot be an hereditary right…; it cannot be a parliamentary right that lies in Britain, not a victorious right, for the King of England never conquered America. Then he can have no more right to America than what the people have, by compact, invested him with, which is only a power to protect them and defend their rights, civil and religious; and to sign, seal, and confirm as their steward such laws as the people of America shall consent to.”

And if this be the case, Allen thundered, “then judge whether the King of England and his ministry are not the transgressors in this affair in sending armed Schooners to America to steal by power and sword the people’s property.”

The message was clear: The British king hadn’t inherited America as some personal property; he’d never conquered it, and the power of Britain’s Parliament lay in Britain, not in the colonies. Rhode Island’s people were free, their rights were enshrined in their charter, and their laws originated in their own assembly. Who, then, was the true aggressor?

Five ‘Observations’

Allen’s “Oration” was built around five observations.

First: that “a craving, absolute Prince, is a great distress to a people.”

Second: that when the three branches of government, “king, judges, and senates unite to destroy the rights of the people by a despotic power… the destruction of the people’s rights is near at hand.”

Third: that “an arbitrary despotic power in a prince, is the ruin of a nation, of the King, of the crown, and of the subjects,” and that neither the King of England nor the Parliament of England can “justly make any laws to oppress or defend the Americans” because “they are not the representatives of America.”

Fourth, Allen channeled his inner John Locke:

“THAT it is not rebellion, I declare it before GOD, the congregation, and all the world, and I would be glad if it reached the ears of every Briton, and every American; That it is no rebellion to oppose any king, ministry, or governor, that destroys by any violence or authority whatever, the rights of the people. Shall a man be deem’d a rebel that supports his own rights? It is the first law of nature, and he must be a rebel to GOD, to the laws of nature, and his own conscience, who will not do it.”

And fifth: “That when the rights and liberties of the people are destroyed, it is commonly by the mischievous design of some great man,” who Allen wisely did not specifically mention by name.

These were radical sentiments—and tens of thousands of Americans read them enthusiastically. According to American Founder John Adams, by mid-1773, patriots like James Otis Jr. were regularly reading Allen’s “Oration” to “large Circles of the common People.”

In his “Oration,” Allen insisted on something Americans must remember today:

“A right to the blessing of freedom, we do not receive from Kings, but from Heaven, as the breath of life and essence of our existence, and shall we not preserve it, as the beauty of our being? Do not the birds of the air expand their wings? the fish of the sea their fins? and the worm of the earth turn again when it is trod upon? And shall it be deem’d rebellion? Heaven forbid it! … It is no more rebellion, than it is to breathe.”

Dr. W. Kesler Jackson is a university professor of history. Known on YouTube as “The Nomadic Professor,” he offers online history courses featuring his signature on-location videos, filmed the world over, at


The Capitol’s Statue of Freedom

As I step outside the House chamber on the second floor of the Capitol, I guide my visitors halfway down the stairs outside, offering them a sweeping view of the Supreme Court building and the Library of Congress. That’s when I call their attention to something else altogether—the crowning achievement, literally, of the Capitol: the Statue of Freedom, perched atop the dome, solitary, magisterial.

It is perhaps the most recognizable feature of the Capitol, an iconic world image of liberty and government by the people. Peering into the distance nearly 300 feet above the East Front Plaza, the bronze statue is of epic dimensions, soaring almost 20 feet high and weighing about 15,000 pounds. Freedom is decked out in an elaborate headdress topped by an eagle head and feathers. Her flowing dress is cinched with a large brooch emblazoned with two letters: U.S. In her right hand, she clasps a sheathed sword, while the other clutches a laurel wreath of victory and a shield.

The Statue of Freedom perched atop the Capitol is something to behold and serves as a symbol of my stewardship as a member of Congress, which is why I selected that image of the Capitol dome to adorn my letterhead. This is what I want my constituents to see, to be reminded of, when I write to them.

The Statue of Freedom also symbolizes the personal quest for freedom of one man, Philip Reid, born into slavery in Charleston, South Carolina, in 1820. In one of the great ironies of American history, Reid, as a slave, was assigned the complex project of creating and placing one of the world’s most powerful symbols of freedom on the most visible building in our nation.

The Statue of Freedom atop the U.S. Capitol dome in Washington, D.C., on July 1, 2010. (Paul J. Richards/AFP via Getty Images)

I’ll admit, I didn’t know the
 story of Reid until after I became
 a member of Congress and got the 
lowdown on the history of the Capitol from those in the know. But once I heard about Reid’s remarkable story, I delved deeper, reading more about it online. I mention all this about Reid during my tours, and though not one of my visitors has ever known the story beforehand, they are surely glad to hear it. Slavery is a terrible stain on our history, but my guests are palpably proud of how far America has come since then.

The statue was commissioned in 1855. Thomas Crawford, an American sculptor, created the plaster model of the statue in Rome, Italy. After his death in 1857, his widow shipped the statue in six crates, and the model was assembled and placed in what is now Statuary Hall. The following year, Clark Mills, a self-taught sculptor, was given the task of casting Freedom. Mills started his business in South Carolina, where he purchased Reid for $1,200. Reid dismantled the model in the Capitol, cast the individual sections, and finally assembled and mounted the bronze sections atop the dome.

On April 16, 1862, as Reid supervised the creation of the statute’s massive bronze sections, Congress passed the District of Columbia Emancipation Act, freeing thousands of slaves living within the district. That included Reid. As a free man, he kept working for Clark Mills. At noon, on December 2, 1863, under Reid’s supervision, the top section of the Statue of Freedom was raised and bolted on top of the Capitol dome.

Many of the experts with whom I have toured the Capitol offered various explanations for the direction the Statue of Freedom faces. Some say Freedom faces east because every morning she watches the sun rise on America with a new day of liberty for all. Others say she faces east because the primary entrance to the Capitol is on the east side, or because most residents of Washington, D.C., at the time lived on the east side. Yet others suggest she faces east because European settlers came from that direction.

The Statue of Freedom on top of the U.S. Capitol dome, silhouetted against the super moon on Jan. 20, 2019. (Brendan Smialowski AFP via Getty Images)

As the foreman in the casting of the Statue of Freedom, Philip Reid stepped in when an Italian sculptor hired to assemble the five sections refused unless granted a pay raise. It was Reid who figured out how the pieces were separated and put together. He was paid $1.25 a day, though his owner received those payments, except on Sundays, when it was his own. Mills, the man who bought Reid, described him as “short in stature, in good health, not prepossessing in appearance, but smart in mind.”

Reid was a freed man by the time the last piece of the Statue of Freedom was assembled in December 1863. He went on to become a respected businessman, identified in census records as a “plasterer.” While a plaque to Reid resides not at the Capitol, but where his remains lie at the National Harmony Memorial Park in Landover, Maryland, his place in history—and on my tour—remains resolute.

Excerpted from the 2020 book, “Capitol of Freedom: Restoring American Greatness,” by Colorado Rep. Ken Buck.

Arts & Letters History

A Pedestal Waiting for a Monument

The crypt of the U.S. Capitol isn’t the dark, dank dwelling conjured up by its evocative moniker. On the contrary, the crypt is a well-lit circular chamber on the ground floor, under the rotunda, traversed by countless people every day, hurrying on their way—blinders on—to a hearing or meeting of reputed import. George Washington was supposed to be interred here—hence the name of the burial place—but his body never made it. Construction of the crypt was interrupted by the War of 1812. His family decided to honor his wish to be buried at his Mt. Vernon, Virginia, home, just a few miles away from the Capitol.

Magna Carta

Tucked away in the crypt—hidden in plain sight—is a replica of the Magna Carta, the 800-year-old document reining in the monarch. On tours, I make a point of directing my visitors’ attention to this transformational declaration; otherwise, they might miss it, given all the magnificent distractions surrounding it—forty neoclassical columns, and thirteen statues of prominent Americans of the original thirteen colonies.

In all the times I’ve entered the crypt—and it’s been plenty—I’ve never seen people clustered around the gold and glass case containing this most essential document, the greatest relic in the room.

The history of the Magna Carta predates our nation’s founding by more than five hundred fifty years, which might explain how it sometimes escapes people’s attention today. King John of England signed the Magna Carta on June 15 of 1215, after a severe clash with his barons, who had become frustrated with the monarch’s arbitrary rule and abuses of power. The noblemen set out to craft a document to rein in the king’s powers. The document they formulated prohibited arbitrary arrest and imprisonment, and established individuals’ right to a fair trial and the protection of private property. Those rights are foundational to the rule of law, and essential for limiting the powers of government.

The Magna Carta—Latin for “the Great Charter”—provided the key principles of the supremacy of the rule of law that formed the foundation of our Constitution. In this respect, it is symbolic that the Magna Carta replica lies in the crypt—the literal foundation—of the Capitol, erected to support the rotunda above it. The document’s most important principle— that no man is above the law, not even the king—is the foundation for American rule of law, and the base upon which we have built our system of government.

If those basic rights recognized in the Magna Carta sound familiar, it’s for good reason. America’s founders drew heavily from the ideas in the Magna Carta to write the American Declaration of Independence and the Constitution.

The Compass Star

Only a few feet away from the Magna Carta is a worn white marble stone compass star embedded in the center of the floor of the crypt. While it may seem, at first glance, the two features of the Capitol are unrelated, they each reinforce the primacy of the rule of law and the importance of the legislative body.

That compass star is the point in Washington, D.C. where all four quadrants of the district—northeast, southeast, northwest, and southwest—converge. If you place your foot on the compass, as I have from time to time to demonstrate for my visitors, you are standing in all four quadrants of the city simultaneously. When I take tourists to this spot, the following ritual tends to take place: They stand on the star, which droops below floor level, smoothed down with the passage of time. Then they hop off the star, pull out their smartphones, and take photos of what is, admittedly, a cool symbol. But it holds even greater significance. The compass star is the key to understanding the vital role the legislature plays in our republic.


We must first revisit Pierre-Charles L’Enfant. After he wrote to President George Washington, offering to create a capital “magnificent enough to grace a great nation,” he got the gig in 1791. Influenced by the France of his youth, L’Enfant borrowed ideas from the grand sweep of the Versailles palace, conjuring up what are now distinct D.C. features, such as its broad avenues, designed on a slashing angle. The cheerful L’Enfant sought another epic brush stroke, designing a considerable park in front of the White House, for the benefit of the president, whoever happened to be in residence. But Thomas Jefferson put the kibosh on those plans out of a worry such an exclusive domain didn’t mesh with the nascent nation of the people. Hence, the space became a public gathering spot you might have heard of—Lafayette Park.

L’Enfant, though, got his way on a more vital part of his plan, to make the Capitol the central point of the new capital district. The Capitol was created to be the central focus of the new government, a building perched on a slight hill, elevated above the rest of the city. That hill was known in our nation’s earlier years as “Jenkins Hill,” because a man named Thomas Jenkins apparently once grazed livestock at the site. L’Enfant saw it in a more enchanted way, as “a pedestal waiting for a monument.” That pedestal has come to be known as Capitol Hill, today.

The location of the Capitol building speaks volumes about the role our founders intended the legislative branch to play—and the paramount role of the rule of law. Because the Capitol is located on a hill, on one of the highest points in Washington, D.C., it reminds all of us that the legislative branch—the part of the federal government most accountable to the people—is the most important branch of government.

Excerpted from the 2020 book “Capital of Freedom, Restoring American Greatness” by Colorado Rep. Ken Buck

Features History

In Their Words: Veterans Who Served in War Tell Their Stories


World War II Veteran

Editor’s note: Stanley Feltman passed away on September 23, shortly before this issue went to press.

(Dave Paone)

In 1945, at age 19, Stanley Feltman was a tail gunner in a B-29 for the U.S. Army Air Corps. He had flown about 15 successful bombing missions in the South Pacific, but come mission number 16, he wasn’t so lucky.

His plane, containing 11 crew members, was shot down by a Zero fighter aircraft of the Imperial Japanese Navy. All 11 men were able to escape the wreckage by inflating a dinghy and paddling away from the aircraft before it sank minutes later.

The dinghy was designed for six. That meant six were able to sit inside; but five, including Feltman, had to hang onto a rope that ran around the perimeter, with their bodies waist-deep in the water.

And then there were the sharks. They had some repellent on hand, but it dissipated after time. At one point, another airman who was hanging on lost his grip and slipped into the shark-infested water. Feltman dived after him and brought him back to the surface. This act of bravery would earn Feltman the Bronze Star.

Several hours later, a submarine spotted them. However, its crew was on a mission elsewhere, and could not take them aboard. The submarine’s crew wired their coordinates to an aircraft carrier, which sent a PBY seaplane to pick up the stranded airmen after a total of about eight hours in the water.

When the United States entered the war on December 7, 1941, Feltman was only 15 and couldn’t enlist, although he wanted to. However, Americans could enlist at 17 with parental consent, which was his plan. Upon his 17th birthday, he told his parents of his intention to volunteer.

Eventually, Feltman found himself in the tail of a B-29 in the South Pacific. His job was to fire at oncoming enemy planes. Often, these were flown by Kamikaze pilots, who would purposely crash their explosive-laden planes into American aircraft carriers.

Feltman recalled his first encounter with the enemy. “I remember somebody saying, ‘There’s planes coming in at six o’clock,’” he said. “I sighted on a plane that I saw coming in. I didn’t know if it was the same plane that they saw because usually they had five, six planes at one time come at you. I fired; I saw the plane blow up, so I figured it has to be a Kamikaze plane. It just exploded.”

Feltman was only 18 at the time, and the youngest member of the crew. After he hit his target, he shouted, “I got him! I got him! I got him!”

Today, at 95, when Feltman thinks about those battles, he’s not so enthusiastic. He’s certain he shot down eight Japanese pilots and thinks there may have been two more.

“I never felt right by taking a life,” he said. “When you’re shooting planes down, you’re taking a life. That’s all. There’s nothing big about that.”

Korean War Veteran

Sal Scarlato (left) with a South Korean counterpart. (Courtesy of Sal Scarlato)

On June 25, 1950, North Korean soldiers crossed the 38th parallel, and the Korean War began.

Sal Scarlato was 17 at the time. He had known of a few boys from his Brooklyn neighborhood who were killed in combat early on, but this didn’t stop him and his pals from enlisting in the Marines after they turned 18.

Private First Class Scarlato landed at Incheon on April 10, 1952. He was 19 and in the infantry.

“All of a sudden, we got hit with small-arms fire and mortar fire,” said Scarlato. “We were firing like crazy. I had the runs. I urinated. I was crying. A couple of guys got hit.”

One night, Scarlato had outpost duty along the 38th parallel. “That night, the CCF (Chinese Communist Forces) really gave us a welcome,” he said. “When they came, I didn’t fire my weapon right away. I froze. So, the guy next to me—actually, he was my squad leader—hit me in the helmet. He said, ‘You better start firing that weapon.’ A couple of minutes later, he got hit in the belly. He fell right on top of me. And when the corpsman came, he said, ‘Give me your hand.’”

Scarlato applied pressure to the squad leader’s liver, which was protruding from his body. Right then and there, he died. “I cried like a baby,” he said. “After this, I was very bitter. I kept saying to myself, ‘What the hell am I doing here?’ And my officers always said, ‘You’ll find out. You’ll find out eventually what you’re doing here.’”

Scarlato witnessed countless casualties, and then, in July 1952, became one. Once again, Scarlato’s unit came under attack by the CCF. An enemy combatant tossed a grenade at him and two other Marines. It exploded, killing one of them and wounding the other two. Scarlato suffered leg, neck, and hand wounds, and a concussion.

A corpsman gave him a shot of morphine and sent him via jeep to an aid station. From there, he was flown via chopper to a hospital ship. He thought this was his ticket home, but the Marines still needed him. Being sent back to his unit made Scarlato bitter. “I hated everybody,” he said, even his South Korean allies. Scarlato once even spat on a soldier when he came close.

Scarlato soon discovered that the officers were correct, and he did indeed find out why he was there. On patrol one day, Scarlato’s unit came upon a small village where several civilians had been killed.

“There was a little boy, maybe 5, 6 years old—he had his hand blown off.” Scarlato immediately picked the boy up and put his severed hand in his own pocket. He bandaged the end of the boy’s arm and a corpsman arrived. The child screamed in pain the entire time. They flagged down a medical jeep and drove to a nearby orphanage that had medical staff.

The nurses placed the boy on a table. Scarlato and the corpsman turned and walked out, having done all they could. Then, Scarlato remembered he still had the child’s hand in his pocket. He stepped back inside, only to find out the boy had died.

This was the defining moment. Out of all the death and carnage Scarlato saw, this was the worst. Now, he knew that the reason he was there was “to save these people’s lives. Before that, I didn’t understand.”

At 88, Scarlato is still sharp as a tack and keeps up with the news, including about current U.S.–North Korea relations. He’s a member of the Korean War Veterans Association, and regularly raises money for Korean War monuments.

Vietnam War Veteran

Col. Robert Certain with his wife, Robbie. (Courtesy of Robert Certain)

It was late 1972, and as the holiday season approached, Colonel Robert Certain, an Air Force B-52 navigator, was preparing to return stateside from war-torn Vietnam. But just days before his departure date, this much-anticipated plan was abruptly changed. Instead of returning home, Certain was now assigned to a large-scale flying mission—one that would radically change his life.

As a navigator, Certain explained that his job was not only to get to the target on time, but also to ensure the task was accomplished in an equally prompt and precise manner. The logistics were critically important for this mission, he said, because he and his crew would be flying toward Hanoi, deep into what was then known as enemy territory. Even so, the newly assigned mission initially got off to a good start and seemed to go according to plan. And then, it didn’t.

When Certain and his crew had almost reached their target, the plane suddenly sputtered into a free fall. They’d been hit. With no time to waste, Certain knew there was only one way to survive the doomed flight—eject into enemy territory. And so, Certain explained, he wasn’t surprised when he was captured, along with another member of the crew. “We were just a few miles north of Hanoi,” Certain said of their precarious landing site, estimating it was within 10 or 20 kilometers of their original target.

Certain would eventually end up in the infamous prison sarcastically dubbed by Americans at that time as the “Hanoi Hilton.” But first, he was forced to endure hours of relentless interrogation. Then, he and his fellow captive crew mate were paraded in front of cameras at an international presser.

Though the North Vietnamese may have been “showing off” their catch of the day, Certain believes this exposure protected him and the other new captures from the type of well-reported, horrendous conditions earlier prisoners were subjected to. After about 10 days, his tiny, shared cell was upgraded to a much larger one, and the prisoners were eventually allowed to gather together on Sundays for a service of sorts.

If the watchful eye of the media played a part in the type of treatment Certain and other newer captives received as prisoners, undoubtedly, so did the actions of the American government. At that time, the United States was in dedicated negotiations to end its involvement in the war. After the signing of the Paris Peace Accords made it official, Certain once again began planning for his return home. This time, his plans were undeterred, and Certain was set free on March 29, 1973.

But this isn’t where the story ends. Certain, who was 25 when he was captured, returned to the United States and hit the ground running, but on a much different path. In 1976, Colonel Certain became Father Certain, an ordained Episcopalian priest. He went on to earn his Doctor of Ministry degree in 1999, and as a member of the U.S. Air Force Reserves, he served as chaplain for a number of U.S. bases, including what is now Andrews Joint Base. When former President Gerald Ford passed away in 2006, it was Father Certain who presided over his graveside services.

Certain retired from active-duty service in 1977 but went on to serve in the Reserves until 1999. His exemplary service earned him a number of prestigious honors, including the Purple Heart, Bronze Star, and Distinguished Flying Cross medals, to name just a few. He has also served as a CEO, director, or board member for numerous organizations and governmental committees, such as the Defense Health Board and the Pentagon Task Force on the Prevention of Suicide by Members of the Armed Services. Notably, he remains active as a board member of the Distinguished Flying Cross Society, comprised of medal recipients. Over the years, his 2003 autobiography, “Unchained Eagle,” has accumulated a prestigious—and rare—five-star average rating on Amazon.

Yet despite his many successes, Certain admits to one failure. “I’ve tried to retire,” he said with humor in his voice, “but I’ve been a failure at it.” Officially though, Certain is indeed now classified by the military as retired, and lives with his wife of many years, Robbie, in Texas.

Gulf War Veteran

Air Force Lt. Col. Rob Sweet (center right) with his family after he took his final flight on June 5 this year, at the Moody Air Force Base in Georgia. (Andrea Jenkins)

It was February 1991, and U.S. Air Force pilot, Lieutenant Colonel Robert Sweet, was on his 30th mission in Desert Storm. The goal, simply put, was to eliminate enemy targets. However, his arrival at the targeted area was met with such heavy fire, he was ordered to leave because, as he explained in a press statement later, “if the target area is too hot, you have to leave. It’s not time to be a hero.”

As he and his lead flight captain, Stephen Phillis, made their way out of the area, he caught sight of what he described as a “pristine array of (enemy) tanks that had not been hit.” He found this downright shocking, he said, “because by that point, everything had been bombed for the past 30 days.” After Sweet began to attack the tanks, an exchange of fire erupted, and the A-10 Thunderbolt he was piloting was hit from behind.

He attempted to keep the damaged plane in the air, but he quickly realized it was not salvageable, and in order to survive, he would have to eject into enemy territory. “I tried a couple of things, and basically, it wasn’t going to work, so I punched out,” Sweet said, explaining how he landed face-to-face with more than a dozen irate Iraqi soldiers, southwest of Basra, Iraq. He was captured and held prisoner for 19 days under brutal conditions, including beatings, starvation, and exposure to disease.

It was clear, he said, that he now had to fight to keep himself both physically and emotionally strong. But it was also clear that the military had prepared him well beforehand for this type of situation. “There were very few surprises,” Sweet said of his time as a prisoner. “The SERE (Survival Evasion Resistance and Escape) we have is outstanding,” he said of the U.S. military’s training. “There were very few surprises in the jailhouse. I knew what to expect.”

And although his expectation included casualties, Sweet still found himself reeling after learning that Phillis had been killed in action. “I had survivor’s guilt, and it took me a long time to get over that,” he said.

Sweet spent 19 days in captivity before being released as part of a prisoner exchange. But it wasn’t without some long-term aftereffects. Most notably, he realized the importance of making good decisions under pressure and taking life as it comes. “Bloom where you’re planted,” he advised. In the military, that often includes assignments to undesirable locales. “Make the most of them and move on,” he said.

And that’s exactly what Sweet himself has done. After spending 20 years on active duty and 13 more as a reservist, Sweet retired in June 2021, making him America’s last POW to be actively serving in the Air Force. After this acknowledgement and congratulations at his retirement ceremony, General Charles Q. Brown captured the sentiment of the nation when he said simply, “We thank you for all you’ve done.”

Dave Paone is a Long Island-based reporter and photographer who has won journalism awards for articles, photographs, and headlines. When he’s not writing and photographing, he’s catering to every demand of his cat, Gigi.

Joni Williams started her career as a real estate reporter. Magazine writing soon followed, and with it, regular gigs as a restaurant and libations reviewer. Since then, her work has appeared in a number of publications throughout the Gulf Coast and beyond.

Features History

A Secret Language That Helped End World War II

In war, information can be more valuable than tanks, planes, ships, or soldiers. Information sent and received without detection can mean the difference between victory and defeat, even between life and death.

Protecting information means developing elaborate codes. One code, which Native Americans developed and used, played a pivotal role in helping the United States win the Pacific front during World War II and bring the conflict to an end.

In the process, it became the only spoken code in military history never to have been deciphered.

Members of the Navajo tribe combined with the Marine Corps to create a code using the Navajo language. The Navajo Marines who employed that code became known as “Navajo Code Talkers” and participated in every Marine assault in the Pacific, including Guadalcanal, Iwo Jima, and Okinawa.

The code “saved hundreds of thousands of lives and helped win the war in the Pacific,” said Peter MacDonald Sr., a 93-year-old Marine veteran and one of only four Code Talkers still living.

At Iwo Jima, six Code Talkers sent and received more than 800 messages without making a mistake.

“Were it not for the Navajos,” 5th Marine Division signals officer Major Howard Connor once said, “the Marines would never have taken Iwo Jima.”

A Spark of Genius

The idea to use Navajo came to a civil engineer in Los Angeles. Philip Johnston, the son of a missionary, grew up on a Navajo reservation in Arizona and maintained contacts with Navajo friends. Johnston, who fought in World War I, had learned that the U.S. Army used the language spoken by the Comanche tribe for military communications during field maneuvers.

After the Japanese attacked Pearl Harbor in 1941, Johnston contacted the Marines and presented his idea in 1942. The Marines asked him to organize a demonstration, so Johnston chose four Navajos who were working in Los Angeles’ shipyards at the time.

The demonstration succeeded. The Navajos decoded and transmitted three lines within 20 seconds.

MacDonald Sr. with his veteran insignia. (Tom Brownold for American Essence)

So the Marines approved Johnston’s plan and recruited 29 Navajos to write a code book. But since Navajo was only spoken, not written, the authors devised an alphabet for written communication and colorful descriptions for military terms.

For example, the Code Talkers used the Navajo word for chickenhawk to describe a dive bomber.

“We had a lot of chickenhawks on the reservation,” MacDonald said. “They fly high, but when they see a raven down below, they dive real fast, and they have a nice lunch. So by using the action of the bird and the action of the airplane, we can help us memorize what those code words are.

“Code words were not very difficult to remember because they were all based on something that we’re all familiar with. All the names of different airplanes took the names of different birds that we are very familiar with on the reservation.”

Breaking New Ground

The armed forces used other Native American languages as codes during World War II, but Navajo provided several advantages. First, it remained an unwritten language. Second, only about 30 non-Navajo Americans understood the language when the program began. Third, Navajo’s grammar and syntax differ dramatically from other languages.

Though the program began in 1942, MacDonald had no idea it existed when he joined the Marines in 1944.

“It was top secret to begin with,” he said. “None of us knew that there was such a program until after we passed boot camp, combat training, and communication school. Only after that were we then introduced to a very private, top secret, confidential, Navajo code school.”

At that school, instructors who served overseas taught the students how to use and pronounce code words, how to use the new alphabet, how to write legibly on a special tablet for the code, and how to practice their new skills.

Working Under Fire

The Code Talkers who graduated became as indispensable as rifles or mess kits.

“Every ship used in the landing—battleships, cruisers, destroyers, submarines, aircraft carriers—all had Navajo Code Talkers along with the English [language] network guys,” MacDonald said. “Every Marine air wing, Marine tank unit, and Marine artillery unit also had Navajo Code Talkers assigned to them.”

So how did the whole system work under fire?

“There are two tables [where Marines worked], one for the Navajo communication network, a second table for the English communication network,” MacDonald said. “As soon as the first shot is fired, messages are coming in Navajo as well as in English. All Navajo messages are received by Navajo Code Talkers.

“The message comes in, you write it down in English, and hand it over your shoulder to the runner standing behind us. He takes it up to the bridge and gives it to the general or the admiral. He reads it, he answers, and the runner brings it back down to us.”

The runner had his own special way to determine a communication’s importance.

“If he says ‘Nevada,’ ‘New Mexico,’ or ‘Arizona,’ we send a message back out in Navajo code,” indicating the message was important, MacDonald said. “If there is a top secret or confidential message that needs to be sent to another unit or another location, it’s given to a Navajo Code Talker.”

By the time World War II ended, more than 400 Marines served as Navajo Code Talkers. Their secret vocabulary grew from 260 code words used during Guadalcanal, the Code Talkers’ first battle, to more than 600, MacDonald said.

Preserving a Legacy

Yet not until 1968, when the government declassified the program, did Americans know about the Navajo Code Talkers. Now, 80 years after serving, the surviving Code Talkers are trying to preserve their legacy for future generations.

“We have been going across the country, via invitations, to tell our story,” MacDonald said, “and we are making headway to get American people to know this legacy.”

MacDonald Sr. with his grandchildren. (Tom Brownold for American Essence)

Part of that campaign involves plans for building a museum dedicated to that legacy.

“We found that many Americans and foreign nations didn’t know anything about this unique World War II legacy,” said MacDonald, who is spearheading the project. “The museum will tell the story of who we are, our heritage, our culture, our language, and the sacrifices we’ve made like so many other peoples.”

Those sacrifices enabled the United States to help protect the world from tyrants, he added.

Joseph D’Hippolito is a freelance writer based in Fullerton, California. His work has been featured in The Wall Street Journal, The Federalist, The Guardian, The New York Times, and the Jerusalem Post, among other outlets.

Features History

Service in the Time of JFK’s Camelot

This year marks the 60th anniversary of the start of President John F. Kennedy’s administration. When he took office in January 1961, he ushered in a new sentiment for the country. That sentiment was all about youth.

At 43, JFK was the nation’s second-youngest president, and he was good-looking to boot. The First Lady was also young and good-looking, and their two young children were adorable. It was all about youth.

JFK succeeded President Dwight D. Eisenhower. While both had served in the military during World War II, they were from opposite ends of the age spectrum. Eisenhower, known as Ike, was a career soldier, and had reached the rank of five-star general in the U.S. Army by the end of his military career. JFK, while an officer in the Navy, was far younger, and only rose to lieutenant during the war.

“What had happened in 1960 was that the junior ranks of the military in World War II replaced the generals,” said James Piereson, a historian and fellow at the Manhattan Institute. “That was part of the generational change that happened. Kennedy was, of course, quite pro-military,” he said. “JFK gave luster to military service,” he added, having “very much campaigned on his war record” in 1960.

So, what was it like being young and in the service during the Kennedy administration?

Bob Hogan was a gunnery officer and lieutenant junior grade on active duty in the Navy from 1960 to 1963, essentially the entire duration of JFK’s time in office. He was commissioned at age 22. “I was blown away by JFK’s Navy war record, his charisma, style, and wit,” he said. “I was immensely energized by his call to service, and really believed in it. His seeming idealism, his patriotic values—I was completely taken in.”

Tom Fryer had the thrill of a lifetime when JFK handed him his diploma and commission. They shook hands at Fryer’s graduation ceremony from the U.S. Air Force Academy in 1963. “I felt so honored, so humbled,” said Fryer, who was also 22 at the time.

The American president is also commander-in-chief of the nation’s military. In October 1962, JFK had to make some difficult decisions in that role. The United States and the USSR were fighting the Cold War. Nikita Khrushchev was JFK’s counterpart in communist Russia. A U-2 reconnaissance photo of Cuba confirmed that Khrushchev had placed nuclear missiles on the island, just 90 miles off the coast of Florida.

JFK responded by ordering a naval blockade around Cuba, and essentially told Khrushchev that the missiles had to go. If they didn’t, there would be war. A nuclear war.

This period, known as the Cuban Missile Crisis, was essentially a naval operation. But the entire military, worldwide, was ready for deployment, including a possible invasion of Cuba.

Harry Moritz was at Morse Intercept School at Fort Devens, Massachusetts, at the time. “One day, we marched back to our barracks and were held for an announcement. We were asked if anyone spoke Spanish. Several guys raised their hands. They were pulled to one side, told to pack their gear, and they were sent on a ‘special assignment’ TDY (temporary duty station). They disappeared and were never seen again,” he said. “We non-Spanish folks stayed in Morse school, and in the dark, like the rest of the USA, crapping our pants.”

Gary Mahone was a Morse interceptor, stationed in Hakata, Japan. “During that time, we were on red alert and worked 12-hour shifts, 24/7,” he said. “All leaves and terminations were canceled. Very tense times.”

The Air Force Academy that Fryer attended was in Colorado, not far from the North American Aerospace Defense Command (called NORAD), which conducts aerospace warning and control for the United States. “If the Russians would have come after us, that was a prime target,” said Fryer.

However, according to Fryer, Soviet missiles weren’t all that accurate at the time, so if they fell 15 miles short of their target, the academy could easily be hit. “In preparation for that, we held some drills,” he said. The academy was built with underground tunnels that distributed its utilities. Top brass decided the safest place for the cadets was in these tunnels, which no one really knew about.

Hogan was on a destroyer, which was part of the task force that was going to invade Cuba. His ship was the submarine screen and would provide shore bombardment should the invasion happen.

Hogan spotted a Russian submarine tailing them. “I heard his torpedo doors open,” he said. That meant the Soviets were preparing to attack. Hogan had his hand on the trigger, let his captain know he had positive identification, and requested permission to fire.

Had permission been granted, this very action would have kicked off a nuclear war. However, he was “in a system” and “the system has its rules; you follow the rules.” He would have obeyed the order to fire if it had been given.

“I was (expletive) my pants,” Hogan recalled. “There was a long pause, and the captain said, ‘Classify your contact as a whale,’” instead of an enemy submarine. “I was really glad when the captain chickened out.”

With a nuclear war between the two superpowers looming, Khrushchev eventually gave in and agreed to remove the missiles.

Veteran Joe Schmidt of N.Y. (Dave Paone)

Joe Schmidt was a 21-year-old signalman on a destroyer in the blockade. His job was to directly communicate with the Russian merchant ships as they removed the missiles from Cuba. “With a flashing light, we would send a message to them, and we had to ask them, ‘What is your cargo?’” he said. The expected reply was, “Missiles.” Schmidt would relay that message to the captain, who would relay it to the naval air station in Key West, Florida.

It was understood by everyone involved that the Soviet merchant ships were carrying the missiles and nothing else. “Anything coming out of Cuba at that point was only coming out with missiles on it. They weren’t bringing cigars,” said Schmidt with a laugh.

Key West would then dispatch a P2V Neptune anti-submarine aircraft to fly over the Russian ship to photograph its cargo. The only time Schmidt was in contact with a Soviet ship, it was after midnight and completely dark.

“They had these huge searchlights on the wingtips,” he said. “And they lit that ship up—that plane lit it up—it looked like it was 12 o’clock in the afternoon with those lights.” Even though the two sides spoke entirely different languages—ones that don’t even share the same alphabet—there was a code that both understood, which made communication possible.

JFK’s presidency is fondly referred to as “Camelot,” and the consensus among those who served in the military during his administration is that, for different reasons, it was an exciting time. As Hogan put it, “Best and worst experience of my life.”

Dave Paone is a Long Island-based reporter and photographer who has won journalism awards for articles, photographs, and headlines. When he’s not writing and photographing, he’s catering to every demand of his cat, Gigi.

Features History

Photographing President Eisenhower

On a summer’s day in 1955, the stars aligned for an airman second class at the Lowry Air Force Base in Denver, Colorado. This was just before the days when Camp David became the official presidential retreat, and President Dwight D. Eisenhower used a property near the base known as the “Summer White House.”

Twenty-one-year-old Al Freni was assigned to the president as his official photographer. On August 16, he and several other photographers were shooting Eisenhower (known as Ike) and his grandson, David, as they were recreating on a nearby ranch, owned by one of Ike’s friends, Aksel Neilsen.

Freni took the picture that would kick-start his career. It’s of the pair fishing at a pier, bonding as grandfathers and grandsons do. This picture would be republished in books and magazines and exhibited for decades thereafter.

Freni’s story begins in 1933, when he was the second son born to Italian immigrant parents in the Jackson Heights neighborhood of Queens, New York. His birth name was Alfredo Giuseppe Freni, but several years later, an editor felt it would take up too much space in his publication and, in an Ellis Island-style move, insisted he simply go by Al Freni.

At 10 years old, Freni purchased his first camera, a Clix Deluxe, for $1.79. Soon after, his older cousin purchased a basic darkroom kit for Freni, and he started developing and printing his own pictures in the bathroom and what was the coal bin in his family’s house.

Completely by chance, famed Life magazine photographer Alfred Eisenstaedt lived in an apartment building two blocks from the Freni household. Upon learning this, Freni scraped up a dime to purchase the latest issue, never having heard of Eisenstaedt before.

Freni attended the School of Industrial Art (now the High School of Art and Design) in Manhattan for high school, where he took four photography classes per day and was named “most probable to succeed” upon graduation in 1951.

At this point, the Korean War was on, and Freni was of draftable age. For the next two years, he worked two different jobs but decided to enlist before he was to be drafted. He joined the Air Force in 1953 with the plan of working as a photographer.

The Air Force had a different plan. They trained him as a turret mechanic for B-47 bombers. After nine months of it, Freni had had enough and was seriously considering going AWOL. “I couldn’t stand what I was doing,” Freni said. A fellow airman suggested he speak with the base chaplain. Freni took that advice, and the chaplain, a colonel, pulled some strings. He offered Freni a position working for the weekly Air Force newspaper, called Airmen. Freni jumped at the offer.

As airman first class, Al Freni is pictured after his promotion in 1955. (Courtesy of Al Freni)

The good news was Freni was the No. 2 photographer of a two-man photo department. The bad news was that meant he had to shoot the less-glamorous and more difficult assignments, including climbing up a ladder to the roof of a hangar to photograph the president’s plane upon arrival.

“Then the magical thing happened,” said Freni. “The photographer that was assigned to cover the president in 1954 got his orders. They shipped him out. I graduated to base photographer.”

That meant whenever Eisenhower vacationed at the Summer White House, Freni was the official photographer. “Here I am, not even 22 years old,” said Freni, “and I was assigned to be the presidential photographer.”

The day of golfing, horseback riding, and fishing was a photo-op manufactured by the presidential press secretary at the time, James Hagerty. It was so manufactured that, according to Freni, the White House had live trout trucked in and released into the water to ensure the younger Eisenhower would catch a fish.

While the entire day was manufactured, the moment Freni captured was real. David had walked away from his grandfather, and the other half-dozen photographers there, and stood on the pier alone. Ike walked over and joined him. Freni saw this unfolding but was the only photographer to act. “I saw a picture,” Freni said. He then shot the photo that would bring him his most recognition.

All of Freni’s photographs taken while in the Air Force were shot on a Speed Graphic camera, which he purchased in 1949. It was the camera photojournalists had used for decades. It was big, heavy, cumbersome, and took one sheet of film at a time, so photographers spent a lot of time inserting and removing the frames that held the film. If a flash was needed, individual flashbulbs were inserted before and ejected after each use.

The fishing photo ran on the front page of Airmen, as well as the Rocky Mountain News, a Denver daily newspaper. Eisenhower loved it so much that he requested 40 prints. It took Freni three days, but he made 43 11-by-14-inch prints in the darkroom by hand.

An appointment was set up for Freni and the public information officer, a major, to meet in the president’s office, where Eisenhower would sign one of the prints for Freni to keep. Freni got a haircut, shined his shoes, and put on clean fatigues. When they walked into the room, Eisenhower said, “Come in, Sergeant,” and the major’s face turned white.

Freni believes this was the commander-in-chief’s subtle way of saying to the major, “Promote this guy.” Whether it was intentional or not, the major did, indeed, promote Freni soon afterward.

Ike wrote, “For Alfred Freni, with best wishes, Dwight Eisenhower.”

Thirty-nine years later, the grandson, David, signed the photo, writing, “For Al Freni, who took my favorite picture.”

Freni’s photograph is at the Dwight D. Eisenhower Presidential Library, Museum, and Boyhood Home in Abilene, Kansas, and at the Eisenhower Historical Site in Gettysburg, Pennsylvania. It’s been published in one of the titles in the Time-Life series, “The Fabulous Century,” as well as many other books and magazines.

Freni has had a long career as a professional photographer in New York. For many years, he had a combination studio-office-darkroom in the Time-Life Building, seeing Eisenstaedt regularly. As a true New Yorker, he never left his Queens neighborhood and now lives in the building where Eisenstaedt lived. But it’s the fishing picture that Freni remembers most fondly.

He often states how “one two-hundredth of a second” can change a person’s life. That one two-hundredth of a second certainly changed his.

Dave Paone is a Long Island-based reporter and photographer who has won journalism awards for articles, photographs, and headlines. When he’s not writing and photographing, he’s catering to every demand of his cat, Gigi.