Categories
House of Beauty Features History

The Breakers in Newport, Rhode Island: A Grand Tour of the Vanderbilts’ Italianate Summer Home

In the autumn of 1885, Cornelius Vanderbilt II paid a little over $400,000 for a summer cottage in Newport, Rhode Island. The Queen Anne style house, built in 1878, was considered the “crown jewel” of Newport. It had been designed by the architectural firm of Peabody and Stearns for Pierre Lorillard IV, whose fortune came from the Lorillard Tobacco Company. He bred thoroughbred race horses and financed archaeological expeditions to South and Central America. He helped to make Rhode Island a yachting center as well. The house was situated along Cliff Walk in Newport, with an amazing view of the ocean.

When Cornelius Vanderbilt II acquired the “cottage,” he hired Peabody and Stearns to oversee $500,000 in renovations to it, but in 1892 a fire that started in the kitchen largely destroyed the house. Vanderbilt decided to demolish the ruined house, right down to its foundations, and build anew. He brought in architect Richard Morris Hunt, who had worked for the Vanderbilt family in New York, and expressed to him his great concern about the new house being fireproof. Hunt responded by creating a design that would cost $7 million to build—even in 1893.

The entrance gates, manufactured by the William H. Jackson Company of New York, rise 30 feet above the driveway and feature a monogram of Cornelius Vanderbilt’s initials as well as acorns and oak leaves— symbolic of the Vanderbilt family. (Courtesy of The Preservation Society of Newport County)
Designed by Richard Morris in the style of ancient Rome, the walls of the Billiard Room are constructed from slabs of Italian cippolino marble with rose alabaster arches. Semi-precious stones create mosaics. The Billiard Room was featured in the second episode of “The Gilded Age” series on HBO. (Courtesy of The Preservation Society of Newport County)

The bones of the estate would be steel, brick, and Indiana limestone. Rather than using wood framing, the architect created masonry arches on steel beams. The boiler room was in a detached building and connected to the main house by an underground steam tunnel. What rose from the original foundations was not simply a reconstruction of the old house, but a grand edifice in the style of the Italian Renaissance. It would be the grandest Gilded Age mansion of Newport. In fact, the new Breakers is much larger than the original house, of which the remaining foundations made up only part of the base of Hunt’s grand masterpiece. Hunt took his inspiration for The Breakers from Peter Paul Rubens’s book “Palazzi di Genova,” written in 1622. He acquired the book on a trip to Genoa and referred to its detailed illustrations as he created a Renaissance villa for the Vanderbilts.

Approaching the mansion from the street, it appears to be three stories high (it is actually five). As you enter the foyer, there is a gentleman’s reception room to the right and a ladies’ reception room to the left. Continuing straight, you step into the immense Great Hall. Rising 50 feet above with its great balconies, the Great Hall creates the illusion of an Italian open courtyard, or cortile. Hunt organized the rooms of the mansion around this central space, in the manner of the villas depicted in “Palazzi di Genova.” The firm of Allard and Sons of Paris created the interiors, importing the finest materials for its work. Austrian sculptor Karl Bitter created the relief sculpture in the estate. Ogden Codman, a Boston architect, oversaw the design of the family quarters.

Portrait hanging inside the Morning Room at The Breakers of Countess Laszlo Szechenyi (Gladys Moore Vanderbilt), the youngest child and daughter of Mr. and Mrs. Cornelius Vanderbilt II, by Philip de Laszlo, 1921. (Public Domain)
The Music Room showcases a gilt-coffered ceiling lined with silver and gold. This room was featured in the season finale of the HBO series “The Gilded Age.” (Courtesy of The Preservation Society of Newport County)

For the grand view of the ocean, Hunt created the double loggia (covered exterior galleries, one above the other, created primarily as a place for sitting). The lower loggia has a vaulted ceiling covered in mosaic, and the upper loggia is painted to resemble canopies against the sky. The spandrels (panels) of the loggia arches feature figures representing the four seasons of the year. The materials and the artisans were imported from overseas. Inspired by the palaces and villas of 16th-century Genoa, Hunt drew from classical Greek and Roman motifs to create the splendor of The Breakers. While the exterior is constructed of Indiana limestone, the walls of the Great Hall are made of carved Caen limestone imported from the coast of France. The walls are inset with plaques of rare marbles such as pink marble from Africa and green marble from Italy.

The Great Hall’s pilasters (embedded columns) and medallions (circular decorations) are decorated with acorns and oak leaves, representing strength and longevity, symbols of the Vanderbilt family. On top sits a massive cornice that frames a ceiling mural of a windswept sky. Hunt enclosed the space in consideration of Rhode Island’s New England climate, but he quite successfully retained the illusion of an open courtyard. The contrast of the elaborately detailed cornice against the painted sky reinforces that feeling, as does the large glass wall between the hall and the loggias.

Portrait of Mrs. Cornelius Vanderbilt II by Raimundo de Madrazo y Garreta, 1880. (Public Domain)
The Dining Room is the most lavish room inside The Breakers, featuring 12 rose alabaster Corinthian columns, a ceiling mural of the goddess Aurora bringing in the dawn on a four-horse chariot, and two Baccarat crystal chandeliers. (Courtesy of The Preservation Society of Newport County)

Projecting from the estate’s south wing is the oval Music Room. Richard Van der Boyen designed the intricate woodwork and furnishings. Jules Allard and Sons built all the woodwork in their shops in Paris and shipped it to America for installation. Used originally for recitals and dances, the Music Room was featured in an episode of Julian Fellowes’s HBO series “The Gilded Age.”

The gardens of the 70-room estate were designed by Boston engineer Ernest W. Bowdtich, who was a student of Frederick Law Olmsted. Trees were carefully placed to increase the sense of distance between The Breakers and the neighboring houses. The enormous gate of the property and the wrought iron fence are flanked with rhododendron, mountain laurel, and other flowering shrubs to create a secluded place. Footpaths wind around the tree-shaded grounds, all of which provide a very natural backdrop for the more formal terrace gardens.

Facing east to welcome the rising sun, the Morning Room is a communal sitting room designed by Allard & Sons in France, featuring platinum-leaf wall panels adorned with muses from Greek mythology. (Courtesy of The Preservation Society of Newport County)

Paying homage to the original Breakers, Robert Swain Peabody and John Goddard Stearns, who designed the original house, were commissioned to create The Playhouse in the garden. It was a small, Queen Anne Revival style cottage, reminiscent of their original design, which was used as a children’s playhouse.

Cornelius Vanderbilt II died in 1899. He was 56. Alice, his wife, outlived him by 35 years. Not unlike the fictional Crawley family of “Downton Abbey,” the Vanderbilts faced the reality that such an estate, with its army of servants, was becoming increasingly difficult to maintain. Alice gave the mansion to her youngest daughter Gladys (Countess Széchenyi), who was an active supporter of the Preservation Society of Newport County. She opened the house for visitors in 1948, leasing it to the society for a dollar a year. The society eventually purchased The Breakers in 1972 for $365,000—slightly less than what Mr. Vanderbilt paid for the property almost a century before.

This article was originally published in American Essence magazine. 

Categories
History

Not Just Paul Revere: The Unknown Story of the Night Rider in Virginia Who Warned the British Were Coming

It was the spring of 1781, and war had come to Virginia.

Many Virginians were fighting elsewhere with George Washington’s forces, weakening the ability of the state to resist British advances. King George’s troops, some of them commanded by defector Benedict Arnold, had earlier that winter conducted raids and fought skirmishes with Americans along the James River. In May, these soldiers hooked up with the forces of Lord Cornwallis, who had marched his men up from North Carolina. In less than six months, this army would surrender to the Americans and French at Yorktown, but for now, they faced only light resistance and moved handily throughout the eastern part of Virginia.

Driven that winter out of the state’s new capital, Richmond, the Virginia legislature had opted in the spring to meet in Charlottesville, believing themselves secure from the British in that western hamlet. Among these lawmakers were Gov. Thomas Jefferson, now in the last days of his term of office, as well as famous patriots like Patrick Henry and signers of the Declaration of Independence Richard Henry Lee and William Harrison. Among their number was also Daniel Boone of Kentucky, then considered a part of Virginia.

When Lord Cornwallis learned that the legislature had gathered in Charlottesville, he dispatched Lt. Col. Banastre Tarleton and 200 mounted troopers to ride west and capture these lawmakers. Though despised by colonial patriots for his harsh treatment of militia and civilians in the Carolinas—he was nicknamed “Bloody Ban”—Tarleton was a fine horseman and an aggressive commander. He pushed his men toward Charlottesville, riding much of the time at night to conceal their objective. On June 3, he paused for a few hours at the Louisa County Courthouse to give his men and horses a well-earned rest before advancing into Charlottesville the following day.

And it was on this night that one American would upend this British raid.

“Thomas Jefferson” by Mather Brown, 1786. National Portrait Gallery, Washington, D.C. (Public domain)

Virginia’s Paul Revere

Born in 1754 to John and Mourning Harris Jouett, Jack Jouett had grown up in Charlottesville, where his father operated the Swan Tavern. On this evening of June 3, he was almost 40 miles away in Louisa County at the Cuckoo Tavern, so named because of the clock in that establishment. Jouett had seen the arrival of the British dragoons, overheard talk in the tavern of their plans to proceed to Charlottesville, and decided on his own initiative to race through the hills to that town and alert the threatened legislators.

Mounted on his bay mare Sally, Jouett set out through the dark countryside. Fearing British troops, he took the back roads and trails with which he was well familiar. Just around dawn, his fast-paced horse brought him to Monticello, Thomas Jefferson’s estate. There, he roused the household and explained the dire situation to Jefferson, who, as legend has it, offered Jouett a glass of Madeira to help revive the weary rider before he set out for nearby Charlottesville.

In Charlottesville, Jouett spread the word, which included a visit to his father’s popular tavern. The legislators agreed to move south to Staunton, about 40 miles away. Though Daniel Boone and several other members of this body were captured by the British, most of the representatives packed in haste, fled the town, and escaped safely to Staunton.

A Near-Run Disaster

Jefferson himself came close to being taken prisoner as well.

Aided by his body servant, Jefferson slowly packed up important papers and personal items, reluctant to leave the home he’d designed and built for fear the British would burn it. Only when a neighbor who was an officer in the Virginia militia, Christopher Hudson, found him still on the premises and urged him to flee did Jefferson mount his horse, Caractacus, and ride into the forest. Like Tarleton, he was an excellent horseman, knew the terrain, and was confident of his ability to escape Tarleton’s raiders.

The British arrived at Monticello within minutes of his departure, with Jefferson still close enough to hear them and to observe through his telescope. He rode away, but his fears regarding the destruction of his home proved unjustified. Perhaps the British remembered the story of Jefferson’s kind treatment of several captured officers earlier in the war. The troops did threaten to shoot a slave, Martin Hemmings, unless he informed them of his master’s whereabouts, at which point the servant demonstrated his loyalty to Jefferson by replying, “Fire away, then.” Hemmings was left unharmed, and after a thorough search of the house and grounds, the British headed to Charlottesville.

Lt. Col. Tarleton was determined to capture colonial lawmakers. “Portrait of Sir Banastre Tarleton” by Joshua Reynolds, 1782. National Gallery, London. (Public domain)

As for Jack Jouett, he moved to Kentucky the year after his ride, where he married Sallie Robards, became a father to 12 children, established himself as a successful farmer, and served in the Kentucky legislature. He was a stout advocate for statehood and was undoubtedly pleased when in 1792 Kentucky became the second state to join the newly formed United States of America.

Though honors for his heroism on that night-long ride were belated, Jouett eventually received official recognition from the Virginia government for his exploit and was awarded a brace of fine pistols and a sword for his service.

The Power of One

Jack Jouett isn’t as famous as Paul Revere, in large part because of Longfellow’s poem “Paul Revere’s Ride” with its well-known opening lines “Listen, my children, and you shall hear / Of the midnight ride of Paul Revere.” Yet Jouett’s bravery and boldness that June night and the following day may have helped save the American Revolution. At the time, the Americans had no sure hope of victory—far from it—and the capture of patriots like Jefferson and Richard Henry Lee might have brought disastrous consequences. At the least, such a triumph would have severely damaged American morale.

Jouett also deserves our esteem for demonstrating a particularly American trait: individual initiative. Unlike Paul Revere, who worked with a committee of others discerning and attempting to thwart British intentions, Jouett acted alone and spontaneously. No one commanded him to deliver his warning; he asked no one for advice as to what he should do. At great risk to himself, he saddled up that bay mare and set out on his self-imposed mission.

To put aside our fears, doubts, and self-interests in the pursuit of liberty and a righteous cause: That is Jack Jouett’s greatest lesson for us all.

This article was originally published in American Essence magazine. 

Categories
History

Gibson Guitars: Fascinating Stories Behind an American Icon Serving a Century of Musicians

It was Ray Whitley who started the excitement. Throughout the 1930s, Whitley traveled with the World’s Championship Rodeo, providing musical entertainment with his band, the Six Bar Cowboys. In 1937, he prodded the Gibson Mandolin-Guitar Manufacturing Co. to develop a “super jumbo” instrument, one that could go lick for lick with the nearly 16-inch-wide, rosewood-and-mahogany Dreadnought guitar issued by C.F. Martin & Co.

A Gibson L-4 CES, fit for jazz players. (Heath Brandon CC BY 2.0, CreativeCommons.org/ licenses/by/2.0)

“Of course, what somebody at Martin saw, and what no one at Gibson apparently did, was that players were forsaking banjos for guitars and demanding louder instruments,” wrote Walter Carter in his history of the Gibson company. There was no other way to match the volume of the singer and microphone. Playing live dates, Gene Autry, a star at Chicago’s WLS radio station, was already strumming an elaborately ornamented Martin D-45, which replaced the smaller Martin that had been stolen along with his Buick the year before. Whitley’s demands of Gibson resulted in a 17-inch-wide body with a mosaic pickguard and the slogan “Custom Built for Ray Whitley” inscribed on the headstock. The Super Jumbo 200 took its name from its generous size and steep list price: $200 (a 1938 Ford could be purchased for just over three times that amount). After World War II, the model would be known simply as the SJ-200, and Elvis Presley cradled one when he appeared on The Ed Sullivan Show in 1957.

“The Gibson-made instruments were louder and more durable than the competitive, contemporary fretted instruments, and were the go-to instruments demanded by players of the day,” explained ZZ Top’s Billy Gibbons in an email.

Elvis Presley’s Gibson J200 on display at his home, the Graceland mansion in Memphis, Tenn. (Mr. Littlehand CC BY 2.0, CreativeCommons.org/ licenses/by/2.0)

Whitley carried his SJ-200 to Hollywood, where he wrote “Back in the Saddle Again” for the cinematic mystery-romance “Border G-Man.” Meanwhile, Gibson made a dozen more SJs for key influencers. Autry bought two at the discounted price of $150 each, and his biographer, Holly George-Warren, wrote of one guitar that it was “embellished with a two-tone mother of pearl border; horses and bucking broncos inlaid with pearl; and his name writ large alongside horseshoes inset on the fingerboard.” It was a spectacular instrument and showpiece, indeed, making a lasting impact.

In 1939, Autry recorded his version of Whitley’s tune and adopted “Back in the Saddle Again” as his enduring theme song. Heard today, the lyrics still evoke feelings of truth and triumph, but cowboy singers would soon fall out of fashion. The music made by electric guitars took over radio airwaves. Autry finished his career with landmark recordings of holiday songs, namely “Here Comes Santa Claus,” “Rudolph, the Red-Nosed Reindeer,” and “Peter Cottontail.”

The integration of Gibson guitars into the upper echelons of popular music deserves some explanation. The company’s founder, Orville Gibson, had migrated from his native New York state to Kalamazoo, Michigan, by 1881, when he was 25 years old. After more than a dozen years as a clerk in a shoe store and a restaurant, he started manufacturing musical instruments. In his small workshop, he made mandolins from a patented design. The patent application of 1895 said existing instruments were made of too many parts, “to the extent that they have not possessed that degree of sensitive resonance and vibratory action necessary to produce the power and quality of tone and melody.” He boasted of having achieved “a sound entirely new to this class of musical instruments.” The first Gibson catalog offered a family of mandolins for the popular mandolin orchestras, as well as round- or oval-hole guitars and harp guitars with 12 or 18 strings. Five stages of ornamentation, from plain to fancy, were available.

A 1964 Gibson Country Western acoustic guitar (L) and a 1963 Southern Jumbo SJ. (Tony 1212 CC BY-SA 4.0, CreativeCommons.org/ licenses/by-sa/4.0)
A Gibson magazine advertisement from around 1939 to 1940. (Public Domain)

By 1902, an investor group took over Gibson’s enterprise, and the next year the founder—who had become a consultant for the company—quit, in order to teach music and collect royalties. Eventually, Gibson returned to New York; he died in 1918. His namesake company adopted an innovative marketing approach, turning music teachers into salesmen and letting customers pay small monthly installments. The Gibson banjo was introduced, but the 1911 L-4 and 1923 L-5 guitars were better fits with Jazz Age outfits like Duke Ellington’s Washingtonians at a time when people were losing their heads dancing the Charleston. With the finest materials and craftsmanship, the 1934 Super 400 extended the trend of successful rhythm instruments. Gibson’s first electric, the hollow-body ES-150, made its debut in 1936 and was popularized by the ill-fated jazz player Charlie Christian. Extolling “electrical amplification,” Christian showed the world how to perform a proper solo, before he died—too young at 25—of tuberculosis.

Singers Ray Whitley and Redd Harper, and actor Frank Seeley (far R), with fellow musicians at the Armed Forces Radio Service studio. (Public Domain)
An Orville by Gibson guitar, a line of instruments made for the Japanese market. (Public Domain)

While worthy competition came from the 1950 Fender Telecaster and 1954 Stratocaster—solid-body electrics made in Southern California—Gibson made a wily move in advance of the era of rock ’n’ roll and electric blues: In order to avoid the disdainful label of “plank” guitar, the solid-body 1952 Gibson Les Paul was developed with collaboration from Les Paul (Lester Polsfuss), who was a master player and something of a mad scientist. The guitar that bore his name had a carved maple top with no sound holes, and the gold color was intended to disguise a trade secret: the mahogany back. Like Orville Gibson’s mandolins, the new guitar was an innovative departure and an instant classic. The challenge was to figure out what to do with it, but players stepped to the fore. Bluesman John Lee Hooker, to name one, extracted grit and passion from his Les Paul. Billy Gibbons dubbed his own 1959 example “Pearly Gates,” explaining that the guitar “possesses those rare qualities found in a precise combination of elements which miraculously came together on that fateful day of fabrication.”

Renamed Gibson Guitar Corp., and now Gibson Brands, Inc., the company moved operations from Kalamazoo to Nashville by 1985, with acoustic guitars produced in Bozeman, Montana, since 1989. The company has experienced ups and downs in conjunction with fickleness in the national economy and the guitar industry—even restructuring in Chapter 11 bankruptcy in 2018. However, the pandemic has brought about a surge in guitar sales—“Did Everyone Buy a Guitar in Quarantine or What?” asked Rolling Stone—putting the company in a good position to capitalize on the upswing. Gibson guitars continue to lend their great sound and seriousness of intent to new musical acts. And it all started with Orville Gibson and his carving tools in Kalamazoo.

This article was originally published in American Essence magazine.

Categories
The Great Outdoors History

Raising a Forest by Hand

“The hills bear all manner of fantastic shapes,” Charles Bessey observed, noting that they sometimes featured open pockets of bare sand in blowouts and were “provokingly steep and high.” Bessey was describing the Sandhills, the area of post-glacial dunes wrought by mighty winds in north-central and northwestern Nebraska. Aided by his botany students from the University of Nebraska (today’s University of Nebraska–Lincoln), he cataloged a treasure of plant species in 1892. Yet besides spurges and gooseberries, herbaceous plants such as smooth beardtongue, and grasses such as Eatonia obtusata, he found the potential for forestation.

“He was convinced that the moist soil of the Sandhills would support forest growth,” the historian Thomas R. Walsh wrote. Nebraska had gained statehood in 1867 but still had enough untouched areas to be “a virgin natural laboratory,” as Walsh described it. And there were so few trees for wood, shelter, or shade. Bessey had been pushing the state legislature to reserve Sandhills tracts for tree planting. In 1891, urged by the top forestry official in Washington, D.C., he started a test plot at the eastern edge of the Sandhills, which encompassed an area about the size of New Jersey. Ponderosa pines were a big component of the experiment’s 13,500 conifers. With the initial indication that they would do fine, he started a campaign to convince people that forestation was practical. After all, as Walsh noted, “the area was once covered by a pine forest that was destroyed by prairie fires.”

The pre-dawn fog rises above the Niobara River, located in Valentine, Nebraska. (Pocket Macro/Shutterstock)

Bessey had come to the University of Nebraska in 1884, lured from Iowa Agricultural College (today, Iowa State University) by an offer of $2,500 per year. He was already the author of “Botany for High Schools and Colleges,” the nation’s first textbook on the subject. His motto of “Science with Practice” indicated a teaching philosophy that mixed laboratory and field work with classroom instruction. He was one of a small group of professors at the prairie university, attended by just 373 students in the year he arrived, but he had an outsized and enduring influence through his popular botany seminar. A top student in the 1892 cataloging project was Roscoe Pound, who claimed the university’s first Ph.D. in botany, then distinguished himself as a legal scholar and served two decades as dean of Harvard University’s law school.

Throughout the latter years of the Gilded Age, Bessey kept hammering away at the idea of national forests. To Gifford Pinchot, head of the national Division of Forestry, he wrote, “In the Sandhills, we have a region which has been shown to be adapted to the growth of coniferous forest trees, and here we can now secure large tracts which are not yet owned by private parties.” Pinchot had the ear of President Theodore Roosevelt, who in 1902 set aside 206,028 acres in two reserves in the Sandhills. “This was the first and only instance in which the federal government removed non-forested public domain from settlement to create a man-made forest reserve,” Walsh explained.

The two reserves are 75 miles apart. The northern Samuel R. McKelvie National Forest is on the Niobrara River near the city of Valentine. The southern one, first called Dismal River Forest Reserve, is now the Nebraska National Forest at Halsey and is managed by the Bessey Ranger District. (Nebraskans refer to it as “Halsey Forest.”) Within it are the Bessey Recreation Area and the crucially important Charles E. Bessey Tree Nursery, which yearly produces 1.5 million bare-root seedlings and up to 850,000 container seedlings for distribution in the Great Plains and Rocky Mountain states. Additionally, the nursery acts as the seed bank for Rocky Mountain Region 2, storing about 14,000 pounds of conifer seeds in case of wildfire or insect infestation.

Carson Vaughan, author of “Zoo Nebraska: The Dismantling of an American Dream,” grew up in Broken Bow, about 50 miles from Halsey Forest. It was only after he started writing articles about Bessey and the forest that he comprehended the magnitude of the original undertaking: creating the largest man-made forest in the United States. “Nothing like this has ever happened anywhere else on the planet,” he said. “And it all started because this pioneering botanist, Charles Bessey, had this wild idea and the patience, the dogged persistence, to stick with it over a couple decades and see it come to fruition.”

Vaughan remembered climbing Scott Lookout Tower, near Halsey, and feeling the impact upon viewing a forest amid treeless grasslands. “You get the rolling, billowing Sandhills right next to this very clear, dark, dense forest,” he said. The experience reinforced the concept that “it took human beings planting all of these trees to make this national forest grow out of this sandy, arid region.”

The sun rises over the Dismal River, which runs through the Nebraska Sandhills. (marekuliasz/Shutterstock)

After succeeding in the Sandhills, Bessey turned to other important challenges. In 1903, he was contacted about the effort to save the giant sequoias in certain groves in the Sierra Nevada Mountains of California. He tried to interest President Roosevelt in the cause, then introduced the matter into proceedings of scientific societies, sending their resolutions on the matter to congressional representatives. Although he helped to set the conservation process in motion, Bessey would pass away in 1915 without seeing his efforts bear fruit. 16 years later, the state of California acquired the Mammoth Tree Grove, which is a principal element of the eventual Calaveras Big Trees State Park.

On the other side of the country, Bessey became involved in the effort to create a national forest reserve in the southern Appalachians. “The cutting away and total destruction of the forests is a crime against the community as a whole,” he wrote. In 1908, a bill to authorize the reserves came before the House of Representatives, but soon died. It particularly galled Bessey that one of his former students, Representative Ernest M. Pollard, was on the agricultural committee, which had deferred action. “It does seem as though we had the most stupid and blinded lot of men in charge of our affairs that has ever cursed any country,” Bessey wrote to House Speaker Joseph G. Cannon. Bessey and others kept working, and ultimately, the Weeks Act of 1911 was passed, providing for acquisition and preservation of forested lands nationwide.

Today, visitors to the University of Nebraska–Lincoln can see an image of Bessey in bas-relief on a bronze tablet at—where else?—Bessey Hall. There’s also a Bessey Hall at Iowa State. And at Michigan State University, Ernst Bessey Hall is named for Charles’s son, who became a professor of botany and dean at MSU’s graduate school from 1930 to 1944. The apple didn’t fall far from the tree.

Categories
History

The First Selfie

As of this writing, about 700 billion photographs have been uploaded to the internet. Billions and billions more exist in physical form. Many of these photos fall into the category now referred to as “selfies,” a type of photograph that is typically assumed to be as young as Generation Y. However, the roots of the selfie actually go back almost 200 years.

The son of a Dutch immigrant, Robert Cornelius was born in Philadelphia in 1809. As a child, Cornelius was fascinated by chemistry. This interest was surely fanned by the boy’s father, a silversmith, who taught Robert the business of metal polishing and silver plating.

In 1839, the world was taken by storm when French artist Louis Daguerre invented the daguerreotype, a complex process—involving silver-plated copper, mercury vapor, and liquid chemical treatment—that could produce a photographic likeness. An account of Daguerre’s process was published in Philadelphia on October 15, 1839. The next day, Cornelius was approached by a local watchmaker and inventor named Joseph Saxton, who was at that time, an employee of the Philadelphia Mint. Saxton wanted Cornelius to help him produce a daguerreotype image. Cornelius agreed.

Cornelius created the silver plating for Saxton’s photographic image; and that image, as far as we know, was the first photograph ever taken in the United States. In dark hues of gold and brown, the image was taken from Saxton’s own Philadelphia Mint office window, and it portrays part of the State Arsenal and a section of a neighboring high school. A late-19th century description of the “camera” reveals Saxton’s quick ingenuity: A seidlitz powder (a laxative) box with a few flakes of iodine answered for a coating box, while a cigar box and burning glass were improvised for a camera. One Philadelphia photographer later wrote that the Saxton daguerreotype “created no small excitement among the curious in such matters; and from this date, many of our Philadelphia savants began cultivating the art.”

The experience ignited in Cornelius an abiding interest in photography, and he was determined to improve upon the makeshift daguerreotype he’d helped Saxton throw together. In this effort, he enlisted a physician named Paul Beck Goddard, and later that same month of October, they produced a daguerreotype image of Cornelius himself—the first photographic portrait (picture of a human being) ever taken in the United States. It was probably the first ever in history, although one earlier daguerreotype image had been taken a year before in Paris and happened to include a couple people in the background. But that image wasn’t meant as a portrait.

From the metallic, spotted image, Cornelius, with his head slightly tilted to the left, stares back at us with determined eyes set underneath a prominent forehead, that was partly covered by his thick, disheveled hair. He wears a dark coat with a cravat. His right arm is held upright across his chest, and his right hand is tucked beneath the left side of his coat. The limitations of the technology dictated that for this first-ever selfie, Cornelius had to sit still for up to 15 minutes.

The first photographic portrait made in the United States (and probably the world), by Paul Beck Goddard and Robert Cornelius. The subject is Cornelius himself, allowing him to claim the achievement of first-ever “selfie.”

Cornelius went on to establish several photo studios, manufacturing his own cameras, plates, and mats to produce portraits for the prominent people (among others) of his time. Many of those photographs survive to this day.

One of his innovations was in harnessing additional light via the use of reflectors. Writing several decades later, one observer of Cornelius described his process: For coating the plates, he used dry iodine exclusively; and by several large reflectors, set at different angles, both within doors and without, he was enabled, in strong sunshine, to concentrate upon his sitter light enough to obtain through a side-window facing south, an impression within from one to five minutes.

Cornelius was later able to improve his process to the point where he could produce “fair impressions, even without reflectors, in from 10 to 60 seconds—and this too within doors.” But success invited imitation, and, as one mid-19th century historian informs us: “Together with the improvements made by [Cornelius] and others in the heliographic apparatus [light reflectors] and manipulative methods, and the great advance consequent thereon in the mode of obtaining portraits from life, quite a number of persons directed their attention to the art from the hope of making it a source of profit.”

As the demand for and interest in photography spread, and as more studios opened, Cornelius opted to move on to other things: specifically, the invention of a solar lamp that proved highly popular across the United States and Europe—but that’s another story. Incidentally, few people knew or understood at the time that Cornelius had taken the first photographic portrait in American history. He wasn’t one to trumpet his own accomplishments.

Luckily for us, however, Cornelius mentored others at his studio. One of them was a young man named Marcus Root. A quarter-century after that first selfie was taken, in 1864, Root published “The Camera and the Pencil, or the Heliographic Art,” which included, along with the theory and practice of photography, a history of the field. That book explicitly, and rightly, credited Cornelius with the first photographic portrait.

Twelve years later, “The Camera and the Pencil” was exhibited at the Centennial Exhibition, where the book was noticed by a photographer named Julius Sachse. Sachse went on to interview Cornelius, and later became editor of the “American Journal of Photography.” In this way, Cornelius’ legacy was secure—and just in the nick of time. He died the following year, in1877.
So, the next time you take a selfie, take a moment to remember Cornelius—and be glad you don’t have to sit still for 15 minutes.

Categories
Founding Fathers History

Washington’s Presidency, the Glorious and the Mundane

George Washington, universally acclaimed nowadays as one of our best presidents, encountered a little bad press in his own day. Even before his inauguration, he knew that facing impossibly high expectations would be a challenge during his time as president.

“My movements to the chair of Government will be accompanied with feelings not unlike those of a culprit who is going to the place of his execution.” These were the unenthusiastic words of George Washington, written to fellow Revolutionary War veteran Henry Knox on April 1, 1789, not long before his nearly inevitable election as president.

For eight years (between 1775 and 1783) and without pay, Washington had led the Continental Army against the British. The aristocratic Virginian might have gone on to leverage his impressive victory to become a “conquering general” and establish a personal dictatorship—an end conceivably within his grasp and even suggested by some in his circle.

Instead, George Washington very emphatically retired. Lest anyone should miss the point, Washington even delivered a public resignation address. His days of service were over, and beloved Mount Vernon was calling.

But now, he was being summoned into service once more. Two weeks after Washington had compared his feelings to those of a culprit on his way to execution, a dispatch arrived at Mount Vernon notifying the retired general of his presidential election. Two days after that, 57-year-old George Washington left Mount Vernon, penning the following in his diary:

About ten o’clock I bade adieu to Mount Vernon, to private life, and to domestic felicity; and with a mind oppressed with more anxious and painful sensations than I have words to express, set out for New York … with the best dispositions to render service to my country in obedience to its call, but with less hope of answering its expectations.

Perhaps no one in America was more familiar with the challenges of directing the new union than George Washington, who had played such a central role in its inception and evolution. As such, he was clearly under no illusion as to the challenges that awaited him. His acquiescence (for so it was) to the presidency was informed less by political ambition and more by solemn duty. There was no relishing of the prospect, no celebration on his part, no reveling in his political achievement. Being the sort of president people wanted—by unanimous vote of the Electoral College, no less!—seemed at the very least a daunting task, and probably an impossible one. He seems to have known this.

Bad Roads and White Robes

New York was to serve as the first temporary capital of the new United States of America, but great distance and bad roads meant that it was quite a journey to get there from Virginia. And if Washington was really weighed down by “expectations” at the moment of his departure, he was certainly more so as the journey progressed. Everywhere he went, crowds cheered his arrival, casting roses and wreaths along his path, or erecting triumphal arches for him to pass through. At Trenton, 13 girls—representing the 13 states—in white robes hailed him as “mighty Chief” in song, while Washington was made to ride beneath a 13-columned arch.

Finally reaching Elizabethtown, New Jersey, across the Hudson from New York City itself, Washington was greeted by an ostentatious barge manned by 13 white-uniformed captains. Upon this gaudy vessel, the president-elect was ferried across the river to where Wall Street met the water. New York Governor George Clinton awaited him there—atop a set of specially prepared steps with their sides draped in lavish cloth.

Engraving depicting George Washington en route to Federal Hall for the first Presidential Inauguration, April 30, 1789. (Archive Photos/Getty Images)

George Washington was sworn in on April 30, his oath of office administered on the balcony of Federal Hall, in front of a massive crowd gathered along Broad and Wall Streets and on balconies and housetops in every direction. All was hushed during the swearing in, after which the officiator exclaimed, “Long live George Washington, President of the United States!”

Thunderous applause echoed throughout the city as a 13-gun salute rang out from the harbor. As the ovation continued, an American flag was hoisted above George Washington himself.

Expectations, indeed.

Complainers

Of course, the hoped-for utopia to be ushered in by America’s greatest Founding Father never materialized. Even Washington himself had hoped that the new federation would, at the very least, avoid political factions. Instead, the real world offered its usual share of complication and contention—including a highly combative two-party system. By the time Washington left office, his once-invulnerable image had taken a hit among some contemporary people. Complainers picked at flaws, real or not. American newspapers attacked his perceived disloyalty to republicanism and his personal integrity. They attacked the lavish receptions (or “levees”) he hosted with his wife, his “aristocratic” airs, his alleged “monarchical” pretensions, his cold and aloof manner. Critics accused him of being unintelligent and susceptible to bad advice from his cabinet, of treacherously betraying France by proclaiming neutrality—and of betraying the American Revolution by not eagerly supporting the French one.

A whole series of letters (called the “Belisarius” letters, after their author’s pen name), addressed personally to Washington and published in opposition newspapers, lambasted the president on a wide range of counts: for cultivating “a distinction between the people and their Executive servants”; failing to stand up to (post-war) Britain; overseeing a costly war with the American Indians; maintaining a standing army in peacetime; and supporting internal taxation (then “denouncing” the people most affected by it), among other allegations.

Tempering Expectations

Women laying flowers at George Washington’s feet as he rides over a bridge at Trenton, New Jersey, on the way to his inauguration as first president of the United States on April 30, 1789. (MPI/Getty Images)

It may be that the aspersions cast in his direction were a primary reason George Washington decided to retire after just two terms. Indeed, an earlier draft of his Farewell Address actually included these words:

As some of the Gazettes of the United States have teemed with all the Invective that disappointment, ignorance of facts, and malicious falsehoods could invent, to misrepresent my politics and affections; to wound my reputation and feelings; and to weaken, if not entirely destroy the confidence you had been pleased to repose in me; it might be expected at the parting scene of my public life that I should take some notice of such virulent abuse. But, as heretofore, I shall pass them over in utter silence.

The impossible expectations placed upon the first president demonstrate, perhaps, the futility of investing in one individual the utopian hopes and dreams of an entire people. One of the original American lessons, at least as they pertain to the state, is that political saviors don’t exist; not even the vaunted George Washington could be one! He’d felt the weight of such expectations right from the beginning. That weight probably helped drive him out of the spotlight in the end.

When election cycles come around, perhaps our expectations should be tempered by Washington’s experience.

And when politicians talk like saviors, remember George Washington, too.

Dr. W. Kesler Jackson is a university professor of history. Known on YouTube as “The Nomadic Professor,” he offers online history courses featuring his signature on-location videos, filmed the world over, at NomadicProfessor.com

Categories
History

The Real Johnny Appleseed

Walt Disney’s 1948 animated short, “The Legend of Johnny Appleseed,” famously depicts its main character as a Pennsylvania farmer yearning to join the pioneers heading west to the frontier. All those settlers would need something to eat—and so Disney’s Johnny is determined to plant apple trees across the land in order to feed them.

The real Johnny Appleseed—a “small, wiry man, full of restless activity” named John Chapman, born in Massachusetts in 1774—was indeed a trained orchardist, and he really did show up in Ohio Territory with “a horse-load of apple seeds” to plant. It’s no myth, either, that John Chapman introduced apple trees to large swathes of frontier territory, from modern Pennsylvania and Ohio to Indiana, Illinois, and even Canadian Ontario.

Orchards for Cider

But Chapman’s apples weren’t meant for eating at all (they would have been far too sour, anyway). No—the apples of the real Johnny Appleseed were meant for making hard cider. As a not-so-subtle Smithsonian Magazine headline framed it in 2014, “The Real Johnny Appleseed Brought Apples—and Booze—to the American Frontier.” Journalist Michael Pollan agreed, explaining in a 2015 interview, “Really, what Johnny Appleseed was doing and the reason he was welcome in every cabin in Ohio and Indiana was he was bringing the gift of alcohol to the frontier. He was our American Dionysus.” Perhaps as early as the 1810s, however, Chapman was already being referred to as “Johnny Appleseed,” recognized as such “in every log-cabin from the Ohio River to the Northern lakes, and westward to the prairies of […] Indiana,” according to an 1871 Harper’s New Monthly Magazine piece. Unfortunately, along with apple trees Chapman also planted countless acres of dogfennel, which he considered as possessing medicinal qualities; dogfennel is now seen as a pernicious weed.

And the real Appleseed’s motivations were also far more complicated than those of the happy planter portrayed in the Disney musical. Far from acting the altruistic scatterer of apple seeds tossed hither and thither wherever he happened to roam, Chapman was probably a shrewd entrepreneur. The nurseries Chapman planted weren’t open to all, but rather fenced off, each tree meant for selling to settlers for six and a quarter cents. In the late 18th and early 19th centuries, some land speculation companies required prospective colonists to plant fifty apple trees on their land; since it took the better part of a decade for these trees to bear fruit, such planting would serve to prove the colonists’ commitment to develop the land over many years. The real Appleseed saw in this a business opportunity. He would thus plant a nursery, enter into a business partnership with someone in the area to manage the sale of the trees to arriving settlers eager to quickly fulfill the companies’ requirements, and then move on to repeat the process elsewhere. Two or three years later, he might return to tend to the nursery.

Traversing the Frontier and In Touch With Indians

This was a business model that demanded he labor “on the farthest verge of white settlements,” and it wasn’t an easy line of work. Chapman essentially lived life as a nomad—and a barefoot one at that. One newspaper described him as “barefooted and almost naked,” and for a time he apparently wore a coffee sack, having cut out holes for his head and appendages. (There was, too, his signature hat: a tin vessel, with a visor-like peak in the front, that doubled as a pot for cooking.) The work itself could be hazardous. Once, while working in a tree, he fell and got his neck stuck between forking branches. If one of his helpers that day, a mere lad of eight years, hadn’t discovered him and run for help, the real Johnny Appleseed might have died in 1819. And as far away as they might have seemed from America’s economic centers, Chapman’s enterprises weren’t immune from the vicissitudes of the larger economy. When recession racked the United States in 1837, the price of his trees plummeted to just two cents apiece.

The frontier along which he worked, skirting American Indian country, could also be dangerous. But the natives, whom Chapman always admired and whose wilderness trails he often traversed, left the real Appleseed alone, considering him something of a medicine man; how else could one explain the privation and exposure which he so easily endured? Even during the War of 1812, when many natives along the frontier allied with Britain to devastate white frontier communities, Chapman never ceased his wanderings, and he was never harmed. One frontier settler, reporting his experience during this period, remembered with a “thrill” the peripatetic Chapman’s timely warning to his community: “The Spirit of the Lord is upon me, and he hath anointed me to blow the trumpet in the wilderness, and sound an alarm in the forest; for behold, the tribes of the heathen are round about your doors, and a devouring flame followed after them.” The real Appleseed’s warnings may have saved hundreds of lives.

Disney’s cartoon Johnny makes friends with a skunk and is beloved by all animals. In truth, the land he traversed teemed with potentially menacing wild animals, including wolves, rattlesnakes, wild hogs, and bears. One account of Chapman, however, does claim he had partially tamed a pet wolf, which followed him around everywhere he went, and his religious fervor (he followed Swedenborgianism) did eventually cultivate in him an almost Jain-like respect for all living creatures. By the time of his death, he had become a full-fledged vegetarian.

An Orator and a Gift-giver

Illustration of Johnny Appleseed delivering a speech, circa 1820. (Fotosearch/Getty Images).

As an itinerant orchardist-nurseryman, the real Appleseed, with his “long dark hair, a scanty beard that was never shaved, and keen black eyes,” was well-known up and down the frontier. This meant Chapman was often “passing through” the towns and settlements and American Indian villages of the region (in the words of one 19th-century newspaper, he “sauntered through town eating his dry rusk and cold meat”). Frequently, he would stop to entertain groups of children—apparently he was a master storyteller—or preach “on the mysteries of his religious faith” to any adult who might listen. To little girls he gifted bits of ribbon and calico; “Many a grandmother in Ohio and Indiana,” reported an article published a few decades after his death, “can remember the presents she received when a child from poor homeless Johnny Appleseed.” Once, after being gifted shoes for his bare feet by a particularly assertive settler, he was discovered a few days later again walking barefoot in the cold; the settler confronted him “with some degree of anger,” only to find out that Chapman had almost immediately re-gifted the shoes to a poor family, some of them barefoot, traveling west.

Days after strolling through the streets of Fort Wayne, Indiana, the real Appleseed died suddenly. The year was 1845, and John Chapman was around 70 years old. He was hailed in one obituary for “his eccentricity,” his “strange garb,” his material self-denial (apparently his faith had worn away at the material entrepreneurship of his youth), his “cheerfulness,” and his “shrewdness and penetration.” Johnny Appleseed was buried at Fort Wayne.

Within a quarter-century, the life of Johnny Appleseed was featured in the aforementioned Harper’s New Monthly Magazine piece, subtitled “A Pioneer Hero.” Even then, it was admitted that, as the frontier disappeared, “the pioneer character is rapidly becoming mythical.” The story of the nomadic nurseryman “whose whole [life] was devoted to the work of planting apple seeds in remote places” had begun to take on a life of its own—a myth which, by the mid-20th-century, had become musical Disney legend.

Dr. W. Kesler Jackson is a university professor of history. Known on YouTube as “The Nomadic Professor,” he offers online history courses featuring his signature on-location videos, filmed the world over, at NomadicProfessor.com

Categories
History Founding Fathers

Roger Sherman, Low-Key Founding Father

Among the Founding Fathers, Roger Sherman is one of the best-kept secrets. But he shouldn’t be, especially in light of the cumulative and lasting effect he has had on this nation, including the present-day debates on the meaning and legal effect of the Ninth Amendment.

Most notable is the fact that he is the only Founding Father to have signed all of these prominent founding documents: the Declaration and Resolves (1774), which contain many of the rights that are enumerated in the First Amendment; the Articles of Association (1774), which was a trade boycott with Great Britain; the Declaration of Independence (1776); the Articles of Confederation (1777); and the U.S. Constitution (1787).

Sherman’s influence on the Constitution was greater than most realize. Historian Richard Werther wrote in 2017 in the Journal of the American Revolution that, at the Constitutional Convention debates, “of 39 issues cited, Sherman prevailed on 19, Madison on 10, and 7 resulted in compromises (the other 3 were interpretational issues for which no clear-cut winner is determinable).” Werther adds, “While no one is arguing that Sherman, not Madison, assumes the mantle as ‘Father of the Constitution,’ clearly Sherman had a bigger role than may have been previously understood.”

As a boy in Connecticut, Roger Sherman was self-educated in his father’s library and later by a newly built grammar school. He managed two general stores. Although he had no formal education in law, he passed the bar exam and was admitted to the bar in 1754. He wrote and published an almanac each year from 1750 to 1761. He served as a mayor, a justice of the peace, a county judge, a Connecticut Superior Court judge, and as a delegate to both the First Continental Congress and the Second Continental Congress. After ratification of the Constitution, he served in the U.S. House of Representatives from 1789 to 1791 and in the U.S. Senate from 1791 until his death in 1793.

Sherman’s reputation was stellar. He was described as honest, cunning, a staunch opponent of slavery, a devout Christian who was outspoken about his faith, and a protector of states’ rights. William Pierce, a delegate to the Constitutional Convention who took extensive notes, said of Sherman, “He deserves infinite praise, no man has a better heart nor a clearer head. If he cannot embellish he can furnish thoughts that are wise and useful. He is an able politician, and extremely artful in accomplishing any particular object; it is remarked that he seldom fails.”

Role in the Bill of Rights and the Ninth Amendment

Originally, Sherman was opposed to adding a bill of rights to the Constitution due to its being “unnecessary” and “dangerous.” He, like other Federalists, stated that it was unnecessary as the powers enumerated in the Constitution granted limited authority; if certain powers were not enumerated and delegated, then the federal government wouldn’t have the authority to infringe upon the rights in question. Plus, the states had their own constitutions protecting their citizens’ rights, and the Constitution is concerned only with federal guarantees, not states’ guarantees. The Federalists considered it dangerous to list certain rights as it could be construed that other rights not singled out were surrendered to the government; in other words, if they were not written down, then those rights would not be considered protected.

The original Constitution was signed by 39 delegates on September 17, 1787. It was during the First Congress on June 8, 1789, that James Madison proposed to “incorporate such amendments in the Constitution as will secure those rights, which they consider as not sufficiently guarded […] to satisfy the public that we do not disregard their wishes.” After Madison persuaded Congress to create a Bill of Rights, the proposals were referred to a House select committee, the Committee of Eleven, which took up the debates. In 1987, the National Archives discovered among Madison’s papers the only known copy of the deliberations of that House Committee, and they are in Sherman’s handwriting, most likely reflecting the thoughts of the committee as opposed to his personal views.

This discovery has created a vigorous debate among legal scholars as to the meaning and legal effect of the Ninth Amendment, the text of which reads, “The enumeration in the Constitution, of certain rights, shall not be construed to deny or disparage others retained by the people”: namely, what are the rights “retained by the people” referring to, and what legal effect do they have? To give context, it is essential to go back to Madison’s original draft regarding retained rights:

The exceptions here or elsewhere in the Constitution, made in favor of particular rights, shall not be so construed as to diminish the just importance of other rights retained by the people, or as to enlarge the powers delegated by the Constitution; but either as actual limitations of such powers, or as inserted merely for greater caution.

After the House committee’s debates and revisions, Sherman’s notes read:

The people have certain natural rights which are retained by them when they enter into society, such as the rights of conscience in matters of religion; of acquiring property; and of pursuing happiness and safety; of speaking, writing and publishing their sentiments with decency and freedom; of peaceably assembling to consult their common good, and of applying to government by petition or remonstrance for redress of grievances. Of these rights therefore they shall not be deprived by the government of the United States.

According to the Bill of Rights Institute, once the Bill of Rights was drafted, Sherman supported it, just as the people of Connecticut supported it.

Deborah Hommer is a history and philosophy enthusiast who gravitates toward natural law and natural rights. She founded the nonprofit ConstitutionalReflections (website under construction) with the purpose of educating others in the rich history of Western civilization.

Categories
History

The Righteous Revolutionary Thanksgiving ‘Oration’

In mid-1772, a British customs schooner, the HMS Gaspee, attempted to catch an American packet ship off the coast of Rhode Island. The Gaspee was led by one Lieutenant William Dudingston, hated among Rhode Islanders for his strict enforcement of the Navigation Acts. In the case of Dudingston, rather than “enforcement,” the locals might have used the word “harassment.”

During the chase, the Gaspee ran aground. Stuck on a sandbar, and hence vulnerable, the Gaspee became the target of a group of Providence men, many of them Sons of Liberty. Assembled via the town crier, the patriots rushed toward the ship. The crew attempted to resist, but it was no use. Dudingston himself was shot and wounded, and his ship was burned down to the waterline.

British authorities tried to get the colonial perpetrators of the “Gaspee Affair” extradited to England to stand trial for treason, but the government couldn’t figure out who they were. Even a large reward failed to produce the names. But the Affair had only just begun to run its course.

A Baptist minister named John Allen, a recent arrival from Britain, while preaching in Boston at the end of that year, invoked this incident in his December 3 sermon, entitled “An Oration on the Beauties of Liberty.” Subsequently printed as a pamphlet, this sermon became a best-seller throughout the colonies. The question Allen posed was: Do the Rhode Islanders who destroyed the Gaspee receive their Laws from England?

“O! Amazing!” Allen reflected. “I would be glad to know what right the King of England has to America. It cannot be an hereditary right…; it cannot be a parliamentary right that lies in Britain, not a victorious right, for the King of England never conquered America. Then he can have no more right to America than what the people have, by compact, invested him with, which is only a power to protect them and defend their rights, civil and religious; and to sign, seal, and confirm as their steward such laws as the people of America shall consent to.”

And if this be the case, Allen thundered, “then judge whether the King of England and his ministry are not the transgressors in this affair in sending armed Schooners to America to steal by power and sword the people’s property.”

The message was clear: The British king hadn’t inherited America as some personal property; he’d never conquered it, and the power of Britain’s Parliament lay in Britain, not in the colonies. Rhode Island’s people were free, their rights were enshrined in their charter, and their laws originated in their own assembly. Who, then, was the true aggressor?

Five ‘Observations’

Allen’s “Oration” was built around five observations.

First: that “a craving, absolute Prince, is a great distress to a people.”

Second: that when the three branches of government, “king, judges, and senates unite to destroy the rights of the people by a despotic power… the destruction of the people’s rights is near at hand.”

Third: that “an arbitrary despotic power in a prince, is the ruin of a nation, of the King, of the crown, and of the subjects,” and that neither the King of England nor the Parliament of England can “justly make any laws to oppress or defend the Americans” because “they are not the representatives of America.”

Fourth, Allen channeled his inner John Locke:

“THAT it is not rebellion, I declare it before GOD, the congregation, and all the world, and I would be glad if it reached the ears of every Briton, and every American; That it is no rebellion to oppose any king, ministry, or governor, that destroys by any violence or authority whatever, the rights of the people. Shall a man be deem’d a rebel that supports his own rights? It is the first law of nature, and he must be a rebel to GOD, to the laws of nature, and his own conscience, who will not do it.”

And fifth: “That when the rights and liberties of the people are destroyed, it is commonly by the mischievous design of some great man,” who Allen wisely did not specifically mention by name.

These were radical sentiments—and tens of thousands of Americans read them enthusiastically. According to American Founder John Adams, by mid-1773, patriots like James Otis Jr. were regularly reading Allen’s “Oration” to “large Circles of the common People.”

In his “Oration,” Allen insisted on something Americans must remember today:

“A right to the blessing of freedom, we do not receive from Kings, but from Heaven, as the breath of life and essence of our existence, and shall we not preserve it, as the beauty of our being? Do not the birds of the air expand their wings? the fish of the sea their fins? and the worm of the earth turn again when it is trod upon? And shall it be deem’d rebellion? Heaven forbid it! … It is no more rebellion, than it is to breathe.”

Dr. W. Kesler Jackson is a university professor of history. Known on YouTube as “The Nomadic Professor,” he offers online history courses featuring his signature on-location videos, filmed the world over, at NomadicProfessor.com

Categories
History

The Capitol’s Statue of Freedom

As I step outside the House chamber on the second floor of the Capitol, I guide my visitors halfway down the stairs outside, offering them a sweeping view of the Supreme Court building and the Library of Congress. That’s when I call their attention to something else altogether—the crowning achievement, literally, of the Capitol: the Statue of Freedom, perched atop the dome, solitary, magisterial.

It is perhaps the most recognizable feature of the Capitol, an iconic world image of liberty and government by the people. Peering into the distance nearly 300 feet above the East Front Plaza, the bronze statue is of epic dimensions, soaring almost 20 feet high and weighing about 15,000 pounds. Freedom is decked out in an elaborate headdress topped by an eagle head and feathers. Her flowing dress is cinched with a large brooch emblazoned with two letters: U.S. In her right hand, she clasps a sheathed sword, while the other clutches a laurel wreath of victory and a shield.

The Statue of Freedom perched atop the Capitol is something to behold and serves as a symbol of my stewardship as a member of Congress, which is why I selected that image of the Capitol dome to adorn my letterhead. This is what I want my constituents to see, to be reminded of, when I write to them.

The Statue of Freedom also symbolizes the personal quest for freedom of one man, Philip Reid, born into slavery in Charleston, South Carolina, in 1820. In one of the great ironies of American history, Reid, as a slave, was assigned the complex project of creating and placing one of the world’s most powerful symbols of freedom on the most visible building in our nation.

The Statue of Freedom atop the U.S. Capitol dome in Washington, D.C., on July 1, 2010. (Paul J. Richards/AFP via Getty Images)

I’ll admit, I didn’t know the
 story of Reid until after I became
 a member of Congress and got the 
lowdown on the history of the Capitol from those in the know. But once I heard about Reid’s remarkable story, I delved deeper, reading more about it online. I mention all this about Reid during my tours, and though not one of my visitors has ever known the story beforehand, they are surely glad to hear it. Slavery is a terrible stain on our history, but my guests are palpably proud of how far America has come since then.

The statue was commissioned in 1855. Thomas Crawford, an American sculptor, created the plaster model of the statue in Rome, Italy. After his death in 1857, his widow shipped the statue in six crates, and the model was assembled and placed in what is now Statuary Hall. The following year, Clark Mills, a self-taught sculptor, was given the task of casting Freedom. Mills started his business in South Carolina, where he purchased Reid for $1,200. Reid dismantled the model in the Capitol, cast the individual sections, and finally assembled and mounted the bronze sections atop the dome.

On April 16, 1862, as Reid supervised the creation of the statute’s massive bronze sections, Congress passed the District of Columbia Emancipation Act, freeing thousands of slaves living within the district. That included Reid. As a free man, he kept working for Clark Mills. At noon, on December 2, 1863, under Reid’s supervision, the top section of the Statue of Freedom was raised and bolted on top of the Capitol dome.

Many of the experts with whom I have toured the Capitol offered various explanations for the direction the Statue of Freedom faces. Some say Freedom faces east because every morning she watches the sun rise on America with a new day of liberty for all. Others say she faces east because the primary entrance to the Capitol is on the east side, or because most residents of Washington, D.C., at the time lived on the east side. Yet others suggest she faces east because European settlers came from that direction.

The Statue of Freedom on top of the U.S. Capitol dome, silhouetted against the super moon on Jan. 20, 2019. (Brendan Smialowski AFP via Getty Images)

As the foreman in the casting of the Statue of Freedom, Philip Reid stepped in when an Italian sculptor hired to assemble the five sections refused unless granted a pay raise. It was Reid who figured out how the pieces were separated and put together. He was paid $1.25 a day, though his owner received those payments, except on Sundays, when it was his own. Mills, the man who bought Reid, described him as “short in stature, in good health, not prepossessing in appearance, but smart in mind.”

Reid was a freed man by the time the last piece of the Statue of Freedom was assembled in December 1863. He went on to become a respected businessman, identified in census records as a “plasterer.” While a plaque to Reid resides not at the Capitol, but where his remains lie at the National Harmony Memorial Park in Landover, Maryland, his place in history—and on my tour—remains resolute.

Excerpted from the 2020 book, “Capitol of Freedom: Restoring American Greatness,” by Colorado Rep. Ken Buck.

Categories
Arts & Letters History

A Pedestal Waiting for a Monument

The crypt of the U.S. Capitol isn’t the dark, dank dwelling conjured up by its evocative moniker. On the contrary, the crypt is a well-lit circular chamber on the ground floor, under the rotunda, traversed by countless people every day, hurrying on their way—blinders on—to a hearing or meeting of reputed import. George Washington was supposed to be interred here—hence the name of the burial place—but his body never made it. Construction of the crypt was interrupted by the War of 1812. His family decided to honor his wish to be buried at his Mt. Vernon, Virginia, home, just a few miles away from the Capitol.

Magna Carta

Tucked away in the crypt—hidden in plain sight—is a replica of the Magna Carta, the 800-year-old document reining in the monarch. On tours, I make a point of directing my visitors’ attention to this transformational declaration; otherwise, they might miss it, given all the magnificent distractions surrounding it—forty neoclassical columns, and thirteen statues of prominent Americans of the original thirteen colonies.

In all the times I’ve entered the crypt—and it’s been plenty—I’ve never seen people clustered around the gold and glass case containing this most essential document, the greatest relic in the room.

The history of the Magna Carta predates our nation’s founding by more than five hundred fifty years, which might explain how it sometimes escapes people’s attention today. King John of England signed the Magna Carta on June 15 of 1215, after a severe clash with his barons, who had become frustrated with the monarch’s arbitrary rule and abuses of power. The noblemen set out to craft a document to rein in the king’s powers. The document they formulated prohibited arbitrary arrest and imprisonment, and established individuals’ right to a fair trial and the protection of private property. Those rights are foundational to the rule of law, and essential for limiting the powers of government.

The Magna Carta—Latin for “the Great Charter”—provided the key principles of the supremacy of the rule of law that formed the foundation of our Constitution. In this respect, it is symbolic that the Magna Carta replica lies in the crypt—the literal foundation—of the Capitol, erected to support the rotunda above it. The document’s most important principle— that no man is above the law, not even the king—is the foundation for American rule of law, and the base upon which we have built our system of government.

If those basic rights recognized in the Magna Carta sound familiar, it’s for good reason. America’s founders drew heavily from the ideas in the Magna Carta to write the American Declaration of Independence and the Constitution.

The Compass Star

Only a few feet away from the Magna Carta is a worn white marble stone compass star embedded in the center of the floor of the crypt. While it may seem, at first glance, the two features of the Capitol are unrelated, they each reinforce the primacy of the rule of law and the importance of the legislative body.

That compass star is the point in Washington, D.C. where all four quadrants of the district—northeast, southeast, northwest, and southwest—converge. If you place your foot on the compass, as I have from time to time to demonstrate for my visitors, you are standing in all four quadrants of the city simultaneously. When I take tourists to this spot, the following ritual tends to take place: They stand on the star, which droops below floor level, smoothed down with the passage of time. Then they hop off the star, pull out their smartphones, and take photos of what is, admittedly, a cool symbol. But it holds even greater significance. The compass star is the key to understanding the vital role the legislature plays in our republic.

L’Enfant

We must first revisit Pierre-Charles L’Enfant. After he wrote to President George Washington, offering to create a capital “magnificent enough to grace a great nation,” he got the gig in 1791. Influenced by the France of his youth, L’Enfant borrowed ideas from the grand sweep of the Versailles palace, conjuring up what are now distinct D.C. features, such as its broad avenues, designed on a slashing angle. The cheerful L’Enfant sought another epic brush stroke, designing a considerable park in front of the White House, for the benefit of the president, whoever happened to be in residence. But Thomas Jefferson put the kibosh on those plans out of a worry such an exclusive domain didn’t mesh with the nascent nation of the people. Hence, the space became a public gathering spot you might have heard of—Lafayette Park.

L’Enfant, though, got his way on a more vital part of his plan, to make the Capitol the central point of the new capital district. The Capitol was created to be the central focus of the new government, a building perched on a slight hill, elevated above the rest of the city. That hill was known in our nation’s earlier years as “Jenkins Hill,” because a man named Thomas Jenkins apparently once grazed livestock at the site. L’Enfant saw it in a more enchanted way, as “a pedestal waiting for a monument.” That pedestal has come to be known as Capitol Hill, today.

The location of the Capitol building speaks volumes about the role our founders intended the legislative branch to play—and the paramount role of the rule of law. Because the Capitol is located on a hill, on one of the highest points in Washington, D.C., it reminds all of us that the legislative branch—the part of the federal government most accountable to the people—is the most important branch of government.

Excerpted from the 2020 book “Capital of Freedom, Restoring American Greatness” by Colorado Rep. Ken Buck

Categories
Features History

In Their Words: Veterans Who Served in War Tell Their Stories

 

World War II Veteran

Editor’s note: Stanley Feltman passed away on September 23, shortly before this issue went to press.

(Dave Paone)

In 1945, at age 19, Stanley Feltman was a tail gunner in a B-29 for the U.S. Army Air Corps. He had flown about 15 successful bombing missions in the South Pacific, but come mission number 16, he wasn’t so lucky.

His plane, containing 11 crew members, was shot down by a Zero fighter aircraft of the Imperial Japanese Navy. All 11 men were able to escape the wreckage by inflating a dinghy and paddling away from the aircraft before it sank minutes later.

The dinghy was designed for six. That meant six were able to sit inside; but five, including Feltman, had to hang onto a rope that ran around the perimeter, with their bodies waist-deep in the water.

And then there were the sharks. They had some repellent on hand, but it dissipated after time. At one point, another airman who was hanging on lost his grip and slipped into the shark-infested water. Feltman dived after him and brought him back to the surface. This act of bravery would earn Feltman the Bronze Star.

Several hours later, a submarine spotted them. However, its crew was on a mission elsewhere, and could not take them aboard. The submarine’s crew wired their coordinates to an aircraft carrier, which sent a PBY seaplane to pick up the stranded airmen after a total of about eight hours in the water.

When the United States entered the war on December 7, 1941, Feltman was only 15 and couldn’t enlist, although he wanted to. However, Americans could enlist at 17 with parental consent, which was his plan. Upon his 17th birthday, he told his parents of his intention to volunteer.

Eventually, Feltman found himself in the tail of a B-29 in the South Pacific. His job was to fire at oncoming enemy planes. Often, these were flown by Kamikaze pilots, who would purposely crash their explosive-laden planes into American aircraft carriers.

Feltman recalled his first encounter with the enemy. “I remember somebody saying, ‘There’s planes coming in at six o’clock,’” he said. “I sighted on a plane that I saw coming in. I didn’t know if it was the same plane that they saw because usually they had five, six planes at one time come at you. I fired; I saw the plane blow up, so I figured it has to be a Kamikaze plane. It just exploded.”

Feltman was only 18 at the time, and the youngest member of the crew. After he hit his target, he shouted, “I got him! I got him! I got him!”

Today, at 95, when Feltman thinks about those battles, he’s not so enthusiastic. He’s certain he shot down eight Japanese pilots and thinks there may have been two more.

“I never felt right by taking a life,” he said. “When you’re shooting planes down, you’re taking a life. That’s all. There’s nothing big about that.”

Korean War Veteran

Sal Scarlato (left) with a South Korean counterpart. (Courtesy of Sal Scarlato)

On June 25, 1950, North Korean soldiers crossed the 38th parallel, and the Korean War began.

Sal Scarlato was 17 at the time. He had known of a few boys from his Brooklyn neighborhood who were killed in combat early on, but this didn’t stop him and his pals from enlisting in the Marines after they turned 18.

Private First Class Scarlato landed at Incheon on April 10, 1952. He was 19 and in the infantry.

“All of a sudden, we got hit with small-arms fire and mortar fire,” said Scarlato. “We were firing like crazy. I had the runs. I urinated. I was crying. A couple of guys got hit.”

One night, Scarlato had outpost duty along the 38th parallel. “That night, the CCF (Chinese Communist Forces) really gave us a welcome,” he said. “When they came, I didn’t fire my weapon right away. I froze. So, the guy next to me—actually, he was my squad leader—hit me in the helmet. He said, ‘You better start firing that weapon.’ A couple of minutes later, he got hit in the belly. He fell right on top of me. And when the corpsman came, he said, ‘Give me your hand.’”

Scarlato applied pressure to the squad leader’s liver, which was protruding from his body. Right then and there, he died. “I cried like a baby,” he said. “After this, I was very bitter. I kept saying to myself, ‘What the hell am I doing here?’ And my officers always said, ‘You’ll find out. You’ll find out eventually what you’re doing here.’”

Scarlato witnessed countless casualties, and then, in July 1952, became one. Once again, Scarlato’s unit came under attack by the CCF. An enemy combatant tossed a grenade at him and two other Marines. It exploded, killing one of them and wounding the other two. Scarlato suffered leg, neck, and hand wounds, and a concussion.

A corpsman gave him a shot of morphine and sent him via jeep to an aid station. From there, he was flown via chopper to a hospital ship. He thought this was his ticket home, but the Marines still needed him. Being sent back to his unit made Scarlato bitter. “I hated everybody,” he said, even his South Korean allies. Scarlato once even spat on a soldier when he came close.

Scarlato soon discovered that the officers were correct, and he did indeed find out why he was there. On patrol one day, Scarlato’s unit came upon a small village where several civilians had been killed.

“There was a little boy, maybe 5, 6 years old—he had his hand blown off.” Scarlato immediately picked the boy up and put his severed hand in his own pocket. He bandaged the end of the boy’s arm and a corpsman arrived. The child screamed in pain the entire time. They flagged down a medical jeep and drove to a nearby orphanage that had medical staff.

The nurses placed the boy on a table. Scarlato and the corpsman turned and walked out, having done all they could. Then, Scarlato remembered he still had the child’s hand in his pocket. He stepped back inside, only to find out the boy had died.

This was the defining moment. Out of all the death and carnage Scarlato saw, this was the worst. Now, he knew that the reason he was there was “to save these people’s lives. Before that, I didn’t understand.”

At 88, Scarlato is still sharp as a tack and keeps up with the news, including about current U.S.–North Korea relations. He’s a member of the Korean War Veterans Association, and regularly raises money for Korean War monuments.

Vietnam War Veteran

Col. Robert Certain with his wife, Robbie. (Courtesy of Robert Certain)

It was late 1972, and as the holiday season approached, Colonel Robert Certain, an Air Force B-52 navigator, was preparing to return stateside from war-torn Vietnam. But just days before his departure date, this much-anticipated plan was abruptly changed. Instead of returning home, Certain was now assigned to a large-scale flying mission—one that would radically change his life.

As a navigator, Certain explained that his job was not only to get to the target on time, but also to ensure the task was accomplished in an equally prompt and precise manner. The logistics were critically important for this mission, he said, because he and his crew would be flying toward Hanoi, deep into what was then known as enemy territory. Even so, the newly assigned mission initially got off to a good start and seemed to go according to plan. And then, it didn’t.

When Certain and his crew had almost reached their target, the plane suddenly sputtered into a free fall. They’d been hit. With no time to waste, Certain knew there was only one way to survive the doomed flight—eject into enemy territory. And so, Certain explained, he wasn’t surprised when he was captured, along with another member of the crew. “We were just a few miles north of Hanoi,” Certain said of their precarious landing site, estimating it was within 10 or 20 kilometers of their original target.

Certain would eventually end up in the infamous prison sarcastically dubbed by Americans at that time as the “Hanoi Hilton.” But first, he was forced to endure hours of relentless interrogation. Then, he and his fellow captive crew mate were paraded in front of cameras at an international presser.

Though the North Vietnamese may have been “showing off” their catch of the day, Certain believes this exposure protected him and the other new captures from the type of well-reported, horrendous conditions earlier prisoners were subjected to. After about 10 days, his tiny, shared cell was upgraded to a much larger one, and the prisoners were eventually allowed to gather together on Sundays for a service of sorts.

If the watchful eye of the media played a part in the type of treatment Certain and other newer captives received as prisoners, undoubtedly, so did the actions of the American government. At that time, the United States was in dedicated negotiations to end its involvement in the war. After the signing of the Paris Peace Accords made it official, Certain once again began planning for his return home. This time, his plans were undeterred, and Certain was set free on March 29, 1973.

But this isn’t where the story ends. Certain, who was 25 when he was captured, returned to the United States and hit the ground running, but on a much different path. In 1976, Colonel Certain became Father Certain, an ordained Episcopalian priest. He went on to earn his Doctor of Ministry degree in 1999, and as a member of the U.S. Air Force Reserves, he served as chaplain for a number of U.S. bases, including what is now Andrews Joint Base. When former President Gerald Ford passed away in 2006, it was Father Certain who presided over his graveside services.

Certain retired from active-duty service in 1977 but went on to serve in the Reserves until 1999. His exemplary service earned him a number of prestigious honors, including the Purple Heart, Bronze Star, and Distinguished Flying Cross medals, to name just a few. He has also served as a CEO, director, or board member for numerous organizations and governmental committees, such as the Defense Health Board and the Pentagon Task Force on the Prevention of Suicide by Members of the Armed Services. Notably, he remains active as a board member of the Distinguished Flying Cross Society, comprised of medal recipients. Over the years, his 2003 autobiography, “Unchained Eagle,” has accumulated a prestigious—and rare—five-star average rating on Amazon.

Yet despite his many successes, Certain admits to one failure. “I’ve tried to retire,” he said with humor in his voice, “but I’ve been a failure at it.” Officially though, Certain is indeed now classified by the military as retired, and lives with his wife of many years, Robbie, in Texas.

Gulf War Veteran

Air Force Lt. Col. Rob Sweet (center right) with his family after he took his final flight on June 5 this year, at the Moody Air Force Base in Georgia. (Andrea Jenkins)

It was February 1991, and U.S. Air Force pilot, Lieutenant Colonel Robert Sweet, was on his 30th mission in Desert Storm. The goal, simply put, was to eliminate enemy targets. However, his arrival at the targeted area was met with such heavy fire, he was ordered to leave because, as he explained in a press statement later, “if the target area is too hot, you have to leave. It’s not time to be a hero.”

As he and his lead flight captain, Stephen Phillis, made their way out of the area, he caught sight of what he described as a “pristine array of (enemy) tanks that had not been hit.” He found this downright shocking, he said, “because by that point, everything had been bombed for the past 30 days.” After Sweet began to attack the tanks, an exchange of fire erupted, and the A-10 Thunderbolt he was piloting was hit from behind.

He attempted to keep the damaged plane in the air, but he quickly realized it was not salvageable, and in order to survive, he would have to eject into enemy territory. “I tried a couple of things, and basically, it wasn’t going to work, so I punched out,” Sweet said, explaining how he landed face-to-face with more than a dozen irate Iraqi soldiers, southwest of Basra, Iraq. He was captured and held prisoner for 19 days under brutal conditions, including beatings, starvation, and exposure to disease.

It was clear, he said, that he now had to fight to keep himself both physically and emotionally strong. But it was also clear that the military had prepared him well beforehand for this type of situation. “There were very few surprises,” Sweet said of his time as a prisoner. “The SERE (Survival Evasion Resistance and Escape) we have is outstanding,” he said of the U.S. military’s training. “There were very few surprises in the jailhouse. I knew what to expect.”

And although his expectation included casualties, Sweet still found himself reeling after learning that Phillis had been killed in action. “I had survivor’s guilt, and it took me a long time to get over that,” he said.

Sweet spent 19 days in captivity before being released as part of a prisoner exchange. But it wasn’t without some long-term aftereffects. Most notably, he realized the importance of making good decisions under pressure and taking life as it comes. “Bloom where you’re planted,” he advised. In the military, that often includes assignments to undesirable locales. “Make the most of them and move on,” he said.

And that’s exactly what Sweet himself has done. After spending 20 years on active duty and 13 more as a reservist, Sweet retired in June 2021, making him America’s last POW to be actively serving in the Air Force. After this acknowledgement and congratulations at his retirement ceremony, General Charles Q. Brown captured the sentiment of the nation when he said simply, “We thank you for all you’ve done.”

Dave Paone is a Long Island-based reporter and photographer who has won journalism awards for articles, photographs, and headlines. When he’s not writing and photographing, he’s catering to every demand of his cat, Gigi.

Joni Williams started her career as a real estate reporter. Magazine writing soon followed, and with it, regular gigs as a restaurant and libations reviewer. Since then, her work has appeared in a number of publications throughout the Gulf Coast and beyond.