http://arstechnica.com/science/2013/11/ancient-siberians-skeleton-yields-links-to-europe-and-native-americans/ Ancient Siberian’s skeleton yields links to Europe and Native Americans Enlarge / One of the stone figures found in the region of Siberia where the new genome originated. Kelly E Graf All the evidence indicates that the Americas were populated by people who migrated across the Bering Sea at a time when the ice age lowered ocean levels enough to do it. Well, almost all the evidence. Some of the oldest skeletons found in the new world have features that look somewhat European, a link supported by a few pieces of DNA found in some Native American populations. Now, a 20,000-year-old skeleton from Siberia may help clear up the confusion. The skeleton indicates that the confusion may be caused by a combination of migration and population structure within Asia at the time. The end result is that some exchanges of DNA only went in one direction—and the donors then moved on to other locations. Genetically, Native Americans share a strong affinity with East Asian peoples. There are a couple of exceptions, however, such as a somewhat European-looking mitochondrial DNA sequence found in some Native American populations. But that could be viewed as a product of later contamination by European visitors. What’s harder to understand is the presence of features in skeletons such as Kennewick Man, who died over 7,000 years ago in what’s now Washington state. Kennewick Man didn’t look very much like modern Native American populations, raising questions about how he ended up in the Pacific Northwest at that time. A skeleton called MA-1, found in the Altai region of Siberia, may provide a way to tidy up this confusion. MA-1 lived in an area northwest of Lake Baikal (north of Mongolia) roughly 24,000 years ago. This was in the heart of the most recent ice age shortly before a period called the Last Glacial Maximum. Despite the undoubtedly harsh conditions that prevailed, the area was producing some of the earliest known artwork based on human figures at the time. DNA work yielded a mitochondrial genome from the individual, as well as a 1X coverage of the entire nuclear genome. (That means that every base is, on average, sequenced once. But since the sequences are random, some areas don’t actually get sequenced at all, while a few will end up being sequenced dozens of times.) It’s not enough to say anything about what this individual’s genes were doing, but it is enough to figure out who he’s related to. And that turns out to be a rather complicated story. Despite his location near Mongolia, the MA-1 sequence looks like it branched off from the group that eventually populated Europe and Western Asia. The mitochondrial genome, although it belongs to a lineage that seems to have died out, looks similar to those of the Paleolithic hunter-gatherers who populated Europe at the time. The Y chromosome also looks like it split off near the base of the tree that includes modern Europeans. But it also looks like it resides near the root of Native American populations, while it doesn’t show up in any modern East Asian population we’ve looked at. The same thing happened when the researchers looked at the full nuclear genome. The sequences typically looked like they were near the base of the modern Eurasian family tree, but a few also had strong links to a native Brazilian population. Statistically, the authors find that somewhere between 20 to 40 percent of MA-1’s genome contributed to the people who populated the Americas. Again, the nearby cultures of East Asia, such as the Chinese, do not share these sequences. How is all of this explained? The authors turn to the concept of population structure, which occurs when groups of the same species end up reproductively isolated from each other. For example, MA-1’s population, despite being rather close to China, did not seem to have interbred with the ancestors of the Han Chinese (the Gobi desert possibly having something to do with that). However, the ancestral population of the East Asians split off a group that did interbreed with MA-1’s relatives but then promptly left Asia for the Americas. This genetic intermingling seems to have taken place very early, given that all Native American populations tested seem to be about equally distant to MA-1’s genome. To confuse matters further, MA-1’s population seems to have wandered in the opposite direction, settling in Western Asia and Europe. A bit of sequence from a 14,000-year-old skeleton from the region, however, suggests that the population was tough enough to ride out the Last Glacial Maximum in Siberia. The work is another great indication of how our ability to work with ancient DNA is completely revolutionizing how we understand humanity’s past. And, as in the case with the Denisovans and Neanderthals, that past is a complicated mix of interbreeding, migration, and population structure. Nature, 2013. DOI: 10.1038/nature12736 (About DOIs).
This is the latest in a series of posts on the records of Brooklyn’s Corporation Counsel, which are currently being processed with funding provided by a Council on Library and Information Resources (CLIR) “Hidden Collections” grant.
The U. S. Supreme Court recently upheld New York City’s policy of preventing unvaccinated students from attending public schools while another student has a vaccine-preventable disease. This is just the latest in long line of judicial decisions which addresses the limits of government control over the health of the individual. Over one hundred years ago, during a smallpox epidemic which ravaged the city, the very same situation was addressed in the courts of Brooklyn. The city’s Department of Health was determined to actively control the spread of the disease, often coming into conflict with those citizens whom they were trying to protect.
Prior to the development of a vaccine, smallpox was one of the most feared diseases on the planet – it was extremely contagious, and had killed and disfigured millions since ancient times. The first traditional vaccine was developed in England in the late 18th century, and by 1800 it was introduced to the United States. While the vaccine was highly effective at reducing the spread of the disease, from the beginning there was resistance to state imposed vaccination programs. Laws varied from region to region. In Europe, Germany and England had made vaccination compulsory. Massachusetts introduced the first mandatory vaccination policy in the U.S. In New York, there were no compulsory vaccination laws, excepting children who attended public schools.[i]
In 1892, about 20 years since the last epidemic, small pox returned to Brooklyn. At first the cases remained isolated. Then, from December 1893 to February 1894, there were about 70 newly diagnosed cases a month. By March of 1894 the number of infections had increased to 150. Brooklyn’s Department of Health, led by Dr. Z. Taylor Emery, decided it was time to take action. The department operated under the premise that the masses (i.e. the poor, but also business owners and landlords), “needed the guidance of enlightened and scientifically trained professionals to ensure the public good.”[ii] To that end, they began a policy of vaccination and quarantine that sometimes overstepped the bounds of New York law.
The department’s strategy of vaccination and quarantine was something akin to the military tactic of “shock and awe.” The 1894 annual report of the Department of Health describes the typical response to a reported case of smallpox: “As occupants of infected houses were sometimes known to escape therefrom to other parts of the city, before the removal of the patient, it was found necessary to put a police quarantine on the house pending arrival of the ambulance, the disinfection of the premises, and vaccination of the inmates. As soon as precautions were complied with, quarantine was raised, so as to inconvenience the occupant as little as possible.” As you can imagine, many residents found the process to be somewhat more severe than a mere inconvenience, as they were essentially placed under house arrest while their clothes, beddings, and other household goods which might be harboring the disease were destroyed.[iii]
Sometimes the afflicted were allowed to convalesce at home, but more often than not they were sent to the Kingston Avenue Hospital, also known as the Contagious Disease Hospital, in Flatbush (the hospital was located at Kingston Ave. and Fenimore St., today the site of the High School for Public Service). The hospital was soon filled to capacity, and tents were erected to house even more patients. Naturally, local residents were alarmed by the influx of disease carriers. The hospital was threatened with arson, and soon guards were stationed on the premises to protect both the patients and staff.[iv]
The city was pro-active in its vaccination efforts. Over two dozen free vaccination clinics were set up across the city. A team of vaccinators was sent to the 27th Ward (Bushwick), where a large German population (which was largely resistant to vaccinations) resided. The city focused on areas where large groups of people congregated, specifically schools, factories, and lodging houses. In one day, 2,000 workers were vaccinated at the Havenmeyer & Elder sugar refinery alone. [v]
When a new case of smallpox was reported, not only was the entire household vaccinated on the spot, but teams would canvass all surrounding residences to prevent the spread of the disease. While the health department was not empowered to coerce the vaccination of citizens, they used quarantines (which were allowed by New York state law to prevent the spread of disease) to strong arm anyone who resisted. Quarantined homes were marked with a yellow flag, and policemen were stationed outside to prevent anyone was entering or leaving the home. Sometimes even food deliveries were prevented from entering the quarantined homes.[vi]
The policies of Brooklyn’s Department of Health led to a number of legal problems for the city, and there are several cases related to the epidemic in the records of Brooklyn’s Corporation Counsel.
Mary A. Ferrer sued the city for false imprisonment. She claimed to be misdiagnosed with small pox (while actually suffering from malaria), and was held at the Kingston Avenue Hospital for a week, all the while being exposed to the deadly disease.[vii]
John Salmon sued for injuries received as a result of his vaccination. According to the plaintiff, a health department official came to his home and asked if he was vaccinated. When Salmon indicated that he was not, the health official falsely declared that the vaccination was mandatory, and Salmon reluctantly submitted. Three days later his skin began to blister all over his body and he was confined to a hospital for three months.[viii]
Robert W. Goggin filed suit against city for the deaths of both his wife and daughter. He claimed that city failed to remove a small pox carrier from his apartment building, and as a result his wife and two children contracted the disease and were quarantined at the hospital. His wife soon died, and his daughter, who was later sent to the Home for Destitute Children, died of measles and pneumonia.[ix]
N.Y. state law regarding vaccination and public schools. Scrimshaw, Frederick and Charles A. Walters – Public school admittance and vaccination disputes, 1894-1895. Brooklyn, N.Y., Department of Law, Corporation Counsel records, 2013.015; Brooklyn Historical Society
The most significant legal case found in the collection involves the vaccination of school children. In 1893, the New York state legislature passed an act to provide for the compulsory education of children, which also allowed school boards to appoint physicians to vaccinate students. Children were inspected for vaccination scars by the physicians, and any student who was suspected of being unvaccinated was prevented from attending public school.
This practice was challenged by the Kings County Anti-Compulsory Vaccination League, which was led by Dr. Charles A. Walters. He argued that the city had no right to exclude unvaccinated children from public schools. The case was heard by Judge Bartlett, who ultimately sided with the city. In his decision, he indicated that public school education was a privilege, not a right. Since the public school system was a creation of the state it was subject to reasonable regulation, especially regarding the health and welfare of the community. He still did not go so far as to endorse compulsory vaccination for all citizens, noting, “To vaccinate a person against his will, without legal authority to do so, would be an assault.”[x]
While this case ended in the city’s favor, their aggressive quarantine policy would not hold up in the courts. That same year a judge ruled that the health department had no right to quarantine the homes of citizens who had not contracted small pox. Legal challenges to compulsory vaccination continued into the 20th century, culminating in 1905 when the U.S. Supreme Court, “affirmed the right of the majority to override individual liberties when the health of the community required it.”[xi] Of course, as the recent Supreme Court ruling regarding school vaccinations indicates, the debate over the government’s role in public health remains unsettled to this day.
[i] “Between persuasion and compulsion: Smallpox control in Brooklyn and New York.” Colgrove, J. Bull. Hist. Med. 2004 Summer;78(2):349-78.
[iii] Annual Report of the Department of Health of the City of Brooklyn, 1894.
[iv] “Brooklyn’s Smallpox Outbreak,” N.Y. Sun, 29 March 1894.
[v] “Between persuasion and compulsion: Smallpox control in Brooklyn and New York. Colgrove, J. Bull. Hist. Med. 2004 Summer;78(2):349-78.
[vii] Brooklyn, N.Y., Department of Law, Corporation Counsel records, 2013.015; Brooklyn Historical Society.
[xi] “Between persuasion and compulsion: Smallpox control in Brooklyn and New York.” Colgrove, J. Bull. Hist. Med. 2004 Summer;78(2):349-78.
Each year for the past 28 years I’ve tried to change how I teach my courses (I call them ‘Initiatives’). This means changing the materials I use, adding, changing, or removing the activities I use to engage students, etc. Some of my ideas work wonderfully, others crash and burn miserably. Those initiatives that have survived to the present day have earned their continued use.
My measure of a worthy initiative is based on how well a student can find the ‘value’ in that knowledge and/ or the path to acquiring that knowledge. Nowadays, the worthiness of any classroom instruction is measured by a test score. Low test scores => poor classroom instruction. This is a misguided view and a prime contributor to poor teaching styles. I have succumbed to the pressure from administrators and colleagues in the past. Often, I end up with students performing well on exams and a deep feeling that I did not remain true to a teacher’s creed or my responsibilities to the founding fathers of our nation. While some of my most successful “test” years have also been some of my most unappealing years professionally, I continue to try to satisfy both camps: Maintain high test scores vs. sparking self-motivated inquiry. Very difficult indeed.
As I prepare to welcome a new crop of youngsters to my class, I have to brace myself for the inevitable “Why are we doing this?” question. I know that these students fully expect me to stand in the front of the class and “tell them what they need to know”. This is the way they’ve been taught up to now. It’s the way their parents expect me to teach them. If I vary from that norm, I’ll get questions and stares aimed at me. If the test scores are high, no one will question me. If they are not to the administrator’s or parents’ expectations, I’m targeted. Students bear the joys or scars of their education. But, they have been ‘programmed’ to accept a number on a test/ report card as a measure of their success. Through experience and natural inclination I find this scenario as futile.
This year, I’m posting this TED Talk by Dr. Sugata Mitra for the student and parent who wants to know “Why are we doing this?” I ask that you watch/ listen to this man. Then, when you’re in class and I employ a technique that is different, unusual, or unknown to you, ask yourself how those children in the slums of India would have reacted.
If you’re like me, you can’t appreciate the magnitude of a major event until you see the details expressed numerically. As part of my efforts to bring WWI back into the discussions of History lovers during this centennial period, I offer this info graphic compiled from The Encyclopedia Britannica, PBS, the US Army Center of Military History, and the Imperial War Museum.
In my efforts to bring as much information as possible to your attention during this WWI centennial period, I offer the following article.
The Enduring Impact of World War I
British infantrymen in a trench before advancing during the Battle of the Somme in July 1916. Credit Associated Press
“I feel like a soldier on the morning after the Somme.” This line of dialogue, from an episode in the second season of the BBC series “Call the Midwife,” caught my ear recently as an especially piquant morsel of period detail. It is uttered by a doctor to a nurse after they have just assisted in a grueling home birth, an experience that is compared to the four-month battle in a muddy stretch of Picardy beginning on July 1, 1916, that was, at the time, the bloodiest episode of combat in human history, generating 60,000 casualties in a single day of fighting on the British side alone. The doctor’s comparison is surely metaphorical overkill, but it also represents a familiar style of wit, a habit of linking the challenges we regularly endure with calamities we can scarcely imagine.
- R. C. Sherriff’s ‘Journey’s End,’ a Remembrance of WarJUNE 19, 2014
- John Singer Sargent and Randolph Bourne, DissentersJUNE 19, 2014
- Vaughan Williams’s ‘A Pastoral Symphony,’ Inspired by WarJUNE 19, 2014
- Viewing World War I Through the Prism of the PersonalJUNE 19, 2014
- Dispatches From the Culture FrontJUNE 20, 2014
- In Europe and the U.S., Cultural Reminders of World War IJUNE 19, 2014
But why choose that particular calamity? “Call the Midwife,” based on a popular series of memoirs by Jennifer Worth, takes place in the late 1950s, not long after a war that, in terms of the sheer scale and extent of global slaughter, far eclipsed its predecessor. It is interesting that for this youngish doctor and nurse, the earlier conflict comes more readily to mind. The Somme is more accessible, and perhaps more immediate, than Dunkirk or D-Day.
German soldiers, rear, offering to surrender to French troops in Massiges, France, during World War I. Credit Denise Follveider/Reuters
The allusion may require a footnote now, but its occurrence in a television program that is acutely sensitive to historical accuracy is a sign of just how deeply, if in some ways obscurely, World War I remains embedded in the popular consciousness. Publicized in its day as “the war to end all wars,” it has instead become the war to which all subsequent wars, and much else in modern life, seem to refer. Words and phrases once specifically associated with the experience of combat on the Western Front are still part of the common language. We barely recognize “in the trenches,” “no man’s land” or “over the top“ as figures of speech, much less as images that evoke what was once a novel form of organized mass death. And we seldom notice that our collective understanding of what has happened in foxholes, jungles, mountains and deserts far removed in space and time from the sandbags and barbed wire of France and Belgium is filtered through the blood, smoke and misery of those earlier engagements.
One person who did notice the lasting and decisive cultural influence of World War I was Paul Fussell, a literary scholar and World War II infantry veteran whose 1975 book, “The Great War and Modern Memory,” remains a tour de force of passionate, learned criticism. Fussell, who died in 2012, combed through novels, memoirs and poems written in the wake of the war and found that they established a pattern that would continue to hold, consciously and not, for much of the 20th century.
Many British soldiers and officers arrived at the front steeped in a literary tradition that colored their perception — a tradition that included not only martial epics and popular adventure novels but also religious and romantic allegories like John Bunyan’s “The Pilgrim’s Progress.” The central character in that 17th-century tale of desperate hardship and ultimate redemption is first seen as “a man clothed in rags” with “a great burden upon his back,” a description that seemed uncannily to prefigure the trench-weary conscript with his tattered uniform and heavy pack.
Gino Severini’s “Armored Train in Action,” a Futurist painting from 1915. Credit MoMA/Artists Rights Society (ARS), New York via ADAGP
That soldier, in turn, with some adjustments of outfit and equipment, would march through the subsequent decades, leaving behind a corpus of remarkably consistent firsthand testimony. Whether presented as memoir or fiction, post-1918 war writing returns again and again to the same themes and attitudes. Among them are an emphasis on the tedium and terror of ground combat; the privileging of the ordinary soldier’s perspective over that of officers or strategists; a suspicion of authority and a tendency to mock those who wield it; a strong sense of the unbridgeable existential division between those who fight and the people back home; a taste for absurdity, sarcasm and black humor; and the conclusion that, whatever the outcome or justice of the war as a whole, its legacy for the individual veteran will be cynicism and disillusionment.
Fussell found these traits in the literature of his own war — in “The Naked and the Dead,” “Catch-22” and “Gravity’s Rainbow” — and they saturate the Vietnam narratives that followed the publication of his book. The title of “The Things They Carried,” Tim O’Brien’s cycle of autobiographical stories about life before, during and after combat in Vietnam, carries an echo of “The Pilgrim’s Progress,” and its blend of economical prose, blunt naturalism and surreal terror makes it both a definitive account of its own war and a recapitulation of the Great One.
Like nearly every other male writer in English to have tackled the subject of war, Mr. O’Brien owes a clear debt to Hemingway, who came as close to anyone to striking a template for how it should be dealt with in a famous passage from “A Farewell to Arms”:
A 1918 United States Tank Corps recruitment poster. Credit The Huntington Library
“There were many words that you could not stand to hear and finally only the names of places had dignity. Certain numbers were the same way and certain dates and these with the names of the places were all you could say and have them mean anything. Abstract words such as glory, honor, courage, or hallow were obscene beside the concrete names of villages, the numbers of roads, the names of rivers, the numbers of regiments and the dates.”
This tough wisdom — itself curiously abstract, in spite of its insistence on specificity — has remained in effect even as the geography has changed. The imperative to tell what really happened, even to a public or a posterity incapable of fully understanding, has produced a literature full of names and dates. Verdun, Passchendaele, Gallipoli, Guadalcanal, Monte Cassino, Stalingrad, Inchon, Khe Sanh, Kandahar, Fallujah. Nov. 11; June 6; Tet; Sept. 11.
In 1964, 50 years after the war began, Philip Larkin, born in 1922, published a memorial poem called “MCMXIV.” Larkin’s subject is less the war as such than a faded England of “archaic faces” and bygone habits, an England that ceased to exist sometime between the assassination of Archduke Franz Ferdinand in Sarajevo on June 28 and the commencement of full, continent-engulfing hostilities at the beginning of August. The poem tries to freeze the moment when the older world — a world his parents knew intimately but one that lay just beyond the horizon of his own memory — “changed itself to past without a word.”
Ernest Hemingway recuperating at an American Red Cross Hospital in Milan in 1918. Credit Corbis
“Never such innocence again,” Larkin concludes, summarizing what was, then and now, a crucial tenet of the conventional wisdom about the Great War, a notion that informed Hemingway’s rejection of the old, elevated language of honor and glory. Even as he acknowledges the seductive power of the idea of lost innocence, Larkin also suggests that it is complicated, even deceptive. Individuals like the anonymous children and husbands who populate his lines can easily be imagined as innocent. Imperial nation-states that have spent the last few centuries conquering most of the rest of the globe are another story.
This was clear enough to Larkin, whose patriotism rested on the notion that England was the worst place on earth with the possible exception of everywhere else. The first time he uses the phrase “Never such innocence” he qualifies it with “never before or since,” suggesting that the particular Edenic aura that hangs over the prewar months of 1914 may be its own kind of illusion. To imply that Britain (or for that matter any other combatant nation) was somehow more innocent than ever on the eve of catastrophe is to register an aftereffect of the catastrophe itself.
The war was so foul and terrible that it could only have erupted in a landscape of goodness and purity. That, at any rate, is one of the myths it leaves behind. Another, favored at the time by a handful of vanguard intellectuals (notably the Italian Futurists) and adapted by some later historians, was that the war accelerated tendencies already present in modern society: toward mechanized violence, total conflict and the fusion of technology and politics.
American troops wading ashore during the Normandy invasion. Credit Associated Press
Accounts of that summer, especially in France and Britain, frequently emphasize beautiful weather and holiday pleasures. Gabriel Chevalier’s “Fear,” a novel of combat published in 1930, opens with “carefree France” in its “summer costumes.” “There wasn’t a cloud in the sky — such an optimistic, bright blue sky.” A lovely example of the interplay of empirical reality and literary embellishment: the meteorological record will attest to the color and clarity of the sky, but only the cruel, corrective irony of hindsight can summon the word “optimistic.”
And then: “In a few short days, civilization was wiped out.” This brutally concise sentence, a few pages into “Fear,” summarizes the loss of innocence that subsequent chapters of first-person narration will elaborate. But those chapters will also make clear the extent to which that “civilization,” so intoxicated by its own rhetoric of national glory and heroic destiny, was the author of its own extinction. The discrepancy between that lofty language and the horrific reality of war opens a chasm in human experience that, in Fussell’s account, has never closed: “I am saying,” he wrote, “that there seems to be one dominating form of modern understanding; that it is essentially ironic; and that it originates largely in the application of mind and memory to the events of the Great War.”
More recent events, and the imaginative response to them might indicate the extent to which minds can change, and memories fade. Chevalier’s “bright blue sky” can’t help evoking a certain late-summer sky over Manhattan almost 13 years ago, at another moment that would come to mark a boundary between Before and After.
An American Marine resting in November 2004 during a battle to take control of Falluja, Iraq. Credit Ashley Gilbertson for The New York Times
After Sept. 11, 2001, we were told — we told ourselves — that everything had changed. In a curious reversal of the logic of the Great War, the attacks on the World Trade Center and the Pentagon were widely and quickly understood to herald “the death of irony.” What this meant, at least at first, was that a cultural style dominated (according to Roger Rosenblatt in Time, among others) by “detachment and personal whimsy” would give way to an ethic of seriousness and sincerity. But in retrospect, the obituaries for irony were not only premature; they were also part of an aggressive reassertion of innocence, a concerted attempt to refute the conclusion of Larkin’s “MCMXIV.”
There followed a rehabilitation of the abstract words that Hemingway and his lost generation had found so intolerable. Ordinary soldiers were routinely referred to as “heroes” and “warriors,” even as their deaths and injuries were kept from public view. Those at home were encouraged toward displays of patriotism and support but also urged to continue with the optimistic routines of work, leisure and shopping “as if it were all” (to quote Larkin) “an August Bank Holiday lark.”
But the Great War is not quite finished with us. As the wars in Afghanistan and Iraq have wound down in bloody inconclusiveness, the men and women who served in them have started writing, and what they have produced should return us to the morning after the Somme. “Billy Lynn’s Long Halftime Walk,” Ben Fountain’s award-winning 2012 novel, pushes past irony into farce as it juxtaposes the experiences of a battered platoon plunged from the chaos of Iraq into the vulgar spectacle of the Super Bowl, where their service is honored and exploited. The book belongs in the irreverent company of “Catch-22,” which is to say on the same shelf as “All Quiet on the Western Front” and Chevalier’s “Fear.”
Phil Klay’s “Redeployment,” meanwhile, published this year, follows in the hard-boiled, matter-of-fact line of Hemingway and “The Things They Carried.” A deceptively modest collection of linked short stories, “Redeployment” bristles with place names, military numbers and acronyms, grim humor, sexual frustration, sentimental friendship and contempt for authority. It could only have been written by someone who was there, even if “there,” with some adjustments of technology, idiom and climate, might just as well be Ypres as Ramadi. And the moral might have been written by the British memoirist Edmund Blunden, who derived a stark lesson from his own experience at the Battle of the Somme: “The War had won, and would go on winning.”
Up to this point in US History, the nation had attempted to wipe away the culture of indigenous peoples through a series of boarding school efforts, missionary work, or violence. The goal was to remove the ‘savage’ culture from the body and replace it with the civilized Christian culture largely derived from Europe. Had that effort succeeded, we should not convince ourselves that prejudice against native peoples would have ceased. Remember, for all the efforts exerted to replace their culture, nothing could change the way they looked.
It was deeply felt by many in our country, from its very beginnings as a colonial possession, that a ‘Christian’ savage was better than a ‘heathen’/ ‘pagan’ savage. It was considered a moral obligation to attempt to convert these peoples. In reality, deep-seated fear, then hatred, of the “Other” would keep our society, as it matured into a nation, from ever seeing anything of value in the native. You can’t help but think that many, since the time of Andrew Jackson, truly believed that the world would be a better place without the Native American.
Then World War I came.
Suddenly, the American native has something that the US, arguably the world, needs desperately. In a conflict that would introduce the world into modern warfare and it’s weapons of mass destruction, German cyber-warfare had an advantage that kept allied hopes bleak. Encoded information was being deciphered by the Germans. The British, one of the great code-breakers of all time, were doing this to the Germans as well. But WWI was a ‘new’ beast. In no previous conflict were the cost in lives and materials on par with what was unleashed in 1914.
Before the cultures of the Choctaw and Cherokee were destroyed, someone had the bright idea to use their language as the encryption for military communication. The Germans, perhaps no institute of learning outside the US, had any linguistic understanding of those languages. It was genius! In addition, Native American soldiers were recruited to break the codes of opposing forces for the duration of WWI and through WWII.
Let’s not use the secretive nature of cyber-warfare as an excuse to overlook the critical role played by those clandistine heroes. We must not forget that amongst them stood the children of generations that our nation tried to wipe away from History. To them all I say, on behalf of a grateful nation, “Thank You!”
History is full of irony, ain’t it?
One of the so-called “Five Civilized Tribes” of the southeastern United States, the Choctaw traditionally farmed corn, beans and pumpkins while also hunting, fishing and gathering wild edibles. Despite allying themselves with the United States in the War of 1812, they were pressured afterwards into ceding millions of acres of land to the government. Following the passage of the Indian Removal Act in 1830, most members of the tribe were then forced to relocate to present-day Oklahoma in a series of poorly planned and poorly provisioned journeys that left an estimated 2,500 dead. In what would become a catchphrase for all Indian removal west of the Mississippi River, a Choctaw chief described it as a “trail of tears.”
When the United States entered World War I in April 1917, it had not yet granted citizenship to all Native Americans, and government-run boarding schools were still largely attempting to stamp out their languages and cultures. Nonetheless, several thousand Native Americans enlisted in the armed forces to fight the Central Powers. Nearly 1,000 of them representing some 26 tribes joined the 36th Division alone, which consisted of men from Texas and Oklahoma. “They saw that they were needed to protect home and country,” said Judy Allen, senior executive officer of tribal relations for the Choctaw Nation of Oklahoma, “so they went to the nearest facility where they could sign up and were shipped out.”
In summer 1918, the 36th Division arrived in France to participate in the upcoming Meuse-Argonne campaign, a major offensive along the Western Front. At that point, the outcome of the conflict was still in doubt. “World War I really wasn’t decided until very, very late,” explained William C. Meadows, a Native American studies professor at Missouri State University and expert on code talking. “It wasn’t like World War II where we clearly had them on the run.”
One main problem for the Allies was the Germans’ ability to listen in on their communications and to break their codes, which were generally based on either European languages or mathematical progressions. “We couldn’t keep anything secret,” Allen said. An apocryphal story spread around that a German once interrupted a U.S. Signal Corps member sending a message to taunt his use of code words. Sending out human runners proved equally ineffective, since about one in four were captured or killed. And other methods of communications, such as color-coded rockets, electronic buzzers and carrier pigeons, were too limiting, too slow, too unreliable or a combination thereof.
Honorable discharge and Purple Heart of Tobias Frazier, a Choctaw Indian code talker during World War I. (Credit: Rodger Mallison/Fort Worth Star-Telegram/MCT via Getty Images
Soon after the Meuse-Argonne campaign got underway, a company commander in the 36th Division reportedly happened to overhear two of his soldiers conversing in Choctaw. In a flash, he recognized the military potential of the language, essentially unknown to the Germans, and persuaded his superiors to post a Choctaw speaker at various field company headquarters. On October 26, 1918, the Choctaws were put to use for the first time as part of the withdrawal of two companies from the front. Having completed this mission without mishap, they then played a major role the following two days in an attack on a strongly fortified German position called Forest Ferme. “The enemy’s complete surprise is evidence that he could not decipher the messages,” Colonel A.W. Bloor later wrote in an official report. The tide of battle turned within 24 hours, according to Bloor, and within 72 hours the Allies were on full attack.
At least 19 Choctaws subsequently completed a short training session. Lacking the words for certain modern-day military terms, they used “big gun” for artillery, “little gun shoot fast” for machine gun, “stone” for grenade and “scalps” for casualties, among other substitutions, thereby becoming true code talkers rather than simply communications operators speaking a little-known language. “They create these code words, but they don’t actually get to use them because the war ends on the 11th [of November],” Meadows said. Even so, Colonel Bloor described the results of the training session as “very gratifying.” “It is believed, had the regiment gone back into the line, fine results would have been obtained,” he declared. “We were confident the possibilities of the telephone had been obtained without its hazards.”
A captured German later admitted that his side couldn’t make heads or tails of the Choctaw speakers, whom Allen credited with likely bringing about an earlier end to the war and saving hundreds of thousands of lives. The irony would not have been lost on them, she added, that “the same government that was asking them to use their native language to win the war was punishing people for speaking it back home.” American Indians from at least five other tribes also used their native tongues to transmit messages during World War I in an effort to confuse the Germans, although unlike the Choctaws they are not known to have invented intentionally coded vocabulary.
Code talkers made an even bigger impact during World War II, when the U.S. government specifically recruited Comanche, Hopi, Meskwaki, Chippewa-Oneida and Navajo tribal members for such work. The Navajo developed the most complex code, with over 600 terms, for use in the Pacific Theater, compared with about 250 terms for the World War II-era Comanche and under 20 terms for the World War I-era Choctaw. “Even the other tribe members back home didn’t know what this coded vocabulary meant,” Meadows said. “It was all gibberish to them.” In addition to the handful of intentionally coded Native American languages employed by the Allies, they used two dozen or so others on a more ad hoc basis. The opposition is not believed to have deciphered a single code talker message in either world war.
Only the Navajo, with more code talkers than all other tribes combined, have become relatively well known, in part due to the Hollywood film “Windtalkers.” They received congressional recognition for their exploits in 2000, whereas the remaining tribes had to wait eight more years until a bill passed praising them for their “dedication and valor.” “Honoring Native American code talkers is long overdue,” the bill admitted. Pursuant to the legislation, a medal ceremony took place in November 2013 in Washington, D.C., with 33 tribes known to have had code-talking members in attendance. “My regret,” said Allen, “is that none of the code talkers were alive from our [Choctaw] nation to see this moment, and none of their children were alive.”