“If a person perceives that food comes from a grocery store, gasoline from a pump, shoes from an online retailer, it is reasonable to believe then that this person’s perceptions have been skewed into believing that nothing must ever die for us to consume whatever we want in whatever quantities we desire. As long as the blood is on someone else’s hands in some other land far from sight, then there is no blood at all. It is this willful blindness to the day to day functioning of industrial civilization on the part of the world’s wealthier populations that allows a people draped in slave made textiles who are kept fed by the mechanistic rape of stolen land powered by stolen oil to stare up with their doe eyes and without a hint of irony ask, ‘But why do they hate us?'”
If some extraterrestrial species were compiling a history of Homo sapiens, they might well break their calendar into two eras: BNW (before nuclear weapons) and NWE (the nuclear weapons era). The latter era, of course, opened on August 6, 1945, the first day of the countdown to what may be the inglorious end of this strange species, which attained the intelligence to discover the effective means to destroy itself, but — so the evidence suggests — not the moral and intellectual capacity to control its worst instincts.
Day one of the NWE was marked by the “success” of Little Boy, a simple atomic bomb. On day four, Nagasaki experienced the technological triumph of Fat Man, a more sophisticated design. Five days later came what the official Air Force history calls the “grand finale,” a 1,000-plane raid — no mean logistical achievement — attacking Japan’s cities and killing many thousands of people, with leaflets falling among the bombs reading “Japan has surrendered.” Truman announced that surrender before the last B-29 returned to its base.
Those were the auspicious opening days of the NWE. As we now enter its 70th year, we should be contemplating with wonder that we have survived. We can only guess how many years remain.
Some reflections on these grim prospects were offered by General Lee Butler, former head of the U.S. Strategic Command (STRATCOM), which controls nuclear weapons and strategy. Twenty years ago, he wrote that we had so far survived the NWE “by some combination of skill, luck, and divine intervention, and I suspect the latter in greatest proportion.”
Reflecting on his long career in developing nuclear weapons strategies and organizing the forces to implement them efficiently, he described himself ruefully as having been “among the most avid of these keepers of the faith in nuclear weapons.” But, he continued, he had come to realize that it was now his “burden to declare with all of the conviction I can muster that in my judgment they served us extremely ill.” And he asked, “By what authority do succeeding generations of leaders in the nuclear-weapons states usurp the power to dictate the odds of continued life on our planet? Most urgently, why does such breathtaking audacity persist at a moment when we should stand trembling in the face of our folly and united in our commitment to abolish its most deadly manifestations?”
He termed the U.S. strategic plan of 1960 that called for an automated all-out strike on the Communist world “the single most absurd and irresponsible document I have ever reviewed in my life.” Its Soviet counterpart was probably even more insane. But it is important to bear in mind that there are competitors, not least among them the easy acceptance of extraordinary threats to survival.
Survival in the Early Cold War Years
According to received doctrine in scholarship and general intellectual discourse, the prime goal of state policy is “national security.” There is ample evidence, however, that the doctrine of national security does not encompass the security of the population. The record reveals that, for instance, the threat of instant destruction by nuclear weapons has not ranked high among the concerns of planners. That much was demonstrated early on, and remains true to the present moment.
In the early days of the NWE, the U.S. was overwhelmingly powerful and enjoyed remarkable security: it controlled the hemisphere, the Atlantic and Pacific oceans, and the opposite sides of those oceans as well. Long before World War II, it had already become by far the richest country in the world, with incomparable advantages. Its economy boomed during the war, while other industrial societies were devastated or severely weakened. By the opening of the new era, the U.S. possessed about half of total world wealth and an even greater percentage of its manufacturing capacity.
There was, however, a potential threat: intercontinental ballistic missiles with nuclear warheads. That threat was discussed in the standard scholarly study of nuclear policies, carried out with access to high-level sources — Danger and Survival: Choices About the Bomb in the First Fifty Years by McGeorge Bundy, national security adviser during the Kennedy and Johnson presidencies.
Bundy wrote that “the timely development of ballistic missiles during the Eisenhower administration is one of the best achievements of those eight years. Yet it is well to begin with a recognition that both the United States and the Soviet Union might be in much less nuclear danger today if [those] missiles had never been developed.” He then added an instructive comment: “I am aware of no serious contemporary proposal, in or out of either government, that ballistic missiles should somehow be banned by agreement.” In short, there was apparently no thought of trying to prevent the sole serious threat to the U.S., the threat of utter destruction in a nuclear war with the Soviet Union.
Could that threat have been taken off the table? We cannot, of course, be sure, but it was hardly inconceivable. The Russians, far behind in industrial development and technological sophistication, were in a far more threatening environment. Hence, they were significantly more vulnerable to such weapons systems than the U.S. There might have been opportunities to explore these possibilities, but in the extraordinary hysteria of the day they could hardly have even been perceived. And that hysteria was indeed extraordinary. An examination of the rhetoric of central official documents of that moment like National Security Council Paper NSC-68 remains quite shocking, even discounting Secretary of State Dean Acheson’s injunction that it is necessary to be “clearer than truth.”
One indication of possible opportunities to blunt the threat was a remarkable proposal by Soviet ruler Joseph Stalin in 1952, offering to allow Germany to be unified with free elections on the condition that it would not then join a hostile military alliance. That was hardly an extreme condition in light of the history of the past half-century during which Germany alone had practically destroyed Russia twice, exacting a terrible toll.
Stalin’s proposal was taken seriously by the respected political commentator James Warburg, but otherwise mostly ignored or ridiculed at the time. Recent scholarship has begun to take a different view. The bitterly anti-Communist Soviet scholar Adam Ulam has taken the status of Stalin’s proposal to be an “unresolved mystery.” Washington “wasted little effort in flatly rejecting Moscow’s initiative,” he has written, on grounds that “were embarrassingly unconvincing.” The political, scholarly, and general intellectual failure left open “the basic question,” Ulam added: “Was Stalin genuinely ready to sacrifice the newly created German Democratic Republic (GDR) on the altar of real democracy,” with consequences for world peace and for American security that could have been enormous?
Reviewing recent research in Soviet archives, one of the most respected Cold War scholars, Melvyn Leffler, has observed that many scholars were surprised to discover “[Lavrenti] Beria — the sinister, brutal head of the [Russian] secret police — propos[ed] that the Kremlin offer the West a deal on the unification and neutralization of Germany,” agreeing “to sacrifice the East German communist regime to reduce East-West tensions” and improve internal political and economic conditions in Russia — opportunities that were squandered in favor of securing German participation in NATO.
Under the circumstances, it is not impossible that agreements might then have been reached that would have protected the security of the American population from the gravest threat on the horizon. But that possibility apparently was not considered, a striking indication of how slight a role authentic security plays in state policy.
The Cuban Missile Crisis and Beyond
That conclusion was underscored repeatedly in the years that followed. When Nikita Khrushchev took control in Russia in 1953 after Stalin’s death, he recognized that the USSR could not compete militarily with the U.S., the richest and most powerful country in history, with incomparable advantages. If it ever hoped to escape its economic backwardness and the devastating effect of the last world war, it would need to reverse the arms race.
Accordingly, Khrushchev proposed sharp mutual reductions in offensive weapons. The incoming Kennedy administration considered the offer and rejected it, instead turning to rapid military expansion, even though it was already far in the lead. The late Kenneth Waltz, supported by other strategic analysts with close connections to U.S. intelligence, wrote then that the Kennedy administration “undertook the largest strategic and conventional peace-time military build-up the world has yet seen… even as Khrushchev was trying at once to carry through a major reduction in the conventional forces and to follow a strategy of minimum deterrence, and we did so even though the balance of strategic weapons greatly favored the United States.” Again, harming national security while enhancing state power.
U.S. intelligence verified that huge cuts had indeed been made in active Soviet military forces, both in terms of aircraft and manpower. In 1963, Khrushchev again called for new reductions. As a gesture, he withdrew troops from East Germany and called on Washington to reciprocate. That call, too, was rejected. William Kaufmann, a former top Pentagon aide and leading analyst of security issues, described the U.S. failure to respond to Khrushchev’s initiatives as, in career terms, “the one regret I have.”
The Soviet reaction to the U.S. build-up of those years was to place nuclear missiles in Cuba in October 1962 to try to redress the balance at least slightly. The move was also motivated in part by Kennedy’s terrorist campaign against Fidel Castro’s Cuba, which was scheduled to lead to invasion that very month, as Russia and Cuba may have known. The ensuing “missile crisis” was “the most dangerous moment in history,” in the words of historian Arthur Schlesinger, Kennedy’s adviser and confidant.
As the crisis peaked in late October, Kennedy received a secret letter from Khrushchev offering to end it by simultaneous public withdrawal of Russian missiles from Cuba and U.S. Jupiter missiles from Turkey. The latter were obsolete missiles, already ordered withdrawn by the Kennedy administration because they were being replaced by far more lethal Polaris submarines to be stationed in the Mediterranean.
Kennedy’s subjective estimate at that moment was that if he refused the Soviet premier’s offer, there was a 33% to 50% probability of nuclear war — a war that, as President Eisenhower had warned, would have destroyed the northern hemisphere. Kennedy nonetheless refused Khrushchev’s proposal for public withdrawal of the missiles from Cuba and Turkey; only the withdrawal from Cuba could be public, so as to protect the U.S. right to place missiles on Russia’s borders or anywhere else it chose.
It is hard to think of a more horrendous decision in history — and for this, he is still highly praised for his cool courage and statesmanship.
Ten years later, in the last days of the 1973 Israel-Arab war, Henry Kissinger, then national security adviser to President Nixon, called a nuclear alert. The purpose was to warn the Russians not to interfere with his delicate diplomatic maneuvers designed to ensure an Israeli victory, but of a limited sort so that the U.S. would still be in control of the region unilaterally. And the maneuvers were indeed delicate. The U.S. and Russia had jointly imposed a cease-fire, but Kissinger secretly informed the Israelis that they could ignore it. Hence the need for the nuclear alert to frighten the Russians away. The security of Americans had its usual status.
Ten years later, the Reagan administration launched operations to probe Russian air defenses by simulating air and naval attacks and a high-level nuclear alert that the Russians were intended to detect. These actions were undertaken at a very tense moment. Washington was deploying Pershing II strategic missiles in Europe with a five-minute flight time to Moscow. President Reagan had also announced the Strategic Defense Initiative (“Star Wars”) program, which the Russians understood to be effectively a first-strike weapon, a standard interpretation of missile defense on all sides. And other tensions were rising.
Naturally, these actions caused great alarm in Russia, which, unlike the U.S., was quite vulnerable and had repeatedly been invaded and virtually destroyed. That led to a major war scare in 1983. Newly released archives reveal that the danger was even more severe than historians had previously assumed. A CIA study entitled “The War Scare Was for Real” concluded that U.S. intelligence may have underestimated Russian concerns and the threat of a Russian preventative nuclear strike. The exercises “almost became a prelude to a preventative nuclear strike,” according to an account in the Journal of Strategic Studies.
It was even more dangerous than that, as we learned last September, when the BBC reported that right in the midst of these world-threatening developments, Russia’s early-warning systems detected an incoming missile strike from the United States, sending its nuclear system onto the highest-level alert. The protocol for the Soviet military was to retaliate with a nuclear attack of its own. Fortunately, the officer on duty, Stanislav Petrov, decided to disobey orders and not report the warnings to his superiors. He received an official reprimand. And thanks to his dereliction of duty, we’re still alive to talk about it.
The security of the population was no more a high priority for Reagan administration planners than for their predecessors. And so it continues to the present, even putting aside the numerous near-catastrophic nuclear accidents that occurred over the years, many reviewed in Eric Schlosser’s chilling study Command and Control: Nuclear Weapons, the Damascus Accident, and the Illusion of Safety. In other words, it is hard to contest General Butler’s conclusions.
Survival in the Post-Cold War Era
The record of post-Cold War actions and doctrines is hardly reassuring either. Every self-respecting president has to have a doctrine. The Clinton Doctrine was encapsulated in the slogan “multilateral when we can, unilateral when we must.” In congressional testimony, the phrase “when we must” was explained more fully: the U.S. is entitled to resort to “unilateral use of military power” to ensure “uninhibited access to key markets, energy supplies, and strategic resources.” Meanwhile, STRATCOM in the Clinton era produced an important study entitled “Essentials of Post-Cold War Deterrence,” issued well after the Soviet Union had collapsed and Clinton was extending President George H.W. Bush’s program of expanding NATO to the east in violation of promises to Soviet Premier Mikhail Gorbachev — with reverberations to the present.
That STRATCOM study was concerned with “the role of nuclear weapons in the post-Cold War era.” A central conclusion: that the U.S. must maintain the right to launch a first strike, even against non-nuclear states. Furthermore, nuclear weapons must always be at the ready because they “cast a shadow over any crisis or conflict.” They were, that is, constantly being used, just as you’re using a gun if you aim but don’t fire one while robbing a store (a point that Daniel Ellsberg has repeatedly stressed). STRATCOM went on to advise that “planners should not be too rational about determining… what the opponent values the most.” Everything should simply be targeted. “[I]t hurts to portray ourselves as too fully rational and cool-headed… That the U.S. may become irrational and vindictive if its vital interests are attacked should be a part of the national persona we project.” It is “beneficial [for our strategic posture] if some elements may appear to be potentially ‘out of control,’” thus posing a constant threat of nuclear attack — a severe violation of the U.N. Charter, if anyone cares.
Not much here about the noble goals constantly proclaimed — or for that matter the obligation under the Non-Proliferation Treaty to make “good faith” efforts to eliminate this scourge of the earth. What resounds, rather, is an adaptation of Hilaire Belloc’s famous couplet about the Maxim gun (to quote the great African historian Chinweizu):
“Whatever happens, we have got,
The Atom Bomb, and they have not.”
After Clinton came, of course, George W. Bush, whose broad endorsement of preventative war easily encompassed Japan’s attack in December 1941 on military bases in two U.S. overseas possessions, at a time when Japanese militarists were well aware that B-17 Flying Fortresses were being rushed off assembly lines and deployed to those bases with the intent “to burn out the industrial heart of the Empire with fire-bomb attacks on the teeming bamboo ant heaps of Honshu and Kyushu.” That was how the prewar plans were described by their architect, Air Force General Claire Chennault, with the enthusiastic approval of President Franklin Roosevelt, Secretary of State Cordell Hull, and Army Chief of Staff General George Marshall.
Then comes Barack Obama, with pleasant words about working to abolish nuclear weapons — combined with plans to spend $1 trillion on the U.S. nuclear arsenal in the next 30 years, a percentage of the military budget “comparable to spending for procurement of new strategic systems in the 1980s under President Ronald Reagan,” according to a study by the James Martin Center for Nonproliferation Studies at the Monterey Institute of International Studies.
Obama has also not hesitated to play with fire for political gain. Take for example the capture and assassination of Osama bin Laden by Navy SEALs. Obama brought it up with pride in an important speech on national security in May 2013. It was widely covered, but one crucial paragraph was ignored.
Obama hailed the operation but added that it could not be the norm. The reason, he said, was that the risks “were immense.” The SEALs might have been “embroiled in an extended firefight.” Even though, by luck, that didn’t happen, “the cost to our relationship with Pakistan and the backlash among the Pakistani public over encroachment on their territory was… severe.”
Let us now add a few details. The SEALs were ordered to fight their way out if apprehended. They would not have been left to their fate if “embroiled in an extended firefight.” The full force of the U.S. military would have been used to extricate them. Pakistan has a powerful, well-trained military, highly protective of state sovereignty. It also has nuclear weapons, and Pakistani specialists are concerned about the possible penetration of their nuclear security system by jihadi elements. It is also no secret that the population has been embittered and radicalized by Washington’s drone terror campaign and other policies.
While the SEALs were still in the bin Laden compound, Pakistani Chief of Staff Ashfaq Parvez Kayani was informed of the raid and ordered the military “to confront any unidentified aircraft,” which he assumed would be from India. Meanwhile in Kabul, U.S. war commander General David Petraeus ordered “warplanes to respond” if the Pakistanis “scrambled their fighter jets.” As Obama said, by luck the worst didn’t happen, though it could have been quite ugly. But the risks were faced without noticeable concern. Or subsequent comment.
As General Butler observed, it is a near miracle that we have escaped destruction so far, and the longer we tempt fate, the less likely it is that we can hope for divine intervention to perpetuate the miracle.
A story about perspective on childhood:
A friend I’ve known for over a decade, whose indigenous to the lands I now call home, saw this high school kid fall off his bicycle. The kid fell backwards, and his helmet slipped towards his nose. My friend, whose recent birthday inched him towards forty, said he watched the EMT’s lift this teenager into an ambulance and hoped he was ok. Later we learned the kid died, and this big brown man’s voice quivered, “Oh no,” He tipped against the wall and started sobbing. He wept at this complete strangers death. I watched unsure of how to react and just told him I was there for him. A few hours later he walked up and hugged me, said, “I’m sorry. Where I’m from, so many young people die.”
I recently visited the Pine Ridge Reservation to witness and stand on the side of the Lakota people who’d invited us. They invited us as allies and take part in an ongoing struggle against Whiteclay, NE. Before getting into the flesh of that struggle, there are things I need to share, things think we must be look at, and try to understand before reaching the actions. See, there are facts, little factoids you learn about place when you’re there. Sometimes about death and addiction. Statistics, and they mean nothing. It’s hard to grasp a % of a thing, when you’re not in the thing. You’re not the one living it.
Pine Ridge Reservation statistics:
- Lakota people have the lowest life expectancy in America.
- Life expectancy for men 48, 52 for women. Lowest reported 45 years of length life. USA= 77.5 years of average lifespan.
- Teenage suicide rate is 150% higher than the U.S. average for teens.
- The infant mortality rate is the highest on this continent. 300% higher than the U.S. national average
- Alcoholism affects 8 out of 10 families.
- The death rate from alcohol-related problems on the Reservation is 300% higher than the remaining US population.
We heard these numbers as we crossed onto the Pine Ridge reservation. These death and alcohol statistics became more comprehendible as we listened to uncountable stories of known and recently dead: Cousins and siblings– relative upon relative, from car accidents, and acts of drunk random violence. Liver cirrhosis. Died from alcohol. I listened while on Pine Ridge, and everyone had stories, and lists of their known dead. The pizza hut worker who hated Whiteclay, whose brother died of cirrhosis, of alcoholism. A 35 year old women, who lived the streets of Whiteclay, who’d slur at me that she knew it was bad. That Whiteclay had to go, that her father died there. But we ran into her after she’d made the 2 mile trek to Whiteclay, NE from the Pine Ridge Reservation. My trip to the reservation wasn’t only to learn about the history of Whiteclay, NE, but also to listen to stories from the people who struggle against Whiteclay, and called the Pine Ridge reservation home.
We arrived to the Lakota’s homelands by crossing an imaginary border out of Nebraska, into what I’d learn wasn’t the Pine Ridge Reservation to the Lakota who invited us, but was really POW camp 344.
This name referencing the tribal identification number assigned them by the federal government. We were told that each reservation, and thus tribe was assigned a numeric sequence to attach them to the lands their people were now isolated too. While the story of POW camp 344 is far too large for this small article, let it be known that I heard stories of warriors that once fought the United States government, and won victories. The Lakota warriors even decimated one of United States armies in the field. Warriors vs Soldiers.
“Sustainability” is the buzzword passed around nearly every environmental and social justice circle today. For how often the word is stated, those who use it rarely articulate what it is that they are advocating. And because the term is applied so compulsively, while simultaneously undefined, it renders impossible the ability of our movements to set and actualize goals, let alone assess the strategies and tactics we employ to reach them.
Underneath the surface, sustainability movements have largely become spaces where well-meaning sensibilities are turned into empty gestures and regurgitations of unarticulated ideals out of mere obligation to our identity as “environmentalists” and “activists.” We mention “sustainability” because to not mention it would undermine our legitimacy and work completely. But as destructive as not mentioning the word would be, so too is the lack of defining it.
When we don’t articulate our ideals ourselves we not only allow others to define us but we also give space for destructive premises to continue unchallenged. The veneer of most environmental sustainability movements begins to wither away when we acknowledge that most of its underlying premises essentially mimic the exact forces which we allege opposition.
The dominant culture currently runs on numerous underlying premises – whether it is the belief in infinite growth and progress, the myth of technological prowess and human superiority, or even the notion that this culture is the most successful, advanced and equitable way of life to ever exist.
These premises often combine to form the basis of an ideological belief in infinite substitutability – when a crisis occurs, our human ingenuity and creativity will always be able to save us by substituting our disintegrating resources and systems with new ones.
And by and large, most of us accept this as truth and never question or oppose the introduction of new technologies/resources in our lives. We never question whom these technologies/resources actually benefit or what their material affects may be. Often, we never question why we need new technologies/resources and we never think about what problems they purport to solve or, more accurately, conceal entirely.
New negotiations between Israelis and Palestinians may begin next week, with much talk of a “new chapter” in the seemingly intractable conflict. A new chapter, perhaps, but who is writing the book?
Any public discussion about the “peace process” is tense, in part because there is no widely shared understanding of the history and politics of — even an appropriate terminology for — the conflict. That’s as true in the United States as in Palestine and Israel.
I never gave much thought to the question until I was 30 years old, in the late 1980s. Before that, I had a typical view of the conflict for an apolitical American: It was confusing, and everyone involved seemed a bit crazy. With no understanding of the history of the region and no framework for analyzing U.S. policy in the Middle East, it was all a muddle, and so I ignored it. That’s one of the privileges of being in the comfortable classes in the United States — you can remain comfortably ignorant.
But as a frustrated journalist with a newfound freedom to examine the politics of news media in graduate school, I began studying law and human rights, in the domestic and international arenas. I also started digging into the issues I had been avoiding. In the case of Palestine/Israel, I began reading about the roots of the conflict, how the United States was involved, and how U.S. journalists were presenting the issues.
I came to this inquiry with no firm allegiance to either side. As a white U.S. citizen from a centrist Protestant background but with no religious commitments, I felt no cultural or spiritual connection to either national group. I don’t speak Hebrew or Arabic, and I had never traveled to the Middle East. I had no personal relationships that predisposed me to favor one group over the other. Like any human, I was not free of bias, of course. As a relatively unreflective white man rooted in a predominantly Christian culture, I was raised with some level of anti-Semitism and anti-Arab racism, for example, and no doubt that affected my perceptions. But based solely on my personal profile, I didn’t have a dog in that fight, or so I thought.
Thi Sieu says her family lived for generations on a small plot of land studded with cashew trees until they fell victim to an alleged land grab by powerful local elites, a fate shared with many indigenous farmers in Vietnam’s lush central hills.
All land in the communist nation is owned by the state and usage rights are frequently opaque, allowing corrupt local officials and well-connected businessmen to seize land with impunity, according to activists speaking to AFP.
The Central Highlands have long been a hotbed of discontent over land rights, thanks in part to government schemes luring big agricultural firms and lowland migrants seeking their fortunes in booming cashew, coffee and rubber industries.
Official figures show the area’s population surged from 1.5 million in 1975 to around six million in 2010, prompting complaints from indigenous minorities of forced evictions by newly arrived ethnic Kinh, who make up 90 percent of the population.