Tuesday, February 28, 2012

History Channel Packs A Lot Into A Little

I was eating breakfast this morning before going to work and glanced at a program on the History Channel concerning the War of 1812. It was highly informative, but not because it told me anything I didn't already know about the war itself.

In the first minute, a historian disparaged then-President James Madison as a "cold fish" and "nerdy." Of course he was. James Madison was a man of character and intellect, and such a man has little place in modern America, certainly no more of a place than the Constitution he gave us.

In the second minute, another historian lamented that the war occurred only because of the slow pace of communications -- the British repealed their hated impressment laws before our Congress declared war (this was back when Congress still did such things), which supposedly would have averted war today because we would have known instantly of the repeal. For all our vaunted technology, wisdom apparently remains scarce. When an emotional decision is made, all conflicting data are rationalized away. America had an Anglophobia coupled with a lust for territory that sought an outlet, and no telephones or emails would have sufficed to plug it. Most likely, news of the repeal would have been buried until after the war was on, or a new outrage would have magically surfaced to justify the war even if impressment no longer did.

Thanks, History Channel. In two minutes you conveyed deep insights that most people can't grasp over a lifetime, although this may not have been your intention.

Sunday, February 26, 2012

War -- Part III

Iraq

The invasion of Iraq at the behest of President George W. Bush in the spring of 2003 inflamed worldwide debate, largely because the erratic and inconsistent explanations for the invasion wholly failed to pass legal or logical muster. America had no justification to launch a war against a nation that had neither attacked it nor posed a credible threat of imminent attack, even if America’s pretext was to uphold U.N. resolutions (while condemning the U.N. as uncooperative); to quarantine weapons of mass destruction known as “WMDs” (while failing to establish their existence); to depose a dictator and his Baath party (while originally helping that party to seize power); or to “liberate” an oppressed people (while already having bombed and embargoed those people into abject misery for the previous decade).

First, contrary to the federal government’s initial justification, the U.N. Security Council never approved a resolution empowering the United States or its allies to decide on their own when to use force against Iraq to deal with WMDs. The only Security Council measure that ever authorized any military force against Iraq took place in 1990 and had as its goal the removal of Iraqi personnel from Kuwait. That resolution (678) did not grant authority to invade Iraq even at the time, and certainly not some thirteen years after Kuwait had regained her independence. As noted by the commander of the British forces in the first Gulf War Coalition: “We did not have a mandate to invade Iraq or take the country over . . . . [I]n pressing on to the Iraqi capital we would have moved outside the remit of the United Nations authority, within which we had worked so far.” But the United States attempted nonetheless to rely on 678 and other inapplicable resolutions to justify its actions. One such measure was Security Council Resolution 687, which offered to lift economic sanctions if Iraq cooperated with WMD inspections – it did not authorize new sanctions, let alone military action. An additional Security Council Resolution that the Bush administration latched onto was 1441, which did not authorize the use of force, but rather commanded the Security Council to re-convene if Iraq failed to cooperate with a full inspection for WMDs. The United States thus violated 1441 by making a unilateral determination to use force on its own, claiming all the while that this usurpation of the Security Council’s power would rescue the United Nations from irrelevance.

Public scrutiny of the invasion eventually peaked, and the United States could no longer hide behind its transparent assertions that the Security Council had somehow authorized the invasion, so the United States switched gears and argued that it was exercising the inherent right of self-defense under Article 51 of the U.N. Charter. Once again, however, this explanation didn’t wash. No proven link connected Iraq to the terrorist attacks of September 11, 2001, and the Bush administration did not even make such an accusation against Iraq during the earlier period leading up to the invasion. What the Bush administration had alleged was the threat of WMDs believed to be in Iraq’s possession. President Bush waxed hysterical about miniature planes that could deliver chemical or biological agents to American soil, and National Security Adviser Condoleeza Rice warned that Iraq possessed aluminum tubes for producing nuclear weapons. Once the invasion was well under way, it became apparent that these claims were false and that suspected stockpiles of WMDs did not exist. Unfortunately, the ensuing public debate missed the point by focusing on whether the American-British coalition had made an honest mistake or whether the pre-invasion intelligence was intentionally “doctored.” Under either scenario – honest mistake versus intentional fraud – the invasion was illegal because even if Iraq had possessed WMDs, this would not trigger a right of self-defense in the United States. Per Article 51 of the U.N. Charter and customary international law, the right of self-defense arises only when an armed attack has occurred or is very likely to occur in the near future, neither of which was the case with Iraq.

In straining even further to legitimize the invasion, the United States went so far as to resurrect the gelatinous “just war” doctrine to imply that aggression can become permissible if in the service of a noble objective. President Bush’s hired theologian Michael Novak made this argument when attempting to secure the Vatican’s blessing, but the Vatican rebuffed him and publicly denounced the invasion as unjust (an episode demonstrating why this highly-subjective test was shelved in the first place).

Switching gears yet again, the United States argued for an unprecedented right of “Preventive War” that would allow military force against any potential threats despite the lack of actual or imminent danger. Such a right of armed attack against mere potential threats is as audacious as it is unworkable, since nations always face the classic Security Dilemma: as one nation grows more secure, other nations feel less secure. In a world of such perpetual insecurity, “Preventive War” would grant nations perpetual license to attack each other without provocation. According to the doctrine’s own internal logic, other nations now have good reason to feel threatened and to launch “preventive” attacks against the United States itself, lest they find themselves in the same predicament as Iraq. To deal with this logical conclusion, the United States apparently reserves to itself the right of “Preventive War” while denying it to others – unprovoked attacks by the United States are proper, whereas only unprovoked attacks upon the United States are criminal. One must look to ancient Rome for a double standard of such staggering proportions. Under the legal contours of self-defense, it was actually Iraq who enjoyed the right to use force to defend itself against the United States, and Article 51 of the U.N. Charter authorized other nations to come to Iraq’s “collective defense” to repulse the invasion if they so choose. Indeed, other nations may indeed have contributed to the insurgency already, and if so they were not violating the law of the use of force, but rather exercising their right to defend a nation under aggressive attack.

When confounded by arguments such as these, invasion enthusiasts at last shed all pretense of legality and resort to their emotional trump card: Saddam Hussein and his Baathist regime had to be destroyed for the Iraqi people’s own benefit, and anyone who disagrees is a sympathizer and enabler of tyranny. Left unsaid is that the United States in 1963 first “enabled” the Baathists into power by using the Central Intelligence Agency to overthrow the republican government established by Abdul Karim Qasim, a violation of Iraq’s sovereignty and of Article 2(4) of the U.N. Charter. Left unsaid is that the United States had no quarrel with Saddam Hussein when selling him conventional and chemical weapons in the 1980s to fight against Iran. Left unsaid is that the United States has managed to kill more Iraqis since 2003 than Saddam Hussein likely could have, which doesn’t even take into account the hundreds of thousands who died previously as an admitted and unapologetic result of the Clinton administration’s policies. And left unsaid is that the United States has long maintained friendly relationships with despots worldwide . . . so long as they do the United States’ bidding. One need only remember the Shah of Iran, General Musharraf in Pakistan, or Pinochet in Chile to see that spreading “democracy” is a means to the federal government’s ends, not an end in itself.

Yet even if we ignored all these unpleasant facts and agreed that the world is a better place with Saddam Hussein removed from it, we would still face a sobering reality: like Japan, Italy, and Germany during the 1930s, the United States waged an aggressive, illegal war and attempted to justify it with the successful outcome achieved thereby, falling squarely within Justice Robert Jackson’s admonition at Nuremberg.

Saturday, February 25, 2012

War -- Part II

America Violates The Law Consecrated By Its Own Blood

Despite waging a war to defeat the aggression of the Axis powers, and despite constructing a legal framework in the Nuremberg Principles and the U.N. Charter to prevent such calamities from recurring, the United States government has since undertaken its own course of aggression that disrupts international peace and threatens to tear this legal framework apart. In a most unoriginal manner, the United States has justified such actions as necessary to protect oppressed peoples and to foster an overblown concept of national security reaching far beyond national boundaries. These rationales do nothing more than mimic the rhetoric of Japan, Italy, and Germany prior to World War II, and they make a mockery of that war’s chief lesson: aggression is a forbidden method for advancing any agenda no matter how just it may seem. Due to the fact that the federal government has discarded most legal restraints on its behavior, and further due to the absence of a serious foreign rival to its military might, the federal government has brazenly proclaimed that its version of post-constitutional “democracy” is superior to all other forms of government and that the United States may impose this system on any other nation for the sake of national security. As President George W. Bush openly admitted in his second inaugural address.
[I]t is the policy of the United States to seek and support the growth of democratic movements and institutions in every nation and culture, with the ultimate goal of ending tyranny in our world.
This is not a crusade to end tyranny. This is tyranny. The sentiment on display in Bush’s pronouncement (and which America’s political firmament shares) disregards not only the law of the use of force, but also another rudimentary doctrine of international law: nation-states are not obligated to adopt a particular form of government.* A nation-state’s existence hinges on only four things: 1) a permanent population; 2) a defined territory; 3) a government (not a particular type of government); and 4) the capacity to conduct relations with other nation-states. The United States itself officially acknowledges these criteria. In the eyes of the law it makes no difference whether a given government is a democracy, a republic, a monarchy, a socialist collective, or a military dictatorship – so long as the nation-state satisfies the legal criteria, then it shares sovereign equality with all other nation-states, and it has the right to defend itself from unprovoked attacks or interventions.

Even the most despicable nation may make binding treaties; it may refuse to enter treaties with which it disagrees; it may defend itself or seek help from other nations if attacked; and it may come to the aid of any other nation under attack. “Democracies” are not the only nations entitled to such basic rights, and “democracies” are certainly not entitled to deprive other nations of them.

Disregarding all this, federal foreign policy now mimics the Brezhnev Doctrine whereby the old Soviet Union reserved the right to invade its neighbors to restore regimes more friendly to Soviet policies (as with Czechoslovakia in 1968 and Afghanistan in 1979). An American Brezhnev Doctrine in the service of “democracy” is no more legal and no less opportunistic – the federal government topples elected governments while empowering despots whenever expedient to its own interests. A few examples of federal aggression abroad demonstrate that the federal government’s war-making policy is in fact a war on the Nuremberg Principles, on the U.N. Charter, and on national sovereignty.
____________________________________
*The technical method for describing nation-states is to refer to them as “states.” However, since most Americans perceive “states” as the internal components of the United States, I avoid confusion by referring to states on the world stage as “nations,” “nation-states,” or “countries.”

Wednesday, February 22, 2012

Astounding Numbers

The Heritage Foundation estimates that 49.5% Americans pay no taxes, raising a giant red flag to how democracy is fulfilling its parasitic promise.

But as usual, the truth is even worse. If we factor in all the people who draw a government salary, we must add them to the rolls of non-taxpayers as well. Someone is shrieking right about now that government employees pay taxes too! No, you don't. You are paid with tax money, so the taxes you pay back are nothing more than a slight diminution of your take. For example, if you receive $10 in tax money and pay back $1, you remain a tax consumer to the tune of $9.

Government workers at the federal, state, and local levels comprise about 8% of the population, so we have a total of 57.5% of the population living at the expense of the other. The parasite is already larger than the host.

Tuesday, February 21, 2012

Imagine That

A rational, intelligent, objective climate analyst stoops to thievery and fraud in an attempt to smear the opposition. How shocking it must be for some to realize that science is only as good as the people using it -- it is not a sacred pipeline to the truth, but yet another outlet for human fallibility.

Monday, February 20, 2012

War -- Part I

It's the oldest trick in the book: when troubles proliferate, pick a fight. Fights compel us to put aside our differences and join the common cause, so they are catnip for politicians. Senator Arthur Vandenberg famously endorsed this phenomenon during the Cold War when announcing that all politics must stop at the water's edge. The forays into Iraq and Afghanistan have not sufficed to squelch dissent, so Washington is looking to double down in Iran, Syria, or anyplace else where blood might be shed. A stumbling block is the very system of law that the United States helped create after the Second World War, which condemns the aggressive use of force. Not many people understand this system or how it came about, so here's a thumbnail sketch. I wrote this three years ago during the Bush administration, but things have not changed very much.

AMERICA LETS SLIP THE DOGS OF WAR

In recent decades the United States has claimed unfettered discretion to initiate war against mere potential threats to national security, as well as to topple governments that the United States deems oppressive to their own citizens. This attitude represents a grave breach of treaty and customary obligations that the United States itself was instrumental in creating, a tragic irony that projects to the rest of the world a spectacle of unlawful government that has become all too familiar here at home. Enabling this belligerence is a pagan philosophy that equates power with righteousness, as summed up by one of George W. Bush’s spokesmen:
We’re an empire now, and when we act, we create our own reality. And while you’re studying that reality – judiciously, as you will – we’ll act again, creating other new realities, which you can study too, and that’s how things will sort out.
By refusing to be bound by the law or even by the concept of objective truth, and by drawing the sword according to nothing but its own whims, the United States government has joined the company of history’s worst offenders. The purportedly noble goals of this arrogance do nothing to make it righteous or unique.

The Struggle To Define The Lawful Initiation of Force

For centuries man has attempted to frame principles governing when the resort to war becomes justifiable, a largely futile effort that has generated a great deal of literature and that only recently has become standardized into a uniform set of rules.*

From the days of ancient Greece up through the nineteenth century, scholars propounded the theory of “just war” as determining when force becomes proper. As outlined by secular thinkers such as Hugo Grotius and ecclesiastics such as St. Thomas Aquinas, the initiation of military force had to fulfill an array of standards such as approval by a legitimate authority, an intention to advance good and/or avoid evil, force as a last resort only, proportionality between the costs of war and the goals to be attained thereby, a reasonable hope of success, and a formal declaration of war. But these criteria proved highly subjective and open to misinterpretation, so they did little to restrain nation-states once they emerged as autonomous entities at the end of the Middle Ages. National rulers predictably began throwing the mantle of justice over every one of their sordid military ventures, diminishing the “just war” doctrine from a precondition to a pretext.

Only towards the end of the nineteenth century did nations begin making an earnest effort to place an objective boundary on what had become wanton warfare in the service of cynical politics. Early steps were taken at the Hague Conferences of 1899 and 1907, where many of the smaller nations enjoyed their first taste of sovereign equality with the great powers. Although the nations participating in the Hague conferences did not propose any radical changes to the nation-state system, they nevertheless created an arbitration mechanism for resolving international disputes, and they called attention to the barbarities of war, committing themselves to meet regularly to make any further progress possible.

World War I broke out in 1914 and dimmed the hopes of the Hague system, but the war nevertheless produced a far more serious effort to curtail the use of force, namely the League of Nations. This international bureaucracy – the brainchild of President Woodrow Wilson, and forerunner of the modern United Nations – sought to guarantee peace through “collective security”: League members would rush to the aid of any other member victimized by aggression. Under League rules, feuding nations could not legally resort to war unless and until they submitted their disputes to adjudication. But the League suffered major problems, not least of which was the refusal of the United States to join it. Moreover, there remained the stark possibility that a member of the League would resort to war even after a peaceful adjudication, especially if the outcome were disfavorable.

Several nations attempted to cure this defect by entering the much-maligned Kellogg-Briand Pact of 1928, which “outlawed” war as a means for resolving international disputes. Contrary to the guffaws of its detractors, the Pact did not attempt to accomplish the ridiculous objective of eliminating all war; rather, the Pact and its preparatory materials renounced offensive wars, while simultaneously permitting defensive wars. So a new litmus test for the lawful use of force had just emerged, one that turned on the distinction between aggression (impermissible) versus defense (permissible). Henceforth, all aggressive uses of force were deemed illegal, regardless of the “justice” of the underlying cause. Whether a cause might be “just” ceased entirely as the relevant inquiry – after all, war always appears justified to the people perpetrating it, so something more concrete than amorphous “justice” had to serve as the standard.

In flagrant disregard for this new standard, Japan, Italy, and Germany launched aggressive attacks on their neighbors during the 1930s in the name of “justice” and “national security,” but certainly not in self-defense. Japan invaded the Chinese province of Manchuria in 1931 and erected a puppet regime under the former Chinese emperor. From the very beginning, Japanese leaders portrayed this invasion as a noble mission to bring liberty and prosperity to the Manchurian people, who were suffering at the hands of rapacious warlords. Similarly, in 1935 Italy flexed its newfound muscle by unleashing tanks, planes, and poison gas on the people of Ethiopia, a massacre supposedly justified by Ethiopia’s persistence of archaic and evil practices such as slavery. As Italian head-of-state Mussolini phrased it:
The war which we have begun on African soil is a war of civilisation and liberation. It is a war of the people. The Italian people feels it as its own. It is the war of the poor, of the disinherited, of the proletariat. Against us are ranged the forces of conservatism, of selfishness, and of hypocrisy. We have taken on a hard fight against these forces. We shall continue this fight to the end.
And of course it cannot be forgotten that in 1938 Hitler used the language of justice to threaten war against Czechoslovakia on behalf of “oppressed” Germans living there, such threats causing the notorious appeasement by British Prime Minister Neville Chamberlain. In 1939, Hitler directed the same sanctimonious rhetoric at Poland in order to “protect” Germans in Danzig, only this time he confronted an uncompromising foe. In response, Hitler justified launching an aggressive war, just as Japan and Italy had recently done, thereby breaking the back of the League of Nations and ushering in the bloodiest war ever.

At the end the war in 1945, the United States resolved to play a more active role on the world stage and condemned such aggressive uses of force for all time. Towards this end, the United States spearheaded the war crimes tribunals in Nuremberg and Tokyo to punish the Axis powers for the incalculable harm that their aggression had wrought, thereby helping to frame the famous Nuremberg Principles that forbade and censured any such “Crimes Against The Peace” and their accompanying “War Crimes” and “Crimes Against Humanity.”** While serving as a prosecutor at Nuremberg, Supreme Court Justice Robert Jackson made it clear that modern nation-states would no longer tolerate aggressive warfare, no matter what justification was offered for it:
Repeatedly, nations have united in abstract declarations that the launching of aggressive war is illegal. They have condemned it by treaty. But now we have the concrete application of these abstractions in a way which ought to make clear to the world that those who lead their nations into aggressive war face individual accountability for such acts. . . .
We must make clear to the Germans that the wrong for which their fallen leaders are on trial is not that they lost the war, but that they started it. And we must not allow ourselves to be drawn into a trial of the causes of the war, for our position is that no grievances or policies will justify resort to aggressive war. It is utterly renounced and condemned as an instrument of policy. [italics added]
In addition to holding these tribunals, the United States demonstrated its commitment to international peace and security by inviting the Allies to San Francisco in order to establish the United Nations, whose founding Charter (a binding treaty) states in the preamble that its purpose is “to save succeeding generations from the scourge of war, which twice in our lifetime has brought untold sorrow to mankind.” To secure this purpose, Article 2(4) of the Charter places a blanket prohibition against the use of force:
All Members shall refrain in their international relations from the threat or use of force against the territorial integrity or political independence of any state, or in any other manner inconsistent with the Purposes of the United Nations.
So the United States, in unison with the vast majority of the other nations of the world, declared that the international use of force would thenceforth be presumptively illegal. Non-use of force was the rule; use of force was the exception. Nations could resort to force only in those few circumstances provided by the U.N. Charter and customary international law, as demonstrated below:*


Individual nations therefore retained the inherent right to use force in their own defense or in the defense of others against armed attack, without the need for external approval. But in order to make the grave decision to initiate war, nations agreed that they would first seek a broad-based consensus among the fifteen member nations of the U.N. Security Council, five of whom are permanent members whose great-power status makes their unanimous consent absolutely necessary: the United States, the United Kingdom, France, China, and the Soviet Union (now Russia, who inherited this permanent-member status).***

____________________________________
* Principles governing the initiation of warfare fall under the generic heading of jus ad bellum, which differ from the principles governing the conduct of warfare once initiated (jus in bello). Strangely, while the rules for initiating warfare have grown more refined, the actual conduct of warfare has grown more barbarous. Ever since the French Revolution, war has ceased functioning as a limited dispute between governments and now functions as an unrestrained, cathartic bloodletting between entire peoples.

** It is crucial to note that under the Nuremberg Principles adopted by the International Military Tribunal, these three crimes hinged on international wrongdoing – there was no liability for purely domestic activities unconnected to illegal warfare, a limitation that has since evaporated.

*** Proving that even the most straightforward legal text buckles under the weight of power politics, the U.N. Security Council swiftly disregarded the requirement that all five permanent members give an affirmative vote for the use of force. When the Truman administration sought the Security Council’s approval (but not Congress’s) for the “police action” in Korea, the administration treated the Soviet Union’s abstention from voting as a silent approval (the Soviet Union had absented itself in a fit of anger over the fact that Taiwan rather than Communist China was on the Council at that time). This illicit practice has never died, and the Council now requires an affirmative veto to derail any such resolution.

Sunday, February 19, 2012

The Death Of Grammar

I was doing a news roundup this morning and ran into a brief column by former Congressman Lee Hamilton, who was questioning the value of presidential debates. Near the beginning of the column appeared these two sentences:

"There's no question debates have value. Structured properly, they make a candidate put forth ideas and give us a glimpse of how they behave under pressure."

The second sentence is highly problematic. When Hamilton writes "they make a candidate put forth ideas," he is using "they" to refer to debates. But Hamilton proceeds in the same clause to say that the debates "give us a glimpse of how they behave under pressure." Who is "they" this time? The only plural antecedents are debates and ideas, but Hamilton isn't referring to either of them. He is referring to the candidate. So not only has he switched the meaning of "they" midstream, but he also has misapplied it to a singular antecedent.

Sloppy language of that kind sounds like nails on a chalkboard to me, but that exact kind of language infests all formal writing today. If Lee simply had used the plural "candidates," the sentence would improve greatly, though it would still suffer from the abrupt switch of "they" to signify a different antecedent. Here is the best method for Lee to have expressed himself:

"Structured properly, they make a candidate put forth ideas and give us a glimpse of how he behaves under pressure."

The sentence is now grammatically perfect, albeit politically incorrect. Politics seep into every nook and cranny of modern life, and even language cannot escape their noxious influence.

Thursday, February 16, 2012

Language

Microsoft Word recognizes the word "misogyny," but it labels "misandry" as a misspelling (indeed, the latter term has a red squiggly line under it right here on blogger). How revealing of the times we inhabit. Take a gander at television for just half an hour and find out which of these two phenomena typifies modern discourse.

Well, Well, Well

More of those stubborn facts and people who will have to be snuffed out if we're going to save the planet:

Here, here, and here.

Saturday, February 11, 2012

Life In Missoula

In Spanish there's a saying, "dime con quién andas y te digo quién eres" (tell me who you walk with, and I'll tell you who you are). Very insightful, but I would offer the corollary of "dime dónde vives y te digo quién eres" (tell me where you live, and I'll tell you who you are).

I have lived in Missoula, Montana, for almost two years and have come to the realization that this place resonates with me far more than my home of Florida ever did.

For one thing, here I can enjoy a life of the mind like never before. As an attorney, I get to participate in appellate work, something I'm particularly good at but never had much access to in Florida because of the paranoid, guild-like protectionism of appellate attorneys there, who consider themselves experts in an area no ordinary attorney could fathom. Apart from ordinary work, I tutor two or three Spanish students at any given time and enjoy chatting with each of them in fun places such as cafés and wineries, and I have gained new friends who are from Latin America but have precious few Spanish speakers nearby for company (in Florida, Spanish speakers are as plentiful as palm trees). I spend a lot of time at the law school with young and intelligent attorneys-in-training, especially nowadays because I coach the international-law moot court team and get to discuss esoteric but crucial legal principles. There is chess, which I have studied and improved at since coming here, and which already has earned me several good friends. And there is my personal writing, some of which I do here and some of which I do on the side, but all of which allows me to express myself in a way that the demands of Florida life would not allow.

There is a sense of community here, not the alienating isolation of being swallowed by a big city. The friends I already mentioned were incredibly easy to meet but have provided a level of warmth and camaraderie that matches none from Florida, save for my childhood friends. Perhaps only one, or at most two, degrees of separation stand between Missoulians. If you mistreat someone here, it will come back to haunt you as it should.

There are seasons that reflect life, as the splendor of spring and summer gives way to the gloom of autumn and winter, only to be reborn once again. My ancestry has re-emerged and allows me to face the bitter cold without any problem. Sometimes I leave the gym after swimming and walk through snow wearing flip-flops, a damp bathing suit and a pullover, and I feel just fine.

And there are the women. I've dated several here and have been with my current girlfriend for a few months. Each woman is unique, but they all share a quality extremely hard to find in Florida -- character.

Missoula is my new home, and it's a good fit.

Saturday, February 4, 2012

Environmentalism -- Part IV

A Global-Warming Bureaucracy Is Born

The international effort to rid the world of its eternal habit of climate change began in earnest at the 1992 Earth Summit, officially known as the United Nations Conference on Environment and Development. Politicians from more than one hundred fifty nations lounged in lush Rio de Janeiro to collaborate on how we the people should be allowed to pursue our happiness, such as the “proper” number of automobiles we drive; the kinds of fuels we use; the settings where we choose to live; and the industries in which we choose to work. Totalitarian ideology has rarely found so thorough an exposition as this, with reams of declarations operating on the unspoken presumption that the governments of the world have the unfettered right to micromanage human life. While most of this polluted verbiage amounted to mere political theater, the Earth Summit did produce a more noxious offspring: the United Nations Framework Convention on Climate Change (“UNFCCC”), a treaty calling for periodic such meetings into the future so as to devise ways and means of forcing us to obey the global-warming religion. Staying true to his amoral establishment credentials, President George H.W. Bush ratified the UNFCCC upon securing the Senate’s approval, likely because he saw political advantage in claiming that he was “doing something” about the environment.

Predictably, it did not take long for the perpetual UNFCCC gabfests to conjure up something even worse: the Kyoto Protocol of 1997. This proposed addition to the UNFCCC attempts to regulate and reduce the carbon1 output of entire nations, specifically those few designated under Annex I to the UNFCCC who are wealthy enough to afford the honor (i.e., the ones who have benefited from freedom so much that they are now capable of paying to snuff it out). Over one hundred seventy nations have rushed forward to ratify this suicide pact, but with only forty-odd of them obligated to drink the Kool Aid.

To its credit, the United States has thus far refused to join, despite furious denunciation from the decaying Western world. Canada recently grew a spine and withdrew in December of 2011. If the United States were to join, it would undertake to straitjacket American citizens so that our carbon-dioxide emissions are reduced to approximately 1990 levels (an entire generation prior). To accomplish that pointless and emasculating objective, the federal government would oversee our collective activities and submit a greenhouse-gas inventory each year to the U.N. in order to demonstrate Kyoto compliance. As an alternative to outright servility of this sort, the United States could opt for international welfare-statism by sending taxpayer money abroad to subsidize carbon reduction where it is cheaper to do so, thereby earning “carbon credits” from the bureaucratic benefactors at the UNFCCC. Better yet, the federal government might graciously allow us purchase “carbon credits” from one other, so that citizens who need to emit more carbon dioxide than normally permitted could purchase that ability from citizens who emit less than their allotted portion.

The UNFCCC is laying the foundation for a new protocol to succeed Kyoto, which expires this year, and it is apparent that the federal government plans to participate every step of the way. Having embraced the environmentalist religion, the federal government in February of 2007 signed the Washington Declaration with various other nations, agreeing to devise a mutually acceptable plan for imposing emissions controls on us. In June of 2007 the federal government announced that it was seriously considering a pact with Europe to cut our emissions in half by the year 2050. And in late 2007, a U.N. “climate conference” in Bali, Indonesia saw proposals for a global carbon tax, proposals that met with the approval of a former Vice-President of the United States.

Welcome to the future. Our experiment in limited government that sprouted for one brief moment in history – and which has already suffered major setbacks – is faced with oblivion as the United States marks the days until it adopts Kyoto or its kin as a pretext for massive new deprivations of life, liberty, and property. An American populace stripped of transcendence and schooled with the blunt implements of modern pedagogy stands spiritually and intellectually unequipped to do anything other than swallow environmentalism’s poetic lie. Mark Twain once observed that a lie can travel half way around the world while the truth is putting on its shoes. Let us hope that the truth gets its shoes on quickly.
_______________________________________

1. Technically, the Kyoto Protocol casts a net over multiple “greenhouse gases,” of which carbon dioxide is the most highly featured. Simultaneously, the Kyoto Protocol ignores water vapor, by far the most abundant greenhouse gas of all, but which is not man-made and therefore offers no political rewards.

Thursday, February 2, 2012

There Are No Atheists

"When people stop believing in God, they don't believe in nothing - they believe in anything." ~ G.K. Chesterton

Atheism is the denial of the existence of God, yet of all the self-proclaimed atheists strutting around today, I have never found one who meets this definition. What I find instead are people who have replaced one God for another. Nature abhors a vacuum, and humans abhor the vacuum created by their unique ability to ask questions for which there are no ready answers. Such is the lesson of Genesis, as eating from the tree of knowledge results in expulsion from the paradise of blissful ignorance.

Nobody embraces the vacuum. What they do embrace is the next all-encompassing narrative -- i.e., the next God -- that their internal vacuums suction onto. As discussed in my recent posts, a popular God these days is environmentalism and showcases all the standard features of a religion: abnegation, catechism, punishment of heretics, reliance on consensus, and imperviousness to reason. Gods can be as complex as political ideologies or as simplistic as the pursuit of entertainment and pleasure, though the latter often denotes a futile attempt to avoid tough questions and re-enter a state of infancy.

The issue is not whether to believe in God, but rather which God to believe in. All that has changed with modern society and "atheists" is that God now resides in matter -- government, flora and fauna, a bottle, or between the legs -- rather than in ideals. This is hardly progress and embodies a shabby form of paganism utterly lacking the majesty of Greece or Rome.