Abolition of Russian serfdom vs Abolition of US Slavery

It seems like the emancipation of 23 million serfs in Russia in 1861 was a lot better organized and planned out than the emancipation/abolition of U.S. slaves during the American Civil War happening at the same time. In part, this difference would likely have stemmed from the fact that the Imperial Russian government could act by fiat and receive compliance. Moreover, the serf-holding landowners in Russia were way more indebted/obligated toward their government (than the already literally rebelling Southern American slaveholders) and thus couldn’t resist such a decision from the central government.

But, more importantly, the committee that planned the Russian emancipation also did a lot of theorizing on how to handle emancipated serfs in a manner that didn’t trap them on old lands and gave them some economic opportunities. Freed serfs didn’t exactly get 40 acres and a mule either — and it was still a pretty bumpy outcome — but it was a lot closer to a comprehensive and effective dismantling of the system in a responsible manner. The U.S. approach seems to have ended up at “you’re free now, problem solved. ok, next thing on the agenda,” which immediately led to slavery-by-another-name practices like abusive sharecropping contracts.

President Lincoln was elected by a pro-abolition party (even though that wasn’t personally his primary or even secondary campaign plank). Many of his generals repeatedly tried to brainstorm and implement measures — such as the aforementioned, abortive 40 acres land grants proposal — to deal with the slaves encountered in the South while suppressing the rebellion (and he objected to all of them). So obviously, in spite of (and because of) the Civil War going on at the time, a lot of people in the United States were thinking about this issue on some level.

I would have hoped somebody in the Republican Party or government or military would have at least had a working group on implementation of abolition. After all, this wasn’t a foreign concept because the northern states already had plenty of experience with dismantling their slave-inclusive economies with relatively minimal disruption. Yes, they consistently had fewer slaves, but they still figured out something that worked. So the information and ideas needed to plan for this eventuality — foreshadowed as early as the Constitutional Convention of 1787 — should absolutely have been there by 1861.

But instead, U.S. abolition was implemented chaotically and indecisively over the 1860s, with little plan for what to do with/for all the freed people, and with little enforcement (especially after the removal of Federal troops at the end of Reconstruction) to prevent abuses.

Letter: Discrimination in St. Patrick’s Day parade

ireland-flagI submitted this letter to the editor of the Boston Globe last week — I don’t think they published it — regarding this story, which has been brewing for quite some time.

It’s absurd in 2014 that the organizers of the Boston St. Patrick’s Day Parade are still trying to block open LGBTQ participation. The day is an annual cultural and community tradition, and one certainly long-separated from any religious aspect. The parade aims to celebrate one of Boston’s communities – Irish-Americans – that, like every ethnic community, has LGBTQ members within it. The organizers are telling their own community that not everyone is welcome to be proud of their ethnic heritage. LGBTQ people have always been with us and aren’t going away. Every poll indicates that’s ok with about 9 in 10 Bay Staters. Rather than representing an integral part of Boston, the parade organizers have proven themselves deeply unrepresentative. (As an Irish-American, I certainly don’t feel represented by them.) Instead, they prattle on about “wrong messages” like it’s 1980. Get with the times. If you’re going to host a public parade on city streets, with city facilitation, then no discriminating against any of the city’s residents. The organizers should be ashamed of themselves.

 
While it’s perhaps not my primary self-identity, I am, in fact, old-school Irish-American. Pre-Potato Famine. My first Irish ancestor arrived during the American Revolution to help fight the British, who were still repressively occupying Ireland at the time. He fought in the Battle of Bennington in upstate New York. I don’t take kindly to people trying to suppress other people’s freedoms and identities, particularly when it’s coming from Irish-Americans, who’ve faced their share of terrible discrimination and should do better.

American History: A bloody coup in the U.S.

wilmingtonpress_540-2bf0830e2573b95312f000c316a44a5c57c107a2One of the reasons we have a month set aside to celebrate and remember Black History is because unfortunately the teaching of American History tends to leave it out the rest of the time (which is also why there’s never been a need for a White History Month). However, just because it isn’t taught doesn’t mean it’s unimportant or that it doesn’t count. Here’s one historical event I want to talk about today because — sad to say — I only just recently learned of it myself.

Did you know there was once a bloody coup d’état within the borders of continental United States?
Read more

Replacing the War Powers Act

Senators Tim Kaine (D-VA) and John McCain (R-AZ) want to get rid of the War Powers Act — slogan: “Consistently Ignored by Presidents Since 1973!” — and replace it with something that might actually work and better reflect realities of U.S. military operations today. Here’s the Wikipedia summary of the existing law, which officially is called the “War Powers Resolution of 1973“:

The War Powers Resolution requires the President to notify Congress within 48 hours of committing armed forces to military action and forbids armed forces from remaining for more than 60 days, with a further 30 day withdrawal period, without an authorization of the use of military force or a declaration of war. The resolution was passed by two-thirds of Congress, overriding a presidential veto.

 
The failed presidential veto was by Richard Nixon, the year before his resignation, but Congress was responding to significant public outrage about the secret, unauthorized bombings in Cambodia during the Vietnam War — which, while authorized by Congress, had also never been declared. (In fact, the last formal Declaration of War was part of World War II.)

Although it’s no surprise that Nixon rejected the legitimacy of the law — given his unusually heightened aversion to the legitimacy of applying any law to the U.S. Presidency — every president since then (except for possibly one incident in 1975 under President Ford, who had fairly recently been elevated directly from and by the legislative branch to the White House via the resignations of Spiro Agnew and Richard Nixon) has also officially refused to acknowledge its constitutionality as a general principle.

Even so, to be on the safe side, presidents have generally unofficially adhered to it by providing the proper notice to Congress more or less as a “courtesy” without acknowledging the resolution as the reason. A few instances are disputed as to whether this notice was provided. Congress has never been able to successfully enforce the resolution or end any conflicts with it, and the Supreme Court won’t get into the middle of that inter-branch fight.

Tim Kaine essentially feels this situation is absurd, as well as out of date, and he wants a compromise that preserves the ability of the executive to act quickly when necessary but also preserves the rights of Congress to have a say and maintain accountability. From the ThinkProgress article (linked above):

Rather than only having to notify Congress after launching military action, Kaine and McCain want the force presidents to consult with legislators prior to sending U.S. soldiers, sailors, and pilots into harm’s way.

Under current law, the president has to notify Congress whenever placing forces in areas where “imminent” hostilities are likely, and is given a sixty-day window to conduct the operation absent Congressional approval and another thirty-days allotted towards withdrawal. The new proposal would reduce that autonomy, requiring the Executive Branch to “consult with Congress before ordering deployment into a ‘significant armed conflict,’ or, combat operations lasting, or expected to last, more than seven days.”

That provision would exclude humanitarian missions and covert operations, and the initial consultation could be deferred in time of emergency, but must take place within three days after. The legislation would also raise a new joint committee composed of the heads of the Armed Services, Foreign Relations, Intelligence, and Appropriations in both Houses of Congress “to ensure there is a timely exchange of views between the legislative and executive branches, not just notification by the executive.”

Finally, the law, if passed and signed, would require a vote in Congress in support of or against any military operation within 30 days.

 
Now is a relatively good time to try to introduce such a revision, not too long after an angry Congress (and a well-timed revolt in the UK parliament) managed to talk down the Obama Administration from launching a major air campaign in Syria, proving that Congress still had at least a shred of influence on U.S. military actions after more than two decades of rubber-stamping.

But, in 2008, the Obama Campaign more or less signaled their opposition to a similar proposal. While unfortunate, this is not a huge surprise. Most presidents (or presidential hopefuls) reject out of hand any legal limitations on their powers as “commander-in-chief,” even despite the Constitution’s specific and intentional provision reserving the power to declare wars to Congress (a power typically previously wielded only by the monarch heads of state in the Europe of the day against which the Framers were comparing their system). President Obama doesn’t want to limit his own power (or that of his successors) to act decisively and quickly in the face of the “unknown unknowns,” as former Defense Secretary Donald Rumsfeld famously called them.

Non-state surveillance

In an op-ed in the NY Times Sunday Review, Jeffrey Rosen discusses James Madison’s views on privacy and surveillance. In particular, Rosen argues that Madison made a slightly odd distinction between government invasions of privacy (which he wanted restricted) and the same from businesses or other people (which he didn’t really care about much). Then Rosen asks whether that distinction is valid or even still up to date.

In practice, the neo-Madisonian distinction between surveillance by the government and surveillance by Google makes little sense. It is true that, as Judge Pauley concluded, “People voluntarily surrender personal and seemingly private information to trans-national corporations which exploit that data for profit. Few think twice about it.”

But why? Why is it O.K. for AT&T to know about our political, religious and sexual associations, but not the government?

[…]

That distinction is unconvincing. Once data is collected by private parties, the government will inevitably demand access.

More fundamentally, continuously tracking my location, whether by the government or AT&T, is an affront to my dignity. When every step I take on- and off-line is recorded, so an algorithm can predict if I am a potential terrorist or a potential customer, I am being objectified and stereotyped, rather than treated as an individual, worthy of equal concern and respect.

Justice Louis Brandeis, the greatest defender of privacy in the 20th century, recognized this when he equated “the right to be let alone” with offenses against honor and dignity.

But he also lamented that American law, unlike European law, was not historically concerned with offenses against what the Romans called honor and what in more modern terms we call dignity. European laws constrain private companies from sharing and collecting personal data far more than American laws do, largely because of the legacy of Madisonian ideas of individual freedom, which focus on liberty rather than dignity.

What Americans may now need is a constitutional amendment to prohibit unreasonable searches and seizures of our persons and electronic effects, whether by the government or by private corporations like Google and AT&T.

 

Europe is way more aggressive about trying to curb private amassing of data. Meanwhile, both the U.S. government and private mega corporations — aided by the gushing of the American media — are pitching the concept of “big data” as a godsend and cure-all, thus necessitating mass collection and indefinite storage of data. Can’t throw all the data points in the data stew if you haven’t held on to all of them, the logic goes.

And it’s a fair question raised in this article. The phone company or the internet businesses knowing all our private information (and movements and habits) is allowed freely. Yet the government is supposed to be following various restrictions, due to the Bill of Rights — but why? Why don’t the protections extend to the private corporations? We’ve seen time and again that they willingly turn over all their data for “national security” and “public safety” reasons, sometimes without even being asked through a court order.

Our government need not construct a surveillance state unconstitutionally when corporate America will do it for them.

Addendum: On a partially related note, I highly recommend this article by Virginia Eubanks in The American Prospect: “Want to Predict the Future of Surveillance? Ask Poor Communities.”

Marginalized groups are often governments’ test subjects. Here are a few lessons we can learn from their experiences.

Repeating Collective Failure, Long After the Great War

wwi-italian-frontAlmost a century after the start of World War I, Italy is still recovering bodies of those killed in action high in the Alps. Starting in the 1990s, the Earth’s mounting temperatures melted enough ice to free some of those long-frozen souls.

In recent weeks, Britons got to read in their newspapers a war of words between Education Secretary Michael Gove and actor Sir Tony Robinson, over the latter’s TV representation of the first world war as a colossal, tragic mistake.

Sadly, that was indeed a fairly accurate summary of a war that began almost accidentally and rapidly involved every European country that had nothing to do with it.

A local assassination, excessive hubris, illogical military plans and a general unwillingness to stop a war’s wheels from grinding into action let things get out of control faster than any diplomats could rein in it – even if they wanted to.

Soon, officers were ordering wave after wave of young men into barbed-wire-tangled moonscapes, as machine guns raked across their ranks and shells exploded around them. The metric for victory became a few feet of meaningless dirt.

It is a cliché to note that the “War to End All Wars” was certainly far from the last conflict, but it seems to have become accepted wisdom that no countries could be so foolish a century later as to initiate a cascade of mistakes on that scale.

The irony, of course, is that the recently recovered Austrian and Italian bodies from the mountain front were likely only disgorged due to the melting of glaciers and once-permanent snow packs as a result of man-made global warming. Will unrestrained climate change be 2014’s tragic answer to the epic, collective failure of 1914?

The phenomenon has until recently been, in effect, a slow-motion collision of the different economic plans of nations everywhere. Our diplomats had more warning this time – but again had no support from their home governments to negotiate a solution that might head off the impact.

Every vanishing glacier that once served millions with drinking water now serves only as a catalyst for more squabbling over limited resources. Every new factory in one nation must be answered with a factory in its competitor. There is no partial mobilization of resources when economic primacy is at stake.

The world’s marginal places – the societies literally living on the margin between existence and extinction from one harvest to the next – are finding themselves drier and more prone to catastrophe than ever. They are an ecological and human powder keg that rivals last century’s Balkans.

The rapidity of South Sudan’s recent collapse – or that of nearby Central African Republic – or northern Mali in 2012 – even the wheat-driven Arab Spring – should be seen as a bigger warning of what is yet to come than any anarchist bomb or gunshot.

This warming is upon us and we are its primary cause. We can ignore the signs until an avoidable global tragedy is fully unleashed once more or we can commit our diplomats, strategists and resources to a collaborative counter-effort that will benefit all mankind.

This summer, as Europe swelters through commemorations of the Great War, we should heed the heavy cost of 1914’s chain of errors or past will again be present.

 
This essay originally appeared in The Globalist.

Communism, political Islam, and resistance

Yesterday in a post about the Uighurs wrongfully detained in Guantanamo Bay for over a decade, I made an allusion to U.S. Cold War policies toward anti-colonial movements in developing nations, which I want to explore further today:

And the United States in particular needs to stop lumping together every rural Muslim male with a gun as an “Islamic terrorist.” It’s not a helpful approach to the conflicts from southeast Europe to northwest China and everywhere south of that (including much of Africa now). It’s just as bad as our refusal to make nuanced distinctions among different Communist-affiliated nationalist independence movements in Africa and Asia during the Cold War.
[…]
We’ve heard this before, after World War II, when the United States decided to fight pro-American independence groups like the Viet Minh because of their Communist alignment, instead of embracing fellow anti-colonialists.

 
In hindsight, many American intellectuals can see that there were alliances to be made, which we rejected to our detriment. But even today, a lot of people — particularly on the conservative end of foreign policy — are still refusing to make the distinctions.

We saw this come up a few weeks ago in discussions over the legacy of Nelson Mandela. The Cold War mentality produced a lot of tangled, unlikely, and unfortunate alliances all around, for both the Americans and the Soviets. Nelson Mandela, while not a communist himself, was willing to work with communists who shared his core principles of anti-colonialism, anti-imperialism, and often (but not always) a unitary nationalism.

The United States ought to have joined in that approach, rather than siding with the crazed white supremacist regime that Mandela was resisting. But we couldn’t see past the communist involvement in the movement.

Why communism?

The reality of the situation in much of the developing world after World War II was that the communists were often the best organized members of the wider nationalist independence movements in those nations. And their affiliation with the independence movements — while sort of contrary to original Marxian views that all borders, boundaries, and national divisions were lies intended to divide the global working classes and turn them against each other — was not a huge surprise, given the influences of Lenin on 20th century communism around the world.

Lenin, the founding father of the Soviet Union, certainly had a range of terrible policies as a leader. But before he became a dictator, he was a key contributor to communist theory.

Perhaps his biggest contribution was toward explaining the merger of colonial economics and political imperialism. When other thinkers were still purely obsessed with overthrowing industrial governments in Europe, Lenin was actually raising pointed criticisms of colonialism and explaining how the extractive relationship was harming poor and working class people both in the colonized and colonizer nations.

(The low-cost or enslaved labor of colonized nations, Lenin felt, had made possible the manufacture of so many cheap goodies back home in the colonial powers that The People in those countries were content enough not to rise up. He further argued that colonialism was just one tool of a global financial system geared toward big business capitalism and that it provided the cheap, raw resources necessary to fuel the wealth accumulation of the industrialists’ growing mega-companies. Lenin wanted to cut the legs out from under the big capitalists and spur their European workers to rise up, and neither development was helping.)

So the Soviet Union, despite what was essentially its own colonization of the outer periphery of the former Russian Empire, ended up expending a lot of energy and resources in aiding independence movements in developing regions. The deal was that you would get help in exchange for becoming a communist and pledging to support the Soviet Union.

And so it was that many activists in the developing world joined communist movements out of a sense that Lenin’s theories and Soviet assistance offered the best route to political and economic independence for their home countries.

More broadly, they also saw communism as a way toward a more egalitarian and inclusive society than the divide-and-conquer political strategies of the occupying powers and the economic inequality they were fostering as they created ruling elites, whether white or native.

The Soviet Union, while actually quite socially progressive compared to much of the West at the time, was of course deeply flawed in many ways and extremely brutal at times.

But from the perspective of someone already living under a brutal, unequal, and impoverished colonial occupation (or post-colonial system like the apartheid regime in South Africa), it makes sense to consider communism as a way out.

Communism was offering colonized and occupied peoples self-determination, a path toward industrialization, and the promise of widely distributed and improved living standards that were probably higher than what colonialism and apartheid were offering. Even if the improvements communism could achieve might be less than what well-regulated and politically free market-capitalism might have been able to achieve, neither of those — simply put — were on offer under colonial and white supremacist rule. So communism would have looked pretty good at that point.

Rebuffing potential friends

Even so, many communists in emerging nations during their independence and early post-colonial movements actually tried to befriend the United States because they saw it as a freer alternative to following the Soviet model and they believed Americans — who had thrown off mercantilist colonial rule first and held certain truths to be self-evident for all men — would be sympathetic to the struggles of people who wanted freedom from colonialism, national independence, and upward social mobility for all.

Unfortunately, Americans were often too blinded by ideological taxonomies — and, of course, concerns over maintaining business interests of American multinational firms in developing nations. This resulted in classifying everyone as Red or Not-Red, even if that meant opposing friendly movements that identified as predominantly communist during their resistance phase against oppressive colonial and post-colonial regimes.

It also often meant supporting brutally undemocratic regimes who happened to identify as anti-communist, usually because that country’s main national opposition was communist or because pitching one’s self as a guardian of American business and political interests was a convenient means of acquiring military aid to suppress one’s populations.

Nelson Mandela wasn’t communist himself. But in the communists, he saw brothers-in-arms who shared many of the same beliefs and were willing to help oppose the apartheid regime of the Afrikaner white minority rulers. The United States was ambivalent toward the regime at the best of times and actively refused to oppose the apartheid government at the worst of times.

Rather the criticizing these past associations as we consider the passing of Mandela, Americans should reflect on how ideological blinders have warped our global relations in past eras and what we can do to ensure we are helping the right people and not helping the wrong people in future.

The more we provide help to those who need it and the less we offer aid and comfort to oppressive regimes, the less likely people will be to join radical and violent movements or to associate with movements and ideologies we consider harmful.

The United States must lead by example, not lecture, and must help economically and politically oppressed populations wherever we can.

The challenge today

With the end of the Cold War and the fizzling of many of the remaining “communist” movements in the developing world, “communism” is no longer the source of dangerous American foreign policy conflations. These days, as I suggested in my earlier post, the United States needs to do a better job of making distinctions between resistance movements that use political Islam as a convenient and unifying force against their oppressors or poor living conditions (but which do not pose a threat to the United States and likely don’t even oppose us) and those movements that use an extremist twisting of Islam as part of a delusional plan for a “global caliphate” or whatever.

The latter are angry, unemployed young men who have heard too many conspiracy theories explaining their circumstances and just want to watch the world burn. The former are also dissatisfied with present conditions and see the organized structure and shared identity of political Islam as a means of reorganizing a society away from corrupt and failed rule that benefits a few and toward a system that distributes benefits to the needy and provides basic social services as well as law and order. Some in that category also see a need to incorporate the militarism of early Islam as a motivating force to overthrow the status quo whether it derives from bad local/domestic leadership, a distant and unrepresentative central government (as in Russia or China), or foreign occupation. But this doesn’t automatically mean anti-Western/anti-American views. It’s just one type of response to local conditions.

We shouldn’t be the arbiters of the right to armed resistance oppression, but we do need to recognize who is resisting their oppressive local circumstances — poverty, corruption, occupation, inequality, dictatorship — and isn’t just trying to burn down everything for the sake of it. Between alienating moderate Islamist political parties and frequently blowing up civilians accidentally in predominantly Muslim nations because they might be near someone with a gun, we’re doing pretty badly on that front right now. And again, we don’t need to arbitrate the issue of whose armed resistance is most legitimate if we are pursuing policies that support liberation of all oppressed populations and encourage non-violent solutions.

Unlike with the past “threat” of “global communism,” there are way more people today who identify as Muslim than those who identified as communist at its peak. Learning to employ and display a nuanced understanding of who the real enemies are — the dangerous radicals who seek global revolution, chaos, and general violence — will be crucial to earning trust and good will from a large portion of the planet, whether they live under oppressive or free governments.