Who grows the most Thanksgiving foods these days?

Turkey, pumpkins, sweet potatoes, cranberries, apples, potatoes, green beans, and corn: Where did they originate and which countries grow ’em now? Gobble, gobble.

Plumper_Pumpkin_Patch

The United States is the world’s largest producer and exporter of turkey. Turkeys are an indigenous animal to North America (specifically forested regions of Mexico and the United States). These U.S. states are the top five producers within the country today:

  1. Minnesota
  2. North Carolina
  3. Arkansas
  4. Missouri
  5. Virginia

Pumpkins, squash, and gourds are a collective category covering a wide range of cultivated items. Gourds tend to be Old World in origin — even the pre-Columbian American varieties either migrated across the Bering Strait land bridge from Asian origins or floated across the Atlantic from Africa. “Pumpkins” (the British colonial-era name for a bright orange type of squash) and squash in general are all indigenous to North America. Pumpkins have been found in Mexico for millennia. Today, however, most of the world gets their pumpkins, squash, and gourds from major emerging market producers of the Old World. Notably, though, no African country cracks the top 5 list, despite the inclusion of gourds, but gourds are also very common across Asia:

  1. China
  2. India
  3. Russia
  4. Iran
  5. United States

Sweet potatoes (or yams) are sometimes substituted for pumpkin/squash at the Thanksgiving table or are sometimes included alongside them. Like ordinary potatoes, sweet potatoes were domesticated in South America. Remarkably, however, sweet potatoes made the jump to Polynesian islands in the Pacific well before the Western arrival in the New World, indicating strongly that Polynesian explorers landed in pre-Columbian South America and returned home with the crop. This early start in Polynesia helped sweet potato later become a major crop in nearby southeast Asia, including Indonesia. While China again tops the present-day producer list, this category is Africa’s moment to shine, as several African countries have incorporated yams firmly into their cuisine.

  1. China
  2. Tanzania
  3. Nigeria
  4. Uganda
  5. Indonesia

Cranberries remain strongly associated, in terms of production, with their natural homes in the United States and Canada. The early United States saw the conversion of the wild marsh crop (previously gathered by Native Americans and First Nations peoples) into a farmable wetland production, which began exporting cranberries all over the world, where they caught on. The Russian Empire, in particular, tried its own hand at cranberry production and that legacy can still be seen in the runners-up.

  1. United States
  2. Canada
  3. Belarus
  4. Azerbaijan
  5. Latvia

Apples are one of the few food items commonly associated with modern Thanksgiving that did not originate in the Americas at all, with the exception of crabapples (which are generally not consumed). Wild apples come from Central Asia (including what is now western China) and a wide number of wild species have been domesticated and bred down into various edible selections. China is far and away the largest producer of apples in the world. Distant second-place United States — “as American as apple pie” — has had edible, domesticated apples for less than four hundred years, unlike most of the rest of the modern Thanksgiving selection foods. In fact, apples were not grown in New England until several years after the first Thanksgiving.

  1. China
  2. United States
  3. Turkey
  4. Poland
  5. Italy

Potatoes have become a global staple over the past several hundred years, but they originated in South America. Today, wild species can be found from Chile to the United States, but they all came from a single strain in Peru or Bolivia, which is also where they were domesticated many thousands of years ago.

  1. China
  2. India
  3. Russia
  4. Ukraine
  5. United States

Green beans (known elsewhere as string beans or snap beans) are from Central and South America (domesticated in two separate locations) and were introduced to the rest of the world by Christopher Columbus on his second trip back from the Americas. Today the top producers are:

  1. United States
  2. France
  3. Morocco
  4. Philippines
  5. Mexico

The United States is also, unsurprisingly, the world’s largest producer and exporter of corn (maize), but 97% of U.S. corn production is not for direct human consumption. There are various animal or industrial uses for all that U.S. corn production not going to people. Mexico is a big producer of White Corn, particularly for use in tortillas and other Mexican cuisine. Maize was domesticated over several centuries of careful breeding in Mexico many thousands of years ago, with several varieties from a single strain, and became important to regional trade between indigenous groups. It remains North America’s largest grain crop, and human genetic modification is still a major influence to present day.

Statistical Data Sources: FAOSTAT (2013 top 5 producers data for each crop), AgMRC (Turkey and Corn)

Op-Ed | France and the West: Inconvenient Questions

This essay originally appeared in The Globalist.

January 2013: French troops being airlifted to Mali. (U.S. Air Force photo by Staff Sgt. Nathanael Callon)

January 2013: French troops being airlifted to Mali. (U.S. Air Force photo by Staff Sgt. Nathanael Callon)

Nothing can ever justify or excuse an act of terrorism against civilians. But that does not absolve us from truly comprehending the links between the foreign and military policy approaches pursued by Western governments and the reactions this generates.

The aftermath of a terrorist attack is an especially difficult time to ask difficult questions about strategy. But just as the United States has faced a lot of (justified) criticism for refusing to acknowledge the direct linkages between misguided interventions and blowback incidents, we cannot apply a different yardstick to France.

Watch for the warmongers

This is all the more critical as, in the wake of the events in Paris, there are those pundits and policymakers who are trying to let slip the dogs of war or beat the drums by defining the scourge of “radical Islam” and “homegrown terrorism” as the root of all evil.

If we should have learned one thing by now, it is that tough talk is not the same as serious, strategic policymaking. It is irresponsible to undertake foreign policies without accurately representing to the public the likely risks to them that it will create.

As we assess the future approach, we must also take account of the role that Western governments have played in creating this catastrophe.

This applies especially to all those who glibly claim that ISIS “cannot be contained; it must be defeated,” as Hillary Clinton has just done.

Such an argument conveniently overlooks the fact that it was the U.S. government that inadvertently gave rise to this movement. Its decades of invasions and unpopular interference in the region ultimately culminated in the Pandora’s box war of choice in Iraq. Out of, and in reaction to, these policies grew al Qaeda and ISIS.

The advocates of such a strategy must also explain what can possibly be accomplished by responding with yet more force in an already war-torn region.

An eye for an eye strategy, while sounding principled, makes the whole world blind to the pitfalls such an approach has been triggering.

The French example

France can actually serve as Exhibit A of the pitfalls of a more “muscular” approach. The cruel attacks in Paris are demonstrably reactive in nature.

The unfortunate reality no one wants to discuss at the moment is that France’s Presidents Nicolas Sarkozy (2007-2012) and François Hollande (2012-present) have pushed the envelope for modern France on maintaining a highly aggressive and forcible military presence in majority Muslim countries.

Not since perhaps the Algerian War has France meddled with, sent troops to or bombed so many predominantly Muslim regions in such a short span.

President Sarkozy led regime change in Libya by air campaign in 2011 at the nadir of his domestic popularity. We know what that resulted in. He did it for oil and whatever it was that Iraq War apologist Bernard-Henry Lévy promised him would transpire.

But his successor, President Hollande, went way, way farther — claiming, almost George W. Bush style, that he was fighting ‘them’ over there to protect France from terrorist attacks at home. This approach painted a much bigger target on France’s back.

Hollande’s misadventures

The Hollande record is this: First, he invaded Mali in January 2013, after it collapsed as part of fallout from the Libya meltdown. He did so purportedly to stop terrorism and prevent the creation of a terrorism launching pad near Europe (despite Libya being much closer and truly festering).

In December 2013, he then invaded Central African Republic to ‘save’ Christians from Muslim militias that had already been disbanded. (It did not help that French troops now implicated in widespread child abuse stood by as Christian militias mutilated Muslim civilians’ corpses in front of them.)

In May 2014, Hollande announced a large, permanent rapid strike force deployment to five “Sahel-Sahara” West African nations, all of which were majority or plurality Muslim. He sent jets to bomb Iraq in September 2014. Finally, a year later in September 2015 he sent jets to bomb Syria.

It is difficult to understand Hollande’s declaration that the November 2015 Paris attacks are an “act of war” by ISIS, in view of the reality that France has already been at war with ISIS for more than a year.

Note, too, that the United States was barely involved in half of those misguided efforts.

Whether or not it can match U.S. capacity, France is no longer a junior partner or even hapless “sidekick” to the United States’ mayhem. In that sense, Hollande has gone much further than Tony Blair ever did during the Iraq War episode. Blair restrained himself to just being a sidekick.

France under Hollande has turned itself into an active cyclone by pursuing a militarized foreign policy – a strategy that may prove self-defeating. Read more

In and after Gulf War, US covered up troops’ nerve gas exposure

Late Baathist-era flag of the Republic of Iraq, 1991-2004.

Late Baathist-era flag of the Republic of Iraq, 1991-2004.

Newsweek:

During and immediately after the first Gulf War, more than 200,000 of 700,000 U.S. troops sent to Iraq and Kuwait in January 1991 were exposed to nerve gas and other chemical agents. Though aware of this, the Department of Defense and CIA launched a campaign of lies and concocted a cover-up that continues today.
[…]
During January and February 1991, when the U.S. bombed Iraq’s weapons plants and storage sites, poisonous plumes floated across the desert to thousands of U.S. troops based on the Saudi border. Sirens wailed daily, but officers in charge announced that the chemical-detection alarms were faulty.

 
The U.S. government and military continues to obscure or misrepresent the scale of the lasting damage for veterans to this day. To say nothing of the very similar mishandling and coverup in the second Iraq War of legacy chemical weapons disposal.

Ranked choice voting for statewide executives?

Below I’ll present passages explaining briefly what ranked choice voting is and then present historical evidence as to why the system might be particularly (perhaps uniquely) suited to the constitutional role of a statewide executive. The latter case I have drawn from the constitutional debates that shaped the late 18th century creation of a governorship model that spread from Massachusetts to the eventual 50 states (as well as influencing the original U.S. setup).

massachusetts-statehouse

The New America Foundation and FairVote, June 2008, on what instant runoff ranked choice voting is, procedurally:

Instant runoff voting (IRV) is an election method that determines the choice of a majority of voters in a single round of voting without the need to conduct a separate runoff election. As a majority voting method, IRV is ideal for single-winner offices such as governor […] In recent years, IRV has been implemented for local elections in several American cities, including San Francisco (CA), Cary (NC), Hendersonville (NC), Takoma Park (MD), and Burlington (VT).

An explanation of mechanics and outcomes, also from FairVote:

Ranked choice voting (RCV) describes voting systems that allow voters to rank candidates in order of preference, and then uses those rankings to elect candidates able to combine strong first choice support with the ability to earn second and third choice support. RCV is an “instant runoff” when electing one candidate…
– Gives voters the option to rank as many or as few candidates as they wish without fear that ranking less favored candidates will harm the chances of their most preferred candidate.
– Empowers voters with more meaningful choice.
– Minimizes strategic voting.
[…]
In a single seat ranked choice voting election, sometimes called instant runoff voting, votes are first distributed by first choices. If no candidate has more than half of those votes, then the candidate with the fewest first choices is eliminated. The voters who selected the defeated candidate as a first choice will then have their votes added to the totals of their next choice. This process continues until a candidate has more than half of the active votes or only two candidates remain. The candidate with a majority among the active candidates is declared the winner.

 
Now let’s turn to a 1780 theory on the role of a statewide executive office from the constitutional debates of the State of Massachusetts, described and quoted by Eric Nelson in his 2014 book The Royalist Revolution: Monarchy and the American Founding (pp. 176-177):

But the most extraordinary response to the proposed constitution came from the town of Wells, far to the north of Massachusetts (in present-day Maine). Like the Groton committee, the drafters of the Wells report began by proposing that “the Governor might have a [full] Negative on all Acts of the Legislature.” “We think it very necessary,” they explained, “that the Independence of the Executive and Judicial Departments be well secured – Nor can We conceive of how this can be done effectually unless there be a Power lodged somewhere of negativing such legislative Acts as tend to destroy or violate this Independency–And We are clearly of the opinion that the Governor will be the most fit person to be intrusted with this Power; he being the first Magistrate and the Sole Representative of the whole Commonwealth.” But the authors then added an excursus that was entirely their own. The governor, they insisted, will constitute “the Center of the Union to all the several parts and members of the political Body; who is chosen and constituted by the whole Community to be in a peculiar manner the Guardian of the Constitution and of the Rights and Interests of the whole State–All the Individuals have a like Interest in him and stand in a like Relation to him as their common Representative.” […] “when we consider that the several Members of the Legislative Body are to be chosen only by particular Districts as their special Representatives and many not improbably be often chosen for the very purpose of serving and promoting such Views and Designs of their Constituents as would be injurious to other parts of the State,” the dangers of assembly government become perfectly clear. It follows that “we cannot but think that the Representatives of the Whole People who can have no reason to act under the Influence of such partial Biases and Respects should be furnished with ample and Sufficient Powers to prevent effectively the pernicious Consequences of such narrow Policy, as is calculated to serve the Interest of one part to the injury of another who may happen not to have an equal Interest in the Legislature.”

The authors went on to explain that their heightened sensitivity to the dangers of legislative power, and the corresponding need to invest the chief magistrate with sweeping prerogatives, arose chiefly out of their experience of life on the periphery of a political community. “The distant parts” of the commonwealth, such as Wells, “may Scarce have a single Member to Speak and act on their Behalf” in the legislature, and, accordingly, the two chambers “may be prevailed upon to pass Bills injurious, oppressive and pernicious to a great part of the people.” […] But however estranged they might be from the metropolitan legislature, “we shall always have a Representative in the Person of our Governor, we may claim an equal Interest in him with the other parts of the State.”

 
In other words, rather than electing a statewide executive (whether the governor or further down the ballot) who effectively represents only the plurality of voters who voted for him or her — often mirroring the geographical distribution of the population itself as the legislature already does — an executive would be elected by the whole people in this system.

Each voter would have cast a vote for that governor, in effect, by ranking a list of candidates. The governor would likely be the first or second choice of a much broader range of people than under the current system. He or she would be accountable to and representing the whole of the state, not just the populous parts.

The War on Chronology

Donald Trump’s quote about George W. Bush was literally as simple as “The World Trade Center came down during his reign” — which is a statement of chronological fact, without even making a judgment upon its significance or lack thereof, yet establishment conservatives are furious about that.

This emblematic is what we’re up against on a major scale: People who don’t just have an alternate worldview but an alternate view of chronological reality.

I’ve said this before but it bears repeating: So many points of “conventional wisdom” from the political and media establishment in Washington (including both sides of the aisle, but especially conservatives) fall apart when chronology is applied to cause-and-effect claims they make. It’s not just “correlation is not causation” — it’s that they get the order of historical events consistently wrong in drawing broad conclusions about them. Everything becomes of the fault of their opponents (whether on their own side or the other side) by presenting the reaction to something as its historical cause.

13 of Truman’s 21 policy points from 1945 are relevant today

In an address to Congress just days after the celebration of V-J Day in the United States, President Harry S. Truman outlined to Congress what the country must do after World War II. 13 of those 21 policy points remain fully or significantly relevant in 2015, seventy years later.

harry-truman

“Special Message to the Congress Presenting a 21-Point Program for the Reconversion Period” – September 6, 1945
1. Unemployment compensation
2. Fair Labor Standards Act

5. Full Employment
6. Fair Employment (non-discrimination)
7. Harmonious Industrial-Labor relations
8. Job creation for returning veterans and in regions where job opportunities are scarce
9. Sustainable agriculture

11. Housing for all (urban and rural) and socially responsible city planning
12. Support for research (academic, industrial, governmental)
13. Responsible tax policy (matching revenues to expenditure needs, balancing burden distribution)

15. Support for small business
16. Support for returning veterans in all arenas of life (GI Bill and health care)
17. Investment in public works and conservation of national resources

(These points are all elaborated in greater detail at the link above to the full speech. The points not included all relate more specifically to the World War II situation itself or its immediate aftermath.)

Complicated former longtime president of Benin dies

February 2006 Photo: President Mathieu Kérékou (right) of Benin receives Brazil's president.

February 2006 Photo: President Mathieu Kérékou (right) of Benin receives Brazil’s president.

One of Africa’s most unusual and complicated leaders — Pastor Mathieu Kérékou of Benin — has passed away at age 82. The former radical military dictator and later civilian democratic president led Benin through several major transformations in its history, eventually earning him the surprising nickname “father of democracy.” BBC News:

Mr Kerekou had two spells as president totalling nearly 30 years, first coming to power as the head of a Marxist regime in 1972.

But he then accepted the idea of multi-party democracy and organised elections, which he lost in 1991. […]
He stepped down in 1991 after losing to Nicephore Soglo in a multi-party poll, but returned to power in 1996 having beaten Mr Soglo at the polls and then went on to win a second and final five-year term in 2001.

 
From 1972 to 1991, Kérékou served as the country’s military president, pursuing a radical new nationalism in his first two years and then a hybrid of nationalism and revolutionary Marxism-Leninism, backed by the Soviet Union. Much of it was marked by totalitarian violence and incompetent policy management. Over the course of his first presidency, the economic doctrines would grow less and less radically leftist and more moderate, eventually moving even to the center-right by the late 1980s.

During the early period, however, he renamed the country from Dahomey to Benin, in an effort to shed the French colonial legacies and avoid favoring one ethnic group over another, although both labels applied to pre-colonial African states in the area. Eventually, after facing down many coup attempts and amid growing economic stagnation and political unrest, he realized that his days were probably numbered if he clung to power — particularly with the Soviet Union’s fading influence and then disintegration — so he accepted a transition to multi-party democracy when it was demanded by a 1990 National Conference to fix the unraveling domestic situation.

Perhaps most importantly, however, Kérékou did not fight or cancel this transition when it became clear he would not be kept in power democratically, and he gracefully exited the political stage, even asking for forgiveness on national TV for whatever errors and crimes his regime had committed. He was permitted to remain president (albeit with an outside prime minister) through the 1991 elections, which he contested but lost by a landslide. 1991 in Benin became sub-Saharan Africa’s first successful direct handoff of power by a free election since the end of colonialism. This peaceful and stable transition likely helped spark or reinforce the coming wave of democracy in West Africa during the 1990s.

The onetime Marxist and atheist (rumored possibly also to have dabbled with Islam) staged an impressive comeback one term later, in 1996, this time as an evangelical Christian pastor, to become the second civilian president of Benin. This political comeback itself set its own precedent whereby former African military rulers would rehabilitate themselves as wise and experienced civilian candidates for the offices they once held by force.

Kérékou served two five-year terms as a civilian, from 1996 to 2006, before retiring again. Announcing, in 2005, his planned departure from the presidency per the constitutional term limits, Kérékou explained that a lifetime of high-level service had taught him one lesson many times: “If you don’t leave power, power will leave you.” Once again, he was strengthening democracy in Benin and the region.

His successor, President Thomas Boni Yayi, now nearing the end of his own second term had widely been rumored to be considering trying to remove the term limits provision but seems to have bowed earlier in 2015 to similar pressure to leave power before it leaves him. This decision to retire was likely reinforced by the Burkina Faso revolution in 2014 over an attempt to lift presidential term limits and the chaotic political violence in Burundi after the president sought a third term on a technicality. For now, the unexpected legacy of Kérékou, born-again democrat not totalitarian dictator, will live to see another day.