Saturday, August 29, 2009

The Jvl Bi-Weekly for 083109

LAST ISSUE OF THE JvL BI-WEEKLY

(It will be replaced by Occasional Political Papers)

(See A Special Announcement for The JvL Bi-Weekly for an explanation)

I can be most easily reached through the following email address for suggesting new additions to the subscription list or to cancel your subscription to the Bi-Weekly:

channujames@yahoo.com

The Blog Address for the Bi-Weekly is: http://jvlbiweekly.blogspot.com

Please forward the Blog address for the Bi-Weekly to any who might be interested

Monday, August 31st, 2009

Volume 8, No. 16

4 articles, 33 pages

1. Latin American Social Movements in Times of Economic Crisis

2. Obama on Drugs: 98% Cheney?

3. Health Care Stirs up whole Foods CEO John Mackey

4. The Collapse Gap

1. LATIN AMERICAN SOCIAL MOVEMENTS IN TIMES OF ECONOMIC CRISIS

BY

JAMES PETRAS

The most striking aspect of the prolonged and deepening world recession/depression is the relative and absolute passivity of the working and middle class in the face of massive job losses, big cuts in wages, health care and pension payments and mounting housing foreclosures. Never in the history of the 20-21st Century has an economic crisis caused so much loss to so many workers, employees, small businesses, farmers and professionals with so little large-scale public protest.

To explore some tentative hypotheses of why there is little organized protest, we need to examine the historical-structural antecedents to the world economic depression. More specifically, we will focus on the social and political organizations and leadership of the working class, the transformation of the structure of labor and its relationship to the state and market. These social changes have to be located in the context of the successful ruling class socio-political struggles from the 1980’s, the destruction of the Communist welfare state, and the subsequent uncontested penetration of imperial capital in the former Communist countries. The conversion of Western Social Democratic parties to neo-liberalism, and the subordination of the trade unions to the neo-liberal state are seen as powerful contributing factors in diminishing working class representation and influence.

We will proceed by outlining the decline of labor organization, class struggle and class ideology in the context of the larger political-economic defeat and co-optation of anti-capitalist alternatives. The period of capitalist boom and bust leading up to the current world depression sets the stage for identifying the strategic structural and subjective determinants of working class passivity and impotence. The final section will bring into sharp focus the depth and scope of the problem of trade union and social movement weakness and their political consequences.

History of Economic Depression and Worker Revolts: US, Europe, Asia and Latin America

The social history of the 20th and early 21st Century’s economic crises and breakdowns is written large with working class and popular revolts, from the left and right. During the 1930’s the combined effects of the world depression and imperialist-colonial wars set in motion major uprisings in Spain (the Civil War), France (general strikes, Popular Front government), the US (factory occupations, industrial unionization), El Salvador, Mexico and Chile (insurrections, national-popular regimes) and in China (communist/nationalist, anti-colonial armed movements). Numerous other mass and armed uprising took place in response to the Depression in a great number of countries, far beyond the scope of this paper to cover.

The post-World War II period witnessed major working class and anti-colonial movements in the aftermath of the breakdown of European empires and in response to the great human and national sacrifices caused by the imperial wars. Throughout Europe, social upheavals, mass direct actions and resounding electoral advances of working class parties were the norm in the face of a ‘broken’ capitalist system. In Asia, mass socialist revolutions in China, Indo-China and North Korea ousted colonial powers and defeated their collaborators in a period of hyper-inflation and mass unemployment.

The cycle of recessions from the 1960’s to the early 1980’s witnessed a large number of major successful working class and popular struggles for greater control over the work place and higher living standards and against employer-led counter-offensives.
Economic Crises and Social Revolts in Latin America

Latin America experienced similar patterns of crises and revolts as the rest of the world during the World Economic Depression and the Second World War. During the 1930-40’s, aborted revolutionary upheavals and revolts took place in Cuba, El Salvador, Colombia, Brazil and Bolivia. At the same time ‘popular front’ alliances of Communists, Socialists and Radicals governed in Chile and populist-nationalist regimes took power in Brazil (Vargas), Argentina (Peron) and Mexico (Cardenas).

As in Central and Eastern Europe, Latin America also witnessed the rise of mass right-wing movements in opposition to the center-left and populist regimes in Mexico, Argentina, Brazil, Bolivia and elsewhere – a recurrent phenomenon overlooked by most students of ‘social movements’.

The phenomenon of ‘crisis’ in Latin America is chronic, punctuated by ‘boom and bust’ cycles typical of volatile agro-mineral export economies and by long periods of chronic stagnation. Following the end of the Korean War and Washington’s launch of its global empire building project (mistakenly called ‘The Cold War’), the US engaged in a series of ‘hot wars’, (Korea- 1950-1953 and Indo-China- 1955-1975) and overt and clandestine coups d’etat (Iran and Guatemala – both in 1954); and military invasions (Dominican Republic, Panama, Grenada and Cuba); all the while backing a series of brutal military dictatorships in Cuba (Batista), Dominican Republic (Trujillo), Haiti (Duvalier),Venezuela (Perez-Jimenez), Peru (Odria) among others.

Under the combined impact of dictatorial rule, blatant US intervention, chronic stagnation, deepening inequalities, mass poverty and the pillage of the public treasury, a series of popular uprisings, guerrilla revolts and general strikes toppled several US-backed dictatorships culminating in the victory of the social revolution in Cuba. In Brazil (1962-64), Bolivia (1952), Peru (1968-74), Nicaragua(1979-89) and elsewhere, nationalist presidents took power nationalizing strategic economic sectors, re-distributing land and challenging US dominance. Parallel guerrilla, peasant and workers movements spread throughout the continent from the 1960’s to the early1970’s. The high point of this ‘revolt against economic stagnation, imperialism, militarism and social exploitation/exclusion’ was the victory of the socialist government in Chile (1970-73).

The advance of the popular movements and the electoral gains however did not lead to a definitive victory (the taking of state power) except in Cuba, Grenada and Nicaragua nor did it resolve the crisis of capitalism (the key problem of chronic economic stagnation and dependence). Key economic levers remained in the hands of the domestic and foreign economic elites and the US retained decisive control over Latin America’s military and intelligence agencies.

The US backed military coups (1964/1971-76),US military invasions(Dominican Republic 1965 ,Grenada1983,Panama 1990,Haiti 1994,2005),surrogate mercenaries Nicaragua 1980-89 and right-wing civilian regimes (1982-2000/2005), reversed the advances of the social movements, overthrew nationalist/populist and socialist regimes and restored the predominance of the oligarchic troika: agro-mineral elite, the ‘Generals’ and the multinational corporations. US corporate dominance, oligarchic political successes and pervasive private pillage of national wealth accelerated and deepened the boom and bust process. However the savage repression, which accompanied the US-led counter-revolution and restoration of oligarch rule ensured that few large-scale popular revolts would occur, between the mid 1970’s to the beginning of the 1990’s – with the notable exception of Central America.

Civilian Rule, Neo-liberalism, Economic Stagnation and the New Social Movements

Prolonged stagnation, popular struggles and the willingness of conservative civilian politicians to conserve the reactionary structural changes implanted by the dictatorships, hastened the retreat of the military rulers. The advent of civilian rulers in Uruguay, Brazil, Chile, Bolivia, Argentina in the late 1980’s was accompanied by the rapid intensification of neo-liberal policies. This was spelled out in the ‘Washington Consensus’ and was integral to the President George H.W. Bush’s New World Order. While the new neo-liberal order failed to end stagnation it did facilitate the pillage of thousands of public enterprises, their privatization and de-nationalization. At the same time the massive outflow of profits, interest payments and royalties and the growing exploitation and impoverishment of the working people led to the growth of ‘new social movements’ throughout the 1990’s.

During the ascendancy of the military dictatorships and continuing under the neo-liberal regimes, while social movements and trade unions were suppressed, non-governmental organizations (NGOs) flourished. Billions of dollars flowed into the accounts of the NGOs from ‘private’ foundations. Later the World Bank and US and EU overseas agencies viewed the NGOs as integral to their counter-insurgency strategy.

The theorists embedded in the NGO-funded feminist, ecology, self-help groups and micro-industry organizations eschewed the question of structural changes, class and anti-imperialist struggles in favor of collaboration with existing state power structures. The NGO operatives referred to their organizations as the ‘new social movements’, which, in practice, worked hard to undermine the emerging class-based movements of anti-imperialists, Indians, peasants, landless workers and unemployed workers. These class-based mass movements had emerged in response to the imperial pillage of their natural resources and naked land grabs by powerful elites in the agro-mineral-export sectors with the full support of voracious neo-liberal regimes.

Toward the end of the 1990’s, neo-liberal pillage throughout Latin American had reached its paroxysm: Tens of billions of dollars were literally siphoned off and transferred, especially out of Ecuador, Mexico, Venezuela and Argentina, to overseas banks. Over five thousand lucrative, successful state-owned enterprises were ‘privatized’ by the corrupt regimes at prices set far below their real value and into the hands of select private US and EU corporations and local regime cronies. The predictable economic collapse and crisis following the blatant looting of the major economies in Latin America provoked a wave of popular uprisings, which overthrew incumbent elected neo-liberal officials and administrations in Ecuador (three times), Argentina (three successful times) and Bolivia (twice). In addition, a mass popular uprising, in alliance with a constitutionalist sector of the military, restored President Chavez to power. During this period mass movements flourished and numerous center-left politicians, who claimed allegiance to these movements and denounced ‘neo-liberalism’, were elected president.

The deep economic crisis and repudiation of neo-liberalism marked the emergence of the social movements as major players in shaping the contours of Latin American politics. The principal emerging movements included a series of new social actors and the declining influence of the trade unions as the leading protagonist of structural change.

The Crisis of 1999-2003: Major Social Movements at the ‘End of Neo-liberalism’

Major social movements emerged in most of Latin America in response to the economic crisis of the 1990’s and early 2000’s and challenged neo-liberal ruling class control. The most successful were found in Brazil, Ecuador, Venezuela, Argentina and Bolivia.

Brazil: The Rural Landless Workers Movement (MST), with over 300,000 active members and over 350,000 peasant families settled in co-operatives throughout the country, represented the biggest and best organized social movement in Latin America. The MST built a broad network of supporters and allies in other social movements, like the urban Homeless Movement, the Catholic Pastoral Rural (Rural Pastoral Agency) and sectors of the trade union movement (CUT), as well as the left-wing of the Workers Party (PT) and progressive academic faculty and students. The MST succeeded through ‘direct action’ tactics, such as organizing mass ‘land occupations’, which settled hundreds of thousands of landless rural workers and their families on the fallow lands of giant latifundistas. They successfully put agrarian reform on the national agenda and contributed to the electoral victory of the putative center-left Workers Party presidential candidate Ignacio ‘Lula’ Da Silva in the 2002 elections.

Ecuador: The National Confederation of Indian and Nationalities in Ecuador (CONAIE) played a central role in the overthrow of two neo-liberal Presidents, Abdala Bucaram in 1997 and Jamil Mahuad in January 2000, implicated in massive fraud and responsible for Ecuador’s economic crisis of the 1990’s. In fact, during the January 2000 uprising, the leaders of CONAIE briefly occupied the Presidential Palace. Beginning in the late 1990’s CONAIE had resolved to form an electoral party ‘Pachacuti’, which would act as the ‘political arm’ of the movement. Pachacuti, in alliance with the rightist populist former military officer Lucio Gutierrez in the 2002 elections, briefly held several cabinet posts, including Foreign Relations and Agriculture. CONAIE’s and Pachacuti’s short-lived experience as a government movement and party was a political disaster. By the end of the first year, the Gutierrez regime allied with multi-national oil companies, the US State Department and the big agro-business firms, promoted a virulent form of neo-liberalism and forced the resignation of most CONAIE-backed officials. By the end of 2003, widespread discontent and internal divisions were exacerbated by an army of US and EU-funded NGOs, which infiltrated the Indian communities.

Venezuela: Major popular revolts in 1989 and 1992 culminated in the election of Hugo Chavez in 1999. Chavez proceeded to encourage mass popular mobilizations in support of referendums for constitutional reform. A US-backed alliance between the oligarchy and sectors of the military mounted a palace coup in April 2002, which lasted only 48 hours before being reversed by a spontaneous outpouring of over a million Venezuelans supported by constitutionalist soldiers in the armed forces. Subsequently, between December 2002 and February 2003, a ‘bosses’ lockout’ of the petroleum industry, designed to cripple the national economy, supported by the Venezuelan elite and led by senior officials in the PDVSA (state oil company), was defeated by the combined efforts of the rank and file oil workers with support from the urban popular classes. The failed US-backed assaults on Venezuelan democracy and President-elect Chavez radicalized the process of structural changes: Mass community-based organizations, new class-based trade union confederations and national peasant movements sprang up and the million-member Venezuelan Socialist Party was formed. Social movement activity and membership flourished, as the government extended its social welfare programs to include free universal public health programs via thousands of clinics, state-sponsored food markets selling essential food at subsidized prices in poor neighborhoods and the development of universal free public education including higher education. At the same time numerous enterprises in strategic economic sectors, such as steel, telecommunications, petroleum, food processing and landed estates, were nationalized.

While the ruling class continues to control certain key economic sectors and highly-paid officials in the state sector retain powerful levers over the economy, the Chavez government and the mass popular movements have maintained the initiative in advancing the struggle throughout the decade from the late 1990’s into the first decade of the new millennium.

The Venezuelan social movements retain their vigor in part because of the encouragement of Chavez’ leadership, but the movements are also held back by powerful reformist currents in the regime, which seek to convert the movements into transmission belts of state policy. The movement-state relationship is fluid and reflects the ebb and flow of the conflict and the threats emanating from the US-backed rightist organizations.

The regime-movement relationship deepened during the crisis period of 1999-2003 and was further strengthened by the rise in oil prices during the world commodity boom of 2003-2008. With the unfolding of the world economic crisis in late 2008-2009, the positive relationship between the state and the movements will be tested.

Bolivia: Bolivia has the highest density of militant social movements of any country in Latin America, including high levels of mine and factory worker participation, community and informal market vender organizations, Indian and peasant movements and public employee unions. The long years of military repression from the early 1970’s to the mid 1980’s weakened the trade unions and was followed by intense application of neo-liberal policies.

By the end of the 1990’s, new large-scale social movements emerged but the locus of activity shifted from the historically militant mining districts and factories to the ‘sub-proletariat’ or ‘popular classes’ engaged in informal, ‘marginal’ occupations, especially in cities like ‘El Alto’. ‘El Alto’, located on the outskirts of La Paz, is densely populated by recent migrants, displaced miners and impoverished Indians and peasants, and received few public services. The new nexus for direct action challenging the neo-liberal regimes emerged from the coca farmers and Indian communities in response to the brutal implementation of US-mandated programs suppressing coca cultivation and the displacement of small farmers in favor of large-scale, agro-business plantations. In the cities, public sector employees, led by teachers, students and factory health worker unions fought neo-liberal measures privatizing services, like water, and cutting the public budgets for education and health care.

The economic crises of the late 1990-2000’s led to major public confrontation in January 2003, followed by a popular revolt in October and insurrection centered in ‘El Alto’ and spread to La Paz and throughout the country. Before being driven from power, the Sanchez de Losada regime murdered nearly seventy community activists and leaders. Hundreds of thousands of impoverished Bolivians stormed the capital, La Paz, threatening to take state power. Only the intervention of the coca farmer leader and presidential hopeful, Evo Morales, prevented the mass seizure of the Presidential palace. Morales brokered a ‘compromise’ in which the neo-liberal Vice President Carlos Mesa was allowed to succeed to the Presidency in exchange for a vaguely agreed promise to discontinue the hated neo-liberal policies of his predecessor, Sanchez de Losada. The tenuous agreement between the social movements and the ‘new’ neo-liberal President survived for two years due to the moderating influence of Evo Morales.

In May-June 2005, a new wave of mass demonstrations filled the streets of La Paz with workers, peasants, Indians and miners forcing Carlos Mesa to resign. Once again, Evo Morales intervened and signed a pact with the Congress calling for national elections in December 2005 in exchange for calling off the protests and appointing a senior Supreme Court judge (Rodriguez) to act as interim President.

Morales diverted the mass social movements into his party’s campaign machinery, undercutting the autonomous direct action strategies, which had been so effective in overthrowing the two previous neo-liberal regimes. This resulted in his election as President in December 2005.

While the economic crisis abated with the boom in commodity prices, President Evo Morales’ social-liberal policies did little to reduce the gross income inequalities, the vast concentration of fertile land in a handful of plantation elite and the dispossession of a majority of Indian communities from their lands. Morales’ policies of forming joint ventures with foreign multinational gas, oil and mining companies did little to end the massive transfer of profits from Bolivia’s natural resources back to the ‘home offices’ of the MNCs. Nevertheless the Morales’ tepid ‘nationalist gestures led to a ‘political-economic’ confrontation with the US-backed Bolivian oligarchy, which was funded by their enormous private profits gained during the ‘commodity boom’.

Argentina: The strongest relationship between a severe economic crisis and a mass popular rebellion took place in Argentina in December 19-20, 2001 and continued throughout 2002.

The conditions for the economic collapse were building up in the 1990s during the two terms of President Carlos Menem. His neo-liberal regime was marked by the corrupt ‘bargain basement’ sale of the most lucrative and strategic public enterprises in all sectors of the economy. The entire financial sector of Argentina was de-regulated, de-nationalized, dollarized and opened up to the worst speculative abuses. The national economic edifice, weakened by the massive privatization policies, was further undermined by rampant corruption and gross pillage of the public treasury. Menem’s policies continued under his successor, President De la Rua, who presided over the banking crisis and the subsequent collapse of the entire national economy, the loss of billions of dollars of private savings and pension funds, a thirty percent unemployment rate and the most rapid descent into profound poverty among the working and middle classes in Argentine history.

In December 2001, the people of Buenos Aires staged a massive popular uprising in front of the Presidential palace with the demonstrators taking over the Congress. They ousted President De la Rua and subsequently three of his would-be presidential successors in a matter of weeks. Hundreds of thousands of organized, unemployed workers blocked the highways and formed community-based councils. Impoverished, downwardly mobile middle class employees and bankrupt shopkeepers, professionals and pensioners formed a vast array of neighborhood assemblies and communal councils to debate proposals and tactics. Banks throughout the country were stormed by millions of irate depositors demanding the restitution of their savings. Over 200 factories, which had been shut down by their owners, were taken over by their workers and returned to production. The entire political class was discredited and the popular slogan throughout the country was: ‘!Que se vayan todos!’ (‘Out with all politicians!’). While the popular classes controlled the street in semi-spontaneous movements, the fragmented radical-left organizations were unable to coalesce to formulate a coherent organization and strategy for state power.

After two years of mass mobilizations and confrontation, the movements, facing an impasse in resolving the crisis, turned toward electoral politics and elected center-left Peronist Kirchner in the 2003 Presidential campaign.

Low Intensity Social Movements: Peru, Paraguay, Colombia, Chile, Uruguay, Central America, Haiti and Mexico

The entire Latin American continent and the neighboring regions witnessed the significant growth of social movement activity of greater or lesser scope. What differentiated these movements from their counterparts in Brazil, Argentina, Ecuador, Bolivia and Venezuela was the absence of political challenges and regime change and the limited scope of their social action.

Nevertheless significant outbreaks of mass popular movements raised fundamental challenges to the reigning neo-liberal hegemony.

In Haiti, a mass popular rebellion to reinstate the democratically elected President Jean Bertrand Aristide, who had been taken hostage and flown into exile by a joint US-EU-Canadian military operation, was brutally repressed by a multinational mercenary force led by a Brazilian general. Subsequent massacres in crowded slums by the occupying troops aborted the resurgence of the popular ‘Lavelas’ movement protesting the foreign imposition of neo-liberal ‘privatization’ and austerity measures.

Mexico witnessed a series of localized rebellions and mass uprisings against the neo-liberal regimes dominating Mexico. In 1994, the Zapatista National Liberal Army (EZLN), based in the Indian communities of rural Chiapas, rose and temporarily succeeded in gaining control of several towns and cities. With the entry of many thousands of Mexican Federal troops, and in the absence of a wider network of support, the Zapatistas withdrew to their jungle and mountain bases. An unstable truce was declared, frequently violated by the government, in which an isolated EZLN continued to exist confined to a remote area in the state of Chiapas. In Oaxaca, an urban rebellion, backed by trade unions, teachers and popular classes in the capital city and surrounding countryside, organized a popular assembly (comuna) and briefly created a situation of ‘dual power’ before being suppressed by the reactionary neo-liberal governor of the state using ‘death squads’ and Mexican troops. Faced with the repressive power of the state, the insurgent popular movements shifted toward the electoral process and succeeded in electing center-left Andres Manual Lopez Obrador in 2006 in the midst of the neo-liberal economic debacle. Their victory was short-lived, with the election results, overturned through massive fraud in the final tally of the votes. Subsequent peaceful protests involving millions of Mexicans eventually lost steam and the movement dissipated.

In Colombia, mass peasant, trade union and Indian protests challenged the neo-liberal Pastrana regime (1998-2002) while the major guerrilla movements (FARC/ELN) advanced toward the capital city. Fruitless peace negotiations, broken off under US pressure and a $5 billion dollar US counter-insurgency program, dubbed ‘Plan Colombia’, heightened political polarization and intensified paramilitary death-squad activity. With the election of Alvaro Uribe, the Colombian regime decimated peasant, trade union and human rights movements as it advanced its neo-liberal policies.

The political effects of the economic crisis at the end of the 1990’s, which had precipitated social movement activity throughout the hemisphere, led to brutal repression in Haiti, Mexico and Colombia in order for the neo-liberal regimes to continue their policies.

In several other Latin American countries, namely Peru and Paraguay, as well as in Central America, powerful rural-based peasant and Indian movements engaged in rural road blockages and land occupations against their governments’ neo-liberal ‘free trade’ agreements with the US. Since these rural movements lacked nation-wide support, especially from the urban centers, their struggles failed to make a significant impact even as their economies crumbled under neo-liberal policies.

Social Movements in the Time of the Commodity Boom

The sharp rise of agricultural and mineral commodity prices between 2003-2008, along with the election of center-left politicians, had a major impact on the most active and dynamic social movements.

In Brazil the election of Lula De Silva (2002-2006) from the putatively center-left Workers Party was backed by all the major social movements, including the MST (Landless Rural Workers Movement) under the mistaken assumption that he would accelerate progressive structural changes like land re-distribution. Instead, Da Silva embraced the entire neo-liberal agenda of his predecessor, President Cardoso, including widespread privatization and tight fiscal policies, which, with the rise of agro-mineral prices, led to a narrowly focused agro-mineral export strategy centered exclusively on large agro-business and mineral extractive elites to the detriment of small businesses and rural producers. The MST’s efforts to influence Da Silva over the past decade(2003-2009) were futile – as state, local and federal governments criminalized the movement’s direct action tactics of land occupation. Lula’s policy of granting subsistence federal food allowances to the extremely poor and his success at co-opting movement leaders, especially from the huge trade union federations, neutralized the landless peasants and organized workers’ capacity to protest and strike. Lula’s policies isolated the MST from its ‘natural’ urban allies in the labor movement.

Lula’s right-turn and the vast increase in export revenues from high commodity prices led to increased social expenditures and reduced the level of activity and support for the MST in its struggle for agrarian reform. While retaining its mass base and continuing its land occupations, the MST no longer had a strategic political ally in its quest for social transformation. Subsequently it pursued more moderate reforms to avoid confrontation with the Lula regime, to which it still offered ‘critical support’.

In Argentina, the massive wave of direct action social movements subsided with the election of Kirchner (2003-2008) and the 7% economic growth rate stimulated by the commodity boom and the recovery from the dramatic economic melt-down of 2001-2002. With the recovery of employment and the return of their savings, the middle class assemblies rapidly disappeared. Kirchner offered subsidies to the unemployed and co-opted their leaders, which led to a sharp reduction of road blockages and membership in the militant unemployed workers organizations. Kirchner won over part of the human rights movement with his policies, which included his public purge of some of the more notorious military and police officials and the granting of subsidies to certain sectors of the human rights movement, including the Madres de la Plaza de Mayo. With the decline of the radicalized movements of 1999-2002, the economic recovery of 2003-2008 led to a partial recovery of trade union activism, whose demands were mostly economic, focusing on the recovery of the workers’ wages and benefits lost during the systemic crisis.

In Bolivia, the economic boom, which began under the neo-liberal regime of Carlos Mesa continued under ‘leftist’ populist Evo Morales. He quickly moderated movement demands as he moved to the center-left. As an alternative to the social movement platform calling for the nationalization of the principal resource sectors exploited by multi-national corporations, Morales promoted ‘joint ventures’ which he demagogically claimed were ‘nationalization without expropriation’. Likewise he answered peasant and Indian demands for agrarian reform by opening up mostly uncultivatable public lands in the Amazon to the landless peasants. By the same token, he protected the most fertile land in the largest privately owned plantations from expropriation by exempting private land, which was classified as performing a ‘social function’. Avoiding structural change, Morales was able to use the windfall of state revenues from the high prices of Bolivian minerals and gas to co-opt movement leaders, provide incremental increases in the minimum wage, finance subsidies to Indian communities, encourage legal, political rights and recognize indigenous jurisdiction over their local communities.

Morales retained his leadership of the coca farmers union and, through his Movement to Socialist Party (MAS), exercised hegemony over the major community-based movements. His close ties with Presidents Castro in Cuba and Chavez in Venezuela set him in radical opposition to Washington’s interventionist policies and its supporters among the five rightist-controlled provinces centered in Santa Cruz. The extreme right gained ascendancy in the latter region and launched a violent racist frontal assault on the Morales government, polarizing the countryside while guaranteeing Morales the continued mass support among the popular classes and movements throughout the country.

In Ecuador, the powerful Indian movement (CONAIE) and its allies in the trade unions supported the neo-liberal regime of Lucio Gutierrez and suffered a severe decline in their power, support and organizational cohesion. The recovery has been slow, hindered by interventions of numerous US/EU funded NGOs.

With the demise of the established social movements, a new urban-based ‘citizens’ movement’ led by Rafael Correa overthrew the venal, corrupt, neo-liberal Gutierrez regime and led the electorate to vote Correa into power in both 2006 and 2009. Correa adapted center-left political positions, financing incremental wage and salary increases and state subsidized cheap credit to small and medium size businesses. He adopted a nationalist position on foreign debt payments and the termination of US military basing rights in Manta. The boom in mining and petroleum prices and ties with oil-rich Venezuela facilitated President Correa’s capacity to fund programs to secure support among the Andean bourgeoisie and the popular classes.

In Venezuela, the economic boom, namely the tripling of world oil prices, facilitated Venezuela’s economic recovery after the crisis caused by the opposition coup and the bosses’ lockout (2002-2003). As a result, from 2004 to 2008 Venezuela grew by nearly 9% a year. The Chavez government was able to generously fund a whole series of progressive socio-economic changes that enhanced the strength and attraction of pro-government social movements. The social movements played an enormous role in defeating opposition referendums, which had called for the impeachment of the President. Peasant organizations were prominent in pressuring recalcitrant bureaucrats in the Chavez government to implement the new agrarian laws calling for land distribution. Trade union militants organized strikes and demonstrations and played a major role in the nationalization of the steel industry. Given the vast increase in state resources, the Chavez government was able to both compensate the owners of the expropriated firms and meet workers’ demands for social ownership.

Summary

The economic boom and the ascendancy of center-left governments led to incremental increases in living standards, a decline of unemployment and the co-optation of some movement leaders — resulting in the decline of radical movement activity and the revival of traditional ‘pragmatic’ trade union moderates. During the economic boom and the rise of the center-left, the only major mass mobilization took the form of right wing movements determined to destabilize the center-left governments in Bolivia and Venezuela.

A comparison of the social movements in countries where they played a major role in political and social change (Venezuela, Ecuador, Brazil and Bolivia) and movements in countries where they were marginalized reveals several crucial differences. First of all, the differences are not found in terms of the quantity of public protests, militant direct actions or number of participants. For example, if one adds up the number of social movement protests in Mexico, Peru, Colombia and Central America, they might equal or even surpass the social actions in Brazil, Argentina and Bolivia. What was different and most politically significant was the quality of the mass action. Wherever they were of marginal significance, the organizations were fragmented, dispersed and without significant national leadership or structure and without any political leverage on the institutions of national power. In contrast, influential social movements operated as national organizations, which coordinated social and political action, centralized and capable of reaching the nerve centers of political power – the capital cities (La Paz, Buenos Aires, Quito and to a lesser degree Sao Paolo). To one degree or another, the high impact social movements combined rural and urban movements, had political allies in the party system and bridged cultural barriers (linking indigenous and mestizo popular classes).

World Economic Crisis and Social Movements – 2008 Onward

Beginning in late 2008 and continuing in 2009 the world economic crisis spread across Latin America. The crisis came later to Latin America and with less initial severity than in the US or EU. Because it is an ongoing process, the full socio-political implications and economic impact is still far from clear. What we can observe is that, at least initially, the current crisis has not provoked anything like the mass upheavals and the surge of radical social movements that we witnessed during the crisis beginning in 2001.



Gross Domestic Product

($ Millions of dollars, constant 2000 prices)

Annual growth rates



Country
2007
2008
2009*

Argentina
8.7
7.0
1.5

Bolivia
4.6
6.1
2.5

Brazil
5.7
5.1
-0.8

Chile
4.7
3.2
1.0

Colombia
7.5
2.6
0.6

Costa Rica
7.8
2.6
3.0

Cuba
7.3
4.3
1.0

Ecuador
2.5
6.5
1.0

El Salvador
4.7
2.5
-2.0

Guatemala
6.3
4.0
1.0

Haiti
3.4
1.3
2.0

Honduras
6.3
4.0
2.5

Mexico
3.3
1.3
-7.0

Nicaragua
3.2
3.2
1.0

Panama
11.5
9.2
2.5

Paraguay
6.8
5.8
3.0

Peru
8.9
9.8
2.0

Dominican Republic
8.5
5.3
1.0

Uruguay
7.6
8.9
1.0

Venezuela
8.9
4.8
0.3

Sub-total Latin America
5.8
4.2
-1.9

Caribbean
3.4
1.5
-1.2

Latin American and the Caribbean
5.8
4.2
-1.9


* Projections
Source: ECLAC

If anything, we have seen a surge of right-wing movements and electoral organizations in countries, like Argentina, and a US-backed right-wing military coup backed by the rightist business associations in Honduras, and the continued ‘pragmatic’ behavior of mass social movements in Brazil, Bolivia and Ecuador.

The only exception is in Peru where the organized Indian communities in the Amazonian region have engaged in armed mass confrontations with the US-backed, right-wing regime of Alan Garcia. The Amazonian Indians responded to a series of Government decrees, which handed mineral and gas exploitation rights on Indian lands to foreign mining and energy corporations. From a historical perspective, the struggle was ‘conservative’, in so far as it pitted indigenous communities defending traditional use and ownership of lands and resources against the modern economic predators and the neo-liberal state.

The Lumpen-Bourgeoisie: The Triple Alliance of the Neo-Liberal State, Narco-traffickers and the Unemployed Poor

The least studied, but most dynamic, and, possibly best organized social movement in Latin America today is the right-wing drug trafficking movement. Headed by a powerful narco-bourgeoisie, with strong ties to the military and neo-liberal state apparatus and with armed lumpen-cadres drawn from the urban unemployed and landless peasantry, the ‘Lumpen’ Movement has created a powerful geographic and social presence in Mexico, Colombia, Peru, Bolivia, Guatemala, Honduras, El Salvador and elsewhere.

It was the agrarian neo-liberal policies that prepared the ground for the ‘mass base’ of the rightist narco-movement. The promotion of mechanized agro-export agriculture in Colombia, Mexico, Peru and Central America uprooted millions. State terror and paramilitary death squads drove millions of peasant families from the land and into urban slums. The large-scale importation of cheap, subsidized agricultural produce from the US wiped out many thousands of small-scale family farms. The stagnant of manufacturing sector was unable to absorb the migrants into labor-intensive work. This created massive numbers of young rural unemployed landless and urban workers, who could be either recruits for progressive social movements or recruits for the narco-industry. Cultivating coca and opium, refining and smuggling the drugs and soldiering for the drug lords provided a livelihood for these desperate young men and women. The deep economic crisis and stagnation of the 1990’s and early 2000’s created a large mass of young unemployed and under-employed workers in the cities ripe for employment by the narco-gangs who paid a living wage for an often deadly occupation.

The links between right-wing political parties, banking, business and landowner associations has been demonstrated repeatedly throughout Latin America. In Colombia, drug traffickers have become large landowners after their death squads devastated peasant communities suspected of supporting leftists or progressive organizations. ‘Sicarios’ or ‘hit-men’ are mostly young men from working or peasant class background who ‘work’ for business leaders and multi-national corporations as assassins. They have killed hundreds of trade union and peasant and Indian leaders each year in Colombia alone. Over a third of the members of the Colombian Congress, the principle backers of President Uribe, have been financed by the drug cartels. Uribe has long-term ties with prominent narco-traffickers and death-squad militia leaders.

In Mexico, drug traffickers have recruited widely among the impoverished peasants. In many Mexican states the narcos have purchased the services of thousands of government officials from top to bottom. In the absence of employment and a social safety-net, many of the poor find work in the narco-trade. Narco-traffickers have established alliances and business associations with upper class financial groups engaging in joint ‘philanthropic’ activities, such as handing out cash and delivering needed services to the poor. Narco-traffickers eventually wash their illegal earnings through major banks in the US, Canada and Europe and then invest in real estate, tourist complexes and landed properties.

Narco-trafficker organizations and death squads have worked closely with rightwing movements in Sta. Cruz (Bolivia), with rightist political parties in El Salvador, Guatemala and Honduras, as well as in Mexico and Colombia.

The ‘lumpenization’ process operates via two routes: In some cases, young unemployed males are directly recruited via neighborhood organizations; in other cases the dispossessed, bankrupt and downwardly mobile farmers and long-term unemployed workers are gradually forced into the ‘illegal’ labor market.

The long-term, large-scale process of stagnation, despite the periods of export growth, marginalize the rural poor and accelerate their impoverishment without generating compensatory stable, urban employment paying a living wages. The ‘lumpenization’ of these displaced, marginalized peasants and workers, produced by the crisis and class polarization, is accompanied by the rise of a ‘lumpen culture’ with its own hierarchical structures, where the few at the ‘top’ develop ties to the economic and state elite and the masses at the ‘bottom’ aspire to a degenerate kind of middle-class consumerist life-style.

By the first decade of the new millennium, the rightist lumpen-narco movement far exceeded the progressive popular movements in terms of power and influence in Mexico, Colombia, Central America and some countries in the Caribbean, like Jamaica. The relationship between the ‘legal’ rightist and the ‘narco’ rightist movements is one of collaboration and conflict: They join forces to oppose powerful rural and trade union movements and progressive electoral regimes. The lumpen-narcos provide the ‘shock troops’ to assassinate progressive leaders, including elected officials and to terrorize supporters among the peasantry and urban poor. On the other hand, violent conflict between the rightists can break out at any time, especially when the lumpen-elite encroach on the state prerogatives, business interests, ties with imperial drug enforcement agencies and raise questions about the legitimacy of the bourgeois class.

Latin America’s Social Movements and the Economic Recession/Depression

Economic crises have multiple and diverse impacts on the popular classes and social movements.

The profound economic crisis of the 1990’s and first years of 2000 radicalized the popular classes and led to widespread ‘high impact’ protests and national rebellions, which overthrew incumbent neo-liberal regimes and replaced them with ‘center-left’ regimes. At the same time the social changes, implicit in the neo-liberal crisis, led to a downwardly mobile urban and rural sector. This formed the basis for the growth of dynamic leftist social movement led by popular mass-based leaders and rightist movements led by lumpen-narco chiefs and supported by the economic elites. The conservative, far-right confronted popular social movements from positions in the state and through the military and para-military death squads.

The commodity boom and the ascendancy of the ‘center-left’ regimes led to the ‘moderation’ of demands from below in the face of cooptation from above. Large-scale job creation and poverty programs, cheap credit and incremental wage and salary increases all contributed to moderating mass politics. The trade unions re-emerged as central actors and collective bargaining replaced mass direct action. Rural movements engaged in militant struggle were relatively isolated. The key political factor in this period was the demobilization of the popular classes, the decline of the direct action movements and the restoration of the power of the business, land-owning and mining elite based on their strengthened economic position. The rejuvenated Right took the lead in directing their own ‘direct action’ movements in Bolivia, Argentina and Central America.

As the crisis of 2008-2009 unfolded, the progressive movements were slow to respond, having been ‘under the tent’ of the center-left electoral regimes. Since these regimes were now being held responsible for the fallout of the commodity crash, the left social movements were in a weak position and unable to pose any radical alternatives.

It is important to remember that the world economic crisis had hit the ‘North’ (US/EU) earlier and harder than in Latin America. In Latin American, the social impact was weaker – at first. Unemployment grew mainly during the last months of 2008. The gradual unfolding of the crisis contrasted with the system-wide crash of the late 1990’s-2002, which precipitated mass rebellions. In addition, as a consequence of the earlier crisis, capital and finance controls had been imposed that limited the spread of the toxic assets and financial crisis from the US to Latin America.

Moreover, Latin American countries are diversifying their trade, especially toward Asia including China, which continues to grow at 8% a year. Diversification and financial controls limited the impact of the US financial melt-down on the Latin American economies. In addition, the early ‘stimulus’ measures, taken in response to the first signs of the crisis, had the effect of temporarily ameliorating the impact of the global recession/depression on Latin America.

Nevertheless as the depression deepens in the North, Latin America’s trade has plunged, and the region has fallen into negative growth. As a result, unemployment is growing in both the export sectors as well as in production for the domestic economy. In response, the right-wing parties and leaders blame the center-left regimes. Moves are underway in Argentina, Bolivia and Ecuador to oust these regimes through elections or through coups, backed by US President Obama’s ‘rollback’ global strategy. The July 2009 coup in Honduras, covertly backed from the strategic US military base in the country, is the first sign that Washington is moving its military client to overthrow the new independent ‘center-left’ regimes in the region. This is particularly true among the Central American and Caribbean countries linked with Venezuela in the new integration programs, such as ALBA and PetroCaribe.

The first manifestations of progressive mass popular protests in the current economic recession are not directly related to the economic decline. In Peru, the indigenous Amazonian communities organized militant road blockages and confrontations with the military resulting in over one hundred dead and wounded. This mass movement developed in response to the Peruvian government’s granting concessions of mining exploitation rights to foreign multi-nationals, an infringement of the rights of the indigenous people to their lands in the Amazonian region. Demonstrations in solidarity with the Amazonian Indians occurred in most cities, including Lima. The Congress, fearing a mass uprising, temporarily canceled the concessions. This was a major victory for the indigenous communities. Moreover, the success of the Amazonian Indian communities has detonated widespread sustained strikes and protests in most of the major cities of Peru, in response to economic decline resulting from falling commodity prices.

The sustained popular struggle in Honduras is in response to the military coup overthrowing President Zelaya, a moderate reformer pursuing an independent foreign policy. Led by the urban public sector trade unions and peasant movements, the struggle has combined democratic, nationalist and populist demands.

Apart from these two mass popular movements, the economic crisis has yet to evoke mass radical rebellions, like those which took place during earlier crises between 2000-2003. We can posit several possible explanations or hypotheses for the contrasting responses of the mass movements to economic crises.

Hypotheses

1. The full impact of the world crisis has yet to hit the popular classes – it began late in

2008 and only began to register increased unemployment in the first quarter of 2009.

2. The current crisis, at first, did not hit the lower middle classes, public employees and skilled workers. It has been highly segmented, thus weakening cross class solidarity and alliances present in earlier crises.

3. Unlike the previous period, the crisis takes place in many countries, which are ruled by ‘center left’ regimes with an organized social base backed by the social movements. These regime-movement linkages neutralize mass protests, out of fear of a return to the hard right.

4. The mass movements on the left have responded to the crisis with relative passivity – in part because the governments have intervened with economic stimulus measures and some social ameliorative policies. The continuation and deepening of the crisis and the inadequate coverage of moderate public interventions could eventually lead to the resurgence of mass struggles.

5. The increasing economic vulnerability of the incumbent center-left regimes and the relative passivity of the progressive social movements has opened political space and opportunities for rightwing mass mobilizations, combining electoral and street politics to build a base for a return to power.

6. The crisis will likely accelerate the lumpenization process, as long-term unemployment sets in and if alternate movements fail to organize the chronically unemployed in consequential struggles.

7. As the bourgeoisie and its political supporters find few legitimate sources for profiteering available, they will likely serve as intermediaries and ‘protectors’ of the narco-traffickers and other criminal syndicates and rely on them to eliminate left social movement leaders and activists.

8. The rise of the ‘lumpen-Right’ may lead to a virtual ‘dual power’ situation in which legitimate and illegitimate power configurations cooperate in repressing social movements and compete for influence.

9. The relative passivity of the social movements is likely a transitory phenomenon, influenced by the convergence of circumstances. If the crisis deepens and extends over time and rightist regimes return to power, recent past historical experience strongly suggests that the massive increase in poverty and unemployment, combined with repressive rightist regimes, could lead to mass rebellions on the part of the previously ‘passive’ popular classes.

2. OBAMA ON DRUGS: 98% CHENEY?

BY

GREG PALAST



Eighty billion dollars of WHAT?

I searched all over the newspapers and TV transcripts and no one asked the President what is probably the most important question of what passes for debate on the issue of health care reform: $80 billion of WHAT?

On June 22, President Obama said he'd reached agreement with big drug companies to cut the price of medicine by $80 billion. He extended his gratitude to Big Pharma for the deal that would, "reduce the punishing inflation in health care costs."

Hey, in my neighborhood, people think $80 billion is a lot of money. But is it?

I checked out the government's health stats (at HHS.gov), put fresh batteries in my calculator and totted up US spending on prescription drugs projected by the government for the next ten years. It added up to $3.6 trillion.

In other words, Obama's big deal with Big Pharma saves $80 billion out of a total $3.6 trillion. That's 2%.

Hey thanks, Barack! You really stuck it to the big boys. You saved America from these drug lords robbing us blind. Two percent. Cool!

**************************
ALERT

Now it's Let's Make a Deal with hospital lobbyists.

First, the President was caught with his principals down, cutting a scuzzy back-room deal with pharmaceutical lobbyist Billy Tauzin to limit drug price savings to just 2% over 10 years (see attached, "Obama on Drugs: 98% Cheney?"), the New York Times today reports that another deal was sealed by lobbyist Chip Kahn of the American Hospital Association.

Here are the numbers they don't want you to see: Hospitals will be allowed to hike their prices and revenues by six trillion dollars ($5,853 billion) over the next ten years, only $155 billion less than they had projected before the Obama "reform."

In all, the Obama back-room deal will "reduce" our $26 trillion total hospital bill over the next decade by one-half of one percent.

Once again, the lobbyists got the gold mine, the public got the shaft.

Say it ain't so, Mr. President.

***************************

For perspective: Imagine you are in a Wal-Mart and there's a sign over a flat screen TV, “BIG SAVINGS!” So, you break every promise you made never to buy from that union-busting big box - and snatch up the $500 television. And when you're caught by your spouse, you say, "But, honey, look at the deal I got! It was TWO-PERCENT OFF! I saved us $10!"

But 2% is better than nothing, I suppose. Or is it?

The Big Pharma kingpins did not actually agree to cut their prices. Their promise with Obama is something a little oilier: they apparently promised that, over ten years, they will reduce the amount at which they would otherwise raise drug prices. Got that? In other words, the Obama deal locks in a doubling of drug costs, projected to rise over the period of "savings" from a quarter trillion dollars a year to half a trillion dollars a year. Minus that 2%.

We'll still get the shaft from Big Pharma, but Obama will have circumcised the increase.

And what did Obama give up in return for $80 billion? Chief drug lobbyist Billy Tauzin crowed that Obama agreed to dump his campaign pledge to bargain down prices for Medicare purchases. Furthermore, Obama’s promise that we could buy cheap drugs from Canada simply went pffft!

What did that cost us? The New England Journal of Medicine notes that 13 European nations successfully regulate the price of drugs, reducing the average cost of name-brand prescription medicines by 35% to 55%. Obama gave that up for his 2%.

The Veterans Administration is able to push down the price it pays for patent medicine by 40% through bargaining power. George Bush stopped Medicare from bargaining for similar discounts, an insane ban that Obama said he’d overturn. But, once within Tauzin’s hypnotic gaze, Obama agreed to lock in Bush’s crazy and costly no-bargaining ban for the next decade.

What else went down in Obama's drug deal? To find out, I called C-SPAN to get a copy of the videotape of the meeting with the drug companies. I was surprised to find they didn't have such a tape despite the President's campaign promise, right there on CNN in January 2008, "These negotiations will be on C-SPAN."

This puzzled me. When Dick Cheney was caught having secret meetings with oil companies to discuss Bush's Energy Bill, we denounced the hugger-muggers as a case of foxes in the henhouse.

Cheney's secret meetings with lobbyists and industry bigshots were creepy and nasty and evil.

But the Obama crew's secret meetings with lobbyists and industry bigshots were, the President assures us, in the public interest.

We know Cheney's secret confabs were shady and corrupt because Cheney scowled out the side of his mouth.

Obama grins in your face.

See the difference?

The difference is 2%.

3. HEALTH CARE STIRSS UP WHOLE FOODS CEO JOHN MACKEY, CUSTOMERS BOYCOTT ORGANIC GROCERY STORE.

BY

EMILY FRIEDMAN

Joshua has been taking the bus to his local Whole Foods in New York City every five days for the past two years. This week, he said he'll go elsewhere to fulfill his fresh vegetable and organic produce needs.

Customers are threatening to boycott Whole Foods stores after the company's CEO, John Mackey, wrote an op-ed discussing his ideas for health care reform.

"I will never shop there again," vowed Joshua, a 45-year-old blogger, who asked that his last name not be published.

Like many of his fellow health food fanatics, Joshua said he will no longer patronize the store after learning about Whole Foods Market Inc.'s CEO John Mackey's views on health care reform, which were made public this week in an op-ed piece he wrote for The Wall Street Journal.

Michael Lent, another Whole Foods enthusiast in Long Beach, Calif., told ABCNews.com that he, too, will turn to other organic groceries for his weekly shopping list.

"I'm boycotting [Whole Foods] because all Americans need health care," said Lent, 33, who used to visit his local Whole Foods "several times a week." (To join the 'Whole Foods Boycott" on Facebook, click here: http://www.facebook.com/group.php?gid=119099537379)

"While Mackey is worried about health care and stimulus spending, he doesn't seem too worried about expensive wars and tax breaks for the wealthy and big businesses such as his own that contribute to the deficit," said Lent.

In his op-ed, "The Whole Foods Alternative to ObamaCare," published Tuesday, Mackey criticized President Barack Obama's health care plan.

Mackey provided eight "reforms" he argued the U.S. can do to improve health care without increasing the deficit. He suggested that tax forms be revised to "make it easier for individuals to make a voluntary, tax-deductible donation to help the millions of people who have no insurance."

Mackey also called for a move toward "less government control and more individual empowerment" instead of "a massive new health care entitlement that will create hundreds of billions of dollars of new unfunded deficits."

He added that many of the country's health care problems are "self-inflicted" and are preventable through "proper diet, exercise, not smoking, minimal alcohol consumption and other healthy lifestyle choices."

In the op-ed, Mackey outlines Whole Foods' employee health insurance policy. According to Mackey, Whole Foods pays 100 percent of the premiums for all employees who work 30 hours or more per week -- about 89 percent of his workforce.

Additionally, the company gives each employee $1,800 per year in "health-care dollars," says Mackey, that they can use at their own discretion for health and wellness expenses. This money can be put toward the $2,500 annual deductible that must be covered before Mackey says the company's "insurance plan kicks in."

Whole Foods Shoppers Weigh In

The op-ed piece, which begins with a Margaret Thatcher quote, "The problem with socialism is that eventually you run out of other people's money," has left some Whole Foods loyalists enraged. Many say Mackey was out of line to opine against the liberal base that has made his fortune possible.

Christine Taylor, a 34-year-old New Jersey shopper, vowed never to step foot in another Whole Foods again.

"I will no longer be shopping at Whole Foods," Taylor told ABCNews.com. "I think a CEO should take care that if he speaks about politics, that his beliefs reflect at least the majority of his clients."

Countless Whole Foods shoppers have taken their gripes with Mackey's op-ed to the Internet, where people on the social networking sites Twitter and Facebook are calling for a boycott of the store.

A commenter on the Whole Foods forum, identified only by his handle, "PracticePreach," wrote, "It is an absolute slap in the face to the millions of progressive-minded consumers that have made [Whole Foods] what it is today."

"You should know who butters your hearth-baked bread, John," wrote the commenter. "Last time I checked it wasn't the insurance industry conservatives who made you a millionaire a hundred times over."

While Mackey reduced his annual salary to one dollar in 2007, after explaining to employees he was "no longer interested in working for money," Mackey is still the head of the 10th largest food and drug store in the U.S.

Whole Foods Market Inc. reported that sales for the last quarter rose by 2 percent to $1.878 billion. It is consistently ranked a Fortune 500 company.

And not all Whole Foods customers were upset by Mackey's op-ed.

Many posted online that they agreed with his message and would try to shop at the chain more often.

Frank Federer wrote ABCNews.com, expressing fatigue with the knee-jerk reaction of other shoppers.

"You can count me as one vote FOR Whole Foods' CEO," wrote Federer. "At a time when most folks are more inclined toward rancor than discussion of facts, I applaud John Mackey."

Despite his financial success, this is not the first time Mackey has become fodder for criticism. In 2007, it was discovered that Mackey had been using a pseudonym to post blogs lambasting Whole Foods' competitor, Wild Oats Market, and questioning the worth of the company's stock.

The postings were made public when Mackey announced his desire to buy Wild Oats Market, and a lawsuit was filed by the Federal Trade Commission over concerns that the purchase would violate antitrust laws.

The FTC eventually let the sale go through, provided that Mackey sold 31 of the Wild Oats stores, and the Securities and Exchange Commission, which had launched an investigation into the online postings, did not press charges.

Libba Letton, a Whole Foods spokeswoman, told ABCNews.com that Mackey was unavailable for an interview and said that the op-ed "stands on its own." Letton offered no further comment regarding customers' threats to boycott the store.

When a CEO Speaks Out...

According to Robert Passikoff, the founder of Brand Keys, a N.Y.-based consulting firm, what a CEO says or does can often have a direct impact on consumers' pocketbooks.

"You can have a tremendous effect as a CEO, but it's a double-edge sword in that you'll have people who will support your position and feel better about your brand because of what you say," said Passikoff. "But equally so, you'll have people who think you're crazy and because they can't take it out on you, the CEO, they'll take it out on the company."

It is the risk of losing customers, said Passikoff, which more often than not leads CEOs to keep their mouths shut, at least when it comes to polarizing issues such as health care.

Tom Monaghan, the founder of Domino's Pizza who was outspoken in the pro-life movement, ostracized many of his consumers who weren't sure how much of the money he earned making pizza was then going to support the pro-life movement.

Lynn Upshaw, a brand marketing consultant at Upshaw Brand Consulting in Kentfield, Calif., said that more often it is the actions of an entire company, and not just of a CEO, that lead to boycotting by consumers.

For example, Upshaw remembers when, in the late 1970s, Nestle angered consumers with a baby formula product it claimed to be a healthy alternative to breast-feeding.

"It's relatively unusual for a CEO to be as outspoken as Mackey has been," said Upshaw. "Because any time you weigh in to something political, you're bound to have loyal customers who will question [your] point of view, and that can have a very negative effect."

Upshaw added that Mackey's op-ed may have done more harm than might be typical because of the unique makeup of his clientele.

"You have more activist consumers going to Whole Foods than other stores," said Upshaw. "They're not just simply expressing an opinion, they do something about it.

"These are people who have already gone out of the way to find a place that is more expensive to buy certain types of food," he said. "So in theory, they might be more willing to take the action to go somewhere else if they don't agree with Mackey."

4. "THE COLLAPSE GAP"
WITH

DMITRY ORLOV

A Lecture
(The Soviet Example and American Prospects".)

(The USSR was better prepared for collapse than the US)

(Dmitry Orlov's repeated travels to Russia throughout the early nineties allowed him to observe the aftermath of the Soviet collapse first-hand. Being both a Russian and an American, Dmitry was able to appreciate both the differences and the similarities between the two superpowers. Eventually he came to the conclusion that the United States is going the way of the Soviet Union. His emphasis is on all the things that can still be made to work, and he advocates simply ignoring all that will fall by the wayside.)

Good evening, ladies and gentlemen. I am not an expert or a scholar or an activist. I am more of an eye-witness. I watched the Soviet Union collapse, and I have tried to put my observations into a concise message. I will leave it up to you to decide just how urgent a message it is.

My talk tonight is about the lack of collapse-preparedness here in the United States. I will compare it with the situation in the Soviet Union, prior to its collapse. The rhetorical device I am going to use is the "Collapse Gap" – to go along with the Nuclear Gap, and the Space Gap, and various other superpower gaps that were fashionable during the Cold War.

Slide [2] The subject of economic collapse is generally a sad one. But I am an optimistic, cheerful sort of person, and I believe that, with a bit of preparation, such events can be taken in stride. As you can probably surmise, I am actually rather keen on observing economic collapses. Perhaps when I am really old, all collapses will start looking the same to me, but I am not at that point yet.

And this next one certainly has me intrigued. From what I've seen and read, it seems that there is a fair chance that the U.S. economy will collapse sometime within the foreseeable future. It also would seem that we won't be particularly well-prepared for it. As things stand, the U.S. economy is poised to perform something like a disappearing act. And so I am eager to put my observations of the Soviet collapse to good use.

Slide [3] I anticipate that some people will react rather badly to having their country compared to the USSR. I would like to assure you that the Soviet people would have reacted similarly, had the United States collapsed first. Feelings aside, here are two 20th century superpowers, who wanted more or less the same things – things like technological progress, economic growth, full employment, and world domination – but they disagreed about the methods. And they obtained similar results – each had a good run, intimidated the whole planet, and kept the other scared. Each eventually went bankrupt.

Slide [4] The USA and the USSR were evenly matched in many categories, but let me just mention four.

The Soviet manned space program is alive and well under Russian management, and now offers first-ever space charters. The Americans have been hitching rides on the Soyuz while their remaining spaceships sit in the shop.

The arms race has not produced a clear winner, and that is excellent news, because Mutual Assured Destruction remains in effect. Russia still has more nuclear warheads than the US, and has supersonic cruise missile technology that can penetrate any missile shield, especially a nonexistent one.

The Jails Race once showed the Soviets with a decisive lead, thanks to their innovative GULAG program. But they gradually fell behind, and in the end the Jails Race has been won by the Americans, with the highest percentage of people in jail ever.

The Hated Evil Empire Race is also finally being won by the Americans. It's easy now that they don't have anyone to compete against.

Slide [5] Continuing with our list of superpower similarities, many of the problems that sunk the Soviet Union are now endangering the United States as well. Such as a huge, well-equipped, very expensive military, with no clear mission, bogged down in fighting Muslim insurgents. Such as energy shortfalls linked to peaking oil production. Such as a persistently unfavorable trade balance, resulting in runaway foreign debt. Add to that a delusional self-image, an inflexible ideology, and an unresponsive political system.

Slide [6] An economic collapse is amazing to observe, and very interesting if described accurately and in detail. A general description tends to fall short of the mark, but let me try. An economic arrangement can continue for quite some time after it becomes untenable, through sheer inertia. But at some point a tide of broken promises and invalidated assumptions sweeps it all out to sea. One such untenable arrangement rests on the notion that it is possible to perpetually borrow more and more money from abroad, to pay for more and more energy imports, while the price of these imports continues to double every few years. Free money with which to buy energy equals free energy, and free energy does not occur in nature. This must therefore be a transient condition. When the flow of energy snaps back toward equilibrium, much of the US economy will be forced to shut down.



Slide [7] I've described what happened to Russia in some detail in one of my articles, which is available on SurvivingPeakOil.com. I don't see why what happens to the United States should be entirely dissimilar, at least in general terms. The specifics will be different, and we will get to them in a moment. We should certainly expect shortages of fuel, food, medicine, and countless consumer items, outages of electricity, gas, and water, breakdowns in transportation systems and other infrastructure, hyperinflation, widespread shutdowns and mass layoffs, along with a lot of despair, confusion, violence, and lawlessness. We definitely should not expect any grand rescue plans, innovative technology programs, or miracles of social cohesion.

Slide [8] When faced with such developments, some people are quick to realize what it is they have to do to survive, and start doing these things, generally without anyone's permission. A sort of economy emerges, completely informal, and often semi-criminal. It revolves around liquidating, and recycling, the remains of the old economy. It is based on direct access to resources, and the threat of force, rather than ownership or legal authority. People who have a problem with this way of doing things, quickly find themselves out of the game.

These are the generalities. Now let's look at some specifics.

Slide [9] One important element of collapse-preparedness is making sure that you don't need a functioning economy to keep a roof over your head. In the Soviet Union, all housing belonged to the government, which made it available directly to the people. Since all housing was also built by the government, it was only built in places that the government could service using public transportation. After the collapse, almost everyone managed to keep their place.

In the United States, very few people own their place of residence free and clear, and even they need an income to pay real estate taxes. People without an income face homelessness. When the economy collapses, very few people will continue to have an income, so homelessness will become rampant. Add to that the car-dependent nature of most suburbs, and what you will get is mass migrations of homeless people toward city centers.

Slide [10] Soviet public transportation was more or less all there was, but there was plenty of it. There were also a few private cars, but so few that gasoline rationing and shortages were mostly inconsequential. All of this public infrastructure was designed to be almost infinitely maintainable, and continued to run even as the rest of the economy collapsed.

The population of the United States is almost entirely car-dependent, and relies on markets that control oil import, refining, and distribution. They also rely on continuous public investment in road construction and repair. The cars themselves require a steady stream of imported parts, and are not designed to last very long. When these intricately interconnected systems stop functioning, much of the population will find itself stranded.

Slide [11] Economic collapse affects public sector employment almost as much as private sector employment, eventually. Because government bureaucracies tend to be slow to act, they collapse more slowly. Also, because state-owned enterprises tend to be inefficient, and stockpile inventory, there is plenty of it left over, for the employees to take home, and use in barter. Most Soviet employment was in the public sector, and this gave people some time to think of what to do next.

Private enterprises tend to be much more efficient at many things. Such laying off their people, shutting their doors, and liquidating their assets. Since most employment in the United States is in the private sector, we should expect the transition to permanent unemployment to be quite abrupt for most people.

Slide [12] When confronting hardship, people usually fall back on their families for support. The Soviet Union experienced chronic housing shortages, which often resulted in three generations living together under one roof. This didn't make them happy, but at least they were used to each other. The usual expectation was that they would stick it out together, come what may.

In the United States, families tend to be atomized, spread out over several states. They sometimes have trouble tolerating each other when they come together for Thanksgiving, or Christmas, even during the best of times. They might find it difficult to get along, in bad times. There is already too much loneliness in this country, and I doubt that economic collapse will cure it.

Slide [13] To keep evil at bay, Americans require money. In an economic collapse, there is usually hyperinflation, which wipes out savings. There is also rampant unemployment, which wipes out incomes. The result is a population that is largely penniless.

In the Soviet Union, very little could be obtained for money. It was treated as tokens rather than as wealth, and was shared among friends. Many things – housing and transportation among them – were either free or almost free.

Slide [14] Soviet consumer products were always an object of derision – refrigerators that kept the house warm – and the food, and so on. You'd be lucky if you got one at all, and it would be up to you to make it work once you got it home. But once you got it to work, it would become a priceless family heirloom, handed down from generation to generation, sturdy, and almost infinitely maintainable.

In the United States, you often hear that something "is not worth fixing." This is enough to make a Russian see red. I once heard of an elderly Russian who became irate when a hardware store in Boston wouldn't sell him replacement bedsprings: "People are throwing away perfectly good mattresses, how am I supposed to fix them?"

Economic collapse tends to shut down both local production and imports, and so it is vitally important that anything you own wears out slowly, and that you can fix it yourself if it breaks. Soviet-made stuff generally wore incredibly hard. The Chinese-made stuff you can get around here – much less so.

Slide [15] The Soviet agricultural sector was notoriously inefficient. Many people grew and gathered their own food even in relatively prosperous times. There were food warehouses in every city, stocked according to a government allocation scheme. There were very few restaurants, and most families cooked and ate at home. Shopping was rather labor-intensive, and involved carrying heavy loads. Sometimes it resembled hunting – stalking that elusive piece of meat lurking behind some store counter. So the people were well-prepared for what came next.

In the United States, most people get their food from a supermarket, which is supplied from far away using refrigerated diesel trucks. Many people don't even bother to shop and just eat fast food. When people do cook, they rarely cook from scratch. This is all very unhealthy, and the effect on the nation's girth, is visible, clear across the parking lot. A lot of the people, who just waddle to and from their cars, seem unprepared for what comes next. If they suddenly had to start living like the Russians, they would blow out their knees.

Slide [16] The Soviet government threw resources at immunization programs, infectious disease control, and basic care. It directly operated a system of state-owned clinics, hospitals, and sanatoriums. People with fatal ailments or chronic conditions often had reason to complain, and had to pay for private care – if they had the money.

In the United States, medicine is for profit. People seems to think nothing of this fact. There are really very few fields of endeavor to which Americans would deny the profit motive. The problem is, once the economy is removed, so is the profit, along with the services it once helped to motivate.

Slide [17] The Soviet education system was generally quite excellent. It produced an overwhelmingly literate population and many great specialists. The education was free at all levels, but higher education sometimes paid a stipend, and often provided room and board. The educational system held together quite well after the economy collapsed. The problem was that the graduates had no jobs to look forward to upon graduation. Many of them lost their way.

The higher education system in the United States is good at many things – government and industrial research, team sports, vocational training... Primary and secondary education fails to achieve in 12 years what Soviet schools generally achieved in 8. The massive scale and expense of maintaining these institutions is likely to prove too much for the post-collapse environment. Illiteracy is already a problem in the United States, and we should expect it to get a lot worse.

Slide [18] The Soviet Union did not need to import energy. The production and distribution system faltered, but never collapsed. Price controls kept the lights on even as hyperinflation raged.

The term "market failure" seems to fit the energy situation in the United States. Free markets develop some pernicious characteristics when there are shortages of key commodities. During World War II, the United States government understood this, and successfully rationed many things, from gasoline to bicycle parts. But that was a long time ago. Since then, the inviolability of free markets has become an article of faith.

Slide [19] My conclusion is that the Soviet Union was much better-prepared for economic collapse than the United States is.

I have left out two important superpower asymmetries, because they don't have anything to do with collapse-preparedness. Some countries are simply luckier than others. But I will mention them, for the sake of completeness.

In terms of racial and ethnic composition, the United States resembles Yugoslavia more than it resembles Russia, so we shouldn't expect it to be as peaceful as Russia was, following the collapse. Ethnically mixed societies are fragile and have a tendency to explode.

In terms of religion, the Soviet Union was relatively free of apocalyptic doomsday cults. Very few people there wished for a planet-sized atomic fireball to herald the second coming of their savior. This was indeed a blessing.

Slide [20] One area in which I cannot discern any Collapse Gap is national politics. The ideologies may be different, but the blind adherence to them couldn't be more similar.

It is certainly more fun to watch two Capitalist parties go at each other than just having the one Communist party to vote for. The things they fight over in public are generally symbolic little tokens of social policy, chosen for ease of public posturing. The Communist party offered just one bitter pill. The two Capitalist parties offer a choice of two placebos. The latest innovation is the photo finish election, where each party buys 50% of the vote, and the result is pulled out of statistical noise, like a rabbit out of a hat.

The American way of dealing with dissent and with protest is certainly more advanced: why imprison dissidents when you can just let them shout into the wind to their heart's content?

The American approach to bookkeeping is more subtle and nuanced than the Soviet. Why make a state secret of some statistic, when you can just distort it, in obscure ways? Here's a simple example: inflation is "controlled" by substituting hamburger for steak, in order to minimize increases to Social Security payments.

Slide [21] Many people expend a lot of energy protesting against their irresponsible, unresponsive government. It seems like a terrible waste of time, considering how ineffectual their protests are. Is it enough of a consolation for them to be able to read about their efforts in the foreign press? I think that they would feel better if they tuned out the politicians, the way the politicians tune them out. It's as easy as turning off the television set. If they try it, they will probably observe that nothing about their lives has changed, nothing at all, except maybe their mood has improved. They might also find that they have more time and energy to devote to more important things.

Slide [22] I will now sketch out some approaches, realistic and otherwise, to closing the Collapse Gap. My little list of approaches might seem a bit glib, but keep in mind that this is a very difficult problem. In fact, it's important to keep in mind that not all problems have solutions. I can promise you that we will not solve this problem tonight. What I will try to do is to shed some light on it from several angles.

Slide [23] Many people rail against the unresponsiveness and irresponsibility of the government. They often say things like "What is needed is..." plus the name of some big, successful government project from the glorious past – the Marshall Plan, the Manhattan Project, the Apollo program. But there is nothing in the history books about a government preparing for collapse. Gorbachev's "Perestroika" is an example of a government trying to avert or delay collapse. It probably helped speed it along.

Slide [24] There are some things that I would like the government to take care of in preparation for collapse. I am particularly concerned about all the radioactive and toxic installations, stockpiles, and dumps. Future generations are unlikely to able to control them, especially if global warming puts them underwater. There is enough of this muck sitting around to kill off most of us. I am also worried about soldiers getting stranded overseas – abandoning one's soldiers is among the most shameful things a country can do. Overseas military bases should be dismantled, and the troops repatriated. I'd like to see the huge prison population whittled away in a controlled manner, ahead of time, instead of in a chaotic general amnesty. Lastly, I think that this farce with debts that will never be repaid, has gone on long enough. Wiping the slate clean will give society time to readjust. So, you see, I am not asking for any miracles. Although, if any of these things do get done, I would consider it a miracle.

Slide [25] A private sector solution is not impossible; just very, very unlikely. Certain Soviet state enterprises were basically states within states. They controlled what amounted to an entire economic system, and could go on even without the larger economy. They kept to this arrangement even after they were privatized. They drove Western management consultants mad, with their endless kindergartens, retirement homes, laundries, and free clinics. These weren't part of their core competency, you see. They needed to divest and to streamline their operations. The Western management gurus overlooked the most important thing: the core competency of these enterprises lay in their ability to survive economic collapse. Maybe the young geniuses at Google can wrap their heads around this one, but I doubt that their stockholders will.

Slide [26] It's important to understand that the Soviet Union achieved collapse-preparedness inadvertently, and not because of the success of some crash program. Economic collapse has a way of turning economic negatives into positives. The last thing we want is a perfectly functioning, growing, prosperous economy that suddenly collapses one day, and leaves everybody in the lurch. It is not necessary for us to embrace the tenets of command economy and central planning to match the Soviet lackluster performance in this area. We have our own methods, that are working almost as well. I call them "boondoggles." They are solutions to problems that cause more problems than they solve.

Just look around you, and you will see boondoggles sprouting up everywhere, in every field of endeavor: we have military boondoggles like Iraq, financial boondoggles like the doomed retirement system, medical boondoggles like private health insurance, legal boondoggles like the intellectual property system. The combined weight of all these boondoggles is slowly but surely pushing us all down. If it pushes us down far enough, then economic collapse, when it arrives, will be like falling out of a ground floor window. We just have to help this process along, or at least not interfere with it. So if somebody comes to you and says "I want to make a boondoggle that runs on hydrogen" – by all means encourage him! It's not as good as a boondoggle that burns money directly, but it's a step in the right direction.

Slide [27] Certain types of mainstream economic behavior are not prudent on a personal level, and are also counterproductive to bridging the Collapse Gap. Any behavior that might result in continued economic growth and prosperity is counterproductive: the higher you jump, the harder you land. It is traumatic to go from having a big retirement fund to having no retirement fund because of a market crash. It is also traumatic to go from a high income to little or no income. If, on top of that, you have kept yourself incredibly busy, and suddenly have nothing to do, then you will really be in rough shape.

Economic collapse is about the worst possible time for someone to suffer a nervous breakdown, yet this is what often happens. The people who are most at risk psychologically are successful middle-aged men. When their career is suddenly over, their savings are gone, and their property worthless, much of their sense of self-worth is gone as well. They tend to drink themselves to death and commit suicide in disproportionate numbers. Since they tend to be the most experienced and capable people, this is a staggering loss to society.

If the economy, and your place within it, is really important to you, you will be really hurt when it goes away. You can cultivate an attitude of studied indifference, but it has to be more than just a conceit. You have to develop the lifestyle and the habits and the physical stamina to back it up. It takes a lot of creativity and effort to put together a fulfilling existence on the margins of society. After the collapse, these margins may turn out to be some of the best places to live.

Slide [28] I hope that I didn't make it sound as if the Soviet collapse was a walk in the park, because it was really quite awful in many ways. The point that I do want to stress is that when this economy collapses, it is bound to be much worse. Another point I would like to stress is that collapse here is likely to be permanent. The factors that allowed Russia and the other former Soviet republics to recover are not present here.

In spite of all this, I believe that in every age and circumstance, people can sometimes find not just a means and a reason to survive, but enlightenment, fulfillment, and freedom. If we can find them even after the economy collapses, then why not start looking for them now?

Thursday, August 13, 2009

The JvL Bi-Weekly for 081509

I can be most easily reached through the following email address for suggesting new additions to the subscription list or to cancel your subscription to the Bi-Weekly:

channujames@yahoo.com

The Blog Address for the Bi-Weekly is: http://jvlbiweekly.blogspot.com

Please forward the Blog address for the Bi-Weekly to any who might be interested

Saturday, August 15th, 2009

Volume 8, No. 15

5 articles, 27 pages

1. Editor's notice

2. Editor's 2nd notice

3. Our Suicide Bombers

4. Hiroshima Day

5. Hiroshima, Nagasaki Atom Bombs Was Right Decision



(Editor's notice: "Conclusions. At the close of this long and arid survey—partaking of the nature of catalogue—it seems worth while to bring together the important conclusion for political science which the data presented appear to warrant.

The movement for the Constitution of the United States was originated and carried through principally by four groups of personality interests which had been adversely affected under the Articles of Confederation: money, public securities, manufactures, and trade and shipping.

The first firm steps toward the formation of the Constitution were taken by a small and active group of men immediately interested through their personal possessions in the outcome of their labors.

No popular vote was taken directly or indirectly on the proposition to call the Convention which drafted the Constitution.

A large propertyless mass was, under the prevailing suffrage qualifications, excluded at the outset from participation (through representatives) in the work of framing the Constitution.

The members of the Philadelphia Convention which drafted the Constitution were, with a few exceptions, immediately, directly, and personally interested in, and derived economic advantages from, the establishment of the new system.

The Constitution was essentially an economic document based upon the concept that the fundamental private rights of property are anterior to government and morally beyond the reach of popular majorities.

The major portion of the members of the Convention are on record as recognizing the claim of property to a special and defensive position in the Constitution.

In the ratification of the Constitution, about three fourths of the adult males failed to vote on the question, having abstained from the elections at which delegates to the state conventions were chosen, either on account of their indifference or their disfranchisement by property qualifications.

The Constitution was ratified by a vote of probably not more than one-sixth of the adult males.

It is questionable whether a majority of the voters participating in the elections for the state conventions in New York, Massachusetts, New Hampshire, Virginia, and South Carolina, actually approved the ratification of the Constitution.

The leaders who supported the Constitution in the ratifying conventions represented the same economic groups as the members of the Philadelphia Convention; and in a large number of instances they were also directly and personally interested in the outcome of their efforts.

In the ratification, it became manifest that the line of cleavage for and against the Constitution was between substantial personality interests on the one hand and the small farming and debtor interests on the other.

The Constitution was not created by "the whole people" as the jurists have said; neither was it created by "the states" as Southern nullifiers long contended; but it was the work of a consolidated group whose interests knew no state boundaries and were truly national in their scope."

Charles Austin Beard, An Economic Interpretation of the Constitution of the United States.)

(Editor's 2nd notice: So you invested a lot of illusions in a Democrat and adopted the standard liberal "sky-is-falling" excuse toward the Republicans. These are the same lesser-evilism rationalizations we've been hearing from centrist liberals for several generations now.

There are two chronic mistakes people like you always make: (a) you overestimate the progressive potential of the Democrats and (b) overestimate how much worse the Republicans are going to be. Go back to 2006, when you were investing such fervent antiwar hopes in electing a Democratic Congress. The Democrats, of course, continued to vote to fund the war in Iraq, and you had to eat crow on your choice in that election, and your ilk issued thunderous post hoc denunciations of the Democrats' treacheries. But then you stepped right back into the same trap in 2008.

Here's the problem--your denunciations of the mainstream liberal Democrats are always POST HOC, always after the elections, when you protest how badly you've been betrayed and wounded by the Democrats' betrayals. But this breast-beating is the result of NEVER LEARNING from past EMPIRICAL REALITY, and always repeating the same mistake--as though your previous post hoc revelations evaporate by election day of the next election cycle. Norman Solomon and David Lindorff follow exactly the same pattern.

As for (b), overestimating the danger of the Republicans--or the Chicken Little argument for voting Democrat--the problem is this: any significant differences you posit between the mainstream elements of the two parties are always CONJECTURAL and COUNTERFACTUAL, based on what you expect the Republicans would do once in office. But EMPIRICALLY, WHEN IN OFFICE, the Democrats ARE NEVER ANY DIFFERENT--on foreign or domestic policy. Yet you keep stubbornly expecting them to be so. This is simply an example of failing to learn from experience—the experience of what both major parties actually do while in office, which refutes both your chronic prospective illusions about the Democrats (always followed by retrospective sense of betrayal!) and your Chicken Little hyperbole about the Republicans (yes, Bush was bad news, but he did NOT institute outright fascism, as you and other Chicken Littles predicted in 2004, and ALL of his policies were seconded and funded and authorized by the mainstream Democrats--all of them not just "Blue Dogs").

Is it possible that the country is a hair less dumb and more sane with Obama rather than McCain in the White House? Yes . . . but only by a hair, and only in ways that are mostly symbolic. Dem apologists like you always pose counterfactual hypotheses about extreme measures you expect the Republicans to make, or moderately progressive ones you expect from the Democrats; but neither imagined course ever comes to pass, and empirically, while in office, these knaves always follow pretty much the same policies in all the areas that matter. So your methodology of rationalizing your votes for Democrats is always nonempirical and always refuted by the facts of actual history.

Moreover, your approach guarantees that you and others will always be trapped by the duopoly shell game. If one group pretends--and I emphasize "pretends"-- to be so much worse than the other, then you and others can easily be scared into supporting the least worst, time after time, with the result that we always get some variant of "worst" and never any alternative. There has to be a decision, at some point, that the entire paradigm of financial fraud and imperial adventure will be repudiated, that people will begin devoting their energies to posing and building an alternative, rather than being bamboozled into settling for what will be at best a marginally-- and only marginally, if at all--less repugnant variant of the reigning barbarism. You have no business ever choosing barbarism--even barbarism with a "human" face--the human face of the focus-group marketers, of course.

If we are ever to break out of this closed paradigm, we must break with it decisively. Given the imminence of total economic collapse, brazen looting of the Treasury, and global-warming disaster, there is no longer any time to indulge in hair-splitting scholasticism over preferred variants of barbarism. We must act boldly to press for those measures that will challenge the barbaric paradigm once and for all. If those measures will not and cannot be taken up by any significant and influential sector of the Democrats--and we have seen over and over and over that this is the case--then we must stop playing their game and begin the hard work of saving this planet--for no less than that is at stake.

That means insisting on single-payer, nationalizing the banks, cutting military spending, and so on. The Democratic Party is a swamp where these demands sink into oblivion. THERE IS NO TIME TO PLAY THIS GAME ANY LONGER. van Mungo)





3. OUR SUICIDE BOMBERS

BY

JOHN FEFFER AND TOM ENGELHARDT

The way you imagine someone engaged in a suicide attack depends, not surprisingly, on which end of the attack you happen to be on — in cultural, if not literal terms. In American films and pop culture, there were few acts more inexplicable or malevolent in the years of my childhood than those of Japan’s kamikaze pilots (and, in a few cases, submariners), the state-organized suicide bombers of World War II who targeted the U.S. fleet with their weapons and their lives. Americans themselves were incapable of such kamikaze acts not because they didn’t commit them, but because, when done by someone known to us in the name of a cause we cherish or to save us from being overrun by them, such acts were no longer unrecognizable. Under those circumstances, each represented a profound gift of life to those left behind.

In the desperate early days of 1942 in the Pacific, for instance, there were a number of reported cases in which American pilots tried to dive their planes into Japanese ships. According to Edward F. Murphy in Heroes of WWII, Captain Richard E. Fleming, the only recipient of the Congressional Medal of Honor for the Battle of Midway, was leading his dive bomber squadron in an attack on the disabled cruiser Mikuma when his plane was hit by anti-aircraft fire. It "rocked wildly… but… soon righted itself and continued down under control. At an altitude of only 350 feet, Fleming released his bomb. Then he followed it straight down to the Japanese carrier." His hometown, St. Paul, Minnesota, later named its airport in his honor.

In the same way, "Colin" became a popular first name for boys (including, evidently, Colin Powell) because of war hero Captain Colin P. Kelly, Jr., who was generally (if incorrectly) believed to have won the Medal of Honor for plunging his B-17 into the smokestack of the Japanese battleship Haruna — he didn’t — in the first days of the Pacific war.

This sort of American heroism, as John Feffer, co-director of the website Foreign Policy in Focus and TomDispatch regular, indicates below, was highlighted in war films of those years. There was even a celluloid version of kamikaze sex. As film critic Jeanine Basinger wrote in The World War II Combat Film, nurse Veronica Lake, trapped by the Japanese on the Bataan peninsula in So Proudly We Hail (1943), "places a hand inside her blouse… and walks slowly toward the enemy in her combat fatigues. As she nears them, she takes off her helmet, and releases her long, very blonde hair over her shoulders. When they come near her in obvious delight, she pulls the pin on her grenade…" In fact, many war films of that time had a kamikaze feel to them, but as "we" were defending "home" and knew ourselves for the individuals we were, the act of diving a plane into a bridge or refusing to leave a platoon certain to be wiped out bore no relation to suicidal enemy acts.

To understand and deal with our world, it’s often less than useful to look on the enemy, in our case today "the terrorist," as something other than human (whether super-human or sub-human) rather than as another one of those strange creatures like ourselves. But let Feffer take it from here. Tom

Our Suicide Bombers

Thoughts on Western Jihad
By John Feffer

The actor Will Smith is no one’s image of a suicide bomber. With his boyish face, he has often played comic roles. Even as the last man on earth in I Am Legend, he retains a wise-cracking, ironic demeanor. And yet, surrounded by a horde of hyperactive vampires at the end of that film, Smith clasps a live grenade to his chest and throws himself at the enemy in a final burst of heroic sacrifice.

Wait a second: surely that wasn’t a suicide bombing. Will Smith wasn’t reciting suras from the Koran. He wasn’t sporting one of those rising sun headbands that the Japanese kamikaze wore for their suicide missions. He wasn’t playing a religious fanatic or a political extremist. Will Smith was the hero of the film. So how could he be a suicide bomber? After all, he’s one of us, isn’t he?

As it happens, we have our suicide bombers too. "We" are the powerful, developed countries, the ones with an overriding concern for individual liberties and individual lives. "We" form a moral archipelago that encompasses the United States, Europe, Israel, present-day Japan, and occasionally Russia. Whether in real war stories or inspiring vignettes served up in fiction and movies, our lore is full of heroes who sacrifice themselves for motherland, democracy, or simply their band of brothers. Admittedly, these men weren’t expecting 72 virgins in paradise and they didn’t make film records of their last moments, but our suicidal heroes generally have received just as much praise and recognition as "their" martyrs.

The scholarly work on suicide bombers is large and growing. Most of these studies focus on why those other people do such terrible things, sometimes against their own compatriots but mainly against us. According to the popular view, Shi’ite or Tamil or Chechen suicide martyrs have a fundamentally different attitude toward life and death.

If, however, we have our own rich tradition of suicide bombers — and our own unfortunate tendency to kill civilians in our military campaigns — how different can these attitudes really be?

Western Jihad

In America’s first war against Islam, we were the ones who introduced the use of suicide bombers. Indeed, the American seamen who perished in the incident were among the U.S. military’s first missing in action.

It was September 4, 1804. The United States was at war with the Barbary pirates along the North African coast. The U.S. Navy was desperate to penetrate the enemy defenses. Commodore Edward Preble, who headed up the Third Mediterranean Squadron, chose an unusual stratagem: sending a booby-trapped U.S.S. Intrepid into the bay at Tripoli, one of the Barbary states of the Ottoman empire, to blow up as many of the enemy’s ships as possible. U.S. sailors packed 10,000 pounds of gunpowder into the boat along with 150 shells.

When Lieutenant Richard Sommers, who commanded the vessel, addressed his crew on the eve of the mission, a midshipman recorded his words:

"’No man need accompany him, who had not come to the resolution to blow himself up, rather than be captured; and that such was fully his own determination!’ Three cheers was the only reply. The gallant crew rose, as a single man, with the resolution yielding up their lives, sooner than surrender to their enemies: while each stepped forth, and begged as a favor, that he might be permitted to apply the match!"

The crew of the boat then guided the Intrepid into the bay at night. So as not to be captured and lose so much valuable gunpowder to the enemy, they chose to blow themselves up with the boat. The explosion didn’t do much damage — at most, one Tripolitan ship went down — but the crew was killed just as surely as the two men who plowed a ship piled high with explosives into the U.S.S. Cole in the Gulf of Aden nearly 200 years later.

Despite the failure of the mission, Preble received much praise for his strategies. "A few brave men have been sacrificed, but they could not have fallen in a better cause," opined a British navy commander. The Pope went further: "The American commander, with a small force and in a short space of time, has done more for the cause of Christianity than the most powerful nations of Christendom have done for ages!"

Preble chose his tactic because his American forces were outgunned. It was a Hail Mary attempt to level the playing field. The bravery of his men and the reaction of his supporters could be easily transposed to the present day, when "fanatics" fighting against similar odds beg to sacrifice themselves for the cause of Islam and garner the praise of at least some of their religious leaders.

The blowing up of the Intrepid was not the only act of suicidal heroism in U.S. military history. We routinely celebrate the brave sacrifices of soldiers who knowingly give up their lives in order to save their unit or achieve a larger military mission. We commemorate the sacrifice of the defenders of the Alamo, who could have, after all, slunk away to save themselves and fight another day. The poetry of the Civil War is rich in the language of sacrifice. In Phoebe Cary’s poem "Ready" from 1861, a black sailor, "no slavish soul had he," volunteers for certain death to push a boat to safety.

The heroic sacrifices of the twentieth century are, of course, commemorated in film. Today, you can buy several videos devoted to the "suicide missions" of American soldiers.

Our World War II propaganda films — er, wartime entertainments — often featured brave soldiers facing certain death. In Flying Tigers (1942), for example, pilot Woody Jason anticipates the Japanese kamikaze by several years by flying a plane into a bridge to prevent a cargo train from reaching the enemy. In Bataan (1943), Robert Taylor leads a crew of 13 men in what they know will be the suicidal defense of a critical position against the Japanese. With remarkable sangfroid, the soldiers keep up the fight as they are picked off one by one until only Taylor is left. The film ends with him manning a machine gun against wave upon wave of oncoming Japanese.

Our warrior culture continues to celebrate the heroism of these larger-than-life figures from World War II by taking real-life stories and turning them into Hollywood-style entertainments. For his series of "war stories" on Fox News, for instance, Oliver North narrates an episode on the Doolittle raid, an all-volunteer mission to bomb Tokyo shortly after Pearl Harbor. Since the bombers didn’t have enough fuel to return to their bases, the 80 pilots committed to what they expected to be a suicide mission. Most of them survived, miraculously, but they had been prepared for the ultimate sacrifice — and that is how they are billed today. "These are the men who restored the confidence of a shaken nation and changed the course of the Second World War," the promotional material for the episode rather grandly reports. Tokyo had the same hopes for its kamikaze pilots a few years later.

Why Suicide Missions?

America did not, of course, dream up suicide missions. They form a rich vein in the Western tradition. In the Bible, Samson sacrificed himself in bringing down the temple on the Philistine leadership, killing more through his death than he did during his life. The Spartans, at Thermopylae, faced down the Persians, knowing that the doomed effort would nevertheless delay the invading army long enough to give the Athenians time to prepare Greek defenses. In the first century AD in the Roman province of Judea, Jewish Zealots and Sicarians ("dagger men") launched suicide missions, mostly against Jewish moderates, to provoke an uprising against Roman rule.

Later, suicide missions played a key role in European history. "Books written in the post-9/11 period tend to place suicide bombings only in the context of Eastern history and limit them to the exotic rebels against modernism," writes Nicolo Caldararo in an essay on suicide bombers. "A study of the late 19th century and early 20th would provide a spate of examples of suicide bombers and assassins in the heart of Europe." These included various European nationalists, Russian anarchists, and other early practitioners of terrorism.

Given the plethora of suicide missions in the Western tradition, it should be difficult to argue that the tactic is unique to Islam or to fundamentalists. Yet some scholars enjoy constructing a restrictive genealogy for such missions that connects the Assassin sect (which went after the great sultan Saladin in the Levant in the twelfth century) to Muslim suicide guerrillas of the Philippines (first against the Spanish and then, in the early twentieth century, against Americans). They take this genealogy all the way up to more recent suicide campaigns by Hezbollah, Hamas, al-Qaeda, and Islamic rebels in the Russian province of Chechnya. The Tamil Tigers of Sri Lanka, who used suicide bombers in a profligate fashion, are ordinarily the only major non-Muslim outlier included in this series.

Uniting our suicide attackers and theirs, however, are the reasons behind the missions. Three salient common factors stand out. First, suicidal attacks, including suicide bombings, are a "weapon of the weak," designed to level the playing field. Second, they are usually used against an occupying force. And third, they are cheap and often brutally effective.

We commonly associate suicide missions with terrorists. But states and their armies, when outnumbered, will also launch such missions against their enemies, as Preble did against Tripoli or the Japanese attempted near the end of World War II. To make up for its technological disadvantages, the Iranian regime sent waves of young volunteers, some unarmed and some reportedly as young as nine years old, against the then-U.S.-backed Iraqi army in the Iran-Iraq War of the 1980s.

Non-state actors are even more prone to launch suicide missions against occupying forces. Remove the occupying force, as Robert Pape argues in his groundbreaking book on suicide bombers, Dying to Win, and the suicide missions disappear. It is not a stretch, then, to conclude that we, the occupiers (the United States, Russia, Israel), through our actions, have played a significant part in fomenting the very suicide missions that we now find so alien and incomprehensible in Iraq, Afghanistan, Chechnya, Lebanon, and elsewhere.

The archetypal modern suicide bomber first emerged in Lebanon in the early 1980s, a response to Israel’s invasion and occupation of the country. "The Shi’ite suicide bomber," writes Mike Davis in his book on the history of the car bomb, Buda’s Wagon, "was largely a Frankenstein monster of [Israeli Defense Minister] Ariel Sharon’s deliberate creation." Not only did U.S. and Israeli occupation policies create the conditions that gave birth to these missions, but the United States even trained some of the perpetrators. The U.S. funded Pakistan’s intelligence service to run a veritable insurgency training school that processed 35,000 foreign Muslims to fight the Soviets in Afghanistan in the 1980s. Charlie Wilson’s War, the book and movie that celebrated U.S. assistance to the mujahedeen, could be subtitled: Suicide Bombers We Have Known and Funded.

Finally, the technique "works." Suicide bombers kill 12 times more people per incident than conventional terrorism, national security specialist Mohammed Hafez points out. The U.S. military has often publicized the "precision" of its airborne weaponry, of its "smart" bombs and missiles. But in truth, suicide bombers are the "smartest" bombers because they can zero in on their target in a way no missile can — from close up — and so make last-minute corrections for accuracy. In addition, by blasting themselves to smithereens, suicide bombers can’t give away any information about their organization or its methods after the act, thus preserving the security of the group. You can’t argue with success, however bloodstained it might be. Only when the tactic itself becomes less effective or counterproductive, does it recede into the background, as seems to be the case today among armed Palestinian groups.

Individual motives for becoming a suicide bomber or attacker have, when studied, proved to be surprisingly diverse. We tend to ascribe heroism to our soldiers when, against the odds, they sacrifice themselves for us, while we assume a glassy-eyed fanaticism on the part of those who go up against us. But close studies of suicide bombers suggest that they are generally not crazy, nor — another popular explanation — just acting out of abysmal poverty or economic desperation (though, as in the case of the sole surviving Mumbai suicide attacker put on trial in India recently, this seems to have been the motivation). "Not only do they generally not have economic problems, but most of the suicide bombers also do not have an emotional disturbance that prevents them from differentiating between reality and imagination," writes Anat Berko in her careful analysis of the topic, The Path to Paradise. Despite suggestions from Iraqi and U.S. officials that suicide bombers in Iraq have been coerced into participating in their missions, scholars have yet to record such cases.

Perhaps, however, this reflects a narrow understanding of coercion. After all, our soldiers are indoctrinated into a culture of heroic sacrifice just as are the suicide bombers of Hamas. The indoctrination doesn’t always work: scores of U.S. soldiers go AWOL or join the peace movement just as some suicide bombers give up at the last minute. But the basic-training techniques of instilling the instinct to kill, the readiness to follow orders, and a willingness to sacrifice one’s life are part of the warrior ethic everywhere.

Suicide missions are, then, a military technique that armies use when outmatched and that guerrilla movements use, especially in occupied countries, to achieve specific objectives. Those who volunteer for such missions, whether in Iraq today or on board the Intrepid in 1804, are usually placing a larger goal — liberty, national self-determination, ethnic or religious survival — above their own lives.

But wait: surely I’m not equating soldiers going on suicide missions against other soldiers with terrorists who blow up civilians in a public place. Indeed, these are two distinct categories. And yet much has happened in the history of modern warfare — in which civilians have increasingly become the victims of combat — to blur these distinctions.

Terror and Civilians

The conventional picture of today’s suicide bomber is a young man or woman, usually of Arab extraction, who makes a video proclamation of faith, straps on a vest of high explosives, and detonates him or herself in a crowded pizzeria, bus, marketplace, mosque, or church. But we must expand this picture. The September 11th hijackers targeted high-profile locations, including a military target, the Pentagon. Hezbollah’s suicidal truck driver destroyed the U.S. Marine barracks in Beirut on October 23, 1983, killing 241 U.S. soldiers. Thenmozhi Rajaratnam, a female Tamil suicide bomber, assassinated Indian Prime Minister Rajiv Gandhi in 1991.

Suicide bombers, in other words, have targeted civilians, military installations, non-military sites of great significance, and political leaders. In suicide attacks, Hezbollah, Tamil Tiger, and Chechen suicide bombers have generally focused on military and police targets: 88%, 71%, and 61% of the time, respectively. Hamas, on the other hand, has largely targeted civilians (74% of the time). Sometimes, in response to public opinion, such movements will shift focus — and targets. After a 1996 attack killed 91 civilians and created a serious image problem, the Tamil Tigers deliberately began chosing military, police, and government targets for their suicide attacks. "We don’t go after kids in Pizza Hut," one Tiger leader told researcher Mia Bloom, referring to a Hamas attack on a Sbarro outlet in Jerusalem that killed 15 civilians in 2001.

We have been conditioned into thinking of suicide bombers as targeting civilians and so putting themselves beyond the established conventions of war. As it happens, however, the nature of war has changed in our time. In the twentieth century, armies began to target civilians as a way of destroying the will of the population, and so bringing down the leadership of the enemy country. Japanese atrocities in China in the 1930s, the Nazi air war against Britain in World War II, Allied fire bombings of German and Japanese cities, the nuclear attacks against Hiroshima and Nagasaki, U.S. carpet bombing in Cambodia and Laos, and the targeted assassinations of the Phoenix program during the Vietnam War, Russian depredations in Afghanistan and Chechnya, the tremendous civilian casualties during the Iraq War: all this has made the idea of conventional armies clashing in an area far from civilian life a quaint legacy of the past.

Terrorist attacks against civilians, particularly September 11th, prompted military historian Caleb Carr to back the Bush administration’s declaration of a war against terror. "War can only be answered with war," he wrote in his best-selling The Lessons of Terror. "And it is incumbent on us to devise a style of war more imaginative, more decisive, and yet more humane than anything terrorists can contrive." This more imaginative, decisive, and humane style of war has, in fact, consisted of stepped-up aerial bombing, beefed-up Special Forces (to, in part, carry out targeted assassinations globally), and recently, the widespread use of unmanned aerial drones like the Predator and the Reaper, both in the American arsenal and in 24/7 use today over the Pakistani tribal borderlands. "Predators can become a modern army’s answer to the suicide bomber," Carr wrote.

Carr’s argument is revealing. As the U.S. military and Washington see it, the ideal use of Predator or Reaper drones, armed as they are with Hellfire missiles, is to pick off terrorist leaders; in other words, a mirror image of what that Tamil Tiger suicide bomber (who picked off the Indian prime minister) did somewhat more cost effectively. According to Carr, such a strategy with our robot planes is an effective and legitimate military tactic. In reality, though, such drone attacks regularly result in significant civilian casualties, usually referred to as "collateral damage." According to researcher Daniel Byman, the drones kill 10 civilians for every suspected militant. As Tom Engelhardt of TomDispatch.com writes, "In Pakistan, a war of machine assassins is visibly provoking terror (and terrorism), as well as anger and hatred among people who are by no means fundamentalists. It is part of a larger destabilization of the country."

So, the dichotomy between a "just war," or even simply a war of any sort, and the unjust, brutal targeting of civilians by terrorists has long been blurring, thanks to the constant civilian casualties that now result from conventional war-fighting and the narrow military targets of many terrorist organizations.

Moral Relativism?

We have our suicide bombers — we call them heroes. We have our culture of indoctrination — we call it basic training. We kill civilians — we call it collateral damage.

Is this, then, the moral relativism that so outrages conservatives? Of course not. I’ve been drawing these comparisons not to excuse the actions of suicide bombers, but to point out the hypocrisy of our black-and-white depictions of our noble efforts and their barbarous acts, of our worthy goals and their despicable ends. We — the inhabitants of an archipelago of supposedly enlightened warfare — have been indoctrinated to view the atomic bombing of Hiroshima as a legitimate military target and September 11th as a heinous crime against humanity. We have been trained to see acts like the attack in Tripoli as American heroism and the U.S.S. Cole attack as rank barbarism. Explosive vests are a sign of extremism; Predator missiles, of advanced sensibility.

It would be far better if we opened our eyes when it came to our own world and looked at what we were actually doing. Yes, "they" sometimes have dismaying cults of sacrifice and martyrdom, but we do too. And who is to say that ending occupation is any less noble than making the world free for democracy? Will Smith, in I Am Legend, was willing to sacrifice himself to end the occupation of vampires. We should realize that our soldiers in the countries we now occupy may look no less menacing and unintelligible than those obviously malevolent, science-fiction creatures. And the presence of our occupying soldiers sometimes inspires similar, Will Smith-like acts of desperation and, dare I say it, courage.

The fact is: Were we to end our occupation policies, we would go a long way toward eliminating "their" suicide bombers. But when and how will we end our own cult of martyrdom?

4. HIROSHIMA DAY: AMERICA HAS BEEN ASLEEP AT THE WHEEL FOR 64 YEARS

BY

DANIEL ELLSBERG



It was a hot August day in Detroit. I was standing on a street corner downtown, looking at the front page of The Detroit News in a news rack. I remember a streetcar rattling by on the tracks as I read the headline: A single American bomb had destroyed a Japanese city. My first thought was that I knew exactly what that bomb was. It was the U-235 bomb we had discussed in school and written papers about, the previous fall.

I thought: "We got it first. And we used it. On a city."

I had a sense of dread, a feeling that something very ominous for humanity had just happened. A feeling, new to me as an American, at 14, that my country might have made a terrible mistake. I was glad when the war ended nine days later, but it didn't make me think that my first reaction on Aug. 6 was wrong.

Unlike nearly everyone else outside the Manhattan Project, my first awareness of the challenges of the nuclear era had occurred—and my attitudes toward the advent of nuclear weaponry had formed—some nine months earlier than those headlines, and in a crucially different context.

It was in a ninth-grade social studies class in the fall of 1944. I was 13, a boarding student on full scholarship at Cranbrook, a private school in Bloomfield Hills, Mich. Our teacher, Bradley Patterson, was discussing a concept that was familiar then in sociology, William F. Ogburn's notion of "cultural lag."

The idea was that the development of technology regularly moved much further and faster in human social-historical evolution than other aspects of culture: our institutions of government, our values, habits, our understanding of society and ourselves. Indeed, the very notion of "progress" referred mainly to technology. What "lagged" behind, what developed more slowly or not at all in social adaptation to new technology was everything that bore on our ability to control and direct technology and the use of technology to dominate other humans.

To illustrate this, Mr. Patterson posed a potential advance in technology that might be realized soon. It was possible now, he told us, to conceive of a bomb made of U-235, an isotope of uranium, which would have an explosive power 1,000 times greater than the largest bombs being used in the war that was then going on. German scientists in late 1938 had discovered that uranium could be split by nuclear fission, in a way that would release immense amounts of energy.

Several popular articles about the possibility of atomic bombs and specifically U-235 bombs appeared during the war in magazines like The Saturday Evening Post. None of these represented leaks from the Manhattan Project, whose very existence was top-secret. In every case they had been inspired by earlier articles on the subject that had been published freely in 1939 and 1940, before scientific self-censorship and then formal classification had set in. Patterson had come across one of these wartime articles. He brought the potential development to us as an example of one more possible leap by science and technology ahead of our social institutions.

Suppose, then, that one nation, or several, chose to explore the possibility of making this into a bomb, and succeeded. What would be the probable implications of this for humanity? How would it be used, by humans and states as they were today? Would it be, on balance, bad or good for the world? Would it be a force for peace, for example, or for destruction? We were to write a short essay on this, within a week.

I recall the conclusions I came to in my paper after thinking about it for a few days. As I remember, everyone in the class had arrived at much the same judgment. It seemed pretty obvious.

The existence of such a bomb—we each concluded—would be bad news for humanity. Mankind could not handle such a destructive force. It could not control it, safely, appropriately. The power would be "abused": used dangerously and destructively, with terrible consequences. Many cities would be destroyed entirely, just as the Allies were doing their best to destroy German cities without atomic bombs at that very time, just as the Germans earlier had attempted to do to Rotterdam and London. Civilization, perhaps our species, would be in danger of destruction.

It was just too powerful. Bad enough that bombs already existed that could destroy a whole city block. They were called "block-busters": 10 tons of high explosive. Humanity didn't need the prospect of bombs a thousand times more powerful, bombs that could destroy whole cities.

As I recall, this conclusion didn't depend mainly on who had the Bomb, or how many had it, or who got it first. And to the best of my memory, we in the class weren't addressing it as something that might come so soon as to bear on the outcome of the ongoing war. It seemed likely, the way the case was presented to us, that the Germans would get it first, since they had done the original science. But we didn't base our negative assessment on the idea that this would necessarily be a Nazi or German bomb. It would be a bad development, on balance, even if democratic countries got it first.

After we turned in our papers and discussed them in class, it was months before I thought of the issues again. I remember the moment when I did, on a street corner in Detroit. I can still see and feel the scene and recall my thoughts, described above, as I read the headline on Aug. 6.

I remember that I was uneasy, on that first day and in the days ahead, about the tone in President Harry Truman's voice on the radio as he exulted over our success in the race for the Bomb and its effectiveness against Japan. I generally admired Truman, then and later, but in hearing his announcements I was put off by the lack of concern in his voice, the absence of a sense of tragedy, of desperation or fear for the future. It seemed to me that this was a decision best made in anguish; and both Truman's manner and the tone of the official communiqués made unmistakably clear that this hadn't been the case.

Which meant for me that our leaders didn't have the picture, didn't grasp the significance of the precedent they had set and the sinister implications for the future. And that evident unawareness was itself scary. I believed that something ominous had happened; that it was bad for humanity that the Bomb was feasible, and that its use would have bad long-term consequences, whether or not those negatives were balanced or even outweighed by short-run benefits.

Looking back, it seems clear to me my reactions then were right.

Moreover, reflecting on two related themes that have run through my life since then—intense abhorrence of nuclear weapons, and more generally of killing women and children—I've come to suspect that I've conflated in my emotional memory two events less than a year apart: Hiroshima and a catastrophe that visited my own family 11 months later.

On the Fourth of July, 1946, driving on a hot afternoon on a flat, straight road through the cornfields of Iowa—on the way from Detroit to visit our relatives in Denver—my father fell asleep at the wheel and went off the road long enough to hit a sidewall over a culvert that sheared off the right side of the car, killing my mother and sister.

My father's nose was broken and his forehead was cut. When a highway patrol car came by, he was wandering by the wreckage, bleeding and dazed. I was inside, in a coma from a concussion, with a large gash on the left side of my forehead. I had been sitting on the floor next to the back seat, on a suitcase covered with a blanket, with my head just behind the driver's seat. When the car hit the wall, my head was thrown against a metal fixture on the back of the driver's seat, knocking me out and opening up a large triangular flap of flesh on my forehead. I was in coma for 36 hours. My legs had been stretched out in front of me across the car and my right leg was broken just above the knee.

My father had been a highway engineer in Nebraska. He said that highway walls should never have been flush with the road like that, and later laws tended to ban that placement. This one took off the side of the car where my mother and sister were sitting, my sister looking forward and my mother facing left with her back to the side of the car. My brother, who came to the scene from Detroit, said later that when he saw what was left of the car in a junkyard, the right side looked like steel wool. It was amazing that anyone had survived.

My understanding of how that event came about—it wasn't entirely an accident, as I heard from my father, that he had kept driving when he was exhausted—and how it affected my life is a story for another time. But looking back now, at what I drew from reading the Pentagon Papers later and on my citizen's activism since then, I think I saw in the events of August 1945 and July 1946, unconsciously, a common message. I loved my father, and I respected Truman. But you couldn't rely entirely on a trusted authority—no matter how well-intentioned he was, however much you admired him—to protect you, and your family, from disaster. You couldn't safely leave events entirely to the care of authorities. Some vigilance was called for, to awaken them if need be or warn others. They could be asleep at the wheel, heading for a wall or a cliff. I saw that later in Lyndon Johnson and in his successor, and I've seen it since.

But I sensed almost right away, in August 1945 as Hiroshima and Nagasaki were incinerated, that such feelings—about our president, and our Bomb—separated me from nearly everyone around me, from my parents and friends and from most other Americans. They were not to be mentioned. They could only sound unpatriotic. And in World War II, that was about the last way one wanted to sound. These were thoughts to be kept to myself.

Unlikely thoughts for a 14-year-old American boy to have had the week the war ended? Yes, if he hadn't been in Mr. Patterson's social studies class the previous fall. Every member of that class must have had the same flash of recognition of the Bomb, as they read the August headlines during our summer vacation. Beyond that, I don't know whether they responded as I did, in the terms of our earlier discussion.

But neither our conclusions then or reactions like mine on Aug. 6 stamped us as gifted prophets. Before that day perhaps no one in the public outside our class—no one else outside the Manhattan Project (and very few inside it)—had spent a week, as we had, or even a day thinking about the impact of such a weapon on the long-run prospects for humanity.

And we were set apart from our fellow Americans in another important way. Perhaps no others outside the project or our class ever had occasion to think about the Bomb without the strongly biasing positive associations that accompanied their first awareness in August 1945 of its very possibility: that it was "our" weapon, an instrument of American democracy developed to deter a Nazi Bomb, pursued by two presidents, a war-winning weapon and a necessary one—so it was claimed and almost universally believed—to end the war without a costly invasion of Japan.

Unlike nearly all the others who started thinking about the new nuclear era after Aug. 6, our attitudes of the previous fall had not been shaped, or warped, by the claim and appearance that such a weapon had just won a war for the forces of justice, a feat that supposedly would otherwise have cost a million American lives (and as many or more Japanese).

For nearly all other Americans, whatever dread they may have felt about the long-run future of the Bomb (and there was more expression of this in elite media than most people remembered later) was offset at the time and ever afterward by a powerful aura of its legitimacy, and its almost miraculous potential for good which had already been realized. For a great many Americans still, the Hiroshima and Nagasaki bombs are regarded above all with gratitude, for having saved their own lives or the lives of their husbands, brothers, fathers or grandfathers, which would otherwise have been at risk in the invasion of Japan. For these Americans and many others, the Bomb was not so much an instrument of massacre as a kind of savior, a protector of precious lives.

Most Americans ever since have seen the destruction of the populations of Hiroshima and Nagasaki as necessary and effective—as constituting just means, in effect just terrorism, under the supposed circumstances—thus legitimating, in their eyes, the second and third largest single-day massacres in history. (The largest, also by the U.S. Army Air Corps, was the firebombing of Tokyo five months before on the night of March 9, which burned alive or suffocated 80,000 to 120,000 civilians. Most of the very few Americans who are aware of this event at all accept it, too, as appropriate in wartime.)

To regard those acts as definitely other than criminal and immoral—as most Americans do—is to believe that anything—anything—can be legitimate means: at worst, a necessary, lesser, evil. At least, if done by Americans, on the order of a president, during wartime. Indeed, we are the only country in the world that believes it won a war by bombing—specifically by bombing cities with weapons of mass destruction—and believes that it was fully rightful in doing so. It is a dangerous state of mind.

Even if the premises of these justifications had been realistic (after years of study I'm convinced, along with many scholars, that they were not; but I'm not addressing that here), the consequences of such beliefs for subsequent policymaking were bound to be fateful. They underlie the American government and public's ready acceptance ever since of basing our security on readiness to carry out threats of mass annihilation by nuclear weapons, and the belief by many officials and elites still today that abolition of these weapons is not only infeasible but undesirable.

By contrast, given a few days' reflection in the summer of 1945 before a presidential fait accompli was framed in that fashion, you didn't have to be a moral prodigy to arrive at the sense of foreboding we all had in Mr. Patterson's class. It was as easily available to 13-year-old ninth-graders as it was to many Manhattan Project scientists, who also had the opportunity to form their judgments before the Bomb was used.

But the scientists knew something else that was unknown to the public and even to most high-level decision-makers. They knew that the atomic bombs, the uranium and plutonium fission bombs they were preparing, were only the precursors to far more powerful explosives, almost surely including a thermonuclear fusion bomb, later called the hydrogen bomb, or H-bomb. That weapon—of which we eventually came to have tens of thousands—could have an explosive yield much greater than the fission bombs needed to trigger it. A thousand times greater.

Moreover, most of the scientists who focused on the long-run implications of nuclear weapons, belatedly, after the surrender of Germany in May 1945 believed that using the Bomb against Japan would make international control of the weapon very unlikely. In turn that would make inevitable a desperate arms race, which would soon expose the United States to adversaries' uncontrolled possession of thermonuclear weapons, so that, as the scientists said in a pre-attack petition to the president, "the cities of the United States as well as the cities of other nations will be in continuous danger of sudden annihilation." (In this they were proved correct.) They cautioned the president-on both moral grounds and considerations of long-run survival of civilization-against beginning this process by using the Bomb against Japan even if its use might shorten the war.

But their petition was sent "through channels" and was deliberately held back by Gen. Leslie Groves, director of the Manhattan Project. It never got to the president, or even to Secretary of War Henry Stimson until after the Bomb had been dropped. There is no record that the scientists' concerns about the future and their judgment of a nuclear attack's impact on it were ever made known to President Truman before or after his decisions. Still less, made known to the American public.

At the end of the war the scientists' petition and their reasoning were reclassified secret to keep it from public knowledge, and its existence was unknown for more than a decade. Several Manhattan Project scientists later expressed regret that they had earlier deferred to the demands of the secrecy managers—for fear of losing their clearances and positions, and perhaps facing prosecution—and had collaborated in maintaining public ignorance on this most vital of issues.

One of them—Eugene Rabinowitch, who after the war founded and edited the Bulletin of the Atomic Scientists (with its Doomsday Clock)—had in fact, after the German surrender in May, actively considered breaking ranks and alerting the American public to the existence of the Bomb, the plans for using it against Japan, and the scientists' views both of the moral issues and the long-term dangers of doing so.

He first reported this in a letter to The New York Times published on June 28, 1971. It was the day I submitted to arrest at the federal courthouse in Boston; for 13 days previous, my wife and I had been underground, eluding the FBI while distributing the Pentagon Papers to 17 newspapers after injunctions had halted publication in the Times and The Washington Post. The Rabinowitch letter began by saying it was "the revelation by The Times of the Pentagon history of U.S. intervention in Vietnam, despite its classification as ‘secret' " that led him now to reveal:

"Before the atom bomb-drops on Hiroshima and Nagasaki, I had spent sleepless nights thinking that I should reveal to the American people, perhaps through a reputable news organ, the fateful act—the first introduction of atomic weapons—which the U.S. Government planned to carry out without consultation with its people. Twenty-five years later, I feel I would have been right if I had done so."

I didn't see this the morning it was published, because I was getting myself arrested and arraigned, for doing what Rabinowitch wishes he had done in 1945, and I wish I had done in 1964. I first came across this extraordinary confession by a would-be whistle-blower (I don't know another like it) in "Hiroshima in America: Fifty Years of Denial" by Robert Jay Lifton and Greg Mitchell (New York, 1995, p. 249).

Rereading Rabinowitch's statement, still with some astonishment, I agree with him. He was right to consider it, and he would have been right if he had done it. He would have faced prosecution and prison then (as I did at the time his letter was published), but he would have been more than justified, as a citizen and as a human being, in informing the American public and burdening them with shared responsibility for the fateful decision.

Some of the same scientists faced a comparable challenge four years after Hiroshima, addressing the possible development of an even more terrible weapon, more fraught with possible danger to human survival: the hydrogen bomb. This time some who had urged use of the atom bomb against Japan (dissenting from the petitioners above) recommended against even development and testing of the new proposal, in view of its "extreme dangers to mankind." "Let it be clearly realized," they said, "that this is a super weapon; it is in a totally different category from an atomic bomb" (Herbert York, "The Advisors" [California, 1976], p. 156).

Once more, as I learned much later, knowledge of the secret possibility was not completely limited to government scientists. A few others—my father, it turns out, was one—knew of this prospect before it had received the stamp of presidential approval and had become an American government project. And once again, under those conditions of prior knowledge (denied as before to the public), to grasp the moral and long-run dangers you didn't have to be a nuclear physicist. My father was not.

Some background is needed here. My father, Harry Ellsberg, was a structural engineer. He worked for Albert Kahn in Detroit, the "Arsenal of Democracy." At the start of the Second World War, he was the chief structural engineer in charge of designing the Ford Willow Run plant, a factory to make B-24 Liberator bombers for the Air Corps. (On June 1 this year, GM, now owner, announced it would close the plant as part of its bankruptcy proceedings.)

Dad was proud of the fact that it was the world's largest industrial building under one roof. It put together bombers the way Ford produced cars, on an assembly line. The assembly line was a mile and a quarter long.

My father told me that it had ended up L-shaped, instead of in a straight line as he had originally designed it. When the site was being prepared, Ford comptrollers noted that the factory would run over a county line, into an adjacent county where the company had less control and local taxes were higher. So the design, for the assembly line and the factory housing it, had to be bent at right angles to stay inside Ford country.

Once, my father took me out to Willow Run to see the line in operation. For as far as I could see, the huge metal bodies of planes were moving along tracks as workers riveted and installed parts. It was like pictures I had seen of steer carcasses in a Chicago slaughterhouse. But as Dad had explained to me, three-quarters of a mile along, the bodies were moved off the tracks onto a circular turntable that rotated them 90 degrees; then they were moved back on track for the last half mile of the L. Finally, the planes were rolled out the hangar doors at the end of the factory—one every hour: It took 59 minutes on the line to build a plane with its 100,000 parts from start to finish—filled with gas and flown out to war. (Click here and here for sources and photographs.)

It was an exciting sight for a 13-year-old. I was proud of my father. His next wartime job had been to design a still larger airplane engine factory—again the world's largest plant under one roof—the Dodge Chicago plant, which made all the engines for B-29s.

When the war ended, Dad accepted an offer to oversee the buildup of the plutonium production facilities at Hanford, Wash. That project was being run by General Electric under contract with the Atomic Energy Commission. To take the job of chief structural engineer on the project, Dad moved from the engineering firm of Albert Kahn, where he had worked for years, to what became Giffels & Rossetti. Later he told me that engineering firm had the largest volume of construction contracts in the world at that time, and his project was the world's largest. I grew up hearing these superlatives.

The Hanford project gave my father his first really good salary. But while I was away as a sophomore at Harvard, he left his job with Giffels & Rossetti, for reasons I never learned at the time. He was out of work for almost a year. Then he went back as chief structural engineer for the whole firm. Almost 30 years later, in 1978, when my father was 89, I happened to ask him why he had left Giffels & Rossetti. His answer startled me.

He said, "Because they wanted me to help build the H-bomb."

This was a breathtaking statement for me to hear in 1978. I was in full-time active opposition to the deployment of the neutron bomb—which was a small H-bomb—that President Jimmy Carter was proposing to send to Europe. The N-bomb had a killing radius from its output of neutrons that was much wider than its radius of destruction by blast. Optimally, an airburst N-bomb would have little fallout nor would it destroy structures, equipment or vehicles, but its neutrons would kill the humans either outside or within buildings or tanks. The Soviets mocked it as "a capitalist weapon" that destroyed people but not property; but they tested such a weapon too, as did other countries.

I had opposed developing or testing that concept for almost 20 years, since it was first described to me by my friend and colleague at the RAND Corp., Sam Cohen, who liked to be known as the "father of the neutron bomb." I feared that, as a "small" weapon with limited and seemingly controllable lethal effects, it would be seen as usable in warfare, making U.S. first use and "limited nuclear war" more likely. It would be the match that would set off an exchange of the much larger, dirty weapons which were the bulk of our arsenal and were all that the Soviets then had.

In the year of this conversation with Dad, I was arrested four times blocking the railroad tracks at the Rocky Flats Nuclear Weapons Production Facility, which produced all the plutonium triggers for H-bombs and was going to produce the plutonium cores for neutron bombs. One of these arrests was on Nagasaki Day, Aug. 9. The "triggers" produced at Rocky Flats were, in effect, the nuclear components of A-bombs, plutonium fission bombs of the type that had destroyed Nagasaki on that date in 1945.

Every one of our many thousands of H-bombs, the thermonuclear fusion bombs that arm our strategic forces, requires a Nagasaki-type A-bomb as its detonator. (I doubt that one American in a hundred knows that simple fact, and thus has a clear understanding of the difference between A- and H-bombs, or of the reality of the thermonuclear arsenals of the last 50 years.

Our popular image of nuclear war—from the familiar pictures of the devastation of Nagasaki and Hiroshima—is grotesquely misleading. Those pictures show us only what happens to humans and buildings when they are hit by what is now just the detonating cap for a modern nuclear weapon.

The plutonium for these weapons came from Hanford and from the Savannah River Site in Georgia and was machined into weapons components at Rocky Flats, in Colorado. Allen Ginsberg and I, with many others, blockaded the entrances to the plant on Aug. 9, 1978, to interrupt business as usual on the anniversary of the day a plutonium bomb had killed 58,000 humans (about 100,000 had died by the end of 1945).

I had never heard before of any connection of my father with the H-bomb. He wasn't particularly wired in to my anti-nuclear work or to any of my activism since the Vietnam War had ended. I asked him what he meant by his comment about leaving Giffels & Rossetti.

"They wanted me to be in charge of designing a big plant that would be producing material for an H-bomb." He said that Dupont, which had built the Hanford Site, was to have the contract from the Atomic Energy Commission. That would have been for the Savannah River Site. I asked him when this was.

"Late '49."

I told him, "You must have the date wrong. You couldn't have heard about the hydrogen bomb then, it's too early." I'd just been reading about that, in Herb York's recent book, "The Advisors." The General Advisory Committee (GAC) of the AEC—chaired by Robert Oppenheimer and including James Conant, Enrico Fermi and Isidor Rabi—were considering that fall whether or not to launch a crash program for an H-bomb. That was the "super weapon" referred to earlier. They had advised strongly against it, but President Truman overruled them.

"Truman didn't make the decision to go ahead till January 1950. Meanwhile the whole thing was super-secret. You couldn't have heard about it in '49."

My father said, "Well, somebody had to design the plant if they were going to go ahead. I was the logical person. I was in charge of the structural engineering of the whole project at Hanford after the war. I had a Q clearance."

That was the first I'd ever heard that he'd had had a Q clearance—an AEC clearance for nuclear weapons design and stockpile data. I'd had that clearance myself in the Pentagon—along with close to a dozen other special clearances above top-secret—after I left the RAND Corp. for the Defense Department in 1964. It was news to me that my father had had a clearance, but it made sense that he would have needed one for Hanford.

I said, "So you're telling me that you would have been one of the only people in the country, outside the GAC, who knew we were considering building the H-bomb in 1949?"

He said, "I suppose so. Anyway, I know it was late '49, because that's when I quit."

"Why did you quit?"

"I didn't want to make an H-bomb. Why, that thing was going to be 1,000 times more powerful than the A-bomb!"

I thought, score one for his memory at 89. He remembered the proportion correctly. That was the same factor Oppenheimer and the others predicted in their report in 1949. They were right. The first explosion of a true H-bomb, five years later, had a thousand times the explosive power of the Hiroshima blast.

At 15 megatons—the equivalent of 15 million tons of high explosive—it was over a million times more powerful than the largest conventional bombs of World War II. That one bomb had almost eight times the explosive force of all the bombs we dropped in that war: more than all the explosions in all the wars in human history. In 1961, the Soviets tested a 58-megaton H-bomb.

My father went on: "I hadn't wanted to work on the A-bomb, either. But then Einstein seemed to think that we needed it, and it made sense to me that we had to have it against the Russians. So I took the job, but I never felt good about it.

"Then when they told me they were going to build a bomb 1,000 times bigger, that was it for me. I went back to my office and I said to my deputy, ‘These guys are crazy. They have an A-bomb, now they want an H-bomb. They're going to go right through the alphabet till they have a Z-bomb.' "

I said, "Well, so far they've only gotten up to N."

He said, "There was another thing about it that I couldn't stand. Building these things generated a lot of radioactive waste. I wasn't responsible for designing the containers for the waste, but I knew they were bound to leak eventually. That stuff was deadly forever. It was radioactive for 24,000 years."

Again he had turned up a good figure. I said, "Your memory is working pretty well. It would be deadly a lot longer than that, but that's about the half-life of plutonium."

There were tears in his eyes. He said huskily, "I couldn't stand the thought that I was working on a project that was poisoning parts of my own country forever, that might make parts of it uninhabitable for thousands of years."

I thought over what he'd said; then I asked him if anyone else working with him had had misgivings. He didn't know.

"Were you the only one who quit?" He said yes. He was leaving the best job he'd ever had, and he didn't have any other to turn to. He lived on savings for a while and did some consulting.

I thought about Oppenheimer and Conant—both of whom had recommended dropping the atomic bomb on Hiroshima—and Fermi and Rabi, who had, that same month Dad was resigning, expressed internally their opposition to development of the superbomb in the most extreme terms possible: It was potentially "a weapon of genocide ... carries much further than the atomic bomb itself the policy of exterminating civilian populations ... whose power of destruction is essentially unlimited ... a threat to the future of the human race which is intolerable ... a danger to humanity as a whole ... necessarily an evil thing considered in any light" (York, "The Advisor," pp. 155-159).

Not one of these men risked his clearance by sharing his anxieties and the basis for them with the American public. Oppenheimer and Conant considered resigning their advisory positions when the president went ahead against their advice. But they were persuaded-by Dean Acheson-not to quit at that time, lest that draw public attention to their expert judgment that the president's course fatally endangered humanity.

I asked my father what had made him feel so strongly, to act in a way that nobody else had done. He said, "You did."

That didn't make any sense. I said, "What do you mean? We didn't discuss this at all. I didn't know anything about it."

Dad said, "It was earlier. I remember you came home with a book one day, and you were crying. It was about Hiroshima. You said, ‘Dad, you've got to read this. It's the worst thing I've ever read.' "

I said that must have been John Hersey's book "Hiroshima." (I read it when it came out as a book. I was in the hospital when it filled The New Yorker in August 1946.) I didn't remember giving it to him.

"Yes. Well, I read it, and you were right. That's when I started to feel bad about working on an atomic bomb project. And then when they said they wanted me to work on a hydrogen bomb, it was too much for me. I thought it was time for me to get out."

I asked if he had told his bosses why he was quitting. He said he told some people, not others. The ones he told seemed to understand his feelings. In fact, in less than a year, the head of the firm called to say that they wanted him to come back as chief structural engineer for the whole firm. They were dropping the Dupont contract (they didn't say why), so he wouldn't have to have anything to do with the AEC or bomb-making. He stayed with them till he retired.

I said, finally, "Dad, how could I not ever have heard any of this before? How come you never said anything about it?"

My father said, "Oh, I couldn't tell any of this to my family. You weren't cleared."

Well, I finally got my clearances, a decade after my father gave his up. And for some years, they were my undoing, though they turned out to be useful in the end. A decade later they allowed me to read the Pentagon Papers and to keep them in my "Top Secret" safe at the RAND Corp., from which I eventually delivered them to the Senate Foreign Relations Committee and later to 19 newspapers.

We have long needed and lacked the equivalent of the Pentagon Papers on the subject of nuclear policies and preparations, nuclear threats and decision-making: above all in the United States and Russia but also in the other nuclear-weapons states. I deeply regret that I did not make known to Congress, the American public and the world the extensive documentation of persistent and still-unknown nuclear dangers that was available to me 40 to 50 years ago as a consultant to and official in the executive branch working on nuclear war plans, command and control and nuclear crises. Those in nuclear-weapons states who are in a position now to do more than I did then to alert their countries and the world to fatally reckless secret policies should take warning from the earlier inaction of myself and others: and do better.

That I had high-level access and played such a role in nuclear planning is, of course, deeply ironic in view of the personal history recounted above. My feelings of revulsion and foreboding about nuclear weapons had not changed an iota since 1945, and they have never left me. Since I was 14, the overriding objective of my life has been to prevent the occurrence of nuclear war.

There was a close analogy with the Manhattan Project. Its scientists—most of whom hoped the Bomb would never be used for anything but as a threat to deter Germany—were driven by a plausible but mistaken fear that the Nazis were racing them. Actually the Nazis had rejected the pursuit of the atomic bomb on practical grounds in June 1942, just as the Manhattan Project was beginning. Similarly, I was one of many in the late '50s who were misled and recruited into the nuclear arms race by exaggerated, and in this case deliberately manipulated, fears of Soviet intentions and crash efforts.

Precisely because I did receive clearances and was exposed to top-secret intelligence estimates, in particular from the Air Force, I, along with my colleagues at the RAND Corp., came to be preoccupied with the urgency of averting nuclear war by deterring a Soviet surprise attack that would exploit an alleged "missile gap." That supposed dangerous U.S. inferiority was exactly as unfounded in reality as the fear of the Nazi crash bomb program had been, or, to pick a more recent example, as concern over Saddam Hussein's supposed WMDs and nuclear pursuit in 2003.

Working conscientiously, obsessively, on a wrong problem, countering an illusory threat, I and my colleagues distracted ourselves and helped distract others from dealing with real dangers posed by the mutual and spreading possession of nuclear weapons—dangers which we were helping make worse—and from real opportunities to make the world more secure. Unintentionally, yet inexcusably, we made our country and the world less safe.

Eventually the Soviets did emulate us in creating a world-threatening nuclear capability on hair-trigger alert. That still exists; Russian nuclear posture and policies continue, along with ours, to endanger our countries, civilization and much of life itself. But the persistent reality has been that the nuclear arms race has been driven primarily by American initiatives and policies and that every major American decision in this 64-year-old nuclear era has been accompanied by unwarranted concealment, deliberate obfuscation, and official and public delusions.

I have believed for a long time that official secrecy and deceptions about our nuclear weapons posture and policies and their possible consequences have threatened the survival of the human species. To understand the urgency of radical changes in our nuclear policies that may truly move the world toward abolition of nuclear weapons, we need a new understanding of the real history of the nuclear age.

Using the new opportunities offered by the Internet—drawing attention to newly declassified documents and to some realities still concealed—I plan over the next year, before the 65th anniversary of Hiroshima, to do my part in unveiling this hidden history



5. Hiroshima, Nagasaki Atom Bombs Was Right Decision

According To Majority Of Americans: Poll

By John Christoffersen



A majority of Americans surveyed believe dropping atomic bombs on Japan during World War II was the right thing to do, but support was weaker among Democrats, women, younger voters and minority voters, according to a Quinnipiac University poll.

The poll, released Tuesday, found 61 percent of the more than 2,400 American voters questioned believe the U.S. did the right thing. Twenty-two percent called it wrong and 16 percent were undecided.

The first bomb was dropped Aug. 6, 1945, on Hiroshima. An estimated 140,000 people were killed instantly or died within a few months. Tens of thousands more died from radiation poisoning in the years following.

Three days later, another bomb was dropped on Nagasaki, killing about 80,000 people. Japan surrendered less than a week later.

"Sixty-four years after the dawn of the atomic age, one in five Americans think President Harry Truman made a mistake dropping the bomb," said Peter A. Brown, assistant director of the Quinnipiac University Polling Institute.

The poll asked a single question: "Do you think the United States did the right thing or the wrong thing by dropping the atomic bomb on Hiroshima and Nagasaki?"

Among voters over 55 years of age, 73 percent of those surveyed approved the decision while 13 percent opposed. Sixty percent of voters 35 to 54 approved, while 50 percent approved among voters 18 to 34 years old, according to the poll.

"Voters who remember the horrors of World War II overwhelmingly support Truman's decision," Brown said. "Support drops with age, from the generation that grew up with the nuclear fear of the Cold War to the youngest voters, who know less about WW II or the Cold War."

Only 34 percent of black voters and 44 percent of Hispanic voters approved the decision, according to the poll. But Brown cautioned that the polling sample was smaller for those groups, so officials said the margin of error was 8 percentage points for blacks and 10 percentage points for Hispanics.

Support for Truman's decision was much stronger among Republicans than Democrats and among men than women.

Among Democrats surveyed, 49 percent approved, while 74 percent of Republicans supported Truman's decision.

Among women questioned, 51 percent supported the bombing, compared to 72 percent of men surveyed.

The poll showed about 70 percent of white Protestants, Catholics and evangelical Christians support the bombing, while 58 percent of Jews approved. The margin of error was 12 percentage points for Jewish voters, officials said.

Quinnipiac surveyed 2,409 registered voters from July 27 to Aug. 3. The poll has a margin of error of 2 percentage points.