Ars Technica: Why trust climate models?

2013 September 8 Comments off

Why trust climate models? It’s a matter of simple science
by Scott K. Johnson with comments from Weaver, Easterbrook, Otto-Bliesner, Schmidt, Del Genio, and Alley.

This is just awesome with awesome sauce on top.

How refreshing to read a lay science piece that’s about the science and not the controversy.

Contemplating Cultural Boundaries

2013 July 18 Comments off

The Mesh of Civilizations and International Email Flows

Click to access 1303.0045v1.pdf

Abstract: In The Clash of Civilizations, Samuel Huntington argued that the primary axis of global conflict was no longer ideological or economic but cultural and religious, and that this division would characterize the “battle lines of the future.” In contrast to the “top down” approach in previous research focused on the relations among nation states, we focused on the flows of interpersonal communication as a bottom-up view of international alignments. To that end, we mapped the locations of the world’s countries in global email networks to see if we could detect cultural fault lines. Using IP-geolocation on a worldwide anonymized dataset obtained from a large Internet company, we constructed a global email network. In computing email flows we employ a novel rescaling procedure to account for differences due to uneven adoption of a particular Internet service across the world. Our analysis shows that email flows are consistent with Huntington’s thesis. In addition to location in Huntington’s “civilizations,” our results also attest to the importance of both cultural and economic factors in the patterning of inter-country communication ties.


Changing Mass Priorities: The Link between Modernization and Democracy

Click to access inglehart_welzel(2010).pdf

(modified from original in cited paper)


Huntington: The Clash of Civilizations


I find it interesting that Huntington’s cultural boundaries are to some degree quantifiable.


See also Culturomics 2.0:

Westcott and Jewsion: Weather Effects on Expected Corn and Soybean Yields

2013 June 15 1 comment

Corn Yield Model
A model for national corn yields was estimated over the past 25 years (1988-2012), thereby including both the 1988 and 2012 droughts. In addition to a trend variable, the model uses as explanatory variables mid-May planting progress, July weather (precipitation and average temperature), and a June precipitation shortfall measure in selected years. Including those variables helps explain previous yield variations and deviations from trend.

Corn plantings by mid-May are important for yield potential because that allows more of the critical stages of crop development, particularly pollination, to occur earlier, before the most severe heat of the summer. Earlier pollination is also generally associated with less plant stress from moisture shortages. Most of the corn crop develops in July, so weather in that month is included in the model, including variables for both precipitation and temperature.

Finally, while weather in June is important for development of the corn crop (and June typically has lower temperatures and more rain than July), effects of June weather are typically small relative to July weather effects. However, extreme weather deviations from normal in June can have larger impacts, as seen in 2012 and in 1988. To represent that effect, the model uses a measure of the precipitation shortfall from average in years when June precipitation is in the lowest 10 percent tail of its statistical distribution. The mid-May planting progress variable is based on weekly data from USDA’s National Agricultural Statistics Service and is prorated to May 15 from adjacent weeks’ results for years that the statistic was not reported for that specific date. The weather data is from the National Oceanic and Atmospheric Administration.

The planting progress and weather data used is for eight key corn-producing States (Iowa, Illinois, Indiana, Ohio, Missouri, Minnesota, South Dakota, and Nebraska). Those eight States typically rank in the top 10 corn-producing States and accounted for an average of 76 percent of U.S. corn production over the estimation period. An aggregate measure for the eight States for each of those variables is constructed using harvested corn acres to weight State-specific observations.

The effects of mid-May planting progress and July temperatures on corn yield are each linear in the model—for those variables, each unit of change has a constant effect on yield. Similarly, the June precipitation shortfall variable is linear for the years it is nonzero. However, the effect of July precipitation is nonlinear in the model to reflect the asymmetric response of corn yields to different amounts of precipitation above and below its average. That is, reductions in corn yields when rainfall is below average are larger than gains in corn yields when rainfall is above average. The model uses a squared term for July precipitation to represent that asymmetric effect. The estimated regression equation (table 6) explains over 96 percent of the variation in national corn yields during the estimation period (more than 91 percent of the variation around the equation’s trend).

Weather Effects on Expected Corn and Soybean Yields
Paul C. Westcott, USDA, Economic Research Service
Michael Jewison, USDA, World Agricultural Outlook Board

The model assumes a linear trend for corn yields with a weather forced variation.

Note that as of June 12, 2013: U.S. corn production for 2013/14 was estimated 135 million bushels lower to 14.0 billion bushels. Corn yields for the upcoming year were projected at 156.5 bushels per acre, a 1.5 bushel decrease from May’s estimate. The decrease in yields is due to delays in planting in some of the highest producing corn states.

The baseline trend projection for 2013 is 163.6 bushels per acre, if this information from May is accurate: The 2013/14 corn yield is projected at 158.0 bushels per acre, 5.6 bushels below the weather adjusted trend presented at USDA’s Agricultural Outlook Forum in February [Edit: Confirmed here]–206929151.html

But even the new, lower projection is well above the 2011 and 2012 yields.

Ansatz: … but first of all, let us begin with the inspiration …

2013 June 11 Comments off

Introduction: The study of physics requires both scientific observation and philosophy. The tenants of science and its axioms of operation are not themselves scientific statements, but philosophical statements. The profound philosophical insight precipitating the birth of physics was that scientific observations and philosophical constructs, such as logic and reasoning, could be married together in a way that allowed one to make predictions of observations (in science) based on theorems and proofs (in philosophy). This natural philosophy requires a philosophical ‘leap’, in which one makes an assumption or guess about what abstract framework applies most correctly. Such a leap, called Ansatz, is usually arrived at through inspiration and an integrated usage of faculties of the mind, rather than a programmatic application of certain axioms. Nevertheless, a programmatic approach allows enumeration of the details of a mathematical system. It seems prudent to apply a programmatic approach to the notion of Ansatz itself and to clarify its process metaphysically, in order to gain a deeper understanding of how it is used in practice in science; but first of all, let us begin with the inspiration.
A more general treatment of the philosophy of physics and the existence of universes Hall, 2013

In physics and mathematics, an ansatz (initial placement of a tool at a work piece) is an educated guess[1] that is verified later by its results.

An ansatz is the establishment of the starting equation(s), the theorem(s), or the value(s) describing a mathematical or physical problem or solution. It can take into consideration boundary conditions. After an ansatz has been established (constituting nothing more than an assumption), the equations are solved for the general function of interest (constituting a confirmation of the assumption).

An ansatz is an assumed form for a mathematical statement that is not based on any underlying theory or principle.

An example from physics is the Bethe Ansatz (Müller).

Adamatzky: Slime Mould Tactile Sensor

2013 June 6 Comments off

Figure 7. Physarum’s morphological responses towards mechanical contact. (a) A protoplasmic tube is distorted by a glass capillary placed across the tube. (b) A zone of extensive growth of Physarum under and at the edges of the glass capillary. (c) Two segments of glass capillary placed on top of Physarum, on agar blob, are partly colonised by the slime mould. (d) Physarum colonises plastic disc placed on top of Physarum sheet wrapping agar blob, view from below.

Slime mould P. polycephalum is a single cells visible by unaided eye. The cells shows a wide spectrum of intelligent behaviour. By interpreting the behaviour in terms of computation one can make a slime mould based computing device. The Physarum computers are capable to solve a range of tasks of computational geometry, optimisation and logic. Physarum computers designed so far lack of localised inputs. Commonly used inputs illumination and chemo-attractants and repellents usually act on extended domains of the slime mould’s body. Aiming to design massive-parallel tactile inputs for slime mould computers we analyse a temporal dynamic of P. polycephalum’s electrical response to tactile stimulation. In experimental laboratory studies we discover how the Physarum responds to application and removal of a local mechanical pressure with electrical potential impulses and changes in its electrical potential oscillation patterns.

Andrew Adamatzky

In a series of previous works, see overview in [2], we developed a concept and fabricated experimental laboratory prototypes of amorphous bio-computing devices Physarum machines. A Physarum machine is a programmable amorphous biological computing device experimentally implemented in plasmodium of P. polycephalum. Physarum polycephalum belongs to the species of order Physarales, subclass Myxogastromycetidae, class Myxomycetes, division Myxostelida. It is commonly known as a true, acellular or multi-headed slime mould. Plasmodium is a `vegetative’ phase, a single cell with a myriad of diploid nuclei. The plasmodium is visible to the unaided eye. The plasmodium looks like an amorphous yellowish mass with networks of protoplasmic tubes. The plasmodium behaves and moves as a giant amoeba. It feeds on bacteria, spores and other microbial creatures and micro-particles [30]. The plasmodium’s foraging behaviour can be interpreted as a computation: data are represented by spatial distribution of attractants and repellents, and results are represented by a structure of Physarum’s protoplasmic network. In such speci cation a plasmodium can solve computational problems with natural parallelism, including optimisation on graphs, computational geometry, logic and robot control, see details in

A Physarum machine is programmed by confi gurations of repelling and attracting gradients: chemical substances, temperature and illumination. These quantities are often difficult to localise, which makes a precise, fi ne-grained, input of spatial data into Physarum machines problematic. A tactile input of information could be a solution. Thus in present we evaluate a feasibility of Physarum to act as a tranducer: to transform a tactile stimulation or a mechanical pressure to a distinctive pattern of an electrical activity. We study how parameters of the oscillations change in response to an application and removal of a solid light-weight insulators to Physarum’s protoplasmic tubes or sheet-shaped parts

Plasmodium of Physarum polycephalum was cultivated in plastic lunch boxes (with few holes punched in their lids for ventilation) on wet kitchen towels and fed with oat flakes.

Physarum Machines (YouTube)
http: //

Why My Slime Mold is Better than Your Hadoop Cluster

How brainless slime molds redefine intelligence (Nature)

Lu: Cosmic-Ray-Driven Reaction and Greenhouse Effect of Halogenated Molecules: Culprits for Atmospheric Ozone Depletion and Global Climate Change

2013 May 31 8 comments

Cosmic-Ray-Driven Reaction and Greenhouse Effect of Halogenated Molecules: Culprits for Atmospheric Ozone Depletion and Global Climate Change

Abstract This study is focused on the effects of cosmic rays (solar activity) and halogenated molecules (mainly chlorofluorocarbons-CFCs) on atmospheric O3 depletion and global climate change. Brief reviews are first given on the cosmic-ray-driven electron-induced-reaction (CRE) theory for O3 depletion and the warming theory of CFCs for climate change. Then natural and anthropogenic contributions are examined in detail and separated well through in-depth statistical analyses of comprehensive measured datasets. For O3 loss, new statistical analyses of the CRE equation with observed data of total O3 and stratospheric temperature give high linear correlation coefficients >=0.92. After removal of the CR effect, a pronounced recovery by 20~25% of the Antarctic O3 hole is found, while no recovery of O3 loss in mid-latitudes has been observed. These results show both the dominance of the CRE mechanism and the success of the Montreal Protocol. For global climate change, in-depth analyses of observed data clearly show that the solar effect and human-made halogenated gases played the dominant role in Earth climate change prior to and after 1970, respectively. Remarkably, a statistical analysis gives a nearly zero correlation coefficient (R=-0.05) between global surface temperature and CO2 concentration in 1850-1970. In contrast, a nearly perfect linear correlation with R=0.96-0.97 is found between global surface temperature and total amount of stratospheric halogenated gases in 1970-2012. Further, a new theoretical calculation on the greenhouse effect of halogenated gases shows that they (mainly CFCs) could alone lead to the global surface temperature rise of ~0.6 deg C in 1970-2002. These results provide solid evidence that recent global warming was indeed caused by anthropogenic halogenated gases. Thus, a slow reversal of global temperature to the 1950 value is predicted for coming 5~7 decades.

Cosmic-Ray-Driven Reaction and Greenhouse Effect of Halogenated Molecules: Culprits for Atmospheric Ozone Depletion and Global Climate Change
Qing-Bin Lu
Comments: 24 pages, 12 figures; an updated version
Subjects: Atmospheric and Oceanic Physics (; Atomic and Molecular Clusters (physics.atm-clus); Chemical Physics (physics.chem-ph)
Journal reference: Int. J. Mod. Phys. B Vol. 27 (2013) 1350073 (38 pages)
DOI: 10.1142/S0217979213500732
Cite as: arXiv:1210.6844 []
(or arXiv:1210.6844v2 [] for this version)

See also: Lu: from ‘interesting but incorrect’ to just wrong (Real Climate)

Dear Willard: Who needs words when one has letters and operators?

2013 May 25 Comments off

Peer-Reviewed Survey Finds Majority Of Scientists Skeptical Of Global Warming Crisis (Feb 2013)

Regarding ad homininum (circumstantial)
Let X be AGW

1. Person A makes claim ~X.
2. Person B asserts that A makes claim ~X because it is in A’s interest to claim ~X.
3. Therefore claim ~X is false.

If Person B’s assetion is that ~X is false simply because the persons surveyed are petroleum engineers, I agree that argument is fallacious.

But there is a deeper problem. The author obscured the actual scope of the survey, so we aren’t even in agreement about the identity of “Person A”. And the identity of “Person A” has great relevance on the claim, since the whole op-ed is an argument from authority. In the beginning … “these skeptical scientists may indeed form a scientific consensus.” … and again in the end …”Now that we have access to hard surveys of scientists themselves” … much less the title of the piece … “Peer-Reviewed Survey Finds Majority Of Scientists Skeptical Of Global Warming Crisis

How does the Forbes op-ed construct this consensus of scientists?
Through a fallacy of composition

1. A ‘consensus’ A makes the claim ~X OR ~Y
2. All A are an element of B
3. All B are an element of C
4. Therefore a ‘consensus’ of C makes the claim ~X OR ~Y

Where …
A is petroleum engineers from Alberta
B is geoscientists
C is scientists
X is ‘AGW’
Y is ‘crisis’
(note how Taylor mixes skepticism of causes (X) and consequences (Y) to construct his ‘majority’ and ‘consensus’)

The composition fallacy is more apparent when the actual group surveyed is revealed which is why it wasn’t and why noting the population surveyed isn’t fallacious. The source of the survey doesn’t prove/disprove ~X OR ~Y; it identifies the composition fallacy.

Selvam: Universal Inverse Power Law Distribution for Indian Region Rainfall

2013 May 24 4 comments

Space-time fluctuations of meteorological parameters exhibit selfsimilar fractal fluctuations. Fractal space-time fluctuations are generic to dynamical systems in nature such as fluid flows, spread of diseases, heart beat pattern, etc. A general systems theory developed by the author predicts universal inverse power law form incorporating the golden mean for the fractal fluctuations. The model predicted distribution is in close agreement with observed fractal fluctuations of all size scales in the monthly total Indian region rainfall for the 141 year period 1871 to 2011.

Universal Inverse Power Law Distribution for Indian Region Rainfall
From: A. Mary Selvam
[v1] Fri, 3 May 2013 09:52:00 GMT (434kb)
arXiv:1305.1188 [physics.gen-ph]

The Gaussian probability distribution used widely for analysis and description of large data sets underestimates the probabilities of occurrence of extreme events such as stock market crashes, earthquakes, heavy rainfall, etc. The assumptions underlying the normal distribution such as fixed mean and standard deviation, independence of data, are not valid for real world fractal data sets exhibiting a scale-free power law distribution with fat tails (Selvam, 2009). There is now urgent need to incorporate newly identified fractal concepts in standard meteorological theory for realistic simulation and prediction of atmospheric flows.

Dear Dr Russell …

2013 May 23 Comments off

Dear Dr Russell,

First, congratulations on your recent analysis and observations regarding the recent interaction of the massive coronal mass emission and the thermosphere.

On the other hand, I am sure you must be aware by now how your comments regarding the event are being used to suggest that CO2 in the lower atmosphere does not act as a ‘global warming’ gas. For instance, this article …

Global warming debunked: NASA report verifies carbon dioxide actually cools atmosphere
Learn more:

Do you concur with the author’s conclusion that “The result was an overall cooling effect that completely contradicts claims made by NASA’s own climatology division that greenhouse gases are a cause of global warming. “

Thank you in advance for any response
Ron Broberg

Hi Ron,

Thanks for your question. There has been a widespread misconception about what was discussed in this web release and I welcome the chance to clarify what we said. Nothing could be further from the truth to say that “The result was an overall cooling effect that completely contradicts claims made by NASA’s own climatology division that greenhouse gases are a cause of global warming. “ The cooling due to CO2 being referred to in our web article occurs 60 to 155 miles above the surface of the earth (100s of kilometers in altitude). SABER is looking at the energy balance and climate of the upper atmosphere, not down at the surface. This atmospheric region has no effect on global warming in the lower atmosphere near the earth surface. The earth surface is heated by the sun and then cooled by infrared radiation being radiated back to space. CO2 in the lower atmosphere is a strong absorber of this radiation ( as is other greenhouse gases) and it radiates much of this radiation back to the earth surface causing the warming to occur. I liken CO2 in the lower atmosphere to a thick blanket that traps much of the radiated heat from the surface preventing it from escaping resulting in warming in the lower atmosphere. As altitude increases, the “blanket” gets thinner letting more radiation escape to space. In the 60 to 155 mile altitude range reported on in our article, the “blanket” is very thin letting most of the CO2 radiation escape to space causing the cooling we refer to.

So first, the observations we reported on have no bearing on the question of global warming due to the greenhouse gas CO2 and secondly, they do not in any way contradict statements made by NASA , the IPCC or other reputable groups studying climate change that CO2 increases lead to global warming.

I hope this response addresses your question, but if you wish more information, do not hesitate to contact me.


Jim Russell
SABER Principal Investigator

Courtney: Studying the Internal Ballistics of a Combustion Driven Potato Cannon using High-speed Video

2013 May 10 1 comment

Figure 2. Average velocity of cylindrical potato projectiles vs. barrel position for each experimental propellant.

A potato cannon was designed to accommodate several different experimental propellants and have a transparent barrel so the movement of the projectile could be recorded on high-speed video (at 2000 frames per second). Both combustion chamber and barrel were made of polyvinyl chloride (PVC). Five experimental propellants were tested: propane (C3H8), acetylene (C2H2), ethanol (C2H6O), methanol (CH4O), and butane (C4H10). The amount of each experimental propellant was calculated to approximate a stoichometric mixture and considering the Upper Flammability Limit (UFL) and the Lower Flammability Limit (LFL), which in turn were affected by the volume of the combustion chamber. Cylindrical projectiles were cut from raw potatoes so that there was an airtight fit, and each weighed 50 (+/- 0.5) grams. For each trial, position as a function of time was determined via frame by frame analysis. Five trials were taken for each experimental propellant and the results analyzed to compute velocity and acceleration as functions of time. Additional quantities including force on the potato and the pressure applied to the potato were also computed. For each experimental propellant, average velocity vs. barrel position curves were plotted. The most effective experimental propellant was defined as the one which accelerated the potato to the highest muzzle velocity. The experimental propellant acetylene performed the best on average (138.1 m/s), followed by methanol (48.2 m/s), butane (34.6 m/s), ethanol (33.3 m/s), and propane (27.9 m/s), respectively.

Studying the Internal Ballistics of a Combustion Driven Potato Cannon using High-speed Video
E.D.S. Courtney AND M.W. Courtney
1BTG Research, P.O. Box 62541, Colorado Springs, CO, 80962
United States Air Force Academy,
2354 Fairchild Drive, USAF