Friday, February 24, 2012

Afghan Rage and Burning Korans


The public rage now on full display in Afghanistan is ostensibly over what is termed by the PC police as the “inadvertent” burning of a number of Korans by several brain-dead American military personnel. But I believe it is a sign of something far more serious and fundamental that promises the future lack of any kind or amount of American presence or influence in Afghanistan. In my eyes, that rage is in general a repudiation of Western culture and values and specifically of American interference in their country that had been simmering just under the surface and has now boiled over.

Although I do not wish to offend anyone, I believe that, basically, Afghanistan consists of loosely knit tribes of individuals who for the most part are only a few steps removed from the Stone Age. Those few steps include a patina of Islam covering a core of primitivism so profound most Americans can scarcely comprehend it.

That primitive core includes deep seated suspicion of and hostility toward everyone not of their clan or tribe; a conception of honor that requires revenge attacks and killings, including the killing of women who have shamed the family; unquestioned and absolutist patriarchal rule; female subordination to men; lack of higher education for most females, except children of the elite; arranged marriages; individual social status conferred by birth, etc.

It is a culture where asking the name of a man’s wife may be considered a fairly serious insult. Many rural Pashtun men have never known the real names of their mothers, aunts, or even their younger sisters. It is a country where "liberal" judges (meaning ones that are non-fundamentalist) are killed and schools for girls blown up. We're talking about a country where if a man speaks directly to a woman in a social context, he is dishonoring her. Where women are advised to avoid looking men in the eyes and to keep their eyes lowered when they walk down the street to maintain their reputations as pure and submissive. Where, outside the home, men and women must never touch one another under any circumstances. Where women are told to always dress properly to avoid unwanted attention by wearing loose fitting pants under their skirts so their legs are indistinguishable and they do not tempt men to rape them. Seriously.

Although the Taliban practice a severe, fundamentalist form of Islam, they are native Afghanis and not godless Americans. When push comes to shove, as it has today, Afghanis would much prefer to have the Taliban in charge of their country than the Americans.

Centuries of conflict between ethnic groups, tribes, clans, and families has made competition and violence an integral part of Afghan culture, a culture now turned against Americans. It's time for us to get out. Not next year, not in six months, but get out now. They do not want us there and, since Bin Laden and most of his colleagues are dead, we have no business being there. Time to pack up and skedaddle back home.

We have no business trying to act as if we were the world's police force. It's their country. Let them run it the way they want, primitive tribalists or not.

Tuesday, February 21, 2012

Radium Poisoning


Natural radium is produced in the environment through the radioactive decay of uranium and thorium and is found at very low levels in bedrock, soil, plants, the atmosphere, and animals including humans. Its most common isotopes are Ra-226, Ra-224, and Ra-228. High concentrations of radium may be found in bodies of water in certain locations. As a result, radium may be concentrated in fish and other aquatic organisms and be bio-concentrated through the food web. Radium is also present in the environment as a result of human agency, specifically through mining and manufacturing processes that increase exposure to low levels of ionizing radiation.
Author’s Rant: Exposure to low levels of ionizing radiation, no matter if its source is natural or not, remains a highly controversial topic that is subject to considerable scientific discussion. That heated debate began in the 1950s and early 1960s when scientists like Alice Stewart, George W. Kneale, Ian MacKenzie, C.K. Wanebo, Ernest Sternglass, and others began questioning levels of radiation exposure certified safe by the Atomic Energy Commission. The debate took on new life when the well-known and highly respected nuclear chemist/cardiologist, John Gofman (who at the time was Associate Director of Lawrence Livermore National Laboratory) and his associate Arthur Tamplin (a research biophysicist at the Lab), first published their findings that no level of ionizing radiation was safe. Not long after that the Atomic Energy Commission cut off their funding for research.
The AEC, and its successor agency, the Department of Energy, has a long and shameful history of trying to suppress research that it deemed inimical to its primary mission of supporting and growing the nuclear industry. The honor roll of prominent scientists the AEC-DOE either fired, vilified, or tried to make their professional lives miserable because their research results did not meet AEC-DOE’s agenda includes John Gofman, PhD, MD (nuclear/physical chemist and renowned cardiologist; co-discoverer of protactinium-232, uranium-232, protactinium-233, and uranium-233 and proved the slow and fast neutron fissionability of uranium-233); Arthur Tamplin, PhD (biophysicist); Alice Stewart, MD (epidemiologist); Ernest Sternglass, PhD (radiological physicist), George Kneale, PhD (bio-statistician); Karl Z. Morgan, PhD (physicist and widely regarded as the “Founder” of Health Physics who in 1972 resigned his position as Head of Health Physics at Oak Ridge National Laboratory when he was ordered by his superiors to suppress information in his possession about the toxicity of plutonium); Greg Wilkinson, MD (epidemiologist); Henry W. Kendall, PhD (physicist, Nobel Prize Laureate, and one of the founders of the Union of Concerned Scientists); and Thomas Mancuso, MD (epidemiologist), among many others. The tactic used by the AEC-DOE when controversy arose was to use researchers on its payroll, or whose professional work was dependent on agency funding, who were more sympathetic to its interests and persuade them to demonstrate its nuclear activities were not harmful rather than address objectively the issues that had been raised by more independent-minded scientists.
Historical Background: Not long after its discovery, radium was used by doctors and variously guised health practitioners in Europe and the U.S. to treat patients with dozens of diseases and complaints, including everything from acne to insanity. It was administered orally, by inhalation and injection, and even by enema and suppository. Consumer products containing radium included hair tonic, toothpaste, ointments, and a wide range of liquid-based elixirs. For example, in 1901 a French physician used radium in an effort to cure lupus and various skin lesions. Later physicians used it to treat a variety of cancers, unknowingly causing even more cancer. Without any doubt, the famous physicist Marie Curie (Polish-born Maria Sklodowska — pronounced sklaw-DAWF-skah), the scientist who first isolated radium in its pure metallic form and won her second Nobel Prize for the effort, died from leukemia as a result of radium exposure.
But the worst early cases of radium poisoning weren’t those of isolated scientists here or there but hundreds of workers at watch and watch dial factories in the U.S. It all had to do with the natural properties of the metal, which, when purified, glows in the dark. During World War I, that property was exploited in the manufacture of dials for clocks, wrist watches, aircraft gauges, and other instruments that needed to be readable in the dark before the risks of radium exposure were widely understood. Without doubt, the general public was absolutely fascinated with radium’s mysterious luminescent properties. Industries sprang up to manufacture hundreds of consumer products containing radium. It was used it on glow-in-the-dark numbers for houses, theater seats, and luminous lamp-pulls. At about the same time, the general public discovered that wristwatch dials could be seen more readily at night if the dials were painted with a luminous material that contained radium. Almost overnight luminous watch faces became the rage and the manufacture of luminous dials suddenly became an important and well-paying industry.
Radium dial painting began in 1917 and over the next decade about 2,000 young women were employed as dial-painters. That work occurred mostly in about a dozen locations but especially at larger dial and watch factories in Waterbury, Connecticut; Orange, New Jersey; and Ottawa, Illinois. Ignorant of the health hazards of their jobs, the dial-painters breathed air saturated with radium particles and touched contaminated surfaces every working day. But, much worse, the luminescent radium paint was applied to the dials by the young women with very fine brushes. To keep the brush tips pointed, the dial-painters were instructed to twirl the end of the brush between their lips and shape it with their tongues. Many young women would use the paint on the buttons of their clothing to make them glow in the dark and also applied it to their fingernails, eyelids, and assorted other body parts. As a result, the dial-painters ingested radium almost daily; hundreds contracted malignant cancers, suffered bone disfigurements, became seriously ill with other diseases, and died. Although the technique of lip-pointing the brushes was abolished throughout the industry in 1927, by that time many dozen dial-painters had died from radium exposure and many dozen others had contacted serious illnesses, including disfiguring cancers and osteomyelitis of the upper and lower jaw and buccal cavity.
Former dial and watch factory sites that were and still are contaminated with radium include the site of the former U.S. Radium Corporation factory in Orange, New Jersey, and five plants in Connecticut: the former Waterbury Clock Factory, the former Lux Clock Factory, and the former Benrus Clock Company buildings in Waterbury; the former Sessions Clock Company in Bristol; and the former Seth Thomas Clock Company in Thomaston. Many of those abandoned factories are now superfund sites. For readers with a sense of history, the U.S. Radium Corporation was responsible for the infamous Radium Girls trial in the late 1920s, which was settled out of court when the company agreed to pay the plaintiffs the paltry sum of $10,000 each and $600 a month for as long as they lived. Which, it turned out, wasn’t very long.[1]


[1] For additional information, see: Claudia Clark Radium Girls, Chapel Hill, North Carolina: The University of North Carolina Press, 1997; and Ross Mullner, Deadly Glow: The Radium Dial Worker Tragedy, American Public Health Association Publications, 1999.

Sunday, February 19, 2012

Radiometric Dating


Precise method of dating Earth materials by measuring the percentages of a long-life radioactive parent element with respect to its daughter products or measuring the presence a short-life radioactive element; those measurements are precise because the rates of decay of many isotopes have been extensively documented and do not vary with physical conditions found in the Earth’s outer layers. Consequently, each radioactive isotope used in the dating process has been decaying at a known rate since it was formed in the rock in which it is contained and the decay products have also been accumulating at a corresponding rate. For example, when a mineral that contains uranium crystallizes from magma, that magma contains no lead that would be the product of a previous decay process. Therefore, the radiometric clock starts ticking at that moment. As the uranium in the new mineral begins decaying, its daughter products are trapped and measurable quantities of lead will eventually accumulate.
Historical Background: In 1896 the discovery of the natural radioactive decay of uranium by the French physicist Henry Becquerel opened the door to a cornucopia of scientific discoveries. In 1902 the physicist Ernest Rutherford and chemist Frederick Soddy, working at Canada’s McGill University determined that radioactive elements, such as uranium and thorium, broke down at a fixed rate over time into other elements in a predictable sequence or series. Their discovery led to the identification of half life and led to their disintegration theory of radioactivity, which proposed that over time atomic nuclei of an unstable atom split to form other elements. Their research into radioactive decay, coupled with the work of their colleague, Kasimir Fajans, resulted in the Radioactive Displacement Law of Fajans and Soddy that described the products of alpha and beta decay.
That discovery intrigued Bertram B. Boltwood (1870-1927), a radiation chemist working at Yale University. Boltwood was spurred on when in 1905 during a lecture at Yale University Rutherford had challenged the scientific community to use radioactive decay to date rocks. Boltwood began studying the radioactive series Rutherford and Soddy had defined earlier in 1905 and found that lead was always present in uranium and thorium ores. He concluded that lead was the final product of the radioactive decay of uranium and thorium. In 1907, he reasoned that once the rate at which uranium decays is known (the half decay period or half-life), the proportion of lead in the uranium ores could be used as a kind of measuring device, or clock, since it would tell geoscientists when that ore, and therefore the Earth’s crust, formed. Boltwood’s pioneering research, although somewhat crude when measured against today’s far more sophisticated techniques (for example, the use of the mass spectrometer to identify atoms by weight), put the Earth’s age at 1.2 billion years, which, for that time, was a dramatic increase in what scientists believed was the right direction.
Shortly after Boltwood’s discovery in 1907 that uranium decayed slowly to stable lead, Arthur Holmes, an undergraduate physics student at University College of London, was so taken by the geological implications of that discovery that he switched majors to geology. By 1911, using only analytical chemistry applied to a few mineral samples, Holmes established a framework for the geologic time scale that proved to be uncannily accurate, considering the relatively primitive nature of his approach (since it predated the discovery of isotopes). Building on Boltwood’s pioneering work, Holmes performed the very first uranium-lead analysis of rock specifically determined for age-dating purposes. That research resulted in a date of 370 million years for a Devonian specimen. Although only 21 years old and still an undergraduate, Holmes had embarked on a lifetime’s quest “to graduate the geological column with an ever-increasingly accurate time scale.”
In 1913 he wrote The Age of the Earth, a book that almost immediately became justly famous. In that book Holmes estimated the Earth’s age at 1.6 billion years. It is quite extraordinary that at that time he was only 23 and had not completed his doctoral studies. After publication Holmes became recognized as the world’s authority on geochronology. But, opposition from established geologists who clung to the belief that the Earth was 100 million years old was formidable. Key advocates of the opposing position included scientists who supported ideas the famous Scottish physicist William Thompson, perhaps better known as Lord Kelvin, had advocated shortly before his death in 1907. Other well-known opponents included German physicist Hermann von Helmholtz, American astronomer Simon Newcomb, and Charles Darwin’s astronomer-mathematician son, George H. Darwin.
However, by the early to mid-1920s Holmes’s work was vindicated when both the British Association for the Advancement of Science and the National Research Council of the U.S. National Academy of Sciences came down on the side of the Earth being between 1.6 and 2.0 billion years old. From the mid-1920s through the early 1940s, a group of physicists, geophysicists, and geochemists succeeded in devising techniques that continued pushing back the age of the Earth. That group included Holmes, Alfred Nier, E. K. Gerling, Friedrich Georg Houtermans, and Clair C. Patterson, who ultimately produced accurate “primeval” lead isotopic measurements from minerals collected from five meteorite fragments at Canyon Diablo, Arizona.[1] By 1956, Patterson’s research had determined the age of the Earth at almost 4.6 billion years.
Since Patterson’s and Houtermans’ pioneering research in the mid-1950s, additional data have been accumulated, instruments have become more precise, and analytical techniques have improved. Moon rocks and many more meteorites have been sampled and dated. Decay constants have been measured with ever increasing accuracy. Remarkably, certain technical adjustments to and corrections of Patterson’s 1956 computation have canceled each other out. Today’s best estimate of the age of meteorites (4.55 ± 0.02 billion years) is identical to Patterson’s except for a smaller error range. That value has been confirmed by dozens of scientists working independently.
Today it is a nearly universally accepted scientific principle that radioactive decay occurs at a constant rate that is specific to each radioactive isotope. Since the 1950s, geologists and geophysicists have used radioactive elements as natural “clocks” for determining ages of certain types of rocks. Radiometric clocks are set when each rock forms. “Forms” means the moment an igneous rock solidifies from magma, a sedimentary rock layer is deposited, or a rock heated by metamorphism cools. That setting and resetting process allows geoscientists to date rocks that formed at different times and under different circumstances. Another commonly used radiometric dating technique is based on the decay of potassium (K-40) to argon (Ar-40). In igneous rocks, the potassium-argon clock starts the moment the rock crystallized from magma. Precise measurements of the amount of the isotope K-40 relative to Ar-40 in an igneous rock determine the time that has passed since crystallization (knowing that the half-life of K-40 is about 1.3 billion years). If an igneous or other rock is metamorphosed, its radiometric clock is reset. Potassium-argon measurements are then used to determine the number of years that have passed since metamorphism. See isotopic dating.
Author’s Note: It is critical for students to realize that no scientific method is free from ambiguity. In addition, most scientific techniques in and of themselves are subject to considerable latitude in terms of the interpretation of results. Consequently, all physical-chemical methods of dating rocks have uncertainties associated with them. Several basic assumptions are made when geoscientists determine the age of rock samples. The most significant assumption is that the sample is from a closed system in which no parent or daughter isotopes were gained or lost over time. Another assumption involves the amount of daughter isotope present at the time the sample rock was formed. For rare isotopes, that amount is generally assumed to be zero. Because of those and other uncertainties, the strongest evidence for the age of a rock is obtained when two different radiochemical dating methods produce similar results.
Since we live in a real world where our convenient, highly intellectualized categories and classifications are seldom found in nature, it is likely that geoscientists unknowingly (or knowingly according to many creationist critics) put one or both of those assumptions into play when rock samples are dated by radiometric decay techniques. But, despite what Creationists like to assert, simply because a specific dating technique fails to determine a reliable or verifiable date for a rock sample is no reason to reject all radiometric dating techniques. After all, when your car fails to start one winter morning, surely you don’t automatically assume that all cars therefore are useless pieces of junk.
Interested students may wish to consult one of the standard works on the topic. I recommend highly to anyone with a high school background in science G. Brent Dalrymple’s classic, The Age of the Earth. Stanford, California: Stanford University Press, 1991. It is well-written, well-reasoned, and powerful in its explanations. Radiometric dating has been widely attacked by Christian fundamentalists, or Creationists, as unreliable, riddled with inaccuracies, and unscientific. A variety of their views may also be found on the internet under the entry, radiometric dating. Curious students owe it to their intellectual development to examine that alternate universe. However, for the point of view of a Christian geophysicist, see: Roger C. Wiens, PhD, Radiometric Dating: A Christian Perspective, material written in 1994 and revised in 2002: http://www.asa3.org/ASA/resources/Wiens.html.



[1] Where, quite coincidentally, my great uncle, Earl Cundiff, had been murdered in 1926.