Historical Science & Technology, Science

The Age of the Earth: Geology in the 17th & 18th Centuries

Geology has technically been a thing since the first caveman thought, upon picking up a rock, “Hey, this rock looks different from that one over there. I wonder why that is?” But it wasn’t until the 17th Century CE that scientists actually started using actual science to find some answers.

Geology & the Biblical Flood

In the 17th Century, most people in the “Christian world” still believed that the bible was an actual, factual historical document. From this, they extrapolated that the big rainstorm that got Russell Crowe all worked up had actually happened, and set out to prove it through science.

In searching for evidence to support this, scientists and researchers learned a good deal about the composition of the Earth and, perhaps more importantly, discovered fossils, and lots of ‘em. Perhaps unsurprisingly, the real information gleaned in this process was often significantly manipulated to support the idea of the Great Flood (as well as other Biblical nonsense). A New Theory of the Earth, written by William Whiston and first published in 1696, used Christian “reasoning” to “prove” that the Great Flood had not only happened, but that it was also solely responsible for creating the rock strata of the Earth’s crust.

Whiston’s book and further developments lead to numerous heated debates between religion and science over the true origin of the Earth. The overall upside was a growth in interest in the makeup of our planet, particularly the minerals and other components found in its crust.

What created these mineral strata? A relatively easily explainable scientific process, or Jesus?

What created these mineral strata? A relatively easily explainable scientific process, or Jesus?

Minerals, Mining & More

As the 18th Century progressed, mining became increasingly important to the economies of many European countries. The importance of accurate knowledge about mineral ores and their distribution throughout the world increased accordingly. Scientists began to systematically study the earth’s composition, compiling detailed records on soil, rocks, and, most importantly, precious and semiprecious metals.

In 1774, the German geologist Abraham Gottlob Werner published his book, On the External Characteristics of Minerals. In it, Werner presented a detailed system by which specific minerals could be identified through external characteristics. With a more efficient method of identifying land where valuable metals and minerals could be found, mining became even more profitable. This economic potential made geology a popular area of study, which, in turn, led to a wide range of further discoveries.

Religion vs. Facts

Histoire Naturelle, published in 1749 by French naturalist Georges-Louis Leclerc, challenged the then-popular biblical accounts of the history of Earth supported by Whiston and other theologically-minded scientific theorists. After extensive experimentation, Leclerc estimated that the Earth was at least 75,000 years old, not the 4,000-5,000 years the bible suggests. Immanuel Kant’s Universal Natural History and Theory of Heaven, published in 1755, similarly described the earth’s history without any religious leanings.

The works of Leclerc, Kant, and others drew into serious question, for the first time, the true origins of the Earth itself. With biblical and religious influences taken out of the equation, geology turned a corner into legitimate scientific study.

By the 1770s, two very different geological theories about the formation of Earth’s rock layers gained popularity. The first, championed by Werner, hypothesized that the earth’s layers were deposits from a massive ocean that had once covered the whole planet (i.e. the biblical flood). For whatever reason, supporters of this theory were called Neptunists. In contrast, Scottish naturalist James Hutton’s theory argued that Earth’s layers were formed by the slow, steady solidification of a molten mass—a process that made our planet immeasurably old, far beyond the chronological timeframe suggested by the bible. Supporters of Hutton’s theory, known as Plutonists, believed that Earth’s continual volcanic processes were the main cause of its rock layers.

Photo credit: Taraji Blue / Foter / Creative Commons Attribution-NonCommercial-ShareAlike 2.0 Generic (CC BY-NC-SA 2.0)

Science

Give ‘Em an Inch

If you’re reading this in the US, Canada, or anywhere in the UK, you know the inch as a standard unit of length. If you’re reading it from anywhere else, you know it as the stupid thing that isn’t a centimeter, but should be. (Kind of. Wait. What?) But, either way, what you may not know is: where did the inch come from?

History’s Inchstories

The exact details of where the inch came from are a bit murky. The earliest surviving reference to the unit is a manuscript from 1120 CE, which itself describes the Laws of Æthelbert from the early 7th Century CE. This manuscript relates a Law regarding the cost of wounds based on depth: one inch costs one shilling, two inches costs two shilling, and so on. Whether this cost refers to a fine for the inflictor of said wound or the cost of treating the wound is unclear, because of weird Olde English.

Around that time, one of several standard Anglo-Saxon units of length was the “barleycorn,” defined as “three grains of barley, dry and round, placed end to end lengthwise.” One inch was said to equal three barleycorn, a legal definition which persisted for several centuries (as did the use of the barleycorn as the base unit). Similar definitions can be found in contemporaneous English and Welsh legal tracts.

inch

Not to scale.

Attempts at Standardization

Since grains of barley are notoriously nonconformist, the traditional method of measuring made it impossible to truly standardize the unit. In 1814, Charles Butler, a math teacher at the Cheam School in Ashford with Headley, Hampshire, England, revisited the old “three grains of barley” measurement and established the barleycorn as the base unit of the English Long Measure System. All other units of length derived from this.

George Long, in his Penny Cyclopædia, published in 1842, observed that standard measures had made the barleycorn definition of an inch obsolete. Long’s writing was supported by law professor John Bouvier in his law dictionary of 1843. Bouvier wrote that “as the length of the barleycorn cannot be fixed, so the inch according to this method will be uncertain.” He noted that, as a “standard inch measure” was at this time kept in the Exchequer chamber at Guildhall, this unit should be the legal definition of the inch.

Modern Standardization… in 1959?!

Somehow, it was not until 19friggin59 that the current, internationally accept length of an inch was established. This measurement, exactly 25.4 millimeters, or 0.9144 meters, was adopted through the International Yard and Pound Agreement, which sounds ridiculous but was an actual thing.

Before this, there were various, slightly different inch measurements in use. In the UK and British Commonwealth countries, an inch was defined based on the Imperial Standard Yard. The US, meanwhile, had used the conversion factor of 39.37 inches to one meter since 1866.

Photo credit: Biking Nikon SFO / Foter / Creative Commons Attribution 2.0 Generic (CC BY 2.0)

Historical Science & Technology, Science

A Brief History of Cataract Surgery

Cataract surgery is a surgery in which a cataract is removed. Nailed it! More specifically, cataract surgery is the removal of the human eye’s natural lens necessitated by the lens having developed an opacifiation (a.k.a. a cataract) which causes impairment or loss of vision. While we tend to think of “advanced” medical procedures such as this as relatively modern developments, cataract surgery has been performed for thousands of years.

Couching in Ancient India

Sushruta, a physician in ancient India (ca. 800 BCE), is the first doctor known to have performed cataract surgery. In this procedure, known as “couching,” the cataract, or kapha, was not actually removed.

First, the patient would be sedated, but not rendered unconscious. He/she would be held firmly and advised to stare at his/her nose. Then, a barley-tipped curved needle was used to push the kapha out of the eye’s field of vision. Breast milk was used to irrigate the eye during the procedure. Doctors were instructed to use their left hand to perform the procedure on affected right eyes, and the right hand to treat left eyes.

Even drawings of cataract surgery look super unpleasant.

Even drawings of cataract surgery look super unpleasant.

When possible the cataract matter was maneuvered into the sinus cavity, and the patient could expel it through his/her nose. Following the procedure, the eye would be soaked with warm, clarified butter and bandaged, using additional delicious butter as a salve. Patients were advised to avoid coughing, sneezing, spitting, belching or shaking during and after the operation.

Couching was later introduced to China from India during the Sui dynasty (581-618 CE). It was first used in Europe circa 29 CE, as recorded by the historian Aulus Cornelius Celsus. Couching continued to be used in India and Asia throughout the Middle Ages. It is still used today in parts of Africa.

Suction Procedures

In the 2nd Century CE, Greek physician Antyllus developed a surgical method of removing cataracts that involved the use of suction. After creating an incision in the eye, a hollow bronze needle and lung power were used to extract the cataract. This method was an improvement over couching, as it always eliminated the cataract and, therefore, the possibility of it migrating back into the patient’s field of vision.

In his Book of Choices in the Treatment of Eye Diseases, the 10th Century CE Iraqi ophthalmologist Ammar ibn Ali Al-Mosuli presented numerous case histories of successful use of this procedure. In 14th Century Egypt, oculist Al-Shadhili developed a variant of the bronze needle that used a screw mechanism to draw suction.

“Modern” Cataract Surgery

The first modern European physician to perform cataract surgery was Jacques Daviel, in 1748.

Implantable intraocular lenses were introduced by English ophthalmologist Harold Ridley in the 1940s, a process that made patient recovery a more efficient and comfortable process.

Charles Kelman, an American surgeon, developed the technique of phacoemulsification in 1967. This process uses ultrasonic was to facilitate the removal of cataracts without a large incision in the eye. Phacoemulsification significantly reduced recovery patient times and all but eliminated the pain and discomfort formerly associated with the procedure.

Photo credit: Internet Archive Book Images / Foter / Creative Commons Attribution-ShareAlike 2.0 Generic (CC BY-SA 2.0)

Important People, Important Discoveries, Science

The Periodic Table of Dmitri Mendeleev

If you’re here, reading a blog about science and technology, I’m going to assume you already know what the periodic table of elements is, and therefore dispense with the introductory information. However, though you may know the table well, do you know where it came from? Read on, friend, read on…

From Russia with Science

Dmitri Ivanovich Mendeleev (1834-1907) was a Russian inventor and chemist. In the 1850s, he postulated that there was a logical order to the order of the elements. As of 1863, there were 56 known elements, with new ones being discovered at a rate of roughly one per year. By that time, Mendeleev had already been working to collect and organize data on the elements for seven years.

Mendeleev discovered that arranging the known chemical elements in order by atomic weight, from lowest to highest, a recurring pattern developed. This pattern showed the similarity in properties between groups of elements. Building off this discovery, Mendeleev created his own version of the periodic table that included the 66 elements that were then known. He published the first iteration of his periodic table in Principles of Chemistry, a two-volume textbook that would be the definitive work on the subject for decades, in 1869.

Mendeleev’s periodic table is essentially the same one we use today, organizing the elements in ascending order by atomic weight and grouping those with similar properties together.

Dmitri Mendeleev

Changing & Predicting the Elements

The classification method Mendeleev formulated came to be known as “Mendeleev’s Law.” So sure of its validity and effectiveness was he that used it to propose changes to the previously-accepted values for atomic weight of a number of elements. These changes were later found to be accurate.

In the updated, 1871 version of his periodic table, he predicted the placement on the table of eight then-unknown elements and described a number of their properties. His predictions proved to be highly accurate, as several elements that were later discovered almost perfectly matched his proposed elements. Though they were renamed (his “ekaboron” became scandium, for example), they fit into Mendeleev’s table in the exact locations he had suggested.

From Skepticism to Wide Acclaim

Despite its accuracy and the scientific logic behind it, Mendeleev’s periodic table of elements was not immediately embraced by chemists. It was not until the discovery of several of his predicted elements—most notably gallium (in 1875), scandium (1879), and germanium (1886)—that it gained wide acceptance.

The genius and accuracy of his predictions brought Mendeleev fame within the scientific community. His periodic table was soon accepted as the standard, surpassing those developed by other chemists of the day. Mendeleev’s discoveries became the bedrock of a large part of modern chemical theory.

By the time of his death, Mendeleev had received a number awards and distinctions from scientific communities around the world, and was internationally recognized for his contributions to chemistry.

Photo credit: CalamityJon / Foter / Creative Commons Attribution-NonCommercial-NoDerivs 2.0 Generic (CC BY-NC-ND 2.0)

Historical Science & Technology, Science

Medieval European Herbals

After Theophrastus’ and other ancient Greeks’ significant advances in botany, China, India, and the Arabian nations continued to study and expand their knowledge of the science. However, in Western Europe, the study of botany went through a period of inactivity that lasted over 1,800 years. During this time, much of the Greeks’ knowledge and breakthrough discoveries were lost or forgotten.

Moveable Type to the Rescue!

In 15th and 16th Century Europe, life of the average citizen revolved around and was highly dependent upon agriculture. However, when printing and moveable type were developed, most the first published works were not strictly about agriculture. Most of them were “herbals,” lists of medicinal plants with descriptions of their beneficial properties, accompanied by woodcut illustrations. These early herbals reinforced the importance of botany in medicine.

Herbals gave rise to the science of plant classification, description, and botanical illustration. Botany and medicine became more and more intertwined as the Middle Ages gave way to the Renaissance period. In the 17th Century, however, books on the medicinal properties of plants eventually came to omit the “plant lore” aspects of early herbals. Simultaneously, other printed works on the subject began to leave out medicinal information, evolving into what are now known as floras—compilations of plant descriptions and illustrations. This transitional period signaled the eventual separation of botany and medicine.

Notable Herbals

The first herbals were generally compilations of information found in existing texts, and were often written by curators of university gardens. However, it wasn’t long before botanists began producing original works.

Herbarum Vivae Eicones, written in 1530 by Otto Brunfels, catalogued nearly 50 new plant species and included accurate, detailed illustrations.

A page from Brunfels' Herbarum Vivae Eicones

A page from Brunfels’ Herbarum Vivae Eicones

Englishman William Turner published Libellus De Ra Herbaria Novus in 1538. Turner’s tome included names, descriptions, and local habitats of native British plants.

In 1539, Heironymus Bock wrote Kreutterbuch, describing plants the author found in the German countryside. A 1546 second edition of the book included illustrations.

The five-volume Historia Planatrum, written by Valerius Cordus and published between 1561 and 1563 some two decades after the author’s death, became the gold standard for herbals. The work included formal descriptions detailing numerous flowers and fruits, as well as plant anatomy and observations on pollination.

Rembert DodoensStirpium Historiae, written in 1583, included detailed descriptions and illustrations of many new plant species discovered in the author’s native Holland.

Photo credit: ouhos / Foter / Creative Commons Attribution-ShareAlike 2.0 Generic (CC BY-SA 2.0)