Pseudoscience, Technology

The South-Pointing Chariot

Though it is surely an apt moniker, the south-pointing chariot is not one-hundred percent true to its name. One might suspect that these carriages could only travel south, which makes little sense but would be spot-on for their unique sobriquet. However, they are, in fact, named for the movable pointer each one carried, which was designed to point south, regardless of the direction of travel. Most likely used as a compass for navigation, these pointers where often shaped like a doll or other humanoid figure with an outstretched arm indicating south.

Unfortunately, no historical examples of the south-pointing chariot still exist today. A good amount was written about the carriages at the time of their invention and use, however. The oldest reliable resource to mention the south-pointing chariot was written circa 250 CE by Chinese engineer and government official Ma Jun. This places the vehicle’s creation some eight centuries before the first navigational use of the magnetic compass.

Fully Mechanical, Analog GPS

Multiple variations of these ancient, two-wheeled vehicles likely existed. Most types of south-pointing chariot used a special geared mechanism, connected to the rotating drive wheels, that kept the point aimed southward. No magnetics were involved, and so the mechanism did not automatically detect the correct direction; instead, the pointer was aimed south by hand at the start of a journey.

Please orient your screen so that the figure above points south.

Please orient your screen so that the figure above points south.

As the chariot turned left or right, the gear mechanism rotated the pointer relative to the carriage’s movement to keep it pointed south. This mechanism’s dead reckoning action proved less than perfect, and so was highly susceptible to cumulative errors which required manual correction. The curvature of the earth itself and the unavoidable topographical changes along any given route are enough to nudge the pointer well away from true south over a relatively short distance.

It is widely believed that most south-pointing chariots utilized differential gears to maintain their directional pointer, which likely makes them the first devices in human history to use differential gears, centuries before Europeans began using similar mechanisms. Modern recreations of south-pointing chariots that use differential gears have been relatively successful in automatically maintaining their pointer direction.

Descriptions in Song Shu

The Song Shu (translated: “Book of Song”), written in 493 CE by the poet and historian Shen Yue of the Southern Qi dynasty, contains extensive descriptions of south-pointing chariots and their use throughout the Three Kingdoms. Shen Yue wrote:

The south-pointing carriage was first constructed by the Duke of Zhou as a means of conducting homewards certain envoys who had arrived from a great distance beyond the frontiers. The country to be traversed was a boundless plain, in which people lost their bearings as to east and west. [The Duke] caused this vehicle to be made in order that the ambassadors should be able to distinguish north and south […]

During the Qing-long reign period [233-237 CE], the emperor Ming Di commissioned the scholar Ma Jun to construct one, and he duly succeeded. [The vehicle and its design were lost] during the troubles attending the establishment of the Jin Dynasty […]

Later on, Shi Hu [Emperor of the Jie Later Zhao dynasty] had one made by Xie Fei [and] Linghu Shang made one for [Emperor of the Later Qin dynasty] Yao Xing. […] Its appearance and construction was like that of a drum-carriage. A wooden figure of a man was placed at the top, with its arm raised and pointing to the south. Although the carriage turned round and round, the pointer-arm still indicated the south. In State processions, the south-pointing carriage led the way, accompanied by the imperial guard.

These vehicles […] did not function particularly well. Though called ‘south-pointing carriages,’ they very often did not pint true, and had to negotiate curves step by step, with the help of someone inside to adjust the machinery.

 Photo credit: Internet Archive Book Images / Foter.com / No known copyright restrictions

Pseudoscience

Greek Atomism, Isn’t It?

Atomism is a natural philosophy that theorized that everything in nature is comprised of two fundamental principles: atom and void. Moreover, atomism suggested that not only is everything composed of atom and void, but nothing but atom and void actually exists—atom ricochet off each other mechanistically to create other forms in an otherwise empty void. Perhaps unsurprisingly, the ancient Greeks were among the first to explore atomism.

Ah, for Democritus’ Sake!

Round about the 5th Century BCE, Greek study of the nature of reality was divided into two conflicting schools of thought. One school, supported by the philosopher Heraclitus, stated that the nature of all existence is change; the other, backed by rival philosopher Parmenides, believed that change is merely an illusion. To reconcile the two, a third philosopher (Greece was lousy with philosophers back in the day), Leucippus, and his mentee, Democritus, postulated that all matter was composed of small, indivisible particles called atoms.

Believing all of existence to be one big all-encompassing and unchanging mass, Parmenides flatly denied the existence of change, motion, and void. His argument against void’s existence is uniquely compelling: he argued that void could not exist, equating it with non-being—in layman’s terms, if the void exists, it is therefore something, not nothing, and therefore is not the void. Even 2,500 years later, it’s hard to argue with that logic.

Democritus disagreed with Parmenides philosophy on one key point: the idea that change is an illusion. He argued that change was real, and it wasn’t, then there must be an explanation for the illusion of change. Democritus supported the concept of void, stating that the universe is (essentially) made up of many entities that populate the void, and that the void itself is infinite, providing space for atoms to pack together or scatter to create different forms.

“No, No” – Plato

The Michael Jordan of Ancient Greek philosophers, Plato, argued that atoms could never produce the beauty and order of the world around us just by randomly crashing together. In Plato’s Timaeus (ca. 400 BCE), Timaeus insists that the cosmos was created, but designed by its creator to appear an eternal, unchanging model.

Plato (L) & Aristotle taking the stage at Philosopalooza 350.

Plato (L) & Aristotle taking the stage at Philosopalooza 350.

The four geometrical simple bodies (Fire, Air, Water, Earth) were included in that creation, though Plato did not consider these forms to be the most basic level of reality. Instead, he said, the bodies were made up of an unchanging, mathematical level of reality, and were geometric solids whose faces were constructed solely of triangles in varying configurations. Fire was represented by the tetrahedron, air by the octahedron, water by the icosahedron, and earth by the cube.

Each of the shapes, despite ranging from 4-sided to 20-sided, was solely composed of triangles. The bodies could all be broken down into triangles, and those triangles rearranged into different bodies (or atoms).

The Aristotlean Rejection of Atoms

Round about 330 BCE, Aristotle, the Lebron James of Ancient Greek philosophers, philosophized that rather than being made of atoms, fire, air, water, and earth were instead continuous. Furthermore, Aristotle stated that the existence of a void violated physical principles. Instead of rearranging atoms into new substances, matter was transformed from its potential shape into its new, actual shape—clay, in the hands of a potter, assumes its potential and becomes an actual vase.

Aristotle’s theory came to be known as hylomorphism. And that, friends, is a tale for another time. Part two of our history of atomism will be available here at SandyHistorical.org on 2 April 2019.

Photo credit: Image Editor / Foter / CC BY

Historical Science & Technology, Pseudoscience

Chymistry in Medieval Islam

While European alchemy was an amalgamation of science, magic, and religion, the practice took a somewhat different form in the Islamic world. Though the Islamic version of alchemy did address the transmutation of metals (turning lead into gold being the most famous example), it also incorporated a good deal of legitimate science that was closer to practical chemistry. The two disciplines—alchemy and chemistry—were so closely aligned in this region of civilization in this time period that a different word was often used to describe them both: chymistry.

The word “alchemy” is derived from the Arabic word kīmiyā, which, in turn, is derived from kemi, the ancient Egyptian word for “black”.

Chymistry & The Philosopher’s Stone

Building off Aristotle’s elemental studies, Jābir ibn Hayyān (see below) suggested that all metals include the four basic qualities of hotness, coldness, dryness, and moistness—two interior and two exterior. For example, lead was said to be internally hot and moist and externally cold and dry; gold was said to be internally cold and dry but externally hot and moist.

Hayyān proposed that metals were formed by the fusion of sulfur and mercury—not the elements we now know but ideal, hypothetical substances in chymistry—that imparted them with the qualities of hot/dry and cold/moist, respectively. It was thought that the resulting metal depended on the purity and proportion of sulfur and mercury in the composition. (A later chymist known as al-Rāzī expanded this theory, adding a third component: salt.)

Hayyān theorized that, by rearranging the qualities of a given metal, a different metal could be created. Chemical processes such as distillation, calcination, evaporation, crystallization, and sublimation, were employed to this end. This line of research ultimately lead to the pursuit of the Philosopher’s Stone of European alchemy.

philosophers stone

Notable Chymists

Khālīd ibn Yazīd is generally considered the first practitioner of chymistry in the medieval Muslim world, though his life and contributions to the science are apocryphal at best. Legend has it that he studied alchemy under the tutelage of Marianos of Alexandria. Yazīd is said to be the author of numerous notable alchemical texts, including The Book of Pearls, The Small Book of the Roll, The Big Book of the Roll, and The Paradise of Wisdom.

As a youth, Jābir ibn Hayyān studied science under Harbī al-Himyarī in Arabia. Later, in Baghdad, he became the official court alchemist for Hārūn al-Rashīd. A huge number of chymistry texts are attributed to Hayyān, though researchers suggest that many of these may have been the work of his students. Works that can be attributed to Hayyān with certainty include The Books of Balances, The Ten Books of Rectifications, The Seventy Books, and The One Hundred and Twelve Books, none of which are math tutorials.

Muḥammad Ibn Umayl al Tamīmī, born in the 11th century CE, was the author of two of the definitive texts of Islamic chymistry, The Epistle of the Sun and the Crescent and its accompanying volume, The Book on Silvery Water and Starry Earth.

Photo credit: Michael Maier / Foter / Public Domain Mark 1.0

Historical Science & Technology, Pseudoscience

Futurology: The Science of Speculation

Futurology is the study of the future. “How can one study the future?” you may ask. “It hasn’t happened yet, so there’s really nothing to study!” Well, just follow me here, dear reader: to get specific, futurology is the postulation of possible, probable, and/or preferable futures, as well as the current trends that may lead to said futures. A futurologist seeks to understand what parts of the modern world are likely to continue into the future, what could change, and how these potential similarities and changes will affect future society.

There is an ongoing debate as to whether futurology is a science or an art, owing, in part, to the popularity of science fiction, which is often, in its own way, a sort of futurology. The term “strategic foresight” is often used to present the discipline as more genuinely scientific.

A Unique Discipline

Modern futurologists stress the possibility and importance of alternative and plural futures—i.e., they look at multiple possibilities arising from every new futurological prediction (“If A, then probably B, but also possibly C or D or E”). They also emphasize the limitations of prediction and probability, acknowledging that there is, in fact, no way to know for sure what will happen in the future and that they are, at best, making highly informed educated guesses.

There are a number of factors that distinguish futurology’s scientific focus. Primary among them is futurology’s study of not only possible futures, but also probable (what is most likely to happen), preferable (the best future for all), and “wild card” futures (something completely unexpected*).

Futurology generally attempts to create a holistic or systemic view of the future based on information gathered by numerous other scientific disciplines. All available information is considered in postulating futurological hypotheses.

The assumptions behind dominant and contending views of the future are also challenged by futurologists, no matter how well-established the line of thinking may be. For example, it is an accepted scientific fact that our Sun will burn out many thousands or millions of years in the future—a futurological approach would consider the possibility that this may happen in only 100 years, and what consequences that would bring.

Origins of Futurology

The first writings that could be considered to have a futurological view date back to the first century BCE. The first attempts to make systematic predictions about the future were published in the 18th century CE, the most notable of which was 1733’s Memoirs of the Twentieth Century by Samuel Madden. This book looks only at the politics and religion of the future world, with no speculation on technology or other aspects of life.

H.G. Wells, OG Futurologist

H.G. Wells, OG Futurologist

Writers including Jules Verne and H.G. Wells helped establish the science fiction genre in the 19th century, writing of imagined futures with advanced technology and radically altered societies. Many scholars consider Wells to be the unofficial founder of futurology—his best-selling Anticipations of the Reaction of Mechanical and Scientific Progress Upon Human Life and Thought, set in the year 2000, correctly predicts a number of innovations that are now a part of everyday life (the book also presents numerous incorrect predictions).

In 1902, following the success of Anticipations, Wells was asked to deliver a lecture at the Royal Institution. Titled The Discovery of the Future, this lecture encouraged the establishment of what came to be known as futurology, suggesting that scientific methodology would be a better conduit for genuine predictions about the future than simple speculation. While acknowledging that it is essentially impossible to provide entirely accurate predictions for the future, Wells stated that a scientific approach would allow for a “working knowledge of things in the future.”

* Although, if futurologists are studying it, it can’t truly be “completely unexpected,” can it? It’s like that (stupid) old saying, “Expect the unexpected”—completely impossible, because if you’re expecting it, it is no longer unexpected. By definition, there is no way to expect the unexpected.

Photo credit: LSE Library / Foter / No known copyright restrictions

Pseudoscience

Transmutation of Species: More or Less Than Meets the Eye?

Before Chuck Darwin developed his theory of natural selection and cracked the nut of evolution, many scientists and the people who trusted them to science followed the logic of transmutation of species (a.k.a., transformism). There was a good deal of opposition to this theory, and many prominent scientists of the 19th century could be found on either side of the debate.

Lamarck My Words…

French naturalist Jean-Baptiste Pierre Antoine de Monet, Chevalier de Lamarck, often helpfully shortened to simply Lamarck, first proposed the theory of the transmutation of species in his 1809 tome Philosophie Zoologique. Lamarck’s theory suggested that, rather than sharing a common ancestor, the simplest forms of life were created via spontaneous generation. He postulated that an innate life force drove species to become more complex over time.

While recognizing that many species were uniquely adapted to their respective environments, Lamarck suggested that the same life force also caused animals’ organs and plants’… whatever-the-plant-equivalent-of-organs-are to change based on how much or how little they’re used, thus creating more specialized species over successive generations.

The British Are Coming! The British Are Coming!

Concurrently, British surgeon Robert Knox and anatomist Robert “Research” Grant developed their own school of thought on comparative anatomy. Closely aligned with Lamarck’s French Transformationism approach, they further developed the idea of transmutation as well as evolutionism, and investigated homology to prove common descent.

Along with a student named Charles Darwin, Grant investigated the life cycle of marine animals. Darwin went on to study geology with professor Robert Jameson. In 1826, Jameson published an anonymous essay praising Lamarck’s “explanation” of higher animals evolving from “the simplest worms.”

Probably not what they meant.

Probably not what they meant.

In Eighteen-Hundred and Thirty-Seven, computing pioneer and almost-cabbage Charles Babbage published the Ninth Bridgewater Treatise. In it, he proposed that God had the foresight and the power to create laws (or, as Babbage put it, since he’s all computery, “programs”) that would produce new species at the appropriate times instead of dishing out a “miracle” each time a new species arose.

The Vestiges of the Natural History of Creation

Seven years later, Scottish publisher Robert Chambers published—anonymously—The Vestiges of the Natural History of Creation. This book proved to be both highly influential and extremely controversial, as it proposed an evolutionary hypothesis that explained the origins of life on Earth and the existence of our entire solar system. Chambers claimed that, by studying the fossil record, one could easily see a progressive ascent of animals. Current animals, he posited, branched off a main line that ultimately lead to humans. Chambers’ theory suggested that species’ transmutations were part of a preordained plan woven into the very fabric of the universe.

Though slightly less stupid (in 21st century retrospect) than Grant’s theories, Chambers’ implication that humans were merely the top rung of a predetermined evolutionary ladder, if you will, ruffled the feathers of both conservative thinkers and radical materialists. Numerous scientific inaccuracies were found in The Vestiges, and were roundly derided. Darwin lamented Chambers’ “poverty of intellect,” ultimately dismissing his book as no more than a “literary curiosity.” He would go on to publish his own since-proven-correct theory of evolution roughly a decade later.

Photo credit: dBnetco / Foter / CC BY-NC-ND