Technology, World-Changing Inventions

Water Treatment Technology Through History

Water Treatment Technology Through History


Civilization has changed in uncountable ways over the course of human history, but one factor remains the same: the need for clean drinking water. Every significant ancient civilization was established near a water source, but the quality of the water from these sources was often suspect. Evidence shows that humankind has been working to clean up their water and water supplies since as early as 4000 BCE.

Cloudiness and particulate contamination were among the factors that drove humanity’s first water treatment efforts; unpleasant taste and foul odors were likely driving forces, as well. Written records show ancient peoples treating their water by filtering it through charcoal, boiling it, straining it, and through other basic means. Egyptians as far back as 1500 BCE used alum to remove suspended particles from drinking water.

By the 1700s CE, filtration of drinking water was a common practice, though the efficacy of this filtration is unknown. More effective slow sand filtration came into regular use throughout Europe during the early 1800s.

As the 19th century progressed, scientists found a link between drinking water contamination and outbreaks of disease. Drs. John Snow and Louis Pasteur made significant scientific finds in regards to the negative effects microbes in drinking water had on public health. Particulates in water were now seen to be not just aesthetic problems, but health risks as well.

Slow sand filtration continued to be the dominant form of water treatment into the early 1900s. in 1908, chlorine was first used as a disinfectant for drinking water in Jersey City, New Jersey. Elsewhere, other disinfectants like ozone were introduced.

The U.S. Public Health Service set federal regulations for drinking water quality starting in 1914, with expanded and revised standards being initiated in 1925, 1946, and 1962. The Safe Drinking Water Act was passed in 1974, and was quickly adopted by all fifty states.

Water treatment technology continues to evolve and improve, even as new contaminants and health hazards in our water present themselves in increasing numbers. Modern water treatment is a multi-step process that involves a combination of multiple technologies. These include, but are not limited to, filtration systems, coagulant (which form larger, easier-to-remove particles call “floc” from smaller particulates) and disinfectant chemicals, and industrial water softeners.

For further information, please read:

Planned future articles on Sandy Historical will expand on some of the concepts mentioned here. Please visit this page again soon for links to further reading.

Historical Science & Technology, Technology

Historic Astronomical Clocks Across the Globe

An astronomical clock is not a timekeeping device in the traditional sense of the word “clock.” Instead of second, minute, and hour hands, an astronomical clock utilizes an array of special dials and mechanisms to display various astronomical information, such as the relative positions of the sun, moon, constellations, and planets. However, many historical examples do display the time of day along with the celestial information.

Throughout history, there have been a number of historically significant astronomical clocks. Though not a true clock in any sense, the very first astronomical clock was developed in the 2nd century BCE by the Ancient Greeks. It could calculate the positions of the sun, moon, and stars via a complex array of gears. Known as the Antikythera mechanism, the device was also, on a technical level, the world’s first analog computer.

Other notable astronomical clocks include:

The Cosmic Engine of Su Sung

Easily the best-named device on this list, the Cosmic Engine was designed and built by Chinese polymath Su Sung in 1092 CE. The original is lost to history, but surviving records of its design and structure are sufficient enough that both the Science Museum of London and the National Museum of Natural Science in Taiwan have built working replicas. Painstakingly recreated from Su Sung’s own notes, each of these devices is over 30 feet high and, like the original, powered by a water wheel.

The Castle Clock of Al-Jazari

Considered history’s most sophisticated water-powered astronomical clock, this 11-foot tall device offer numerous functions in addition to timekeeping. Built by Muslim polymath Ismail Al-Jazari in 1206 in what is now Turkey, the Castle Clock displayed the zodiac, solar and lunar orbits, and every hour on the hour, a cuckoo clock-like performance of mannequins emerging from hidden, automatic doors, accompanied by a five-piece automaton band.

The 3 Astronomical Clocks of the Strasbourg Cathedral

The world famous Strasbourg Cathedral, in Strasbourg, France, has been home to three different astronomical clocks over the past 700 or so years. The first, completed in 1354 CE, stopped working in the early 16th century. Its replacement, completed in 1574, gave up the ghost in 1789. The third, completed in 1624 and still in operation today, was built inside the casing of the second astronomical clock. It features numerous automata and features the first world’s completely mechanized computus—a mathematical device used to calculate the calendar date of Easter on a yearly basis.

Prague’s Nazi-Proof Astronomical Clock

Prague Astro Clock

Perhaps the most famous astronomical clock of all time is located in the Town Hall of Prague, Czech Republic. Originally constructed in 1410, the Prague clock features an elaborate hourly display in which four figures move about it base, with a skeletal representation of Death striking the hour. During World War II, invading Nazis nearly destroyed the clock, though not intentionally—it was more of a collateral damage situation. The people of Prague came together to save the clock, carefully disassembling it and hiding its pieces in various locations until the end of the war. It was reassembled and renovated in 1948.

Rasmus Sørnes’ Astronomical Clock

Quite possibly the most complex device of its kind ever built, this astronomical clock displays the locations of the sun, moon, and zodiac, the Julian calendar, the Gregorian calendar (with built-in leap year adjustments), sidereal time, GMT time, local time (including daylight savings time and accurate sunrise and sunset times year round), solar and lunar cycles, eclipses, tides, sunspot cycles, and planetarium including Pluto’s 248-year solar orbit and the 25,800-year precession of the Earth’s axis in polar ecliptics. All of this is built into a housing just 2.3’ by 2’ by 6.9’ in size.

Photo credit: Strasbourg Clock: Foter / CC BY-SA
Photo credit: Prague Clock: Thomas Hee / Foter / CC BY-NC-ND


Paper Beats Rock, Chemicals Beat Metal

Subtractive manufacturing is a fairly common process—basically, any time smaller pieces of material are removed from a larger piece to make something specific, you’ve got subtractive manufacturing. Carving, lathe work, even drilling a hole in something is technically subtractive machining. The flipside of those common practices are the lesser-known subtractive manufacturing methods, such as chemical milling, which uses corrosive liquid chemicals to dissolve some sections of a metal sheet while leaving the desired part shape behind.

A chemically machined copper part.

A chemically machined copper part.

From the Renaissance to Modern Industry

Chemical milling—also known as chemical machining, industrial etching, and numerous variations on this general nomenclature—was first developed during the Renaissance as an alternative to engraving on metal. Similar processes, using organic chemicals like lactic and citric acids, were in use as far back as 400 BCE; however, it was not until the 15th century CE that strong mineral acids and manmade chemicals were used for subtractive manufacturing.

Using etchant concoctions of salt, charcoal, and vinegar, blacksmiths used chemical machining to etch intricate patterns onto the surfaces of metal armor. Linseed-oil paint was used to create a “negative”—a protective coating that the etchant would not eat through—that left the painted areas “raised” in relief. (More accurately, the non-painted areas were lowered as the etchant dissolved a layer of the surface.) Through careful control of the chemical milling process, armor could be decorated as if it were extensively engraved without the raised burrs and physical stress caused by actual engraving.

In the 17th century, chemical machining was used to create graduation marks on measuring instruments. The process enable the creation of extremely thin lines that made the instruments more precise and accurate than ever before. A short time later, chemical milling was used to etch trajectory information onto reference plates for cannon and artillery operators, as the durable metal plates held up far better in combat that paper.

Round about Seventeen Hundred and Twenty Eight, Swedish chemist John Senebier discovered that certain resins hardened when exposed to light, losing their solubility to turpentine. This lead to the creation of a more specialized process, called photo chemical machining, in which a liquid coating is applied to the entire workpiece and UV light is used to outline the negative of the intended part. This technique was originally used in the development of photography.

Roughly two centuries later, chemical machining was first used to create parts in a true manufacturing capacity, after Swedish manufacturers Aktiebolaget Separator patented a method of producing edge filters. In the 1940s, industrial etching truly came into its own for the machining of very hard, thin metals and alloys.

Today, chemical etching and photochemical machining are heavily utilized by PCB, semiconductor, aerospace, and other “high tech” manufacturers, as well as in numerous “low tech” applications.

Photo credit: hslphotosync / / CC BY-SA


Get to Know the Gnomon

Surely you’re familiar with the sundial. (And don’t call me Shirley.) But did you know that the term “sundial” actually *technically* only applies to the face of the device (the flat part with the numbers)? Just as a watch is no good without its hands, so, too, is the sundial useless without its gnomon.

Anaximander & The Babylonian Gnomons (Say That 5 Times Fast)

The name “gnomon” comes from an old Greek word that translates to “one who knows or examines”—fitting, as the gnomon is the only part of a sundial that can actually tell you what time it is. (The gnomon knows!) Over time, the term came to have several different meanings, including “perpendicular,” “an L-shaped instrument used to draw right angles,” and “that which, when added to another number or shape, creates a new entity similar to that with which one started.” Fun stuff.

The gnomon was introduced into Greek culture by Anaximander, a 6th Century BCE philosopher who himself took the ideal from the Babylonians. Originally, Babylonian gnomons were simple vertical pillars or rods installed on a flat, horizontal surface. These devices, if they can be considered “devices” at all, worked on the same principle as a sundial: as the sun moves across the sky, the shadow of the gnomon would indicate the time of day. Shorter shadows appeared in summer, becoming shortest at the summer solstice; longer shadows appeared in summer, with the longest occurring on the winter solstice.


Though Anaximander did not invent the gnomon or the sundial, he was the first person to accurately determine the spring and autumnal equinoxes, using a gnomon and good old geometry.

“What Time Is It?” Time to Buy A Gnomon!

Perhaps the most impressive thing about gnomons is that old-timey people like the ancient Greeks were able to figure out how they worked at all. Sure, the Babylonians invented the gnomon and devised the art of time measurement, but the Greeks (more or less) perfected it. Even if the calendar has changed since those days, seconds, minutes, and hours have all remained the same for thousands of years. (Though it should be noted that sundials don’t provide accuracy down to the minute, let alone the second.)

The key to getting a sundial to show the correct time is orientation. In the northern hemisphere, where Greece has been located since at least the 1960s, the shadow casting edge of a gnomon must be oriented pointing north and aligned parallel to Earth’s rotational axis; i.e., the gnomon’s edge must be horizontally inclined at an angle equal to the sundial’s latitudinal location. The easiest way to do this—in ancient Greece and in modern America—is to point the sucker directly at Polaris.

Most sundials are flat and parallel to the ground, with the gnomon extending vertically from its surface. In some instances, however, a sundial can be mounted vertically (usually on the wall of a building) with the gnomon sticking out sideways, parallel to the ground. These vertical sundials are considerably more difficult to properly align, as they are on friggin’ walls and getting everything setup that way is just all around harder.

Photo credit: Tim Green aka atoach / Foter / CC BY

Technology, War: What Is It Good For? (Absolutely Nothin'!)

Greek Fire: Savior of the Byzantine Empire

One of the world’s first truly effective incendiary weapons, Greek fire was developed circa 672 CE by the Byzantine Empire (a.k.a. the Eastern Roman Empire). It was a key component in numerous Byzantine naval victories, most importantly in two separate defenses of Constantinople against Arab sieges.

Greek fire is often referred to in ancient records as “sea fire,” “Roman fire,” “war fire,” “liquid fire”, “sticky fire,” or “manufactured fire.” Its exact formula has been lost to history, but its effects and influence on history are unquestioned.

Ancient Napalm

Incendiary weapons were not unheard of at the time of Greek fire’s invention—the ol’ flaming arrow was popular, of course, as were rudimentary clay-pot fire grenades. Historical records show Assyrian warriors using these and other similar weapons as far back as the 9th century BCE. Manually-operated, metal-tube flamethrowers were seen in battle at the siege of Delium (in Greece) in 424 BCE.

The exact date of Greek fire’s creation is unknown, as is the identity of the chemist or chemists who developed its unique recipe. Modern scientists have deduced that the ingredients likely included some combination of pine resin, naphtha, quicklime, calcium phosphide, sulfur, and niter.

In battle, the Byzantine navy used swiveling flamethrowers called siphōns to blast Greek fire onto the enemy. Mounted at the bow of the navy’s ships, these siphōns were constructed of brass or bronze and were often sculpted in the shape of lions or other ferocious beasts. Most accounts of its use also mention flame-fueled copper cauldrons where the material was heated prior to deployment (likely to make the resin ingredients more liquid and more sprayable), as well as hand-pumped pressure tanks to extend the siphōns’ reach.

Contemporaneous drawing of Greek fire in use against an enemy vessel.

Contemporaneous drawing of Greek fire in use against an enemy vessel.

Greek fire burned hot and fast, and, according to contemporaneous reports, could reduce a wooden boat to ash in just minutes. Modern experiments with pseudo-Greek fire have achieved flame temperatures in excess of 1,000°F. It continued to burn on water—in fact, as some tales have it, water only made Greek fire burn more intensely.

Some Like It Hot

The creation of Greek fire came at a very opportune time for the Byzantine Empire. Long wars against a variety of foes had weakened their military, and the Arabs were steadily cutting a swath across what we now call the Middle East, having conquered Syria, Palestine, and Egypt. Without Greek fire in their arsenal, it is likely the Byzantines would have fallen, as well. However, its effect in battle was more than sufficient to repel invaders and besiegers.

The Eastern Roman Empire went on to use Greek fire in numerous other conflicts, including those against the Saracens (Muslims from the northwestern Arabian region), the Rus (western Russian peoples), and in several Byzantine civil wars. (Nothing like napalming your own guys, amirite?)

Greek fire was so powerful, and became so well known throughout the region, that its discovery was eventually attributed to divine intervention. Then-Emperor Constantine Porphyrogennetos wrote in his book De Administrando Imperio, that the substance was “shown and revealed by an angel to the great and holy first Christian emperor Constantine.”

The recipe for Greek fire became a closely guarded state secret for the Byzantines. Few, if any, people at the time knew all of its ingredients and their quantities, and it was often concocted in “shifts”—several separate batches of ingredients were prepared, then those batches were combined to create the end result. Even the methods of construction and operation of the siphōns was compartmentalized, so that the secrets of Greek fire could not be stolen or divulged by a single individual.

Photo credit: Unknown / Foter / Public Domain Mark 1.0

Historical Science & Technology

Medieval Contributions to Modern Medicine

The fall of the Roman Empire; the Black Plague; the rise of feudalism. The Middle Ages are known for a great many world-changing developments. Another, not-as-widely-acknowledged area of significant change in Medieval Europe was medicine. Numerous discoveries from this period—from practical medicine to surgery—paved the way for the “modern” medical practices.

Medicinal Monks Give Way to University Study

Once upon a time, medicine was learned through oral traditions and apprenticeships. The Middle Ages saw the rise of documenting texts and university instruction. As with most written work from this period, medical texts were most commonly written out and preserved by monks, who could copy and revise any medical writings they could find and share them with other monasteries.

Monks of the time were often also low-level doctors, tending to other monks, travelers, workers, and the population of the monastery’s hospice. Called “infirmarians,” these medicinal monks treated burns, dislocations, fractures, lacerations, and other minor and major ailments. Infirmarians were also responsible for training future infirmarians in the practice, not only demonstrating techniques but also sharing knowledge of medical remedies made from plants and herbs. In this way, monasteries were a sort of medieval pharmacy.

A number of now-prestigious universities were established in this period, all offering instruction in the practice of medicine. The universities of Paris, Bologna, Oxford, Montpelier, and Padua were all founded between 1150 and 1222 CE, providing the opportunity for learning to Europeans all across the continent. A full course of study to become a Doctor of Medicine took ten years (following the earning of the Medieval equivalent of a Bachelor’s Degree), so while the number of fully qualified physicians remained small, those who did complete study were highly trained.

A 13th century anatomical illustration

A 13th century anatomical illustration

Better Living Through Chemistry

Plants and herbs were the main ingredient in most medieval remedies. Advances in medical chemistry, such as the introduction of inorganic materials and processes such as distillation, made these remedies more potent and effective.

One of the foremost medical chemists of the Middle Ages was the alchemist John of Rupescissa. In his efforts to create the philosopher’s stone, J of R developed a number of distillation techniques that proved useful for making medicine. By removing nonessential elements, plant- and herb-based remedies were made more potent, and new, previously-unusable ingredients could be added for greater effect.

Notable Medical Texts

Bald’s Leechbook, published circa 900 CE, was a compendium of medical information from a range of classical works. It also included numerous proven folk remedies.

One of the first, if not the first, female physician, Hildegard of Bingen, was born in 1098 CE. By 1112, she had entered the monastery/convent at Disibodenberg, Germany. There, she studied medicine under the infirmarians, and used the knowledge gained there to write the medical text Causae et Curae, which contains information on the diagnosis, treatment, and prognosis of numerous diseases and illnesses. Causae et Curae shows influence from many different areas, and her detailed descriptions of how to perform various medical tasks helped countless others learn the basics of medicine in the Middle Ages.

In 1180, Roger Frugardi, an Italian surgeon, wrote his treatise on surgery. Building off Frugardi’s work, Theodoric Borgognoni, the foremost surgeon of the period, wrote a four-volume treatise on surgery, the Cyrurgia, between 1350 and 1365. The Cyrurgia detailed a number of key innovations, including the use of antiseptics to prevent infection and a method for creating anesthetics from opiates and herbs.

Photo credit: Foter / Public Domain Mark 1.0