Technology, World-Changing Inventions

Water Treatment Technology Through History

Water Treatment Technology Through History

 

Civilization has changed in uncountable ways over the course of human history, but one factor remains the same: the need for clean drinking water. Every significant ancient civilization was established near a water source, but the quality of the water from these sources was often suspect. Evidence shows that humankind has been working to clean up their water and water supplies since as early as 4000 BCE.

Cloudiness and particulate contamination were among the factors that drove humanity’s first water treatment efforts; unpleasant taste and foul odors were likely driving forces, as well. Written records show ancient peoples treating their water by filtering it through charcoal, boiling it, straining it, and through other basic means. Egyptians as far back as 1500 BCE used alum to remove suspended particles from drinking water.

By the 1700s CE, filtration of drinking water was a common practice, though the efficacy of this filtration is unknown. More effective slow sand filtration came into regular use throughout Europe during the early 1800s.

As the 19th century progressed, scientists found a link between drinking water contamination and outbreaks of disease. Drs. John Snow and Louis Pasteur made significant scientific finds in regards to the negative effects microbes in drinking water had on public health. Particulates in water were now seen to be not just aesthetic problems, but health risks as well.

Slow sand filtration continued to be the dominant form of water treatment into the early 1900s. in 1908, chlorine was first used as a disinfectant for drinking water in Jersey City, New Jersey. Elsewhere, other disinfectants like ozone were introduced.

The U.S. Public Health Service set federal regulations for drinking water quality starting in 1914, with expanded and revised standards being initiated in 1925, 1946, and 1962. The Safe Drinking Water Act was passed in 1974, and was quickly adopted by all fifty states.

Water treatment technology continues to evolve and improve, even as new contaminants and health hazards in our water present themselves in increasing numbers. Modern water treatment is a multi-step process that involves a combination of multiple technologies. These include, but are not limited to, filtration systems, coagulant (which form larger, easier-to-remove particles call “floc” from smaller particulates) and disinfectant chemicals, and industrial water softeners.

For further information, please read:

Planned future articles on Sandy Historical will expand on some of the concepts mentioned here. Please visit this page again soon for links to further reading.

The Science of Film, Music & Art

Da Vinci’s Viola Organista

Leonardo da Vinci was the archetypal Renaissance Man and perhaps the world’s greatest polymath. He notably excelled as a painter, sculptor, geologist, mathematician, engineer, botanist, writer, inventor, and musician. A combination of the latter two of these skills led da Vinci to create the Viola Organista, an experimental bowed keyboard instrument that was the first of its kind ever devised.

Multiple Iterations

As with any successful invention, the viola organista went through several revisions before the design was finalized. It should be noted, however, that in this case, “successful” and “finalized” are not necessarily accurate, as it is unknown if da Vinci ever actually built a working prototype of the instrument. Copious notes and sketches of the viola organista were preserved in his notebooks from 1488-1489, as well as in the Codex Atlanticus and Manuscript H.

The initial design of the viola organista used a mechanical bow, moving side to side, to create friction on the violin-like strings. Da Vinci’s second iteration utilized a rotating wheel to play the strings, much like a hurdy-gurdy. This version of the viola organista contained a large number of strings that were lowered onto the wheel by means of a keyboard. When a key was pressed, the corresponding string moved downward and was “bowed” by the constantly moving wheel. Individual notes as well as full chords could be played in this way.

One of da Vinci's sketches of the viola organista.

One of da Vinci’s sketches of the viola organista.

The third and final design did away with the mechanical bow and spinning wheel while simultaneously incorporating variations of both. This version used multiple rotating wheels to pull looping bows (not unlike the fan belt in a car engine) that ran perpendicular to the strings. As before, a keyboard was used to push the strings down onto the wheels. One version of this design gave the viola organista one note per string; another used “fretted” strings, along with multiple keys per string, to produce different notes/pitches from the same string.

Viola Organista IRL

As mentioned above, it is unknown if the inventor ever created an actual, real life viola organista. No examples matching his notes, descriptions, and sketches have ever been found. It’s basically the Sasquatch of musical instruments invented by Leonardo da Vinci.

The first viola organista-like instrument to actually, definitively exist was the Geigenwerk, created by Hans Heyden in 1575. Heyden’s design is technically an original one, however, and not directly based on da Vinci’s designs. The Geigenwerk notably varies from the viola organista in that it uses several friction wheels to vibrate the strings, instead of a looping bow/belt.

The first official, modern viola organista was built by Akio Obuchi in 1993. In 2004, one of Obuchi’s instruments was used in a live performance in Genoa, Italy. In 2013, Slawomir Zubrzycki performed at the Academy of Music in Krakow, Poland, using a viola organista of his own construction.

Photo credit: Foter / Public Domain Mark 1.0

Historical Science & Technology, World-Changing Inventions

The Julian Calendar

The Julian calendar was a reformed version of the earlier Roman calendar. Introduced by Julius Caesar—for whom the months of July (Julius) and August (he was later known as Augustus Caesar) are named—it was used throughout most of the Western world from 46 BCE until roughly 1582, when it was replaced by another reformed version, the Gregorian calendar, which we use today.

Origins

The Roman calendar on which the Julian calendar is based was similar in structure, but was less accurate and rather confusing. It consisted of “ordinary” and “intercalary” years, in which there lasted  12 and 13 months, respectively. The extra month, called Mercedonius, was inserted between February and March. An ordinary, 12-month year consisted of 355 days. A 13-month intercalary year consisted of 377 days, or, in some instances, 378 days. (See what I mean about it being confusing?)

Scholars of the time determined that, to make the Roman calendar match up to the solar year’s 365.25 days, intercalary years should used every other year, or ideally, in a ratio of 11 times every 24 years (good luck making a logical pattern out of that). To further befuddle things, there actually was no set system of ordinary/intercalary years; instead, intercalary years were determined by the Pontifices (not the Pope at that time, exactly, but kind of). Under this system, intercalary years were essentially used whenever the Pontifice wanted, resulting in random, lengthy stretches of no extended years or intercalary years occurring twice in a row. Again, confusing as all get out.

Correcting the Drift

Due to the inconsistency of intercalary years, the Roman calendar eventually drifted far from the solar year. By the end of this calendar’s use, dates and seasons had gotten so out of whack that the era was known as the “years of confusion.” Caesar’s calendar was designed to correct this problem and keep the calendar aligned with the sun without the need for intercalary years.

For centuries, the Egyptians had used a calendar with a fixed 365 days. In 238 BCE, the Decree of Canopus (in part) attempted to adjust the calendar to make up for the missing quarter of a day by adding an extra day every fourth year. Though the decree was ultimately repealed hundreds of years later, it was in effect during Caesar’s time in Egypt, and this “wandering” calendar was adopted as the Leap Year we know today.

In order to enact the new Julian calendar and align its start (1 January) with the start of the solar year, the year preceding its official adoption was lengthened. Thus, 46 BCE was 445 days long, an 80-day extension that was necessary to compensate for missed intercalary years and the unavoidable drift caused by the imperfectly-calculated Roman calendar. The additional days were made into two one-time-only months known as Intercalaris Prior and Intercalaris Posterior, which were inserted in the calendar between November and December.

This final, artificially extended year was known as the “Last Year of Confusion.”

The Gregorian Calendar

Gregorian calendar

The Julian calendar was not perfect, however, and Leap Years (though they weren’t called that at the time) were initially incorporated inconsistently. Over time, the Julian calendar, too, drifted away from the solar year. When it was eventually replaced with the Gregorian calendar in 1582 CE, the Julian calendar was 11 days behind.

The Gregorian calendar, a slight alteration promulgated by Pope Gregory XIII, was created to better align the Catholic church’s “moveable feasts,” specifically Easter, with the annual solstices and equinoxes. The final country to utilize the Julian calendar was, somewhat ironically, Egypt, which finally transitioned to the Gregorian version in 1928.

Photo credit: adactio / Foter / CC BY

Technology

Prototyping: A Fancy Word for “Practice”

For as long as human beings have been creating things, they’ve been creating prototypes of those things first. As such, prototyping has a loooong history, spanning from the rudimentary sculptures of ancient man to the 3D-printed creations made possible by modern technology. The road from concept to tangible item is a long one, and fraught with peril.

Prehistoric Prototyping

The earliest examples of prototyping date back to first days of human civilization in the 1940s the fourth millennium BCE. Sculpted clay figures, representing gods and mythological creatures, were used as basic “prototypes” for later, improved versions; roughly-formed shapes made of clay served as prototypes for later construction. In most instances, these prototypes were not to scale, but merely intended to convey the general idea of the creator.

Roughly a thousand years later, pen and paper became a key medium for prototyping. Drawings both simple and complex, combined with then-newly developed written language, allowed for even greater detail in the prototypes of this time. Historical evidence suggests that the pyramids were extensively “prototyped” via pen and paper diagrams prior to their construction.

Da Vinci’s Prototypes

During the Renaissance, Leonardo da Vinci was known to create incredibly complex prototype sketches as proof of concept for his various proposed inventions, many of which would have required technology that had not yet been conceived of, let alone developed. Da Vinci incorporated precise measurements and to-scale drawings in his prototype drawings.

One of da Vinci's prototype sketches.

One of da Vinci’s prototype sketches.

A da Vinci prototype drawing from 1502 depicted a 720-foot, single-span bridge, designed for the reigning Ottoman Sultan. Intended to span a river inlet near Istanbul, the sultan canceled the project for which the prototype was commissioned, believing the bridge’s construction would be impossible. Over five centuries later, the Turkish government restarted the project, building a bridge that matched da Vinci’s prototype schematic exactly.

Modern Prototyping

When he wasn’t busy stealing others’ inventions or having competitors murdered so he could take credit for their discoveries, Thomas Edison was known to use an iterative approach and multiple prototyping methods that are still in use today. Edison’s processes led, in part, to the modern practice of rapid prototyping in software design.

In the mid-20th century CE, American industrial engineer Henry Dreyfuss further improved the prototyping process. Dreyfuss’s new take on prototyping became the template on which most modern prototyping methods are based.

Today’s state-of-the-art technology now makes prototyping a much more efficient process. Using advanced software and manufacturing equipment, modern prototype engineering has helped mankind reach to the Moon, Mars, and beyond.

Photo credit: trevor.patt / Foter / CC BY-NC-SA

Historical Science & Technology, Pseudoscience

Futurology: The Science of Speculation

Futurology is the study of the future. “How can one study the future?” you may ask. “It hasn’t happened yet, so there’s really nothing to study!” Well, just follow me here, dear reader: to get specific, futurology is the postulation of possible, probable, and/or preferable futures, as well as the current trends that may lead to said futures. A futurologist seeks to understand what parts of the modern world are likely to continue into the future, what could change, and how these potential similarities and changes will affect future society.

There is an ongoing debate as to whether futurology is a science or an art, owing, in part, to the popularity of science fiction, which is often, in its own way, a sort of futurology. The term “strategic foresight” is often used to present the discipline as more genuinely scientific.

A Unique Discipline

Modern futurologists stress the possibility and importance of alternative and plural futures—i.e., they look at multiple possibilities arising from every new futurological prediction (“If A, then probably B, but also possibly C or D or E”). They also emphasize the limitations of prediction and probability, acknowledging that there is, in fact, no way to know for sure what will happen in the future and that they are, at best, making highly informed educated guesses.

There are a number of factors that distinguish futurology’s scientific focus. Primary among them is futurology’s study of not only possible futures, but also probable (what is most likely to happen), preferable (the best future for all), and “wild card” futures (something completely unexpected*).

Futurology generally attempts to create a holistic or systemic view of the future based on information gathered by numerous other scientific disciplines. All available information is considered in postulating futurological hypotheses.

The assumptions behind dominant and contending views of the future are also challenged by futurologists, no matter how well-established the line of thinking may be. For example, it is an accepted scientific fact that our Sun will burn out many thousands or millions of years in the future—a futurological approach would consider the possibility that this may happen in only 100 years, and what consequences that would bring.

Origins of Futurology

The first writings that could be considered to have a futurological view date back to the first century BCE. The first attempts to make systematic predictions about the future were published in the 18th century CE, the most notable of which was 1733’s Memoirs of the Twentieth Century by Samuel Madden. This book looks only at the politics and religion of the future world, with no speculation on technology or other aspects of life.

H.G. Wells, OG Futurologist

H.G. Wells, OG Futurologist

Writers including Jules Verne and H.G. Wells helped establish the science fiction genre in the 19th century, writing of imagined futures with advanced technology and radically altered societies. Many scholars consider Wells to be the unofficial founder of futurology—his best-selling Anticipations of the Reaction of Mechanical and Scientific Progress Upon Human Life and Thought, set in the year 2000, correctly predicts a number of innovations that are now a part of everyday life (the book also presents numerous incorrect predictions).

In 1902, following the success of Anticipations, Wells was asked to deliver a lecture at the Royal Institution. Titled The Discovery of the Future, this lecture encouraged the establishment of what came to be known as futurology, suggesting that scientific methodology would be a better conduit for genuine predictions about the future than simple speculation. While acknowledging that it is essentially impossible to provide entirely accurate predictions for the future, Wells stated that a scientific approach would allow for a “working knowledge of things in the future.”

* Although, if futurologists are studying it, it can’t truly be “completely unexpected,” can it? It’s like that (stupid) old saying, “Expect the unexpected”—completely impossible, because if you’re expecting it, it is no longer unexpected. By definition, there is no way to expect the unexpected.

Photo credit: LSE Library / Foter / No known copyright restrictions

Science

The Gentleman Scientists

Gentleman scientists are financially independent scientists who practice their craft of their own accord—that is, with no direct affiliation, financial support, or direction from public institutions, government entities, or universities. Gentleman scientists rose to prominence following the Renaissance, and though they exist to this day, government and private funding caused a significant decrease in their numbers beginning in the early- to mid-20th century.

Though they are also known as “independent scientists,” “gentleman scientists” has a much better ring to it, so we’ll stick with that. The term is in no way intended to imply that women (or “ladies,” which would be the logical accompaniment to “gentlemen”) cannot also be scientists of this ilk.

A Brief History

Self-funded scientists have been around at least as long as science has been studied by mankind. Obviously, some of the very first scientists were self-funded, as they were working in uncharted territory and essentially “inventing” science as they went along.

Truer to the definition of the term, however, are the scientists from the end of the Renaissance through the Victorian age who paid for their own research. Gentleman scientists were most prominent in England, which explains the fancy pants nomenclature* (many of the first fellows of the Royal Society of London were gentleman scientists), but they practiced throughout the world.

While some gentleman scientists were independently wealthy, and could use their personal fortunes to finance their research, many derived or supplemented their income with funds from other science-related sources (some very loosely so). Galileo sold scientific instruments of his own design, Johannes Kepler wrote and published horoscopes, and many others practiced and/or taught medicine.

Notable gentleman scientists throughout history include Robert Boyle, Benjamin Franklin, Charles Darwin, Alessandro Volta, and Thomas Jefferson, though he probably still puts “Third President of the United States” first on his resume.

You can't get much more gentleman scientisty than this fellow.

You can’t get much more gentleman scientisty than this fellow. (Whoever he is.)

Pros & Cons of the Gentleman Scientist Game

Though outside funding would certainly have been obtainable by most gentleman scientists, they generally chose to go solo for the freedoms it allowed them. With no patron to determine where their research should be focused, gentleman scientists were able to follow their own interests. The gentleman scientist was free to pursue any project they wanted, including those with a high potential for failure to which others who are footing the bill may be averse.

Going the gentleman scientist route most often meant that far less money was available for research and experimentation, but it also eliminated the inconveniences associated with working for a funds-supplying university: teaching obligations, administrative mumbo jumbo, grant request writing, etc.

Additionally, in many cases, any inventions developed under the patronage of a university or government body would become the intellectual property of said patron. A gentleman scientist inventor’s inventions were entirely his or her own.

Science often requires a good deal of complicated, and usually prohibitively expensive, equipment. These devices could be difficult for the gentleman scientist to obtain outside of a university or government research lab setting. This, of course, made research and experimentation difficult. Some gentleman scientists circumvented this obstacle by working with funded colleagues with access to the necessary equipment, or through equipment-only grants.

* Highlighted here by the use of the equally fancy pants word “nomenclature.” Vocabulary FTW!

Photo credit: Internet Archive Book Images / Foter / No known copyright restrictions