Technology, World-Changing Inventions

Water Treatment Technology Through History

Water Treatment Technology Through History

 

Civilization has changed in uncountable ways over the course of human history, but one factor remains the same: the need for clean drinking water. Every significant ancient civilization was established near a water source, but the quality of the water from these sources was often suspect. Evidence shows that humankind has been working to clean up their water and water supplies since as early as 4000 BCE.

Cloudiness and particulate contamination were among the factors that drove humanity’s first water treatment efforts; unpleasant taste and foul odors were likely driving forces, as well. Written records show ancient peoples treating their water by filtering it through charcoal, boiling it, straining it, and through other basic means. Egyptians as far back as 1500 BCE used alum to remove suspended particles from drinking water.

By the 1700s CE, filtration of drinking water was a common practice, though the efficacy of this filtration is unknown. More effective slow sand filtration came into regular use throughout Europe during the early 1800s.

As the 19th century progressed, scientists found a link between drinking water contamination and outbreaks of disease. Drs. John Snow and Louis Pasteur made significant scientific finds in regards to the negative effects microbes in drinking water had on public health. Particulates in water were now seen to be not just aesthetic problems, but health risks as well.

Slow sand filtration continued to be the dominant form of water treatment into the early 1900s. in 1908, chlorine was first used as a disinfectant for drinking water in Jersey City, New Jersey. Elsewhere, other disinfectants like ozone were introduced.

The U.S. Public Health Service set federal regulations for drinking water quality starting in 1914, with expanded and revised standards being initiated in 1925, 1946, and 1962. The Safe Drinking Water Act was passed in 1974, and was quickly adopted by all fifty states.

Water treatment technology continues to evolve and improve, even as new contaminants and health hazards in our water present themselves in increasing numbers. Modern water treatment is a multi-step process that involves a combination of multiple technologies. These include, but are not limited to, filtration systems, coagulant (which form larger, easier-to-remove particles call “floc” from smaller particulates) and disinfectant chemicals, and industrial water softeners.

For further information, please read:

Planned future articles on Sandy Historical will expand on some of the concepts mentioned here. Please visit this page again soon for links to further reading.

Technology

Prototyping: A Fancy Word for “Practice”

For as long as human beings have been creating things, they’ve been creating prototypes of those things first. As such, prototyping has a loooong history, spanning from the rudimentary sculptures of ancient man to the 3D-printed creations made possible by modern technology. The road from concept to tangible item is a long one, and fraught with peril.

Prehistoric Prototyping

The earliest examples of prototyping date back to first days of human civilization in the 1940s the fourth millennium BCE. Sculpted clay figures, representing gods and mythological creatures, were used as basic “prototypes” for later, improved versions; roughly-formed shapes made of clay served as prototypes for later construction. In most instances, these prototypes were not to scale, but merely intended to convey the general idea of the creator.

Roughly a thousand years later, pen and paper became a key medium for prototyping. Drawings both simple and complex, combined with then-newly developed written language, allowed for even greater detail in the prototypes of this time. Historical evidence suggests that the pyramids were extensively “prototyped” via pen and paper diagrams prior to their construction.

Da Vinci’s Prototypes

During the Renaissance, Leonardo da Vinci was known to create incredibly complex prototype sketches as proof of concept for his various proposed inventions, many of which would have required technology that had not yet been conceived of, let alone developed. Da Vinci incorporated precise measurements and to-scale drawings in his prototype drawings.

One of da Vinci's prototype sketches.

One of da Vinci’s prototype sketches.

A da Vinci prototype drawing from 1502 depicted a 720-foot, single-span bridge, designed for the reigning Ottoman Sultan. Intended to span a river inlet near Istanbul, the sultan canceled the project for which the prototype was commissioned, believing the bridge’s construction would be impossible. Over five centuries later, the Turkish government restarted the project, building a bridge that matched da Vinci’s prototype schematic exactly.

Modern Prototyping

When he wasn’t busy stealing others’ inventions or having competitors murdered so he could take credit for their discoveries, Thomas Edison was known to use an iterative approach and multiple prototyping methods that are still in use today. Edison’s processes led, in part, to the modern practice of rapid prototyping in software design.

In the mid-20th century CE, American industrial engineer Henry Dreyfuss further improved the prototyping process. Dreyfuss’s new take on prototyping became the template on which most modern prototyping methods are based.

Today’s state-of-the-art technology now makes prototyping a much more efficient process. Using advanced software and manufacturing equipment, modern prototype engineering has helped mankind reach to the Moon, Mars, and beyond.

Photo credit: trevor.patt / Foter / CC BY-NC-SA

Historical Science & Technology, Pseudoscience

Futurology: The Science of Speculation

Futurology is the study of the future. “How can one study the future?” you may ask. “It hasn’t happened yet, so there’s really nothing to study!” Well, just follow me here, dear reader: to get specific, futurology is the postulation of possible, probable, and/or preferable futures, as well as the current trends that may lead to said futures. A futurologist seeks to understand what parts of the modern world are likely to continue into the future, what could change, and how these potential similarities and changes will affect future society.

There is an ongoing debate as to whether futurology is a science or an art, owing, in part, to the popularity of science fiction, which is often, in its own way, a sort of futurology. The term “strategic foresight” is often used to present the discipline as more genuinely scientific.

A Unique Discipline

Modern futurologists stress the possibility and importance of alternative and plural futures—i.e., they look at multiple possibilities arising from every new futurological prediction (“If A, then probably B, but also possibly C or D or E”). They also emphasize the limitations of prediction and probability, acknowledging that there is, in fact, no way to know for sure what will happen in the future and that they are, at best, making highly informed educated guesses.

There are a number of factors that distinguish futurology’s scientific focus. Primary among them is futurology’s study of not only possible futures, but also probable (what is most likely to happen), preferable (the best future for all), and “wild card” futures (something completely unexpected*).

Futurology generally attempts to create a holistic or systemic view of the future based on information gathered by numerous other scientific disciplines. All available information is considered in postulating futurological hypotheses.

The assumptions behind dominant and contending views of the future are also challenged by futurologists, no matter how well-established the line of thinking may be. For example, it is an accepted scientific fact that our Sun will burn out many thousands or millions of years in the future—a futurological approach would consider the possibility that this may happen in only 100 years, and what consequences that would bring.

Origins of Futurology

The first writings that could be considered to have a futurological view date back to the first century BCE. The first attempts to make systematic predictions about the future were published in the 18th century CE, the most notable of which was 1733’s Memoirs of the Twentieth Century by Samuel Madden. This book looks only at the politics and religion of the future world, with no speculation on technology or other aspects of life.

H.G. Wells, OG Futurologist

H.G. Wells, OG Futurologist

Writers including Jules Verne and H.G. Wells helped establish the science fiction genre in the 19th century, writing of imagined futures with advanced technology and radically altered societies. Many scholars consider Wells to be the unofficial founder of futurology—his best-selling Anticipations of the Reaction of Mechanical and Scientific Progress Upon Human Life and Thought, set in the year 2000, correctly predicts a number of innovations that are now a part of everyday life (the book also presents numerous incorrect predictions).

In 1902, following the success of Anticipations, Wells was asked to deliver a lecture at the Royal Institution. Titled The Discovery of the Future, this lecture encouraged the establishment of what came to be known as futurology, suggesting that scientific methodology would be a better conduit for genuine predictions about the future than simple speculation. While acknowledging that it is essentially impossible to provide entirely accurate predictions for the future, Wells stated that a scientific approach would allow for a “working knowledge of things in the future.”

* Although, if futurologists are studying it, it can’t truly be “completely unexpected,” can it? It’s like that (stupid) old saying, “Expect the unexpected”—completely impossible, because if you’re expecting it, it is no longer unexpected. By definition, there is no way to expect the unexpected.

Photo credit: LSE Library / Foter / No known copyright restrictions

Science

The Gentleman Scientists

Gentleman scientists are financially independent scientists who practice their craft of their own accord—that is, with no direct affiliation, financial support, or direction from public institutions, government entities, or universities. Gentleman scientists rose to prominence following the Renaissance, and though they exist to this day, government and private funding caused a significant decrease in their numbers beginning in the early- to mid-20th century.

Though they are also known as “independent scientists,” “gentleman scientists” has a much better ring to it, so we’ll stick with that. The term is in no way intended to imply that women (or “ladies,” which would be the logical accompaniment to “gentlemen”) cannot also be scientists of this ilk.

A Brief History

Self-funded scientists have been around at least as long as science has been studied by mankind. Obviously, some of the very first scientists were self-funded, as they were working in uncharted territory and essentially “inventing” science as they went along.

Truer to the definition of the term, however, are the scientists from the end of the Renaissance through the Victorian age who paid for their own research. Gentleman scientists were most prominent in England, which explains the fancy pants nomenclature* (many of the first fellows of the Royal Society of London were gentleman scientists), but they practiced throughout the world.

While some gentleman scientists were independently wealthy, and could use their personal fortunes to finance their research, many derived or supplemented their income with funds from other science-related sources (some very loosely so). Galileo sold scientific instruments of his own design, Johannes Kepler wrote and published horoscopes, and many others practiced and/or taught medicine.

Notable gentleman scientists throughout history include Robert Boyle, Benjamin Franklin, Charles Darwin, Alessandro Volta, and Thomas Jefferson, though he probably still puts “Third President of the United States” first on his resume.

You can't get much more gentleman scientisty than this fellow.

You can’t get much more gentleman scientisty than this fellow. (Whoever he is.)

Pros & Cons of the Gentleman Scientist Game

Though outside funding would certainly have been obtainable by most gentleman scientists, they generally chose to go solo for the freedoms it allowed them. With no patron to determine where their research should be focused, gentleman scientists were able to follow their own interests. The gentleman scientist was free to pursue any project they wanted, including those with a high potential for failure to which others who are footing the bill may be averse.

Going the gentleman scientist route most often meant that far less money was available for research and experimentation, but it also eliminated the inconveniences associated with working for a funds-supplying university: teaching obligations, administrative mumbo jumbo, grant request writing, etc.

Additionally, in many cases, any inventions developed under the patronage of a university or government body would become the intellectual property of said patron. A gentleman scientist inventor’s inventions were entirely his or her own.

Science often requires a good deal of complicated, and usually prohibitively expensive, equipment. These devices could be difficult for the gentleman scientist to obtain outside of a university or government research lab setting. This, of course, made research and experimentation difficult. Some gentleman scientists circumvented this obstacle by working with funded colleagues with access to the necessary equipment, or through equipment-only grants.

* Highlighted here by the use of the equally fancy pants word “nomenclature.” Vocabulary FTW!

Photo credit: Internet Archive Book Images / Foter / No known copyright restrictions

Technology

Cruisin’ for Extrusion

As I’m sure you’re aware, dear reader, plastic is used for roughly 154 million applications in the modern world. Among them: the weather stripping at the bottom of your front door that keeps drafts out; the film you cut off the frozen pizza you had for dinner last week; the insulation on the power cable for the computer upon which you’re reading this article. So, how is plastic transformed from its “raw” pelletized form into these and myriad other products?

There are many different ways to process plastics. The method used to manufacture the products listed above, those that require a continuous, one-piece profile, is known as extrusion.

Process Adaptation

The plastic extrusion process has not changed much since it was first developed. The process, invented by Joseph Bramah and patented in 1797, was originally used to produce lead pipes. The material was heated to a molten state and forced into dies, where it cooled into its final shape.

In Eighteen-Hundred and Twenty, Thomas Hancock built his first rubber masticator, a device designed to recycle scrap rubber for reuse. Sixteen years later, Edwin Chaffee developed a double-roller machine that could mix additives with raw or reclaimed rubber materials.

Thomas Hancock, master of the masticator.

Thomas Hancock, master of the masticator.

Nearly a century later, in the mid-1930s, Bramah’s and Hancock’s processes were combined and adapted for use with plastic, which was just coming into widespread use at the time. German scientists Paul Troester and Ashley Gershoff created the first thermoplastic extrusion in 1935. Branmah’s method required little alteration for use with plastic materials, and this newly modified process is essentially the same as the one used today.

The use of plastics exploded with the rise of American consumer culture in the wake of World War II. It was not until the late 1950s that plastic co-extrusion was developed, allowing two different materials to be blended into a single product. Technological improvements now allow up to five different thermoplastic materials and/or additives to be co-extruded.

Extrusion Materials & Processes

Numerous thermoplastics can be used in extrusion, including PE (polyethylene), PVC (polyvinyl chloride), polypropylene, acrylic, nylon (polyamides), ABS (acrylonitrile butadiene styrene), and others. Each material exhibits certain characteristics that make it better suited for specific extrusion processes.

Blow film extrusion is used to produce shopping bags and other plastic film products from PE. This process utilizes specialized dies, pressurized air, rollers, and cooling systems to create the final product.

Tubing extrusion uses similar dies, in addition to special pins and vacuum chambers, to produce long, thin pipes and similar products from PVC. Tubes can be extruded on top of other extruded tubes for specialty applications.

To produce sheets or films that are too thick to be blown, sheet/film extrusion is used. The dies in this process are used to reorient and guide the flow of the melted polymer into a thin, uniform sheet. Chiller rolls both cool the extrusion and help give it the desired thickness and surface texture.

Overjacketing extrusion is used to apply an outer layer of plastic over an existing item, usually wire or cable, and almost exclusively one made of metal. Two different tooling methods are used: pressure tooling adheres the extruded plastic to the substrate product; jacketing tooling does not.

Photo credit: NPGpics / Foter / CC BY-NC-ND

Technology

Modular Buildings Are Modular

Modular design is a method that subdivides a system into smaller parts (or “modules”) that can be produced independently and used in different combinations to create different, larger systems. In very basic terms, modular design for building construction involves the manufacture of numerous small, and therefore more manageable, modules that are later brought together and assembled into a larger whole.

A Brief History

The use of pre-fabrication and modular design for building construction dates back over a century, and started gaining popularity in the early 20th century. Sears Roebuck actually offered catalogs from which customers could order their very own, semi-custom modular homes, selling over half a million modular homes between 1910 and 1940. After World War II, the market for modular homes grew exponentially as returning soldiers sought an affordable way to buy homes for their families.

Modular buildings of all kinds remain popular to this day, as more sophisticated design and manufacturing techniques have made it faster, easier, and more economical to produce buildings of greater size and complexity. Most modern free-standing modular structures consist of three to six individual modules, though this can vary greatly from structure to structure.

Construction Process & Applications

In most instances, the individual modules are constructed assembly-line style. The modules themselves may take anywhere from a few days to a few months to build, and independent building inspectors supervise the construction to ensure that all building codes and other standards are met. Upon completion, the completed modules are transported to the building site and assembled; depending on the size and complexity of the final design, assembly may anywhere from a few hours to a week or more.

A modular building being assembled.

A modular building being assembled.

Modular design is not limited to the construction of homes and other free-standing buildings. Individual workstations for office or industrial applications, and it’s even possible to construct a complete modular office in outdoor or indoor workspaces such as construction sites or warehouses. Other, more specialized modular structures, such as a portable cleanroom, can be erected quickly and easily as an application warrants.

Advantages

The two biggest advantages of modular design for building construction are lower prices and faster completion. Numerous factors affect this, including:

  • the ability to build modules indoors regardless of weather
  • less material waste
  • the flexibility to rearrange or add/subtract modules to meet requirements
  • the ability to work on site prep projects and modular component construction simultaneously

Photo credit: werdsnave / Foter / CC BY-NC-ND