Technology, World-Changing Inventions

Water Treatment Technology Through History

Water Treatment Technology Through History

 

Civilization has changed in uncountable ways over the course of human history, but one factor remains the same: the need for clean drinking water. Every significant ancient civilization was established near a water source, but the quality of the water from these sources was often suspect. Evidence shows that humankind has been working to clean up their water and water supplies since as early as 4000 BCE.

Cloudiness and particulate contamination were among the factors that drove humanity’s first water treatment efforts; unpleasant taste and foul odors were likely driving forces, as well. Written records show ancient peoples treating their water by filtering it through charcoal, boiling it, straining it, and through other basic means. Egyptians as far back as 1500 BCE used alum to remove suspended particles from drinking water.

By the 1700s CE, filtration of drinking water was a common practice, though the efficacy of this filtration is unknown. More effective slow sand filtration came into regular use throughout Europe during the early 1800s.

As the 19th century progressed, scientists found a link between drinking water contamination and outbreaks of disease. Drs. John Snow and Louis Pasteur made significant scientific finds in regards to the negative effects microbes in drinking water had on public health. Particulates in water were now seen to be not just aesthetic problems, but health risks as well.

Slow sand filtration continued to be the dominant form of water treatment into the early 1900s. in 1908, chlorine was first used as a disinfectant for drinking water in Jersey City, New Jersey. Elsewhere, other disinfectants like ozone were introduced.

The U.S. Public Health Service set federal regulations for drinking water quality starting in 1914, with expanded and revised standards being initiated in 1925, 1946, and 1962. The Safe Drinking Water Act was passed in 1974, and was quickly adopted by all fifty states.

Water treatment technology continues to evolve and improve, even as new contaminants and health hazards in our water present themselves in increasing numbers. Modern water treatment is a multi-step process that involves a combination of multiple technologies. These include, but are not limited to, filtration systems, coagulant (which form larger, easier-to-remove particles call “floc” from smaller particulates) and disinfectant chemicals, and industrial water softeners.

For further information, please read:

Planned future articles on Sandy Historical will expand on some of the concepts mentioned here. Please visit this page again soon for links to further reading.

Historical Science & Technology

Astronomy in Ancient Egypt: A Brief Look

To say that the Ancient Egyptians were way ahead of the astronomy game would be an understatement. Stone circles at Nabta Playa, dating back to the 5th millennium BCE, were likely built to align with specific astronomical bodies. By the 3rd millennium BCE, they had devised a 365-day calendar that, save for a string of five “extra” days at the end of the year, was quite similar to the one we use today. And the pyramids (heard of those?) were built to align with the pole star.

Cataloging Yearly Events

In Ancient Egypt, astronomy was used to determine the dates of religious festivals and other yearly events. Numerous surviving temple books record, in great detail, the movements and phases of the sun, moon, and stars. The start of annual flooding of the Nile River was determined by careful notation of heliacal risings—the first visible appearance of stars at dawn. The rising of Sirius was of particular importance in predicting this event.

The temple of Amun-Ra at Karnak was built to align with the sunrise on the day of the winter solstice. A lengthy corridor in the temple is brightly illuminated by the sun on that day only—for the rest of the year, only indirect sunlight enters.

The remains of Karnak temple.

The remains of Karnak temple.

Logging the Hours

In the shared tomb of pharaohs Ramses VI and Ramses IX, tables carved into the ceiling tell the exact hour that the pole star will be directly overhead on any given night. The temple includes a statue called the Astrologer; those using the star chart must sit in a precise position, facing the Astrologer, for the chart to be accurate.

Aligning the Pyramids

The Egyptian pyramids were aligned with the pole star. Owing to the precession of the equinoxes, the pole star at the time (ca. 3rd millennium BCE) was Thuban, a faint star found in the constellation Draco.

Later temples were built or rebuilt using the Ramses’ star chart to find true north. When used correctly, this star chart still provides surprisingly good accuracy.

Astronomical Innovations

The now-proven planetary theory that the Earth rotates on its axis, and that the planets revolve around the sun (the only other planets observed at that time were Venus and Mercury), was initially devised by the Ancient Egyptians. In his writings, Roman philosopher Macrobius (395-423 CE) referred to this as the “Egyptian System.”

The Ancient Egyptians also used their own complex constellation system. Though there is some influence from others’ observations, the system was almost completely devised by the Egyptians themselves.

Photo credit: Malosky / Foter / Creative Commons Attribution-NonCommercial-ShareAlike 2.0 Generic (CC BY-NC-SA 2.0)

Important People, Important Discoveries, World-Changing Inventions

Big John Gutenberg & the Printing Press

Johannes Gensfleisch zur Laden zum Gutenberg (1398-1468) was a man of many talents. Known as a blacksmith, a goldsmith, a printer, an engraver, a publisher, and an inventor in his native Germany, his greatest contribution to the world was the mechanical, moving-type printing press. Introduced to Renaissance Europe in the mid-1400s, his printing press ushered in the era of mass communication and permanently altered the structure of society.

Full-Court Press

Johannes Gutenberg began working on this printing press in approximately 1436, in partnership with Andreas Dritzehn and Andreas Heilmann, a gem cutter and a paper mill owner, respectively.

Gutenberg’s experience as a goldsmith served him well, as his knowledge of metals alloyed him to create a new alloy of tin, lead, and antimony that proved crucial to producing durable, high-quality printed books. His special alloy proved far better suited to printing than all other materials available at the time.

He also developed a unique method of quickly and accurately molding new type blocks from a uniform template. Using this process, Gutenberg produced over 290 separate letter boxes (a letter box being a single letter or symbol), including myriad special characters, punctuation marks, and the like.

Gutenberg’s printing press itself consisted of a screw press that was mechanically modified to produce over 3,500 printed pages per day. The press could print with equal quality and speed on both paper and vellum. Printing was done with a special oil-based ink Gutenberg developed himself, which proved more durable than previous water-based inks. The vast majority of printed materials from Gutenberg’s press were in simple black, though a few examples of colored printing do exist.

A copy of the Gutenberg Bible in the Huntington Library in San Marino, California

One of the surviving Gutenberg Bibles, Huntington Library, San Marino, California

Changing the World

The moveable-type printing press is considered the most important invention of the second millennium CE, as well as the defining moment of the Renaissance period. It sparked the so-called “Printing Revolution,” enabling the mass production of printed books.

By the 16th Century, printing presses based on Gutenberg’s invention could be found in over 200 cities spanning 12 European countries. More than 200 million books had been printed by that time. The 48 surviving copies of the Gutenberg Bible, the first and most famous book Gutenberg himself ever printed, are considered the most valuable books in the world.

Literacy, learning, and education throughout Europe rose dramatically, fueled by the now relatively free flow of information. This information included revolutionary ideas that reached the peasants and middle class, and threatened the power monopoly of the ruling class. The Print Revolution gave rise to what would become the news media, and was the key component in the gradual democratization of knowledge.

Photo credit: armadillo444 / Foter / Creative Commons Attribution-NonCommercial 2.0 Generic (CC BY-NC 2.0)

Technology, World-Changing Inventions

Your Keyboard is Made of Dinosaur Bones

Plastic is everywhere, plastic is everything. So many of the things use every day are made of plastic, it’s kind of ridiculous. Especially when you consider that plastic is, essentially, made from the fossilized bones of the mighty dinosaurs (like all petroleum products). Once we’ve extruded all that fantastic plastic from a stegosaurus patella, though, how the heck do we turn it into something useful?

Why, Via Injection Molding, Of Course!

Simply put, injection molding is the process of manufacturing objects by injecting molten plastic into a mold that is shaped like the desired end product. Glass and some metals can be used in injection molding, as well, but plastics are by far the most commonly used material in this process. (There are, of course, many other methods for turning “raw” plastic into a useful product, but we’re not writing about those ones here, so let’s just pretend they don’t exist.)

The first manmade plastic was invented in Eighteen-Hundred and Sixty-One by British metallurgist and scientist Alexander Parkes. Dubbed “Parkesine,” this material could be heated and molded into shape, and would retain that shape when it cooled. Parkesine was far from perfect, however, as it was highly flammable, prone to cracking, and expensive to produce.

Improving on Parkes’ Parkesine, American inventor John Wesley Hyatt created a new plastic material, celluloid, in 1868. Just a few years later, Hyatt and his brother from the same mother Isaiah patented the first injection molding machine. Though relatively simple compared to the advanced technology used today (that was over 140 years ago, so…), it operated on the same general principle: the heated plastic (celluloid) was forced through a tube and into a mold. The first products manufactured with the Hyatt Bros’ machine were fairly simple ones: buttons, combs, collar stays, and the like.

WWII Drives Innovation

Injection molding technology remained more or less unchanged for decades; so, too, did the products manufactured with it. It was not until the outbreak of World War II, when every last scrap of metal materials was needed for war efforts, that plastic injection molding came into its own.

As was the case with WWII in general, it was up to the Americans to do the heavy lifting. Inventor James Watson Hendry built the first screw injection machine in 1946. This new device afforded much greater control over the speed with which plastic was injected into the mold, and subsequently greatly improved the quality of end products. Hendry’s screw machine also allowed materials to be mixed prior to injection. For the first time, recycled plastic could be added to “raw” material and thoroughly mixed together before being injected; this, too, helped to improve product quality. And, hey, recycling, right?!

JSYK, this is an atypical injection molding system.

JSYK, this is an atypical injection molding system.

Further Advances

Today, most injection molding is performed using Hendry’s screw injection technology. Hendry later went on to develop the first gas-assisted injection molding process—this enabled the production of complex and/or hollow shapes, allowing for greater design flexibility and reducing costs, product weight, material waste, and production times.

Advances in machining technology have made it possible to create highly complex molds from a range of materials—carbon steel, stainless steel, aluminum, and many others. Most of these molds can be used for years on end with little to no maintenance required, and can produce several million parts during their lifetime. This further reduces the cost of injection molding.

Numerous variations of injection molding have been developed, as well. Insert molding uses special equipment to mold plastic around other, preexisting components—these components can be machined from metal or molded from different plastics with higher melting points. Thin wall injection molding saves on material costs by mass-producing plastic parts that are very thin and light, such clamshell produce containers (like those blueberries come in, for example).

Plastic injection molding has evolved over the decades. From its simple beginnings, churning out combs and other relatively basic products, the process can now be used to produce high precision, tight tolerance parts for airplanes, medical devices, construction components… and computer keyboards.

Photo credit: jarnott / Foter / CC BY-NC

Historical Science & Technology, Science

A Brief History of Cataract Surgery

Cataract surgery is a surgery in which a cataract is removed. Nailed it! More specifically, cataract surgery is the removal of the human eye’s natural lens necessitated by the lens having developed an opacifiation (a.k.a. a cataract) which causes impairment or loss of vision. While we tend to think of “advanced” medical procedures such as this as relatively modern developments, cataract surgery has been performed for thousands of years.

Couching in Ancient India

Sushruta, a physician in ancient India (ca. 800 BCE), is the first doctor known to have performed cataract surgery. In this procedure, known as “couching,” the cataract, or kapha, was not actually removed.

First, the patient would be sedated, but not rendered unconscious. He/she would be held firmly and advised to stare at his/her nose. Then, a barley-tipped curved needle was used to push the kapha out of the eye’s field of vision. Breast milk was used to irrigate the eye during the procedure. Doctors were instructed to use their left hand to perform the procedure on affected right eyes, and the right hand to treat left eyes.

Even drawings of cataract surgery look super unpleasant.

Even drawings of cataract surgery look super unpleasant.

When possible the cataract matter was maneuvered into the sinus cavity, and the patient could expel it through his/her nose. Following the procedure, the eye would be soaked with warm, clarified butter and bandaged, using additional delicious butter as a salve. Patients were advised to avoid coughing, sneezing, spitting, belching or shaking during and after the operation.

Couching was later introduced to China from India during the Sui dynasty (581-618 CE). It was first used in Europe circa 29 CE, as recorded by the historian Aulus Cornelius Celsus. Couching continued to be used in India and Asia throughout the Middle Ages. It is still used today in parts of Africa.

Suction Procedures

In the 2nd Century CE, Greek physician Antyllus developed a surgical method of removing cataracts that involved the use of suction. After creating an incision in the eye, a hollow bronze needle and lung power were used to extract the cataract. This method was an improvement over couching, as it always eliminated the cataract and, therefore, the possibility of it migrating back into the patient’s field of vision.

In his Book of Choices in the Treatment of Eye Diseases, the 10th Century CE Iraqi ophthalmologist Ammar ibn Ali Al-Mosuli presented numerous case histories of successful use of this procedure. In 14th Century Egypt, oculist Al-Shadhili developed a variant of the bronze needle that used a screw mechanism to draw suction.

“Modern” Cataract Surgery

The first modern European physician to perform cataract surgery was Jacques Daviel, in 1748.

Implantable intraocular lenses were introduced by English ophthalmologist Harold Ridley in the 1940s, a process that made patient recovery a more efficient and comfortable process.

Charles Kelman, an American surgeon, developed the technique of phacoemulsification in 1967. This process uses ultrasonic was to facilitate the removal of cataracts without a large incision in the eye. Phacoemulsification significantly reduced recovery patient times and all but eliminated the pain and discomfort formerly associated with the procedure.

Photo credit: Internet Archive Book Images / Foter / Creative Commons Attribution-ShareAlike 2.0 Generic (CC BY-SA 2.0)

Historical Science & Technology, World-Changing Inventions

Cuneiform for Me & Youneiform

Cuneiform, sometimes called cuneiform script, is one of the world’s oldest forms of writing. It takes the form of wedge-shaped marks, and was written or carved into clay tables using sharpened reeds. Cuneiform is over 6,000 years old, and was used for three millennia.

The Endless Sumerians

The earliest version of cuneiform was a system of pictographs developed by the Sumerians in the 4th millennium BCE. The script evolved from pictographic proto-writing used throughout Mesopotamia and dating back to the 34th Century BCE.

In the middle of the 3rd millennium BCE, cuneiform’s writing direction was changed from top-to-bottom in vertical columns  to left-to-right in horizontal rows (how modern English is read). For permanent records, the clay tablets would be fired in kilns; otherwise, the clay could be smoothed out and the tablet reused.

A surviving cuneiform tablet.

A surviving cuneiform tablet.

Over the course of a tight thousand years or so, the pictographs became simplified and more abstract. Throughout the Bronze Age, the number of characters used decreased from roughly 1,500 to approximately 600. By this time, the pictographs had evolved into a combination of what are now known as logophonetic, consonantal alphabetic, and syllabic symbols. Many pictographs slowly lost their original function and meaning, and a given sign could provide myriad meanings, depending on the context.

Sumerian cuneiform was adapted into the written form of numerous languages, including Akkadian, Hittite, Hurrian, and Elamite. Other written languages were derived from cuneiform, including Old Persian and Ugaritic.

CuneiTRANSform

Cuneiform was gradually replaced by the Phoenician alphabet throughout the Neo-Assyrian and Roman Empirical ages. By the end of the 2nd Century CE, cuneiform was essentially extinct. All knowledge of how to decipher the language was lost until the 1850s—it was a completely unknown writing system upon its rediscovery.

Modern archaeology has uncovered between 500,000 and 2 million cuneiform tablets. Because there are so few people in the world who are qualified to read and translate the script, only 30,000 to 100,000 of these have ever been successfully translated.

Photo credit: listentoreason / Foter / Creative Commons Attribution-NonCommercial-ShareAlike 2.0 Generic (CC BY-NC-SA 2.0)

Important People, Important Discoveries, Science

The Periodic Table of Dmitri Mendeleev

If you’re here, reading a blog about science and technology, I’m going to assume you already know what the periodic table of elements is, and therefore dispense with the introductory information. However, though you may know the table well, do you know where it came from? Read on, friend, read on…

From Russia with Science

Dmitri Ivanovich Mendeleev (1834-1907) was a Russian inventor and chemist. In the 1850s, he postulated that there was a logical order to the order of the elements. As of 1863, there were 56 known elements, with new ones being discovered at a rate of roughly one per year. By that time, Mendeleev had already been working to collect and organize data on the elements for seven years.

Mendeleev discovered that arranging the known chemical elements in order by atomic weight, from lowest to highest, a recurring pattern developed. This pattern showed the similarity in properties between groups of elements. Building off this discovery, Mendeleev created his own version of the periodic table that included the 66 elements that were then known. He published the first iteration of his periodic table in Principles of Chemistry, a two-volume textbook that would be the definitive work on the subject for decades, in 1869.

Mendeleev’s periodic table is essentially the same one we use today, organizing the elements in ascending order by atomic weight and grouping those with similar properties together.

Dmitri Mendeleev

Changing & Predicting the Elements

The classification method Mendeleev formulated came to be known as “Mendeleev’s Law.” So sure of its validity and effectiveness was he that used it to propose changes to the previously-accepted values for atomic weight of a number of elements. These changes were later found to be accurate.

In the updated, 1871 version of his periodic table, he predicted the placement on the table of eight then-unknown elements and described a number of their properties. His predictions proved to be highly accurate, as several elements that were later discovered almost perfectly matched his proposed elements. Though they were renamed (his “ekaboron” became scandium, for example), they fit into Mendeleev’s table in the exact locations he had suggested.

From Skepticism to Wide Acclaim

Despite its accuracy and the scientific logic behind it, Mendeleev’s periodic table of elements was not immediately embraced by chemists. It was not until the discovery of several of his predicted elements—most notably gallium (in 1875), scandium (1879), and germanium (1886)—that it gained wide acceptance.

The genius and accuracy of his predictions brought Mendeleev fame within the scientific community. His periodic table was soon accepted as the standard, surpassing those developed by other chemists of the day. Mendeleev’s discoveries became the bedrock of a large part of modern chemical theory.

By the time of his death, Mendeleev had received a number awards and distinctions from scientific communities around the world, and was internationally recognized for his contributions to chemistry.

Photo credit: CalamityJon / Foter / Creative Commons Attribution-NonCommercial-NoDerivs 2.0 Generic (CC BY-NC-ND 2.0)

Technology, World-Changing Inventions

Toastory: The Tale of the Unsung Hero of the Kitchen

As long as there has been bread and fire, there has been toast. Sliced bread was toasted over open flames with the help of a special metal frame or long-handled forks since time immemorial. In the early 19th Century, simpler, easier-to-use handheld utensils for toasting toast were invented. In 1893, everything changed.

The (Mac) Toast Master of Scotland

That fateful year, Alan MacMasters—the man, the myth, the legend—invented the first electric bread toaster in Edinburgh, Scotland. The key to MacMasters’ macmasterpiece was the development of a heating element that could repeatedly be heated to red-hotness without breaking. The then-fairly recent invention of the incandescent light bulb had proved that such an element was possible—however, adapting the technology to the toasting conundrum would prove difficult.

Light bulbs’ metal elements benefited from being sealed within a vacuum, a method MacMasters could not employ, for obvious reasons. Additionally, the kinks were still being worked out of wiring for electrical appliances—the oft-used iron wiring frequently melted at high temps, causing severe fire hazards. Moreover, electricity was not readily available in the 1890s, leading to further difficulties.

Albert marsh, a young engineer, created an alloy of nickel and chromium, called nichrome, that could withstand the rapid, repeated heating and cooling a toasting element required. Nichrome was swiftly implemented into MacMasters’ toaster design. Soon after, the device was commercialized by Crompton & Co., under the name Eclipse.

Marsh, along with George Schneider of the American Electrical Heater Company, received the first US patent for an electric toaster. In 1909, General Electric introduced the Model D-12, America’s first commercially successful electric toaster.

Toastervations: Technological Innovations

Truly, the toaster game was a cutthroat one in its early days. Inventors continually piggybacked off each other’s successes, one-upping previous advancements and fighting tooth and nail for even the smallest piece of the golden goose that was the toasted bread market.

"Two toasts, coming right up!"

“Two toasts, coming right up!”

The earliest electric toasters toasted toast on one side at a time—it had to be flipped manually to continue the toasting process on the other side. American inventory Lloyd Groff Copeman and his wife, Hazel Berger Copeman, received a number of patents for toaster technology in 1913. Later that year, the Copeman Electric Stove Company introduced the “toaster that turns toast,” a device with an mechanical bread turner that flipped toast automatically.

Building off of Copeman’s advances, Charles Strite invented and patented the first automatic pop-up toaster, which included a mechanism that ejected the toasted after toasting, in 1919.

Working from Strite’s design, the Waters-Genter Company developed the Model 1-A-1 in 1925. The 1-A-1 was the first commercially-sold toaster that could toast both sides simultaneously, deactivate the heating element after a set time, and eject the toasted toast after toasting.

Starting in the 1940s, Sunbeam Products began offering a high-end toaster that would perform the entire toasting process of its own accord; all the user had to do was drop in a slice of pre-toast. The Sunbeam T-20 would lower the bread into the toasting chamber without the use of an external control mechanism. The details of how the T-20 and subsequent models worked are too dry and technical to recount here—for simplicity’s sake, we’ll chalk it up to magick.

In recent years, there have been many toaster innovations that are less darkly mysterious, but no less useful. The ability to use only one toast slot as needed saved Americans over $30 billion in electricity costs in the 1980s alone. Wider slots make the toasting of bagels possible. Modifications that “print” logos or images onto toasted toast mark the zenith of toaster technology.

What a time to be alive!

Photo credit: litlnemo / Foter / Creative Commons Attribution-NonCommercial-ShareAlike 2.0 Generic (CC BY-NC-SA 2.0)

Historical Science & Technology, Science

Medieval European Herbals

After Theophrastus’ and other ancient Greeks’ significant advances in botany, China, India, and the Arabian nations continued to study and expand their knowledge of the science. However, in Western Europe, the study of botany went through a period of inactivity that lasted over 1,800 years. During this time, much of the Greeks’ knowledge and breakthrough discoveries were lost or forgotten.

Moveable Type to the Rescue!

In 15th and 16th Century Europe, life of the average citizen revolved around and was highly dependent upon agriculture. However, when printing and moveable type were developed, most the first published works were not strictly about agriculture. Most of them were “herbals,” lists of medicinal plants with descriptions of their beneficial properties, accompanied by woodcut illustrations. These early herbals reinforced the importance of botany in medicine.

Herbals gave rise to the science of plant classification, description, and botanical illustration. Botany and medicine became more and more intertwined as the Middle Ages gave way to the Renaissance period. In the 17th Century, however, books on the medicinal properties of plants eventually came to omit the “plant lore” aspects of early herbals. Simultaneously, other printed works on the subject began to leave out medicinal information, evolving into what are now known as floras—compilations of plant descriptions and illustrations. This transitional period signaled the eventual separation of botany and medicine.

Notable Herbals

The first herbals were generally compilations of information found in existing texts, and were often written by curators of university gardens. However, it wasn’t long before botanists began producing original works.

Herbarum Vivae Eicones, written in 1530 by Otto Brunfels, catalogued nearly 50 new plant species and included accurate, detailed illustrations.

A page from Brunfels' Herbarum Vivae Eicones

A page from Brunfels’ Herbarum Vivae Eicones

Englishman William Turner published Libellus De Ra Herbaria Novus in 1538. Turner’s tome included names, descriptions, and local habitats of native British plants.

In 1539, Heironymus Bock wrote Kreutterbuch, describing plants the author found in the German countryside. A 1546 second edition of the book included illustrations.

The five-volume Historia Planatrum, written by Valerius Cordus and published between 1561 and 1563 some two decades after the author’s death, became the gold standard for herbals. The work included formal descriptions detailing numerous flowers and fruits, as well as plant anatomy and observations on pollination.

Rembert DodoensStirpium Historiae, written in 1583, included detailed descriptions and illustrations of many new plant species discovered in the author’s native Holland.

Photo credit: ouhos / Foter / Creative Commons Attribution-ShareAlike 2.0 Generic (CC BY-SA 2.0)

Technology

Time Switches: Not What They Sound Like

What would be super cool, what would just about the most amazingly fantastic and fantastically amazing thing ever, is if time switches switched time on and off, as light switches do for lights. “Fudge it to dern, I’m running late”—time switch *off*—“Never mind.”

Of course, if everyone had one, no one would ever get anything done because time would constantly be switching on and off at random. (That is, assuming that whoever switched the switch was the only for whom time stopped, which would…you know what, that’s a whole ‘nother can of worms that would take literally thousands and thousands of words to get to the bottom of. I digress…)

“Timer Switches” is Considerably More Accurate

What time switches actually do is control other electric switches, turning them on or off at certain times or after set intervals. This can help save energy by turning equipment and devices off when not needed, as well as security, by switching lights in a pattern that gives the impression that the building is occupied, for example.

Lighting, HVAC systems, cooking devices like ovens, and numerous other devices with electronic controls can be operated via timer switches. Most modern time switches are electronic themselves, utilizing semiconductor timing circuitry to activate or deactivate connected devices.

Originally, time switches were clockwork devices that were wound like watches and set to operate at or after a certain time. Many manufacturers of clockwork time switches used technology adapted from other devices—Intermatic timers, for example, were originally built with modified streetcar fare register mechanics. The vast majority of clockwork timers were capable of only a single on or off switching operation per cycle—i.e., they had to be manually reset after each cycle.

GE Model 3T27 Time Switch (circa 1948)

GE Model 3T27 Time Switch (circa 1948)

From there, timer switches evolved into electromechanical devices, utilizing slowly rotating, electrically-powered, geared motors that mechanically operate switches. Electromechanical time switches offered more programming options than clockwork versions had, allowing for multiple on-off cycles during a given period of time.

Modern, electronic time switches can be programmed to run their set cycles indefinitely. Cycles are usually delineated in 24 hour or seven day periods. Electronic timers can, for example, manage a central heating system to supply heat during specified times of the morning and evening during the week (often programmable down to the exact desired minute), and all day during weekends.

Photo credit: Telstar Logistics / Foter / CC BY-NC

Technology

Do V-Belts Keep Your V-Pants Up?

Come on, title, don’t be ridiculous.

If you’ve spent any amount of time under the hood of an automobile, you’re likely familiar with v-belts. Often called “serpentine belts,” these sturdy, rubber loops are the both the cheapest and the easiest way to transmit power between two or more rotating shafts that are not aligned axially but that run parallel to each other. V-belts are one of the most important components for the operation of automotive engines.

 

OG Leather V-Belts of 1916

 

The earliest mentions of v-belts as used in automobiles dates back to 1916. Originally, v-belts were manufactured from leather, and were made with any number of “V” angles, as the burgeoning auto industry had not yet standardized designs for these components (as well as countless others).

 

In 1917, the endless rubber v-belt was developed by John Gates of the Gates Rubber Company, which would go on to become Gates Corporation, the world’s largest non-tire rubber manufacturer.

 

Walter Geist of Allis-Chalmers would develop the first multiple-v-belt drive in 1925, and a patent for his design (US Patent #1,662,511) was awarded three years later. Allis-Chalmers then marketed Geist’s creation under the “Texrope” brand name.

 

Modern V-Belts

 

Today, v-belts are made from advanced (or “engineered”) rubber materials that provide better durability and flexibility and longer service life. For further improved strength, many designs include special embedded fibers; commonly used fiber materials include nylon, polyester, cotton, steel, Kevlar, and others.

Elaborate treasure map, or V-belt routing diagram?

 

Modern v-belts are manufactured with essentially universal “V” angles (there are, of course,. This standardized shape was scientifically developed to provide an optimum combination of traction and speed. Bearing load is also optimized by this design—as the load increases, the V shape wedges into the corresponding groove in the shaft pulley, providing improved torque transmission.

 

Endless v-belts are the current standard, but jointed and linked v-belts can also be utilized for special applications. Jointed and linked v-belts are made from a number of composite rubber links joined together—links can be added or removed as needed for adjustability. Most of these specialty v-belts can operate at the same speed and power ratings as standard types, and require no other special equipment.

Photo credit: johnsoax / Foter / Creative Commons Attribution-NonCommercial-NoDerivs 2.0 Generic (CC BY-NC-ND 2.0)