Technology, World-Changing Inventions

You Down with PTFE?

Good ol’ polytetrafluoroethylene. It’s just the best, amirite? All right, good talk. See ya next time…

 

 

 

 

 

 

 

 

 

 

 

 

 

…Wait, hold on. Yeah, so polytetrafluoroethylene, a synthetic fluoropolymer of tetrafluoroethylene (don’t even get me started on that $#!t) is better known as PTFE, and even better known as Teflon®. But, because Teflon® is a registered trademark of DuPont Co. (hence the ®), and, like Band-Aids, pretty much became the everyday name for what it is, and because we hate when that kind of bulljazz happens (lookin’ at you, too, Jell-O), we’ll just keep on referring to it as PTFE.

Yeah, You Know Me

You know PTFE quite well, even if you’re unaware of it. It’s main application is as the non-stick coating on cooking pans and the like. Because it’s so slippery, it’s also used as a coating on catheters. Hopefully you’ve only experienced it when cooking and not whilst in the hospital.

Legend has it that PTFE was invented by accident. Back in 1938 CE, Roy “Big Roy” Plunkett was working at a DuPont lab in New Joisey, Whilst attempting to create a new type of refrigerant, his pressurized bottle of tetrafluoroethylene gas malfunctioned and stopped flowing before it was empty. Curious as to the cause of the failure, Plunkett eventually cut the tank in half and discovered that its interior was coated with a waxy, whitish, extremely slippery substance. A high pressure chemical reaction had caused the gas to react with iron from the inside of the bottle, creating polymerized perfluoroethylene.

This is your brain on PTFE. (Or brains, maybe? Not sure of the egg-to-brain ratio. I assume it's 1:1...?)

This is your brain on PTFE. (Or brains, maybe? Not sure of the egg-to-brain ratio. I assume it’s 1:1, so yeah, that would be brainS. Right…?)

After further R&D, the resulting PTFE material was patented in 1941; the name “Teflon®” was trademarked in 1945. By 1948, as part of a joint venture with General Motors, DuPont was cranking out more than two million pounds of their patented PTFE substance. One of its earliest uses was coating pipe valves and seals that held the uranium hexafluoride used in the Manhattan Project. The first PTFE-coated cooking pan, “The Happy Pan,” was first sold commercially in 1961.

PTFE PTFToday

PTFE is still most commonly used on cookware; however, it has found countless other uses since its debut. It is used to waterproof material for camping equipment like tents and rain jackets, and is often used as a spray-on stain repellent for high end fabrics.

Powdered PTFE is used in infrared decoy flares and rocket fuel igniters. In its solid form, the material can be used to make a wide range of products and parts. Though difficult, PTFE machining can produce strong but lightweight parts in almost any shape, form, or size.

Perhaps the best use PTFE was ever put to was as the inflatable roof of the Hubert H. Humphrey Metrodome in Minneapolis, Minnesota. But that’s a technological tale of terror for another time, Timothy.

Photo credit: JPC24M via Foter.com / CC BY-SA

Technology, World-Changing Inventions

600 Words on the History of Machine Tools

A “machine tool” is, as the name suggests, both a machine and a tool, one that shapes, cuts, bores, grinds, shears, or otherwise deforms metal or another rigid material. Though there are a wide variety of machine tools—from drill presses to lathes to electrical discharge machining systems—all utilize some method of constraining the material being worked and provide guided movement of the parts of the machine. (A circular saw, for example, is not a machine tool, as it allows for unguided, or “freehand”, movement.)

Most modern machine tools are electrically, hydraulically, or otherwise externally powered; very few rely on good ol’ elbow grease. That fact may make it seem as though machine tools are a relatively new invention, however, they have been around for millennia.

Early Forerunners

The first kinda-sorta machine tools are the bow drill and the potter’s wheel, which were used in ancient Egypt at least as far back as 2500 BCE. Rudimentary lathes were known throughout Europe as early as 1000 BCE.

However, it was not until the Late Middle Ages/the early Renaissance that true machine tools exhibiting the features noted above began to appear. A chap by the name of Leonardo “Big Leo” da Vinci helped pioneer machine tool technology, with further advancements championed by clockmakers of the time.

Driven By Industry

In its early days, machine tool development was spurred by a number of nascent industries which more or less needed the devices to grow. The first was firearms, because war never goes out of fashion, followed by the textile market and transportation—first steam engines, then bicycles, then automobiles, then aircraft.

Textile manufacturing was perhaps the biggest driver of machine tool innovation. Prior to the Industrial Revolution in England, most textile machinery was constructed from wood (even gears and shafts). However, these early machines couldn’t withstand the rigors of increased mechanization, and parts were replaced by cast or wrought iron. For large parts, cast iron was generally cast in molds (hence the name), but was all but impossible to work on a smaller scale. Wrought iron could be blacksmithed into shape when red hot from the forge, but after cooling was very difficult to hand-machine into the more complex shapes required.

The Watt steam engine, brainchild of James “Big Game James” Watt and the godfather of all modern engines, would never have come about without machine tools. Watt was unable to manually machine a correctly-bored cylinder for his engine until John “Big Bad John” Wilkinson invented a boring machine in 1774.

Selling Out

Portion of an advert for an early lathe machine.

Portion of an advert for an early lathe machine.

Throughout the 18th, 19th, and early 20th centuries CE, machine tools were generally built by the same people would use them. Eventually, people realized that there was a significant market for machine tools, and machine tool builders began to offer their creations for sale to the general public. The first commercially available machine tools were built by English steam engine manufacturer Matthew “Fat Matt” Murray, starting in 1800. Others soon followed suit, including Scottish engineer James “Big Jim” Nasmyth, English inventor Joseph “Big Joe” Whitworth, and Henry “Big Hank” Maudslay, whose skill and innovation would eventually lead him to be dubbed “the father of machine tool technology.”

Among the earliest commercially available machine tools were there the metal planer, the milling machine, the pattern tracing lathe, the screw cutting lathe, the slide rest lathe, the shaper, and the turret lathe. These devices and those that followed allowed for the realization of a long-sought after goal in manufacturing: the production of identical, interchangeable parts such as nuts and bolts. This, in turn, paved the way for mass production, assembly lines, and modern manufacturing as we know it.

Photo credit: Internet Archive Book Images via Foter.com / No known copyright restrictions

Historical Science & Technology, World-Changing Inventions

Bloomery: Iron, Not Underpants

Iron is, for lack of a better word, good. If you haven’t spotted any iron around you today, it’s only because we’re so used to it that it has become essentially invisible. But before iron became ubiquitous in architecture, transportation, and elsewhere, the people of Earth had to make do without and hold their buildings and bridges up with rocks or trees or whatever. And so, tired of an ironless lifestyle, ancient man created the bloomery, with which they could smelt iron to their hearts content.

A Bloom of One’s Own

Consisting of a pit or chiminey (generally made of earth, clay, stone, or other heat-resistant material) with one or more pipes entering through the side walls near the base, a bloomery was the earliest manmade method of smelting iron. Preheated charcoal is used to “fire” iron ore inside the bloomery, and the pipes allow air to enter the furnace via natural draft or with assistance from bellows. The product of a bloomery is porous iron and slag, known as “bloom.” The so-called “sponge iron” that results from the process can be further forged to create wrought iron.

Not coincidentally, the development and widespread use of the bloomery ushered in the Iron Age. Earlier samples of processed iron do exist, but these artefacts have been identified as meteoric iron, which required no smelting, or happy accidents produced in bronze smelting processes.

The surviving remnants of an early American bloomery.

The surviving remnants of an early American bloomery.

A History of Bloomery

The earliest archaeological evidence of the use of bloomeries comes from East Africa, where bloomery-smelted iron tools have been dated to 1000 to 500 BCE. In sub-Saharan Africa, Forged iron tools dating back to 500 BCE have been found amongst relics from the highly advanced and mysterious Nok culture.

In Europe, the first bloomeries were small by necessity, capable of smelting only about 1 kg of iron at a time because they simply could not be built any bigger at the time. By the 14th century BCE, large bloomeries with capacities up to 300 kg had been developed. Some even used waterwheels to power their bellows.

At a larger scale, bloomeries expose iron ore to burning charcoal for longer. Combined with the more powerful air blast required to adequate heat the charcoal in these larger chambers, this often led to the accidental production of pig iron. This pig iron was naught but a waste product for roughly a century, until the arrival of the blast furnace, which enabled smelters to oxidize pig iron and turn it into cast iron, iron, or steel.

Eventually, the bloomery would be replaced for nearly all smelting processes by the blast furnace. Developed in China in the 5th century BCE, the blast furnace did not make its way to the West until the 15th century CE. It was long thought that the ancient Chinese did not use bloomeries, and instead went straight to blast furnacin’. However, recent evidence suggests that bloomeries were in use in China as early as 800 BCE, having migrated eastward from Europe.

Photo credit: mixedeyes via Small Kitchen / CC BY-NC-SA

Technology, World-Changing Inventions

The Cleanest Rooms in History

A cleanroom—or clean room—is just that: a very, very clean room. More specifically, a cleanroom is a controlled environments in which environmental pollutants and particulate are kept at extremely low levels via air filtration and purification, amongst other means. Cleanrooms are critical to the manufacture of a large number of products, from pharmaceuticals to semiconductors.

Compared to conditions in the early days of mass manufacturing, almost any modern room is clean. True clean rooms, though, are held to a higher standard. But who first developed the cleanroom as we know it? Whence did these havens of cleanliness originate? Read on to learn more!

Big Willy Whitfield Cleans Up

There is considerable historical evidence that shows that rudimentary contamination control efforts were being made in hospital operating rooms as early as the mid-19th century CE. Building off earlier discoveries by Louis “Big Lou” Pasteur, British surgeon Joseph “Big Joe” Lister was the first to introduce sterile and antiseptic surgical measures.

Lister’s efforts only affected the medical procedures themselves, however, and did not encompass the entire room in which said procedures were being performed. The concept of the modern cleanroom was developed by an American physicist named Willis “Big Willy” Whitfield.

Whitfield’s invention was spurred by the need for high precision manufacturing during World War II, which required clean environments to ensure the quality and reliability of military instrumentation. Previously, dirty production environments had contributed to poor performance and malfunctions in bomb sights, aircraft bearings, and other equipment crucial to the war effort.

Then known as “controlled assembly areas,” early clean rooms often suffered problems with airborne particulates and unpredictable airflows. Whitfield, an employee of Sandia National Laboratories, created an effective solution through the use of a constant, highly filtered airflow that flushed out impurities in the atmosphere. Air was filtered using high-efficiency particulate air (HEPA) filtration devices developed during the previous decade.

Whitfield’s first clean room prototype appeared in 1960, well after the war had ended but just in time for the space race, which would create a huge market for cleanroom technology. By the mid-1960s, more than $50 billion worth of cleanrooms had been installed throughout the United States and the world.

NASA workers operating in a cleanroom environment.

NASA workers operating in a cleanroom environment.

Today’s Cleanrooms Today

Modern cleanrooms have continued to improve Whitfield’s technology, and can provide particulate filtration as low as twelve particles of 0.3μm diameter or smaller per cubic meter. (Ambient air in a typical urban environment contains roughly 35 million of these particles per cubic meter.)

Modern clean rooms can enclose thousands of square meters, and use extensive filtration and airlock systems to ensure and maintain cleanliness. Specialized HVAC systems can control humidity levels, and ionizers are utilized to prevent ESD and other similar hazards. A modular clean room can be used for temporary or permanent operations, or can be relocated as production processes require.

Photo credit: NASA Goddard Photo and Video via Scandinavian / CC BY

Important People, Important Discoveries, World-Changing Inventions

A Ridiculously Brief History of Nuclear Fusion Research, Part III

Part I can be found here, whilst Part II can be found here.

The Nineteen Hundred & Eighties

In 1983, the NOVETTE laser was completed at Lawrence Livermore, followed a year later by the multi-beam NOVA laser. Thanks to massive advances in laser technology throughout the decade, by 1989 NOVA would be capable of producing 120 kilojoules of infrared light in a single nanosecond pulse.

Scientists at the Laboratory for Laser Energetics at the University of Rochester developed frequency-tripling crystal that would transform infrared laser beams into ultraviolet beams. Further laser amplification was developed in 1985 by US scientist Donna “Big Donna” Strickland and French scientist Gerard “Big Gerry” Mourou. Their “chirping” technique changed a single laser pulse’s wavelength into a full spectrum, then amplified the laser at each wavelength and reconstituted the beam into a single color. Chirp pulsed amplification, as the process came to be known, was instrumental in further technological advances, particularly for weaponized fusion.

Pew pew!

Pew pew!

In 1987, data collected by NASA’s Voyager 2 probe as it passed by Uranus was analyzed by Akira Hasegawa. Hasegawa noted that, in a dipolar magnetic field, fluctuations compressed plasma sans the usual energy loss, an observation that would become the basis for the Levitated Dipole branch of fusion technology and research.

The Nineteen Hundred & Nineties

The world’s first controlled release of fusion power took place at the Culham Centre for Fusion Energy in Oxfordshire, England, in 1991 at the Joint European Torus, the world’s largest operational magnetic confinement plasma physics experiment. Thankfully, this release of fusion power was intentional and did not annihilate anything at all.

Beginning in 1992 and continuing throughout the decade, numerous advocates for laser fusion technology drummed up support for ongoing research. In the late ‘90s, the United States Congress approved funding for the US Naval Research Laboratory (NRL) to continue their work in the field.

In 1995, a massive fusor—a device which uses an electric field to heat ions to conditions suitable for nuclear fusion—was built at the University of Wisconsin-Madison. Called HOMER, the device is still in operation today. Smaller, similar fusors were also built around this time at the University of Illinois at Urbana-Champaign and in Europe.

1996 saw the US Army giving the public its first look at the Z-machine, a device that generates a magnetic pulse which, upon striking a liner of tungsten wires, is capable of discharging 18 million amperes of energy in less than 100 nanoseconds. The Z-machine is an effective way to test the very high energy, very high temperature conditions (up to 2 BILLION degrees Fahrenheit) conditions of nuclear fusion.

The Levitated dipole fusion device was developed by a team of Columbia University and MIT researchers in the late ‘90s. Consisting of a superconducting electromagnet floating in a saucer-shaped vacuum chamber, the device swirls plasma around the chamber and fuses it along the center axis.

The Two Thousand & Aughts

In early 2002, Rusi Taleyarkhan published the findings of his team at the Oak Ridge National Laboratory. Taleyarkhan reported that he and his cohorts had recorded measurements of neutron and tritium output consistent with successful fusion, following a series of acoustic cavitation experiments. When these results were later discovered to have been falsified, Taleyarkhan was found guilty of misconduct by the Office of Naval Research and disbarred from receiving federal funding for more than two years.

A machine capable of producing fusion and small enough to fit “on a lab bench” was developed by a group of researchers at UCLA in 2005. The device used lithium tantalate to generate sufficient voltage to smash deuterium atoms together, though the process generated no net power.

In the early- to mid-2000s, researchers at MIT started investigating the possibility of using fusors, similar to the UCLA team’s device, for powering and propelling vehicles in space. In separate trials, a team from Phoenix Nuclear Labs developed a fusor that could be used as a neutron source for medical isotope production.

To infinity and BEYOND!

To infinity and BEYOND!

In 2008, an exceptionally brainy 14-year-old whippersnapper by the name of Taylor Wilson achieved successful nuclear fusion using a homemade fusor.

The Two Thousand & Tens

In 2012, a published paper demonstrated a method of dense plasma focus that could achieve temperatures of 1.8 billion degrees Celsius. This temperature is sufficient for boron fusion, and the method produced fusion reactions that occurred almost exclusively within a contained plasmoid (a cage of current and plasma trapped in a magnetic field), which is a necessary condition for generating net power.

Lockheed Martin’s Skunk Works announced the development of a high-beta fusion reactor in October 2014. Higher velocity, lower cost space travel, including deep space exploration would be made possible by this compact fusion reactor, as the above-mentioned MIT team postulated. Skunk Works researchers hope to have a functioning prototype of a 100-megawatt version built by 2017, and to have the device ready for regular operation by 2022.

And then it’s now and I don’t know what happens next!

Laser Photo credit: melissa.meister via DesignHunt / CC BY-SA
Space Exploration Photo credit: NASA Goddard Photo and Video via Foter.com / CC BY