A popular opinion on technologies hitting the recycling bin: utility of new emerging products far outweigh their predecessors. Effects of this mindset are quite conspicuous. An exponential demand on  "new" products and the inverse on "obsolete" ones are statistics ubiquitous in tech review reports. Electronics practitioners then tend to deem the study of old technologies highly unnecessary, instead focusing on the new. Anything related to the former is condemned ancient and negligible.

A closer look at exactly how technology evolved from then to now might shed more light into such consumer behavior. Perhaps one can remember black and white television sets constructed with a  vacuum tube diode? Vacuum tube technologies were popular during the early 1900s (sidenote: the TV was invented by Vladimir Zworykin in 1923, the vacuum tube diode by John Fleming in 1903). The vacuum tube diode served as computing power for most digital electronics at the time. Logic levels were manually identified, and because of this physical feature, the first computer occupied so much space it filled an entire room! These massive machines were considered state-of-the-art, one of the best until the mid-90s where the transistor was discovered. Logic operations were then transferred to solid-state electronics paving the way for smaller digital equipment. Computers finally became more affordable for personal use. These developments augured improvements by leaps and bounds, where  Moore predicted transistor density on chips doubling every 2 years. Everything digital became smaller and smaller until one could finally play their favorite music on a device the size of a wristwatch. A popular example would be Apple's iPod, which had a stellar reception enmasse. With these new technologies came lower market prices. Suddenly, everyone forgot about those bulky vacuum tube diodes or bipolar junction transistors. Now, what most are familiar with are CMOS, 3D transistors/tri-gate transistors, Ivy Bridge, memristors, I7 processors, graphene-based transistors, smart materials and fiber optics to name a few.

There seems to be one common denominator about all these developments. They're all innovations on the digital domain. Faster speeds, larger memories, smaller size... focused on the computing market. But what about communications and other applications? Could a CMOS do better than old technology to, for example, modulate broadcasting signals at transmitter stations?

Some may argue the point being too blatantly obvious, a mere tenet of intuition on materials science, but this concept may need iteration as neglecting such a fact is tempting with all the ads and promotion  on the latest trends. Obsolescence doesn't always mean uselessness. In fact, the term "obsolete" means  something no longer in use, and to claim old electronics is no longer in use is an outright fallacy.

Old technologies can still find value in a myriad of other applications. Take bulky vacuum tubes that once served the ENIAC and other old computing equipment. In broadcasting, high power signals are needed for reaching audiences at far distances. High power signals cannot be modulated by mere solid-state components due to their small form factor. Miniscule size means insufficient power ratings, so such small components would likely fail spectacularly. Vacuum tube diodes on the other hand, with their unwanted bulk in digital applications, find great use in communications requiring such high power (take the klystron amplifier).

Aside from differences in application, there are other reasons why obsolete technology remains  relevant. If inventors just invented anything haphazardly, with manufacturers mindlessly following suit, then bankruptcy becomes a likely premonition. For example, say Charles invented a cool new phone with 99% higher energy efficiency, 99% faster processing speed, and at a 99% cheaper price tag than competitors. But when it launched in the market, no one would buy them. Why? Charles' "super-efficient" phone used a completely different modulation scheme, a completely different machine architecture, and a completely different way of battery-charging that no other phone or mobile station would interact with it, even requiring a very unique charger. Furthermore, if the phone needed repairs, the only company that can repair it would be the manufacturer. Sound familiar? Ask a non-Apple user why he/she isn't an Apple patron and compare it with the previous scenario. Perhaps Apple thrived because their products still communicated with other phones. Yet the key was Apple didn't phase out backward compatibility in their products. What would've happened if the iPhone or iPad couldn't communicate with other mobile devices from Samsung, Nokia, and other smartphone manufacturers? If it didn't feature wireless fidelity/wi-fi?

In order for a product to be backward compatible, the inventor of the product must know how old technologies worked and what principles governed their operation. This way, inventions can function together with older technologies.

There are a number of other reasons why obsolete technologies should not be ignored, but I believe the 2 key reasons are finding application in other areas and backward compatibility. Do you agree?