In the higher education system, analog electronics is the first engineering course related to the semiconductor field and serves as an introductory textbook for technology. It differs from basic circuits, which is an electrical entry-level course based on general physics, emerging during the Second Industrial Revolution. From static electricity to the voltaic cell, Oersted, Faraday, Ampère, Maxwell, and many other physicists established a whole new branch of physics: electromagnetism, which coexists alongside traditional Newtonian mechanics and Kelvin’s thermodynamics. Thus, circuits are largely an extension of physics, learned logically with the support of mathematical theorems. Since high schools offer physics courses, university-level circuit studies are relatively easy.
Analog circuits, on the other hand, are purely technical subjects that emerged alongside semiconductor technology. The known circuits, topologies, and application methods are all purely technical, more like a compilation of work notes. These documents record a series of achievements in analog electronics invented by humans over the past century. Clearly, as an introductory text for the semiconductor field, analog electronics textbooks are inadequate. Without introducing the development of the discipline, the technological background, and application scenarios, merely listing technological achievements amounts to asking students to memorize all the content.
If we look back at the history of analog electronics, we can understand what analog electronics is, who invented it, where it stands now, and who is still using it.
Analog electronics was not initially meant for semiconductors, nor was it intended for diodes, transistors, or amplifiers, nor was it created for audio amplifiers or telegraph systems. The original purpose of analog electronics was to directly build computers. Yes, you read that right, directly to build computers.
In 1930, MIT created the world’s first general-purpose analog computer, mainly used for solving 6th-order differential equations. This computer had no relation to electricity at all; it was a purely mechanical solution method. As global productivity rapidly developed, human desire for computational power grew quickly. By the time World War II began, computational power even started to affect the course of the war.
In 1940, Bell Labs developed the first M-9 analog computer, which was immediately placed on a fighter aircraft. It used radar data to calculate enemy aircraft trajectories in real-time, forming an automatic anti-aircraft fire aiming system. This reduced the number of rounds needed to shoot down enemy aircraft by ten times, meaning the shooting accuracy increased tenfold. A single computer could rival a top fighter pilot, and the computational power truly made everyone begin to pay attention.
During the European war, the United States began to dominate the air, and by 1944, it had finally ended the German V1 rocket attacks on London. The M-9 computer played a significant role in this. The core component of the M-9 was a vacuum tube amplifier with negative feedback. Two geniuses, Randall Ragazzini and Russell, made the historic invention of the operational amplifier, and they simultaneously built integration, inversion, addition, and subtraction circuits using the op-amp. In their paper, they wrote: “When the output terminal of the amplifier is connected back to the input terminal, mathematical operations on voltages can be performed…”
In 1952, the operational amplifier, which had gained fame during the war, finally entered the civilian market. George A. Philbrick introduced the first commercially available operational amplifier, the K2-W. Powered by +/-300 V, with a +/-50 V slew rate, it could drive a 50K ohm load and was priced at $22.
From this, we can see that analog electronics at this time was mainly used to build computational circuits to perform addition, subtraction, multiplication, division, integration, differentiation, and inversion. These chapters in your textbooks were derived from this. To make these operations faster and more accurate, companies worldwide began to focus on improving the performance of op-amps. This is when the Miller effect and oscillation problems emerged. People gradually found physical models and systematically researched how to avoid these issues. At this time, op-amps were still based on vacuum tubes, as semiconductors had not yet been commercialized.
In 1962, a young man named Bob Widlar had just graduated from the University of Colorado. He worked at a research institution and had a natural talent for circuit design, which immediately caught the attention of one of their suppliers. Although it was frowned upon to poach clients, the supplier managed to hire Widlar and put him in charge of developing a new product.
In 1964, Widlar successfully led the development of the µA702, the world’s first semiconductor-based operational amplifier, marking the beginning of a new era. This era was called “Very Large Scale Integration,” and the company that hired him was called “Fairchild Semiconductor.”
Like the K2-W, the uA702 also used a two-stage voltage amplification, which led to a problem: how to convert a differential signal into a single-ended signal without sacrificing gain. This was difficult because it would result in a loss of half of the signal. The K2-W had simply discarded this half of the signal. Widlar, however, created another revolutionary invention: the current mirror. He used nine NPN transistors to build a circuit that even we today cannot fully comprehend, successfully boosting the signal by half and avoiding distortion. This is the origin of the current mirror in textbooks. Widlar’s integration of several transistors into a single device is now called “analog chip design.”
At this point, the story of the god-like figure in analog electronics had just begun. After finishing the operational amplifier, Widlar went on to develop specialized operational amplifiers for comparison functions, the 710 and 711, with a comparison time reduced to 40 ns, expanding the bandwidth of conventional op-amps by ten times. These op-amps, used specifically for comparison, were called comparators. This is the origin of the comparator chapter in your textbooks.
Later, he developed the uA726, which drastically reduced temperature drift to 0.2µV/°C, meeting military temperature requirements (-55°C to +125°C). This is the origin of the temperature drift chapter in your textbooks. Why was there such a huge demand for computers at this time? Because the Cold War between the U.S. and the Soviet Union had started, and the world’s military and aerospace industries rapidly advanced. Countries were spending a significant portion of their GDP on technological innovation, and the semiconductor industry entered a nationalized system.
In 1965, Widlar was denied a salary increase by Fairchild, so he left with his colleague Talbert and joined National Semiconductor. His first product after switching companies was the LM100/101A. Widlar thought that since a current mirror could perform signal processing, it could also control power, so he applied the same principle to design a voltage regulator based on operational amplifier principles. This is the origin of the DC voltage regulator chapter in your textbooks, and the beginning of linear voltage regulators. The LM317, which you are familiar with, is a descendant of this series. Its biggest feature is that the output voltage remains constant regardless of changes in load, which made a significant contribution to the development of power supplies.
Widlar’s linear voltage regulation idea also gave birth to another product: the voltage reference chip. In analog circuits, voltage needs an absolutely accurate standard as a reference, for example, 5V should be exactly 5V. If it becomes 5.01V, the final calculation result will be off, much like the floating-point precision in computers. Widlar’s voltage regulation approach greatly improved the precision of reference voltages in analog electronics. This is the origin of the voltage reference chapter in your textbooks. How did Widlar keep a voltage reference nailed at a single value? He thought of the fact that the diode’s forward voltage is 0.6V, which is very stable. However, this forward voltage changes with current and temperature, up to 0.3%/°C. Widlar devised a solution to compensate for this variation with 2mV per degree Celsius. Using his experience with current mirrors, he designed a CTAT (Complementary to Absolute Temperature) circuit, which physically proved that the output voltage could be temperature-independent. The voltage accuracy was pinned within 5mV, or 0.001%.
At the same time, other manufacturers were also making significant strides in technology. For example, Hans Camenzind, working at Signetics, specialized in phase-locked loops (PLLs). He wanted a controllable frequency device for his own use, so he created the 555 timer. Any electrical engineering student has heard of or used this device, as the 555 timer became the best-selling IC globally for 35 years, with 1 billion units sold annually.
Similarly, Bill Hewlett and Dave Packard initially used operational amplifiers to prevent oscillations, but they accidentally wired the circuit for positive feedback, causing the circuit to oscillate continuously. After thinking it through, they decided to specialize in oscillators, and so they started a company, named after their initials: HP (Hewlett-Packard). The street where their company was located later became a well-known place called Silicon Valley.
By now, the chapters in your analog electronics textbook have pretty much all been covered. Based on these technologies, various products began to emerge, such as radios, audio systems, televisions, and telephones.
Later, in the 1970s, the maturation of MOSFETs further enhanced the scale and speed of integrated circuits, and the world began the digitalization process. Digital electronics began to take center stage globally, while once-glorious analog electronics began to step back from the spotlight. It is clear that the development of semiconductors was a result of humanity’s constant pursuit of computational power.
Related:
- DAC Chip Basics, Functions, Architectures & Challenges
- ADC Chip Basics, Structure & Key Performance Factors
- Signal Chain Chips: Bridging Real and Digital Worlds
Disclaimer:
- This channel does not make any representations or warranties regarding the availability, accuracy, timeliness, effectiveness, or completeness of any information posted. It hereby disclaims any liability or consequences arising from the use of the information.
- This channel is non-commercial and non-profit. The re-posted content does not signify endorsement of its views or responsibility for its authenticity. It does not intend to constitute any other guidance. This channel is not liable for any inaccuracies or errors in the re-posted or published information, directly or indirectly.
- Some data, materials, text, images, etc., used in this channel are sourced from the internet, and all reposts are duly credited to their sources. If you discover any work that infringes on your intellectual property rights or personal legal interests, please contact us, and we will promptly modify or remove it.