Market Insights

Does the Internet of Things mean that analog electronics on the way out

March 03, 2015 by Steve Roberts

The hype surrounding the Internet of Things (IoT) is now reaching fever pitch. According to a model of modern technology evolution (http://

The hype surrounding the Internet of Things (IoT) is now reaching fever pitch. According to a model of modern technology evolution (http:// , it will soon hit the “peak of inflated expectations” before descending into the “trough of disillusionment” and finally levelling out to the “plateau of productivity”. We know that we have reached the inflated expectations peak when we read comments like; “The Internet of Things will transform everything - in the future, the next generations will look at us and wonder how we ever survived our “dumb” homes”. Considering that millions of people in the world make it through their lives quite satisfactorily without even the benefit of electricity, we can see how rash such comments are. A more level-headed commentator on the recent surge of questionable IoT products - such as plant pots that send SMS messages, feeding bottles that monitor your baby’s milk intake and fridges that inform you that they are empty - questions whether people are really that lazy or unobservant to realize that their plants are dry, their baby is full or that they need to buy more beer. The same commentator more realistically suggests that about 4% of the current IoT proposals will actually turn out to be useful. Nevertheless, those few products that do succeed will be very profitable, hence the excitement in the electronics industry over IoT.

The corollary concept to IoT is that everything will be dominated by digitally interconnected devices with local and group intelligence, thus the future will be binary, not analog. Any traditionally analog function will be successively replaced by its digital equivalent; all sensors will transmit their information directly as digital data, any analog signals will be filtered by DSPs and “dumb” analog power supplies will be exchanged for digital power supplies that can intelligently adapt themselves to the load conditions. The expectation is that soon analog electronics will no longer be relevant in the modern world – the end of analog electronics as we know it.

However, analog electronics is not going to go away. We live in the real world, not a digital simulation of it. After all, at the most basic level, all electronics is analog and digital is just fast analog. What is on a sharp growth curve is mixed-signal electronics; microprocessors with on-board operational amplifiers, PWM generators and ADC’s, for example. Real world voltages and currents often exceed the 3.3V limits of most digital circuitry and their micro-amp supplies - and as long as there is a need for analog-to-digital and digital-to-analog interfaces, there will always be a need for analog electronics. This is doubly true for power electronics as there is no such thing as a digital switch.

In a recent interview, the CEO of one of the larger power supply manufacturers stated that the future for his company lies in power modules with digital control and digital feed-back and that consequently he was diverting resources away from analog design. This may be a good strategy for higher power units where the expectations for exceptional efficiency, high power factor and a fast response to dynamic loads can hardly be reached using non-digital control systems any more, but all power supplies have performance artefacts that arise from analog interactions – whether it be simply the inductance of the cabling, unwanted cross coupling between components or stray leakage capacitances, not to mention the complex layered EMC interference signals that many digital simulations do not or cannot take into account. As long as the heart of any power supply is the transformer, then analog electronics will still be at the center of power supply design.

There is also a significant trust issue with digital power supplies – many do not feel comfortable working on designs that resist empirical analysis. Electronics engineers need to be able to look at a circuit schematic and, with the help of the performance specifications in the datasheets, be able to figure out how a circuit is supposed to work or troubleshoot why it does not. As soon as there is a digital processing or control stage in the design, the logical link is lost between input and output, hidden in some cryptic part of the coding.

Returning to the theme of the Internet of Things; how will all of these devices be powered? The accepted answer is to use either a single cell battery or energy scavenging from thermal gradients, vibration and solar energy or by tapping into the electro-smog of all of the other RF networks. All of these energy sources generate either very low or very variable voltages, so there is still a need for boost converters/ regulators with high efficiency at very low loads. A coin-cell battery may be able to power an IoT processor for many years (after which, presumably, the device will be simply thrown away), but the 1.5V battery output also severely limits the sensor choice. Temperature and visible light measurements are generally OK, but many other sensors require significantly higher voltages and there are very few power supplies in the market that can efficiently generate higher output voltages from single cell supplies or low voltage energy scavenging circuits. For the few that do exist, the severe cost restrictions of a disposable power supply product mean that only simple analog DC/ DC converters are suitable. So as the Internet of Things gathers momentum, the requirement for very low power analog power supplies will actually increase. For the few remaining analog power supply designers out there; analog is not on the way out and long live the IoT!