Technical Article

Rethinking Power Analysis

December 01, 2015 by Bernd Neuner

This article introduces ZES ZIMMER's innovations to their power analyzers in terms of DUT physical interface, user interface and interface to periphery.

The quest for increased power efficiency permeates every level in the electronics industry, from individual semiconductor components over circuits and modules to entire devices and even complex systems. From a power analysis point of view, every part of this spectrum has different requirements and poses different challenges. To keep up with those diverging demands has been increasingly difficult for conventional architectures – reason enough for ZES ZIMMER Electronic Systems GmbH to come up with a radically changed power analysis platform to address the needs of the future. 

ZES ZIMMER has been designing and manufacturing precision power analyzers for more than three decades. Since we have always been exclusively dedicated to power analysis, we have accumulated a considerable amount of experience and have been exposed to lots of meaningful customer feedback across all industries and geographies. Our design philosophy is centered on how an instrument interacts with its environment. We believe that in order to increase an instrument’s value to our customers, we must strive to improve those interactions on all levels. All interaction occurs via interfaces, which can be categorized into three distinct types:

  • the physical interface to the device under test
  • the user interface
  • the interface to other devices like PCs, peripherals etc.

Over the years, those interfaces have gradually been enhanced and modified in most power analyzers in the market due to innovations in the devices under test and general IT infrastructure changes. But sometimes gradual change is not good enough any longer, a step change is required. One needs to go back and take a fresh direction. We will subsequently describe major innovations we feel have become necessary in order to successfully cope with the power analysis demands of tomorrow. To illustrate the extent of those changes, we will pick one example for each type of interface.

 

The physical interface to the device under test (DUT)

The defining parameter of any power analyzer is RMS power of the measured signal. In addition,  harmonics, or more generally speaking, information about power distribution in the frequency domain, are crucial when it comes to optimizing efficiency and avoiding damage to components. Unfortunately, RMS values and frequency domain information cannot always be obtained under the same circumstances, sometimes you have to decide which one you are interested in, modify the measurement setup accordingly, carry out the measurement, and repeat it with modified settings in order to obtain the rest of the required values. 

Where does the problem come from? Harmonics, filtered values etc. are constrained by the Nyquist-Shannon theorem, whereas the RMS power value is not necessarily. In practice, this means you can still correctly derive the RMS power values for signals with their bandwidth exceeding half the sampling frequency, while this is impossible for harmonics etc. In order to look at the frequency distribution, you need to sample fast enough in order to reconstruct the original signal from the sample values. To merely measure RMS power, this is not mandatory. A loss of information is unproblematic, as the exact signal shape does not matter: an infinite number of different signals can end up having the same RMS power value. What is important is to obtain a statistically significant number of random power samples to calculate the right average. However, when looking at the frequency domain (or calculating harmonics, applying digital filtering etc.), one has to be careful to obey the Nyquist-Shannon criterion. Thus any signal content above half the sampling frequency has to be removed to avoid erroneously undersampling it and mixing it up with the rest of the signal after digitization. This order cannot be reversed, it is impossible to sample first and filter later, since the sample values of an undersampled high-frequency signal are indistinguishable from those of a “genuine” lower-frequency signal. The result of this erroneous undersampling is aliasing, which puts the results of the measurement in danger. Generally, there is no fool-proof way to predict its influence on overall accuracy, so it is best to make sure to eliminate it entirely.

One way to avoid the above dilemma is to carry out two separate measurements – which creates a fresh quandary: for most devices under test, it is close to impossible to reach exactly the same point of operation for two independent measurements. Theoretically, the only safe way out would be two parallel measurements on the same DUT at the same time. What sounds like an instrument manufacturer’s dream is unattractive in practice due to the doubled cost for measurement instruments. In reality, you do not necessarily have to duplicate the entire instrument: it is sufficient to sample/filter twice in parallel. This underlying idea eventually lead to the birth of ZES ZIMMER’s DualPath architecture depicted in figure 1. The incoming signal is split up and gets sampled concurrently by two independent A/D converters – the users get both results they are interested in, taken exactly at the same time under the same operational circumstances.

 

Two A/D converters/channel for faster & better results

Figure 1: Two A/D converters/channel for faster & better results

 

As long as the measurement bandwidth required by the application does not approach half of the instrument’s sampling frequency, there is no problem at all, since there is no frequency content in danger of getting undersampled. However, this does not reflect the current state-of-affairs for most power analysis applications. Such diverse products as LED circuits, switch-mode power supplies, variable frequency drives (VFDs) and welding equipment share pulse-width modulation (PWM) as a common feature. While PWM has many advantages in terms of efficiency and flexibility, it also comes with harmonics and other by-products that completely redefine the conditions for power analysis.  The measurement bandwidth no longer depends on the frequency of the application itself alone, it also has to accommodate the switching frequency of the PWM, which often extends from 2000 Hz to 100 kHz and beyond. This signifies in turn that the harmonic frequencies certainly reach into the MHz range. How important their contribution is cannot be generally determined, it is a question of the requirements of the application at hand. If there is no need for high accuracy, the high-frequency portions might be omitted. Typically, the higher you go in frequency, the lower the remaining share of the power spectrum. Figure 2 shows an example of the output power distribution of a frequency converter in an electric drive application. (Please note that both the x- (frequency) and the y-axis (power) are scaled logarithmically.)

 

VFD power spectrum

Figure 2: VFD power spectrum

 

There are three distinct areas that are somewhat typical for this kind of application: the tall bar to the left represents the fundamental frequency of the motor, which is where most of the power is intended to end up, since only this portion can potentially contribute to torque. The successively shorter bars in the center represent the switching frequency of the converter and its harmonics. While this frequency is in the lower kHz-range in the above example, it can easily be located at 20 kHz and beyond, with the 40th harmonic already close to the MHz range. The blue “triangle” on the right is made up of line reflections and other by-products. Due to the logarithmic scaling it appears more substantial than it actually is, but it contributes still about 2% to total power. The depicted distribution illustrates that insufficient measuring bandwidth will truncate the power spectrum and omit portions of the true power output. There is no point in measuring the lower-frequency content with 0.1% accuracy while completely cutting of 2% at the upper end. Thus, ample analog bandwidth is crucial for precise measurements in high-frequency applications.

Still, expanding the bandwidth at the expense of accuracy does not yield any benefits either. With an increased shift to variable-frequency drives for motor control, the use of modified oscilloscopes has become more popular for power measurements, since they typically offer superior analog bandwidth when compared to “native” power analyzers. However, a closer look reveals this increased bandwidth does not result in more precise power measurements. What is gained by adding the spectrum e.g. between 10-20 MHz (often at unspecified accuracy) is more than lost by deteriorating basic measuring accuracy from typical values for a high-end power analyzer (0.03% or better) to those of an oscilloscope (typically around 0.1%). Since the vast majority of the measured power is to the left of the graph, a measurement error about 3x worse at the fundamental frequency of the motor makes a much bigger difference than the entire contribution from 10 MHz on. The conclusion is obvious: both accuracy and bandwidth need to be taken to the maximum, one cannot replace the other.

 

The user interface 

The above example hopefully already serves to hint at the many degrees of freedom and associated choices when measuring power in real-life scenarios. Yet, it only captures one aspect of a single application on one device under test. It does not take a lot of imagination to picture the numerous ramifications opened by the diverse range of devices found in the market. Unless one wanted to develop application-specific power analyzers, every power analyzer needs to offer all the possibilities required in all applications across industries. The individual user, however, typically only accesses and uses a tiny fraction of these features. As long as they are easy to access, there is no issue. Once they become hard to find because they are buried under a multitude of options, usability severely suffers.

In our experience, very few users are working with desktop power analyzers on a daily basis. (Integrated test systems are a different story.) In a worst-case scenario, the power analyzer gets dusted-off every few weeks or even months to qualify the impact of some design or design change. When it happens, there is little time to re-familiarize oneself with its functions. In an ideal world, the user would set up the test, connect the instrument, power it up, configure it, start measuring, save the data and call it a day. In reality, people soon get stuck during configuration already. Of course, there is the user manual, and hopefully also the instrument manufacturer’s competent customer support team. Still, more often than not, the entire procedure is quite time consuming and nerve-wrecking.

Since users have little time to adapt to the instrument, the logical consequence is that the instrument needs to adapt to the user. In other words, the user needs to be able to shape the instrument in a way so it becomes the perfect tool for the job at hand. Easy to understand, easy to use. This means first and foremost eliminating features and functions that are not relevant to the task at hand and thus become mere distractions. It also means creating a meaningful context for the measurement task by adding useful information, and it means the ability to structure the results obtained during the course of measurement in a way so that excessive jumps between different menu items and screens can be minimized or eliminated altogether. Figures 3-6 show some exemplary implementations of this principle.

 

Conventional

Figure 3: Conventional "spreadsheet" view

Individualized application-specific GUI layout

Figure 4: Individualized application-specific GUI layout

 

It is not sufficient to simply reduce or reformat the amount of displayed data, the combination of data from different domains on one page is crucial to switch from an analyzer-centric to an application-centric view. In the context of a given application it might be useful to collocate RMS values, external inputs, like torque and speed, harmonics and even derived variables calculated from the electric measurements. A widespread example of the latter are properties like field strength, flux density and losses of magnetic cores. Displaying them next to the electric parameters concentrates all desired results in a single spot, allowing to easily check the plausibility of the obtained values. There is no need to navigate through different menus and submenus any longer for collecting various bits of information to manually aggregate them later on. Nothing needs to be remembered regarding the structure of the instruments menu tree: the user simply connects the device under test, loads the individualized GUI accompanied by the right settings and starts testing. There is no “programming” necessary, apart from simple drag-and-drop operation when defining the ideal GUI setup to be re-used later for subsequent tests.

 

From power analyzer to magnetic core analyzer

Figure 5: From power analyzer to magnetic core analyzer

Combining graphs and tables

Figure 6: Combining graphs and tables

 

The interface to the periphery

Improving usability of an instrument does not stop at the built-in screen: sometimes operation from the front panel is not advisable, since the unit under test is situated in an environment hazardous to human health or otherwise unsuitable. This is typical when machine-to-machine communication comes into play: data needs to be transferred from the analyzer to a control computer or a peripheral device for visualization, storage etc. Of course there are other scenarios when instruments need to be remote-controlled – first and foremost in the context of automated test systems, where the analyzer plugs into a larger umbrella structure. Those shall not concern us here, for the moment we would like to focus on how a single user can carry out all those actions he normally initiates from the front panel at a distance. 

As with all other described functions, remote operation of the power analyzer should be easy to set up and use. Remote-control software tools have long been established for those scenarios and are readily available for most power analyzers on the market. If the proper infrastructure – mainly PC/notebook and software – is in place, controlling the instrument should not be an issue. Sometimes not all front panel features can be made available, sometimes the feature-set remotely available is even more powerful. Often, however, users do not have access to a separate PC, and accessing the existing LAN with power analyzer and control computer might need to undergo approval by the IT department, which tends to complicate and slow down the process. Thus, many customers jokingly confessed they wished they could just remove the front panel, take it to their desk, and operate the unit from there. With conventional power analyzer architectures this would probably have remained a joke forever. The only part of the front panel that could effectively be redirected was its graphical output, e.g. via VGA or DVI interface. Although this turned out useful for sharing the screen content, it was merely a one-way street: while it was possible to look at a remote screen to see what was going on, there was no way to influence what was happening – not without an additional computer. 

The only way to achieve true “remote front panel” operation without at least an additional keypad was to make screen interaction bidirectional: the screen had to serve both as output and input device. Looking at all available options, ZES ZIMMER decided for a touch screen interface. Not only did it turn out to be enormously advantageous when it came to simplifying the use of the power analyzer, it also solved the remote operation problem in a very elegant way: all you need to do is plug a touchscreen into the LMG600, and you can control the instrument the same way you would from the built-in front panel. 

 

Figure 7: "Remote front panel" operation

 

Conclusion

The increasing sophistication of applications in combination with a general trend away from specialization in test engineers has brought conventional power analyzers to their limits and tempted users to resort to other means of measurement to make up for the perceived shortcomings. However, a few decisive changes to the traditional architecture and feature set of power analyzers can successfully address and overcome these obstacles. Offering the customizability and flexibility otherwise only found in generic data acquisition systems in an off-the-shelf power analyzer is a huge step towards quick, low-cost and low-effort out-of-the-box test automation for engineers.

 

About the Author

Bernd Neuner works as the Director of Global Sales at ZES Zimmer Electronics Systems GmbH since June 2013 where he specializes in the building and maintaining global business relationships, managing networks of distribution, dealing with multi-national key accounts. He is skilled in the field of business development, photovoltaics and solar energy as well. He earned his Diploma in Computer Science at The Julius Maximilians University of Würzburg located in Germany.

 

References

  • Neuner, B., DualPath explained: Why one A/D converter per channel is not (always) sufficient, ZES ZIMMER, 2015
  • Jaeckle, T., How to avoid typical pitfalls when measuring Variable Frequency Drives, ZES ZIMMER, 2014
  • Jaeckle, T., Power measurement and its theoretical background, ZES ZIMMER, 2009