- September 03, 2013
- Feature
Summary
-
By Shaun Luper
The stunning and sometimes revolutionary strides made in the consumer and corporate computing industries over the last decade (and more) have Automation clients, customers, and insiders wondering why the evolution of tech in Automation has not paralleled these gains to a greater degree.
By Shaun Luper
The stunning and sometimes revolutionary strides made in the consumer and corporate computing industries over the last decade (and more) have Automation clients, customers, and insiders wondering why the evolution of tech in Automation has not paralleled these gains to a greater degree.
Certainly there are no significant technical reasons holding back the evolution rate of Automation systems. All of the items on Bill Lydon's Wish List are more than achievable with today’s (and even yesterday’s) hardware and software technologies. So what is it then?
The current state of Automation is driven less by technological limits and supplier protectionism and more by Risk, Plant Lifecycle/Adoption rates, Defacto-Standards, Economics of Market Scale, and Human Risk Aversion a.k.a. Resistance to Change.
Vendor Cartels?
Some are of the view that the rising tide of innovation in the consumer and corporate computing industry has not floated the Automation industry boat (which certainly floats in the same general seas) at a faster pace because of the self interests of system vendors. This does not seem to be a credible explanation as long as there is sufficient competition in the marketplace. While it is certainly possible that Automation suppliers are colluding with one another in a collectivized effort to tame the forces of market competition to their advantage, conspiracies of this type seem highly unlikely (if not illegal). If the history of free markets has taught anything…it is that market cartels are notoriously difficult to manage to all the participants benefit in the long term (see history of OPEC).
Industry Consolidation
Yes, there has been a wave of consolidation in the Automation supplier industry that started in the late 1980s and continues today. It is also true that consolidation can retard competition like an effective cartel. Still, in most industries and parts of the world, competition in the Automation industry is still adequate.
Adequate does not mean that there are dozens of different suppliers to choose from. Historically, in the consumer and corporate computing industries, the market (read millions of individual decision makers) ultimately tends to shape the competition into a few large players that set the industry de-facto standards while lumbering along, trading periodic technological counter punches. These few big dog’s heels are constantly nipped at by dozens of young and scrappy pups that hope to be just enough of an annoyance (or threat) to get bought out before they get stepped on. The explosion of internet companies in the late 1990s was a good example. Now market forces have weeded out the very numerous bad ideas (pets.com) and consolidated the rest into major players (new and old) we are all familiar with today.
True, consolidation tends to mean less choice, less innovation, and in the short run, sometimes higher prices. But, it also means more (even if forced) industry standardization which is not always a bad thing. Indeed, it seems to be what the market has told suppliers it prefers (in the long run).
De-Facto Standardization and Economies of Scale
For those of you that are old enough…recall the computing world before the emergence of DOS and MS Windows 3.1 running on Intel processors as a de-facto consumer and corporate computing standard in the late 1980s and early 1990s. Before this period it truly was difficult or even practically impossible to integrate computing hardware and software from different suppliers. The dominance of “Wintel” over the last 25 years despite its’ frequent curse inducing shortcomings and periodic inferiorities (Windows ME or the Pentium FDIV bug anyone?) vs. competing technically superior platforms (albeit with much less market share such as Linux, Mac on PowerPC, Solaris on Sparc, Unix) is a lasting testament to the fact that economic decision, on balance, tends to favor the winning (dominant) standard over a technically superior minority (VHS vs. Betamax).
In another decade, Wintel domination will probably sound as ridiculous as US Steel or International Harvester domination. But in their respective days they were no less real. With the rise of Linux, cloud computing, and Android or iOS on mobile devices Wintel Domination has never been so fragile.
Another, often overlooked, benefit of the consolidation / standardization business cycle is that it allows economies of scale to come into play for both the Hardware and Software of the lucky owners of the winning platform. Settling on one platform for mass adoption drives unit prices down making the use of the technology economically feasible in many more applications. This puts the hardware and software into more hands (and minds) and the resultant exposure to its benefits and shortcomings ultimately ignites another round of innovation and competition from disruptive upstarts (a good thing).
As Bill Lydon points out in his article, the ease with which today’s users can buy a powerful peripheral device for $30 and connect it to a USB port and have it working in minutes with minimal fuss, risk, or technical know-how are the product of both standardization (due to a couple of dominant market players…heck even Apple eventually got on the Intel bandwagon) and economies of scale. Remember what it was like to add an expansion card to your computer’s Industry Standard Architecture (ISA) bus in the mid 1980s? It cost a fortune and probably ate your whole weekend (and more) and this was if you were even technically capable of attempting it in the first place. So much for standards…with experience you will find that most standards…aren’t. (RS-485, 232…RS means Recommended Standard)
Indeed the ease with which the average consumer can plug and play with devices is a double edged sword for Automation. Yes it makes some of our tasks much easier and cheaper. But it also sets expectations pretty high. Now all integration tasks in the Automation world are expected to be as easy as plugging in a new mouse. It is often difficult to explain to non-technical customers or your managers unfamiliar with the Automation world that “Just because it says ‘Profibus Compatible’ on the spec does not mean that you just bolt the thing to the wall, floor, or instrument stand, connect the cable, turn it on and it works.” * See the bottom of this article for a humorous aside on this subject.
Displacing De-Facto Standards
Of course, de-facto standards are just that, and will eventually either evolve or be supplanted when a sufficiently superior replacement comes along. Small incremental improvements in competing tech are usually not enough to hoist the massive weight of entrenched incumbent standards from their thrones and into the pages of history. The change must be more revolutionary for people to give up their considerable investments in the past and place a bet on the future. Think Cassette tapes vs. Reel To Reel, CDs vs. Pre-Recorded Cassette tapes, MP3s vs. CDs, digital cameras vs. Film. In these cases the incumbent did not stand a chance. The advantages of the new technology were non-linear improvements in cost and functionality that were sufficient to displace the considerable inertia of the entrenched market standard quickly.
One of the favorite topics in Automation is the various pretenders to the 4-20mA throne. How is the latest Fieldbus (take your pick) fairing vs. traditional hardwired interfaces (4-20mA and discrete I/O)? Is it being adopted only begrudgingly here and there in small pilot installations? And on the projects you have been on where it was used, did you find that all the extra features (which might justify the associated risk, complexity, and cost) were not often used and in the end it became effectively an expensive and complex version of the traditional hardwired standard, passing only the process value or a discrete signal up to the control system for further processing? Yes it saved on wiring and cabinets, but the tools and expertise needed to implement and manage it were more expensive.
Effectively the current crop of these fieldbus technologies are the 8-Track Cartridges of the past - technically better than LP Records and Reel to Reel recorders for some tasks - but not by much and they carry many disadvantages. Outside of the US and UK for a brief period in the early 70s, 8 Tracks never really caught on and were quickly wiped out when their smaller, simpler, cheaper, 4 tracks cousins - more commonly known as Cassette tapes - appeared on the scene. Today’s fieldbus technologies are in that 8-track zone. They are not compelling enough to sweep the incredibly massive installations of traditional Automation Architecture and I/O into history’s dustbin. Most are just niche players moldering on the sidelines.
Risk
If your iOS, Android Device, IP telephone, or Win 8 computer doesn’t work correctly because of a software or hardware problem, it’s an annoyance, not a life threatening situation. When Automation systems go awry, and particularly safety (SIL) systems, the consequences can be life threatening. There is typically much more at risk. Therefore these systems must be much more robust than your average consumer or corporate tech. Think in terms of the kind of software and hardware testing and development that goes into medical devices (pace makers, heart and lung machines) or at least in terms of the testing you hope goes into these platforms should your life ever depend on them. In terms of the rate of change of technology, this means that older, fully debugged (and usually cheaper) technology will always be favored over the new hot thing. Practical large scale Automation is not concerned with bleeding edge or even leading edge tech. It is the realm of the trailing edge - the tortoise, not the hare. The rising tide of consumer computing innovation does float the automation tech boat but only partially and with a lag of several years.
If it ain’t broke, don’t fix it.
The cost of downtime for any industrial facility is usually very high. If the Automation group can maintain or improve uptime by only a few minutes (or sometimes even seconds) a year, as an Automation engineer you have already paid for yourself. Other measures of Automation’s value are safety and operational efficiency (process yield, quality, improvements in maintenance expenses, waste / emissions minimization). Excluding the safety aspect, operational efficiency improvements are usually not as attractive when looked at in terms of risk and reward when compared to boring, reliable, stable (even if sub optimal) uptime operation. Management and investors notice when the plant is down and can easily quantify its real costs. Promises by Automation engineers of marginal improvements in operational efficiencies if they are allowed to tweak things are more diffuse, not easily measured and thus not always explicitly visible in the bottom line. They may be there, but they can easily get attributed to other things besides improvement in Automation. The only thing that is for sure is that attempting to make those improvements, by definition, requires changes to the way the plant operates, control strategies, software, hardware, etc. And that means certain cost and RISK. So, much to the consternation of Automation engineers who want to “tweak,” management frequently takes the safe route. If it ain’t broke (and works well enough), don’t fix it - at least not on my watch/shift or rotation. The message is basically leave the innovation and its risks to others. If it works, we will consider following. Of course everyone is doing this so not much happens.
Lifecycle
The design life of most industrial process plants is usually on the order of 20 years, and of course there are many that are much older (look at some of the nukes and refineries). If owners can get away with leaving the same automation systems in place for the entire life of the plant, that is the least risk option. Replacing an entire automation system during a planned outage is a high risk proposition that owners are right to avoid if possible. This means the replacement cycle (if you can call it that) for industrial automation hardware and software is on the order of the life of the plant (say 10 to 20 years). Any changes to the plant automation system are likely to be bolt-on additions or entirely new systems for new expansions with gateways to the existing system or partial updates to replace a particularly troublesome or obsolete portion of the system (and hence the need for all these legacy gateways between old and the new). This longer lifecycle means that evolutionary improvements are slower to come when compared with the 2 to 4 year lifecycles in Corporate and Consumer computation market. Older tech can stick around longer because the risk factor tends to drive conservative choices when designers and engineers select technologies thus reducing the pressure on suppliers to introduce new tech.
Size
In comparison with the consumer and corporate computing markets, the Automation market is miniscule. Aside from piggy backing on advances made in the corporate and consumer computing technologies, the specialized software and additional hardware features required (redundancy, explosion proofing, environmental hardening) to produce suitable automation equipment for industrial use limits the economy of scale effects seen in the consumer and corporate markets. Certainly scale effects are present but they are not comparable to the scale efficiency effects present in the consumer and corporate tech market. This drives prices up and choice down when compared to the consumer and corporate tech market. It also contributes to making product lifecycles longer, as mentioned above.
Selling a larger quantity of the same systems over a longer period of time means development costs can be spread over more systems, reducing the impact of initial fixed development capital costs on the per unit pricing. This can translate either into lower customer prices and/or higher profits for the supplier.
Human Inertia
Finally there is the human element. Study after study of human choice reveals that again and again when people have a choice they will choose an imperfect known over an unknown. After all, despite all of your personal well-known and glaring shortcomings, your partner keeps choosing you, and you them ;-). In short, people who have responsibilities for the outcomes of decisions are naturally conservative. They know that there could be big payoffs to a new approach but would prefer to let someone else take the risks initially, and then follow quickly in their footsteps if they prove to be successful.
I remember my early days in the industry, infatuated with whatever the latest and greatest tech was being promoted by suppliers. It was the future, it was cool, and it put me on an even playing field with the more experienced guys because it was new for them too. I felt that their experience and prior knowledge of existing systems was rendered useless. Of course this was not entirely true. While our detailed knowledge of the latest tech was equal, their experience on the risks of new tech was something I lacked.
I could see only the touted benefits and none of the potential complexities, costs, and risks. I was continually frustrated when the final decisions were made by my superiors with regards to the systems that would be installed, how they would be structured and configured, how much automation would be used, etc. It always seemed they chose the oldest, least cool tech, and put the minimal amount of Automation into the design. Of course, as I gained experience (and more importantly responsibilities), I began to understand and appreciate my superior’s conservative approach. While the WOW factor was missing, we weren’t being paid for WOW, we were paid for safe, low risk, reliable, inexpensive automation solutions that were “good enough” - otherwise infamously known as “fit for purpose” solutions. After all, the company was in business to make money, and my bosses (not me) were responsible for the end result. I was there to learn, help, and hopefully eventually earn my keep. No gold-plated hi-tech Rolls Royce was being asked for and therefore none would be offered. The difference between bleeding edge and leading edge was not yet apparent to me.
As time has marched on, I am approaching the experience level of the old guard I worked under when I first started. I do try to balance my naturally more conservative stance with openness to changes. I also make an effort not to lose my own enthusiasm or to tamp down on the enthusiasm of the me of yesteryear for the latest and greatest. One of these bleeding edge approaches will eventually become the new de-facto standard starting the cycle of innovation all over again.
Summary
In summary, there are many forces that conspire to make the current state of Automation what it is today. Nearly all of them have little to do with technological hurdles or nefarious plots by suppliers. They are nearly all based on natural human tendencies, risk, the power and demands of the markets, economies of scale, and the natural lifecycles of the process plants/systems.
I know my ramblings here might be misconstrued as a defense of lack of innovation. Comparisons between the relatively riskless world of consumer tech and the Automation world need to be examined through the lens of risk. Risk drives conservative approaches which in turn have a damping effect on automation innovation (and with good reason). The market rewards reliability and quality not speed and features that work most of the time and that shapes the current offerings. However, I agree with Bill Lydon that the status quo is getting stale. The de-facto standard approaches have been on the Automation throne for a long time. The industry populace is restless and hungry for a new de-facto standard. Many movements to that end are afoot, and (r)evolution is on the lips of the young and old.
*Humor…Ha Ha
In the mid 1990s, I was on a Cogeneration Power project. We were relatively early in the design and the group had just finished bid evals and purchased the control system hardware. It was at this time the project manager (for clarity…not an engineer…a business manager) called the control systems group into a private meeting to announce that he was able to considerably reduce project cost and schedule because he had “got a copy” of the control system configuration (programming) from a previous power project he had been on. His plan for the current projects plan was to just load it directly onto to the system hardware when it arrived. What was left unsaid was that now that we had purchased the hardware, his cost and schedule savings would therefore be achieved by getting rid of the Automation Engineers (us).
The naiveté was truly shocking. He truly believed that essentially you could just run down to Circuit City (at the time still in business) and get a copy of Microsoft Cogen 2.0 off the shelf for $99 (or a copy from the previous project) and the “Cogens for Dummys” book and that was it.
Ah, if only life were so simple - control system software so java like and portable, and cogeneration plants built to an identical spec with a mass market for Cogen Control System software so the cost of developing it would be divided among millions of users. The temptation for a snarky reply from my boss must have been nearly irresistible. Of course wiser heads prevailed and no snarkiness was even faintly evident in his tone while the project manager was politely and patiently educated on the ridiculousness of this plan (they were totally different plants, the control system vendors were different, etc, etc.). Still I fantasized about what that snarky reply could have been…. “Yes! Brilliant idea sir! That is why they pay you the big money. And while we are down at Circuit City picking up that copy of MS Cogen 2.0, would you also like us to pick up the latest version of MS Project so you too can have the opportunity to save the project lots of money?”
About the Author
Shaun Luper has a B.S. in chemical engineering but has been working in process control in the EPCM (engineering, procurement, and construction management) business for more than 20 years. He has worked in the USA and abroad in Indonesia, Australia, Peru, and the Democratic Republic of Congo and currently working in Kazakhstan. Experience includes Power, Pharmaceuticals, Mining and Metals, Nuclear, and in the Oil and Gas industries. Luper is married and has one son attending the University of Washington.
Did you enjoy this great article?
Check out our free e-newsletters to read more great articles..
Subscribe