I recently went to a factory and witnessed a PLC I programmed 12 years ago, It runs 24/7/365 (a siemens s5, but it could just have easily been AB, etc.) At the same factory they run a sterilising system recently installed, it runs under XP on good brand name industrial PCs - it
crashes every week - the systems company almost live on-site. They recently added a small plc which cycles power on a watchdog check and this allows a return to on-line in just a few minutes unless the crash is during a critical measuring time, when the restart time can be 45
minutes. Incidentally I have quoted for a PLC version of this system, I'll likely get it too.
I'm not an oss zealot, I would never try to play tomb raider on a PLC, or trust a critical process to windows. Yes curt and jiri, Linux, (my Linux box has been crash free for 17 months) but I've looked at all the offerings and they are nowhere near usability yet - I'm experimenting at the moment with an S7-200 and IT module connected to a Linux PC running Firefox, all the processing still done in the PLC - it's as far as I'll go at the moment, because if I make a mistake, 20,000 litres of milk could end up as yogurt!!
Indeed, that's my whole point. People won't feel good about PC reliability until 5 years _after_ they dump Windows. That gives the PLC folks a pretty good edge. Of course Windows infelicities cause them a good deal of grief as well. The first PLC vendor to support Linux will have quite a competitive advantage where reliability matters. What I think is hiliarious are the folks who argue to the death about the reliablity of PLCs and then build systems dependant on Windows or Windows technology. In a lot of applications, it doesn't matter if the PLCs keep on truckin' if your people can't see what the PLCs are doing. Obviously, reliability isn't very important yet. I think there will be tremendous synergy when you install Linux and PLCs together and which you choose for what role will depend on what needs to be done and IO count. I have already seen the extrordinary power of this approach. It makes you want to tell people about it.
Its just not fair!
IF an SLC5/nn had to run or be able to drive a scanner, a digital camera, a color screen, a graphic printer, a keyboard, You know where I'm leading you...
AND IF Bill Gates would be able to make us ONLY connect some limited number of devices which HE had built.
THEN who would crash more often.
I run Wintel boxes for months and years without any BSOD. Unfortunatly, when I do get in trouble it's often real hard to find why. Most of the times its human related. Either the operators or coders have been "playing" with my systems or the migrate the application to another PC.
Never do I find a Wintel PC that runs good suddenly go bad.
My first choice is PLC controls with a SCADA. This way we can reboot safely and no one notices.
Win2000 is very stable WinXP is very stable. Even Win 98SE was stable.
BUT DON'T PUT ANY CRAP IN THEM.
Pierre Desrochers Montréal, Canada
Unfortunately for all of you die hard Linux users and believers, Windows is a very stable platform. I have several (15 to 18) very large Water Districts (20 - 50MG / Day usage)and the associated plants, pump stations, wells, tanks and other associated equipment or systems running Motorola Moscad tied to XP and 2000 based SCADA computers running IFix or Fix32 and there has never been a Windows related crash or failure - ever in over 5+ years. So much for 2 day MTBF comments.
Linux is not the end all to operating systems and so far from what I have seen, Linux systems require 10x more development time and expense, are harder to get support for and the developers usually fall off the planet after a couple of years.
You can go ahead and develop and install Linux systems, but all of my customers are sold on rock solid Windows based systems - period!
I suspect that in a few years, many will be bashing Linux for its instability and the lack of drivers, applications and similar related comments we now hear about Windows. To each their own.
The only other thnig I can say is stop whining about Windows, I don't whine about Linux all the time, even though my experiences have been bad in very case. In fact I have seen more Linux systems replaced by Windows systems for a host of reasons, just as I am sure you have seen the opposite.
Personally I think your all sore because Microsoft is more widely accepted than Linux and will be for sometime to come, by the time it is not I will have retired and will be fishing somewhere up north.
I call "bullshit" on 5+ years of experience with XP and 2000.
As for the 2 day MTBF comments, those are derived from the manufacturer's published figures. Such as they are. I suspect that as far as most people are concerned, the manufacturer's "5% crash often" figure will weigh more heavily than your anecdotal report.
I agree, but I was waiting for the Windows folks to call him on that one. I don't think anyone would believe that since almost everybody uses Windows.
Win 2000 and XP for 5+ years? If this is 2004, did I miss some early release of 2000 or XP that would have come out 5 years ago? Or maybe a time warp?
A PLC has been optimized for use in an environment
where there is limited sophistication in maintenance
and setup. There is typically very little
configuration involved with PLC cards - most of them
are plug and play - something easy to do if you are
using cards which all come from a single vendor for a
single hardware platform. So very seldom are there
issues with bad drivers.
From a programming standpoint, RLL is a simple
language which has a huge population of maintenance
personnel who are competent - so unless you need the
flexibility of an open system, or want a lot of power
for not a lot of money, why retrain your maintenance
Delta Tau Data Systems
Because Microsoft poisoned the well with instability and this has caused many people to think PC's are unreliable. Also, suitable IO and software have been hard to come by until recently. But the point is becoming moot as many recent PLCs are essentially a PC under the covers. Not that PLCs aren't a good solution, it's just that PCs get a bad rap fron the Windows experience. Many large automation systems ran and still run on UNIX hardware and a PC with
Linux is as reliable and vastly more powerful than say, a VAX. So, it's mostly perception and the fact that vendors have made PLCs very convenient, if somewhat pricey. I expect a trend towards more PCs as Windows fades. They can do a lot more with higher bang/buck and far greater flexibility.
Having used Windows platforms for automation since Win
NT 3.51, I would have to take issue with Curt's
statement about Windows instability. When properly
set up and administered (i.e. lock the operators out
of everything except for the absolute minimum of
required apps) I have found Windows to be more stable
with each iteration - and XP Pro is the most stable of
the bunch. If the system adminstrator and programmer
do not have the requisite level of competence, then
there can be significant stability issues. The
biggest problem I have seen in 13 years of Wintel
automation programming is that there are far too many
people out there who think that since they can log on
to AOL and chat at home, and maybe install a USB
camera driver, they have the knowledge required to set
up and administer a factory system.
I have also programmed and administered HP-UX and
Solaris based automation systems - and I found both to
be little more stable than the Wintel systems
including and post Win 2000 Pro. Not to imply that
they were problematic, but only that they were not
much better than a properly set up Wintel system. I
find it unlikely that Linux is any more stable than a
properly set up XP Pro system.
Delta Tau Data Systems
With regards to Davis Gentry's comments on Windows crashes:
I believe that it was widely published in the IT press last year that according to Microsoft's own figures from their new automatic reporting system, 5 percent of Windows installations crash more than twice per day. They did not reveal the figures for what proportion of Windows PCs crashed at various less frequent intervals.
These figures are for the most recent version of Windows only, as there was no equivalent reporting system in previous versions. Presumably we can conclude from your statement that the situation was worse for Windows NT4 or 98.
However, the people I know in the IT business would dispute your statement about Windows XP being the most "stable" so far. They all seem to be of the opinion that Windows 2000 was much better in that regard, and that Windows XP is a distinct step backwards. This is just opinion though. All we really know for sure is that 5 percent of Windows XP installations crash more than twice
Whether this level of reliability is acceptable of course depends upon the application. In its intended market as an OS for e-mail and entertainment applications, a few crashes now and again are probably of no consequence.
London, Ont. Canada
This is a little unfair, the uptime project, where users try to keep machines up for as long as possible, currently gives an average uptime for windows XP as 2 days.
Well the Win2k I use at work does a little better than that. Sometimes more than a week :^). Between the servers and the desktop, the average is probably 3 or 4 days anyways. Eerily long for Windows, but I don't think I'll be switching my home and business machines over to MS anytime soon. Do they have an uptime command for Windows? It'd be interesting to get an accurate figure, I only see the problems on my shift. My new Linux server has everything there beat except the Linux/IBM mainframes.
There is a freeware one at:
Neat! Considering the wild assumptions I made in my calculations, I think I can feel quite happy that my figure (1.2 days) is so close to yours.
Do you have the other figures for the distribution? How much does it deviate from Poisson? Is there correlation between successive uptimes on the same computer? (How much?)
In any case, though, this is largely academic in this forum; 2 days MTBF is ridiculous.
There seems to be a lot of Windows bashing here, when the question was about PLC's and PC's. I have used both over the years, each has its own advantages and disadvantages. The biggest is cost and development time and integrated I/O. Many users decide to go with industrial based PC control systems because a PLC will or may not handle the extensive amount of communications required to control the various systems of a modualr piece of production equipment.
As far as Windows being unstable, I have 3 systems running at home, a Win3.11 (has run non-stop for 6 years and 7 months, no crashes, no lockups and no viruses.) My Win98 systems as been on line for over 4 years and my own XP Pro system has been up without failure for over 1 year. So much for the 2 day uptime comment. It is not always Windows that is the root cause of the lock ups and "crashes", it it typically related to application issues, memory leakage, inadequate system administration, driver conflicts and operators surfing the web with their SCADA machine.
As far as other applications with Windows based SCADA systems, I can provided anyone who is interested with a list of customers with large SCADA networks which have operated without a Windows related failure for 5+ years.
I have also found that keeping software up to date, running anti-virus software, maintaining your HDD and doing a little machine related up keep goes a long way in keeping a machine up and running. Having a UPS, modem line protection and keeping SCADA boxes off of the internet and not allowing operators to play games or load software is the best way to ensure a machine will be fully operational and clean for its intended purpose.
As far as Unix and Linux based systems, there are pros and cons and yes they are a little more stable, but given my experience, I will still recommend, install, maintain, support, purchase and program SCADA systems for and around Windows based machines - 90+% of all software and drivers are designed for Windows, there are more Windows based developers than for other Operating Systems and I can get support world wide for any Windows based application related to SCAD and as well for drivers.
Of the 5 Linux systems I have had the headache of working with, only one had a support group and it took 5 days to get somone to call back. The other developers had fallen off the planet and getting driver problems resolved was a nightmare.
My point is this - if you don't maintain it, it will fail. More people put more effort into maintaining their car and home then they do their computer, and when the computer fails they blame the software. I have seen more failures related to power surges, dust, overheating, viruses and spyware than Windows related issues.
Now back to the poster's question - PLC's offer a standalone, dedicated control solution which provides integrated I/O for a reasonable cost and is scalable as the users needs grow.
A PC must be outfitted with dedicated software, has limited I/O capabilities, may be prone to incompatable software and driver issues, has higher investment and lower ROI. Is typically much larger, uses more power and can be more time consuming to troubleshoot and repair. It is also more difficult to get support for PC based systems vs a PLC.
So, for all those out there basing PC's, don't blame Windows, Bill Gates, MicroSoft or the various other applications developers and manufactures. You may want to look at how your customer is using their SCADA machine, keep it clean, protect it from power related issues, install anti-virus and anti-spyware software, update your drivers and applications and keep your SCADA box of the internet.
Actually, I believe it was Windows 95 and 98 which crashed after 49.7 days. The crash was caused when the 32 bit milli-second uptime counter rolled over. The only way to avoid the crash was to re-boot before that happened. It took years for anyone to notice this problem. Not too many people ever had to worry about reaching that limit in practice though.
London, Ont. Canada
You should apply at Microsoft, They obviously could use your talents. I'm sure Bill Gates doesn't see that kind of reliability, so you must know something they don't.
So you want us to believe that windows 98 will run continuously for 4 years in a row, and the the way to do this is maintain it by keeping the software up to date. Yet when I look at the patches released for windows 98, there are several patches that have been release in the last 4 years, and they required that you reboot your computer to impalement them. Thus you are contradicting yourself.
The place for the uptime command for windows. However even this simple program is broken. It reports my system up for 267 days but I can see in the event-log that I rebooted a few days ago.
[snip from screendump]
06/04/2004 11:43:18 AM Boot Prior downtime:0d 0h:1m:37s
06/04/2004 2:35:10 PM Boot
Current System Uptime: 267 day(s), 10 hour(s), 15 minute(s), 9 second(s)
Estimate based on last boot record in the event log.
See UPTIME /help for more detail.
Total Reboots: 119
Mean Time Between Reboots: 2.25 days
Total Bluescreens: 3
There is insufficient data in the event log to calculate
system availability. Please see UPTIME /help for more detail.
They've declared their system Open, they've declared their system costs less, they've declared their system secure. What reason would they have to doubt that people will believe the uptime figure?
Was it also reported what percentage of those crashes were on machines improperly set up and/or administered? I have seen VERY few crashes on XP Pro, and the experience of the developers working for my company indicates that XP Pro is very stable. My laptop has been running XP Pro for over a year now, and is used for code development in VC++ and VB, not just operation. I also do debugging of code which my
client's write, and while I have had apps crash, I have almost never had the OS crash. So I cannot comment for the rest of the XP Pro machines running out there, but in our experience a properly set up and maintained XP installation is a very stable platform.
Delta Tau Data Systems
On March 25, 2004 12:02, Davis Gentry wrote: <clip>
> Was it also reported what percentage of those crashes
> were on machines improperly set up and/or
No, there were no further details of interest. The reports were from a speech by Bill Gates where he mentioned in passing the results of their new automatic crash reporting system. The 5% crashing twice a day number for Windows was significant, because this was the first large scale statistical sample from real world use.
Prior to this, most people simply had their opinions or impressions from what they recall seeing with a handful of PCs. There are lots of reasonably reliable statistics about server uptimes, since this is an important business statistic and there are people logging this data. However, server application is different from desktop use, and furthermore Windows XP does not have a server version (i.e. there is are Windows 2000 and 2003 servers, but no Windows XP server) to compare to.
> I have seen VERY few crashes on XP Pro,
> and the experience of the developers working for my
> company indicates that XP Pro is very stable.
I won't argue with this, although I'll say that I thought that Windows 98 wasn't all that bad either. I will also say that the "reliability" issue can be over emphasized. The question really ought to be whether the frequency and consequences of crashes is tolerable for a given application.
> laptop has been running XP Pro for over a year now,
> and is used for code development in VC++ and VB, not
> just operation. I also do debugging of code which my
> client's write, and while I have had apps crash, I
> have almost never had the OS crash.
I won't dispute your experiences with this, although I would wish to state that computer programmers are usually the wrong people to ask about things like this. The problem is that they are more like the people who wrote the software (Windows), and less like the typical people using it in other applications. They tend to do the same sorts of things which the original developers did, and so are less likely to find the other bugs which crop up under different circumstances.
> So I cannot
> comment for the rest of the XP Pro machines running
> out there, but in our experience a properly set up and
> maintained XP installation is a very stable platform.
I think that in this last sentence you have more or less answered the original question. You have stated that you have a great deal of expertise in setting up and maintaining Windows systems, and that it takes this level of expertise to make Windows XP reliable. However, for most people involved in the automation business their expertise is in other areas and they don't know enough to set up or maintain such a system. That is why they use PLCs instead of PCs with Windows. Greater use of PCs in industry will require, among other things, the use of operating systems which are reliable when set up and maintained by people who are not experts.
London, Ont. Canada
Well, *if* the crashes were random, that would giver MTBF of 1.2 days. It's not clear from the press report whether those days are 24-hour or 8-hour, thus respectively giving MTBF of either 29 hours or 10 hours.
That would give 44% not crashing on any given day, 36% crashing once, 15% crasing twice, 4% crashing three times, 1% crashing 4 or 5 times, and a negligible number crashing 6 or more times.
The crashes are unlikely to be random, so the real percentages are likely to be different; for instance, I would expect it to be more tail-heavy. However, this is the best one can do with the figures given.
As a manufacturer-reported MTBF, it's certainly cause for thought.
I can't discount the possibility that in 15 years
or so I've never seen a properly set up and
administered Windows PC. So, I'll take you at your word. I sure have seen a lot of the other type though. Glad my Linux boxen aren't that fussy.
On March 19, 2004, Davis Gentry wrote:
> Having used Windows platforms for automation since Win
> NT 3.51, I would have to take issue with Curt's
> statement about Windows instability. When properly
> set up and administered (i.e. lock the operators out
> of everything except for the absolute minimum of
> required apps) I have found Windows to be more stable
> with each iteration - and XP Pro is the most stable of
> the bunch. <
So in other words, the Windows platforms have not been
stable, but if current trends continue, some day the
windows platform may become stable.
> If the system adminstrator and programmer
> do not have the requisite level of competence, then
> there can be significant stability issues. The
> biggest problem I have seen in 13 years of Wintel
> automation programming is that there are far too many
> people out there who think that since they can log on
> to AOL and chat at home, and maybe install a USB
> camera driver, they have the knowledge required to set
> up and administer a factory system. <
I would go on to say that most 'professional' system
administrators that focus on wintel systems are
qualified to administer a factory, or even the it
systems they are hired to maintain. A lot of them got in
the business because IT was a hot commodity that was
hiring lots of people. They got in it for the money, not
because they love the field. It still amazes me at the
number of IT people that think Unix/Linux is hard or
> I have also programmed and administered HP-UX and
> Solaris based automation systems - and I found both to
> be little more stable than the Wintel systems
> including and post Win 2000 Pro. Not to imply that
> they were problematic, but only that they were not
> much better than a properly set up Wintel system. I
> find it unlikely that Linux is any more stable than a
> properly set up XP Pro system. <
A properly set up set up Windows system that is stable
is a lot like Santa Claus. There are a lot of people with
childish notions that believe it exists, due to deception
from a lot of other people that know better.
On March 24, 2004, Mark Blunier wrote:
> So in other words, the Windows platforms have not
> stable, but if current trends continue, some day the
> windows platform may become stable. <
I think that my English was fairly clear - I have personally set up and administered Wintel systems which have been and continue to be stable.
> A properly set up set up Windows system that is
> is a lot like Santa Claus. There are a lot of
> childish notions that believe it exists, due to
> from a lot of other people that know better. <
I think I stated this problem fairly clearly as well - there are a lot of people out there who are setting up Wintel systems who should not be. Under those circumstances, then yes, a stable system (Wintel or other) is much like Santa Claus - it shows up once a year for a short period, or so it has been reported, if never documented.
Delta Tau Data Systems
PLC's have optimzed achitecture towards servicing I/O's while PC's are more designed towards computing speed with less real time I/O servicing. Applications that PLC's a used require near real time processing of I/O's which PC'c will never be able to do.
Now that's silly too. Lots of QNX, VX Works and RTLinux ueers as well as othere RTOS users would dispute that. After they finished arguing what near real time means :^) Even "normal" Linux can meet 10mSec deadlines with kernel premption to a very high degree of confidence. In fact Wind River just linked up with RedHat and is aggressively
persuing the Linux business:
This is the very perception I'm talking about. Most would dispute the "not built for reliability part as well" especially when you configure a PC for reliability. Or use an embedded class PC.
???? This is the dumbest thing I've heard all week, - it's now going to rattle around in my brain all day.
There are tens of THOUSANDS of PCs doing 'real-time I/O servicing' - I have three on my desk now, two of them 'embedded' PCs, one is even synching several drives at 1ms cycle times and only about 3us jitter.
Get your facts straight before making global generalizations, especially ones that are dead wrong.
Robert Trask, PE
Los Angeles, CA
It's because a PLC is a dedicated PC for Industrial control. It got a number of digital and analog Inputs/0utputs and a sytem to handle them.
Simply where u need a truck u will not prefer a mercedes benz.
Hope U got the difference
A PLC programmer
Here are two reasons.
1. PLCs are the direct descendents of the pre-solid-state relay control panels that were used in all control applications. So by inertia, PLCs have an advantage.
2. The architecture of a PLC is especially well suited to handling a large number of simulataneous parallel functions (the kind that are so common in industial control applications) with a high degree of reliability. Even though the PLC internally is implemented as a sequential processing device, it appears to the user as a highly parallel processor.
Embedded Systems Consulting
I think it's simply because PLC is really mean for process and automation control. With its IO and special function modules, it takes less time in doing programming because one only need to concern about seq and loop control. Commercially, PLC is more expensive, but the way it has been designed makes it more realible in the long run. Just think of what a PLC does be passed to a PC, the PC will be overloaded with rapid processing works which is required for industrial control.
That last statement, Sir, is absurd. If you look at the throughput of a typical PLC class processor VS typical PC processors you'll begin to grasp where the problem is. It takes an almost inconcievable amount of bloat and cruft to slow a PC class processor to the point that it can't scan 10 times the IO, 10 times faster than most PLCs. And only running the World's _least_ reliable software makes the
reliability argument credible, in my _real world_ experience. Indeed, a typical PC processor with reasonably efficient software could easily saturate any automation backplane I've seen and nearly any network designed for automation as well. Indeed, if you were to port the PLC software directly to a PC, which is probably quite doable since many PLCs run x86 engines these days,
all your perceptions of PC speed and efficiency would be blown out of the water. The reliability would amaze you as well.
My point is: If you tried to run the equivalent of Windows on your PLC, it would then be both extremely slow and extremely unreliable. But that is not a problem with the hardware. A typical PC with rotating disks and fans might give away 60% of the mtbf of a PLC. More in a really bad
environment. But without those, the two should be much closer as the electronics aren't as different as they are similar.
The other 99% of the "failures" come from simply running the default software rather than making an intelligent selection _for_ reliability. That's why the telcos and most ISPs and financial exchanges and the massively parallel supercomputer folks don't run Windows. They choose software for reliability.
In other words, if your PC is slower and much less reliable than your PLCs, you have a dire need for different software. But don't blame the PC. Borrow a Tivo and do the math :^)
In my experience this is simply not true, what makes you say this?
Because PLCs are designed for automatic control (relay replacement) and PCs are designed for human interface (video, keyboard, pointing device) and data storage (rotating media).
Hope this helps!
Rexel / Central Florida
It is simply because PLCs can operate 20 years without any major problems! What PC can do that?
Go on the main page (http://www.control.com) on the right lower side and (under "Thermal Overload") there is an article called "PC based control vs PLC". Or just click on this link: http://www.control.com/941648158/index_html)
There are over 50 comments on the subject.
My experience is that a PLC is a much tougher platform and will hold up with industrial apps.
Additionally, Microsoft constantly changing the platform causes nothing but long term problems.
We have PC's on the floor that had DOS 2.2 running on them, when the PC failed the replacement was very tough and can be very expensive.
Most newer hardware i.e. temperature controllers etc. do not communicate with old PCs and software, well that is a major pain.
On units that I have replaced, we use a PC for data collection and PLCs to perform the work. This has been very successful and very cost friendly.
Proven reliability is the main reason. Another is that the main operating systems by MS have had problems and the reliability of PCs is not near as good as a PLC.
1) PLCs have rugged housing, broader temperature range, and no cooling fans.
2) PLCs have I/O capabilities. Adding I/O to a PC means integrating a third-party solution. (Also, it typically means having to open the case of PC; it's much easier to add a card to a PLC rack)
3) PLCs don't have harddrives, and do have battery backed and/or non-volatile memory.
4) Historical reasons. Why change? This also encompasses training issues, e.g. your maintenance people all know Allen-Bradley but can't change a video driver.
5) Security. Turn the key on a PLC to "Run", then pull the key out. Now nobody can change anything going on in the PLC without the key. PCs tend to be harder to lock down.
6) Stability. This is a perception problem as much as anything; I've seen plenty of rock-solid PCs on the factory floor. Still, if you buy a PC and it screws up, your bosses will ask you why you didn't use a PLC.
7) Boot time. PLCs come to life much quicker than PCs.
Naturally, there are people who will disagree with this list, and some who will say "You forgot reason X!" Still, this gives you an idea.
Incidentally, I'm a big fan of PC-based control, and my company has been doing it for almost ten years now.
Sage Automation, Inc.
Sorry to disappoint you, but this is just an urban legand Linux users mindlessly promote.
I've had many Linux "deaths" over the last two years that are just as serisous as a "blue screen of death". In fact I stopped using one RedHat Linux distribution a month ago on an old IBM system with AMD K6 processor because it had the bad habit of all of a sudden thinking it was running on a Mac. I'd get the "Sick MAC" icon in the middle of my
display (you know the MAC with an ice-bag on it's head ...) and a fatal error popup because it couldn't find some PowerPC drivers. The screen wasn't blue, but the PC sure was dead - at this point there was not even a way to dump a core or anything useful because I don't have a Mac. I
don't have time to trouble-shoot Linux kernels to fix such deaths, especially when I have no details I can offer the makers of the disti so they can fix it either.
- LynnL, www.digi.com
Sounds like someone's sense of humor. Where in the world would a Mac icon come from? I don't doubt you Lynn, but that's about the weirdest thing I've heard of. Only thing I could guess is it misidentified the processor. Linux does run on the newer macs. And AMD had a signature problem a while back. I'll bet a run through Deja News would find it.
Sounds like you encountered the BSOD screensaver in the xscreensaver package... "Systems depicted include Microsoft's Windows 95 and Windows NT, Commodore-Amiga's AmigaDOS 1.3, SPARC Linux, SCO UNIX, the Apple Macintosh (both the MacsBug debugger and the rarer "Sad Mac"), and the Atari ST."
XWindows may have crashed, but the system was probable still Ok. If you couldn't recover control from the console, you could always telnet/ssh in to kill the offending process.
And I am sure that the average machine operator has
root access to do this... I suspect that the action
taken would be the same as if/when Windows freezes -
reboot and keep on gettin' it.
probably because the PLCs arrived first in the automation business. But now days PC are being preferred for automation task due the fact that they can keep a lot of acquired data and they can export that data to any PC (using the internet) or to any software program. Besides is a lot easier to develop a program in a PC than on a PLC.
One simple reason - open a PC you bought just 6 months ago and see how many of the EXACT same cards and componants you can buy today. I doubt many of them are still available; everything is "new and improved". Functionally, a new Ethernet card or CD-ROM drive works the same, but if
you start experiencing subtle problems in 1 of 20 Industrial PC, it will be hard to rule out the diverse hardware. True "Industrial PC" suppliers hold componant supply constant, but ultimately their price will be higher so users will decide buying a "new" CD-ROM for $29 is better than an exact replacement for $169.
I remember working with a compressor company in Singapore using Industrial PC for some realtime "call-for-service" systems and the field techs even complained how different models needed different screw drivers to service. And the modem cards - some had the "line" RJ11 as the top of 2, while others had it the bottom of 2. ;^) These things sound small, but when you are supporting hundreds of sites - and may have to DRIVE BACK to one if you goof up reconnectling cables - these kinds of inconsistancies can be costly.
- LynnL, www.digi.com
That is a good and valid point. But I've only seen a couple cases where I needed exact replacements. Both were shrinkwrap that only supported obsolete hardware because of driver availability. And it couldn't be recompiled for the updated OS. It's tough to find a Hercules video card anymore or a 40mb hdd. But, when you start experiencing subtle problems in 1 of 20 PLCs, you're screwed because you know nothing useful about the hardware. In either case, complete replacement is probably most cost effective. OSS helps but it doesn't completely mitigate the problem. I guess there is some price for progress.
To those who are less exposed with PLC's and recetnly been using PC's the best analogy is a tractor and a car. Most people now a days are so familiar with cars that they do everything using their car. A limited number of people though still use tractors for very specific application. There are range of tractors from a few horsepower to several hundred horsepower built for specific use.
Jobs sometimes can be done using either car or a tractor like pulling a trailer. But pulling a trailer on rugged terrain may mean differently if using a car or a tractor. Definitely tractor would be the best choice in this situation.
Pulling a light baggage trailer in a highway may require a car to be use for speed and maneuverability, becuase tractors are not built for this and will never be.
There may be huge advancement of PC's with reagrds to speed and processing power but for automation point of view it is trivial. It can not replace the PLC even just one percent of the current application because it is not built for reliability as already pointed by some.It is not built strictly with industrial specifications components while PLC's are. It has horrendous hardware and software overheads before it processes I/O's.
PLC's never crash 99.99% of the time while PC's run only 90% of the time without hanging up.
Just a few that I can think of right now:
1. The PLC was invented as a modular control solution right at the onset (this has to do with what it was meant to replace). The inputs are in modules, the outputs are in modules, the rack is another module and so is the power supply. Each PLC maker usually has a variety of CPU (and
communications) modules to suit a wide range of application. The programming in modern PLC's also allows modular development.
2. There is a long term hardware manufacturing commitment by the maker, usually in the range of 7 to 10 years.
3. The hardware is tested to rigorous standards, the better manufacturers modules withstand serious abuse (electrical & mechanical).
4. Components of the more recent PLC systems are hot pluggable.
5. There are fully developed on-line programming & diagnostic tools available.
6. There is a ready supply of competent electrician/programmers available. The big item that matters here is that the person standing at the PLC
programming terminal more likely thoroughly understands the electrical & control system that he PLC is interacting with.
7. While there are lots of people that like to argue about it, it's easy to do decent process control with PLC's, and it's done with the same hardware/software that is there for the rest of the system.
8. Decent PLC makers work on a continuum when they bring out new lines of hardware, building on previous models and capabilities. This allows gradual upgrading of the workforce and retaining values of installed interfacing
9. There are more likely to be local training opportunities for getting into PLC programming. I know I have taken lots of electricians, techs and engineers (even mechanics) to competent PLC programming in a one week
training course. I have not seen any comparable course that would take one of those guys to a competent computer programmer in a week.
10. There had to be ten at least, so lets make up another reason: Competent sales staff. Where I buy my PLC hardware I can get some help with applications problems (whch is fortunately quite rare - but valuable when needed). Is your local computer dealer going to be of help (or the operating system vendor, or the application software vendor)?
Rockwell Solution Provider
Personally, I prefer using a PC (although I have used PLC's for years). With a PC and comm card (such as Profibus master) you can program in a single language (such as VB.Net) and have direct fast access to the world (plant devices on a network, database access, operator interfacing, animation, great debugging environment, a multitude of built-in functions and more).
I might also add that I have heavy duty applications running a standard Dell PC that have been running for several years with a glitch. These applications are running with Profibus master card to I/O blocks, 8 server axis, vision lighting controllers, pnuematic valve blocks and AC variable speed drives. Additionally, the PC has an imaging card for vision with live video, is running animated graphics and reads and writes parameters to a remote database server (MS SQL Server).
If you try to do this with a PLC, comparitively you pay a much higher cost in hardware and software packages. The PC with VB or C offers seamless integration in a single lanquage, where a PLC (which I have used for such applications) means many pieces to integrate. This often means a debugging nightmare when one vendor is pointing the finger at the other because of bug in software which you have no debugging access to determine who is the offender.
Just my take after over 25 years experience from relays to PLCs, to PCs.