why is the realibity of analog computer low ??

Actually, reliability is not the real issue with analog computers; indeed, they can be much more reliable at times, as they don't have issues of round-off errors and so forth that are inherent in digital computers. Analog devices have the advantage of being continous rather than discrete, and thus are better models for certain problems.

Reliability can be an issue, of course, in that most analog devices are mechanical, and subject to wear. However, well-machined parts mean that an analog computer is unlikely to wear out within it's useful lifespan.

The real limits with analog computers is that they are always special-purpose devices; no practical Turing-equivalent analog computers have ever been designed. This isn't to say it isn't possible, but given the advantages of digital computing and the experience gained with it over the past 60 years, it is unlikely that any general-purpose analog devices will be developed. The only reason to use an analog computer today is if something needs to be modelled very accurately in a continuous manner, and since current digital devices provide very high resolution (if still discrete) accuracy, that almost never arises.