0

HELP! We are 8 years into a 12-year study, and our original computers are failing. There is a serious possibility that the study will not be completed, because we cannot find computers which are compatible with our equipment and which can do the job.

We need to be able to measure and control scientific processes with millisecond accuracy. We can find nothing available which can do our task. Specifically we need to be able to do the following:

1. Collect real world analog data from 8 signal lines once every millisecond, over a period of 4 seconds.

2. Calculate a value from those data, and determine whether or not it exceeds a value determined at the beginning of the experiment.

3. Output the boolean result of the comparison to a digital port before the next millisecond ends.

4. When selected numbers of milliseconds have passed, output certain boolean signals to the digital port to operate equipment during different stages of the experiment.

5. After the 4 seconds are over, save the collected data to disk.

We were able to do this under MS-DOS. But now we are not able to find a machine capable of handling this task. Every computer we have tried messes up on the timing.

- We tried a specialized computer running Unix and C+. If someone who was watching hadn't intervened, the equipment might have been destroyed. It changed the order of the experiment, collecting all of the data first, and saving it to disk. THEN it did the calculations and output the boolean variable from the comparison. Last, it output the experiment stage variables. Somehow, the operating system did not seem to understand that the timing of all of these operations was important, and it rearranged them to process the data faster.

- Windows lets us do I/O only once every 55 millisecionds.

- We have similar problems in Mac OS.

- We have a little board level computer which can do this, which is currently on the market. It has a serial port so a computer can be its console. But it can't communicate with Windows. You have to have a DOS computer to use it.

- We tried e-prime, but it has too much latency, with the output coming several millisecinds after the input, and sometimes the output is bunched up.

I originally posted this in the Quickbasic forum, and they said to try here.

5
Contributors
14
Replies
16
Views
12 Years
Discussion Span
Last Post by Real-tiner
0

DOS still runs fine on the fastest, most modern PC's. Why not stick with what works, but with new hardware?

Don't forget that a free copy of DOS is in every Windows system (well, it was easy under Windows 95 to reboot as DOS, not sure if that is still true on XP). But you should be able to find a DOS that runs on a new machine. For example:

http://www.drdos.com/

0

Hello,

I would look at a store in your area that deals with older equipment. Believe it or not, I ran across a shop here in Wisconsin that does a lot of work with Commodore 64 computers. Simple serial port testing stuff. They have a closet with about 40 of them, and if one breaks, they can go to the closet and get another.

It sounds like your software was written to do the timings based on a certain CPU/clock chip, and that the software is not very portable at all to other levels of hardware. I have seen that too with Motorola programming software that requires ancient computers because of internal timing disorders.

Time to go make an investment with older computers, and get a few of them to work, as your environment ages.

Christian

0

Ah, I see, one of those old timing issues. Well, your local public school probably has hideously old computers that they would love to unload on you, especially if you bought them a brand new one to replace it.

Heck, Apple just released a $500 'minimac'; you could upgrade your local school to the latest, coolest machine on the planet and in return get some old 'piece of junk' that could be just what you want! You both win! And the minimac would be tax-deductible!

0

We tried most of those:

- You can't buy a new printer which works with DOS. We have only one printer left which works with DOS, and they just discontinued the ribbons for that. I bought up a stock, but they do go bad over time even if they are not used.

- We already scavanged all of the old PCs and DOS copies we could find. But most of those old PCs are getting to be unreliable, as they are all over 10 years old. Maintenance is becoming an expensive problem with them. We just spent over $300 getting older monitors (which are compatible with those computers) repaired. And most of the old copies of DOS I found are on 5.25" floppies. We can't get the drives anymore.

- I actually have a few Tandy Color Computers, which I got because the program rules over the operating system (which is how it should be, not the other way around like Microsoft did it). It is the only small computer I ever found which has no timesharing jiffy counter and no interrupt-driven I/O. But it is not fast enough to do the calculation before the 1 millisecond time expires. Also, the disk format is proprietary, and it can't run modern disk or printer equipment.

- Yes, we can boot DOS on new computers. I actually made some disks for that purpose. But the hard disk seems to be off limits to DOS, since it is not a FAT disk. None of the new printers work with DOS either, and DOS has no idea what to do with a CD burner. What good is running the experiment if we can't store, print, or use the data? We would have to read the screen and copy down the numbers.

- Our existing hardware is doing the timings, by signalling when the A/D conversion cycle is done. It has an internal 1 millisecond clock. The problem is that the Windows computer is off doing Windows mousekeeping during several of these 1 millisecond periods, and so the timing is botched up by the time Windows finishes doing its twiddling. Also, the card won't fit in the new computers.

- The professors who are running the study want to be able to use the latest analysis software. They also want point and click abilities. They are not very computer literate, and do not understand that adding that capability is part of what destroys the timing accuracy.

- The IT department wants all computers on campus to be standardized, and is threatening to take away our DOS computers and replace them with the uniform computer they have chosen. They already will not support the DOS computers.

- Some guy tried to sell us a system which uses "timestamping" to realign the data internally in the computer after the trial ends. But how do you timestamp the external chemical process, or the reaction time of a subject, to indicate that the signal should have gone out 23 milliseconds earlier than it did? Those extra 23 milliseconds have already ruined the process by that time.

I see new computers which are able to record and mix audio digitally. Is there some program similar to that, which can do analog data without 12, 20, 55, or 110 millisecond latencies??

0

Ah HAH! So you have ancient hardware boards that don't buffer up signals, which is why you need real-time response in a PC to read those signals from your hardware board.

How about investing in a new hardware device that IS compatible with modern computers? This may be impractical, but so was starting a 12 year study with computer hardware that couldn't change!

Ok, you mention that DOS won't deal with modern disks and such; well you CAN format a windows drive as FAT and the DOS that came with windows 95 was just fine with that. But then there's the printer problem.

So, here's an idea....

Suppose you have a sorta-modern PC running windows 95 running in DOS boot mode so you have (A) newer hardware and (B) low-impact OS. Now, you do not even attempt to print on that computer, maybe you don't bother to write to disk even. Instead you write to the serial port. That works great in DOS. Then, the serial cable goes over to ANOTHER modern computer, this one running Windows XP, and it takes the serial data and writes it to the latest in network drives and prints it on a laser printer etc etc etc. Since the serial buffers are, well, buffered, the time latency is much less of an issue. Still, you don't want to be running massive programs on that PC either, I'm guessing.

By writing the data to a network drive, the professors can run their wiz-bang analisys software on their own desktop machines and access the latest data.

Your original question mentioned that you need REAL real-time. Well, that is true but you also have a bunch of other limiting requirements (like supporting old PC hardware cards) that limit your options. PC's are cheap enough now that you can afford to have some dedicated machines to this problem, I'm guessing (or else how important is this data, really?).

In summary:
- Some machine old enough to support your PC card but otherwise capable of running windows 95 (maybe an old 486 machine).
- Boot windows 95 in DOS mode to run DOS
- Read the hardware card, add data to a buffer to be written to a serial port. maybe you need several buffers queued up because of the other machine running windows
- Send the data to a second new machine running a new OS via the serial port
- That machine writes the data to a network drive and prints to modern printers and the like; maybe also writes the data locally for backup purposes.
- Professors and others access the disk data from the networked machine.

0

"Ah HAH! So you have ancient hardware boards that don't buffer up signals, which is why you need real-time response in a PC to read those signals from your hardware board."

BUFFERING AND TIMESHARING ARE WHAT CAUSE THE TROUBLE!

We need to respond in real time because we need to respond to a sudden real-world change within one millisecond. The buffer holds the data until it is too late. The process itself needs FEEDBACK from the computer at millisecond speed.

Think of it as being like a servo loop. Put a delay in the servo loop, and the servo goes crazy.

I repeat again. The computer needs to read the data, calculate an equation, and RESPOND to changes, all WITHIN ONE MILLISECOND. We cannot wait for a buffering system or for Windows to get done with its mousekeeping.

We even tried to do the calculation in analog circuitry. The problem is that the analog circuit can't do the changes the experimenter wants to do from trial to trial. It has to be recalibrated for each new experimental condition. The experimenter wants to make the changes within half a minute, and it's a two-hour job to recalibrate. Some of the preparations we use go bad after an hour, and if you substitute another preparation, the previous data can't match it.

"How about investing in a new hardware device that IS compatible with modern computers? This may be impractical, but so was starting a 12 year study with computer hardware that couldn't change!"

Who Knew?

Nobody knew about Windows as being much of anything but a toy when we started the design of the project. MS DOS had been a staple mainstay, without much real change for the past 10 years, when we started. Then Bill Gates got greedy, designed OUT the parts of DOS which were in the public domain, and started replacing operating systems every 3 years.

Besides, even Windows 3.1 could not do the job. In fact, an MS DOS computer would work this job fine, until Windows 3.1 was started on that computer (they used a spreadsheet to crunch the numbers). After you quit Windows, the timing was wrong, giving numbers which were off on the order of several seconds and making the servo loop go wild. You had to reset the computer and reboot DOS to regain accuracy. The Windows drivers were still running and eating up time. We almost damaged the equipment one time when somebody forgot to reboot.

Nobody makes such a board anymore, because boards which do what we need are automatically incompatible with Windows. Windows requires buffering, because it takes I/O in bursts. Those bursts destroy the feedback loop.

The basics of our science also prevent the use of buffering.

"Ok, you mention that DOS won't deal with modern disks and such; well you CAN format a windows drive as FAT and the DOS that came with windows 95 was just fine with that. But then there's the printer problem."

You CAN format a Windows drive as FAT, but if Windows is already installed, then you can't change the format. And we don't get to keep the Windows disks. IT management keeps those. And if we ask them for a new install, we automatically get XP, NTFS. Note that the business school runs IT management.

"Suppose you have a sorta-modern PC running windows 95 running in DOS boot mode so you have (A) newer hardware and (B) low-impact OS. Now, you do not even attempt to print on that computer, maybe you don't bother to write to disk even. Instead you write to the serial port. That works great in DOS. Then, the serial cable goes over to ANOTHER modern computer, this one running Windows XP, and it takes the serial data and writes it to the latest in network drives and prints it on a laser printer etc etc etc. Since the serial buffers are, well, buffered, the time latency is much less of an issue. Still, you don't want to be running massive programs on that PC either, I'm guessing."

If we could get DOS and XP to run on the same machine, and find a place to put DOS files, we could write to disk, and then reboot in XP and print.

On the serial port, what do we configure the port as on the XP side? It would want to know what kind of device it is.

Windows 95 still runs the Windows timings in the DOS mode, so it can't be used. We already tried that back in ... 1996.

"By writing the data to a network drive, the professors can run their wiz-bang analisys software on their own desktop machines and access the latest data."

The existence of a network card on the data collection computer uses enough time through interrupts to destroy the critical timing -even if it is not installed on the DOS side. But I am content to save data to floppies - if I can. Our DOS boot system requires the DOS disk to be in the drive to work. Can't read the DOS files from the hard drive, because of the file system difference.

"Your original question mentioned that you need REAL real-time. Well, that is true but you also have a bunch of other limiting requirements (like supporting old PC hardware cards) that limit your options. PC's are cheap enough now that you can afford to have some dedicated machines to this problem, I'm guessing (or else how important is this data, really?)."

We can get all the dedicated machines we want, but we can't buy anything anymore which will do the job. They took all the stuff off the market. I even thought of collecting the data on two machines, one to control the experiment, and the other to save the data. But the response within one millisecond is the part which seems to have been engineered out of today's computers, because "business doesn't need it".

This kind of research we do is central to our department's continued existence. The kind of scientific research we do could become a thing of the past, because there isn't enough of a market for the kind of computer we need.

0

Well, seems a bad design decision is coming to haunt you...
you would best look into special built hardware, think of PLCs for example.
Or get some new data acquisition cards with their own logic controllers which you can use parallel to the computer they're in.

Believe it or not but your requirements aren't unique in the world and someone somewhere will have thought of how to solve it and put it on the market.
If not it may be time to contact the electronics engineering department and get them to design the hardware you need for you.

Besides, even Windows 3.1 could not do the job. In fact, an MS DOS computer would work this job fine, until Windows 3.1 was started on that computer (they used a spreadsheet to crunch the numbers). After you quit Windows, the timing was wrong, giving numbers which were off on the order of several seconds and making the servo loop go wild. You had to reset the computer and reboot DOS to regain accuracy. The Windows drivers were still running and eating up time. We almost damaged the equipment one time when somebody forgot to reboot.

that tells me a lot about the quality of your system design, and none of it is good.

Go for a dedicated unit to do your stuff and buffers the data in an internal buffer when it's done with it.
Get a modern computer to read out that unit at its leasure.
Decouple the parts of your system so that changes in one have minimal impact on the rest.

0

This is a liberal arts and sciences college. We do not have an engineering school.

"that tells me a lot about the quality of your system design, and none of it is good."

No, it tells you about the rigors of the experiment.

Our original experimental hardware was a machine and computer package designed by a major scientific company and costing over $30000. We just wrote the software for the experiment. It was the expected way to do things AT THAT TIME. The hardware card provided can do DMA buffering, but the machine needs that immediate feedback, so DMA is not allowed. That scientific company has since discontinued this kind of machine. They tell me that they discontinued it because they couldn't figure out how to make it work with new computers. There are currently no machines capable of handling the experiment on the market.

The setup was built before any form of Windows except 1.0 existed, and at the time Windows was just a toy for business. There was no timeslicing or time sharing on PCs at that time, except for the keyboard scan every 55 ms. Most computers did not yet have mice. The original computer which came with the first machine (we have 4) was an IBM AT. It, surprisingly, is the only original supplied computer left which still works.

The reason the numbers were off by seconds was that the software was designed to read in each data item as it became available, and provide the feedback value immediately. This worked fine, as long as the Windows drivers were not running. When the Windows drivers were running, the A/D converters in the machine kept cranking out the data at the 1 ms rate, but during the Windows timeslice, no values were read. There was a flag raised if the computer missed a value, but there was no way provided for the program to know how many values it missed. When we test-ran the card with a triangle wave, we discovered that it was alternating reading the values and not reading the values every 55 ms. And of course, the feedback was not only missing, but 55 ms late, letting the machine get way outside its normal limits.

"Go for a dedicated unit to do your stuff and buffers the data in an internal buffer when it's done with it.
Get a modern computer to read out that unit at its leasure.
Decouple the parts of your system so that changes in one have minimal impact on the rest."

OK. Where do we get the dedicated unit? Because of the high failure rate, old computers are out. And, like I said, no scientific hardware is made for this experiment anymore.

What do you mean "Decouple the rest of the system?"

Maybe I could hire you to be shrunk and inserted into the process, carrying a little stop sign. Then you can hold up the sign every 55 ms and tell the little ATP molecules to stop reacting long enough for Windows to finish its mousekeeping.

0

Update:

- I emailed Dr DOS. They have not returned my emails.

- Another old computer just trashed its hard disk. Replacements are impossible to find - nobody makes a drive SMALL enough to work under DOS anymore.

We really need a solution that works in a hurry.

0

There are several real-time, DOS-like operating systems available, such as RTOS:
http://www.smxinfo.com/index.html

These support modern disks and such, perhaps they could be a solution. Otherwise, it sounds like a real pickle because
a) your hardware board limits your hardware chassis options
b) the real-time requirements limit your os options

I'm just about out of ideas here, so I'll bow out of the conversation. Good luck to you!

0

I can get a new hardware board, provided I know what it fits.

RTOS is just another stupid multitasking system. Multitasking is what KILLS our application.

I need something which does NOT have multitasking in any form.

0

HELP! We are 8 years into a 12-year study, and our original computers are failing. There is a serious possibility that the study will not be completed, because we cannot find computers which are compatible with our equipment and which can do the job.

We need to be able to measure and control scientific processes with millisecond accuracy. We can find nothing available which can do our task. Specifically we need to be able to do the following:

1. Collect real world analog data from 8 signal lines once every millisecond, over a period of 4 seconds.

2. Calculate a value from those data, and determine whether or not it exceeds a value determined at the beginning of the experiment.

3. Output the boolean result of the comparison to a digital port before the next millisecond ends.

4. When selected numbers of milliseconds have passed, output certain boolean signals to the digital port to operate equipment during different stages of the experiment.

5. After the 4 seconds are over, save the collected data to disk.

We were able to do this under MS-DOS. But now we are not able to find a machine capable of handling this task. Every computer we have tried messes up on the timing.

- We tried a specialized computer running Unix and C+. If someone who was watching hadn't intervened, the equipment might have been destroyed. It changed the order of the experiment, collecting all of the data first, and saving it to disk. THEN it did the calculations and output the boolean variable from the comparison. Last, it output the experiment stage variables. Somehow, the operating system did not seem to understand that the timing of all of these operations was important, and it rearranged them to process the data faster.

- Windows lets us do I/O only once every 55 millisecionds.

- We have similar problems in Mac OS.

- We have a little board level computer which can do this, which is currently on the market. It has a serial port so a computer can be its console. But it can't communicate with Windows. You have to have a DOS computer to use it.

- We tried e-prime, but it has too much latency, with the output coming several millisecinds after the input, and sometimes the output is bunched up.

I originally posted this in the Quickbasic forum, and they said to try here.

This is prob. no help at all but...several years back I was using an STM that was controlled by a pretty old DOS box. The system had two monitors...one to control the instrument and a second for graphics output. When we initially got the system, we bought our own computer and shipped it to the STM builder who added some custom cards. To make a short story short, the computer died and we needed a replacement motherboard *identically configured* to the original. It was a Dell. It turned out that the motherboard was actually a Dell design (it had funky DELL insignia and patterns soldered on excess space on the motherboard!!!) rather than a third party component. After some back and forth with Dell, they shipped us an *8-year old BRAND NEW replacement motherboard* that they had sitting in some warehouse somewhere...just waiting for the desperate academic willing to shell out ~$500 for antiquated hardware. We got the part and everything worked fine.

0

I had an insight on this while studying how the human hearing system can respond to sounds up to 20 KHz under conditions restricting processing speed to much slower values:

- The maximum pulse rate of a nerve cell is about 1 KHz
- The human brain runs at a speed in the range of 2 Hz to 10 Hz.

The cochlea of the ear does a mechanical Fourier transform of the audio signal, and sends THAT to the brain. The brain sends control information to the cochlear cochlea and ossicles, but that information is determined in advance by existing conditions. The cochlear nucleus then does the actual control function at a much higher speed. This gave me the idea.

I am currently working on a solution which uses an analog computing device for the critical calculation. As I see it, this will have:

- addressable digital output ports so the computer can set the parameters
- A/D converters to provide the analog parameter values for the calculation
- some analog calculus (PID) circuits to actually do the calculations
- a data collector card for the computer
- a trigger line to start the data collector card
- a switch so the user can lock out parameter changes during a trial
- one program to calibrate the parameter values
- one program to set the values for the next trial
- one program to collect one trial of the experiment

This question has already been answered. Start a new discussion instead.
Have something to contribute to this discussion? Please be thoughtful, detailed and courteous, and be sure to adhere to our posting rules.