I am a lab engineer for a university.

Up until recently, we have been using MS-DOS 6.22 and Quickbasic 4.5 to control our experiments and record our data. We did not change to Windows because Windows would not let us control our processes on a millisecond level. This series of experiments is supposed to run over a 20 year period. Any large changes threaten the continuity of the data collected.

But recently, we have had some problems:

- Many of our 486 and Pentium I systems have failed, and new computers are not compatible with our hardware.
- We just lost our last DOS compatible printer due to inability to buy ink cartridges for it.
- The company which made our hardware for the experiment has gone out of business (they said at the time that it was because their equipment could not be used on any modern computer).

We are now 15 years into this study, and are seriously wondering if we will be able to complete it. Our needs are as follows:

- We need to be able to collect data continuously over 4 seconds per trial, with eight 12-bit data points being read each millisecond. The data are saved to an array. The hardware we have controls the timing of the collection cycle, raising a bit every millisecond.

- The program must notice within one millisecond to any abrupt change in one particular data item out of the eight being read. It must then time off an interval selected by the user in exact milliseconds, and then activate an output bit to turn on a function on the hardware.

- We store the data to disk after the trial ends.

- The hardware is ISA bus.

Attempts to do this on any Windows system, or even a DOS system which has not been rebooted since running Windows 3.1, resulted in the timing values being way off.

Questions:

1. Are there ways to get accurate software timing on a Windows machine? Windows I/O totally messes up all attempts to do this, forcing all times to be multiples of 55 ms and missing much of the data.

2. Is there a way to make new printers run on DOS machines? We have been able to get new machines to run MS-DOS, but they won't print.

3. Is there a way to use our existing hardware on a new machine? We couldn't get it to run on a Pentium II which has ISA slots, because the bus timing was too fast for the card.

Recommended Answers

All 11 Replies

I dont know if it will help, or if this function has any overhead, but the standard c/c++ function GetTickCount() will return a Long int of milliseconds.

The program

#include <cstdio>
using namespace std;

long the_time = 0;
long last_time = 0;
long frame_time = 0;

void frame(void) 
{
    this_time = GetTickCount();
    frame_time = the_time - last_time; // this is time elapsed since last call
    if(frame_time == 1) // one millisecond
    {
        // perform processing of data
    }
    last_time = the_time; 
    return;
}

int main(void)
{  
    for(;;) { frame(); }
    return 0;
}

gets the time every time the frame is called. If you print the time you find that on most recent systems the processing is enough that i have found even on my 2.5GHz celeron that from that loop i could get quite a few calls before one millisecond was up. The IF statement effectively is called every millisecond, on paper anyway. I could not garuntee the accuracy or any overheads. It might be worth turning this code over to the computer science forum as i believe they might be able to help with timing and accuracy of code and computing a lot more :)

BTW this compiles on win32 and above....

SORRY!!! it was late at night i forgot this was the Basic/VB forum... I think that to get that accuracy you might need to go to a lower level language anyway.... ;)

The main problem is that Windows takes control of the computer away from the program for periods of time longer than a millisecond each time the 55 millisecond jiffy timer goes off. With DOS, the interruption was far less than a millisecond.

We have tried the following, and they do not work:

- We tried a Unix based system using C+, and it changed the order of the operations to suit its own sense of efficiency, collecting and storing the data before outputting the signal to start the equipment.

- We tried a system of timestamping, which was said to be able to compensate for the times of the events. But it is impossible to put a time stamp on the real world to tell it that the control signal should have occurred 23 milliseconds before the computer actually output it. This ruined the data from an analysis standpoint.

- Any system of data collection taken through the Windows system restricted our control points to be 55 milliseconds apart, and produced a random delay at the beginning, as the first real-world event was event was asynchronous to the first 55 millisecond jiffy. If we needed a read-modify-write operation, the control points had to be 110 milliseconds apart.

- We were successful with a DMA data collector card, but we were not able to scan the data on the fly as we need to do.

- We tried e-Prime, and found that it introduced delays of up to 10 milliseconds if the real-world event occurred during the Windows jiffy.

Someone told me it was possible to use the old QuickBasic 4.5 with Windows, and it could take control away from the Windows timing. Is this true?

I was able to boot a Windows XP computer from a DOS 6.22 floppy in the A: drive, but I couldn't use the hard disk with it (it's not FAT). The printers would not work either. And I can't take the DOS floppy out to put in a data disk, or the program crashes if it makes certain DOS calls.

I would therefore recommend passing the problem to the universities engineering dept. You will need specialised timing electronics made for the nature of te research! Something that has an accurate internal oscillating frequency and can store some data. Then write a driver to get the data off of the hardware into windows. As long as you buffer so much in the hardware the overhead from windows will not matter!

I would therefore recommend passing the problem to the universities engineering dept. You will need specialised timing electronics made for the nature of te research!

Several problems with that:

- This university has no engineering department (other than the physical plant department, which takes care of the buildings and pipes). It's a liberal arts and sciences college.

- I already built a timing machine myself. The problem is that the scientist involved wants to use the computer to randomly select timings, rather than turn knobs on a control panel.

- It would take a computer to do the calculation he wants to do to determine one of the settings. That puts it back in the computer's flawed timing, since the answer would still not be available to the hardware until 55 milliseconds later.

- The scientist wants to make frequent changes in the experiment.

I need a way to do at least some of these:

- Run DOS on an XP machine in such a way that I can store data to the XP hard drive.

- Use a current printer and disk drive on a DOS machine (no drivers available).

- Run QuickBasic under XP without being stopped by Windows timing or fouling up the hard disk.

- Connect and use our current ISA slot data collection system to an XP machine

The only thing I can help you with there is with the printing.

Currently you can still buy some printers that are MSDOS compatible. But be carefull. You need a printer that has a parallel printer socket, not just a USB one (I think DOS can only output to a parallel printer socket). Also you need to make sure the box says it is DOS compatible. When you install the printer driver there should be an option displayed to enable MSDOS printing. You then make sure Windows XP is set to send the info to LPT1 and connect via a parrallel printer cable.

or why dont you just install xp on a fat drive?

Is it not possible just to replace the old 486 or Pentium hardware? That stuff is SUPER cheap now. Even if the university wouldn't authorize funds for it, you could probably dumpster dive and find the base equipment or systems you need.

The way I look at it, why change what you've got too much? I'd just keep trying to run with the old hardware. If you're really dead-set on changing, have you thought about some kind of embedded solution, or a PC running QNX, an actual RTOS?

Ah, the joys of doing psychology research with modern operating systems.

Any modern operating system isn't going to allow you to access reliable millisecond timing, no matter what you do. You'll need to prempt the kernel to do that, which will require some nasty coding.

There is a company that makes some software that lets you build a real-time virtual OS that runs next to Windows. I have no idea if it works (I'm a Mac person), but it looks like the kind of thing your looking for:

http://www.tenasys.com/intime.html

On another note, make sure the professor you're working with has read this paper:

Ulrich, R., & Giray, M. (1989) Time resolution of clocks: Effects on reaction time measurement: Good News for bad clocks. British journal of mathematical and statistical psychology, 42, S. 1-12.

It basically shows that even older systems that introduced 16ms of error don't really mess with your data. Another useful discussion on RT accuracy can be found here:

http://www.visionscience.com/documents/rt.html

Hope this helps,

-Jon

Member Avatar for nicentral

Aside from OS issues, there are also issues with the x86 architecture. They have a tendancy to lose track of time so to speak. I would think that if someone wanted to create an app that needed extreme time accuracy they would go with RISC based hardware like something from IBM or Sun. Expensive I know, but if you ammortize the purchase over 20 years :)

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.