Hi there im wondering how to use a time delay in my code. What i have done is searched through a saved file and picked out infomation the send it to the rs232 port. what i want to do is create a time delay of 100 micro seconds for every 18 charecters sent. Heres my code so far.

Attachments
//print file on screen
#include <stdio.h>
#include <stdlib.h>
#include <string.h>
#define STRICT
#include <tchar.h>
#include <windows.h>




//write to the port

BOOL WriteABuffer(HANDLE hComm, char * lpBuf, DWORD dwToWrite)
{
   OVERLAPPED osWrite = {0};
   DWORD dwWritten;
   BOOL fRes;

   // Create this writes OVERLAPPED structure hEvent.
   osWrite.hEvent = CreateEvent(NULL, TRUE, FALSE, NULL);
   if (osWrite.hEvent == NULL)
      // Error creating overlapped event handle.
      return FALSE;



   // Issue write.
   if (!WriteFile(hComm, lpBuf, dwToWrite, &dwWritten, &osWrite)) {
      if (GetLastError() != ERROR_IO_PENDING) { 
         // WriteFile failed, but it isn't delayed. Report error and abort.
         fRes = FALSE;
      }
      else {
         // Write is pending.
         if (!GetOverlappedResult(hComm, &osWrite, &dwWritten, TRUE))
            fRes = FALSE;
         else
            // Write operation completed successfully.
            fRes = TRUE;
      }
   }
   else
      // WriteFile completed immediately.
      fRes = TRUE;

   CloseHandle(osWrite.hEvent);
   return fRes;
}




int main () {
  FILE * pFile;
  long lSize;
  char * buffer;

  pFile = fopen ( "LCD.aup" , "rb" );
  if (pFile==NULL) exit (1);

  // obtain file size.
  fseek (pFile , 0 , SEEK_END);
  lSize = ftell (pFile);
  rewind (pFile);

  // allocate memory to contain the whole file.
  buffer = (char*) malloc (lSize);
  if (buffer == NULL) exit (2);

  // copy the file into the buffer.
  fread (buffer,1,lSize,pFile);

  char *pdest = NULL;
   
   char *tmp;
   char Trackname[80];

   char *wavetrackname;

   tmp = buffer;


   memset(&Trackname[0],32,80);
   
// open the port
			char gszPort[18];
			

	memset(&gszPort[0], 32,18);


HANDLE hComm;

strcpy(gszPort, "COM1");
hComm = CreateFile( gszPort,  
                    GENERIC_READ | GENERIC_WRITE, 
                    0, 
                    0, 
                    OPEN_EXISTING,
					FILE_FLAG_OVERLAPPED,
					0);
if (hComm == INVALID_HANDLE_VALUE)
{

   // error opening port; abort

}


// find wavetrack name in file
   do
   {
	   pdest = strstr( tmp, "wavetrack name" );

	   if(pdest != NULL)
	   {
		   wavetrackname = pdest + 16;
		   tmp = pdest + 1;
    
// find file name up to"
			int  pos;
			pos = strcspn( wavetrackname, "\"" );
//copy track name to new string
			strncpy( Trackname, wavetrackname, pos );

			char *ttmp = &Trackname[0];
			strncpy( ttmp + pos, "\0", 1 );
			strncpy( ttmp + pos + 1, "                  ", 18); 




   WriteABuffer(hComm, ttmp, 18);



//print results on different lines

	//	printf("\n\n\n");
	//		printf(Trackname);
	//		printf("\n");

	   
	   }
   }
   while(pdest != NULL);


  // terminate
  fclose (pFile);
  free (buffer);

  int c = getchar();

  return 0;
}

Well you could state your OS and compiler.

There is no standard way to achieve sub-second delays.

Well you could state your OS and compiler.

There is no standard way to achieve sub-second delays.

Im using windows xp home and visual C++

hint: look for function Sleep() which uses milliseconds. And 100 is about as accurate as it can get, depending on what other processes are running on the computer.

yea ive seen the sleep function but it doesnt work with microseconds only miliseconds

MS-Windows is not a real-time operating system, so you can't get any better accuracy then a couple hundred milliseconds. If you want microseconds, then you have to use a different real-time os. One reason is that it takes the os thread scheduler a few milliseconds just to switch threads and do other os maintenance. Then of course other programs may not play nice and hog more cpu time than they probably should, and there are some operations that cannot be interrupted, such as disk i/o -- you have no control over any of those events.

I saw an add-on a few years ago that would emulate real-time, but it cost quite a few $$$.

> what i want to do is create a time delay of 100 micro seconds for every 18 charecters sent
Do this calculation:
Your port is set to 19200 baud, and framing is set to 8-N-1.
So each character has a start bit, 8 data bits, a stop bit and no parity (10 in all).
That gives you a line speed of 1920 characters per second.
At that speed, sending each character takes a little over 500 microseconds.

So a 100uS delay is basically the time taken to send a couple of bits of data. There's no way to guarantee that the OS won't insert much bigger delays due to scheduling of other tasks, and it certainly isn't enough time for the receiver to do much about it either.

Microscopic programmed delays are usually the wrong approach. There is simply too much variability in the average OS to make it anything like reliable.
You should be looking to use some other method of flow control, say xon-xoff.

Perhaps you need to explain what a delay of 100uS is supposed to get you, then perhaps we could suggest a better answer.

> what i want to do is create a time delay of 100 micro seconds for every 18 charecters sent
Do this calculation:
Your port is set to 19200 baud, and framing is set to 8-N-1.
So each character has a start bit, 8 data bits, a stop bit and no parity (10 in all).
That gives you a line speed of 1920 characters per second.
At that speed, sending each character takes a little over 500 microseconds.

So a 100uS delay is basically the time taken to send a couple of bits of data. There's no way to guarantee that the OS won't insert much bigger delays due to scheduling of other tasks, and it certainly isn't enough time for the receiver to do much about it either.

Microscopic programmed delays are usually the wrong approach. There is simply too much variability in the average OS to make it anything like reliable.
You should be looking to use some other method of flow control, say xon-xoff.

Perhaps you need to explain what a delay of 100uS is supposed to get you, then perhaps we could suggest a better answer.

I found this via google because it was EXACTLY what I need in my program - a delay of 100uS. The idea was to write a program to simulate the output of a microcontroller, which sends data byes in 100uS intervals. But based off the replies regarding OS timing, I don't think this will be possible anymore...

you won't be able to simulate it on either *nix, MS-Windows or MAC. But you might be able to do it in MS-DOS 6.X and earlier because it doesn't have the multi-process activitiy that the other os have.

This article has been dead for over six months. Start a new discussion instead.