Well... I guess you could have a valid reason for wanting to do it :P. However it probably wasn't included because .NET is intended for application development and not so much realtime operations like this. You can time events every 0.5ms or however you want but I think your next timer will frequently fire off the event before the last call finished up. You will need to do a considerable amount of testing on this. Also you will want to test without the debugger attached and with code optimizations on because the added overhead may very well delay execution. Take a look at the StopWatch class. I think that is the most-accurate way of controlling timing included in the .NET framework.
The Timer.Interval property is defined as a double, so if you want 10 microsecs you could try Timer.Interval = 0.01;
But as sknake pointed out, I also doubt if it works, should say give it a try and succes!
I'm going to break protocol here somewhat, perhaps (since the question has already been begged), and ask what kind of scenario you are wanting to manipulate with a timer in microseconds? My teeny-macro mind cannot fathom what it could be...:icon_redface: Any real-world explanation would suffice for me if you cannot divulge the actual purpose.
Now that makes sense to me because I would think you would use a dedicated circuit/processor for that kind of speed. I had no idea those 555 timers are still so popular. I have built some fun stuff with them way-back-when (decibel meter, strobing LED's, binary clock).;)
I suggest your best bet is to program something like a PIC microcontroller. Get it to accept a serial comms message that tells it frequency & mark space ratio. So your C# just sends a message to the dedicated hardware when it wants to change the modulation.
Fundamentally, 10 microseconds is much shorter than a scheduler can run tasks.
But, do you need your PWM rate to be 100,000 Hz? If each pulse is to carry an independent value, you need to export a complete message to a microcontroller - and then you'll need to be double buffering to keep up with the required work rate. If the PWM stream represents a realatively slowly varying signal level, does the repeat rate need to be so high?
Sorry, get the oscilloscope out. The fact that the syntax allows you to specify a 100 nsec interval doesn't prove you can deliver that to the outside world. You have to allow for scheduling overhead. In order to beat the other threads to a particular mill, you've got to set thread priority to make sure you're not the one that's being pre-empted.
The danger is you end up committing one of the processor cores to this objective, because you can't allow the overhead of a task context switch.
When a thread sleeps, it is de-scheduled, allowing the processor core that's running your thread to unload the working context of that thread, figure out which thead to run next, load it's context, and switch to that. That's going to take more than a microsecond.
You can set a timer for milliseconds in C# -> Thread.Sleep(new TimeSpan(10)); This TimeSpan ctor takes a long in 100 nanosecond intervals. - SB
I unsdestand what you want. I am in the same issue here. I'm using a Measurement Computing PCI DIO-24 card with 24 I/O bits. I comes with a DLL and libraries for .NET. I've been using it to send data to some external FIFOs, but it is too slow with the timer at 1ms. it means that my clock has a period of 2ms at the best which is only 0.5khz. My FIFOs can go 100Mhz! I seems pretty stupid that a Ghz computer can only stopwatch 1ms intervals. --The Bug
Processors these days perform 4 instruction sets for ever tick of the crystal oscillator from the system clock. If anything its not the processor. I would say its a limitation on the .net virtual machine sand boxing all the code it executes. If the OP seriously need accuracy at less than 1ms he either needs to switch to a native language or more realistically create a dedicated circuit in the device that uses a timing crystal of its own.