Hey Everyone,

I'm doing a little test for a presentation I'm giving on Tuesday. I wanted to show the performance difference between VB2008, C# and C++.

To do this, I have three programs. Each program performs the Fibonacci sequence X times, and can either do it iteratively or recursively. As expected both C# and C++ blow VB away. However, my test between C# (with GUI) and C++ (command line) was a little surprising.

I did 40 Fibonacci numbers, both iteratively and recursively. The iterative tests show the same time to the millisecond almost. However, when doing the recursive test, C# completed the calculation in 3.8 seconds, while it took C++ about 12.1 seconds. Is this expected? I would have thought C++, especially with it being command line, would have outperformed C#?

Here are my codes... maybe you can see an 'unfairness' that might help clear things up. Or am I wrong in thinking C++ is quicker than C#?

C++ Code:

#include <iostream>
#include <ctime>
#include <stdio.h>
#include <time.h>

using namespace std ;

unsigned long first = 0;
unsigned long second = 1 ;
unsigned long result ;

time_t start, end ;
double total ;

int x ;
int ans ;

unsigned long fib( int n ) ;

int main()
{
	cout << endl << " Fibonacci Sequence Test -> C++ " << endl ;
	cout << "_____________________________________" << endl ;


	cout << endl << endl << endl ;

	cout << "Number of iterations to calculate:  " ;
	cin >> x ;
	cout << endl << endl ;

	cout << "1.  Iterative" << endl ;
	cout << "2.  Recursive" << endl ;
	cout << "Select method:  " << endl ;
	cin >> ans ;
	cout << endl ;

	if ( ans == 1 )
	{
		start = clock() ;
		for ( int i = 0; i <= x ; i++ )
		{
			//cout << result << endl ;
			first = second ;
			second = result ;
			result = first + second ;
		}
		end = clock() ;
		total = (end - start) * .001 ;

		cout << "Result:  " << endl ;
		cout.precision(10) ;
		cout << "Total Time:  " << total << endl ;
	}
	else if ( ans == 2 )
	{
		start = clock() ;
		result = fib(x) ;
		end = clock() ;
		total = (end - start) * .001 ;

		cout << "Result:  " << result << endl ;
		cout.precision(10) ;
		cout << "Total Time:  " << total << endl ;
	}
		
		
	return 0 ;
}

unsigned long fib( int n )
{
	    if (n <= 1)
			return n;
		else
			return fib(n-1)+fib(n-2);
};

C# Code:

using System;
using System.Collections.Generic;
using System.ComponentModel;
using System.Data;
using System.Drawing;
using System.Linq;
using System.Text;
using System.Windows.Forms;

namespace FibCsharp
{

    public partial class Form1 : Form
    {
        double startTime;
        double endTime;
        double totalTime;

        ulong first = 0;
        ulong second = 1;
        ulong result;

        public Form1()
        {
            InitializeComponent();
        }

        private void button1_Click(object sender, EventArgs e)
        {
            startTime = DateTime.Now.Hour * 3600 + DateTime.Now.Minute * 60 + DateTime.Now.Second + DateTime.Now.Millisecond * .001;
            for (int i = 1; i <= 40; i++)
            {
                first = second;
                second = result;
                result = first + second;
            }
            endTime = DateTime.Now.Hour * 3600 + DateTime.Now.Minute * 60 + DateTime.Now.Second + DateTime.Now.Millisecond * .001;
            totalTime = endTime - startTime;

            label1.Text = Convert.ToDouble((totalTime)).ToString();
            textBox1.Text = result.ToString();
            

        }

        private void button2_Click(object sender, EventArgs e)
        {
            startTime = DateTime.Now.Hour * 3600 + DateTime.Now.Minute * 60 + DateTime.Now.Second + DateTime.Now.Millisecond * .001;
            result = fib(40);
            endTime = DateTime.Now.Hour * 3600 + DateTime.Now.Minute * 60 + DateTime.Now.Second + DateTime.Now.Millisecond * .001;
            totalTime = (endTime - startTime);

            label1.Text = Convert.ToDouble(totalTime).ToString();
            textBox1.Text = result.ToString();
        }

        private ulong fib(ulong n)
        {
            if (n <= 1)
                return n;
            else
                return fib(n - 1) + fib(n - 2);
        }
    }
}

Thanks

Let me start by telling I am sort of a C# affictionado, with intrests in other languages. I am as surprised as you are. Even more because you did extra calculations in the C#version to time the thing.
Perhaps that's a problem. Try to use if possible the same class for timing. I don't know for C++ but in C# you have the Timer classes and the StopWatch class. An example of the use of StopWatch can be found in http://www.daniweb.com/code/snippet979.html.

Oh and a tip :
Put your calculation in a for loop and perform it 1000000 times or so, that way the millisecs become less important.

Is this really a fair comparison?

With C# you are using the CLR to indirectly use your OS and in C++ you are directly using the OS.

If you want this to be a fair comparison, try running the code using the CLR in C++ also.

Firstly, your iterative versions aren't equivalent either: your iterate over a different interval in the loop.

As to the differences with the recursive versions, there are all sorts of possible explanations.

1) The way you're measuring time may have different granularity. The clock time is not infinite resolution - hence the tip by ddanbe to do a large number of calculations: for small duration calculations the error in the computed interval may be larger than the actual interval i.e. your timing results may be meaningless because you're not doing enough calculations.

2) The conversion of an interval (measured with two calls of clock()) is generally required to be divided by the value of the macro CLK_TCK to convert the interval to seconds. You're not doing that, so it's possible you've introduced a scaling factor on the computed times.

3) Compiler optimisation settings can have a significant effect: for example comparing a "Release" build with a "Debug" build in the other languages. "Release" builds tend to be optimised for speed, but "Debug" versions are not.

My guess, in your case, is a combination of all three.

The VC++ now defines CLOCKS_PER_SEC (old CLK_TCK) macros as 1000, so there are no bad scaling in that case. However a real measured interval (i.e. VC++ RTL clock() precision) is about 15 milliseconds, so it's impossible to measure short intervals (iterative variant with clock() ticks <2 MICROseconds!).
For example, look at my snippet based on CPU Performance Clock Counter API (~1 microsecond precision):
http://www.daniweb.com/code/snippet916.html

Release version of this test runs 1.3 seconds on my CPU (debug version ~8 milliseconds). So I think recursive fib() code compiled with VC++ 2008 (and others) is faster than its C# incarnation, that's not the point.

Let's look at the "yellow press style" post header. The question: what's an object of this test? Is it C++, or C#? Obviously, answer negative. It's a test of 331,169,281 function calls. Of course, function call overheads is a very important parameter of - what? The language? No, it's a parameter of the language implementation (compiler+RTL).

It's a well-known fact: there are different compilers with different code generator qualities. Visual C++ is not a champion of C++ code optimisation.

So be careful with post titles ;)...
PS. It does not matter GUI or console environment used for function calls...

Here is my results for exactly your code in my Dell Inspiron 6400 laptop with Intel T2500 CPU and 2gb of Ram.

Results for VC++ 9.0

Fibonacci Sequence Test -> C++
_____________________________________



Number of iterations to calculate: 40


1. Iterative
2. Recursive
Select method:
2

Result: 102334155
Total Time: 2.5
Press any key to continue . . .

Results for Intel C++ compiler 10

Fibonacci Sequence Test -> C++
_____________________________________



Number of iterations to calculate: 40


1. Iterative
2. Recursive
Select method:
2

Result: 102334155
Total Time: 2.391
Press any key to continue . . .

And the result for C# 3 with .Net framework 3.5

Best Result for C#3 and .Net3.5 was 3.562


As you can see, in my computer test runs in C++ about %31 faster than C#.
Perhaps you forgot to enable optimization when compiling with C++.

PS the post title isn't appropriate.

Just for the record, I wasn't trying to do yellow journalism or anything of that sort. If the title is offensive somehow, I apologize. Actually, I expected and was rooting for C++, so I assure you I wasn't trying to make false accusations. If a Mod would like to change the title to something more appropriate, feel free. Doesn't make a difference to me.


Thanks for the replies. I can understand datetime and clock() causing a timing difference in the iterative versions, but not the recursive; not that big anyways. The differences in time was about 8 seconds. I'm sure the different timing methods didn't cause an 8 second delay.

The iteration tests were the same to whoever said they weren't (as far as the number of iterations). I posted the last code I used, but I made sure to change it before my tests so that they were fair between the two.

I wasn't aware of an optimization capability. Thanks. Where do I access that? Also, is there the same sort of capability for C#?

thanks again for everyone's help

Comments
dun worry.
Doing experiments and asking questions to find solutions besides your own is the best way to learn, so dont you dare be sorry! =)

By default all optimizations are enabled in the release configuration in visual studio standard edition and above. Remember that, all optimizations are off in default Debug configuration.

You can access optimization settings in project property page in the C/C++->Optimization and in the linker->optimization sections.

And if you are using cl command line, these switches are related to optimization:
/Od
/O1
/O2
/Ox
/Oi
and for linker:
/OPT:
Consult MSDN for more information.

This thread seems to be pretty old but lacks a satisfactory answer.

In the C++ version cout is used during the calculation to show the progression.

In the C# version a gui is used to do the job.

This alone explains why the C++ version is slower in this case.
cout is a blocking I/O which means that the C++ version is waiting for I/O most of the time. Basically cout is calling de write syscall directly.

C# gui uses a separated thread to handle display tasks for the gui (like almost all gui libraries).

So if you want to make a fair comparison you should at least place all your messages in a fifo and print them from another thread. Then you'll see the power of C++.

This case is very interesting because it illustrate perfectly mains pros and cons of both languages:
* C# development is quicker and easier than C++ development.
* C++ is more powerful but also more painful and needs more efforts and a higher level of expertise to be efficient.

Edited 1 Year Ago by Loïc

This article has been dead for over six months. Start a new discussion instead.