Please explain how that's supposed to work. The expression is still in an integer context, which means the result is an integer type. You then pass this integer type to printf() with the %f specific that expects a floating-point type.
So aside from the original bug, you've introduced another: passing the wrong type to printf(). The output is more likely to be 0.000000, but since this is undefined behavior in the first place the behavior is completely unpredictable. However, 2.333333 is the least likely result due to stacking undefined behavior on top of evaluating 7/2 in integer context.
I Developed application in C# with crystal reports and created setup file in third party (Advance) Installer.When i installed my application on client system application is working but ...