0
#include<stdio.h>


int main()
{
float f1,f2,f;
double d1,d2,d;
char s1[]="2.0045",s2[]="1.00056";


f1=atof(s1);
f2=atof(s2);
f=f1-f2;


d1=atof(s1);
d2=atof(s2);
d=d1-d2;


printf("\n%s %s",s1,s2);
printf("\n%f %f : %f",f1,f2,f);
printf("\n%ld %ld : %d\n",d1,d2,d);
return 1;
}

Output:

2.0045 1.00056
1271310336.000000 869988544.000000 : 401321792.000000
-67108864 1104343465 : 1870659584

Desired Output:

2.0045 1.00056
2.0045 1.00056 : 1.00394
2.0045 1.00056 : 1.00394

----------------
My botherations:

  1. Why is 'atof' working so strangely??...isnt it suppose to assign the value 2.0045 to f1 and 1.00056 to f2 and compute f??
  2. How else could i do this :sad:

PS: basically im reading a trace file...and performing some arithematic by parsing it.

Edited by Dani: Formatting fixed

2
Contributors
2
Replies
3
Views
11 Years
Discussion Span
Last Post by toxic_iguana
This question has already been answered. Start a new discussion instead.
Have something to contribute to this discussion? Please be thoughtful, detailed and courteous, and be sure to adhere to our posting rules.