I wrote this program for insertion sort.I compiled the source file in xubuntu terminal.But when I try to run a.out executable....terminal showing "segmentation fault".I don't about segmentation fault..when segmentation fault occurs...what is the thing in the source file that is causing segmentation fault...

Thanks in advance......

int i,j,n,key;
int a[n];
printf("Enter size:");
printf("\n Enter elements:separate each by space\n");
	scanf("%d ",&a[j]);              
 	printf("%d ",a[j]);

5 Years
Discussion Span
Last Post by challarao

It's because of this line

int a[n];

You cannot declare arrays like that.
Here either 'n' should be initialized prior to declaring array

printf("Enter size:");
int a[n];

or n should be replaced by some constant value.
like this:

#define BUFFER 100
int a[BUFFER];
Votes + Comments
Thank you

You cannot declare arrays like that.

Yes, but only because the OP is clearly not compiling as C99. I can tell because main() uses implicit int, which was removed in C99.

Here either 'n' should be initialized prior to declaring array

printf("Enter size:");
int a[n];

Um, no. Since this is C90[1], array sizes must be a compile time constant. If the above code compiles, it's due to a compiler extension. The C99 standard added variable length arrays (VLAs for googlers), but I strongly discourage their use in robust code. If you want an array sized at runtime, C99 best practice and C90's only portable practice is dynamic allocation:

int *a;
int n;

scanf("%d", &n);
a = malloc(n * sizeof *a);



On a side note, C90 doesn't allow the mixing of declarations and code. C99 does, but we've already established that this code isn't C99. C++ allows it as well, which suggests that you're compiling as the wrong language (on top of relying on compiler extensions). The problem is that C++ and C are subtly different. If you don't know where the differences lie, you could end up with difficult to trace bugs.

or n should be replaced by some constant value.

That's better, but only if the size is fixed and unlikely to change. Beginners and poor programmers often subscribe to the theory of "640K ought to be enough for anybody", where if you allocate a large enough array to handle all expected input, the code is simpler. However, if/when they grow up and start writing robust code, constant sized arrays tend to be used less, or as intermediates in producing a variable sized "array". An example of the latter case is using fgets()--which requires a fixed buffer--as part of an algorithm to read arbitrarily long strings.

[1] Or C89, or C95. C89 was the ANSI standard before ISO got a hold of it, and C90 is virtually identical to C89 (but it's the "official" international standard). The two are interchangeable. However, when someone talks about C89 or C90, they nearly always mean C94/95, which is C90 + Normative Addendum 1 ("C95" is more commonly used than "C94", but the value of the __STDC_VERSION__ macro says 1994). Normative Addendum 1 basically made C more international-friendly through language and library additions targeting character sets.


Segmentation fault is due to an attempt of access of an area of memory that is not allowed to be accessed.
Common causes are:-
Improper format control string in printf or scanf statements.
Forgetting to use & on the arguments to scanf.
Accessing beyond the boundaries of an array.
Failure to initialize a pointer before accessing it.
Incorrect use of the "&" (address of) and "*" (dereferencing) operators.


I can tell because main() uses implicit int, which was removed in C99.

Even though compilers supporting c99 features(E.g. LCCwin32->that's what I'm using right now) give warnings if int is omitted before main().
"no type specified.Defaulting to int. "

Also while compiler language extensions are on and warning level is set to max,

int main()

gives warnings like
"old-style function definition for 'main'."
"missing prototype for 'main'"

That was not the case with

int main(int argc,char* argv[])

though( cause I was running out of 'main' definitions.)


Thanks for replies....
I think, here the reason, is garbage value.As I have declared a[n] before scanning n, a[n] took a garbage value for n that is usually very large.So, system couldn't allocate such large size hence the segmentation fault occurred.

This question has already been answered. Start a new discussion instead.
Have something to contribute to this discussion? Please be thoughtful, detailed and courteous, and be sure to adhere to our posting rules.