i just want to knwo why no one uses C these days ?

i just want to knwo why no one uses C these days ?

I do embedded systems exclusively and no memory or time hogging requires using C and/or Assembler.

And some use Forth, Ruby, Lisp, Fortran, C++ ...

There are good reasons to choose a particular language, and they all are about matching the needs of the project to the language(s) best suited to the problem and the team. There are bad reasons too, but lets not go there.

so why c is no used in the mainstream ?

It was simply beaten... C++ made it lose lots of popularity to C++ and with .NET Framework and iPhone OS supporting C-Based Languages (C# and Objective-C, respectively) but not C, C has just lost popularity. It is still used, though... C is still used for OS development and Application Development... I think it was used, in combination with other languages, on Windows 7

so is there any point to have c as a starting programming language

"is there any point in using {this language} for beginning programmers" is one of the ways to start a weeks-long discussion.

Yes: Because history says this was the first reasonably accessible systems-level programming language, and it is still in use. It is also the earliest still-in-use precursor to the many 'C type languages' still on the forefront (C#, Java, C++). Some would also argue that the difficulty itself is valuable as a filter to remove students early who will likely not succeed with later classwork.

No: Because it is full of arcane things like macros, and it is missing lots of useful things like objects; and because the edit / compile / debug loop is pretty slow; and because you really need make and lint. Some would also argue that because it is difficult, it discourages a significant number of students who would eventually love programming if they were given a better first language.

With proper programmer discipline anything can be written in ANSI C. Granted it has lost out to other languages but they are not necessarily better.

"is there any point in using {this language} for beginning programmers" is one of the ways to start a weeks-long discussion.

Yes: Because history says this was the first reasonably accessible systems-level programming language, and it is still in use. It is also the earliest still-in-use precursor to the many 'C type languages' still on the forefront (C#, Java, C++). Some would also argue that the difficulty itself is valuable as a filter to remove students early who will likely not succeed with later classwork.

No: Because it is full of arcane things like macros, and it is missing lots of useful things like objects; and because the edit / compile / debug loop is pretty slow; and because you really need make and lint. Some would also argue that because it is difficult, it discourages a significant number of students who would eventually love programming if they were given a better first language.

Objects, data hiding, code reuse,and design for test can all be accomplished in C with proper programmer discipline.

I'm not in any way arguing that C is inadequate or even less than good (for some things). It is arguably not the right introduction language for new programmers, who by definition don't (yet) have appropriate discipline.

This is in no way about C or any other language: It is about appropriate strategies for teaching programming.

I'm not in any way arguing that C is inadequate or even less than good (for some things). It is arguably not the right introduction language for new programmers, who by definition don't (yet) have appropriate discipline.

My first language convinced me I would never like computers it was fortran on an IBM 360 punch cards and the whole bit. Later I went to school and they showed me a PDP11 with basic on an interactive terminal. Needless to say that was my introduction to an addiction with programming.
I started my professional career programming the 8048 in hand assembled hex. Later languages were assembler for most uprocessors. C is a feature rich luxury to me. I am showing my age herer but I think for a beginning student an interepreted language like Basic would still be excellent for learning with.

then i dont know why we learn c for the first year of uni then move straight to java on the second year, 1 year of c is not enough

moroccanplaya: I agree that C isn't the ideal first year language. You might find it amusing or instructive to ask your prof, in the nicest possible way why they do it that way. One year of C or C++ would be enough (one term, even) if you already were a decent programmer. You have to spend a term or two learning what it means to write programs; and the language really doesn't matter very much.

If it were up to me, I would start with Python or Scheme (honest) for the early programming and theory courses. If you want to do the compilers course or the systems course, you would have to pass either a C++ course or a C++ test (so you could do it on your own if you wished). I think Java has little or no use in the University setting (C is better to learn about 'near the machine' issues, and something less like C++ is better for expanding your programming horizons).

Python because it is a nice orthogonal language with almost all the features of many other languages; and if you 'get' programming, you can learn it in a week (if not, add a week).

Scheme because it is simple enough to learn in about 2 weeks; and close enough to a functional language to give you some good experience in that realm (and functional languages are likely to be the precursors to whatever we end up using to do massively parallel computation).

Either one because the whole point of introductory programming courses is to give you some experience, and the tools to test out various algorithms and to think about big O issues and other things that are theoretically interesting. The quicker you can get past the 'how to compile Hello World' level, the sooner you get to the part that is useful.

I partially agree with Griswolf, because you can learn the programming basics faster with a language like Python. I believe that C however, gives you a unique insight in the working of your computer (especially the memorey-usage), which you won't get from Java or Python. I recommend everyone to do a year of "C" :)

What I know is " C is mother of all Languages"

wow...you know so much...:)

who is the father btw??

At least I am not... :)

one more thing i am doing computer security so im guessin c is good to start with

then i dont know why we learn c for the first year of uni then move straight to java on the second year, 1 year of c is not enough

Java is a terribly hard language to learn as your first language, their goal might have been to teach you Java all along, but teaching you a little C so you understand basic programming makes it easier to understand Java later.

And I don't know much about computer security, but language choice would probably depend on the type of security you're working on. If the security is network based, I'd imagine that Java could do a fine job, but if you want to disable people from logging in to a computer, I'm sure C would be my first choice. I hope that helps somewhat.

All of the UNIX OS is written and maintained in C - well some asm. GLIBC, all of the the Linux stuff as well.

What was your sample size when you came to this conclusion?

All of the UNIX OS is written and maintained in C - well some asm. GLIBC, all of the the Linux stuff as well.

What was your sample size when you came to this conclusion?

There are many embedded systems that are written in C as well.

what do you mean by sample size?

so why c is no used in the mainstream ?

"mainstream" alone doesn't mean anything at all and should be used only in context.

For example:
- Embedded systems programming - C is used by mainstream.
- GPU programming - take a look at GLSL shader language, which is based on C, - C is mainstream answer here.

So one need to choose particular field and after that search mainstream language in THAT field.

what do you mean by sample size?

Anyone who programs should have at least a small amount of statistics knowledge. Sample size: How many items you looked at.

One of the wonderful things about statistics is that you don't really need to know much about the size of the "universe" (all possible items) to know how well your sample represents the universe. You just need to know how big the sample is and how 'well chosen' it is. I am being deliberately simplistic which can also be spelled 'wrong' ... so if you want to know more, please learn some statistics for yourself. I just did a quick search and found this which is free, seems pretty comprehensive and was at least a little bit approachable (for me, skimming for about 5 minutes):
http://www.artofproblemsolving.com/LaTeX/Examples/statistics_firstfive.pdf

C is the base of all the programming lang's. and It is wrong nobody using the C...... There are so many companies are working with C

deysubrat gets it wrong(*) on two counts:

  1. C is not the basis of all programming languages. In fact, to mention just some that we will all recognize, Fortran, Lisp, Cobol and Basic were sll earlier than C
  2. It is wrong to say "nobody using the C". Look here: http://langpop.com/ (man, I hadn't thought of this issue in years, these folks do an amazing job considering the difficulties, and they show you what they did unlike most so called 'statistics')

And if deysubrat meant to say "We should all learn C because it is the basis for many modern programming languages", that is wrong again, it is like saying "We should all learn to cook over an open fire because that is the basis for modern cooking." Yes, I enjoy food cooked on the barbie; and I think C is a useful language. But neither one should be a requirement.

(*) Unless I mis-understand the translation into English. If the statement is "it is wrong to say that nobody is using C", then I agree with that part.

so what language did you start learning griswolf

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.