Sometimes I feel like I may be pushing myself along a little too quickly with computer programming. For instance, I'm in the process of learning VB.NET and have gotten to where I feel like I can start trying a few things, so I decided to look at some of the code for our custom tools here at work. They kind of make sense to me, but I don't even know how to begin figuring out what objects I need to reference. I work in Geographic Information Systems the software we use (ArcGIS) has an object model that is just insane. Here's the ArcObjects model if anyone wants to take a look: http://edndoc.esri.com/arcobjects/9.1/ArcGISDesktop/AllDesktopOMDs.pdf

All of our stuff is done in VB.NET so I was trying to find some sort of documentation to at least get me started. It was then that I realized that I don't have the background yet to even understand the jargon they use to explain things. For instance, I don't know what "hashing" is. I know these are basic concepts for anyone in computer science, but I haven't had that many CS courses. Does someone have to know all the theory to be able to program or is it just something that you can learn on your own through experience? Does it take a long time to get to the point where you can just jump right in and do exactly what you want?

I'm thinking that the learning curve for this stuff is a lot higher than I expected and that it will take awhile to get the hang of it. I feel like I should be able to just jump right in but all that's done is made me get frustrated and want to give up. Kind of the, "I'm too stupid to do this" feeling. It's overwhelming at times.

Jread,

Chill fella you are most definitely *NOT* alone, it's like that for heaps of us. Sometimes I convince myself I'm in completely the wrong job and a total programming fake (usually after looking at an object model like yours he he) I only have the vaguest idea what a hash table is myself, but i'm not worried I know I can find out all about it one day.

You do have to be aware of the low level stuff, there are a million layers of flakey abstraction between the bits floating around in the hardware and your high level OOP code. It is a steep learning curve but stick with it you will get a little further each day. You will have days -weeks even - when you will feel you have achieved nothing....

Oh I'm getting kicked out of the office I'll do more tomorrow.

>I'm thinking that the learning curve for this stuff is a lot higher than I expected
That's very likely.

>I feel like I should be able to just jump right in
After a couple of years of learning you can jump right in and still be overwhelmed. There's really no such thing as a comfort zone when it comes to real world programming. ;)

And I'm exceedingly curious as to whether you got permission to make public internal company documents, and what the penalty to you would be if you don't have such permission and your boss found out...

And I'm exceedingly curious as to whether you got permission to make public internal company documents, and what the penalty to you would be if you don't have such permission and your boss found out...

What are you talking about?

I recently got hired as a .NET programmer and compared to C++ and Java, it's like a walk in the park. .NET languages are extremelly easy, in my opinion. If I can do it, anyone can do it. You just have to personally want to make it as a programmer.

Server crash, I seem to follow you around daniweb and post after you. But I hope and believe your right. I believe that anyone is capable if they really try, I hope your right cause thats what I want to learn next, programming,

some on my own, but I'd like to go back to school for it when my company picks up the tab

I can't say I've disagreed with you so far.

If you're just starting out then don't go for .NET right away and stay away from tools that will make you lazy until you actually get a good feel for doing it without the added cushion.

What tools???

IDE's where you can just drag n drop controls on a form type stuff like VIsual Studio. The're great but they hide stuff, it takes care of all the boring crap like include paths and compiler optimisation switches, and generates code in the background. It's not a problem if you know what it's hiding, but if you don't the day it lets you down you're gonna be stuffed. That's why so many people recommend starting with C or C++ because you learn about bits and compilers and how memory is laid out and important stuff like that, so then when you go for the big tools and a freek mouse accident somehow wipes out your classpath you know why the darn thing refuses to build and you can straighten it out.

So does something like learning python instead of c+ using tools and taking the easy way out. Or are they just 2 different languages??

Well I'd argue there is no easy way out. It's not about learning Python, or learning C++ or java or whatever, that's the easy bit it's just syntax it doesn't make you a programmer. I can speek English, but that doesn't make me a poet, to be a poet I need to understand what makes the language what it is: nouns, verbs, adjectives. And then the techniques of rhyme, stanza etc... These are the key. So don't fret about programming languages when starting, just pick one and get going.

It's like I can scrawl a load of words on a page easy peasy, but getting a good grade from my teacher is a bit more tricky.

I can type a load of correct syntax in a source file, but getting a good grade from my compiler/interpreter is a bit more tricky.

See what I'm saying?

Very much so, and you did clearify. That they're both languages of equal integrity so just go for it and start learning. I found a tutorial for python that looks thorogh so Im gonna start there.

Thanks again for the info

If you're just starting out then don't go for .NET right away and stay away from tools that will make you lazy until you actually get a good feel for doing it without the added cushion.

I'd have to disagree here.

You could most certainly use a .NET language (C#/VB.NET/C++) and not use Visual Studio. In fact, I could fire up Notepad, write some C# code, save the file as myprogram.cs, and then feed it to csc.exe. and compile it into an executable form. If you want to learn the nitty-gritty of the language's syntax, you can do it without Visual Studio.

In that respect, it's not that different from say, learning C in a text editor and then compiling using gcc. In my opinion, though, using an IDE like Visual Studio when learning can yield a great benefit-- rather than focusing on the syntax so heavily, you could then focus on learning fundamental object oriented programming concepts. Such concepts illustrate the power of languages like C#,VB.NET, and Java, but many new programmers don't utilize the features, because they feel they're too difficult to grasp.

Well, for me the Visual Studio environment has been a good place to learn. I realize that I need to understand what's going on behind the scenes (kinda like writing HTML in notepad instead of using Frontpage) but the one good thing about using an IDE is the "instant gratification" that comes along with it. I took my first programming class over a decade ago and I could only take so much of writing code at the command prompt. I think that the .NET environment is nice because it's encouraging for new programmers... you can create something more quickly and see instant results. This is motivating to me and makes me want to do more and more. I still, however, plan to take the concept courses in data structures, etc. because the fundamentals are very important for all the reasons you guys have listed.

I wasn't just referring to the Editor when I said stay away from .NET from the get go. The general syntax and way of doing things is just way to easy and gives the misconception that all languages are that easy and will make you lazy... Doesn't work that way for everyone, but I've seen it happen too often.

rather than focusing on the syntax so heavily, you could then focus on learning fundamental object oriented programming concepts.

Sadly for most people it ends up not learning the syntax but instead learning where all the buttons and wizards are located to generate code for them and then crying like babies when they end up having to write some code by hand because they don't know how.

Hmm.. this leads me to another question:

I'm debating between a degree in CIS or Software Engineering. The CIS provides more business courses, but is not as much into programming. I'm thinking of doing the CIS degree and taking programming courses at the community college along the way. I hear that CS degrees can be a little too dry and theory-based and I'm not sure how necessary it would be to go that route.

I cannot advise amongst courses as I have never done a computer science degree of any kind, I studied business and wish I hadn't. Here's some general advice you can take or leave.

Don't do the course you think you should do, do the one that most interests you, you are more likely to do well in it.

At degree level it is not SO important which one you do obviously it depends on your future plans somewhat, but they should also be aimed at what interests you, because again you are more likely to excel.

A GOOD grade is whats important and the quality of the college/university where you get it. Learning to teach yourself new things by researching and reading and experimenting are the main assets you take away with you from a degree course and are the skills you will use most in life thereafter. (as well as how to cook he he..)

I wasn't just referring to the Editor when I said stay away from .NET from the get go. The general syntax and way of doing things is just way to easy and gives the misconception that all languages are that easy and will make you lazy... Doesn't work that way for everyone, but I've seen it happen too often.

Don't take this like I'm drilling you or anything, but can you please cite a specific example of the phenomena you're describing? Some main differences I've seen is a lack of required garbage collection, and the fact that the .NET framework allows you to write modules in one .NET language usable in another .NET language. C# is really similar to Java, and it's not that far off from C++ . In fact, the syntax is not all that different from Perl or PHP. If you can understand VB.NET, you could probably understand Python fairly well, or even Ruby.

I don't entirely think that the choice of language makes someone "lazy", neccesarily. I like the fact that I don't have to fool around with garbage collection, for instance-- that allows me to focus more on my programming logic, and not mundane housekeeping. What "way of doing things" makes a .NET language any different from another non-.NET language?

I'm genuinely curious. If anyone else wants to weigh in, I'd be really glad to hear. I'm open to any feedback that's presented.

I'm not going through everything or even in detail with it. This thread wasn't started to have a language war.


One example:

If x = 2 Then
  'blah blah
End If

Besides the really old if-then-end if syntanx, what happens if you do the same in about any other language not in .NET platform? I've seen this billions of times. It can occur as an accidental misstype or in a lot of cases from .NET programmers especially vb.

I'm not going through everything or even in detail with it. This thread wasn't started to have a language war.


One example:

If x = 2 Then
  'blah blah
End If

Besides the really old if-then-end if syntanx, what happens if you do the same in about any other language not in .NET platform? I've seen this billions of times. It can occur as an accidental misstype or in a lot of cases from .NET programmers especially vb.

What does happen? Just out of curiosity...

I'm not going through everything or even in detail with it. This thread wasn't started to have a language war.


One example:

If x = 2 Then
  'blah blah
End If

Besides the really old if-then-end if syntanx, what happens if you do the same in about any other language not in .NET platform? I've seen this billions of times. It can occur as an accidental misstype or in a lot of cases from .NET programmers especially vb.

I'm not asking to start a language war-- I know full well how not to do that :D

I understand the example that you provided. Every language has examples of that-- little "gotchas" that you have to watch out for. C#, a .NET language, does not allow you to do that in an if loop. (It throws a warning/error) I'm genuinely asking for an opinion on a language/language family, and what drawbacks/limitations it may have. If it really takes the thread far off-topic, I'll split it out, but I think if you make a comment to avoid a particular framework, you owe it to provide a good explanation if someone questions. I really think that the original poster would benefit from such an exposition.

The .NET framework has a lot of handy and powerful classes/libraries. I think it's simple enough that you could learn basic syntax and programming technique, but it's powerful enough that you don't have to simply drop the language once you really need to start getting work done.

What does happen? Just out of curiosity...

Single = sign is an assignment statement in almost all languages. Double = is suppose to be for comparison, so if you did this in C++:

if (x = 3)


then it's not comparing x to the value of three, it's setting x to the value of 3.

I'm not asking to start a language war-- I know full well how not to do that :D

I understand the example that you provided. Every language has examples of that-- little "gotchas" that you have to watch out for. C#, a .NET language, does not allow you to do that in an if loop. (It throws a warning/error) I'm genuinely asking for an opinion on a language/language family, and what drawbacks/limitations it may have. If it really takes the thread far off-topic, I'll split it out, but I think if you make a comment to avoid a particular framework, you owe it to provide a good explanation if someone questions. I really think that the original poster would benefit from such an exposition.

The .NET framework has a lot of handy and powerful classes/libraries. I think it's simple enough that you could learn basic syntax and programming technique, but it's powerful enough that you don't have to simply drop the language once you really need to start getting work done.

The biggest reason I despise the language for beginners is because I made the mistake of starting with VB.NET years ago. It made me extremely lazy and I didn't realize the importance of not programming in an IDE with GUI builder and intellisense right off the bat. From VB.NET I went to Java and just about gave up, because it was too much work compared to VB.NET. Luckily, I'm stubborn and stick with things and learned Java from the ground up and have an excellent grasp on it. You can't tell me someone who's just starting to program will want to learn VB.NET without all the added tools... The chances of you knowing not to do so is very slim.


If you have a little experience under your belt then it's fine. The .NET languages are the best around for RAD and really do a great job of allowing a lot of power with little knowledge. I'm not bashing the .NET framework in anyway. I actually love it and think it's something special, but I don't think it's good to start out with if you plan on becomming a programmer. Heck, I didn't even consider VB a real language until they added inheritance.

but it's powerful enough that you don't have to simply drop the language once you really need to start getting work done.[/

of course not. I think you've missunderstood what I've said. .NET is one of the best development tools out there, but it hides A LOT of details that any novice should be familiar with. Unless it's a huge application like photoshop, I would rather use .NET than C++. I do have to say that it's web application development hasn't detoured me away from JSP,Servlets and Java yet.


The best language for a beginner, in my opinion, is Java. I haven't seen a language yet that displays all the concepts of OO in such a friendly and understanding manner. If you want proof then just look around at what language the universities are switching to. If you can master Java and understand it's elements and features, then you've got it made. You can go into about any programming language and it's just learning syntax from there.

Thank you. That was exactly the kind of explanation I was looking for :)

The best language for a beginner, in my opinion, is Java. I haven't seen a language yet that displays all the concepts of OO in such a friendly and understanding manner.

I don't understand this OO-mindedness that people have. Learning CS isn't about learning how to think in an object-oriented fashion, or how to organize things in that particular fashion (because that's just one way of doing it). It's really only about learning how to solve problems -- being able to state problems precisely, and being able to encode algorithms that you could perform ad hoc into precise definitions. This ability comes from thinking about problems, and particularly, thinking about different ways of describing problems and their solutions. It also helps to have a general understanding of how important parts of the computer, like operating systems, work. And for that it greatly helps to have used vastly different languages, from C to Lisp.

If you want proof then just look around at what language the universities are switching to. If you can master Java and understand it's elements and features, then you've got it made. You can go into about any programming language and it's just learning syntax from there.

This is just patently untrue. Besides rudimentary forms of BASIC, the only popular languages you could jump to are C# and Python, unless you want to willfully limit yourself to a subset of other languages' abilities.

Universities switching to Java is a good thing? I don't feel like listing a rant here, and fortunately, somebody else has replicated my opinion.

I don't understand this OO-mindedness that people have. Learning CS isn't about learning how to think in an object-oriented fashion, or how to organize things in that particular fashion (because that's just one way of doing it). It's really only about learning how to solve problems -- being able to state problems precisely, and being able to encode algorithms that you could perform ad hoc into precise definitions. This ability comes from thinking about problems, and particularly, thinking about different ways of describing problems and their solutions. It also helps to have a general understanding of how important parts of the computer, like operating systems, work. And for that it greatly helps to have used vastly different languages, from C to Lisp.

This is just patently untrue. Besides rudimentary forms of BASIC, the only popular languages you could jump to are C# and Python, unless you want to willfully limit yourself to a subset of other languages' abilities.

Universities switching to Java is a good thing? I don't feel like listing a rant here, and fortunately, somebody else has replicated my opinion.

I did say as a beginner language. There's no University that front loads their courses with tough theory and complex algorithms.


I'm thinking that the learning curve for this stuff is a lot higher than I expected and that it will take awhile to get the hang of it. I feel like I should be able to just jump right in but all that's done is made me get frustrated and want to give up. Kind of the, "I'm too stupid to do this" feeling. It's overwhelming at times.

It is overwhelming at times-- I feel the same way. I find myself thinking about it almost constantly (in the shower, while watching a movie, while falling asleep), but what is exciting and wonderful is that after all this processing in my mind (by reading tutorials and writing code) I solve problems that I thought were too advanced for me. It is intellectually rewarding.

I know that this does not really address your concern. I have to tell myself that there is only so much I can do concerning programming in one 24-hour period, and at the same time see to other responsibilities: relationship, pets, laundry, meals, etc. Believe me, I can (and do) spend ALL day\ night on the computer\ compiler, but sometimes I just have to step away and read a book, take a walk with my girlfriend, or cook a nice dinner. The interesting thing is, I usually solve my largest problems with programming during these times, casually thinking over the problem, not while I am on the compiler.

Regards.

I did Physics at university and the course included some numerical modelling work. Guess what language we were given to learn first....Fortran. It has some neat features, like having loads of fancy mathematical functions built into the core of the language. Since learning about other languages I've realised it's not really worth using unless you're doing heavy number crunching. But somthing I've noticed is that every other language I know about has many similarities to it. This is not surprising as they all came after it.

So, I'd recommend learning Fortran to a wannabe programmer. It's the original and best :p .

Steven.

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.