Look at this: http://www.codinghorror.com/blog/2007/01/how-to-become-a-better-programmer-by-not-programming.html

Did someone who simply couldn't write efficient code (complexity-wise) before, felt intimidated by people who did, decide to work really hard, try to solve an insane number of problems, actually get better?

I've been in touch with programming since some 4-5 years. Some of my friends who recently picked up C and speak total rubbish about it, solve those hard programming problems. I end up writing a long well polished program that employs brute force, while they get the best algorithm implemented with some really messy code.

I understand programming structures quite well, can read documentation, write understandable code.
Could reading a CLRS and working day and night at topcoder help me?

Thanks

Recommended Answers

All 29 Replies

Define "better". Do you want to win contests or be a good programmer in general? Do you want to write professional quality code or code that just solves the problem as quickly as possible, and maintainability be damned?

That's the difference between competition and software development. Most of the time, code that should be in contests would miserably fail a code review. Likewise, good production code will general lose a contest consistently.

Tagging on to deceptikon
"Better" is subjective thus in the eye of the beholder.
So who is beholding your "Better". Is it you? Is it your co-workers?
In 15 years of field service the question always hung in the air: who decides what is good-enough?
The answer is almost always: The Customer.
Do you know who your customer is? Do you know what your customer requires?

The better you serve your customer's requirements, the more valuable you are. It can be a measurable process once you know what you are looking for. That makes it objective.
As always, know where you are going!

If one can write an algorithm that is significantly faster than a brute-force one, that is better.
Satisfying the customer is the goal, yes. But there are so many roles that would contribute to that. A UX designer would, a software tester would. But all my peers and companies that recruit interns look for people who can write algorithms that are fast.

I just fail to see the wonderful patterns that arise out of the better(faster) algorithms.
I wonder if solving tons of problems on topcoder, codeforces, etc. would actually help.

If one can write an algorithm that is significantly faster than a brute-force one, that is better.

Is it? What if the brute force algorithm was fast enough already? What if the faster algorithm is so godawful complicated that the original author barely understood it and left the company? One of the first things I try to teach college grads is cost benefit analysis for code. It may be cool, or theoretically superior, but if the cost exceeds the benefit, it's an inferior solution.

But all my peers and companies that recruit interns look for people who can write algorithms that are fast.

Obviously you shouldn't intentionally write slow code, but fast has a price, and if an employer doesn't understand that, they're not worth working for.

I just fail to see the wonderful patterns that arise out of the better(faster) algorithms.

Studying algorithm and data structure design would be a good start. Without a strong foundation in the theory, I'm not sure you'd suddenly start seeing those wonderful patterns just because you're under a time or resource crunch.

The better you serve your customer's requirements, the more valuable you are.

Again, it depends on what the customer's requirements are. Before I retired, a programmer in another department always met his deadlines with code for a critical project (water reservoir management as pertains to hydro-electric generation). He wrote his code using modules that he developed in PL/1. Eventually he left the corporation. Within a year of his leaving, they had to hire him back as a contractor at several times his original salary because none of his code was documented and was written so horribly that nobody else could understand it. Under the new contract his rewritten code had to be peer-reviewed (at yet further cost) and approved at every stage.

Strictly speaking he satisfied the customer's original requirements. The code met all the specs and ran correctly. However, the code was impossible to maintain or modify so in the long run the customer was poorly served.

Could reading a CLRS and working day and night at topcoder help me?

I would say that it doesn't hurt. From time to time, I pick up on some simple or classic problems to challenge myself a little bit (without having to devote too much time to it). But this is mostly a matter of reviewing the basics, which is useful as it feeds into real-world programming in the form of a place to draw new ideas (from the old ideas) and not miss out on some obviously better ways to do some simple day-to-day programming tasks. But I would not say that it is any kind of significant part of what will make you a "better" programmer in whichever sense of the word.

If one can write an algorithm that is significantly faster than a brute-force one, that is better.

Really? Did you know that you often need quite a large amount of data before a binary search starts to out-perform a simple linear search? A large part of being a good programmer is being able to judge when the effort is worth it and when it is not. This is the hardest thing to learn, and I would say I've still got plenty to learn in that domain. If you can come up with the best possible algorithm for every single problem you face, and you don't have enough common sense to restrain yourself, you are likely to waste inordinate amounts of time for no good reason. As they say, better is the enemy of good.

About the Jeff Atwood (erk..) article you linked to, I think there is a bit of a misinterpretation here. The way I understand Bill Gates' quote is that after a few years of programming you should know whether you "get it" or not (and you could say the same of pretty much anything else, like math, sports, languages, etc...). I don't interpret this as meaning that after those few years you have nothing significant left to learn, that would be ridiculous, IMHO. One thing I will say is that there are always learning plateaus throughout a programmer's learning-curve / work-experience / life-time. These plateaus are characterized by a moment when you declare "I'm the best, I'm a programming machine, and no one can teach me anything anymore", and often a few months or years later you have a epiphany that sends you on another hike up the learning curve to the next plateau. After a few of those you start to realize that there is probably no actual summet to that mountain. Anyways, my point is that when you hear people say something like that after the first few years they had pretty much learned everything they use now, and the rest of the time they've spent programming didn't teach them much more, it is probably because they are in a very comfortable plateau. In other words, what they've learned so far is sufficient for their job and they haven't faced anything that required them to explore more unknown territories. Sometimes this is good and effective, sometimes it means that you are stalling and are not producing the best code you could (which is, again, subjective, is "best" the most maintainable or the most clever, etc..).

In any case, I would say that it takes at least 10 years of sustained coding in a few different languages and on medium to large size project to really acquire any kind of significant mastery of the overall craft. Now, those years of experience could be dedicated to learning to produce the most complex and clever code known to man, to learning to effectively produce robust and maintainable code, or to learning to construct very ingenious software architectures, that really all depends on what your job requires and what your capabilities are. These things are just different crafts, we just happen to call them all "programming".

Also, just to add, the bulk of day-to-day programming is really really trivial from an algorithmic / computer-science point of view. So, it's hard to make an argument that this is really such an important skill that you should devote most of your "learning" to it. In my opinion, software engineering (i.e., design of the architectures and such), maintainability and quality assurance are the three most important skills to focus your learning on. Most programmers start out being passionate about solving those puzzles (like the topcoder stuff), and that initial learning is probably sufficient for the bulk of all day-to-day programming tasks. What makes the biggest difference are those three areas I mentioned.

Obviously you shouldn't intentionally write slow code, but fast has a price, and if an employer doesn't understand that, they're not worth working for.

I've had one instance where we did intentionally have to write slow code, but that was a very special case where the code had to be slowed down so as to not get ahead of the ADC that fed it with data from an external sensor.
The code could read say 1000 bytes per second, but the ADC could only supply 100 bytes per second, the sensor supplied only 10 bytes per second.
So we had to slow our code to the point where it wouldn't flood the ADC and cause the hardware to crash.

Similarly: I've had to rework underperforming code more than once. Most significant here, an application that needed to run once every 24 hours but took 36 hours to run to completion.
I could have done more, but by the time it ran to completion in 16 hours the customer was happy and we stopped trying to optimise further, throwing more money at what was no longer a problem.

Overall, you write code to be fast ENOUGH, good ENOUGH. Perfection is unattainable and striving for it only leads to missed deadlines, cost overruns, and exploding budgets.

Although possibly questionable ethically and technically, there is some mileage in the reasoning behind being the only one who understands your coding. It worked out quite well for Jim's former colleague. But, antics like that, which really would only be advantageous in a corporate setting where you could actually earn from such a situation or be protected from 'Downsizing', could mean your not getting another such high level job in that line of work, or even that country, again.

Your last job might be the bridge to your next job. Don't burn your bridges.

Although possibly questionable ethically and technically, there is some mileage in the reasoning behind being the only one who understands your coding.

True, it's usually perceived as job security. However, I find that attitude so repulsive I'd be willing to fire an employee for it and give the new employee bonuses for the pain and suffering of figuring out or rewriting the obtuse code. A programmer that writes bad code as job security isn't the type I want to employ.

Although my comment may have appeared like I was advocating such behaviour, I did stress that anyone doing such a thing would lose in the long run. They would become unemployable in what was their field of interest and expertise. So, keeping what you quoted of my comment is context, although such antics are doable, it's inadvisable.

All that said, it's not unusual for employees to make themselves irreplaceable in many areas of employment. I suppose it comes down to the sort of relationship and contractual agreements that exist between the employer and employee. A contractual agreement would protect an employer from a rogue coder, and protect the employer from any breach of employment law in terms of unfair/constructive dismissal.

In my job I was always willing (even eager) to show others how do do the things that apparently only I was doing. A lot of that was using vbScript to automate processes by using scriptable components from other applications. Our corporate dbadmin specialist, in particular, was excited to see how I was doing the automation. I believed that I made myself more valuable to the corporation by sharing my expertise rather than by hoarding it.

bro, don't worry! i know how you are feeling. you are in same boat like me. i also do topcoder, codeforces, codechef and all. i know what your problem is. i had also thought like you initially, but in one year i started thinking in terms of algorithms. even i implement algorihms in daily life. trust me, just give time to it. in metro, in train, in airplane , while walking, try to think about some problem of SPOJ etc. start thinking, take one problem and try to do it in 1 day or 2 or 3 or 4. you will not believe me but in the beginiing, i once took 11 days to solve one problem which was so easy if i look at it now. i am also going thorugh my internship and placement session , and those stupid companies take those who will give fast algos. :p take one problem, think on it for hours , hours and days. belive me when u will do that problem, you will say "woopieee!!" then you will enter in a cycle. DO PROBLEMS--> INCREASE CONFIDENCE--> DO MORE PROBLEMS--> INCREASE MORE CONFIDENCE... so this will never end. just try my method once and after some time, you will be ere who will say "yes! now i can solve and can see paterns" :) trust me, this is my practical experience and i have gone thorugh same wording which you have written above. thanks.

I believed that I made myself more valuable to the corporation by sharing my expertise rather than by hoarding it.

That is the best way to be for job satisfaction, the only problem is that if you shine too brightly you can find that some get nervous about their position and your motives. I'm not interpreting your comment that way, but its important for the uninitiated to keep this in mind.

In my particular case, the job position I had when I retired was the same one I had when I was hired. My salary, of course, was considerably higher upon leaving. My actual job title was Process Control System Software Specialist and over the years I progressed from level 1 through level 4 (with each level having several salary steps). Because of technology changes my job requirements changed substantially and at every employee evaluation I was asked where I wanted to be in five years. Each time I made it clear that my interests lay in the technical rather than managerial aspects of the job. It wasn't that long before they got the idea that I wasn't interested in anyone else's job. In spite of that, I had to assume a partly managerial position when my group was temporarily divided in 1997-1998. Half of the group was put on the development team to replace our aging AGC/SCADA system. The remainder maintained that old system. When the new system was commissioned I went back to only technical work.

Short version - nobody saw me as a threat.

Short version - nobody saw me as a threat.

that's the crux. People think you're after your job, they'll start trying to get rid of you, and if those people are higher up in the corporate pecking order than you are (which they'll be if they think you're after their job, of course) they'll have better access to people who can make that happen.

I always stress I'm not interested in management positions, lack the personality for it.
Sadly too many people just can't get their heads around the idea that people do NOT want to be the boss, and think you're just sneaky and moving behind their backs when you say so.

What I've seen change, and this is only over the past fifteen years, is that job titles and descriptions have become quite blurry. Many jobs are interwoven, and there are many more employee disputes against employers. In fact, it's become so bad that the Government in the UK has recently introduced a regulation that makes it harder for an employee to take legal action against the employer.

To bring us back to the thread, experience is becoming less and less significant. In an increasing way, employers are more interested in having employees who will simply follow the system. Trust and commitment are irrelevant because the system 'monitoring' methods (performance reviews, etc) will reveal whether your output is acceptable. For an increasing number of multinational companies, it doesn't matter if you are enjoying your job or not they just want to see stat's that show you are doing it.

Thanks everyone.
I watched the 1st lecture in the design and analysis of algorithms course of MIT (Open CourseWare). He starts off from the very basics. He says the purpose of that course is to study efficiency of algorithms. Then he lists what is more important than the efficieny: all the stuff you mentioned (maintainability, readability, simplicity, user experience, etc.) Why study efficiency then? He says it is to buy those more important commodities :)

Why study efficiency then? He says it is to buy those more important commodities :)

Or maybe just because it's fun ;)
It's fun and challenging to learn about and apply efficient algorithms. We're entitled to have some fun too, right? Even if it isn't objectively the best use of your "learning-time".

it's a humble request that can i know the reason for 2 negative votes of my previous post in this thread ? so that i can improve in this thing also. thanks if u may do that.

If someone wanted to give a reason for the downvote they would have given one. If I ever downvote you I promise that I will tell you why.

If I ever feel strongly enough about a downvote to give a reason, I make sure to provide a comment and it becomes negative rep.

hmm. may be. but sir, now i am just improving myself as a human, as a coder, as a user on daniweb and everything. my aim is now to improve myself as much as i can. no matter how many failures i will get, success willl be there for me at some point of time. :)

(With experience) we get more pragmatic, reliable and conventional, but not better - "better" requires creativity, originality and so on...

when you are experienced you tend to avoid trials and go strait to results.

"better" requires creativity, originality and so on...

Only in situations where creativity, originality, and so on are required. "Better" is subjective, and it's prefectly reasonable to say that a code monkey can get "better" without being more creative or original.

when you are experienced you tend to avoid trials and go strait to results.

When you're experienced, your trials just become that much more complex. ;)

thank you for your point of view.

in my experience, with experience you get more cynical, more fatalistic, more focussed on getting the job done at all rather than whatever high ideals about beautiful results you may have had in the past.

Your mind gets better with good experience.
Eventually you learn to avoid painting yourself into a corner.

My last (ever) boss told me "you're going to get sh!t no matter what you do" so experience taught me to do it my way. What he actually meant was "I'm going to give you sh!t no matter what you do" so, yes, he was close to being the worst boss ever. Fortunately my users were a different breed entirely.

You get better with experience if

  1. you have the capacity to improve
  2. you are willing to learn from your mistakes
Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.