An introduction to algorithms

Dani 4 Tallied Votes 535 Views Share

There is little doubt that one of the most daunting tasks in all of computer programming is that of developing your own algorithms. Indeed, it is here where the term Computer Science comes to the fore as it is virtually a step-by-step process, so intricate and precise it truly is a science to master.

So what exactly are algorithms? Simply put, these are step-by-step methods which a programmer tells a computer to follow. They often branch out under certain conditions, in the same way that as humans with brains we follow trillions of algorithms in our daily lives although you don't necessarily think about them. For example, you are following an algorithm in order to add two numbers, to walk down a flight of stairs and even just to move our heads.

Let’s take walking down a flight of stairs as a good example to examine more closely. You begin by placing one foot in front of you and then following with the other. Then you notice that there is another step in front of you, so we do the same again. You keep repeating this process until you finally see there are no steps left, and then note that you have reached the bottom of the stairway. This is the most basic form of an algorithmic process, but the next step you must take is to understand variables. All programming languages involve the use of variables, keywords which are used to represent numbers, letters, and symbols in the computer's random access memory (RAM.)

Variables are quite similar to their algebraic counterparts, and are recognized by both their identifier and name. The identifier of a variable states what type of data it holds be that a number, word, letter and so on. The name of the variable can be virtually anything, although there are some restrictions based on the particular programming language that is used. However, variables are always used in algorithms to store values which vary each time the program is run. For example, within a program where the number 5 was multiplied by two there would be no need for variable, but if that program asked the user for a number N, and then multiplied that number N by two things would be different. A variable would be needed because N is different each time the program is executed.

Now suppose you wanted to write a program which, given a number, would print out double the value of that number if it is less than ten, and triple the value of that number if it is more than ten. Assume that the variable N is used to represent the original number, and the variable X is used to represent the new number which you eventually want to print out. You would begin by using a branch to decide if N is greater than or less than ten, this is known as a deciding statement. When your program is executed, the computer compares ten to number N and decides which way to go. Meanwhile, you have predetermined in your code that if N is less than ten, to make variable X equal to twice N, or else make variable X equal to three times N.

The truth is that it takes a lot of practice to easily be able to rationalize your own algorithms because the process includes thinking ahead as to what your users might do. Think about if the program described above actually existed, where a user was to enter a number into a text field which would become variable N and then a new variable X created and printed to the screen. What would happen if a user entered "dog" in the textbox instead of a number? The computer would be confused while trying to compare "dog" to ten. For this reason, there is the necessity to build error-checks into your programming code. You would have the computer check to see if variable N is indeed a number before comparing it to ten.

Remember that developing algorithms is crucial to a computer programmer because it is always the first step taken before actual code can be written. So before beginning any programming endeavors make sure that you have a good idea in your head of exactly how you are going to accomplish the selected tasks, from start to finish, step-by-step.

mattyd commented: Algos \ well-written and concise +2
majestic0110 commented: nice :) +3
ourchiliean commented: You have helped me to understand a little more, tank you +0
morten42 commented: Excellent post +0
pritaeas commented: Well written +4
The ICE Man commented: Very Good, and well written :) +0
jalucci 0 Newbie Poster

this is the best way to get your data types

mattyd 89 Posting Maven Featured Poster

Thank-you for this tutorial. This is just the thing I have been pondering over lately and this helped to pull the basics of writing and using algos properly into place again. Well-written and concise.

sharky_machine

segunisreal 0 Newbie Poster

well detailed and concise. a good start for a novice like myself,this is getting interesting

mobman80 0 Newbie Poster

Excellent start for algo - I have been a software developer at Kremsoft and I still always have dramas building algos. Thanks for the help!

morten42 0 Newbie Poster

Dear Daniweb readers:

I think this is a cool example of an interesting, easy and
understandable algorithm. I'd like to note that the digits 1-9 was used
by the Arabs, and zero was later invented. The number zero was invented
independently in India and by the Maya. In India a decimal system was
used, like ours, but they used an empty space for zero up to 3rd Century
BC.

This was confusing for an empty space was also used to separate numbers,
and so they invented the dot for a zero. The first evidence for the use
of the symbol that we now know as zero stems from the 7th century AD.
The Maya invented the number zero for their calendars in the 3rd century
AD. The number zero reached European civilization through the Arabs
after 800 AD.

The Greek and Roman did not need the number zero for they did their
calculations on an abacus. The name 'zero' comes from the Arabic
language. We did however invent the Nil pointer.

Also algorithm is an Arab name of a Mathematician.

Currently Sedgewick and Knuth has written important and Comprehensible
introductory texts to Algorithms in computer science.

Using an Abacus is also an algorithm.

here is my current reading of the Hailstone 3N+1 algorithm:

Cliff Pickover's Patterns in the Mysterious Hailstone (3n+1) Numbers

http://sprott.physics.wisc.edu/pickover/hailstone.html

A particularly famous problem in number theory, the hailstone problem,
has fascinated mathematicians for several decades. It has been studied
primarily because it is so simple to state yet apparently intractably
hard to solve. This problem is also known as the 3n+1 problem, the
Collatz algorithm, and the Syracuse problem.

===


Greatest common divisor is for a beginner maybe a little bit complex to
proof, verify and code, but if you omit the verification and proof,
Sedgewick tells it on the fly, right out of the box.

The importance of algorithm is undisputed, but besides or different from
algorithmic thinking, as a parallelism, is object oriented thinking.
It is hard to find any algorithmic explanation of inheritance in object
oriented design, which includes single inheritance. Or delegation.

Unlike what is possible to proof, you may still do a lot of interesting
coding of any Medical, Linguistic, Cryptographic or Physical
understandable problem, which does not need an exact algorithmic proof.

We are all humans and have to understand that we cannot take the outcome
of any Algorithm for granted, and put that as an answer in a medical
context, without any responsibility.

An attorney, a Lawyer cannot say, it says in Google, or even worse: it
says in Wikipedia.

The results of google is based on a very huge Markov chain Algorithm,
still the results has to be interpretable. Like everything we may find
in any book.

In object oriented design and analysis you have


Abstraction;
Encapsulation;
Modularity;
Hierarchy;
Typing;
Concurrency;
Persistence;
...

besides these, there are newer paradigms, like closures, Parallel
computing; even important iPhone application for finding the
distribution of a virus, Bird flu, or a computer virus can be one of the
topics to study, which attempts to estimate the distribution of
information, or existence of a virus through the globe.

The only algorithm which draws my attention in this case is Dijkstra's
routing algorithm.


None of this are Object Oriented. They do not inherit or self learn
anything. This means even if you find classical algorithms hard to
understand, due to lack of formal mathematical training, an object
oriented approach for a similar or related problem may be even as
interesting or revealing.

As another example: google works like a huge Markov chain. The PageRank
of a webpage as used by Google is defined by a Markov chain. A
sufficient proof of concept is the success and the results of google,
but a formal proof that it will only give always correct hits is
difficult, and in this context not useful.

In my understanding an algorithm may be, if it is formally correct,
object based, at most, but due to lack of inheritance never object
oriented.


===

On the other hand,

I am reading this

Data Structures and Algorithms with Object-Oriented Design Patterns in C++

http://www.brpreiss.com/books/opus4/html/page449.html

"its implementation must be given in a derived class."


it has an abstract class Solver. But I cannot see that Inheritance as
hierarchy of classes can be implemented in any useful way, to express
this algorithm.


My understanding of the topic is that we can learn a lot by the study of
algorithms, but object oriented design which includes some class
hierarchy and inheritance, cannot be used in any algorithmic description.

monaxisomcio 0 Newbie Poster

at first its hard for me to understand comp. language but lernd now dat i hav to get an overview of this..tnx.. by the way.

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.