Next week I am going to begin learning about linked list, I know very little currently. Are there any well written explanations of linked list online? Or can someone explain the logic to me. Want to have a background before I start learning it from my professor.

A linked list is a series of objects that are linked together because each object points to the next.

//a basic structure for a linked list
typedef struct linkedList
{
	int data;
	linkedList* next; //points to the next linkedList struct
} linkedList;

I found this online: http://cslibrary.stanford.edu/103/
I didn't really take a good look at it, but it looks reliable.

Edited 5 Years Ago by chrjs: n/a

I'm a little biased, but I believe this is a well written explanation. The code is in C, but the changes necessary to compile the examples as C++ are minor and pretty much just constitute an explicit cast.

Just note( I may be wrong but its worth thinking about ):

They will teach you theoretically that linked list are fast for insertion in the front, middle, and end of the list. Fast in terms of theoretical comparison of big-omega. While theoretically this is true, in practice, you will find that front and end insertion are potentially faster in arrays implementation.

So I'm not trying to pollute your brain or anything, but just hoping you realize the practical sense in data structures. And question everything you can, because from experience, teacher are likely to make mistakes every so often and be aware of them trying to endow their beliefs onto you.

Just thought I'd give you a little food for thought. Enjoy.

Edited 5 Years Ago by firstPerson: n/a

in practice, you will find that front and end insertion are potentially faster in arrays implementation.

There's an implicit trade off with any data structure where the cost of maintaining it (to a certain size threshold) is greater than the benefit of using a smarter data structure. Translation: when the data set is small enough, the amortized cost of arrays is usually smaller than linked lists for any operation.

In my experience, the variability of sizes generally falls into one of two categories:

  1. Constant or variable, but extremely unlikely to grow very large.
  2. Variable, could be teeny tiny, or ├╝ber huge.

With the first category, arrays or some variant thereof are nearly always the best solution under the KISS principle. With the second category, it's not as clean cut, and many war stories are told about programmers who chose poorly. ;)

>>There's an implicit trade off with any data structure where the cost of maintaining it (to a certain size threshold) is greater than the benefit of using a smarter data structure

Good point.

This article has been dead for over six months. Start a new discussion instead.