mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

now when I look at the code and see that I don't return anything from the function, I don't understand why it worked at all. the compiler should give a serious error, or the program should crash imidiately if it doesn't break the laws of C++.

Yeah, this is something to get to know about C++. There are a few different levels of "serious error" in C++.

First, there are compilation errors, i.e., when the code just doesn't compile, but that is usually restricted to syntax errors (i.e., code that breaks the syntax rules of C++), not so much logical errors or "bugs" (although, there are advanced techniques in C++ to get the compiler to detect logical errors and bugs, but I won't get into that here).

Then, you have run-time errors, but these are also limited by how much error checking you do in your code, because the C++ language tries to be an "opt-in" language in the sense that most run-time checks are things that the programmer can choose to adopt in his code (or by compiler options), but they are generally not there by default, as they hinder performance. And because C++ is native (does not run in an interpreter or virtual machine) there are very little built-in safety checks, only the most critical ones (like memory overflows or access violations), and generally, these just terminate the application with no further information about the errors.

Then, arguably the most important class of "serious errors" is …

mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

What do you guys think?

I think there must be a misunderstanding somewhere. Your boss' idea of what your job responsibilities and skills are, and your idea of that are definitely not the same. Maybe, as far as he knows, he hired an "IT guy", and if his idea of IT is limited to using Outlook, then maybe he thinks it's part of your job description. On the other hand, maybe you thought your job description was to be a pure coder, when in reality you are expected to do more.

If you're a programmer like me, what would you do?

Get a meeting with your boss and try to clarify the confusion. Clearly, there must be a confusion somewhere, because nobody would think that being taught about Outlook and going to teach people about Outlook is a productive way for you to spend your time.

That said, if you are lacking in the "people skill" areas, then you might benefit from an opportunity to develop them.

But still, teaching Outlook is a real waste of time, it's a really extreme example. The fact that your boss even considered this to be a good idea (and worth the investment of time on an employee) makes me seriously question his intelligence. But then again, managers are not known for being smart.. quite the opposite.

ddanbe commented: Great! +0
mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

I would have to disagree with JasonHippy.. the debugger is not gonna help you in this case.

Even before looking at the code, just by the description of the symptoms, it was very obvious to me that this is a memory issue, and most likely a combination of leaks and uninitialized memory. This means that you were not deleting memory that you've allocated, and that you are accessing memory that you have not allocated or not initialized. This is why the debugger won't help you much, because these kinds of errors are not traceable using a debugger, even with a memory profiler it is really hard to trace, and also because the error is not one thing, it's everything.

After looking at your code (not in details, obviously, because this is a lot to look at), it only confirmed my suspicions. You have an enormous amount of leaks in that code. Are you aware that, in C++, you have to delete anything that you've allocated with new? You allocate objects with new everywhere (far more than you should), and there is not a single delete statement anywhere. This is bound to leak memory all over the place. This is why the program runs for a while but then eventually becomes extremely slow and almost at a stand-still. Also, OS or library protections are sometimes in place to prevent your program from consuming so much memory that it overwhelms the entire system, and usually, these protections will "terminate the program in …

JasonHippy commented: Ooops, I missed that! Good save! +9
mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

When we are talking about global or namespace-scope variables or functions, the static and extern specifiers pertain explicitely to the "linkage" of the entity. When extern is specified, it means that the entity has "external linkage", which means that the entity will become a symbol that is visible at link-time across multiple translation units (i.e., "external"). When static is specified, it means the opposite, that the entity has "internal linkage", which means that it will only be visible within the translation unit in which the entity is defined (not the same as declared). If you are confused with some of the terms I used, you should read my tutorial that explains the compilation and linking process in great detail.

By default, everything is extern, so, there isn't much point in using it, except making things more explicit. For static entities, this is mainly useful to "hide" things that are not relevant beyond a single translation unit. Or, when you want something to have its own separate entity for each translation unit (much less common).

mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

I always use "were" in that context. It just sounds more "right" to me (but then again, I'm not a native). I was always curious about this, and whether it was acceptable.

The following is just speculation, but I think it might explain some of the discomfort with this form ("If I were .."). English used to be structured like German, even as late as Old English. It got simplified due to Scandinavian and Norman influences, and lost most of its annoying Germanic inversions and inflections as a result. But there are still traces of Germanic structures, and it still remains kind of acceptable sometimes (it doesn't sound completely wrong). Just think Yoda and the way he speaks (fun-fact: in dubbed-German, Yoda speaks normally). And, under those structural rules, it should be "If I a carpenter was,..." or "If I carpentry to take was,...". The point is, under those rules, the verb does not belong at the third position, after "If" (conjunction), and "I" (subject), as it naturally belongs in the second position, and is thus thrown back to the end of the clause. So, maybe, just maybe, the discomfort is due some innate sense that that verb doesn't belong there (at third position, which comes from the French structural rules, i.e., the "normal" rules).

Does that make me crazy? Possibly... but maybe we're crazy, probably...

mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

An Apple was sold to a woman who already had everything, and at the highest possible price: the first strike of genius marketing, we ought to call it the original sin. A befitting name for a company, Apple.

mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

What does 'Pay It Forward' mean? Some sort of jargon? Never heard it before.

It's a pretty well-known expression, it's a play-on-words from "pay it back". It's like if you have a debt to someone, you would pay them back. But in this case, if you've been granted something from someone (like an endorsement), you should "pay it forward", i.e., grant the same to someone else, to propagate the chain of kindness. We have the same expression in French too, "payer au suivant". I'm surprised you never heard it before.

why the variations of 'Career Profile' and 'Curriculum Vitae'/'CV'.

I think the Career profile is supposed to be a bit more complete "CV + Daniweb activity", as it shows the CV and Daniweb endorsements / latest articles / posts. The CV page is more minimal. But it's true that they seem a bit redundant.

mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

returning the length of passed string was just me testing, but after mentioning it I will instead return length of string copied to buffer.

Yes, you should follow conventions. Most functions like this return the number of characters (bytes) written on the destination buffer. At least, that's the C convention (the C++ convention is to return the iterator (pointer or otherwise) to the one-passed-last element written), and the C convention is what you need to follow if you are writing a C API function like that.

BTW, the reason that it is convenient to return the number of characters written (like sprintf does) is so that you can do things like this:

void manufactureComplexString(int someOption) {
  char buffer[1024];
  char* p_str = buffer;
  int consumed = 0;

  consumed = sprintf(p_str, "This is just a test.\n");
  p_str += consumed;

  if(someOption & FIRST_OPTION_VALUES) {
    consumed = sprintf(p_str, "First option with value = %d\n", someOption & FIRST_OPTION_VALUES);
    p_str += consumed;
  };

  if(someOption & SECOND_OPTION_VALUES) {
    consumed = sprintf(p_str, "Second option with value = %d\n", someOption & SECOND_OPTION_VALUES);
    p_str += consumed;
  };

  printf(buffer); // print result.
};

The point is, it creates a convenient way to chain operations without having to keep track of too much information. The C++ convention is far nicer, by the way, but hey, if you're stuck with C, you're stuck with C.

Returning the length written is also a convenient way to express that "nothing was done" in case of an error.

I just get …

mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

First of all, we don't do people's homeworks here. You have to show effort and ask questions about specific problems you are encountering with your code.

Second, please mention the language in question, so the thread can be placed in the correct forum (by language).

mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

Have you installed the Nvidia Linux driver? See here. It would seem that the latest appropriate version is 331.38.

mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

If you invite people on a Linux/Unix forum to ask them questions about Ubuntu, it won't be very objective because most people here are probably quite familiar with Linux and Ubuntu. So, you should not consider that the number of people who said they knew or used Ubuntu are representative of people in general.

mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

A modest CV is better than no CV.

mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

Here is my career profile: http://www.daniweb.com/profiles/787121/mike200017

Pretty cool new feature! Thanks Dani!

mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

Just install a version of Linux, and start playing around with it. Do stuff in the terminal, write code, write scripts, etc.. It all depends on what you want to do with it (sys admin, coding, etc.).

There are basically two main families of Linux distributions: Red Hat and Debian. That split is where most of the differences are, but there are not too many. On the Red Hat side, you can get distributions like Fedora (and its variations "spins") or other derivatives. On the Debian side, the main family is Ubuntu and its derivatives. Frankly, any of them will do, Linux is pretty similar between distributions or family of distributions. Think of different distributions as different "flavours" of Linux, because under-the-hood they are all nearly the same. I suggest you just research around a bit and see screenshots / feature-lists to figure out which distributions might be best for your taste, and then just try one, or a few.

You can first try out different distros by installing them in a VirtualBox. But, ultimately, you really should do a dual-boot installation with the distro that you prefer.

mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

so it would be best to get the expert in that fild to get newest info on the subject

You are pretty much talking to one, in all modesty.

I know for the array that conveges to the root of the square root of a number, and few yers ago, I have even seen that some people use that algorithm in C++, so when you count square root, that thing gets caled. I also know that yo could count that thing with Chebishev polinom, so way this way, it is way faster.

You're half right. Those converging series are used to compute square-roots, and in fact, any transcendental function (exponents, logs, roots, trigonometric, etc.). But those series are implemented in hardware, not in software. Basically, if you ask for the square-root of a number, there is a module on the CPU that will compute about 40 terms of the Taylor series all at once and add them up to obtain the results. That will take many more clock cycles than a typical addition or multiplication, but overall, it is very fast, as fast as can be without sacrificing precision.

However, in some cases, most notably in 3D computer game programming, precision is not important, but speed is. In those cases, people can create functions that compute a truncated (10 terms or so) series instead to spare a few clock cycles at the expense of precision in the results. These functions are usually written in assembly directly.

mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

That also means all you young folks would never get jobs because there were so many old people still working.

If the retirement age was 950, then you would also be part of the "young folks". They would say: "Pfff, he only got gray hair like a couple of decades ago.. he's almost an infant."

Earth of already overpopulated, what would it be like if the life span were doubled or trippled?

My guess would be that there would have to be severe population control, i.e., the number of children would have to limited severely (and I can't imagine how you would have to go about doing that, like sterilizing most people and stuff like that). That's the sad part of this "living 1000 years" thing, is that there won't be much children around, and not many people "allowed" to have children either. That would take away one major factor of purpose and fulfillment, not to mention that it would be boring if only adults existed (would we still have amuzement parks, and stuff?).

Reverend Jim commented: The new motto would be "don't trust anyone over 500". +0
mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

Yeah, the VLA feature is a C99 feature that was made into an optional extension in C11. Many C++ compilers support this extension in C++ too, like GCC does. That is how this is allowed to compile. In MSVC (which only supports up to C90), you can use alloca to allocate stack memory, it's just a little less convenient. And in any case, it is not really worth making optimization checks with the MSVC compiler, it is well-known to be dismal at optimization.

mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

So, I did those tests and because the results were so interesting, I decided to post it as a tutorial of its own. Check it out.

mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

Following up on a discussion on this thread, I did some tests on the performance of recursion vs. iteration, and some interesting things happened, so I thought I would share the results. I'm gonna structure this a bit like a lab-report, because that's really the best way to put it. But I hope the discussion can continue on additional ideas or insight into the performance issues that are discussed here, or if you disagree with my analysis.

Experimental Method

So, I took Schol-R-LEA's coin-combination example as a test case, and timed the execution of three different versions: recursive, iterative with dynamic allocation, and iterative with VLA (stack-based arrays) (because of this discussion). Here is the code:

#include <vector>
#include <utility>
#include <chrono>
#include <iostream>

// recursive version
int count_combinations_rec(int sum, const int* coins) {
  if (!*coins || sum < 0) 
    return 0;
  int result = count_combinations_rec(sum, coins + 1);
  if (sum == *coins) 
    return 1 + result;
  else
    return count_combinations_rec(sum - *coins, coins) + result;
};


// equivalent iterative version (shared by other two functions):
template <typename TaskIter>
int count_combinations_iter_impl(TaskIter first_task) {
  TaskIter cur_task = first_task;
  int result = 0;
  while( cur_task >= first_task ) {
    const int cur_sum = cur_task->first; 
    const int* const cur_coins = cur_task->second;
    if( !*cur_coins || (cur_sum <= 0) ) {
      --cur_task;
      continue;
    };
    ++(cur_task->second);
    if(*cur_coins == cur_sum)
      ++result;
    else {
      ++cur_task;
      cur_task->first = cur_sum - *cur_coins;
      cur_task->second = cur_coins;
    };
  };
  return result;
};

// iterative version with dyn-alloc
int …
Moschops commented: Thanks Mike. This sort of practical knowledge about the tool is always good to know and make me look far for competent at work than I really am. Much appreciated :) +11
Ancient Dragon commented: Great job :) +14
mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

You a connoisseur of spelling too? ;)

Hehe.. Actually, in this case, I'm a connaisseur of French (my native language), and also quite of the opinion that if you guys (English-speakers) are going to borrow a word from French, and still try, but fail miserably, to pronounce it as it is in French, then you can at least spell it correctly. When anglophones say "connoisseur", they actually pronounce it "connaisseur" (or at least, try to), so they should spell it like that (and I believe it is an accepted spelling too). Glad you caught it, I wrote it like that on purpose, but no one else caught it. ;)

mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

What do you get when D is close to 0?

When D is close to zero, you get close to an under-determined case and you have to switch to a minimum-norm solution. When the determinant is zero, the matrix is called rank-deficient (row-rank), in mathematical analysis. In numerical analysis, however, we generally talk about the numerical rank or the condition number (the ratio of the highest eigen-value divided by the lowest eigen-value, in absolute values). The numerical condition number has, in general, a direct effect on the amplification of the numerical round-off error. Any operation inflates the round-off error, it's just a matter of how much. In a linear system solver, the round-off error on the input is, at best, multiplied by a factor proportional to the condition number of the matrix, e.g., if the condition number is 1000 and the error on the input is 1e-6, then the error on the output (solution) is roughly 1e-3. So, if the condition number is too high (e.g., the determinant is near to zero), the whole calculation is meaningless because the error is larger than the value, meaning that the solution is garbage. This is why it is important to worry about this. Also, some algorithms are worse in terms of round-off error amplification. For instance, the determinant method produces terrible errors (the amplification factor is the square of the condition number, if memory serves me right), while a more decent method like QR or RRQR achieves the …

Paul.Esson commented: Informative ! +4
mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

Welcome!

mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

what someone was thinking when they wrote the contract

Probably: "Yay, I'm getting someone to agree to be my slave for life."

mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

Yeah, but I mean only programmed code.

That's your mistake right there, you cannot control or restrict the means by which attackers attack. It's like your asking if it is possible to build an unbreakable wall to keep intruders out, while in reality, intruders will get in either by a tunnel under the wall, climbing over the wall, getting through the door or window, trying to look through the wall, piggy-backing on people who are allowed to cross the wall, etc... the point is, the strength of the wall has no bearing on the overall security, as long as there are weaker loop-holes around it.

On the basis of program hacks themselves (e.g., software exploits, encryption-breaking, etc.), the security has become very solid overall, but that only means that attackers don't attack that line of defense as much as other weak-points. The biggest myth about hackers is this impression that all they do is exploit software loop-holes and get into computers or networks programmatically. For the most part, they don't, at least, for those who are interested in the gain ($), as opposed to the technical challenge.

In terms of hacking code, yes, anything is breakable, with enough time, skill and knowledge. For most stuff, the hardest part is reverse engineering it, but everything can be reverse engineered, and then, exploited. But at some point, the investment (time, money, resources) required to break the system that way is far too much that nobody will choose to break the …

mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

But the clock cycles needed to perform recursion is most definetly more than the clock cycles need to perform a loop.

Yes, the recursive code will include function call prologs and epilogs, unless they are partially or completely optimized away. But unless it's extremely trivial code (like a tail-call), you are going to need some dynamically-allocated memory to replace the build-up of stack-memory in the recursive form. And that is really the main performance difference, that allocating stack-memory is a cheap constant-time operation (just a few integral arithmetic operations, i.e., a few clock cycles), while allocating dynamic memory is a more expensive amortized constant operation (invoking the heap). So, the performance difference really boils down to whether the dynamic allocation is significant or not compared to the rest of the code.

You can also take the dynamic allocation out of the equation by using something like C99 VLAs (variable-length arrays) which are dynamically-sized, single-allocation, stack-based arrays. In that case, the iterative solution would probably be faster on a non-optimizing compiler. But on a good optimizing compiler (with some special optimizing flags set, e.g., -fomit-frame-pointer), the recursive code is probably going to be turned into nearly identical machine code as the iterative version, in that case. It would be interesting to find that out, maybe I'll set up some test code for that.

mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

Ok... I'm reluctant to jump into this boat again, and just repeat what I've already said, but here we go.

That recursion should not be practise or are you saying recursion is worse than other loops? Are you serious? Do you want to see a bench mark?

When I read that post / questions I was a bit startled, because I thought "oh no, did I make unsupported claims about iteratives implementations being faster than their recursive counter-parts", because that would have been a mistake, as it is wrong in general. Except for tail-call recursions, where iteration is definitely faster, recursion will be faster in general. It varies from being quite a bit faster (in simple cases) to making almost no difference at all, but it's very rare, with a good optimizing compiler, that a recursive implementation will be much slower.

However, I was quickly reassured when I looked back on my previous posts to find out that I never made such a claim. You must have interpreted what I said wrong. I'm arguing that iterative implementations are preferrable to recursive implementations, most of the time, but performance isn't a factor here.

All basic loop or recurssion can crash.

All code can have bugs, is that what you are saying? But we are not talking about crashes due to bugs. Assuming a correct, bug-free implementation, you can prove that an iterative implementation will always execute correctly and completely. But you can never prove that for a …

mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

You might want to look at the Wisdom of Chopra, which is a random quote generator from a database of mumbo-jumbo terms collected from the non-sensical tweets of Deepak Chopra (the meta-physics "guru"). They say you can contact them to get the sources.

I think that generating natural language is a very difficult topic, depending on the level of sophistication. It really depends on how much effort you want to put into forming complex grammatical sentences (as opposed to very simple "subject verb complement" forms), and if you want to actually have meaningful sentences (as opposed to just random, but grammatically correct, sentences). On the grammar side, you really want to look into linguistics, especially generative grammars, and there are libraries out there that can help you in this regard, like Boost.Spirit (which has grammar construction, parsing, and generation, in one unified set of DSEL C++ libraries). As for meaning, that's a whole different ball-game, and I would only have a vague idea of where to start. There are also ways to mimic meaning, drawing from machine learning concepts, such as predictive modeling and expectation-maximization (EM) algorithms, like for example, scanning databases of written work (books) and drawing correlations between words and their likelihood of being found together in sentences and stuff, such that you can generate sentences that kind of make sense (e.g., you are more likely to generate "a tree with green leaves" than "a car with blue leaves").

The problem really …

mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

Why would you want to put ubuntu on your mac?

Because you are ready to grow up. You bought an expansive toy, and now you want to have a useful computer. ;)

Are we talking wiping the Mac and installing Ubuntu or a virtual machine?

I don't think the OP is talking about wiping out Mac OSX, or at least, I would not recommend doing so. Installing Ubuntu as either a dual-boot or in a virtual machine is the way to go. It depends on how much the OP wants to use Ubuntu. He could probably start by trying it out in a VirtualBox.

mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

Seeing how my nephew and niece (3 and 2 years old, now) are developing, it is pretty clear to me that consciousness is something that develops as higher-order thinking develops. It's clearly a product of the brain's development. As you see an infant develop from being a kind of instinctual automaton that cries when hungry, that sleeps when tired, that giggles when stimulated by sounds or funny faces, etc.., to becoming aware of its actions, aware of other people, aware of the passing of time, able to imagine scenarios, and so on.., you are looking at the development of consciousness right there, and it's a beautiful thing. And this natural process is beautiful enough in my opinion, I don't see a reason to want some additional cosmic explanation for it, it spoils the extraordinary beauty of what consciousness really is.

iamthwee commented: Wow +0
mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

Yes, there is a fairly straight-forward way to write this as an iteration. This is a depth-first iteration, and it turns into a stack to represent the nodes to be traversed:

void process_node(node_t *node)
{
    std::stack<node_t*> tasks;
    tasks.push(node);
    while(!tasks.empty()) {
        cur_node = tasks.top(); tasks.pop();
        if (!cur_node) 
            continue;
        printf("%s\n", cur_node->name);
        tasks.push(cur_node->right);   // notice the change of order of left and right, so that left is traversed first (on top).
        tasks.push(cur_node->left);
    };
};

Of course, this does not look as nice. But the main point here is that the maximum depth of the recursion is equal to the maximum depth of the tree that is traversed (depth of the deepest leaf-node). In the recursive form, the "stack" (i.e., "tasks" in the iterative form) is formed on the call-stack (in successive activation records of the recursive function calls). Forming the stack on the call-stack is more efficient than forming it in heap-allocated memory (as the iterative form does it), but the call-stack memory is limited and if you run out of it, the program will suffer a immediate halt (crash). And predicting the amount of space you have available on the stack is not really possible in a reliable way, and also very hard to handle, even if you could predict that you will run out. This is not a problem with heap-allocated memory, which is only limited by the amount of memory you have on the system as a whole (RAM). So, if you cannot guarantee that the depth of …

mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

I was looking around the web to try and find a code that that does the same as the stdafx, but I did not find any.

You will not find any, because other compilers do not need it. The stdafx header does not provide anything more than some MSVC-specific things that prepare your headers for being pre-compiled. Other compilers implement pre-compiled headers differently, mostly through compiler options (command-line options) instead of stdafx. In other words, you will never need that header on any other platform / compiler. So, don't look for a replacement, just forget that it exists, and take it out of any code that you have.

mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

But if the body receives consciousness in the same way that a cable box receives satellite signals

A cable box receives satellite signals?!? Ok... Well, I guess he meant the way a high-gain antenna receives VHF/UHF signals. I must assume that he is very poorly informed about the way radio telecommunications work. On the one end, there is a satellite with a high-power, high-gain, directional antenna that broadcasts powerful VHF/UHF signals from geosynchronous orbit (36,000 km altitude) directed at a region of Earth that requires about a few arc-seconds of pointing accuracy from its perspective. Then, the very faint remnant of the signals that actually makes it to the Earth and passed the atmospheric interference, is picked up by a high-gain antenna (the dish on your roof) pointing directly at the satellite. The signal, which is full of noise, is fed to a demodulator which attempts to lock on to the phase and frequency of the carrier wave, filter the noise around the frequency-band of interest, demodulate the signal, filter and demodulate a couple more times, and finally, get a digitally-encoded signal with good enough quality to use it.

The point of that lengthy explanation was to expose a few important problems. First of all, to send or receive any signal (of any kind) over a long distance requires either a tremendous amount of power, or a very large piece of hardware (like a very huge antenna). The laws that govern this are conservation of energy and basic …

tamcan commented: Kudos - I believe its all about the frequencies +0
mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

Have you looked up your USB dongle's make and model to see if there are any reported issues with it in Linux? Usually, there are lots of active forums out there that contain reports of problems, and solutions to them, related to specific hardware. There might be a solution out-there (like a special driver, or a config modification, or a patch, or an NDISwrapper solution). If there is, then you would most likely be able to apply that solution to any distro that you choose to install (i.e., these close-to-kernel things are usually the same regardless of distribution (except for what is installed by default), they only pertain to the kernel and its modules (and some basic configs, like "blacklists"), which all works the same on all Linux systems, AFAIK).

I think that ElementaryOS is tailored to work "out of the box", which probably means that they package a lot of different drivers into the OS installation, just to make sure that all hardware will be covered. If it is able to make your dongle work, it probably means that there is a driver out there that will work for it, it's just that it is not part of the default distribution package of other Linux distros (like Ubuntu / Lubuntu / Kubuntu), but that doesn't mean that you can't install it too. You are probably one sudo apt-get install away from having a working dongle on any Ubuntu distro. That would be my guess.

Generally-speaking, for hardware …

mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

stdafx.h is a Microsoft header that is specific to a feature called "pre-compiled headers" that the Visual C++ compiler (among others) provides. This is not a standard header, and you should not expect it to exist on any other platform than on Windows + Visual C++. If you are very much used to working with Visual Studio, you will soon realize how (intentionally) non-standard that platform is when it comes to C++.

The preferred IDE for Mac is definitely Xcode. If you don't like it, you might try the Mac port of CodeBlocks, which is a popular IDE (on any OS). You might also try Qt Creator, especially if you want to do some cross-platform GUI programs. Or Eclipse is another very "noob-friendly" IDE (I would say, "noob-only", but that's just my opinion). Or, you can do like many people do, just use an enhanced text editor, like sublime.

mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

And so it goes again, Microsoft keeping up with their tradition of alternating between one bad version and one good version: Win95 (OK), Win98 (bad), Win2000 (good), WinME (dismal), WinXP (very good), WinVista (bad), Win7 (OK), Win8 (bad), ... I guess it wouldn't be too bad if they didn't take such a long time between versions.

mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

Because somewhere in the implementation of the std::find function, there is a comparison of two classObjects objects, i.e., there is a line like (o1 == o2) where o1,o2 are objects of class classObjects. The error tells you that the operands to the == operator (binary expression) are invalid, i.e., there is no valid == operator for the class classObjects. You need to define an operator overload for that class, something like this:

bool operator==(const classObjects& lhs, const classObjects& rhs) {
  // some code that returns true if lhs and rhs are equal.
};

Or it could be a friend function too, if you need access to private members of the class. It could also be a member function, but that's not recommended in this case.

mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

You missed a punctuation, there should be a comma between and and and and and and, as so: "between Pig and and, and and and Whistle." ;)

mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

True, there is a lot more room to breath when writing end-user (GUI) applications. But if things are really moving to the "cloud", then that might change as those end-user apps are now supposed to run on the server (and then, the economic argument applies). In any case, I still prefer writing well-performing code regardless, even if it's just for the elegance of it. I guess that is another perspective on this "recursion vs. iteration" debate. Just as richieking pointed out the elegance of recursive implementations, I would point out the elegance of efficient implementations. In other words, what is more elegant between a 4-line implementation of bubble-sort and a highly-optimized, couple-hundred-lines implementation of intro-sort. If we're talking elegance, I prefer the latter (which, ironically, is a rare case of the good use of recursion).

mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

It's odd that you say Kubuntu is faster than gnome. I always thought gnome was leaner that kde?

I know, it's weird, but it's my impression in general. I know that KDE is "heavier" (i.e., more RAM usage, a bit longer start-up time), and it's still a bit heavier than Gnome 3 (which is significantly heavier than previous versions of Gnome). However, for some reason, I find it to be just faster overall when you use it. At least, that's my impression (note that my experiences with Gnome are a bit limited, since I love KDE and use it almost exclusively). My guess is that because KDE relies on Qt (which is where the relative "heaviness" comes from), which is a much more comprehensive GUI library, things are more homogeneously implemented, and thus, faster. Gnome is a lot more of a patchwork of different components, which leads to overhead for gluing them together. At least, that's the best explanation I can find.

The graphics is an onboard amd raedon chipset according to the datasheet.

That graphics card will require a proprietary driver (which you should be able to find through "Additional Drivers..."). The driver for Radeon cards is the Catalyst driver. I guess this is the one to download (but try going through Additional Drivers.. first).

I think that the open-source graphics drivers for Radeon cards are pretty bad, but only because the drivers that AMD provides for Linux are very good, so there …

iamthwee commented: Perfect +14
mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

Ahh, young padawan, you have much to learn.

I think the main question is when to use recursion.

There are some cases where it is OK to use recursion, like Ancient Dragon's example with walking through the file system because the file system has very limited depth and the individual steps are quite expensive (the HDD is the performance bottleneck here). These are the basic requirements for recursion to be an acceptable way to implement a solution.

The other case where it is good to use recursion is when (1) the depth of recursion is guaranteed to be very limited and (2) you want to avoid dynamic allocation (on heap) for storing the states in favor of using dynamic growth of the call-stack as a faster means of storing the states. Basically, if you use an iterative solution, you will need a dynamically allocated queue or stack in order to store the progress (or states at each step) of the algorithm. If you use the recursive version, that progress (or states) is stored in the activation records of the recursive function calls. Allocating on the stack is faster than on the heap, but the difference is rarely significant enough to warrant the use of a recursive solution. One example where this does apply is for the intro-sort algorithm (which is the sorting algorithm used by the standard std::sort function, which is a variant of the quick-sort algorithm). In that case, it is better to use recursion because the …

mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

Arrrgh!
Virtuuus
RRR (Recycle Reduce Reuse)
...

mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

If you don't know if you need additional drivers, just use the "Additional Drivers.." application. See here. It will tell you if there are any additional drivers that are needed or appropriate for your hardware.

Onboard graphics, I assume Intel Graphics, works quite well in general and does not require additional drivers, but I'm not sure.

mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

Notice that myframe is never deleted! None of the controls they create are ever deleted! Every example I ever see uses new and somehow never deletes a single thing.

In that wxWidget example, the frame is created (with new) and then it is registered as the "top-window" using the SetTopWindow function (which is a member of the base-class wxApp of the MyApp class). I would assume that once this top-window is set, it will be deleted in the destructor or finalizer of the wxApp class.

This is how a lot of "old-school" C++ libraries work, and I would assume wxWidgets works that way. Certainly, Qt does. This is not great, because "modern" C++ has a number of established practices that are much better than that, such as using smart-pointers, but once you are committed to a particular system, you have to keep using it and live with its flaws or ambiguities. And GUI libraries like wxWidgets and Qt carry around quite a bit of that "old-school" baggage.

I asked on StackOverflow and someone said: "The memory is deleted when the application is closed so just leave it".

Not exactly true. The memory is reclaimed by the operating system when the application is closed, meaning that you won't get a system-wide problem as a result of this (i.e., it won't consume RAM after it is closed). However, objects are not deleted when the application is closed. In other words, any object that was dynamically allocated and constructed …

mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

Does that mean we have too many Ph.D. researchers???

Too many of us?... :( But, we're so lovable..

What we have is too much horse-racing pressure to get the most publications, the most patents, the most grants, etc... It reached the point, today, where everything and anything is published, patented, and put forth for research grants. The problem with keeping up with the 1000 articles on medecin that get published everyday, is that 999 of them are pretty much irrelevant, or redundant, or incomplete, or too meager, because where people used to do research work for a few years before publishing an article about it, they now do 2 to 4 weeks of tinkering with some code and write a new article about it, and get hailed as a champion of science for being so prolific. The peer review process is being swarmed with junk and being pressured to let everything through. The incentives are too strong and in the wrong direction.

That said, it is true that science and technology is currently evolving far too fast now for anyone to really be able to keep up with it all. You have to pick a specialization and hope you can hang on in it, and even that is hard.

mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

I can only think things must be different at McGill

I guess they must be. But it's not like I'm only talking about McGill either. I've been around a bit (e.g., in Europe), and also, from interactions here on Daniweb, it seems that recursion is a recurring topic (pun intended!) with the mass of computer science students that meet on this forum (but maybe that's not a representative sample of the overall population).

Anyways, at my university, it seems that the focus is a lot more on computer science and software engineering, rather than any kind of practical teachings. Just to illustrate, if you mention things like sorting algorithms, recursion, linked-lists, and other basic computer science topics, the average computer science student here will start talking all about it as if they were world-experts on these subjects (which I have seen happen many times). But then, if you give them a simple programming exercise that involves linking with an external library, an exercise that I solved within an hour, then the average computer science students will spend the entire week or two working full-time on it, simply because they have zero real-world programming skills. That's been my impression, but it is certainly not a complete generalization nor a thorough investigation, it's just from my limited experience. Mind you, I have no formal computer science education at all, it's all practical experience for me (and I only picked up some CS knowledge here and there).

The number of …

mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

Ubuntu, like most Linux distributions, is great for programming. Fundamentally, Linux is an OS by programmers, for programmers (even though it has gotten very user-friendly for anyone else too, now). I tend to prefer Kubuntu (the KDE version of Ubuntu), as it is a bit more development-oriented, and the nativeness of Qt on it makes it all that much easier.

Is it good for game development?

Sort of. It is a good platform for a few reasons. First, it is a great platform for programming in general (game or otherwise). Second, game programming relies on either DirectX or OpenGL, and in Linux, OpenGL is available and generally up-to-date (but DirectX is Windows-only). Third, third-party libraries are very easy to install and use in Ubuntu, which is crucial to speed up the development of your games. Fourth, Linux is a Unix-like environment similar to Mac OSX, meaning that porting your game to Mac platforms will be easy.

However, it isn't a good platform for a couple of reasons. First, game development can often involve using a game engine (a library of code that does a lot of the ground-work for you) which are most widely available in Windows, but still, the majority (about 60%, I would say) are available in Linux too. Second, if you want to distribute your game, Windows would probably be your target, not Linux, but still, you could develop the game mostly in Linux, and only port it to Windows once you have …

mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

Yeah, computer languages are the "universal" languages, like C++:

/** Ceci est une classe pour représenter l'état d'une particule.
 * \note Den här klass är bara för tvâ dimensioner.
 */
struct TeilchenStand {
  double position[2];
  double geschwindigkeit[2];
};
mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

I don't understand the point of this. Are you publishing code on Daniweb? Usually, code snippets are for demonstration purposes only. Daniweb is not exactly a platform for code publication in general, better try platforms like github or sourceforge.

If you want to publish code (in the sense of "hey, here is some code, use it"), then that requires a licensing notice (e.g., BSD, (L)GPLv(2-3), Boost, etc..). And on Daniweb it is a bit fuzzy when it comes to that because technically-speaking, Daniweb gets the copyright to the code.

That said, if what you want is constructive criticism about your code, then this is a lot to chew on. I'll have to make a few broad observations:

(1) Use namespaces. If you want to write "library code", you really, absolutely, need to put your library code into a namespace, with a name unique to your library (more or less).

(2) PImpl the windows header. One of the main reasons I can think of for wrapping the Win32 API code into classes (or other code) is to insulate the windows.h header. This header is one of the dirtiest piece of bad-C / barely-C++ code ever put in an "official" header file. The number one step would be to put it away, deep. The compilation firewall, or PImpl, is the idiom that you should be using for this purpose. You might also be able to get away with forward declarations or by only including the windef.h header, since Win32 API …

mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

Sin City

mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

Like rubberman said, if you have a 64bit system, run a 64bit OS. Basically, a 64bit CPU is made to use 64bit numbers and instruction sets. It will support the 32bit numbers and instruction sets, but it's like running in a "compatibility mode" or something like that. The point is, it's not "native" or "natural" for the CPU to run 32bit instructions. And also, an OS that is compiled to be optimized for a 64bit instruction set will be able to do more with less instructions or make use of modern extensions, making things faster overall. Finally, it could cause some issues, like mathijs said, with RAM memory and things like that. There was a brief period when some PCs were 32bit, yet had more than 4Gb of RAM (which requires more than 32bit to address), and that caused some issues. After that, 32bit OSes and CPUs implemented counter-measures to solve this problem, but I'm not sure that a 32bit OS running on a 64bit CPU would be able to use those counter-measures, and thus, weirdly enough, the issue would re-surface... but anyways, this is mostly speculation on my part. Long story short, you really should use a 64bit OS for your 64bit system.

Well it still feels very sluggish, there doesn't seem to be that much difference. Perhaps I'll go with an Xbuntu 64bit install.

There must be something wrong. Make sure you have installed the "Additional Drivers.." from the menus. I suspect, like you said, that …