mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

it would be netcat, not netcat.exe, although changing the name wouldn't harm anything, or you could make sure that the Makefile created it as netcat.exe.

I don't quite understand that sentence, but there is at least one interpretation of it that is very wrong. Linux executables are not the same as Windows executables, no matter what the extension is (with or without .exe). Linux uses ELF format (in the Unix/BSD tradition), while Windows uses PE format. These are completely different formats and unless you run under an emulation layer (like Wine in Linux, or Cygwin in Windows), there is no way to run one format in the other OS, AFAIK. Changing the extension does not do anything.

MinGW cross compiler under linux

The main problem really is to find a way to tell GCC to generate Windows code. And by Windows code, I really mean two things: it needs to use Windows libraries; and it needs to be packaged in PE format (.dll, .exe, etc.). The executable code itself is just dependent on the processor not the OS. I have very limited experience with setting up a cross-compilation environment. I just know that it's common for embedded systems and things like that where you can't really compile stuff on the target platform (it's too small), but in general, those are still Linux-to-Linux cross-compilations, just with a target different architecture and linking with specific libraries.

I would imagine that cross-compiling anything serious for Windows but under Linux …

mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

I could see how a tablet can be useful for some types of work where you need a really mobile smart-something for reading or taking notes or maybe light work. Basically, I don't think that the tablet will replace the laptop, it will replace the "pen and paper" stuff like notebooks, note-pads, folders, etc... which used to be the only reasonable way to easily carry some documents to some meeting or take notes "in the field".

For example, a friend of mine used to have an internship job where his job was to go around an entire power-plant and note down every temperature / pressure readings on the piping, and then come back to the office and enter them all into Excel, and then repeat. It would probably have been much more productive with a tablet from which he could just enter the values directly.

I see this as pretty much the only work-related purpose for tablets. And that wasn't something people did with laptops before, or it was really awkward to do with laptops.

As for leasure, I don't see where a tablet fits between a smart-phone, an e-reader, a TV, a laptop, and a computer. The fun little games on the tablet are just as good on a smart-phone. E-readers are better for reading. A TV is nicer for watching stuff. A laptop is nicer for sitting down and doing some work on-the-good (or in the office). And a desktop computer is better for serious work. I guess …

mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

Another good thing to do is to block ports in your router. There are tons of ports that are not useful for anything but exploits (e.g., ports that are used for some old or obscure feature that nobody really uses except that these "features" contain holes that hackers can use to get in). There are reports out there that detail all the ports that you should block... NSA has a number of public reports of that nature that you can follow. Router or computer firewalls generally don't block those ports because they are mostly focused on blocking torrents and other p2p protocols, they don't block "official" protocols, which is where real hacks come from.

It's also important to understand that 99.9% of "hacking" uses the "shotgun approach". The idea here is that they just diffuse their malicious software all over the place and catch the most vulnerable people. As long as there are enough vulnerable people to make it worth-while, they won't try a more aggressive attack. In other words, why try to attack some random guy who runs a secured version of Linux behind a uber-paranoid port-blocking router when you can just attack the grandma who thinks that the anti-virus she installed 3 years ago and never updated / renewed is keeping her safe, as she clicks on any random thing that pops up on her screen.

And at the end of the day, whatever the hacker is doing, the data must come out of your computer onto your …

mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

Similarily, another great movie which got bum reviews was 'Signs' by M Knight Shyamalan, I especially love his movies and that had a similar theme which reveals itself at the end.

I think his early movies were entertaining on the first watch (because of the twist), but that's about it.

The Sixth Sense was definitely good, especially on first watch, but I cannot say that it really "stuck" for me, in the sense that I don't think I've watched it more than once or twice.

Then, I found Unbreakable to be probably M. Night Shyamalan's best movie, it was interesting the whole time, and had a great twist ending.

Then, as for Signs, I wouldn't call it "abyssmal" like RJ, I thought it was entertaining (or intriguing) on the first watch, and thus, not a terrible movie. I remember enjoying it when it came out, and I think a caught it a couple of times on TV, and it was still somewhat enjoyable to watch. But still, it's just a so-so movie. And, yes, in retrospect, there are a lot of plot-holes in it, as RJ pointed out.

As far as all the rest of M. Night Shyamalan's movies go, I haven't paid much attention to them (I don't think I saw any of them). But considering that they got worse and worse reviews, and culminating with The Last Airbender and After Earth, which are both considered as two of the worst movies ever made, I can't say …

mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

What do you mean by "statically compile"? I don't understand what you are referring to.

If you are referring to static linking, i.e., not using dynamic libraries (DLL) as run-time dependencies, then I don't think that any GUI library will allow you to do that easily. Qt is probably the one with the most chance of being able to be statically linked to your application. For WinForms (Visual Studio), you can forget it completely.

mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

The negative voltages that the PSU provides are an old requirement in the standard supply wirings of the motherboard. However, these negative voltages are no longer used on most reasonably "modern" hardware. Basically, a few of the old hardware components, such as floppy-disk drives and serial / parallel ports, used those negative voltages. However, they have essentially been phased out, and probably nothing still uses those negative voltages, except maybe for very specialized hardware. On most modern motherboards, the -5V pin goes nowhere (no longer connected to a bus), and I'm not sure the -12V pin ever went anywhere (but some motherboards used it for integrated components).

It is very likely that your PSU simply does not provide the negative voltages and only has dummy wires (of floating voltage) connected to those terminals, just so that the "standard" connector can plug into the motherboard. This is very common with modern PSUs, because providing negative voltages is no longer required, and it's just that the connectors have remained the same. You could check your PSU's specifications to verify that.

All the other voltages seem within reasonable margins.

mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

As rubberman says, in Linux, drivers are implemented as kernel modules. Most drivers are open-source. Basically, the developement community didn't wait for hardware manufacturers to create Linux drivers for their hardware, because they could very well have waited forever, and instead, they developed open-source drivers. For the most part, these drivers are written by reverse-engineering the Windows drivers, and making "lowest common denominators" drivers (manufacturers usually re-use the same basic set of commands between models, so, a "basic" driver will work for a whole series of products).

So, this means that most drivers in Linux are open-source and can therefore be packaged with the Linux distribution (Ubuntu, Fedora, Debian, RHEL, etc.) installation and package-repository. This means that as you install Linux (most popular distributions anyways), it will automatically check your hardware, automatically download / install / enable all the appropriate drivers, and you will probably not have to do anything after that. It is possible that a few peripheral things are not working (e.g., wireless, microphone, webcam, etc.) or not working as well as they could (e.g., graphics card, etc.). If that's the case, you can check if there are proprietary drivers for those specific things (and installing them is easy, and there are usually simple instructions). If there are any issues after that, well, you know where to ask for help ;)

RikTelner commented: Thank you. I sure know where to ask help :). +2
mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

Then there's the whole "get hit by a bullet and get thrown violently backward".

What's worse with all this is that this movie myth is essentially the whole reason for the "third shooter" conspiracy about JFK's murder. The whole idea that he must have been shot from behind because his head rocked forward. But his head rocked forward because his brains were spilled backwards, just basic conservation of momentum.

mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

Another widespread inaccuracy in TV/Movies is the sniper bullets hitting the target (or missing the target) after you hear the "bang". The thing is, sniper rifles all fire bullets at super-sonic speeds (Mach 2.5 to 3.5) for range/accuracy reasons. In other words, from the point of view (or hearing) of the target, the bullet comes before the "bang". At farthest ranges, it could even take more than a second (or even two) before the sound reaches the target after the bullet already has.

Obviously, in movies there is sometimes an ambiguity as to where the "bang" is supposed to be heard from, whether it takes a point-of-view near the target or near the shooter, or neither in particular. But sometimes it's obviously impossible, like when the bodyguard hears the shot and immediately dives and saves the VIP he's guarding. Or, when you see the target reacting to the "bang" a fraction of a second before being hit by the bullet.

In real life, if you hear the shot, it's already too late.

I guess it might just be too weird for people who are conditioned for the "bang! you're dead" scenario, and would be weirded out by a "you're dead... bang!" situation.

mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

While we are discussing grammar, I thought I'd share this article I stumbled on some time ago. For those of you who might be, like me, both grammar buffs and Star Wars fans, here's a nice article discussing Yoda's speech patterns, and in particular, explaining why the prequel trilogy had such an odd-sounding Yoda compared to the original movies.

mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

I think that you will have a bit of problems because of all the algorithms you cited, only "merge-sort" is parallelizable, AFAIK. In any case, parallelizing an algorithm is very tricky business, especially if your aim is to compare the performance. In fact, the overhead coming from synchronization is generally so significant, even in really good parallel sort algorithms, that you won't be able to beat a single-threaded lean implementation like std::sort for data sets smaller than 100 thousand elements or more. That's how significant the overhead is.

I would guess that you must have misunderstood the question because I doubt very much that anyone would ask you to write a parallel bubble sort algorithm. So, I'm not sure what that "parallel" word really refers to.

In any case, we won't do your homework for you. You have to show that you are making efforts to solve this problem by yourself and ask us questions about specific problems that you are encountering with your code.

mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

There are sites that will charge a fee for hosting a git repository just the same as there are sites that will charge a fee for hosting an SVN server, or any other version control server / repository for that matter. However, you can use git for free on your own infrastructure, whether it be on your local hard-drive only or using your own server (local, from your organization, or a VPS). There is absolutely no requirement to tie yourself to a paid "cloud" service or hosting site. jwenting is clearly misinformed about this.

mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

but unfortunately I dont know how can I edit and compile those source code.

I'm a bit baffled by the simplicity of your question... Do you know how to program? You shouldn't try to delve into some open-source project's source code if you don't already have enough programming knowledge to understand a fraction of it. Learn to program first, then look into open-source projects.

Source files are just edited with anything from a simple text editor to a full-blown IDE (Integrated Development Environment). Assuming you are familiar with the language used in the projects, you should be able to see which files are source files and can be opened and edited.

When it comes to compiling, all decent open-source projects will have a "README.txt" file or similar instruction files in the top source directory. They will usually rely on configuration scripts or build-systems such as cmake, autoconf, makefiles, etc.. to automate the configuration of the compilation for your system, and check if you have all the required dependencies (which you may have to download, build and install too). And, of course, you need the appropriate compilers.

But overall, it is impossible for us to give you anything more specific without knowing the specific project, languages involved, and platforms involved.

Hey buddy I am currently working on Windows environment.. can you put some more light on this issue? Like how can I proceed further with Windows environment?

Windows is generally a bad development environment, for just about …

mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

When your computer does power on, how well does it work?

A motherboard does not really have a battery on it, except for a small battery (round) that keeps a few things alive at all times (the BIOS memory and the clock, I think). What supplies the power for the computer is, well, the power-supply (which just converts the AC power from the wall-plug into DC power (12V / 5V) that the computer hardware needs). Failures of the power-supply are very common (often under-designed or neglected in the selection of components). I'm not sure that your particular symptoms are really indicative of a power-supply failure, but it could be, especially if the computer seems to function normally otherwise. But your symptoms are weird. Replacing the power-supply would be a fairly easy and somewhat inexpensive fix. But I'm not sure it will solve your problem.

mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

I think that by the time a 2TB RAM stick would cost only 30$, we will already have switched to memristor technology (or one of its alternatives), and the RAM + HDD paradigm will be a thing of the past. We are clearly moving towards a unified "better than both" storage technology that will replace both RAM and disk, and that will be a major revolution (e.g. boot up the computer in a second, no more "sleep" required (if RAM is non-volatile), thousands of times faster disk I/O, etc.). And there is too much potential in this technology for it to remain prototypical for long, all major players are fast-tracking this and trying to patent and commercialize it ASAP.

mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

I installed the ScriptSafe extension for Chrome and I blacklisted the "apis.google.com" domain. The Google+ button is gone and so is the memory leak!

mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

Woah! Since when and for what do you need to BUY a license to host a repository?!?!?!

There are some proprietary version control system for which you need to pay a license fee to use. As far as I am aware, none of them are really good enough compared to the open-source / free options out there, such as Git, SVN, CVS, Mercurial, Bazaar, etc.. I think that at this point, most proprietary version control system have more or less given up on trying to compete, except maybe BitKeeper. I know that the Microsoft version control (Team Foundation Version Control) has even abdicated in favor of Git, and the version before that (Vault) is also dead.

Long story short, these days, you would have to be pretty stupid to actually pay money for licensing a version control software or repository, because the open-source (especially Git) are clearly superior products.

SVN is just gawd awful compared to the more recent solutions (like Git)

Yeah. SVN was OK, but a bit annoying. And clearly, it is being phased out in favor of Git everywhere. I constantly see older open-source project and libraries that are now using Git, where they used to use SVN. It's gonna disappear slowly... I think that people share jwenting's concerns about where their data is and that's why they like Git so much, for the flexibility, decentralization, and the full backups it produces everywhere, and also the general security and control over any server …

mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

So, from doing tests with Dani in the chat, it appears that the culprit for this issue is the Google+ button. For instance, even this tutorial page about the Google+ button leaks memory at every refresh (for me). So, this appears to be a bug between google+ and google-chrome... ;)

For the moment, the bug can be fixed by removing the google button, which comes from this file: https://apis.google.com/js/plusone.js

If, in your browser, you black-list it or something, I guess it would fix the leak. Any one has a easy suggestion on how to do that (I'm not much of a javascript guy)?

I might file a bug report to google.. if I care enough to do so. But it seems, the problem with that is that it isn't very consistent (easy to reproduce) between platforms (even with the same chrome version).

mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

The thing is for me, I have enough "skills" with the command-line that I find that many tasks are accomplished faster that way. Often, I find that between clicking on some buttons and going through some configuration GUI-panels or wizards, and doing the equivalent task on the command-line or in a config file (CMakeLists.txt, or a bash script), I find that the latter is quicker. I think that the effectiveness of working in the command-line / terminal is underappreciated.

mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

Alright, maybe Survivor has a lot of fakery too.. but I still find it fun to watch, for the thrills and strategy in it.

mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

I imagine it existed solely so that people whose lives were complete crap could watch and say "well, at least my lot is better than that."

Yup.. that's pretty much what it is. I call it "Trainwreck TV". I think you watch it for the same reason you slow down while passing a car accident on the road. Good or bad, there is definitely a strong curiosity in people for watching that.

I also believe that most of the people who got on the show were just making shit up to get their 15 minutes of fame (and a paycheque).

There were probably some people making up the stories, but from what I have gathered, Jerry Springer was like most reality TV today, that is, the people are "real" (real people, real stories (at least, some part of it)), but what they do on the show is extremely forced, prompted, exaggerated, staged and edited for higher dramatic value. That's the "deal" with most reality TV, people sign waivers saying that they won't sue the show for making them look like fools, and then they follow directions (from the show's crew) to cause more drama / fighting. And once you agree to look like a fool, you might as well make the most $ of it. Pretty much all reality TV shows are done like that, such as Jersey Shore, Kardashians, The Hills, The Bachelor(ette), 16 and pregnant, Real Housewives, Big Brother, ... and so on. I …

mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

but until there is an IDE friendly front end I doubt I can see the point in adding still more overhead to my coding.

Most decent IDEs will have support for version control. In my IDE (Kdevelop), you can view your code under version control ("Review" view) to see history of changes and things like that (basically, most stuff you can do in command-line) through the GUI. There is also a "commit" button to commit your code after a session of coding. The IDE automatically detects if your folder is under version control (and which one), and adds those functions to the GUI if it does. I personally don't use that much because I'm more productive in the command-line, but if you're a GUI-monkey then you have that choice too.

Even Visual Studio has GUI support for git.

The reason I prefer an IDE like Visual Studio over managing code and makefiles by manually creating config files and compiling/linking via command line is because I prefer coding to managing code.

Weird. I use command-line tools, version control and build-systems exactly for the same reason that you like using an IDE. I like IDEs that can natively use the tools that I use (such as KDevelop), because, then, the IDE works for me, I don't work for the IDE. For example, I like Git (for many of the reasons mentioned here) and I like cmake (a robust and flexible cross-platform build-system), and when …

mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

We are going to use a game building kit/platform that works on mac, windows and linux. So first, I would like to know if someone can tell me of the platform he might be talking about.

There are a few possibilities, depending on the scope. As Alex mentioned, he could be talking about Unity3D, which is a cross-platform 3D game engine that is kind of popular for entry-level home-made 3D games. But frankly, I doubt that in your first year of CS you would be asked to do a 3D game, even a simple one. It just seems a bit over the top, and also, you would be spending a lot of time just working out the kinks of 3D graphics and modeling, and not much time coding.

It is possible that he is referring to a simpler 2D platform. Something like SDL, or maybe even flash (which is basically just interactive animations, really).

What programming languages have you been focusing on? Because that would be quite telling of which platform your prof has in mind.

I imagined if the game can support the three platforms then it might not be too graphic intensive

The fact that it supports all three platforms has no bearing on the intensity of the graphics. In fact, Windows is the worst platform for graphics, and it's pretty good, so, there isn't much of a limit here.

and the fact that it is a 9 weeks.

That's …

mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

It's a mathematical theory about the optimality of strategies in a game. A game is generally defined as a set of rules that determine the possible actions that each player can make and the rewards given to each player in a round of play (each player makes a move). John Nash is best known for the "Nash Equilibrium" which is a theorem that defines a set of conditions by which the strategy of each player is locally optimal given the strategies of the other players. More interesting, however, is that the global optimum (most rewards for all) might not be one of those Nash equilibrium points, from which you can prove that cooperation is always better than competition.

This theory has implications in economy, sociology, ethics, etc.. as it provides a mathematical framework to evaluate the optimality and stability (stable equilibriums) of a collective set of strategies. For example, the stock market can be seen as a game in which all players try to make the best investments (actions), and depending on each others investments, they get rewards (returns on investments). And so, being able analyse which moves are the best in that context is quite important (for the investors), and also, analyzing what should be the best collective investment strategies is quite important (for the society at large). In ethics, there are similar considerations, i.e., maximizing the well-being of everyone.

Now, if you think that this has much to do with computer games or things like that, then you …

mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

Exactly, talking about version control is completely different from talking about cloud storage or anything of the sort. You can use version control any way you want, either just locally on one computer (e.g., a "version-controlled folder"), across a small local network, across a company network, to a private server (e.g., VPS) somewhere, or using one of those hosting sites like github / bitbucket / etc.. But you don't have to, that's important to understand.

A git repository is, in fact, nothing more than a folder that contains a hidden stash of diff-files that date back to the creation of that repository (i.e., the history of revisions). That's all it is. All the "magic" happens in the way that the git programs can manipulate and present that data to you, and how it can synchronize (push/pull) between different clones of the repository. Where you choose to put those repositories is entirely up to you. Basically, git can deal with repositories being on the same computer or somewhere remote that is accessible by ssh (or https, I think).

The buzzword "cloud" in this context is pretty misleading. To me, "cloud" is just a buzzword to sell the idea of storing stuff on a server to people who don't know what a server is. But to people who know what a server is, then clearly "the cloud" is just another word for "a server". And (savvy) people have been using servers to store their files for decades. They just call it "the …

mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

Is it possible that you were before updating the browser?

No.

Do you have the issue with all windows closed except for the chat?

I tried to close all windows, and open only a Chrome with Daniweb, not logged in and with all extensions disabled, and the problem is the same.

mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

no

mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

I'm running Linux. And it seems that this issue started when I update the version of Chrome to version 33.0.1750.117. I updated two days ago, which coincides with the start of this issue, roughly.

I have also tried to disable all extensions and various other things (like new browser window, etc..), and the issue persists. Every page navigation seems to allocate an additional 10 to 20 Mb of memory which remains allocated permanently (i.e., it's leaking). I took a few javascript heap snapshots... but this is far from being my field of expertise, so I really have no idea what I'm looking at with those, but I can at least tell that the "object counts" and "retained sizes" of the stuff that leaks seem to correspond to that 10-20Mb I estimated.

Also, I consistently get the following javascript warnings:

Invalid App Id: Must be a number or numeric string representing the application id. all.js:56
FB.getLoginStatus() called before calling FB.init(). all.js:56
Consider using 'dppx' units instead of 'dpi', as in CSS 'dpi' means dots-per-CSS-inch, not dots-per-physical-inch, so does not correspond to the actual 'dpi' of a screen. In media query expression: (-webkit-min-device-pixel-ratio: 1.5), (min-resolution: 144dpi) tweet_button.1392079123.html:1
event.returnValue is deprecated. Please use the standard event.preventDefault() instead. 

not that I have much idea what they mean, but that's what I get any time I do anything.

mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

Are you noticing a difference being on pages with the editor vs pages without?

No. It seems to be the same regardless of the kind of page between in-thread (with editor) or in forum listings (without editor).

Being logged in vs not logged in?

No. Being logged in or not makes no difference.

When did this start happening??

I just noticed it today... but it's possible that has been happening for a few days. I would have noticed it if it was happening since longer ago, so, I guess it must be quite recent.

Btw, I'm now at 1.2Gb (same tab since last post).

mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

But my mac usb file system isn't compatible with linux, or at least doesn't allow it to be written too.

Well, that's easy to solve. To read/write Mac file-systems, you just need to install the "hfsplus" package in Linux... normally, I think it should be there by default. And if you don't have the correct file-permissions setup on your folder, then that's easy to fix too. And, I'm also a bit baffled as to why you would use a Mac file-system (HFS+) on a USB stick... why not a more portable file-system like NTFS?

Basically, doing some work in mac, then doing some work in linux but making sure both are now identical. I couldn't find an easy solution... but I'm all ears.

So I have my git folder setup in mac. I do some work on it. Back it up to usb. Go home open my linux machine. And overwrite git folder in linux with the one on my mac usb.

That's not the way you should do it at all. You should never "overwrite" folders like that, not when you have git setup on them.

I have a similar work-flow (except it's Linux on both ends) between my work computer (at office) and my home computer. Here is how I normally do things. I'll explain it in details, because it can be useful to people.

Initial Setup (which may seem long, but it's just once, and it isn't that much work, just a few …

mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

The word-size of a computer architecture is just the number of bits that the computer can digest at one time (for each instruction). So, for an 8-bit processor, it just means that it can only perform operations on 8-bit numbers at a time. This does not mean that it cannot digest bigger numbers, it just means that if it needs to digest a bigger number, it must break up that number into 8-bit chunks and digest each one at a time.

If we take the analogy of eating food, then the limit on the amount of food you can put in your mouth at a time does not limit the total amount of food you can consume, it just means that it will take longer to eat a big plate of food it your bites are smaller.

The standard Unix representation of time (date) has always (AFAIK) been using a 32bit signed integer (and lately, using a 64bit integer) for the number of seconds since the epoch (1970). On 8-bit platforms, this would mean that in order to manipulate a date (e.g., adding a year to a date), the computer would have to add the two 32bit numbers by individually adding 8bit chunks of it.. meaning, four additions with carry (which would be 7 additions total). But the point is, it can still deal with larger numbers than 8-bit, it's just that it needs more work to do so.

If we take the analogy of doing additions like we did …

RikTelner commented: Mike saves world again :D. +2
mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

By the way, I've tried to navigate back and forth on other websites and watching the memory consumption. This is clearly a problem from Daniweb, because none of the other websites I've tried this with have that issue. In other words, this is not a bug in Chrome or one of its extensions. But it could be a Chrome-only bug of Daniweb.

Currently, since I closed and re-opened the Daniweb tab (just after creating this thread), my memory consumption is now about 600Mb and rising.. now 605Mb... and rising.. definitely leaking something.

mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

It's not a problem with the editor. It's a memory leak problem. When the memory consumption starts to reach a couple of Gigabytes, then everything starts to become slower, including the editory, of course. Everything lags, from the editor to the forum drop-down menus (forum categories).

But the problem is not with the editor. Basically, every time that I load a new page (while browsing between forums and posts), the memory consumption of the Daniweb chrome tab increases by 30Mb to 50Mb. And so, after about 50 to 100 times (and that can easily happen since I almost never shutdown this computer or my browser windows), the memory consumption is huge and slows everything down. That's the problem.

mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

Issues... there was no good client for bitbucket on the mac at the time. As a result I spent a lot of time in the terminal.

As far as I'm concerned, there will never be good GUI clients for version control, because nothing beats the speed, effectiveness, and power of terminal-based interactions with it. But then again, I think that GUIs for Git have gotten a bit better lately, but I don't know.

Additionally, because it was cloud based it seemed to slow everything down.

This is really odd. That statement is extremely surprising to me because I have absolutely no idea how bitbucket (or github) could "slow everything down". This makes no sense. All your coding is done locally on your local folders and files. There is absolutely nothing running in the background or connecting with the "cloud" (i.e., a server) as you are coding. In fact, you shouldn't even need to have an internet connection at all. You have to explain this a bit more, because I really don't understand what this means or how it is even possible. I suspect there is something wrong with the way you used it, but I can't imagine what.

To get to an old piece of code you had to do a roll back... then a roll forward.

Maybe your limited GUI client or something required you to do that, but this is certainly not necessary. In git, if you just want to get an …

mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

If I close the tab, the memory is released and all is fine (no lag, no problems). And then, I've noticed that every time I load a new page (e.g., go to a new forum, or thread, etc.), the memory consumption increases by roughly 50Mb each time, and it just accumulates and accumulates.

mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

Hi all,

I just noticed a signficant slow-down with Daniweb, not with the server, but with the site (javascript?), in particular, with the editor that jams every few seconds. So, I checked by task manager and found that one Chrome tab was taking quite a bit of resources (memory + CPU), and lo and behold, it's the Daniweb tab. Here is what I get from "chrome://memory-redirect":

PID     Name                                     Memory
                                                  Private      Proportional
3906    Tab                                      2,042,964k   34,376k
        DaniWeb Community Feedback | DaniWeb

In other words, the Daniweb tab was taking up 2Gb of RAM. And whenever I browse around a bit, the memory consumption just keeps on increasing and increasing. It seems like a memory leak to me.

You might want to look into it.

mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

First of all, in an unsigned byte, the maximum decimal is 255 (0xFF). And second, it's not because the word-size is only 8 bits (1 byte) that you cannot create numbers that are larger than that, it just means that if the numbers are larger than the word-size, you have to handle them word-by-word. For example, if you add two multi-word numbers, you just have to add the least-significant words from each number, and then keep the carry for the next word addition, and so on..

Think of it this way. You, as a human, when you were in elementary school, you could only represent a number between 0-9 through writing a digit down (i.e., that is your native "word-size"). But, you could still represent very large numbers and do many complicated operations with them, right? Well, it was the same for these computers. And it's still the same today for very large numbers that exceed the 32bit / 64bit word-sizes.

mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

Well, that must be part of the training at the academy... learning to switch your high-heel shoes for flat soles in a split second. I mean, that's a necessary skill for a female cop, right?

mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

2 gold medals for Canada, and 1 silver for Sweden.... I couldn't be happier!

mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

The line 99 won't do what you think it does (whatever that is). This line:

std::vector<Car*> print();

actually declares a function called "print" that returns a vector of Car pointers. In other words, the line does not do anything concrete.

I believe that what you meant to write was this:

for (int i=0; i<CAR_SZ; i++){
    cars[i]->print();
}

which calls the print function on all the car objects.

Also, you should get into the habit of cleaning up after your code, to not let memory leak (here, it doesn't matter, but latter it might). So, you should clean up your allocations (at the end, before the return 0; line):

for (int i=0; i < CAR_SZ; i++){
    delete cars[i];
}

for (int i=0; i < PERSON_SZ; i++){
    delete people[i];
}
mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

But I don't work in a team. I found the process interfered with my work flow. Instead I just create a new revision with a folder name plus an incremented number on the end.

You don't need github or bitbucket, or any other "host" for using Git. For a very simple "one man" project, it could be as simple as turning your project's folder into a git repository (which you do with $ git init in the top-level folder), and then, just "commit" your changes once in a while (e.g., every day or every time you complete a good chunk of code). This requires very little work, and does not really disturb any kind of work flow (unless your work flow is extremely disorganized!). Doing things like copying folders (and tagging them by version number or date) or stashing tar-ballz of older versions is far worse in terms of work needed and far less convenient in terms of features.

Here is a typical situation:

I'm working on a project (just me, local repo), and I run some piece of code which fails (error, crash, whatever...). I find this odd because it used to work at some point, i.e., the last time I ran that particular test code, maybe a couple of months ago or so. So, I'm pretty sure I didn't change too many things, but I can't figure out or remember what could be causing this error. So, I go to my project directory, and check the …

mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

How does someone or a group of people create a public folder on dropbox or some sort of similar service and sync the code with their buddies?

I did that a few times for non-code stuff (word documents), and it's the worst thing ever. One time, it was really bad, in a team of 9 people working on a word document of 900 pages, over a drop-box style thing. We had to basically do the check-out / commit through emails (e.g., send an email to say "I'm currently working on it.", download the latest doc, make your changes one-by-one from your local (old) version to the latest doc, then upload the latest doc back, and email to say "Ok, I'm done!"). This was hell! And the sad thing is, we were following the recommended methodology that our supervisors at the European Space Agency told us to follow! So, this whole problem of people being oblivious to the existence of version control is worse than you think (not to mention the absurdity of creating large documents in MS Word, as opposed to a scalable text-engine like LaTeX).

This can be handy if you mess some code up and need to revert to a previous time

I remember making that (very obvious) point to someone and getting the reply: "I already have that with the 'undo' button in my editor"... talk about not knowing what you're missing.

Git

Just another thing I want to mention about …

mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

It is policy not to delete any posts per request from the poster. If your post violates the rules, flag it as such (use that little flag icon next to "edit article"), and we will consider deleting it, but only if it violates the rules.

mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

okay so how many rep points are needed to grab a top level position of daniweb amigo??

Take the number of rep points that Ancient Dragon has, multiply that by 10%, and that pretty much means you are at top level.. i.e., if you can come up to AD's ankles (10%), then that's a major achievement here. ;)

Stuugie commented: nothing like you being a grown man and some dude calling you Mikey. I'm a Mike and that shit stopped 25 years ago, unless it's a friend calling me Mikey. +0
Mike_danvers commented: tit for tat +0
mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

I have always suspected that "till" was actually the older version of "until" (because in Swedish, until is "till"). And, it appears that I am right:

"Till is the earlier form, attested as early as 1330; Until is actually derived from till, not the other way around as in ’til."

So, that gives a redeeming quality to the word "till". So, I have no problems with it.

mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

@rubberman, traditionally (i.e., except of late), to my understanding, the driver situation with the three main companies making graphics cards has been more or less like this:

Intel's attitude has been to dedicate a team of developers to contribute open-source drivers for Linux, i.e., full cooperation with the open-source kernel / driver development community. Their drivers are first-class (installed by default and VERY stable).

AMD/ATI's attitude has been to hold their proprietary drivers to the same standard as their Windows drivers and to cooperate with the open-source kernel / driver development community to make sure basic open-source drivers are OK and that their prioprietary drivers work well with the core parts of Linux.

Nvidia's attitude has been to agree to release some basic and untested proprietary drivers (ported from Windows) without putting too much effort into it, and with no cooperation with the Linux-dev community, forcing them to reverse-engineer the Windows drivers to produce the "Nouveau" drivers (open-source).

Now that Linux gaming and popularity in general is amping up (and due to a middle finger from a prominent figure), Nvidia had to revise that stand, produce better drivers and become more open. I believe that this is starting to bear fruit.

Case in point, a few years back, with a Nvidia-powered laptop, I had quite a few issues with the drivers. Essentially, the choice was between the slow Nouveau drivers or the unstable (flickering, crashing) proprietary Nvidia drivers. Now, that laptop runs well because the newer kernel and Nvidia driver …

mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

I highly recommend Qt. It's a library with a few additional tools. There is also a plug-in for Visual Studio, or you can use its native IDE (Qt Creator, which uses MinGW/GCC). There are plenty of tutorials online to show you how to do it step by step. It's pretty much as easy as it gets when it comes to GUI programming in C++.

mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

@AD: More recently, IBM's super-computer "Watson" beat the world champions of Jeopardy (which is much harder than chess)... and it runs SUSE Enterprise Linux.

mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

is the ',' after 'Dutch' and before 'and'.

Yeah, I wasn't 100% sure on that. It's my understanding that if you repeat or change the subject across the "and" or "or", you have to put a comma. As in, "We ate in a restaurant and then went to the movies" (no change / repeat of subject "we"), compared to "We ate in a restaurant, and then we parted ways". But I might be wrong. I'm a bit of a comma-abuser, I like sentences to breathe.

I think I like the original better -- "à propos" an attempt to mix French words with English. I've never ever heard anyone say or write that, had to google it to find out what it meant.

The original was "albeit a bit disturbing perhaps in the discussion at hand", which makes no sense and was weirdly redundant. One way or another, this had to be reformulated.

The "à propos" expression is one that I have heard a lot in English, and in fact, much more so than in French, weirdly enough. In French, it is used as an introduction (circumstantial adverb), as in "À propos, je voulais vous dire..", where "à propos" essentially means "by the way". Or, it is used in "à propos de" to mean "about". But, it is very rarely used as an adjective, as it is used in English (that means "right on the subject of", "opportune" or "pertinent"). To be honest, I don't like too …

mike_2000_17 2,669 21st Century Viking Team Colleague Featured Poster

Did you try to see what you could do with the instructions on Migrating your KMail/Kontact setup to a new distro? I think that just copying all those config files/folders from the old distro (Mint) to your new system should do the trick, at least, that's what it's intended for. But to be safe, you should back up your Kubuntu configs before overwriting them with the configs from your Mint drive.

All I could see is that you might have some issues with is if the versions of the KDE suite are very different between the distros, or if you use different user-name / accounts. In that case, you might have to manually edit those files to make the repairs. Usually, those kinds of config files are just simple text files with lots of fields and stuff (but maybe emails / contact-lists are encrypted). So, it is usually quite easy to open them up in a text editor and just modify them accordingly (e.g., just open a "fresh" Kubuntu config file and the corresponding Mint config file, and it should be quite obvious what you need to change to make the Mint config work on the Kubuntu system). But I don't think that this will be a problem.

Gribouillis commented: good help +14