Latest Information Security Posts

Dani 1,954

I recently got burned by this as well.

It sounds like a major flaw considering that the product is specificially designed to be a remote access tool.

Reverend Jim 3,203

I think the term "hacker/hack" has been watered down to the point where it has returned to its original meaning. It was originally what you stated, then it became associated with the black-hats. Now it is so over used it has lost its negative connotation. Life hack? Seriously? What's wrong with the words "advice" or "tip"?

Reverend Jim 3,203

I recently got burned by this as well. Unlike you, the remote computer was only four houses down the road (father-in-law). I agree. It's a serious flaw. I do have another system I occasionally remote into (Cambridge, UK) but at least that one is not unattended and can be upgraded by the person at the other end as needed.

Dani 1,954 The Queen of DaniWeb

Living in California, I periodically need to log into my computer that's back in my home in New York. Not that often, typically just a couple times a year.

But today, when attempting to connect, I got an error message saying, "The remote TeamViewer is running an old version which is out of date. Therefore, you cannot connect ot this Version anymore."

Soooo, firstly, why is TeamViewer not backwards compatible?! Secondly, I haven't updated TeamViewer on my local computer either anytime recently, so both local and remote computers should be running versions of TeamViewer that were released roughly at the same time. Thirdly, why does TeamViewer not provide a way to remotely upgrade?!


Dani 1,954

I certainly don't associate the word hacker with anything illegal or nefarious. To me, a hacker is someone who doesn't just follow the manual but is an out-of-the-box thinker who is capable of using things (could be technologies, products, etc.) in non-traditional, creative ways to his or her benefit. In the realm of computer hacking, it's simply being creative to get into functionality not designed to be easily accessible, or designed to be accessible at all, to the end-user. You can hack your registry, for heaven's sake. MacGyver was a hacker. There's even a TV show called Hack My Life which is a how-to series about various life hacks. My mom watches the show, which airs on the same network as Impractical Jokers and COPS, meaning their target audience are non-technical baby boomers. If that doesn't mean the word has made it mainstream, I don't know what does.

Seevenup -3

I second pty on this. I'm describing it exactly the same way !

haimen -3

Hacker may refer to someone with technical skills. It has ability to gain unauthorized access to systems in order to commit crimes.

rproffitt commented: If I have technical skills, I would be suspected of committing crimes. What a way to look at things today. Then again scientists. -3

baabroz1 -3

A programmer is a person who uses PC, organizing or different abilities to defeat a specialized issue. The term programmer may allude to anybody with specialized aptitudes, yet it regularly alludes to an individual who uses their capacities to increase unapproved access to frameworks or systems so as to carry out violations.

Reverend Jim commented: This is wrong on so many levels. -3

Ron Peters

Interesting read.

Thanks for Sharing.!

Bllistic 15

Haha wow savage. Didn't expect this, but I guess thinking of many usage scenarios all the time can get annoying and they can't think of every single thing an user would do.

rproffitt commented: I didn't tell all. Later the JPEG issue turned out to be exploitable. +15

MickeyD 15

If you find an answer, do let us all know. I had a 4 gig SD card that locked up my entire system every time I inserted it into the slot.

rproffitt commented: I bet the Microsoft engineer's answer will be "don't do that." +15

Steven Kenneth

Absoultely! Legality depends on the scenario it about to happen, and permissions as well as access rights.
If the hacking takes place under white hat i.e. test penetration activities how could it be illegal. It is absolutely a part of testing, sometimes QA. Thus, a better term to differentiate in seperate will sounds good.
But, at the same time, When we are about to hear the word called "hacker", the definition which will instantly pop up in our mind is an informal way of accessing something which results or reflects harm.


In the modern world, it’s easy to understand that data is absolutely essential to many corporations and organizations. There is an ever-growing need to understand the needs and desires of your customer base, so that you may tailor your products/services/platform as much as possible to meet these particular needs. However, we also are coming to a point where human beings are beginning to realize that the way that their data is being used can be quite harmful. One of the most obvious examples is the Facebook-Cambridge Analytica scandal, in which data from Facebook’s user base was used to influence a United States political election.

It’s easy to understand the idea that “we are the customer” these days. There are literally billions of individuals that are active on social media platforms, whether they are using Facebook, Twitter, or Instagram. We know that we sign up for these services for free, and in return, companies might use or even sell our information to the highest bidder. It’s also understood that there are many different countries that have different laws when it comes to the Internet, and censorship, as well.

This makes for an interesting tug-of-war between tech companies and governments who want access to more information than ever. It isn’t a new problem by any means. India notably banned Blackberry in 2010, only to reach an agreement with the company regarding the interception of messages in 2013. The tech companies might not be excited to give information to these governments, but they ...

rproffitt 1,687 (nothing in headline)

For those very new to this area, please google SIM SWAP and discover a very nasty security issue with all phones we use today. At first glance the new security researcher might think I'm exaggerating. Do your own research and tell me you don't find this to be one of the most foul, nasty exploits I've seen in years.

This exploit was recently used to highjack a writer's phone at
His case is still developing with loss of accounts, tax returns he stored on the Clouds, and a 25,000USD Bitcoin purchase.

And we're not talking thousands of dollars in losses but millions. Take for instance an over 23 million USD dollar loss at

The current state of affairs appears to be DENIAL by the carriers that this is a problem. You can do your research and if you are like most I've talked to about this, it will shake your faith in all things smart phone and cloud based. So many are using their phone as their wallet and for now, until there are lawsuits that cost the makers of this disaster lose a few billion they won't see a reason to fix it.

Reverend Jim 3,203

Unfortunately, movies and TV have co-opted the term to mean anyone who is capable of gaining unauthorized access to a system in under two minutes, even a completely unfamiliar system. The hacker is basically the movie/TV version of deus ex machina. Check out Die Hard 4 and just about any episode of How to Get Away With Murder.


It's really has a kind of negative meaning to me. I think it's a person who is the best in cheating internet users and enjoys doing some harm. The difference between qualified programmer and hacker is like a difference between good and evil.

rproffitt 1,687

For credit charges here they must add a shipping address. That is the norm for here in the USA even if you don't ship a physical product. I'm sure somewhere it's explained that this is part of keeping everyone on the up and up.

Ayush_5 commented: Thanks for answering +0


Ok, So I sell some online services basically there is no need for my clients to put in their shipping address but paypal still asks for shipping address on the checkout page they are taken to from the payapl button. Is there a way to remove that option? I I'm sorry if I am asking this at the wrong place I am kind of a newbie here. Thanks

useruno1 39

I appreciate your help very much, A++ tech community.

Dani 1,954

Sorry, I don’t know how to do that off the top of my head. If the page contents have changed, you can use a sitemap file. But I don’t think googlebot wants your sitemap to contain dead pages. I think just naturally wait for them to come around and recrawl you.

useruno1 39

Sorry for reply, guys.. any idea how can I ping all links from search engine by a crawlbot or something? In order to let them know(bots) that is 410 redirect, and they should remove it.

useruno1 39

I managed to clean a lot of links with 410 redirect, there are a few more but I will check these days and see if they all disappear. I appreciate your help! I find this thing as only solution and i kinda work, i guess.. :)

    header("HTTP/1.0 410 Gone");

Dani 1,954

That robots.txt file looks correct to me. But will not deindex from what I've read.

I think you've misunderstood what I was saying. A robots.txt file, alone, will not deindex. It was imperitive that userunfo was able to get all the pages to return a 410 Gone HTTP status. The advantage to robots.txt is that Googlebot won't be able to crawl spammy URLs on a domain, consider them spammy, and negatively affect the quality score of the domain as a whole. Therefore, it helps preserve the integrity of the domain (which can take months or years to recover from) while figuring out how to 410.

rproffitt 1,687 is working here but I'm in the USA using a Google DNS so folk in China, North Korea and who knows where else may not be able to get there.

useruno1 39

I managed to HTTP ERROR 410 all /?foo=" links. Let's hope this is gonna solve everthing.

Dani commented: Great!! +16

useruno1 39

Thank you guys for discussing my problem, I read everything you wrote above but I still do not know what solution is more optimistic, I've removed over +1000 links manually through google search consoles but.. stil cant find a good option to bulk them. I've attached some of the files I've found infesting in wordpress.

Is there a solution to write a code so that can all links from to return a 410? >> here are some of the files infested.

. This link dosent work :(:

Dani 1,954

And unfortunately that brings us back to how the original post was snipped to remove a crucial part of the question. It had some foul language as well as links to spammy pages, so I'll try to use example URLs instead:

Ex: + many more

I want somehow to block indexing all links from:

Check out and you can see there that you can do something like:

User-agent: *
Disallow: *?foo=

(At least, I think so. Please correct me if I'm wrong.)

rproffitt commented: That robots.txt file looks correct to me. But will not deindex from what I've read. +15

rproffitt 1,687

Thanks Dani.

Sites with spammy content will suffer so I think we're on the same track there.

If useruno1 wants to keep spammy pages that's up to them and let's hope that what's been covered here can clean it up.

Dani 1,954

"using a robots.txt won’t remove pages from Google’s index." was his point and again why I wrote no.

rproffitt, your quote is taken out of context. robots.txt will not remove valid pages from Google's index. If you have a webpage that you want visitors to be able to access, but you simply don't want it to appear in Google's index, then adding it to your robots.txt file won't remove it from the index.

However, when dealing with the issue the OP is suffering from, this is not the solution to the problem. He needs to make sure all the spammy pages no longer exist on his domain, and then use robots.txt immediately in order to do damage control so as to not get hit by a Google algorithm update that doesn't like domains with lots of spammy pages.

Dani 1,954

rproffitt, I still believe that robots.txt is the best solution here.

It seems as if malware has created many spammy pages with have subsequently been indexed by Google. The article you links to suggests the best way to deindex pages that you want Google to still be able to crawl and want visitors to access. In such a case, I would agree with John Mueller, who is Google's ambassador to the SEO community.

However, I would not recommend that strategy in this case. Basically the strategy involves manually adding a <meta robots=noindex> tag to every page, and then updating the sitemap to tell Google to recrawl the page soon to notice the change.

The problem with doing that, in this case, is firstly, I would hope that the spammy pages have already been permanently removed. Secondly, if for some reason they haven't been, manually modifying every spammy page doesn't seem like a reasonable solution ... if it were that easy, one would just as easily be able to remove the pages altogether or change their content to be non-spammy.

Instead, a robots.txt file is the best way to quickly tell Google to not crawl that section of the site. This is imperitive when the pages are spammy, and you don't want Googlebot to ding you for having spammy pages on your domain. If the pages no longer exist, they'll eventually fall out of the index over time, don't worry about that.