Latest Information Security Posts

Interesting read.

Thanks for Sharing.!

Haha wow savage. Didn't expect this, but I guess thinking of many usage scenarios all the time can get annoying and they can't think of every single thing an user would do.

rproffitt commented: I didn't tell all. Later the JPEG issue turned out to be exploitable. +15

If you find an answer, do let us all know. I had a 4 gig SD card that locked up my entire system every time I inserted it into the slot.

rproffitt commented: I bet the Microsoft engineer's answer will be "don't do that." +15

In the modern world, it’s easy to understand that data is absolutely essential to many corporations and organizations. There is an ever-growing need to understand the needs and desires of your customer base, so that you may tailor your products/services/platform as much as possible to meet these particular needs. However, we also are coming to a point where human beings are beginning to realize that the way that their data is being used can be quite harmful. One of the most obvious examples is the Facebook-Cambridge Analytica scandal, in which data from Facebook’s user base was used to influence a United States political election.

It’s easy to understand the idea that “we are the customer” these days. There are literally billions of individuals that are active on social media platforms, whether they are using Facebook, Twitter, or Instagram. We know that we sign up for these services for free, and in return, companies might use or even sell our information to the highest bidder. It’s also understood that there are many different countries that have different laws when it comes to the Internet, and censorship, as well.

This makes for an interesting tug-of-war between tech companies and governments who want access to more information than ever. It isn’t a new problem by any means. India notably banned Blackberry in 2010, only to reach an agreement with the company regarding the interception of messages in 2013. The tech companies might not be excited to give information to these governments, but they ...

For those very new to this area, please google SIM SWAP and discover a very nasty security issue with all phones we use today. At first glance the new security researcher might think I'm exaggerating. Do your own research and tell me you don't find this to be one of the most foul, nasty exploits I've seen in years.

This exploit was recently used to highjack a writer's phone at https://www.zdnet.com/article/sim-swap-horror-story-ive-lost-decades-of-data-and-google-wont-lift-a-finger/
His case is still developing with loss of accounts, tax returns he stored on the Clouds, and a 25,000USD Bitcoin purchase.

And we're not talking thousands of dollars in losses but millions. Take for instance an over 23 million USD dollar loss at https://www.vice.com/en_us/article/pawwkz/bitcoin-investor-sues-att-23-million-sim-swap-hack

The current state of affairs appears to be DENIAL by the carriers that this is a problem. You can do your research and if you are like most I've talked to about this, it will shake your faith in all things smart phone and cloud based. So many are using their phone as their wallet and for now, until there are lawsuits that cost the makers of this disaster lose a few billion they won't see a reason to fix it.
SIMSWAP.png

Unfortunately, movies and TV have co-opted the term to mean anyone who is capable of gaining unauthorized access to a system in under two minutes, even a completely unfamiliar system. The hacker is basically the movie/TV version of deus ex machina. Check out Die Hard 4 and just about any episode of How to Get Away With Murder.

It's really has a kind of negative meaning to me. I think it's a person who is the best in cheating internet users and enjoys doing some harm. The difference between qualified programmer and hacker is like a difference between good and evil.

For credit charges here they must add a shipping address. That is the norm for here in the USA even if you don't ship a physical product. I'm sure somewhere it's explained that this is part of keeping everyone on the up and up.

Ayush_5 commented: Thanks for answering +0

Ok, So I sell some online services basically there is no need for my clients to put in their shipping address but paypal still asks for shipping address on the checkout page they are taken to from the payapl button. Is there a way to remove that option? I I'm sorry if I am asking this at the wrong place I am kind of a newbie here. Thanks

I appreciate your help very much, A++ tech community.

Dani 1,775

Sorry, I don’t know how to do that off the top of my head. If the page contents have changed, you can use a sitemap file. But I don’t think googlebot wants your sitemap to contain dead pages. I think just naturally wait for them to come around and recrawl you.

Sorry for reply, guys.. any idea how can I ping all links from search engine by a crawlbot or something? In order to let them know(bots) that is 410 redirect, and they should remove it.

I managed to clean a lot of links with 410 redirect, there are a few more but I will check these days and see if they all disappear. I appreciate your help! I find this thing as only solution and i kinda work, i guess.. :)

if(isset($_GET['foo'])){
    header("HTTP/1.0 410 Gone");
    exit();
Dani 1,775

That robots.txt file looks correct to me. But will not deindex from what I've read.

I think you've misunderstood what I was saying. A robots.txt file, alone, will not deindex. It was imperitive that userunfo was able to get all the pages to return a 410 Gone HTTP status. The advantage to robots.txt is that Googlebot won't be able to crawl spammy URLs on a domain, consider them spammy, and negatively affect the quality score of the domain as a whole. Therefore, it helps preserve the integrity of the domain (which can take months or years to recover from) while figuring out how to 410.

https://www.seoinc.com/seo-blog/fastest-way-to-deindex-pages/ is working here but I'm in the USA using a Google DNS 8.8.8.8 so folk in China, North Korea and who knows where else may not be able to get there.

I managed to HTTP ERROR 410 all /?foo=" links. Let's hope this is gonna solve everthing.

Dani commented: Great!! +16

Thank you guys for discussing my problem, I read everything you wrote above but I still do not know what solution is more optimistic, I've removed over +1000 links manually through google search consoles but.. stil cant find a good option to bulk them. I've attached some of the files I've found infesting in wordpress.

Is there a solution to write a code so that can all links from www.website.com/?dic to return a 410?

https://ufile.io/r8uayfsz >> here are some of the files infested.

. This link dosent work :(: https://www.seoinc.com/seo-blog/fastest-way-to-deindex-pages/

Dani 1,775

And unfortunately that brings us back to how the original post was snipped to remove a crucial part of the question. It had some foul language as well as links to spammy pages, so I'll try to use example URLs instead:

Ex:
https://www.example.com?foo=bar+html
https://www.example.com/?foo=bar.html + many more

I want somehow to block indexing all links from:
"foo="

Check out https://geoffkenyon.com/how-to-use-wildcards-robots-txt/ and you can see there that you can do something like:

User-agent: *
Disallow: *?foo=

(At least, I think so. Please correct me if I'm wrong.)

rproffitt commented: That robots.txt file looks correct to me. But will not deindex from what I've read. +15

Thanks Dani.

Sites with spammy content will suffer so I think we're on the same track there.

If useruno1 wants to keep spammy pages that's up to them and let's hope that what's been covered here can clean it up.

Dani 1,775

"using a robots.txt won’t remove pages from Google’s index." was his point and again why I wrote no.

rproffitt, your quote is taken out of context. robots.txt will not remove valid pages from Google's index. If you have a webpage that you want visitors to be able to access, but you simply don't want it to appear in Google's index, then adding it to your robots.txt file won't remove it from the index.

However, when dealing with the issue the OP is suffering from, this is not the solution to the problem. He needs to make sure all the spammy pages no longer exist on his domain, and then use robots.txt immediately in order to do damage control so as to not get hit by a Google algorithm update that doesn't like domains with lots of spammy pages.

Dani 1,775

rproffitt, I still believe that robots.txt is the best solution here.

It seems as if malware has created many spammy pages with have subsequently been indexed by Google. The article you links to suggests the best way to deindex pages that you want Google to still be able to crawl and want visitors to access. In such a case, I would agree with John Mueller, who is Google's ambassador to the SEO community.

However, I would not recommend that strategy in this case. Basically the strategy involves manually adding a <meta robots=noindex> tag to every page, and then updating the sitemap to tell Google to recrawl the page soon to notice the change.

The problem with doing that, in this case, is firstly, I would hope that the spammy pages have already been permanently removed. Secondly, if for some reason they haven't been, manually modifying every spammy page doesn't seem like a reasonable solution ... if it were that easy, one would just as easily be able to remove the pages altogether or change their content to be non-spammy.

Instead, a robots.txt file is the best way to quickly tell Google to not crawl that section of the site. This is imperitive when the pages are spammy, and you don't want Googlebot to ding you for having spammy pages on your domain. If the pages no longer exist, they'll eventually fall out of the index over time, don't worry about that.

My point here is robots.txt doesn't appear to be the answer. Looking around for more clarity I found this discussion in which Google's own webmaster John Mueller notes how to mass deindex.

"using a robots.txt won’t remove pages from Google’s index." was his point and again why I wrote no.

The full discussion is at https://www.seoinc.com/seo-blog/fastest-way-to-deindex-pages/ and IMO should be the fine answer on how to deindex where needed.

Dani 1,775

I think he’s trying to show the format of the links to see if there is regex or something that can be used to mass deindex them. I believe wildcard characters are included. I can’t provide more advice without seeing the format of the original links that were snipped unfortunately.

I'm going with no since robots do not have to honor this file.
Noted at http://www.robotstxt.org/faq/blockjustbad.html

On top of that your links don't show me the issue. In fact they seem more like forum spam to me. Tell me what I should be seeing at those links.

Hello guys,
Recently my site was infected with malware, which caused me a lot of problems. In particular, many spam links have been created and indexed. I managed to get a lot out of them with Google search console, but it still appears in some key searches. Is there any chance of blocking the link prefix in robots.txt to deleting itself from google?

I want somehow to block indexing all links I know i can block like this: User-agent: * Disallow: /product/categories But this one is different, its not like a parent page/category. I would appreciate very much if you can help me, cheers!

"I've never backed up, why is that my problem?"

And yet the same person, if you asked him why he locks his house when he leaves for work would just look at you like you were a moron.

Let me share how I began to understand Microsoft: A long time back when I went to Microsoft Redmond's campus for a seminar. I had two things I wanted to share since I thought Microsoft would want to look into this.

  1. A CD that when put into the PC would cause the PC to lock up. I didn't know why, I just thought it showed a bug.
  2. A jpeg file that when copied to the desktop would render that user's account dead. Even in safe mode until the file was deleted.

I waited my turn to meet with a couple of Microsoft engineers (lucky me!) and their response told me all I needed to know about Microsoft. maybe you can figure it out too. The engineers answer to these issues was "Don't do that."

Microsoft didn't consider things that we do or see as a bug. It's user error.

One of the mantras of computing and just about anything data related is "We only lose what we don't backup" but some are taking offense that this is still the current state of computing today. Recently some owners call this out as "blaming the user", "you're holding it wrong" or snobby. Everyone I know will try their best to help you get your machine back in working order even if people say such things. They've lost it all and upset that they can't get their stuff back.

Last week's example was another smart phone, forgot their password, unlock code and the only way folk told them was to factory reset the phone. No backups, ever. Owner's statement: "I've never backed up, why is that my problem?"

I don't mind all that but it is your data. Keep it safe. No one else will.

Q1. Is such thinking out of date?

Q2. Is the industry really that out of touch?

The Conservative party issued a statement on Saturday which apologised for "any concern caused" and confirmed that "the technical issue has been resolved and the app is now functioning securely." However, not before Boris Johnson's profile image had been changed to a pornographic one and that of Environment Secretary, Michael Gove, swapped for a picture of Rupert Murdoch. Some ministers, and other MPs, apparently reported receiving nuisance calls following the app breach.

The Information Commissioner's Office has confirmed that it is investigating the incident, and bite the Tories with a large fine. Under the EU General Data Protection Regulation (GDPR), which the app stated it complied with in it's privacy policy, that could be in the millions.

@Bostjan_K. I read four pages and you can see why I give them the nod. It's not fast as so many need that sort of help but if you can follow their rules it works. Thanks for updating here so folk can see how much work these pests create for us.