Hello DaniWeb Community,

Can a site get out of the G Sandbox using RSS feeds to fuel the site with content? In other words, if a webmaster does not have the resources to write original content, can syndicated content do the trick?

Any other feedback on getting out of the sandbox, please post here. Thanks!

Recommended Answers

All 13 Replies

It's not so much the sandbox effect as Google's new algorithm to not rank a new site until it's been "broken in" for a couple of months (some says about 6 months). There really isn't anything you can do about it - regardless of how great a site is or how much content it has.

Once you've simply waited it out, then content is key. Google absolutely loves unique content. RSS syndication will help a tad, but in the end, it doesn't take Google a lot to figure out that it's syndicated. (After all, it's going to be duplicate content that Google can find elsewhere) Don't confuse the statement "content is key" with the truth that "unique content is key."

That is a great answer Dani. Thank you as always.

great reply cscgal and i agree with you 100%

you can't really get out of the Sandbox with anything done from your side...

Sandbox helps google to stop New sites from using ethical/unethical means to get top rankings...

So only way is to wait and watch..fingers crossed

I had a site get out of the sandbox in about three weeks . . . I don't think the 6 months is a rule.

There's no rule. It can be a few weeks to nearly a year. There's nothing you can do once you're in it. The best theory people have to avoid getting into it in the first place is to show a completely natural web development process. IE Don't over optimise the pages, dont suddenly point 1000's of links at the site in a short time, build them slowly, and make regular updates.

Supposedly the theory is that the more competitive the term(s) you are targetting the hard the sandbox hits. Anyone who says they've avoided the 'sandbox' is likely targetting low competition terms anyway. Smart webmasters will put up a site expecting to land in it, and thus not pin their hopes on anything real coming from the site sooner than 3-6 months.

While it does seem that the Sandbox is mostly a matter of time, I do think that it is good to have some links pointing to you so that you can be found by Google.
If others have any input or experience on this I would love to hear your suggestions.

Well the only way get out of SandBox is to do what you hasve been doing ina natural way.

Secure links but only from Quality and theme based website. Do NOT secure too many links too soon. this would be Burst Link Strategy. a NO NO

Keep adding fresh contents to your website. etc

Keep doing organic Directory submissions.

Thanks for the tips seo expert. Does Yahoo have a sandbox?

Thanks for the tips seo expert. Does Yahoo have a sandbox?

Nobody really knows specifically how any search engine operates ( unless you work for them of course ) , but from my experience with Yahoo, it seems that a fair answer to your question would be yes and no.

Yahoo uses several datapools when calculating results, some come from their own crawlers, some from back door submissions, some from paid submissions, some from other search engine databases, some that their own editors throw in ... and more.

The sites that their human editors approve or insert can get stuck in a queue seemingly forever, while some that come from other search engines are already available in the SERPs.

Google Sandbox is also a myth, google just doesnt encourage new sites so as your domain ages up you will automatically start to do well on google.

The life in SEO, it's so on the edge.

I've finished experimenting with a technique. I now have only two sandboxed sites at this time, and I'm in no hurry to get them out as they are both in development. I'll share my experience with you, but use of the second technique is at your own risk.

There may be a couple of things that you can try and have worked for me, but it is safe to warn you that you are playing with fire here. Use at your own risk and there are no guarantees in SEO. Don't come back crying to me if you try this and get penalized. I'm talking about situations where you have no choice to get a web site out of the sandbox right now.

First of all, and the safest way, is to keep an updated Google Sitemap, make sure Google verifies it regularly. Make new pages everyday, modify your sitemap to reflect and force feed it to Google. Make it all seem sooo natural. They provide the tools to do this. Make new content daily and force it to spider the sandboxed site daily. Try this first for a couple of weeks then ... if all else fails, try this.

If the web site is able to migrate easily, embed it in a historically credible web site. Put it a couple of levels deeper than the Entrance page. Provide one link to it from the host domain entrance, make sure the spider can find it, not in it's face but casually mentioned in the content, naturally looking. Make it look innocent. What this appears to do is stimulate the duplicated content filters because the content exists exactly somewhere else. This causes a respider to the sandboxed web site and a positive reaction can occur at this point. Google has to decide which is the better content, the sandboxed or the non-sandboxed. Do not alter the sitemap for the host domain. Make it look like you had an idea and then changed your mind a couple of days later. Check the host domain cache, when you are certain that it has found the link, remove the link and the embedded site immediately after they both get re-spidered. Now the sandboxed site has the best content by default and seems like it can get out of the sandbox because of this.

Ethical SEOs will point out that what in fact is done here is similar to creating ghost pages. I have to agree. It isn't exactly ethical and I do not endorse it.

Now some clever SEOs are thinking to themselves, ah duplicate content, naughty naughty and up for penalization. Not necessarily. Google instructs us ( not in these words of course ) that duplicate content naturally exists on the web and will provide results according to which one it determines is most original and has the best content for the keyphrase search.

Remember too folks, it may have been coincidental that I tried this technique at the same time that my web sites were scheduled to get released. If you try this and it doesn't work, well, it was an idea that didn't work for you. Perhaps a better one than what you may have in your own head if you read this far down the thread. :rolleyes:

Nobody really knows ... available in the SERPs.

I am no longer sure that my suggestions earlier in this thread significantly improved the speed at which these web sites popped out of the Sandbox.

It was an idea that really was in development at the time and I was over-enthusiastic in the manner in which I expressed my position.

I don't particularly endorse my previous take on this anymore. In the end it will all be best when positioning pages naturally occurs, in Google time and in Google way.

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.