HN2new | past | comments | ask | show | jobs | submitlogin

Get a new domain (second-level, that you actually own this time) and accurately 301 all your URLs to the new domain. You cannot "regain" your Google traffic. You'll need to rebuild, but 301s will at least help put your inbound links back on solid footing.


This has an excellent chance of contaminating the new domain. Consider Big Daddy G's view of the world: they've just declared *.co.cc persona non grata. 301 redirecting is designed to tell them that foo.co.cc is now doing business at example.net. I would not do that to a domain I didn't want to lose.


Google can't take this extreme position as competitors could do this to your site as well. While I would personally not 301 to the new site, technically the benefit should outweigh the possible negatives. See paid links and directories for past reference on this subject. If getting a site down-ranked was as easy as linking TO them from a bad neighborhood, black-hat SEO would be all over it and would abuse the technique enough to make Google adjust (which most likely has happened many times)


It is certainly possible to burn a new domain by pointing a penalized domain at it. If domain #2 contains the same content as domain #1 pre-burning, and domain #1 301s to domain #2, that's almost proof-positive that they're under the same control. Google's standard of proof isn't neaaaaaaaaaaarly that high.


_If getting a site down-ranked was as easy as linking TO them from a bad neighborhood, black-hat SEO would be all over it_

it has happened and it can work (but not always)


Having removed *.co.cc from their index, is Google going to continue to index it? There doesn't seem to be any reason for them to do so (they would just be throwing the information away), and if they don't then they would never become aware of the 301 redirect.


You know how Dropbox accidentally sent out DMCA notifications to someone because that was an unexpected side effect of using an internal tool to take down a particular file out of a public Dropbox? It may be the case that Google engineers are fully cognizant of the side effects of their internal tools, and they may never have unintended consequences or room for optimization.

A lot of SEO comes down to risk/reward guesstimations. Heads, Google does what you think would be optimal and you get a wee bit of extra juice to your newly un-burned website. Tails, Google hired a human at some point and you're on your third website this week.

I generally get paid to advise companies for whom losing the website is sort of a big deal, so I'm fairly risk adverse with these things. I know folks with different models.


How about a redirect that filters out googles bot?

Or even a Javascript one. Even Google cannot solve the halting problem.


Google can and does include a Javascript interpreter in their crawler: http://www.seomoz.org/ugc/new-reality-google-follows-links-i... . They don't need to solve the halting problem in order to run a Javascript interpreter for a certain amount of time(maybe 10-15 seconds) and see how that changes the page content or redirects to a different site.

If you want to make your visitors wait until after the heat death of the universe before you actually redirect them, I'm sure you'll be able to fool Google too.


I wonder if 301s would help, as in: Do deindexed sites still get crawled and have other impacts on search results (like feeding link juice to other sites)?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: