Url removal: yah!

For now, I’m just going to say “hot damn.” The smart folks on the webmaster console team have migrated Google’s url removal tool into the webmaster console. Along the way, it’s picking up a *lot* of nice new functionality. I’ll talk about it more pretty soon, because I have a fun story to tell, but in the mean time you can read more about it from the official webmaster blog or on Search Engine Land.

Yah!

64 Responses to Url removal: yah! (Leave a comment)

  1. Yeh, i was just reading that on the Google Webmaster Central and your post popped up on google reader. I’m about to test this new functionality as i have some URLs on my site which i dont want included in the index and although i’ve blocked it with the relevant meta tags, it’s taking too long to be removed from Google index. So i think i’ll love this new tool :)

  2. O.K. Now I am feeling uber powerful! First I leave the comment on your blog at
    http://www.mattcutts.com/blog/robotstxt-analysis-tool/

    about what to do about the 404 errors I find on google tools for URL’s that I don’t have.

    And now this happens! Boy is google being attentive to my needs and filling my requests!

    So while I am at it can I also request a slider bar on the google tools to adjust my PR, and also check boxes to click for requests of which search terms I want to be number one for?

    Thanks Google for being so quick to handle my personal requests!

  3. This is so-o obvious that is it sad it was not implemented years ago.

    BUT —- One suggestion…

    A URL REPLACEMENT option….

    Perfect for frustrated :-? Webmasters moving to a new URL or domain name or moving many things to a new directory – (perhaps those moving to a new URL – once verfied – can keep their newer backlinks and pagerank)

    If this idea is implemented, please acknowlege SearchEnginesWeb

  4. It would be good that you could put the URL with some type of parameter. For example http://www.mydomain.com/*.jpg or http://www.mydomain.com/*.mdb

  5. Search EnginesWeb what does Google do with the new domains old back links?

  6. fantastic – what a great idea. i love what webmaster central is becoming.

  7. tuf

    url replacement tool is useless.
    just use 301 redirections

  8. I love all of these cool new features. But it would be great to assign user accounts to your webmaster tools page. If there are different users, some may have less of an understanding of what a simple tool like remove may do to the wrong page.

    With my account I have only shared the login with a few people and made sure they knew exactly what each part does.

    Cheers!

  9. Hi Matt,

    Thanks for this, I just go through one document related to user behavior.
    One suggestion related to SERPs, shuffle sponsored list on the left side then u will get good ad clicks

    cheers

  10. This came out of a great session at SES last week in NY, you should have been there Matt, not only was it fun, it was productive!

  11. I’d like to sneak in a question on a related subject (relating to ‘removal’ and ‘webmaster console’).

    In the webmaster console, when I go to Statistics > Page Analysis and look at the Keywords list, I see many words that are not present in my site but are present in an unrelated subdomain.

    Let’s say that my site is at http://example.com and an unrelated site, run by someone else and covering a completely different matters, is at http://subdomain.example.com.

    I normally find the keywords list useful, however with the recent inclusion of a substantial amount of other terms this tool is becoming much less useful – about 60% of the keywords listed are not mine but from the unrelated subdomain.

    Is there anything that can be done about such issues? I can’t find any existing options within the webmaster console to deal with it.

    Could changes be made to the webmaster console such that, for a given site, one could explicitly specify subdomains that are part of the site, therefore automatically excluding all other subdomains?

    For example. if I add a site for example.com, I would specify that support.example.com and secure.example.com are part of the same site, causing information for unrelated1.example.com and unrelated2.example.com to not be displayed in the webmaster console for the given site.

  12. Hey Matt,

    This is great and thanks for this tool. I would also like to see something in there that shows 100% of the pages Google knows about.

    I run a yahoo store and it’s not hard for us to get orphaned pages that we honestly don’t know about. (I still think this is part of the reason for my bottom of the results drop) As far as I know my site has about 1,900 pages but Google always shows more. I can only dig so deep and every time I do I find orphaned pages. I remove them and wait a few more days and look again in hopes of finding more to get them removed.

    It would be an easier task if we could somehow see 100% of the pages Google knows about someplace.

    I know if we had FTP access like most webmasters do it would be easy but we don’t with yahoo stores.

    Maybe you guys should start Google stores for us…..

    Thanks for reading this,
    Bob

  13. woo hoo… hey, maybe it was my fault – i suggested this on a ‘suggestions for webmaster tools’ thread on webmaster world last year, and either adam or vanessa thought it was a good idea and said they’d pass it on to the gwt people. it was a pain having to search through h and highwater to find the url removal console every time i wanted to remove a bunch of stuff.

    something wrong with my keyboard – i’m shiftless today

  14. Thanks for the heads up, and it’s about time! And as SearchEnginesWeb said, how obvious. (His other suggestion too is long overdue – we want to change domains or URL’s without all the pain)

    Putting more of the tools we need all in one place is much appreciated, and it’s very cool to watch the evolution of the console. It keeps getting better every month.

    I also want to second Aaron’s request for mulitiple user accounts/logins. He’s braver than I am, giving logins to other people…

  15. German

    SUPER!!!!!!

    How do you do when you can’t verify your own website due to some 404 file tests. Should we all enter all our sites in the webmaster console?

    Seriously, I do prefer the way with the tobots.txt.

  16. Yeah, Matt, does that mean that it is now only available to those of us with Google accounts? In other words (don’t mind the tinfoil hat here please :) ) paranoid people won’t get the benefit of the tool?

  17. JLH

    They’ve also expanded Page Analysis statistics that now include common phrases in external links. You can check them out and see how they actually look on the page. Vanessa must have used those extra two days to get her taxes done by rolling out some new products…what’s next? One can only speculate..me thinks it ain’t gunna be live PR.

  18. German

    Hi Michael,

    Don’t mind me being paranoid :-) I think most people in my branch are.
    Maybe it has something with our business peeking in other’s lives everyday and we can’t stand the thought of another big brother peeking in ours the way we are doing it (unwillingly).

    Matts knows from all my sites. I’m not hiding them and they are interlinked or linked in someway. I am not hiding them and I do have an account.

    I am just spending my life on the Internet,seeing what trying to go though my firewall, from every software I am using to a Google toolbar notifyer where I am sure I didn’t download the toolbar. My windows computer is since a week corrupted because an automatic update that I didn’t ask for. Yes I am paranoid and I can’t way to try linux to stop all these check in check out update here and there.

    I will sure prefer avoiding any tool that mean i have to enter the names of my website and changing their configuration because of Google.

  19. Thanks for lovely article , l just burnt ur feedz

  20. Hey Search EnginesWeb, I think you should use a 301 redirect.

  21. JohnD.

    Now if only Webmaster Tools would update correctly. It’s still saying we have 3100 incorrect links on one of our sites. We did do some restructuring, but made sure to update all of our links and even updated the sitemap.xml. In the process our site that has been #1 for targeted terms for many years on Google has dropped completely off of all results. Still shows up #1 on Yahoo and MSN, just not Google.

    The good news is our traffic hasn’t dropped terribly bad since being dropped from Google. Just shows when you have a useful site people will get to it one way or another, Google Analytics just shows traffic coming from other places, and not Google now.

  22. good stuff, Vanessa and Rand should talk more often :)

    Sidenote: I am confused why the Google webmaster guideline tell us to avoid helping the elderly and infirm – http://feedthebot.blogspot.com/2007/04/why-google-has-to-work-very-hard.html

  23. It would be nice to see it tied into the Web crawl errors page, or at least the Not found section. That way you could select the URL you wanted to remove if that’s the appropriate action.

  24. Great. I have used that tool a lot to fix strange problems over the years.

    Only thing is it is kind of deadly. Maybe there should be some speed bumps if you are removing more then a single page. Like a email and a confirmation required.

  25. I have to agree with Search EnginesWeb, I’ve not long change a site to a new URL and I’m having problems with the old URL still being indexed and showing up for the results. Although I’ve got a 301 redirect in place it does not seem to have the affect I was hoping for.

    Having a option to change URL’s would help a lot of web designers.

  26. I know this way off topic matt – but nowhere else to go and ask or complain. what is the story with google apps referals? Why US only? I live in europe – but most of the visitors to my main site are from the US.

    Give us all a break, please…

  27. Harith

    Matt

    How to subscribe to your mattcutts twitter. Don’t you need all the friends you can get? well I do :)

  28. Deb

    Hi Matt
    I have already use it and works fine…………

  29. Matt – I’m impressed! I added the removal request for some old pages as soon as I read this entry of yours, and a day later they’ve been removed! Brilliant! It’s about time :-)

  30. This is a really useful tool. I have a couple things right out of the gate that it would work perfectly for. I also am in love with the Google Walking Maps you posted a few days ago. Really interesting stuff

  31. It is a useful tool, once my site which contains less then 20 pages strated showing indexed pages almost 2000. At that time i wasn’t using sitemaps, But after adding sitemap, those pages were automatically removed over period of time.

    Matt I have a question, Though its irrelevant to this post, But was curious about it.
    Its seems to me that a major back link export is underway, I have noticed a few baclinks added to just newly launched sits just now. Is it?

  32. The fact is that the completed url removals that I have in the older removal page does not show up in the google webmaster console. This is important for me, because I have about 2 weeks left for reinclusion. And because of this, I need to see if any of the “completed” removals have timed out.

  33. one thing more added in webmaster console is under Page analysis!
    First it was only kewords, then only phrases and now both. Also you can view phrases with variations.
    Very handy addition.

  34. Excellent stuff! It’s a handy tool that Yahoo have had for a while, so I’m glad Google have followed suit.

  35. Serp

    Very good. It works fine

  36. Well since we’re on the topic of webmaster tools, is there a tool or technique that will allow you to update the title and description in the Google search results quickly?

  37. Dyce

    I like it…. will have to see how much I use it over time… but already been having a go :)

  38. Let’s say I have a client that is susceptible to the whole ” a little knowledge is a dangerous thing” brain-bomb and got over-enthusiastic.

    So they decided that since it sometimes takes months for a redirected link to come back up to it’s original position, that they would cleverly *remove* the original URL and then put in the redirect. :(

    I don’t suppose there is an “undelete” button somewhere? Or maybe a dunce cap with a Google logo I can hand to the offending party as a reminder to listen to the nice SEO they have hired rather than doing it themselves?

    I’m assuming now it will need to be respidered again and that this would take much longer than a simple redirect would have. I’d be thrilled to find out this is not the case, but I’m not holding my breath.

    Ian

    PS Now that I’m thinking about it, I REALLY would like a Google logoed Dunce Cap. I can think of a ton of opportunities for it.

  39. KJW

    Wow – that is great! I already used it… :-)

  40. Vermut

    Mmmm… It does not works fine. I’ve delete one entire directory from my server and the Http resonse header of my apache web server say “404 Not Found”, but the report says: “Negato” (It’s italian for “Denied”)

  41. some insight about Google web history!

  42. g1smd

    There is an oddity in Webmastertools that is probably very easy to fix.

    Steps to reproduce:

    1. Register a domain, and get a site online.

    2. Add a robots.txt file of some sort.

    3. Sign up for webmastertools and ask for the status of robots.txt for the site. At this point Google has not crawled the site.

    4. Webmastertools will give a date and time that it says the robots file was crawled, and it will say that it is “404 – not found“.

    The date and time given is that of when you signed up for webmastertools, and the “404″ is incorrect. The file is actually there, Google hasn’t actually looked at it.

    The message should say: “not yet crawled”.

  43. I really like Ian’s suggestion. I would actually wear one from time to time.

  44. ec

    I have recently changed the navigation for my entire site, from dynamic to static, ie page.php?categoryid=3&pageindex=22 to /cat/3/22.html, how do I tell google to reset my entire site and start from scratch?

  45. It’s good to finally see the engines doing more to work with SEO types. I think the SEs desire to keep as much info from SEOs contributed to just about every SEO playing the black hat game – even if only a little. The more we know about how to rank the more time we can spend doing it right.

  46. Definitely should have been implemented a long time ago!

  47. Dr. Dave, Ian, we have the ability to make our own fabrics; wouldn’t you rather have a shirt with Matt’s likeness accompanied by a few Google logos and some Hibiscus flowers? Just think Hawaiian Cuttlette shirts, how about it Matt?

  48. I wonder if it means that I want to change a page and title so the new url reflects the changes. By using the tool I can delete the original url without penalty. As we know doing this in the past caused a 404 and harmed our rankings.

  49. Harith

    Matt

    Long time no post. Haven’t you earned enoughhusband bonus points :)

  50. I like it, its working fine ;)

  51. It took a while but it’s good to know that Google listens. Keep up the good work.

  52. I would say instead of going for url removal go for redirecting via .htaccess atleast this way, one can retain a url and inbound link. Same as you do it for dynamic url rewriting.

  53. Will Howard

    Question? If I remove an subdomain URL form the main URL will this affect my main URL in a negitive way. Or what affect will it have?

  54. I made use of the new URL removal tool this past weekend and it worked great. The pages in question were out of Google’s search index very quickly. For those very rare times when it is needed, this is a great tool.

  55. Matt I just used the function to try to remove 12 urls, but when I went back to check for it, I got I message that said that the urls removal were denied. I’m not sure what happend ???

  56. Hampstead

    Here’s a feature suggestion for you:

    Somehow there are 7,500 404 errors being reported in my Webmaster Tools. I have run many link checkers over the site and I can’t find these pages.

    I’m able to download a list of the URLs and it would be very useful to be able to upload said file to the URL removal tool.

    Could these 404s be external links?

  57. Very nice addition for Google.
    I always used the 301 and I will keep using them, but this seems a nice improvement. Guess that pages that aren’t spidered often can be removed faster this way.

  58. Brilliant!! (And yes, long awaited, but still brilliant).

    Thanks for posting this, Matt, and thanks to Vanessa and crew for a set of tools that just get better and better. The Webmaster console can totally change the relationship between sites and engines from combative to cooperative. Those of us willing to play by the rules (and I’ll admit to having been on both sides in the past) are now given a rich set of tools that allow us to *declare* our intentions, rather than to have Google have to try to figure them out. The tinfoil hat types will find life increasingly difficult.

    The timing of the Delete tool is perfect. Over several years, we had tried to make a large and useful site based on content that was not original but which to a lesser or greater degree engendered the creation of useful and original content. It didn’t work. So we recently have killed tens of thousands of pages so that we can focus our resources and efforts on a few really remarkable and outstanding pages. Within a few days, we should be able to clear the decks of cruft, letting Googlebot spend only the time needed to find what’s still actually there. Efficiency is good.

    In the absence of the webmaster console, and this new tool, the changes we have made would be hard to understand, and even downright suspicious looking. Now presumably everything is out in the open.

    Hooray!

  59. Url removal tool does not work when robots.txt exceeds 5000 bytes.
    If some url is blocked in such a file – removal request is denied.

  60. I just submitted a few URL removal requests but they are now pending. Does anyone know how long it takes before they are actually removed?

  61. Hi Matt,
    am finding this google webmaster tool error section very confusing:

    I have errors under “URLs restricted by robots” , “Not Found” & “HTTP Errors” – the errors under not found and http are 410,404 and 403 – i have tried using the removal tool for some of the urls i want removed

    what i would like to know is if 410,404 and 403 errors are displayed does this mean that google will remove the urls and if so how long does it take

    Thank You!

  62. James

    Hi Matt,

    One server of my hosting company got hacked and I have few sites on that server, the hacker created hundreds (or thousands) under each of my domains. The issue had been lately fixed by the hosting but I can find some of these spammy subdomains are being indexed by google. Is there a way I can use webmaster tools to remove these subdomains from google index?

    Thanks,
    James

  63. Just read this, requested some removals, and still waiting for things to happen. Anybody know how long this should take?

Leave a Comment

Your email address will not be published. Required fields are marked *

*

If you have a question about your site specifically or a general question about search, your best bet is to post in our Webmaster Help Forum linked from http://google.com/webmasters

If you comment, please use your personal name, not your business name. Business names can sound salesy or spammy, and I would like to try people leaving their actual name instead.

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>

css.php