If you have a lot of urls that you don’t want in Google anymore, you can make the pages return a 404 and wait for Googlebot to recrawl/reindex the pages. This is often the best way. You can also block out an entire directory or a whole site in robots.txt and then use our url removal tool to remove the entire directory from Google’s search results.
What I would not recommend is sending tons (as in, thousands or even tens of thousands) of individual url removal requests to the url removal tool. And I would definitely not recommend making lots (as in, dozens or even more) of Webmaster Central accounts just to remove your own urls. If we see that happening to a point that we consider excessive or abusive, we reserve the right to look at those requests and responding by e.g. broadening, cancelling, or narrowing the requests.
So if you’re sending huge numbers of requests to our url removal tool, it might be a good idea to take a step back and ask whether you should be removing at the directory level instead.