I read a post the other day by John-Henry Sherck on finding directories that survived the recent Google updates and I found myself agreeing with his philosophy on grabbing links wherever they’re available. This also led me to another thought, “What if we made a search engine that only pulled results from directories?”
I set Xenu loose on Directory Critic and after a REALLY long crawl I had all of the outbound links in one great big file. It took a while to weed out a bunch of junk (and this is still far from perfect) but eventually I got it to work fairly well. Unfortunately, Google Custom Search Engine only offers you 1,000 sites to pull from without paying (if you want to send money, you just let me know and I’ll blow this thing up to its full size). I have over 13,000 directories in the main file that I could add, but this seems like a functional sample of what could be done with this.
The quick overview for the A.D.D. crowd:
So for those of you who didn’t click over to the post I referenced, the general idea here is that you only want links from directories Google is still indexing. So I set up Google custom search to return results from a specified list of domains (which are all directories I scraped from the Directory Critic site). So if you do a search for “home improvement” you’ll get results from directories that Google has indexed that mention “home improvement” on the page. So rather than having to add in a bunch of search footprints like “submit URL” or “add URL” now you just need to type in keywords and you’ll get directory pages that Google indexes. This might be a category page, or a page specific to one listing, but either way you’ll have an idea that Google is indexing the listings and you can build a list of new targets for your link building efforts (it’s a little messy, but you can use the Simple Google Results bookmarklet to scrape your results from this as well).
Try the directory search engine