How To Allow Google/Bing to Index Your Site

A few months ago I changed my host from wordpress.com to bluehost.com, although I’m still using the WordPress software that is now installed on my Bluehost server. I didn’t have any problems getting my site indexed before I moved to bluehost.com.

I first requested Google/Bing to index my site. I create webmaster tools account on both sites (https://www.google.com/webmasters/tools/ and https://www.bing.com/toolbox/webmaster/) so I could see when the bots had crawled my site and what the index status was. I verified both sites had been crawled, but when I found my site in the index (by searching for my domain name) I would get a description of the site that said:

A description for this result is not available because of this site’s robots.txt

So I made sure to add a robots.txt to my webserver root dir that allowed the robots to index it, which I then verified on webmaster tools. But after the site was recrawled I still got the same error message when I found my site in search.

Finally, I found this tool on Bing webmaster tools called ‘Fetch as Bingbot’. It allowed me to type in a domain and see the actual code that Bingbot was sent from my webserver. I then noticed this META tag in the header html:

<meta name=’robots’ content=’noindex, follow’>

I realized WordPress was adding in this line.  It took me awhile to find the right placce to change this setting, but I finally found it at my wp-admin page. Under Settings -> Reading there was a line that said:

Search Engine Visibilty    [checkbox]  Discourage search engines from indexing this site

The checkbox was checked by default apparently. So I unchecked the box, saved, and then tried ‘Fetch as Bingbot’ again. Voila, the meta tag with the robots instruction was no longer being sent by my webserver.