Saturday, December 09, 2006

Google pages' killer Robots

I am still quite ardent towards Google and its services, yet I found it quite absurd that Google ages have started this annoying service of putting up auto generated robots.txt. Of course the problem does not end here. This auto generated robots.txt file blocks the root directory of any of the hosted account. Google might claim that its a bug, I am not convinced anyway.

Please note that steps given here are still in experimental stage and I have not completed the work around. I have finally hosted my web page at I had to reset the CNAME entries and @ entries on my name server(hosted by The work around for this was a bit hectic one(for I had search for other free domain space provider). I certainly am going to ditch Google pages. The work around that I am going to talk about is a bit complicated one for people who are not that savvy about web development. I have my domain registered with and my domain pages are hosted at hosted apps by google at google pages(I take the liberty that people understand what I mean). So the first thing that I did was to register at free domain account. After registering at I had to update godaddy account to point to new name server. Though I preferred not to change the existing name server I have added the new Name server to the list(which fortunately got reflected within 5 minutes). I left my MX record as it is at
Next log on to fateback account and then add the new domain to the list of domain that 50webs
itself offers. You would need to play around with the control panel to learn about how exactly fateback behaves. There on transfer the content through ftp or web interface to the fateback account.

There are quite a few drawbacks of this work around. 50webs does not offer any editor unlike google pages. If you have a full blown site created entirely or partly using google pages editor it would become quite cumbersome to transfer the content to 50webs server. Though there are
number of advantage as well like you get ftp access, you get folder options and of course you have full control on robots.txt
I hope to update this blog in future as I know of some thing interesting. I also request people to post their opinion and suggestions of how to improve this work around


  1. After making this post I realized that, forced ad unit by fateback is quite annoying. I have dropped the idea of complete migration of my domain space. Cloaking could be a solution but its very unsatisfactory as I would still have indexing issue with the search engine.

  2. I chose to host my domain. This host is ad free and provides ftp access as well.
    So my domain settings are as follows.
    1) Domain name registered at
    2) Website hosted at instead of google apps (Google Page creator)
    3) Name server
    4) Email server Google Apps

  3. It seems to be a bug, and they're working on it, see

  4. Thanks for the update gob?n saor. It seems like a bug. May be its bug but I have anyway moved away from GPC (may be for good). Other free hosts give me better options to experiment. Anyway I wonder what's taking Google to fix it if they claim it be a bug.