I have set up a site and have been trying to submit it to Google over the last few weeks.
I put the URL in (
) and the request it to index/crawl the site which takes a few days but then all I get is a load of crawl errors and no indexed pages.
I have read several things about making changes to the robots.txt or htaccess files but I can't find them anywhere in the interface? Would it be a better idea to submit my own xml site map? If so, are there any good, free sites that I can use?
Sorry if these are newbie questions, I am more of a designer so get a little lost with it sometimes!
The system auto generates the robots file for you so you don't have to worry about it. We determine which pages are best to index to also prevent duplicate content on your site, for example we do not index the designer page but do index the individual product pages.
There are some issues with URLs being in the sitemap that shouldn't be which I'm sure are the warnings you are seeing. Our next major update 8.5 removes these unused URLs from the sitemap which should stop the warnings from showing. In the meantime you can ignore the warnings as it's ok for those non existent pages to not be indexed.