Hi,
Can you please give me any solution for that? Google is not crawling my website www.ilovemyphone.com.au. All information is correct there like Robots.txt, sitemap etc. Google allows only manually posted page URLs in the Google webmaster tool.
Looks ok to me:
Thank you Erik for supporting me,
But we have 4000+ pages to index in Google. These 550 pages are showing in Google, which I have submitted manually via the Google webmaster tool. I need to index all the rest of pages in the Google automatically based on Google. I don't do it manually. Please share a solution for that. Can you check robots.txt and sitemap on my website www.ilovemyphone.com.au
Your robots.txt file is very confusing. Have you messed with it? I'd suggest going back to Neto's default.
The final line is invalid, it should be:
Sitemap: https://www.ilovemyphone.com.au/sitemap.xml
How long ago did you submit the site to Google? If you have valid settings, you just need to wait. No one can make Google crawl your site any faster.
Have you submitted the sitemap to Google? You do that in Search Console (https://search.google.com/), under Indexing > Sitemaps. Does that have a successful status? How many pages are indexed?
How did you create this sitemap? It doesn't look like Neto's default, but they may have changed how that works for new accounts. In Neto, go to Settings > All Settings and search for "Google XML Sitemap". What do you see in there?
Hi Erik thank you for giving the solution.
I have done the above changes. Hopefully, It should work. Can you please cross-check these changes?
https://www.ilovemyphone.com.au/robots.txt
It looks like Google can see those pages now. You'll just have to wait for them to be indexed.