Indexing of Shops - Google Search Console

Yah that’s really strange. When we use the plugin and remove the #! from the URLs the browser reports 404 error but still shows the page. When the 404 is triggered it then adds a noindex to the page preventing Google from crawling.

If we remove that and use #! in the URL and test on Google Search Console it shows this errror.

So I’m not sure how those others have gotten crawled if Google says it can’t crawl the URLs we are testing.

I will need to speak to our DEV Team, to ensure that this is not a bug, coming from recent changes on Google’s side.
I hope to get back to you with more info soon.

Great, thanks, Thomas.

For reference here is someone from a year ago stating the same issue I’m having when removing the #! from URLs

Is there an older version of the plugin I can use?

Also seeing this meta tag added on the category pages

Hey there!
Your case is still in the waiting line. As soon as I have some details to share, I can get back to you with some kind of info :slight_smile:

And yes, not all shop pages are set to be indexed by intent. This is correct.

1 Like

Thanks, also noticed the following in the

Rich Results Test that the carousel items are missing the URL parameter.

Also of note is Google Webmaster blog post here - Official Google Webmaster Central Blog: Frequently asked questions about JavaScript and links

Specifically about the URL fragments and their proper use.

Does Googlebot understand fragment URLs?

Fragment URLs, also known as “hash URLs”, are technically fine, but might not work the way you expect with Googlebot.

Fragments are supposed to be used to address a piece of content within the page and when used for this purpose, fragments are absolutely fine.

Sometimes developers decide to use fragments with JavaScript to load different content than what is on the page without the fragment. That is not what fragments are meant for and won’t work with Googlebot. See the JavaScript SEO guide on how the History API can be used instead.

If we could get the #! to remove and not send a 404 error this would be great as it would mostly likely crawl better.

Hello Thomas!
Thank you for help.
Could you connect a supervisor of your company to the chat to resolve this issue?

Sincerely,
Denis

Sorry for my late response up on that topic

At first, to the question of crawling.

Our DEV team is supervising the indexing progress of shops in general. We see clear patterns that the Google bots are able to crawl your pages, even if we are not 100% sure why they cannot access the information in some cases.
So be sure they´ll find their way through.

Removing the hashbang manually is and will always be a use case for advanced users. We do not give advice regards that solution, as the support mostly transforms into an endless back and forth. :confused:

Indexing:
We are actively deciding against an indexing off each and every page and list page of all shops.
Here´s the why: For Google, the entity of all active shops consists of an identical content structure.
Means, independent from each shop´s topic and the designs offered inside, Google Bots recognize the deep link structure for every shop as the identic. Therefore, they most likely only crawl and rank content that is really well maintained by the shop owner.

How to overcome this?
My advice here is clearly to use the blessing of your own domain by building user-relevant content around your shop. That´s what Google is looking for.

Means, you should start building a content home. Create Landingpages for your best niche designs. Do storytelling about your shop, your brand, etc.
This is where the SEO effect kicks in and where a sitemap really makes sense.
Not a sitemap for your shop, only… but for your website! That is the answer I can give here.

My Tip:

Anytime you upload a new design, you can release a new landing page for it. If you have, feel free to write blog content and link to your LPs where possible.
So your website might gain much more link juice than your stand-alone-shop could ever do.

So to confirm, you actually have no idea why Google is crawling some Spreadshop sites but not others.?

We know about of the weakness in SEO in standalone shops, And that can’t be simply solved by setting all subpages to index. Basically it’s about that all shops look the same to Google from their page structure.

And yes, we know why some shops have a higher chance to be ranked.
Embedded shops have a much higher chance to be ranked because the users are providing context to the shop.
If a website feels like a desert without content, literally Google won’t spend much time to understand the things that are hidden, deeper inside the page/shop whatever.
But if backlinks are linking to your page, the link structure is built logical and if there are differentiated LPs built for important designs/products, this will be highly beneficial.

If the shop is used as standalone, Google will only crawl the surface and check if there are any backlinks or other indicators for a minimum of relevance.
Since all standalone shops are nearly identic to Google - from a structural point of view.

This is a non-issue. The majority of sites use WooCommerce, Shopify and many others that all share the same code base and easily get crawled and indexed and ranked.

If you provided a sitemap for all the products this issue would be resolved immediately as Google will index your shop pages even with the #! in the URL. Any chance that’s going to be a feature?

Thanks,

There have been tests with sitemaps for Shops, that did not turn out successful as promised, so we took it off the roll-out list.
I believe the chances of a comeback are low. :man_shrugging:

How can we remove the noindex tag for our various category pages?
If we can’t then how can Google Crawl the shop category and product pages?
We don’t what to have to build hundreds of pages in Wordpress for each landing page when this is already done by Spreadshop store using categories.

We definately need to reconsider our options there, as there are usecases that require the categories to be indexed, and some not.
But this will not be something that I can promise to be solved within the next 2 weeks.

2 posts were split to a new topic: Merchant Center - Microdata missing?

aha so you recommend not using sitemaps for the same reasson even for our shops if they are without our own domain+embedding? that is https://shop.spreadshirt.se//myshophere @Thomas_Spreadshop

This topic was automatically closed 365 days after the last reply. New replies are no longer allowed.