Jump to content
Sign in to follow this  
Grantz

SEO optimization

Recommended Posts

Hi guys,

What is the best method to do SEO?

I would many details to try to make something good

if you can help me.

Thank you.

Share this post


Link to post
Share on other sites

It's not simple and I am not a specialist on that.

 

Just in the first time, you must fille in communication / SEO and product / seo tab, for example, this element.

You must also check if the tag title and description is activated in configuration module / network.

You can also include in google console all the sitemap 

 

this is the first thing to make.

After in your products description, you must have, I think more  300 words, not sure, but google like the content.

 

At the end, you must make some link to go to your website. It's a big work.

 

Ah I forget in Configuration, you must activate the SEO rewriting

 

You can start by this way, after you must adjust in consequence about your description, title tag.

 

 

Share this post


Link to post
Share on other sites
Posted (edited)

Look the forum, I think there is a post with some tools.

For SEO update your must adapt contents regularly and publish high quality. 

Optimize image also and have a link-worthy site for SEO.

Edited by Julie

Share this post


Link to post
Share on other sites

There is a new hooks allow you to make a summary on your product page. Look that, you can win some time

 

https://www.clicshopping.org/forum/files/file/164-hooks-marketing-seo-analyse/

 

Share this post


Link to post
Share on other sites

Ok,

Thank you for all your recommendation.

I will start step by step like described above.

Share this post


Link to post
Share on other sites

You can install via the dahboard module, there a seo modules. It can help you inside this process.

Share this post


Link to post
Share on other sites

I found this point to check. What do you think about SEO :

 

 1. Optimize for page speed

2. Secure your website with SSL

3. Create an FAQ page for your website

 4. Create Short form content

5. Create long form content

6. Use simple words and short sentences

7. Be active on Social Media

8. Use Schema Metadata

9. Update your Google My Business Listing

 10. Optimize Your Website’s Structured Data Markup

  • Thanks 1

Share this post


Link to post
Share on other sites

I found this content in my language and that seems relevant to me:

- Duplicate content
Risk: Search Engines, by definition, hate duplicate content that prevents them from understanding how to accurately index the relevant pages in its index. In terms of linking, he adds that "there is a loss: if ten links point to two identical pages, it is less rewarding than if they all point to a single page with unique content."

Detection and Tools: To manually retrieve duplicate content, simply take a short excerpt from the page and search it in quotation marks on Google ("your search") to see if it appears multiple times. "Software like" Killer duplicate "pushes the tests further by automating verification," said Aurélien Bardon, founder of Aseox agency.

Solution: You must either rewrite or delete the contents of the duplicate page, or deindex through a canonical the page that is the least to reference.

- Slow page loading
Risk: The loading time is a ranking criterion for Google, especially since the Speed Update in July 2018, with the penalization of sites too slow on mobile.

 

- The bot could not crawl the page
Risk: If GoogleBot can not crawl a URL then it can never be indexed. "If the URL generated significant traffic, then after failing on several crawls, Google will degrade the positioning ... often very strongly," says the founder of Aseox.

Detection: The site: exempldomaine command provides a sample of indexed URLs. Searching the URL in Google also allows you to test its indexing. If a URL is indexed, it is crawled.

 

- Title tag / Description empty, missing or duplicate
Risk: The HTML tag is placed in the header of the page displayed in the browser tab and in the clickable title on the SERP of the search engine. It is also a source of information about the content of the page for the crawlers.

Detection and tools: The new Search Console of Google no longer offering this valuable information, it is necessary to pass a crawl on its site with tools like Screaming Frog, MyRankingMetrics, OnCrawl or Botify.

 

- Bad pages in the XML sitemap
Risk: Errors in XML sitemaps have little impact, their goal being simply to help indexing robots identify all the important pages of a website, including recent pages.

Detection: In the Search Console, the "Sitemaps" tab helps to understand the errors encountered by Google

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Sign in to follow this  

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use