April 12, 20197 yr Hi guys, What is the best method to do SEO? I would many details to try to make something good if you can help me. Thank you.
April 12, 20197 yr It's not simple and I am not a specialist on that. Just in the first time, you must fille in communication / SEO and product / seo tab, for example, this element. You must also check if the tag title and description is activated in configuration module / network. You can also include in google console all the sitemap this is the first thing to make. After in your products description, you must have, I think more 300 words, not sure, but google like the content. At the end, you must make some link to go to your website. It's a big work. Ah I forget in Configuration, you must activate the SEO rewriting You can start by this way, after you must adjust in consequence about your description, title tag.
April 12, 20197 yr Look the forum, I think there is a post with some tools. For SEO update your must adapt contents regularly and publish high quality. Optimize image also and have a link-worthy site for SEO. Edited April 12, 20197 yr by Julie
April 12, 20197 yr There is a new hooks allow you to make a summary on your product page. Look that, you can win some time https://www.clicshopping.org/forum/files/file/164-hooks-marketing-seo-analyse/
April 12, 20197 yr Author Ok, Thank you for all your recommendation. I will start step by step like described above.
April 13, 20197 yr You can install via the dahboard module, there a seo modules. It can help you inside this process.
April 20, 20197 yr Gtmetrix is one tool can help you but there also is some other, just make a research on internet.
April 20, 20197 yr A thread has some interesting tool about SEO. I invite you to read : https://www.clicshopping.org/forum/topic/2-interesting-tool-to-analyse-your-website/
August 2, 20196 yr I found this point to check. What do you think about SEO : 1. Optimize for page speed 2. Secure your website with SSL 3. Create an FAQ page for your website 4. Create Short form content 5. Create long form content 6. Use simple words and short sentences 7. Be active on Social Media 8. Use Schema Metadata 9. Update your Google My Business Listing 10. Optimize Your Website’s Structured Data Markup
August 6, 20196 yr I found this content in my language and that seems relevant to me: - Duplicate content Risk: Search Engines, by definition, hate duplicate content that prevents them from understanding how to accurately index the relevant pages in its index. In terms of linking, he adds that "there is a loss: if ten links point to two identical pages, it is less rewarding than if they all point to a single page with unique content." Detection and Tools: To manually retrieve duplicate content, simply take a short excerpt from the page and search it in quotation marks on Google ("your search") to see if it appears multiple times. "Software like" Killer duplicate "pushes the tests further by automating verification," said Aurélien Bardon, founder of Aseox agency. Solution: You must either rewrite or delete the contents of the duplicate page, or deindex through a canonical the page that is the least to reference. - Slow page loading Risk: The loading time is a ranking criterion for Google, especially since the Speed Update in July 2018, with the penalization of sites too slow on mobile. - The bot could not crawl the page Risk: If GoogleBot can not crawl a URL then it can never be indexed. "If the URL generated significant traffic, then after failing on several crawls, Google will degrade the positioning ... often very strongly," says the founder of Aseox. Detection: The site: exempldomaine command provides a sample of indexed URLs. Searching the URL in Google also allows you to test its indexing. If a URL is indexed, it is crawled. - Title tag / Description empty, missing or duplicate Risk: The HTML tag is placed in the header of the page displayed in the browser tab and in the clickable title on the SERP of the search engine. It is also a source of information about the content of the page for the crawlers. Detection and tools: The new Search Console of Google no longer offering this valuable information, it is necessary to pass a crawl on its site with tools like Screaming Frog, MyRankingMetrics, OnCrawl or Botify. - Bad pages in the XML sitemap Risk: Errors in XML sitemaps have little impact, their goal being simply to help indexing robots identify all the important pages of a website, including recent pages. Detection: In the Search Console, the "Sitemaps" tab helps to understand the errors encountered by Google
August 14, 20196 yr Look this website. It's a complete tool analysis with little review: https://backlinko.com/seo-tools
October 16, 20196 yr To complete the topic, I found a document interesting about seo and robot.txt https://ahrefs.com/blog/robots-txt/#example-robots-txt
January 20, 20206 yr In order to contribute to build the topic, I introduce an ideal choice for SEO tools as well as provide great coupon for it.
May 7, 20206 yr when I want to find out new and relevant information about SEO, I open MOZ, look there, they have a great blog, if you are a beginner you will surely find answers to your questions. and recently seoquake blog opened https://www.seoquake.com/blog/ , I read there the latest news
August 21, 20205 yr +Choose keywords based on competition. +Build quality content from your chosen keywords. +Optimize on - page elements for the main website. +Build quality backlinks. +Continue creating content and waiting. *** link removed ***
August 23, 20205 yr Hello @ecityworkshalad, Thank you for the summary but how to make that correctly it 's not simple and take lot of time.
March 23, 20233 yr Hello, As now Chatgptis integrated inside ClicShopping, it will be more easy to create a good approach for that
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.