Jump to content

Tempo

Members
  • Posts

    32
  • Joined

  • Last visited

    Never
  • Days Won

    4

Tempo last won the day on March 3 2020

Tempo had the most liked content!

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

Tempo's Achievements

Newbie

Newbie (1/14)

  • First Post
  • Collaborator Rare
  • Conversation Starter Rare

Recent Badges

4

Reputation

  1. Thank you for your explanation, it's more clear for me and I will tried.
  2. Hello, I installer an app payment and I have a composer warning, what is the problem ?
  3. Does it difficult to install php 8 ?
  4. Hello, Php arrive soon, do you try to run ClicShopping under php 8.0 ? Does it work ?
    Easy to use when you know a little ClicShopping. You can insert, text, images. If you want to display easily an information, this module can be usefull.
  5. Hello @JKaz, Look that, I think it can be good for you. This apps allows you to create different price in function the zone and the weight. https://www.clicshopping.org/forum/files/file/109-apps-shipping-table/
  6. Ah ok, I do not see. it Thank you.
  7. Hello, I read somewhere, do not remember to include meta data inside the products description. I do not see inside the marketplace something like that.
  8. HI @Engy, ClicShopping manage the Webp, when you download an image, there is a compression applied. Look the size of your image. It will be different that the original.
  9. There is some tools on Internet can optimize the image for you. You can use it before to create your product.
  10. If I understand, well, the module is not activated by default and it's a silent analyze ?
  11. Agree square or stripe could be a very good alternative. But square it's only cad and US I think. Stripe has a nice admin tool, easy to understand and configure. I use it and I recommend it
  12. I am looking tarteauxcitrons but I do not found on the marketplace. Could you include the link ? please
  13. I just updated my store to have "persistent cart" and was then told by a colleague that I need to make sure our store is GDPR Compliant.
  14. I found this content in my language and that seems relevant to me: - Duplicate content Risk: Search Engines, by definition, hate duplicate content that prevents them from understanding how to accurately index the relevant pages in its index. In terms of linking, he adds that "there is a loss: if ten links point to two identical pages, it is less rewarding than if they all point to a single page with unique content." Detection and Tools: To manually retrieve duplicate content, simply take a short excerpt from the page and search it in quotation marks on Google ("your search") to see if it appears multiple times. "Software like" Killer duplicate "pushes the tests further by automating verification," said Aurélien Bardon, founder of Aseox agency. Solution: You must either rewrite or delete the contents of the duplicate page, or deindex through a canonical the page that is the least to reference. - Slow page loading Risk: The loading time is a ranking criterion for Google, especially since the Speed Update in July 2018, with the penalization of sites too slow on mobile. - The bot could not crawl the page Risk: If GoogleBot can not crawl a URL then it can never be indexed. "If the URL generated significant traffic, then after failing on several crawls, Google will degrade the positioning ... often very strongly," says the founder of Aseox. Detection: The site: exempldomaine command provides a sample of indexed URLs. Searching the URL in Google also allows you to test its indexing. If a URL is indexed, it is crawled. - Title tag / Description empty, missing or duplicate Risk: The HTML tag is placed in the header of the page displayed in the browser tab and in the clickable title on the SERP of the search engine. It is also a source of information about the content of the page for the crawlers. Detection and tools: The new Search Console of Google no longer offering this valuable information, it is necessary to pass a crawl on its site with tools like Screaming Frog, MyRankingMetrics, OnCrawl or Botify. - Bad pages in the XML sitemap Risk: Errors in XML sitemaps have little impact, their goal being simply to help indexing robots identify all the important pages of a website, including recent pages. Detection: In the Search Console, the "Sitemaps" tab helps to understand the errors encountered by Google
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use