Skip to main content

Creating XML Shopping Feed for an Ecommerce Product Site

Here are simple steps to make your eCommerce site Web Pages indeed by Google and other major search engines

Please note: Create your URL search engine friendly using product names, avoid stop characters

(1) Create RSS feed using http://www.toucanmultimedia.com/rssmaker.php

(2) Save the file as ecommerce_feed.xml (This will tell the search engine that the feed is all about ecommerce product page)

(3) Add the ecommerce_feed.xml to Google Webmaster sitemap https://www.google.com/webmasters/sitemap by logging using Google account, same with yahoo site explorer

(4) Configure the XML shopping feed.xml to feed burner http://www.feedburner.com and it will look like http://feeds.feedburner.com/blogspot/SEOPodCast

(5) Integrate the XML shopping feed with an image or a plain text link from you site so that your visitors can subscribe and comeback when they need, by this you can expect more returning visitors, another advantage is entire product pages will be indexed by major and local search engines and even feed readers / finders.

This can also be followed for large website with lots of resources, thanks for your time, keep visiting

Related Posts: SEO Friendly E Commerce Store

Thanks for your time and support, leave a reply!

Popular posts from this blog

5 SEO Tips to follow in 2022

You would have already knew these SEO tips but you would have ignored or missed out. If you know or not, do consider following and look very close to these, #1 Disavow file Check your site backlinks using your fav SEO tool and find out backlinks which you think is bad and add to the disavow file .    Reference https://www.google.com/webmasters/tools/disavow-links-main #2 Page Speed Improve your page speed for better user experience. This is one of the important ranking signal. Do an analysis & optimize your website with Page speed tool.     Reference https://pagespeed.web.dev/ #3 User Interaction Make sure you provide a good user experience by not completing your web page design. Make it simple and drive the user through an interaction in the form of simple CTA button to get the data. #4 Page layout Shift Every web page should have visual stability. Image or video with unknown dimensions or fonts that loads large then falls back to small or any wired widget ...

Disavow.txt Smart Code Report to Disavow Spammy Or Low-Quality Links

You need to make sure about the links to be disavowed. The main factor is about low-quality & spammy links those points to the site. The process need to be carried out only if you think the links that are to be mentioned are causing issues to your site's online presence. # This Disavow Report was created on Date, year (Today) for domain.com # This report includes links of 2 prior reports submitted on Month Date, Month Date # This report also contains new data added today, Month Date # We have request webmaster to remove our links from their domain. The links from the following domain were cleaned up domain:1xyz.com domain:2xyz.com domain:3xyz.com http://spam.abc.com/article/comments.html # The webmaster did not respond. The links from the following domain were not cleaned up domain:1abc.com domain:2abc.com domain:3abc.com # The below blogspot has differnt ccTLDs. We assume Google will apply disavow to all ccTLDs. domain:xyz.blogspot.com Customize the disa...

What Google Hates?

As we know Google is trying to bring best search results with wide range of algorithms and the most popular panda update does a lot to meet user’s experience. In some cases, even now we see n number of scrapper sites and search manipulation sites in web search. Google is still fighting to get all these sites dumped. What Google hates? 1. Serving different results for search engines and users. 2. Content copied from different sites. 3. Buying paid links to manipulate site ranking. 4. Keyword stuffing in content and HTML tags. 5. Hiding keywords using style sheet. 6. Too many JavaScript pop-ups in a page. 7. Slow loading sites. 8. Low server response sites. 9. Duplicate URLs and sites that don’t use canonical URL tags. 10. Dynamic sites that doesn’t use parameter handling. 11. Pages that have high bounce rate. 12. Tags that doesn’t meet body content. 13. Low average time spent on each page. 14. Non-search engine friendly URLs 15. Images that are not optimized which res...