Skip to main content

Google Paid Shopping Ads - Negative & Converting Keywords

Google shopping campaigns are used to promote products through product listing ads and it consists of image, product title, price, shipping, business name to mention few. By submitting a product feed to Google merchant center and linking to your AdWords account, you can create and access the products information and settings for various targets.

Google shopping don't allow you to manage keywords like web search ads. In most of the industry shopping ads perform better than web search ads but at the same time, you will be losing $$$$ when user clicks on ads by displaying the products for irrelevant keywords, that is negative keywords.

Negative keywords can be used at the campaign or ad group level to restrict the irreverent search queries for which the product ads are shown. You have an option to add, edit, or remove negative keywords by clicking the Keywords tab.

How to find negative keywords?

By using Google Analytics, you can navigate to Acquisition > AdWords > Search Queries. Use Secondary Dimension > Advertising > Campaign and from the advanced filter, look for search option to segment the Shopping Campaign and apply the filter.



This report lists all matched queries for shopping campaigns. The report also includes behavior analysis from which you can see bounce rate. Click on the bounce rate to sort from 100%. Here you will see keywords that are irrelevant to the business. You can also compare the range month wise so that you can get a clear insight on selecting the right negative keywords. Adding those negative keywords would save a lot of $$$$ and this should be a continuous process. In the same section find out the converting keywords and re-optimize the most converted product pages with those relevant keywords.

Take advantage on both negative keywords and converting keywords!

Popular posts from this blog

5 SEO Tips to follow in 2022

You would have already knew these SEO tips but you would have ignored or missed out. If you know or not, do consider following and look very close to these, #1 Disavow file Check your site backlinks using your fav SEO tool and find out backlinks which you think is bad and add to the disavow file .    Reference https://www.google.com/webmasters/tools/disavow-links-main #2 Page Speed Improve your page speed for better user experience. This is one of the important ranking signal. Do an analysis & optimize your website with Page speed tool.     Reference https://pagespeed.web.dev/ #3 User Interaction Make sure you provide a good user experience by not completing your web page design. Make it simple and drive the user through an interaction in the form of simple CTA button to get the data. #4 Page layout Shift Every web page should have visual stability. Image or video with unknown dimensions or fonts that loads large then falls back to small or any wired widget ...

Disavow.txt Smart Code Report to Disavow Spammy Or Low-Quality Links

You need to make sure about the links to be disavowed. The main factor is about low-quality & spammy links those points to the site. The process need to be carried out only if you think the links that are to be mentioned are causing issues to your site's online presence. # This Disavow Report was created on Date, year (Today) for domain.com # This report includes links of 2 prior reports submitted on Month Date, Month Date # This report also contains new data added today, Month Date # We have request webmaster to remove our links from their domain. The links from the following domain were cleaned up domain:1xyz.com domain:2xyz.com domain:3xyz.com http://spam.abc.com/article/comments.html # The webmaster did not respond. The links from the following domain were not cleaned up domain:1abc.com domain:2abc.com domain:3abc.com # The below blogspot has differnt ccTLDs. We assume Google will apply disavow to all ccTLDs. domain:xyz.blogspot.com Customize the disa...

What Google Hates?

As we know Google is trying to bring best search results with wide range of algorithms and the most popular panda update does a lot to meet user’s experience. In some cases, even now we see n number of scrapper sites and search manipulation sites in web search. Google is still fighting to get all these sites dumped. What Google hates? 1. Serving different results for search engines and users. 2. Content copied from different sites. 3. Buying paid links to manipulate site ranking. 4. Keyword stuffing in content and HTML tags. 5. Hiding keywords using style sheet. 6. Too many JavaScript pop-ups in a page. 7. Slow loading sites. 8. Low server response sites. 9. Duplicate URLs and sites that don’t use canonical URL tags. 10. Dynamic sites that doesn’t use parameter handling. 11. Pages that have high bounce rate. 12. Tags that doesn’t meet body content. 13. Low average time spent on each page. 14. Non-search engine friendly URLs 15. Images that are not optimized which res...