Skip to main content

Create Search Engine Friendly URLs using .htaccess

In this post, we will see how to use .htaccess to create search engine friendly URLs. Using .htaccess, mapping dynamic pages to static address is easy and this helps search engines to index the pages fast, effective and strong.

Initially, create a file .htaccess in your favorite editor, use the rewrite rule as below to map dynamic id passing url's to static version. Use the code and create your dynamic pages for search engines spiders

**************************************************************
RewriteEngine on

## Rewrite Rules ##

RewriteRule ^category/([0-9]*)/([0-9]*)/([a-z_]*)$ category.php?id=$1&page_id=$2 [L]
RewriteRule ^category/([^/]*)/([a-z_]*)/([^/]*)$ category.php?r_name=$2&id1=$1


***************************************************************

For eg: Originally the page URL looks like subcategory.php?id=5&page_id=1

Using the above .htaccess code, you can create the URL as

category/5/1/product_name

Here,

RewriteRule ^category/([0-9]*)/([0-9]*)/([a-z_]*)$

^ - Refer to start
Category - creates a folder as mentioned name
/ - Used for virtual partition
[0-9] - Writes single number as folder name
[0-9]* - Writes n numbers as folder name
[a-z_] - Writes single letters as folder name
[a-z_]* - Writes letters as folder name
$ - Refer to end


Additional information:

Your id passing pages might have indexed through some referral and so, you can use the below code to avoid error pages, create a 404pagenotfound.php and add your a message to your visitors to navigate to home page for information’s

*****************************************************************

## Trace Errors
ErrorDocument 404 /404pagenotfound.php


*****************************************************************

Hope you will like the post and its useful :)

Popular posts from this blog

5 SEO Tips to follow in 2022

You would have already knew these SEO tips but you would have ignored or missed out. If you know or not, do consider following and look very close to these, #1 Disavow file Check your site backlinks using your fav SEO tool and find out backlinks which you think is bad and add to the disavow file .    Reference https://www.google.com/webmasters/tools/disavow-links-main #2 Page Speed Improve your page speed for better user experience. This is one of the important ranking signal. Do an analysis & optimize your website with Page speed tool.     Reference https://pagespeed.web.dev/ #3 User Interaction Make sure you provide a good user experience by not completing your web page design. Make it simple and drive the user through an interaction in the form of simple CTA button to get the data. #4 Page layout Shift Every web page should have visual stability. Image or video with unknown dimensions or fonts that loads large then falls back to small or any wired widget ...

Disavow.txt Smart Code Report to Disavow Spammy Or Low-Quality Links

You need to make sure about the links to be disavowed. The main factor is about low-quality & spammy links those points to the site. The process need to be carried out only if you think the links that are to be mentioned are causing issues to your site's online presence. # This Disavow Report was created on Date, year (Today) for domain.com # This report includes links of 2 prior reports submitted on Month Date, Month Date # This report also contains new data added today, Month Date # We have request webmaster to remove our links from their domain. The links from the following domain were cleaned up domain:1xyz.com domain:2xyz.com domain:3xyz.com http://spam.abc.com/article/comments.html # The webmaster did not respond. The links from the following domain were not cleaned up domain:1abc.com domain:2abc.com domain:3abc.com # The below blogspot has differnt ccTLDs. We assume Google will apply disavow to all ccTLDs. domain:xyz.blogspot.com Customize the disa...

What Google Hates?

As we know Google is trying to bring best search results with wide range of algorithms and the most popular panda update does a lot to meet user’s experience. In some cases, even now we see n number of scrapper sites and search manipulation sites in web search. Google is still fighting to get all these sites dumped. What Google hates? 1. Serving different results for search engines and users. 2. Content copied from different sites. 3. Buying paid links to manipulate site ranking. 4. Keyword stuffing in content and HTML tags. 5. Hiding keywords using style sheet. 6. Too many JavaScript pop-ups in a page. 7. Slow loading sites. 8. Low server response sites. 9. Duplicate URLs and sites that don’t use canonical URL tags. 10. Dynamic sites that doesn’t use parameter handling. 11. Pages that have high bounce rate. 12. Tags that doesn’t meet body content. 13. Low average time spent on each page. 14. Non-search engine friendly URLs 15. Images that are not optimized which res...