How to Create Search Engine Friendly PHP Pages Without Technical Hiccups

PHP is a powerful server side language that offers greater scalability, dynamism and ease of use than static HTML pages. However, PHP can be a hard nut to crack when it comes to search engine optimization. PHP developers encounter a lot of technical issues while creating PHP scripts based on SEO guidelines.

Here are some issues that may affect optimization of PHP pages. Also given are guidelines on how to overcome them without technical hiccups:

Latency of PHP scripts: The execution time of a PHP code counts a lot in determining the SEO friendliness of the script. If a search engine spider crawls your PHP page and follows a link, but is forced to wait long for the server to execute the PHP code behind the page, then, it may neglect your page or move on without crawling the rest the page.

To optimize the loop code and avoid slowdowns, reduce the number of SELECT * calls you use. Using SELECT on a table that contains 10 fields when you want to select only one is like inviting a slowdown of your script. Instead, name all the columns you want to retrieve. If you are using MySQL, test your queries using EXPLAIN statement. Further, to make loops more search engine friendly, use duplicated code that will not be repeated many times and static values such as count values.

Session ID issue: If the “enable-trans-sid” option is turned on, it creates links with session ID numbers. Not only do your links grow nonsensically-lengthy, but they also present spiders with different URLs to the same content, which in turn may cause problems in the indexing of pages.

To avoid session ID in your URLs, you need to disable the ‘trans-id’ feature in php.ini by setting “session.use_trans_sid” to false. Else, you can disable session ID feature by adding php_flag session.use_trans_sid off to the .htaccess file in the root directory.

Search Engine friendly URLs: To optimize your PHP pages to look like static pages to search engines, you can use either of the two ways – you can use Apache to fake such static page-like URLs or keep your GET variables to a minimum. URL cleanliness is a crucial issue in dynamic PHP pages. Since such pages are created with GET variables, the URLs look clumsy and almost unreadable to search spiders. Use less number of GET variables to avoid URLs like “Page.php?var=abc&var1=def&var2=ghi”

Instead, you can also make GET variables relevant by using keyword rich titles and terms. If the page requires more variables, you can combine the variables by delimiting them with a hyphen or an unused character and then, splitting them in the target page.

Mod_rewrite: Rewriting your URLs should be done after a lot of thinking since you cannot be changing your links over and over again. First decide on how you want to rewrite your URLs, then go ahead and implement it in your .htaccess file. For instance, a sample mod_rewrite rule would be like this

RewriteEngine On

RewriteRule ^(.*)/(.*)/(.*).html /index.php?act=$1&id=$2&page=$3

The first line starts the mod_rewrite engine, while the second line does the URL modification work. The second line dictates how to modify your actual URL like the one mentioned in part 2 of the RewriteRule.



Source link