okrogius
04-21-2003, 11:00 PM
Ala simular to the "ebay style feedback system" kind-of thread. Now to the point... :)
Search engine friendly URLs are always wanted on forums. Why should your thousand of posts not be indexed in google, afterall? There are hacks here that give a search engine friendly "archive", and some mod_rewrite hacks.
I've yet to find a hack for the actual forums, and not some measly archive, that didn't require mod_rewrite and intensive POSIX knowledge. So, I've been playing around here and there with a modification of my own.
Demo:
http://www.cgshockforums.com/
The modification does require Apache ("lookback" feature of mod_mime in particular). I've only modified the showthread and forumdisplay templates to reflect the "cleaner" urls.
Why? A part of this system is also a customized robots.txt file. Googlebot (or any other cralwer that respects robots.txt) is only allowed to index the /go/ "folder" (as all other files are disallowed). Thus the search engine is only indexing the relevant pages :).
Additionally, the system works nicely by hiding the sessionhash depending on whether it is needed or not. Sessionhash is always hidden from search engine crawlers (defined in an array by HTTP_USER_AGENT in phpinclude template).
Comments? Ideas? Suggestions? :p
Search engine friendly URLs are always wanted on forums. Why should your thousand of posts not be indexed in google, afterall? There are hacks here that give a search engine friendly "archive", and some mod_rewrite hacks.
I've yet to find a hack for the actual forums, and not some measly archive, that didn't require mod_rewrite and intensive POSIX knowledge. So, I've been playing around here and there with a modification of my own.
Demo:
http://www.cgshockforums.com/
The modification does require Apache ("lookback" feature of mod_mime in particular). I've only modified the showthread and forumdisplay templates to reflect the "cleaner" urls.
Why? A part of this system is also a customized robots.txt file. Googlebot (or any other cralwer that respects robots.txt) is only allowed to index the /go/ "folder" (as all other files are disallowed). Thus the search engine is only indexing the relevant pages :).
Additionally, the system works nicely by hiding the sessionhash depending on whether it is needed or not. Sessionhash is always hidden from search engine crawlers (defined in an array by HTTP_USER_AGENT in phpinclude template).
Comments? Ideas? Suggestions? :p