As I move forward one thing is more prominent than others and that is that learning never stops. The rules of web site building and Search Engine Optimization are continuing to evolve in the effort to produce a continually improving level of relevancy of search results. For a long time the buzz was keywords and the tools that allow analysis of just how effective those keywords were in driving traffic to a site. The words are still important but the results and tracking has become a level of magnitude more difficult.
Google is the most widely used tool to evaluate the effectiveness of keyword search campaigns but now that they are encrypting all search traffic they have managed to obscure the tracking of individual keywords. It requires special tactics to gain the knowledge of just which keywords are effective and which ones need to be improved. If a specific web page is pointed to a single or small group of keywords then you can get the stats on the URL that calls that page and thereby get a glimpse at the effectiveness of a specific keyword or phrase. There may be other ways but this is yet to be discovered by me. If you have some better insight please comment on this blog entry.
Add to this the localization that Google also places on search engine result pages (SERP) and the analysis gets even more difficult. However, in Google's defense their goal is to provide you with the most relevant and useful results possible. The formula they use is constantly being revised and what works today may not work tomorrow.
I also discovered that in Sitemap submission to Google the 'SEF' URL will not work as expected and you need to submit the 'non-SEF' URL to get Google to properly recognize and index your site. This is just one more little tidbit of knowledge that is worth passing on to those interested geeks in the web world. Many of you may already know this but I am sure that there are struggling developers out there that can find this helpful.
Feel free to chime in on this topic.