Detect AJAX postback in Global.asax

by demtron on Friday, January 16, 2009 08:19 AM

I was recently working on a page hit logging scenario that, for reasons having to do with URL rewriting, needed to have the logging code executed in the Application.BeginRequest event of Global.ajax.  The application relies heavily on ASP.Net Ajax, and I needed a was to ferret out the AJAX requests and not log them.  Ordinarily, in the page lifecycle, this would be accomplished by using Page.IsAsync.  In this case, I needed to perform the logging prior to the page code being invoked.

To determine the whether the request was generated by AJAX, I used the following code:


Request.Headers("X-MicrosoftAjax") = "Delta=true"


Request.Headers["X-MicrosoftAjax"] = "Delta=true";

Hope that helps someone out there who has struggled to find an easy answer.

Domain Scams and Beijing Himense Part 2

by demtron on Wednesday, January 14, 2009 07:57 PM
Here we are, two months after receiving the Beijing Himense Domain Scam E-Mail and what has happened?  NOTHING.  No other top-level domains have been registered with Demtron.  Just a word to the wise... ignore these jokers.

Killing Your Search Engine Ranking in 7 Easy Steps

by demtron on Wednesday, January 14, 2009 07:40 PM
Have you ever wanted to completely destroy your search engine ranking or do it for someone else?  Maybe you never want your site to get found again?  Believe it or not, I've helped clients in the last six months with each of these problems that were torching their search engine profile and strangling their organic traffic.
1) Domain masking: I took over one site where the entire site was using domain masking.  In seven years, the client had absolutely no idea that the previous designer was doing this to save himself a buck on a hosting plan for the site.  Only the home page was found in search engines.  It turns out the designer did exactly the same thing with all the rest of his clients' sites that have been live for years.  They, too, only have the home page to show for it in search engine results.
2) Use only a JavaScript menu for linking pages: Sure, JavaScript menus are cool.  They can drop down, slide across, have pictures and generally spice up a site.  But they can't be crawled by search engines.  What's more, the links in them don't contribute anchor text, either.  One recent client had over a hundred pages in a JavaScript menu and practically no linking using anchor tags.  Most of the 200 pages of the site had not been crawled at all
3) Use a JavaScript page strip: ASP.Net is famous of offering page navigation strips using is doPostBack JavaScript methods.  Another client I acquired had over 6 thousand pages on a site, but only 32 pages actually crawled.  The remaining pages were all accessible through paginated tables.  Another great waste of code that a search engine ignores.
4) All pictures and Flash with little text: Some designers with a flair for graphic design take sites a little overboard.  If you're a famous pop star and have zillions of fans finding your site every day by just typing, then who cares?  In the real world, most sites are not wildly popular and are only found through search engine results.  SEs love text, especially keyword-rich, backlinked text.  Pictures and Flash sure are pretty, but they basic tell nothing to a crawler.
5) Renaming pages without redirects:  One site I redesigned earlier this year had tons of links from other sites pointing to a page that was non-existent.  What a complete waste of free traffic and promotion!  Both search engines AND human visitors wouldn't find the site.  Oh, what a little bit of 301-redirect action did to help out that one.
6) Leaving title tags blank:  One of the aforementioned sites had about 60 of it's 200 pages with blank titles.  How is anyone going to find those pages, and why would anyone click on them?  Here, let's write a book, then tear the front cover off and leave one of those "this page intentionally left blank" pages as the new front cover.  Real slick.
And last, but not least...
7) user agent: *  disallow: / in the ROBOTS.TXT file: This one didn't actually happen, although it was close.  The site had the disallow all set for a user agent of Google.  So, they kissed 81% of their traffic away just by a simple screw-up by the former designer.
And there you have it.  If you implement these seven key steps, your success with annihilating your search engine exposure and traffic is pretty much guaranteed.  Good luck and happy destroying!

Milwaukee SEO: Keywords for Real Estate Web Sites

by demtron on Wednesday, January 14, 2009 04:38 PM

Real estate agents can achieve significant exposure on the Web through their Web sites and blogs.  According the National Association of Realtors 2006 survey of Internet use (, 24% of all buyers first learn about a home they purchased through the Internet, and a whopping 77% of all buyers used the Internet to search for homes.  An increasing number of home buyers (and sellers) are savvy with technology and conduct their initial investigation of homes real estate web sites. 

Unfortunately, there is a surprising number of real estate Web sites that don’t rank well in search engine results, even for properties in their own communities.  The agents who own these sites likely don’t know much traffic and Web visibility their missing. 

Real Estate Location Keywords for SEO

So, how do we go about determining which keywords will work well for an agent’s site?  First and foremost, it’s critical to know the specific cities to target, the size and relative importance of each, and what variations exist for each name.  For example, let’s choose Waukesha County, Wisconsin for as a sample.  Here’s the list of most populous cities in descending order according to the 2000 U.S Census:



New Berlin





These should be the core of the keywords used for a Waukesha county real estate site as most buyers and sellers will target specific communities when conducting searches.  Variations should be included such as Saint and St., Mount and Mt.,  directionals such as North and N., and Fls. and Falls.  Don’t forget to include Waukesha County itself!

Subject Keywords for Search Engine Optimization

The next sets of words and phrases to consider are subject words.  These represent the subject matter of the site (and their derivations) for which visitors would look.  Typical phrases that should be a the top of the list are:

Real Estate





Home Selling

Foreclosures (if these are important)



Real Estate


Again, this is a standard list that immediately comes to mind.  Consider using a keyword suggestion tool or using the services of a Milwaukee Real Estate SEO Expert to help you determine which ones are most valuable or others that belong on the list.

Secondary Real Estate Keywords for Long-Tail Searches

The last set of words to consider is secondary words that will likely be part of long-tail searches.  Those are searches that go beyond the typical two to three core words that were already chosen and they’ll generally be a part of the majority of search traffic that will come to a real estate site.  A few examples are:




New Homes

For Sale




Selecting keywords is one of the steps required in creating a strong organic search engine profile for real estate Web sites.  For more information on optimizing real estate web sites for search engines, contact Demtron, your Milwaukee and Waukesha SEO Experts.

Powered by BlogEngine.NET
Theme by Mads Kristensen · Adapted by Demtron

Bookmark and Share


<<  June 2024  >>

View posts in large calendar
Log in

Milwaukee SEO Company

Milwaukee Access Programmer/Developer

Milwaukee Website Designer and Developer

Marketing / SEO

Blog Directory
blogarama - the blog directory
Milwaukee area SEO, SEM, ASP.Net