Killing Your Search Engine Ranking in 7 Easy Steps

by demtron on Wednesday, January 14, 2009 07:40 PM
Have you ever wanted to completely destroy your search engine ranking or do it for someone else?  Maybe you never want your site to get found again?  Believe it or not, I've helped clients in the last six months with each of these problems that were torching their search engine profile and strangling their organic traffic.
1) Domain masking: I took over one site where the entire site was using domain masking.  In seven years, the client had absolutely no idea that the previous designer was doing this to save himself a buck on a hosting plan for the site.  Only the home page was found in search engines.  It turns out the designer did exactly the same thing with all the rest of his clients' sites that have been live for years.  They, too, only have the home page to show for it in search engine results.
2) Use only a JavaScript menu for linking pages: Sure, JavaScript menus are cool.  They can drop down, slide across, have pictures and generally spice up a site.  But they can't be crawled by search engines.  What's more, the links in them don't contribute anchor text, either.  One recent client had over a hundred pages in a JavaScript menu and practically no linking using anchor tags.  Most of the 200 pages of the site had not been crawled at all
3) Use a JavaScript page strip: ASP.Net is famous of offering page navigation strips using is doPostBack JavaScript methods.  Another client I acquired had over 6 thousand pages on a site, but only 32 pages actually crawled.  The remaining pages were all accessible through paginated tables.  Another great waste of code that a search engine ignores.
4) All pictures and Flash with little text: Some designers with a flair for graphic design take sites a little overboard.  If you're a famous pop star and have zillions of fans finding your site every day by just typing, then who cares?  In the real world, most sites are not wildly popular and are only found through search engine results.  SEs love text, especially keyword-rich, backlinked text.  Pictures and Flash sure are pretty, but they basic tell nothing to a crawler.
5) Renaming pages without redirects:  One site I redesigned earlier this year had tons of links from other sites pointing to a page that was non-existent.  What a complete waste of free traffic and promotion!  Both search engines AND human visitors wouldn't find the site.  Oh, what a little bit of 301-redirect action did to help out that one.
6) Leaving title tags blank:  One of the aforementioned sites had about 60 of it's 200 pages with blank titles.  How is anyone going to find those pages, and why would anyone click on them?  Here, let's write a book, then tear the front cover off and leave one of those "this page intentionally left blank" pages as the new front cover.  Real slick.
And last, but not least...
7) user agent: *  disallow: / in the ROBOTS.TXT file: This one didn't actually happen, although it was close.  The site had the disallow all set for a user agent of Google.  So, they kissed 81% of their traffic away just by a simple screw-up by the former designer.
And there you have it.  If you implement these seven key steps, your success with annihilating your search engine exposure and traffic is pretty much guaranteed.  Good luck and happy destroying!

Custom Reporting in Google Analytics

by demtron on Thursday, November 13, 2008 09:36 AM
Google recently began offering a beta test version of their custom reporting tool for Google Analytics.  With this enhancement, Analytics just got a whole lot better.

Like any reporting tool, one common complaint with Analytics has been Google’s limitations on reporting data with multiple variables or criteria.  The Customer Reporting feature is a way to move beyond the canned reports supplied by Google and create one that fit specific business needs.

Custom reports are made first by picking dimensions and metrics.  A dimension is an attribute (time of day, geographic location, or page name) to report on.  There is sometimes confusion between dimensions and metrics.  One easy way to remember the difference is that a dimension is like a column title (like "city") and metrics are the numbers that appear in the column (like the number of hits generated from that city).

Any metric can be chosen for reporting.  One important point to note is that not all dimensions can be paired with all metrics.  Google supplies a chart to identify which pairings are possible.  There are some combinations, such as network location, page names, and visits that would be highly useful but aren’t possible.

All in all, this is a great additional to the free Google Analytics tool that is a "must learn" for those that are serious about using Analytics as a reporting tool.

Committing to The SEO Process

by demtron on Thursday, October 02, 2008 06:13 PM

The power of the Internet and search engine technologies has increased interest in making Web sites appear prominently in search engine results.  According to Opinion Research Corp. for Performics Inc., 58% of adults conduct on-line research prior to making a purchase.  This underscores the importance of a business to have a Web site.

The majority of traffic from a search query will go to those sites listed on the first SERP (Search Engine Results Page).  Those sites have a clear advantage in capturing attention, clicks and conversions on their sites.  Sites that do not appear there will likely be ignored completely.

Those well-positioned sites didn’t get there overnight.  SEO takes consistent action and reaction.  What works today for SEO may not work as well in the future.  Competition for high positioning intensifies over time for any keyword phrase as more businesses find out about SEO.

By committing to an SEO strategy over time, a small business has a great opportunity to drive significant amounts of traffic to its site and lands great positioning.  What do you need to know to do to make this commitment?

  1. Assess your current site and positioning.  Where have you been, where are you at, and where are you going?  By knowing your current positioning, you have a reference point for understanding the results of your present and future efforts.  You may be surprised at how well some parts of your site are performing!
  2. Know what to change.  There are over 200 factors that Google considers to determine ranking of pages in its results.  Many can be controlled your site design and profile, and there are a handful that provide the greatest return on your investment.  You might consider hiring an SEO consultant to help you identify what will work best for your site.
  3. Make meaningful changes in increments.  Unless a site has poor content, navigation, and suffers from serious neglect, there are probably areas that are doing reasonably well to attract traffic.  Making too many across-the-board changes too quickly may negatively impact those well-performing areas.  Plus, it may be difficult to identify what changes made the biggest difference.
  4. Measure your effectiveness.  It’s important to periodically check your ranking among the major search engines to gauge the impact of your changes.  Do this for the first week or two after your changes.
  5. Measure your traffic.  Typical analytics tools can tell you statistics such as number of visitors, traffic sources, which search terms yielded visits and page popularity.  The metrics derived from these tools are important for guiding you ongoing SEO enhancement efforts.
  6. It takes time.  Don’t expect the results to happen overnight.  If your site has not ranked well in the past or is not established, it may be several months before your site’s ranking rises substantially.  Search engines need time to crawl your pages and your competitor’s pages to pick up changes and analyze them.  Ranking for high value phrases may prove to be a significant challenge.

The proverb “good things come to those who wait” is particularly important in SEO.  If you need help developing a comprehensive search engine optimization plan, contact us to get started.

Duplicate Content and SEO

by demtron on Wednesday, October 01, 2008 07:34 PM
Duplicate content is bad for SEO - Demtron can help!

I was recently asked to review a site that was ranking poorly and not indexed well in search engines, especially Google.  The site looked well designed, had nice internal linking, and a fair amount of SEO performed on it.  A quick search on Google and Yahoo uncovered the problem - duplicate content!

Duplicate content plays tricks in search engines, and that's nothing but bad for you SEO efforts.  For example:

  • The popularity of any one page that's duplicated is diluted, reducing the likelihood that it comes up in search results
  • Some of all of the URLs may not appear friendly to a visitor, thereby reducing the likelihood of that visitor clicking on it

Upon finding duplicate content, Google will remove content it finds duplicated and only show one link that appers to be the "best".  Larger sites will often have dynamically generated pages with little change in content, URL, title, or keywords.  This is an immediate RED FLAG and will cause Google to remove the offending pages.  Any SEO done for the tossed pages has also beed tossed out the window.

If you think your site may have duplicated content, our SEO services can help you.  Contact us for more information.

Tags: ,

Google | SEO

Powered by BlogEngine.NET
Theme by Mads Kristensen · Adapted by Demtron

Bookmark and Share


<<  May 2024  >>

View posts in large calendar
Log in

Milwaukee SEO Company

Milwaukee Access Programmer/Developer

Milwaukee Website Designer and Developer

Marketing / SEO

Blog Directory
blogarama - the blog directory
Milwaukee area SEO, SEM, ASP.Net