Sunday, December 13, 2009

What's Better: PPC or SEO?

Who in their right mind would not want to be number one in organic search?

The beauty of the search business for google is, there can only be one number "one". The rest of us have to pay. For all those who can not be number one, there is still hope, eg PPC. When some experts say to use PPC only, it kind of reminds me of all those people who will tell you why it is not all that great to be rich, but I have never seen a rich person wanting to be poor. The best person to ask would be the ones who are already number one on organic search.

The ease and simplicity of PPC, makes it perfect for all those who are not number one. What a great business for the search engine companies.

Organic searches is the engine that drives ALL the PPC business. Without it, there would be no PPC business. Could anyone imagine a search engine that would only present PPC? Would anyone want to use such a product? Has anyone ever built a business plan around this concept, never mind even attempt to market such a product? If you ask me, SEO should always be the first thing you do, but search long and hard for a great SEO company to hire, unless you have the time and money to invest for in house expertise.

To be fair, here are some important advantages for PPC:

- Gives immediate online presence
- Have a new site? Have ads in an hour
- Start getting ROI sooner
- No ramp up time
- Great for seasonal items or time sensitive promotions
- Great for testing
- Easily test effectiveness of new marketing message or site design change
- Quickly gather feedback
- Regulate traffic volume
- Sales pipeline empty? Use PPC to push traffic
- Overloaded? Pause campaigns or cut back spend
- Have limited sales season? Saturate market while demand is high

Some experts believe that although PPC gets 10% of clicks, they get 90% of spend.

Since, there is no clear cut consensus, many believe the most prudent approach is to do both, where you can throttle back one over the other, based on your current results.

By: Nick the SEO guy

Can You "Rank" in Google if Everyone Has Different Search Results?

Personal Searches

Google has extended its personalized search functionality to users who are not even signed in. This goes for Google users around the world, in over 40 languages. What this means is that when you search with Google, it will provide results that are aimed at higher relevancy to the individual user, as opposed to relevancy for the average person.

Although google would like to increase sales and profits, they also have to focus on their customers. It is hard to see how 1 or 2 websites out of thousands of websites being ranked higher on a keyword would have any negative impact on people searching. I doubt the average person will switch to another search engine as a result of this change, and I am sure google would have conducted focus group research, prior to making such an important move. As a result, other search engines are more likely to follow then to buck the trend.

On the other hand, this can have a huge impact on SEO experts, who are trying to get a site highly ranked, if they can not influence the outcome. I am putting anyone who is doing their own seo work in the same category. If as a side consequence, this will force more companies to spend more on PPC to get the same amount of traffic, well all the better for google.

"This addition enables us to customize search results for you based upon 180 days of search activity linked to an anonymous cookie in your browser," Google says. "It's completely separate from your Google Account and Web History (which are only available to signed-in users). You'll know when we customize results because a 'View customizations' link will appear on the top right of the search results page. Clicking the link will let you see how we've customized your results and also let you turn off this type of customization."

If you're worried about privacy, Google lets you turn personalized search off altogether. For signed-in users, all you have to do is remove web history from your Google account. For signed out users, click "web history" in the top right corner of a search results page, then click "disable customizations." You can also just clear your browser's cookies.

There are ways to counter google's effect if you are worried about your organic ranking, but these techniques will require a larger investment in organic SEO, making the higher cost of PPC more acceptable and most SEO experts will keep their techniques a secret, so they can use them for their clients only.

By: Nick the SEO guy

Thursday, November 12, 2009

I Want To Add 100 Content Pages. How Should They Be Organized?

-
I am sure you have heard: Content Is King!

What exactly does that mean? Most search engines like to highly rank authoritative websites. What is an authoritative website? It is a website that is full of informative information related to the site topic (primary keyword).

The best way to do this is to create supportive content to your main (primary) keyword. Let's suppose your keyword is "golf". Even if you wrote a book worth of information related to golf, you will probably not get to the number one position, unless it is extremely well organized with all supporting keywords in a hierarchical structure. If you address all the supporting keywords, it will result in hundreds of pages. Many large companies track thousands of keywords. How should these pages be organized?

The more organized the content pages, the more likely these pages will be indexed and provide juice to your main keyword. So is there any simple methodology to creating a power structure? Yes, there are several methods. One method would be to create an "outline" of main topics and subtopics. After you have created an outline, you can create folders based on the topics and subtopics. Each of the keywords (content) can be assigned to one of the folders. In order for each content page to be effective, meta tags, eg meta title, must be unique or the value of creating an unique page will be defeated.

Another method is to create folders based on a list of keywords. Lets say, you have 100 keywords. Some keywords maybe one word, two words, three word, etc. Based on the keywords, folders can be created based on each keyword. Another words, folders are created by using the first word in all keywords. Sub-folders, are created by using the first and second word in a keyword phrase, and so on. This creates an ordered folder/file structure, that is not only well organized around keywords, but more importantly can be done automatically. An example folder/file structure for the keyword "chicago ladies golf shoes" would be: www.domain-name.com/chicago/ladies/golf/shoes/chicago-ladies-golf-shoes.html.

Which method is better? From a search engine approach, they both may have equal results, but the second method is straight forward and can be done by anyone. In addition it can be automated which has been done by Web Coast Design.

Why do most companies, resist creating many content pages? Is it because it is hard to write the content pages? I dont believe so. The real problem, is all the little things that must be put in place, eg the Meta titles, Meta keywords, Meta descriptions, menus, cross links, but even more important is the ongoing maintenance. What happens when you want to update, eg add a page , or delete a page? and everything has to be re-published? You want a system that can quickly and automatically update all pages with the correct menus and cross links, like SEO Page Pro.

Many companies use Content Management Systems (CMS) to manage page content, but they do not focus on Keywords and they do not provide any guidance in creating Meta Titles which is the most important of all the Meta tags.

Nick, the SEO guy

Tuesday, November 10, 2009

What Is The Better Type of BackLink?

-
Sometimes clients ask: why some websites are ranked higher with fewer backlinks? Although, there are many reasons why this can happen, it is also the case that not all backlinks are of equal value.

Posted by randfish on September 10th, 2009 at 1:28 am Link Building, he wrote a thorough explanation about the many ways search engines evaluate the value of links.
Here is a summary of he points. For a more detail discussion, you should visit his blog where he has several good illustrations and graphs.

First, it is good to understand that there is both Page level and Domain level link value.

Search engines have become more and more dependent entire domain matrics, rather than just an individual page. It's why you'll see new pages or those with very few links ranking highly, simply because they're on an important, trusted, well-linked-to domain. This is sometimes referred to as "domain authority" and is viewed as the single largest factor.

#1 - Internal vs. External

When search engines first began valuing links as a way to determine the popularity, importance and relevance of a document, they found the classic citation-based rule that what others say about you is far more important (and trustworthy) than what you say about yourself. Thus, while internal links (links that point from one page on your site to another) do carry some weight; links from external sites matter far more.

This doesn't mean it's not important to have a good internal link structure, or to do all that you can with your internal links (good anchor text, no unnecessary links, etc.), it just means that a site/page's performance is highly dependent on how other sites on the web have cited it.

Matter of fact, one of the most overlooked SEO activity is creating context based internal linking which is an activity you can do yourself.

#2 - Anchor Text

"exact match" anchor text is more beneficial than simply inclusion of the target keywords in an anchor text phrase.

#3 - PageRank

Whether they call it StaticRank (Microsoft's metric), WebRank (Yahoo!'s), PageRank (Google's) or mozRank (Linkscape's), some form of an iterative, Markov-chain based link analysis algorithm is a part of all the engines' ranking systems. PageRank et al. uses the analogy that links are votes and that those pages which have more votes have more influence with the votes they cast.

The nuances of PageRank are well covered in The Professional's Guide to PageRank Optimization. A general understanding is required if you want to be effective:

  1. Every URL is assigned a tiny, innate quantity of PageRank
  2. If there are "n" links on a page, each link passes that page's PageRank divided by "n" (and thus, the more links, the lower the amount of PageRank each one flows)
  3. An iterative calculation that flows juice through the web's entire link graph dozens of times is used to reach the calculations for each URL's ranking score
  4. Representations like those shown in Google's toolbar PageRank or SEOmoz's mozRank on a 0-10 scale are logarithmic (thus, a PageRank/mozRank 4 has 8-10X the link importance than a PR/mR 3)

#4 - TrustRank

The basics of TrustRank are described in this paper from Stanford - Combatting Webspam with TrustRank. The basic tenet of TrustRank is that the web's "good" and "trustworthy" pages tend to be closely linked together, and that spam is much more pervasive outside this "center." Thus, by calculating an iterative, PageRank-like metric that only flows juice from trusted seed sources, a metric like TrustRank can be used to predictively state whether a site/page is likely to be high quality vs. spam. Linkscape uses this intuition to build mozTrust (mT) and Domain mozTrust (DmT). The point being, get links from high trust sites and don't link to potential spam.

#5 - Domain Authority

Though the phrase "domain authority" is often discussed in the SEO world, a formal, universal definition doesn't yet exist. Most practitioners use it to describe a combination of popularity, importance and trustworthiness calculated by the search engines and based largely on link data (though some also feel the engines may use the age of the site here as well).

#6 - Diversity of Sources

No single metric has a more positive a correlation with high rankings than the number of linking root domains. This appears to be both a very hard metric to manipulate for spam (particularly if you need domains of high repute with diverse link profiles of their own) and a metric that indicates true, broad popularity and importance. Having said this, it is far easier (time and money) to get on all pages of a website, then to get on one page of different and unique website. So although having many unique domains link to your website as your primary goal, do not overlook the potential of links from all web pages within the same domain.

#7 - Uniqueness of Source + Target

The engines have a number of ways to judge and predict ownership and relationships between websites. These can include (but are certainly not limited to):

  • A large number of shared, reciprocated links
  • Domain registration data
  • Shared hosting IP address or IP address C-blocks
  • Public acquisition/relationship information
  • Publicized marketing agreements that can be machine-read and interpreted

#8 - Location on the Page

Microsoft was the first engine to reveal public data about their plans to do "block-level" analysis (in an MS Research piece on VIPS - VIsion-based Page Segmentation). Links from the "content" of a piece is most valuable, both from the value the link passes for rankings and, fortuitously, for click-through traffic as well.

Internal links in the footer of web pages may not provide the same beneficial results that those same links will when placed into top/header navigation. One way the engines appear to be fighting pervasive link advertising is by diminishing the value that external links carry from the sidebar or footer of web pages.

#9 - Topical Relevance

There are numerous ways the engines can run topical analysis to determine whether two pages (or sites) cover similar subject matter. Years ago, Google Labs featured an automatic classification tool that could predict, based on a URL, the category and sub-category for virtually any type of content (from medical to real estate, marketing, sports and dozens more). It's possible that engines may use these automated topical-classification systems to identify "neighbourhoods" around particular topics and count links more or less based on attribute.

#10 - Content & Context Assessment

Though topical relevance can provide useful information for engines about linking relationships, it's possible that the content and context of a link may be even more useful in determining the value it should pass from the source to the target. In content/context analysis, the engines attempt to discern, in a machine parse-able way, why a link exists on a page.

#11 - Geographic Location

The geography of a link is highly dependent on the perceived location of its host, but the engines, particularly Google, have been getting increasingly sophisticated about employing data points to pinpoint the location-relevance of a root domain, subdomain or subfolder. These can include:

  • The host IP address location
  • The country-code TLD extension (.de, .co.uk, etc)
  • The language of the content
  • Registration with local search systems and/or regional directories
  • Association with a physical address
  • The geographic location of links to that site/section

Earning links from a page/site targeted to a particular region may help that page (or your entire site) to perform better in that region's searches, eg web design las vegas. Likewise, if your link profile is strongly biased to a particular region, it may be difficult to appear prominently in another, even if other location-identifying data is present (such as hosting IP address, domain extension, etc).

#12 - Use of Rel="Nofollow"

Although in the SEO world it feels like a lifetime ago since nofollow appeared, it's actually only been around since January of 2005, when Google announced it was adopting support for the new HTML tag. Very simply, rel="nofollow", when attached to a link, tells the engines not to ascribe any of the editorial endorsements or "votes" that would boost a page/site's query independent ranking metrics. Today, Linkscape's index notes that approximately 3% of all links on the web are nofollowed, and that of these, more than half are sites using nofollow on internal, rather than external pointing links.

#13 - Link Type

Links can come in a variety of formats. The big three are:

  1. Straight HTML Text Links
  2. Image Links
  3. Javascript Links

It appears that straight, HTML links with standard anchor text pass the most value, followed by image links with keyword-rich alt text and finally, Javascript links (which still aren't universally followed or considered as an endorsement).

#14 - Other Link Targets on the Source Page

When a page links out externally, both the quantity and targets of the other links that exist on that page may be taken into account by the engines when determining how much link juice should pass.

"PageRank"-like algorithms from all the engines divide the amount of juice passed by any given page by the number of links on that page. In addition to this metric, the engines may also consider the quantity of external domains a page points to as a way to judge the quality and value of those endorsements. If, for example, a page links to only a few external resources on a particular topic, spread out amongst the content, that may be perceived differently than a long list of links pointing to many different external sites. One is not necessarily better or worse than the other, but it's possible the engines may pass greater endorsement through one model than another (and could use a system like this to devalue the links sent from what they perceive to be low-value-add directories).

#15 - Domain, Page & Link-Specific Penalties

Search engines apply penalties to sites and pages ranging from the loss of the ability to pass link juice/endorsement all the way up to a full ban from their indices. If a page or site has lost its ability to pass link endorsements, acquiring links from it provides no algorithmic value for search rankings. Be aware that the engines sometimes show penalties publicly (inability to rank for obvious title/URL matches, lowered PageRank scores, etc.) but continue to keep these penalties inconsistent so systemic manipulators can't acquire solid data points about who can gets "hit" vs. not.

#16 - Content/Embed Patterns

As content licensing & distribution, widgets, badges and distributed, embeddable links-in-content become more prevalent across the web, the engines have begun looking for ways to avoid becoming inundated by these tactics. It is likely that content pattern detection and link pattern detection plays a role in how the engines evaluate link diversity and quality. If the search engines see, for example, the same piece of content with the same link across thousands of sites, that may not signal the same level of endorsement that a diversity of unique link types and surrounding content would provide.

#17 - Temporal / Historical Data

Timing and data about the appearance of links is the final point on this checklist. As the engines crawl the web and see patterns about how new sites, new pages and old stalwarts earn links, they can use this data to help fight spam, identify authority and relevance and even deliver greater freshness for pages that are rising quickly in link acquisition.

Any good SEO campaign must include a long term linkback strategy, based on your monthly budget. Although, there is not just one size fits all approach, dont expect to get the same results from doing directory submissions as you would get from well placed context based theme related links from high PR pages, on multiple domains. An experienced search engine expert is capable of evaluating the cost/benefit trade offs that should be evaluated based your competitive landscape.

By: Nick the SEO guy

Meta Data - Do I really need it? Yes, No, Yes, No...

OK this is Highly debatable.

So I will say, just like everything else, that this is only my opinion. I can only go on how I interpret what I have seen.

We all know that you need the trinity to rank:

  • Great Content
  • Focused Meta Data
  • Links, Links, Links (not all links are the same)

You just NEED all three.

When it comes to Meta Data, I am a fanatic. There are so many variations, of how to write it. There is the obvious where you put the keyword term first, but what goes after that? How long should it be? Total, Density, and Prominence can all be factors. Can you stuff another keyword in there?

I always suggest using Meta Data, however is it nessesary? NO.
Is it necessary to highly rank? Also NO.

"So wait, I do not get it, you just said you need it..What is wrong with you?"

Well first lets look at Meta Data.
META DATA, in any media, is just a guide to what you are going to see, hear, or just experience. It is the pamphlet when you attend a concert, graduation, and event. It tells you what you are going to see and how it is organized. However, it is not the actual experience or the 'content of the event.' That experience is like the text and/or content of your website. That is why content is king. They come to see the actual event, not the pamphlet.

Is the pamphlet required by you to understand the event? NO
Is the meta data required by the Search Engines to understand your Website? Also NO

"OK enough with the analogy I get it"

Good, so the reason I like Meta data, is because you are telling Google, Yahoo!, and Bing exactly what you want them to look for. You are telling them that you feel you are most relevant for 'this keyword or that keyword.' And most of the time they listen! Unless your content is so irrelevant.

Now, if your content is long, thorough, organized, informative, and authoritative then the search engines will easily understand your focus based on your content. They will not need the Meta Data. Furthermore, they are not restricted to, or focused on one keyword of your choice. They may index that one page for numerous keywords! Also for every different keyword indexed, your title and description will be different based on what the search engine thought you were trying to show.

Sometimes having no meta data can be more powerful, however, unless you are experienced enough to write your content correctly, then having meta can be more productive for you. ©

by Marketsite Pro

Thursday, November 5, 2009

Just let go and change it. URLs that are already ranked, but not enough

If some URLs on your website are listed on a Search Engine, then that is great.

However, what if you are placed on Page 3 (SERP 27) of Google and you feel that you have peaked. It does not want to break past page 3. You go over your SEO plans and realize that your URL is not fully optimized! What do you do?

Example:
You fix eye glasses. your site is, let's say: fixeg3.com (Which, by the way, would be a horrible domain for an SEO specialist to work with).

One of your internal pages will focus on fixing hand crafted frames, and the keyword you are focusing on is 'hand crafted frames repair.' The current URL string for that page is: fixeg3.com/hcf-repairs.htm

AGAIN you are on the 3rd page og GOOGLE. So close! You already have spent so much money on Backlinks, your META Data is on point, and your content is good. I say, keep working on those three things as they are the meat and potatoes of your SEO campaign (review). However, everything should be optimized, including your URL string.

so......

go for it. Bite the bullet and make the change. Most likely, you do not have backlinks to your internal pages (If you do, then I will go over that in future post). But you have to do it if you want a better future for your website ranking.

Look at URL of one of Nick's Last post:
http://marketsitepro.blogspot.com/2009/11/what-really-matters.html

'/what-really-matters.html'

There is a reason why Google does this, and you NEED do the same.
so in the example, an optimized version may look like this: fixeg3.com/hand-crafted-frames-repair.htm

What happens now? What do I do with the old page? What will happen with my ranking?

The good news is that the page was already ranked on Google and you can use that momentum. Create the new page and then delete the old one. Next, do a 301 redirect to the new page (email us or look for it in future post if you do not know how to do this). This will provide a smooth transistion and allow Google to come in an re-index the new page.

Lastly, you better make sure this time the URL the way you want, because when it is optimized to your satisfaction, then you need to start sending backlinks to that specific page like it was a whole new site.

by: (\/)arketsite Pro



Monday, November 2, 2009

What Really Matters?

-
I am asked all the time: What really matters to get ranked number "one" in google for a key term?

I still see a lot of controversy between "Backlinks" and "Content". Why? Because one SEO expert will explain how they added thousands of backlinks and got highly ranked, while another expert will tell you how they added tons of content pages and got highly ranked. So what is the truth?

Although, only google knows the truth, it is going to be primarily a function of the competitive nature of the keyword. To illustrate this point, if you were the only website, you would rank number one. Try this, type in any 7 random letters in google search, and you will usually get less then 10 search results (you probably thought you would get none). With so few, competitors dont you think you could easily rank higher for the same keyword, then anyone of those listed, with little effort?

Now, if you have competitors, google must evaluate how you stack up against them. The stronger they are, the "more" you must do to compete. What is that "more"? There is is good news and bad news. The bad news is, there are hundreds of factors. The good news, there are hundreds of factors. Why is this both good and bad?

Bad, because at first you might be overwhelmed with the amount of work it will take to compete on so many factors, but on the other hand the good news is, you can play to your strength. For example, if you have a lot of great content, then this can make up for the lack of links. There are many sites that don't have good content, and make up for it by working hard to create a lot of backlinks.

As time goes on, older sites who have been doing this for a long time will gain an overwhelming advantage, because of the continuing incremental improvements. A new comer will take a look at what they are up against and might become thoroughly discouraged, since the length of time and/or the amount of effort (money) can look staggering.

What are the alternatives?
  • Paid advertisement, eg online or ppc, or
  • Look for less competitive keywords
If you plan to be in business for the long haul, then start now. Invest what you can, but don't look for quick results, and use an expert, until you can get up to speed, but in the mean time ask how you can assist your seo expert, eg writing articles or making blog comments.

Oh, and What Really Matter? This is what really matters:

Major:
  • Title tag is the most important aspect of "on page" SEO
  • Anchoring text is the only way to rank high for keywords
  • Masses of links will assist you in getting top rankings
  • Directories work: Use an outside service while simultaneously working on your other tasks
Minor:
  • Heading tags are overrated, however keep them in mind when targeting less competitive keywords
  • Keyword density, as long as you meet the minimums of 1 to 2 percent.
Note:
Getting a good rank is a mixture of domain age, link popularity and optimized anchor links.
Keep in mind, when shooting for number one SERP, everything matters, even the small things, unless you are number one in backlinks, or number one in content pages, in which case few else matters.

By: Nick the SEO guy

Wednesday, October 28, 2009

SEO Automation

-
There are plenty of SEO software applications, from site evaluation software to article generation and submission software. Does this mean we are covered? Of course existing software will continue to improve and most likely get better, but are there still areas that could be automated?

What happens when websites get bigger, or more complex? Are new tools required to manage the content not from a design approach, such as Adobe Photo Shop, but rather from an SEO approach?

For example, it is one thing when you have content to cover 5 or 10 keywords, but what about the SEO strategy of targeting "long tail" keywords, when the more competitive keywords are not obtainable. In this case, there could be several hundred, or more keywords. How is this created, managed and modified, when what is required is a well organized website that is linked in a clean and well integrated structure?

Now, more then ever, there is a need, not only to blog on your area of expertise, but to make blog comments for the purpose of showing your authority regarding a subject and to get links back to your site. To do this can be both time consuming, especially if you are trying to find good sites worthy of your comments, and hard to track if there is a growing number of blogging sites. It seems most SEO experts try to track this in a spread sheet or word document. I suppose this is fine, if there is nothing better, but it seems this is a perfect database application.

I am sure plenty of software tools can be developed to help the serious SEO expert automate these tasks, but of course one question that always must be asked:
will the automation be sufficient to charge enough for the time it takes to develop the product? What are your thoughts?

By: Nick the SEO guy

Sunday, October 25, 2009

What Counts the Most?

-
We have seen that what you place in the title bar counts for a lot. There are two common approaches to the title bar.

1) Put two keywords (or keyword phrase) separated by a pipe symbol, eg | , plus the business name, or

2) Your category of business, plus business name, plus keyword.

But lets assume the first 10 in the SERP all follow item one above. What then? How will google differentiate between them? By content? By backlinks? or the other hundreds of parameters?

Everyone knows, or should know by now what are the most important factors, especially if you are in that select group that wants to be ranked on the first page. Just doing a few things correctly that your competitors dont do, can make all the difference in the world. Of course, if you have 10 times better content, or ten times better backlinks, the little things may not matter much. This is because not all parameters are equally weighted.

One of the little things that can make a difference, for example, is the organization of your content. I call this, the power of the pyramid. This is where less common keyword phrases are used to support more common keywords. The building blocks of the pyramid. There are three common errors when creating content.

  1. Building content for highly competitive keywords, without using less competitive keywords in a supporting role, or the opposite,
  2. Creating content for long tail keywords, and ignoring the highly competitive keywords, when you now have the supporting keywords that can make a difference, and
  3. Not organizing your content in a supportive role, by using folders and files the search engines can easily drill down.

I have heard that, some highly ranked sites with lots of content don't even use the keywords in their content pages. This may work if you have lots of content, but this may not be where you are at right now, so you should always try to prioritize you activities, so you are working on the most important things first, and then work your way down your priority list.

Also, when doing tasks, it does not mean you have to finish one task before starting another. Another words, if your goal is to have a lot of good content, it doesn't mean you have to finish this task, before starting to work on backlinks. On the other hand, when doing content pages, it would be good to think through the organization of your content before you start. Having organized pages, is better then having one long list of blog pages, if what you want is to get highly ranked on multiple keywords.

By: Nick the SEO guy

Friday, October 23, 2009

SERP Factors: Best Guess by Experts

No one knows for sure what factors influence Search Engine Rank Positions. It is surprising we dont know more from ex-employees. Probably because it has become so complex, it is little like the Internal Revenue Code. No one person could possibly know all the code (play on words).

What is possible, is to get some idea of the relative importance of different factors based on asking the opinion of as many SEO experts as possible. Impossible you say? Well it has been done. Check it out at: http://www.seomoz.org .

Even now, I see blogs where they are arguing which is more important? Content or Linking?

Rather then debate this, a better question is for you to ask yourself if your website is "authoritative" for the keyword you want to get highly ranked on?

Within the privacy of Google, I am sure they argue what will be the "value" for everything they do, against the notion; will this help us select the most authoritative website related to that keyword? If that means a website should have relevant content, then they do it. If it means a website should have links from other related websites, then they do it.

So if you dont know anything about SEO, then be guided by "is what I am doing or about to do, help to make my website the Authoratative website, relative to my keywords"?

If you need more incite, then seek out an expert, or do a lot of research at your local library, or on the internet.

By: Nick the SEO guy

Thursday, October 22, 2009

What is my SERP?

Recently, we started a campaign to move our client higher on google search engine results.

When we started, his google search engine position was 245 for Keyword 1 . Not a very enviable position to be at all. We start on a mixture of activities, from redefining his page title, to getting links back to his site. At the same time we started monitoring his SERP (Search Engine Rank Position).

After one week, he had already moved up to position 102. We thought a remarkable advancement in such a short period of time. A week after that he moved to position 75, so we now knew we were on the right track, but would he drop back down? Now we were starting to see some of the linkbacks starting to kick in by checking his backlinks with yahoo.

After the third week and a 10% increase in backlinks, he moved up to position 52. Nice. but what was very strange, was when we looked at google webmaster tools (for the same keywood 1, it listed him at position 9! When you actually did a search, he still showed up in position 52, and when we use other SERP tools, he comes up in position 52. So he was really in position 52? Right? Wrong! A few days later when did an actual search of his keywood 1, he was in position 2! WOW Was this right? Wrong! It turns out I was logged into google webmaster tools. For some strange reason, if you are logged into google webmaster tools, you will get a false reading when doing searches. After logging out, my serp results yielded a positon of 49.

So what gives? Why different SERP results:

google webmaster tools: 9
Actual search results: 49
SERP Tools: 52

Is google forcasting future results? or is that just my wishful thinking?

By: Nick the SEO guy

Tuesday, October 20, 2009

Google Is No.1, and Should Always Be No. 1

Should there be only one search engine?

Have Search Engines become like Utilities, eg like cable ? or water utility, or electric? where we will have one large search engine company? From an advertisers point of view, it would be great. The simplicity of it all.

Other advantages, would be the ability to get the most amount of traffic available for low volume keywords from just one source. In addition, all your PPC can be managed in one place.

The disadvantages would be all the problems associated with a monopoly, eg higher costs, lower innovation, government regulation and potential for abuse of power. You say what abuse of power? For example, G, could start forcing the use of their other applications or suffer lower SERP, or other unforeseen problems. You say this scenario is unlikely, so why am I raising this as an issue?

Yahoo and MSN, the only other potential competitors to G, are so far behind it seems like we are almost there. How can the consumer shift this power, when most websites have ads by google?

By: Nick the SEO guy

How do I rank high on the search engines?


Potential Clients ask me 'How do I rank #1 on Google?'


Most 'SEO companies/individuals' may feel the same way that I do. How do you answer a question so general with answers based on so many variables?

That is like asking a friend, who is a mechanic, "How do I change the oil in my car?" It is something that requires a lot of detailed explanation. However, I have come up with a very basic analogy of what is needed to gain ranking.
(I will get into further details on future posts)

So, there are 3 basic areas:
1) Content
2) Meta Data
3) Link Popularity

[Before I go any further I would like to note that no one can guarantee a # 1 spot on Google. They have a large algorithm used to determine your placement, and no one really knows the exact formula; they can only speculate. However, experienced SEO companies have a general idea of what works well enough to move high.]

I relate Search Engine Placement to applying to Universities.
There are no guarantees that you will be accepted.

Content - Content is king. This is like the essay part of your application. In your own words, you are providing them information about who you are. It should be original, authoritative, and show quality.

As with your application, your site should not:
- Have duplicate content
- Copyright
- Be 'stealthy' with your writing. Write honestly, not what you think they 'want to hear.'

Meta Data - This is code written on your site, which can be related to your grades, extracurricular activities, etc.

Link Popularity - When someone puts a link on their website that points to yours, that is a back link. It is similar to getting a reference on your application to the university; the more references, the better. Also, the source of the reference matters.

Those three things are the basic building blocks to successful organic/natural marketing campaign. Some SEO specialist debate that Content is all you want to focus on. Others are convinced that link popularity is the answer.

Bottom line: you need all three.