Thursday, November 12, 2009

I Want To Add 100 Content Pages. How Should They Be Organized?

-
I am sure you have heard: Content Is King!

What exactly does that mean? Most search engines like to highly rank authoritative websites. What is an authoritative website? It is a website that is full of informative information related to the site topic (primary keyword).

The best way to do this is to create supportive content to your main (primary) keyword. Let's suppose your keyword is "golf". Even if you wrote a book worth of information related to golf, you will probably not get to the number one position, unless it is extremely well organized with all supporting keywords in a hierarchical structure. If you address all the supporting keywords, it will result in hundreds of pages. Many large companies track thousands of keywords. How should these pages be organized?

The more organized the content pages, the more likely these pages will be indexed and provide juice to your main keyword. So is there any simple methodology to creating a power structure? Yes, there are several methods. One method would be to create an "outline" of main topics and subtopics. After you have created an outline, you can create folders based on the topics and subtopics. Each of the keywords (content) can be assigned to one of the folders. In order for each content page to be effective, meta tags, eg meta title, must be unique or the value of creating an unique page will be defeated.

Another method is to create folders based on a list of keywords. Lets say, you have 100 keywords. Some keywords maybe one word, two words, three word, etc. Based on the keywords, folders can be created based on each keyword. Another words, folders are created by using the first word in all keywords. Sub-folders, are created by using the first and second word in a keyword phrase, and so on. This creates an ordered folder/file structure, that is not only well organized around keywords, but more importantly can be done automatically. An example folder/file structure for the keyword "chicago ladies golf shoes" would be: www.domain-name.com/chicago/ladies/golf/shoes/chicago-ladies-golf-shoes.html.

Which method is better? From a search engine approach, they both may have equal results, but the second method is straight forward and can be done by anyone. In addition it can be automated which has been done by Web Coast Design.

Why do most companies, resist creating many content pages? Is it because it is hard to write the content pages? I dont believe so. The real problem, is all the little things that must be put in place, eg the Meta titles, Meta keywords, Meta descriptions, menus, cross links, but even more important is the ongoing maintenance. What happens when you want to update, eg add a page , or delete a page? and everything has to be re-published? You want a system that can quickly and automatically update all pages with the correct menus and cross links, like SEO Page Pro.

Many companies use Content Management Systems (CMS) to manage page content, but they do not focus on Keywords and they do not provide any guidance in creating Meta Titles which is the most important of all the Meta tags.

Nick, the SEO guy

Tuesday, November 10, 2009

What Is The Better Type of BackLink?

-
Sometimes clients ask: why some websites are ranked higher with fewer backlinks? Although, there are many reasons why this can happen, it is also the case that not all backlinks are of equal value.

Posted by randfish on September 10th, 2009 at 1:28 am Link Building, he wrote a thorough explanation about the many ways search engines evaluate the value of links.
Here is a summary of he points. For a more detail discussion, you should visit his blog where he has several good illustrations and graphs.

First, it is good to understand that there is both Page level and Domain level link value.

Search engines have become more and more dependent entire domain matrics, rather than just an individual page. It's why you'll see new pages or those with very few links ranking highly, simply because they're on an important, trusted, well-linked-to domain. This is sometimes referred to as "domain authority" and is viewed as the single largest factor.

#1 - Internal vs. External

When search engines first began valuing links as a way to determine the popularity, importance and relevance of a document, they found the classic citation-based rule that what others say about you is far more important (and trustworthy) than what you say about yourself. Thus, while internal links (links that point from one page on your site to another) do carry some weight; links from external sites matter far more.

This doesn't mean it's not important to have a good internal link structure, or to do all that you can with your internal links (good anchor text, no unnecessary links, etc.), it just means that a site/page's performance is highly dependent on how other sites on the web have cited it.

Matter of fact, one of the most overlooked SEO activity is creating context based internal linking which is an activity you can do yourself.

#2 - Anchor Text

"exact match" anchor text is more beneficial than simply inclusion of the target keywords in an anchor text phrase.

#3 - PageRank

Whether they call it StaticRank (Microsoft's metric), WebRank (Yahoo!'s), PageRank (Google's) or mozRank (Linkscape's), some form of an iterative, Markov-chain based link analysis algorithm is a part of all the engines' ranking systems. PageRank et al. uses the analogy that links are votes and that those pages which have more votes have more influence with the votes they cast.

The nuances of PageRank are well covered in The Professional's Guide to PageRank Optimization. A general understanding is required if you want to be effective:

  1. Every URL is assigned a tiny, innate quantity of PageRank
  2. If there are "n" links on a page, each link passes that page's PageRank divided by "n" (and thus, the more links, the lower the amount of PageRank each one flows)
  3. An iterative calculation that flows juice through the web's entire link graph dozens of times is used to reach the calculations for each URL's ranking score
  4. Representations like those shown in Google's toolbar PageRank or SEOmoz's mozRank on a 0-10 scale are logarithmic (thus, a PageRank/mozRank 4 has 8-10X the link importance than a PR/mR 3)

#4 - TrustRank

The basics of TrustRank are described in this paper from Stanford - Combatting Webspam with TrustRank. The basic tenet of TrustRank is that the web's "good" and "trustworthy" pages tend to be closely linked together, and that spam is much more pervasive outside this "center." Thus, by calculating an iterative, PageRank-like metric that only flows juice from trusted seed sources, a metric like TrustRank can be used to predictively state whether a site/page is likely to be high quality vs. spam. Linkscape uses this intuition to build mozTrust (mT) and Domain mozTrust (DmT). The point being, get links from high trust sites and don't link to potential spam.

#5 - Domain Authority

Though the phrase "domain authority" is often discussed in the SEO world, a formal, universal definition doesn't yet exist. Most practitioners use it to describe a combination of popularity, importance and trustworthiness calculated by the search engines and based largely on link data (though some also feel the engines may use the age of the site here as well).

#6 - Diversity of Sources

No single metric has a more positive a correlation with high rankings than the number of linking root domains. This appears to be both a very hard metric to manipulate for spam (particularly if you need domains of high repute with diverse link profiles of their own) and a metric that indicates true, broad popularity and importance. Having said this, it is far easier (time and money) to get on all pages of a website, then to get on one page of different and unique website. So although having many unique domains link to your website as your primary goal, do not overlook the potential of links from all web pages within the same domain.

#7 - Uniqueness of Source + Target

The engines have a number of ways to judge and predict ownership and relationships between websites. These can include (but are certainly not limited to):

  • A large number of shared, reciprocated links
  • Domain registration data
  • Shared hosting IP address or IP address C-blocks
  • Public acquisition/relationship information
  • Publicized marketing agreements that can be machine-read and interpreted

#8 - Location on the Page

Microsoft was the first engine to reveal public data about their plans to do "block-level" analysis (in an MS Research piece on VIPS - VIsion-based Page Segmentation). Links from the "content" of a piece is most valuable, both from the value the link passes for rankings and, fortuitously, for click-through traffic as well.

Internal links in the footer of web pages may not provide the same beneficial results that those same links will when placed into top/header navigation. One way the engines appear to be fighting pervasive link advertising is by diminishing the value that external links carry from the sidebar or footer of web pages.

#9 - Topical Relevance

There are numerous ways the engines can run topical analysis to determine whether two pages (or sites) cover similar subject matter. Years ago, Google Labs featured an automatic classification tool that could predict, based on a URL, the category and sub-category for virtually any type of content (from medical to real estate, marketing, sports and dozens more). It's possible that engines may use these automated topical-classification systems to identify "neighbourhoods" around particular topics and count links more or less based on attribute.

#10 - Content & Context Assessment

Though topical relevance can provide useful information for engines about linking relationships, it's possible that the content and context of a link may be even more useful in determining the value it should pass from the source to the target. In content/context analysis, the engines attempt to discern, in a machine parse-able way, why a link exists on a page.

#11 - Geographic Location

The geography of a link is highly dependent on the perceived location of its host, but the engines, particularly Google, have been getting increasingly sophisticated about employing data points to pinpoint the location-relevance of a root domain, subdomain or subfolder. These can include:

  • The host IP address location
  • The country-code TLD extension (.de, .co.uk, etc)
  • The language of the content
  • Registration with local search systems and/or regional directories
  • Association with a physical address
  • The geographic location of links to that site/section

Earning links from a page/site targeted to a particular region may help that page (or your entire site) to perform better in that region's searches, eg web design las vegas. Likewise, if your link profile is strongly biased to a particular region, it may be difficult to appear prominently in another, even if other location-identifying data is present (such as hosting IP address, domain extension, etc).

#12 - Use of Rel="Nofollow"

Although in the SEO world it feels like a lifetime ago since nofollow appeared, it's actually only been around since January of 2005, when Google announced it was adopting support for the new HTML tag. Very simply, rel="nofollow", when attached to a link, tells the engines not to ascribe any of the editorial endorsements or "votes" that would boost a page/site's query independent ranking metrics. Today, Linkscape's index notes that approximately 3% of all links on the web are nofollowed, and that of these, more than half are sites using nofollow on internal, rather than external pointing links.

#13 - Link Type

Links can come in a variety of formats. The big three are:

  1. Straight HTML Text Links
  2. Image Links
  3. Javascript Links

It appears that straight, HTML links with standard anchor text pass the most value, followed by image links with keyword-rich alt text and finally, Javascript links (which still aren't universally followed or considered as an endorsement).

#14 - Other Link Targets on the Source Page

When a page links out externally, both the quantity and targets of the other links that exist on that page may be taken into account by the engines when determining how much link juice should pass.

"PageRank"-like algorithms from all the engines divide the amount of juice passed by any given page by the number of links on that page. In addition to this metric, the engines may also consider the quantity of external domains a page points to as a way to judge the quality and value of those endorsements. If, for example, a page links to only a few external resources on a particular topic, spread out amongst the content, that may be perceived differently than a long list of links pointing to many different external sites. One is not necessarily better or worse than the other, but it's possible the engines may pass greater endorsement through one model than another (and could use a system like this to devalue the links sent from what they perceive to be low-value-add directories).

#15 - Domain, Page & Link-Specific Penalties

Search engines apply penalties to sites and pages ranging from the loss of the ability to pass link juice/endorsement all the way up to a full ban from their indices. If a page or site has lost its ability to pass link endorsements, acquiring links from it provides no algorithmic value for search rankings. Be aware that the engines sometimes show penalties publicly (inability to rank for obvious title/URL matches, lowered PageRank scores, etc.) but continue to keep these penalties inconsistent so systemic manipulators can't acquire solid data points about who can gets "hit" vs. not.

#16 - Content/Embed Patterns

As content licensing & distribution, widgets, badges and distributed, embeddable links-in-content become more prevalent across the web, the engines have begun looking for ways to avoid becoming inundated by these tactics. It is likely that content pattern detection and link pattern detection plays a role in how the engines evaluate link diversity and quality. If the search engines see, for example, the same piece of content with the same link across thousands of sites, that may not signal the same level of endorsement that a diversity of unique link types and surrounding content would provide.

#17 - Temporal / Historical Data

Timing and data about the appearance of links is the final point on this checklist. As the engines crawl the web and see patterns about how new sites, new pages and old stalwarts earn links, they can use this data to help fight spam, identify authority and relevance and even deliver greater freshness for pages that are rising quickly in link acquisition.

Any good SEO campaign must include a long term linkback strategy, based on your monthly budget. Although, there is not just one size fits all approach, dont expect to get the same results from doing directory submissions as you would get from well placed context based theme related links from high PR pages, on multiple domains. An experienced search engine expert is capable of evaluating the cost/benefit trade offs that should be evaluated based your competitive landscape.

By: Nick the SEO guy

Meta Data - Do I really need it? Yes, No, Yes, No...

OK this is Highly debatable.

So I will say, just like everything else, that this is only my opinion. I can only go on how I interpret what I have seen.

We all know that you need the trinity to rank:

  • Great Content
  • Focused Meta Data
  • Links, Links, Links (not all links are the same)

You just NEED all three.

When it comes to Meta Data, I am a fanatic. There are so many variations, of how to write it. There is the obvious where you put the keyword term first, but what goes after that? How long should it be? Total, Density, and Prominence can all be factors. Can you stuff another keyword in there?

I always suggest using Meta Data, however is it nessesary? NO.
Is it necessary to highly rank? Also NO.

"So wait, I do not get it, you just said you need it..What is wrong with you?"

Well first lets look at Meta Data.
META DATA, in any media, is just a guide to what you are going to see, hear, or just experience. It is the pamphlet when you attend a concert, graduation, and event. It tells you what you are going to see and how it is organized. However, it is not the actual experience or the 'content of the event.' That experience is like the text and/or content of your website. That is why content is king. They come to see the actual event, not the pamphlet.

Is the pamphlet required by you to understand the event? NO
Is the meta data required by the Search Engines to understand your Website? Also NO

"OK enough with the analogy I get it"

Good, so the reason I like Meta data, is because you are telling Google, Yahoo!, and Bing exactly what you want them to look for. You are telling them that you feel you are most relevant for 'this keyword or that keyword.' And most of the time they listen! Unless your content is so irrelevant.

Now, if your content is long, thorough, organized, informative, and authoritative then the search engines will easily understand your focus based on your content. They will not need the Meta Data. Furthermore, they are not restricted to, or focused on one keyword of your choice. They may index that one page for numerous keywords! Also for every different keyword indexed, your title and description will be different based on what the search engine thought you were trying to show.

Sometimes having no meta data can be more powerful, however, unless you are experienced enough to write your content correctly, then having meta can be more productive for you. ©

by Marketsite Pro

Thursday, November 5, 2009

Just let go and change it. URLs that are already ranked, but not enough

If some URLs on your website are listed on a Search Engine, then that is great.

However, what if you are placed on Page 3 (SERP 27) of Google and you feel that you have peaked. It does not want to break past page 3. You go over your SEO plans and realize that your URL is not fully optimized! What do you do?

Example:
You fix eye glasses. your site is, let's say: fixeg3.com (Which, by the way, would be a horrible domain for an SEO specialist to work with).

One of your internal pages will focus on fixing hand crafted frames, and the keyword you are focusing on is 'hand crafted frames repair.' The current URL string for that page is: fixeg3.com/hcf-repairs.htm

AGAIN you are on the 3rd page og GOOGLE. So close! You already have spent so much money on Backlinks, your META Data is on point, and your content is good. I say, keep working on those three things as they are the meat and potatoes of your SEO campaign (review). However, everything should be optimized, including your URL string.

so......

go for it. Bite the bullet and make the change. Most likely, you do not have backlinks to your internal pages (If you do, then I will go over that in future post). But you have to do it if you want a better future for your website ranking.

Look at URL of one of Nick's Last post:
http://marketsitepro.blogspot.com/2009/11/what-really-matters.html

'/what-really-matters.html'

There is a reason why Google does this, and you NEED do the same.
so in the example, an optimized version may look like this: fixeg3.com/hand-crafted-frames-repair.htm

What happens now? What do I do with the old page? What will happen with my ranking?

The good news is that the page was already ranked on Google and you can use that momentum. Create the new page and then delete the old one. Next, do a 301 redirect to the new page (email us or look for it in future post if you do not know how to do this). This will provide a smooth transistion and allow Google to come in an re-index the new page.

Lastly, you better make sure this time the URL the way you want, because when it is optimized to your satisfaction, then you need to start sending backlinks to that specific page like it was a whole new site.

by: (\/)arketsite Pro



Monday, November 2, 2009

What Really Matters?

-
I am asked all the time: What really matters to get ranked number "one" in google for a key term?

I still see a lot of controversy between "Backlinks" and "Content". Why? Because one SEO expert will explain how they added thousands of backlinks and got highly ranked, while another expert will tell you how they added tons of content pages and got highly ranked. So what is the truth?

Although, only google knows the truth, it is going to be primarily a function of the competitive nature of the keyword. To illustrate this point, if you were the only website, you would rank number one. Try this, type in any 7 random letters in google search, and you will usually get less then 10 search results (you probably thought you would get none). With so few, competitors dont you think you could easily rank higher for the same keyword, then anyone of those listed, with little effort?

Now, if you have competitors, google must evaluate how you stack up against them. The stronger they are, the "more" you must do to compete. What is that "more"? There is is good news and bad news. The bad news is, there are hundreds of factors. The good news, there are hundreds of factors. Why is this both good and bad?

Bad, because at first you might be overwhelmed with the amount of work it will take to compete on so many factors, but on the other hand the good news is, you can play to your strength. For example, if you have a lot of great content, then this can make up for the lack of links. There are many sites that don't have good content, and make up for it by working hard to create a lot of backlinks.

As time goes on, older sites who have been doing this for a long time will gain an overwhelming advantage, because of the continuing incremental improvements. A new comer will take a look at what they are up against and might become thoroughly discouraged, since the length of time and/or the amount of effort (money) can look staggering.

What are the alternatives?
  • Paid advertisement, eg online or ppc, or
  • Look for less competitive keywords
If you plan to be in business for the long haul, then start now. Invest what you can, but don't look for quick results, and use an expert, until you can get up to speed, but in the mean time ask how you can assist your seo expert, eg writing articles or making blog comments.

Oh, and What Really Matter? This is what really matters:

Major:
  • Title tag is the most important aspect of "on page" SEO
  • Anchoring text is the only way to rank high for keywords
  • Masses of links will assist you in getting top rankings
  • Directories work: Use an outside service while simultaneously working on your other tasks
Minor:
  • Heading tags are overrated, however keep them in mind when targeting less competitive keywords
  • Keyword density, as long as you meet the minimums of 1 to 2 percent.
Note:
Getting a good rank is a mixture of domain age, link popularity and optimized anchor links.
Keep in mind, when shooting for number one SERP, everything matters, even the small things, unless you are number one in backlinks, or number one in content pages, in which case few else matters.

By: Nick the SEO guy