Tuesday, January 25, 2011

How can you promote hundred of keywords at a time ?

.

This is a great question, because everyone talks about going after longtail keywords because there is less competition, but they never tell you how to do it, cost effectively.

The simple answer is: there are 3 ways:

1. Use Anchor Text that matches each keyword that links back to your HOME page.

2. Use Anchor Text that matches each keyword that links back to its own internal page that is optimized for each keyword.

3. A mixture of 1 and 2. Using multiple internal pages, and assign multiple keywords (with anchor text based backlinks) to each page.

Which approach do you think is most effective?

There is no question that solution 2 is the most effective solution (best SERP results for each keyword).

Unfortunately, there are pros and cons with everything, and solution 2 will be far more time consuming than solution 1 or 3; both in the initial setup and the ongoing maintainance of each page, so most people trade off the cost of maintenance vs effectiveness. The fall back solution is alternative 3, as a compromise.

What SEO PAGE PRO does everthing to simplify the time to implement solution 2, except writing the content for each page.

That might not seem like much at first glance, but optimizing and managing hundreds of keywords can be extremely time consuming. If you can automate everything except the content, then you will only have to spend your time to create useful content for each keyword. Also, it does some things that are very useful, but would be very complex for the average person. For example, it can create urls based on the keyword string. This is done by automatically creating folders based on the keyword phrase. You can even set the folder depth. Alternatively, you can group your keywords into categories. The categories create folders, that are used as part of the keyword page url address.

Also, after the content is added for each keyword page, SEO PAGE PRO can be used to search all pages, and use the keyword list to search for the keywords in the content. If it finds a keyword match, it will link it back to the correct keyword page, excluding its own keyword page. This might not sound so tough, but try it when you have keywords like: los angeles criminal defense attorney, criminal defense attorney, defense attorney, and attorney. The order in which this is done is very important and then add or delete a page and you will then realize how much time will be required to edit all the right links.

Also, all keywords are linked from page to page in the footer, so going to any keyword page will end up linking to all keywords.

Although none of these features are critical, each will contribute to a more optimized page and website. I have noticed that when adding lots of content pages, many people will not even optimize the title bar. I can only guess, that it just seems like too much time and effort.

By using a product like this, (SEO PAGE PRO) you can quickly set up and publish keyword pages that are highly optimized except for the content. Best of all, from a website point of view it is well structured and you can keep it all organized in one place. This product will even display when any of the keyword pages start ranking in the top 100. This does not eliminate the need for good content or backlinks, but you can begin to focus on those pages which are either starting to rank quickly, or have the most traffic potential.

A great attribute of this approach is; you dont have to have a "ONE SIZE FITS ALL" approach. I have seen companies overspend by thinking they need to purchase pr6 (pick your own number) backlinks for all their keywords. Instead with this approach, you can tailor the number of backlinks for each keyword based on the results, or focus on other on-page activities, e.g. adding video, or more content.

Many have argued that you should only focus on 5 to 10 keywords, because you cannot concentrate on 100 keywords, since this will take much of your time in optimizing and no assurance of results." If the cost was low enough, "no assurance of results" would not matter, because even if only half of the 100 keywords gained traction, that would make it more than worth the effort. No doubt that you should not spread yourself too thin, but whats wrong with setting it all up, if the time involved is small, and then sequence through them when you have more time?

Remember, someone is number one for every keyword. These pages do not have to link to a main site, they can also be stand-alone pages. Also all of the links show up (even if there were 200 pages) allowing users to view all the pages instead of guess. They links can also be put into categories or be separated by alphabet. They do not re-direct, nor are they meant to be doorway pages. Just additional content, based on your keywords, they allow your to add value to your site or just as a site by itself.

The problem with restricting yourself to 5 to 10, is getting the right 5 to 10. Sometimes you can not get above position 3 no matter what you try.

Now-a-days with sponsored results on top, maps, side bar ppc, videos, news and local preference all mixed in, if you are not number 1, you might as well be on another planet. Also, it is pretty well known that you will get better conversion with longtail keywords, but it is equally true that longtails will have much lower traffic. Most people focus on absolute traffic numbers, instead of traffic that will convert. If you restrict yourself to 5 or 10 you will be missing out on most of the longtail traffic and you will be spending a lot more money on general keywords that do not convert very well. This is equally true for ppc adwords.

Oh I forgot, it also REALLY helps with the Google quality score on CPC because you have individual pages with meta and page titles that match the keyword that you are paying for. This would save a person a lot of money as opposed to doing cpc for1 page using different keywords.

Sunday, December 13, 2009

What's Better: PPC or SEO?

Who in their right mind would not want to be number one in organic search?

The beauty of the search business for google is, there can only be one number "one". The rest of us have to pay. For all those who can not be number one, there is still hope, eg PPC. When some experts say to use PPC only, it kind of reminds me of all those people who will tell you why it is not all that great to be rich, but I have never seen a rich person wanting to be poor. The best person to ask would be the ones who are already number one on organic search.

The ease and simplicity of PPC, makes it perfect for all those who are not number one. What a great business for the search engine companies.

Organic searches is the engine that drives ALL the PPC business. Without it, there would be no PPC business. Could anyone imagine a search engine that would only present PPC? Would anyone want to use such a product? Has anyone ever built a business plan around this concept, never mind even attempt to market such a product? If you ask me, SEO should always be the first thing you do, but search long and hard for a great SEO company to hire, unless you have the time and money to invest for in house expertise.

To be fair, here are some important advantages for PPC:

- Gives immediate online presence
- Have a new site? Have ads in an hour
- Start getting ROI sooner
- No ramp up time
- Great for seasonal items or time sensitive promotions
- Great for testing
- Easily test effectiveness of new marketing message or site design change
- Quickly gather feedback
- Regulate traffic volume
- Sales pipeline empty? Use PPC to push traffic
- Overloaded? Pause campaigns or cut back spend
- Have limited sales season? Saturate market while demand is high

Some experts believe that although PPC gets 10% of clicks, they get 90% of spend.

Since, there is no clear cut consensus, many believe the most prudent approach is to do both, where you can throttle back one over the other, based on your current results.

By: Nick the SEO guy

Can You "Rank" in Google if Everyone Has Different Search Results?

Personal Searches

Google has extended its personalized search functionality to users who are not even signed in. This goes for Google users around the world, in over 40 languages. What this means is that when you search with Google, it will provide results that are aimed at higher relevancy to the individual user, as opposed to relevancy for the average person.

Although google would like to increase sales and profits, they also have to focus on their customers. It is hard to see how 1 or 2 websites out of thousands of websites being ranked higher on a keyword would have any negative impact on people searching. I doubt the average person will switch to another search engine as a result of this change, and I am sure google would have conducted focus group research, prior to making such an important move. As a result, other search engines are more likely to follow then to buck the trend.

On the other hand, this can have a huge impact on SEO experts, who are trying to get a site highly ranked, if they can not influence the outcome. I am putting anyone who is doing their own seo work in the same category. If as a side consequence, this will force more companies to spend more on PPC to get the same amount of traffic, well all the better for google.

"This addition enables us to customize search results for you based upon 180 days of search activity linked to an anonymous cookie in your browser," Google says. "It's completely separate from your Google Account and Web History (which are only available to signed-in users). You'll know when we customize results because a 'View customizations' link will appear on the top right of the search results page. Clicking the link will let you see how we've customized your results and also let you turn off this type of customization."

If you're worried about privacy, Google lets you turn personalized search off altogether. For signed-in users, all you have to do is remove web history from your Google account. For signed out users, click "web history" in the top right corner of a search results page, then click "disable customizations." You can also just clear your browser's cookies.

There are ways to counter google's effect if you are worried about your organic ranking, but these techniques will require a larger investment in organic SEO, making the higher cost of PPC more acceptable and most SEO experts will keep their techniques a secret, so they can use them for their clients only.

By: Nick the SEO guy

Thursday, November 12, 2009

I Want To Add 100 Content Pages. How Should They Be Organized?

-
I am sure you have heard: Content Is King!

What exactly does that mean? Most search engines like to highly rank authoritative websites. What is an authoritative website? It is a website that is full of informative information related to the site topic (primary keyword).

The best way to do this is to create supportive content to your main (primary) keyword. Let's suppose your keyword is "golf". Even if you wrote a book worth of information related to golf, you will probably not get to the number one position, unless it is extremely well organized with all supporting keywords in a hierarchical structure. If you address all the supporting keywords, it will result in hundreds of pages. Many large companies track thousands of keywords. How should these pages be organized?

The more organized the content pages, the more likely these pages will be indexed and provide juice to your main keyword. So is there any simple methodology to creating a power structure? Yes, there are several methods. One method would be to create an "outline" of main topics and subtopics. After you have created an outline, you can create folders based on the topics and subtopics. Each of the keywords (content) can be assigned to one of the folders. In order for each content page to be effective, meta tags, eg meta title, must be unique or the value of creating an unique page will be defeated.

Another method is to create folders based on a list of keywords. Lets say, you have 100 keywords. Some keywords maybe one word, two words, three word, etc. Based on the keywords, folders can be created based on each keyword. Another words, folders are created by using the first word in all keywords. Sub-folders, are created by using the first and second word in a keyword phrase, and so on. This creates an ordered folder/file structure, that is not only well organized around keywords, but more importantly can be done automatically. An example folder/file structure for the keyword "chicago ladies golf shoes" would be: www.domain-name.com/chicago/ladies/golf/shoes/chicago-ladies-golf-shoes.html.

Which method is better? From a search engine approach, they both may have equal results, but the second method is straight forward and can be done by anyone. In addition it can be automated which has been done by Web Coast Design.

Why do most companies, resist creating many content pages? Is it because it is hard to write the content pages? I dont believe so. The real problem, is all the little things that must be put in place, eg the Meta titles, Meta keywords, Meta descriptions, menus, cross links, but even more important is the ongoing maintenance. What happens when you want to update, eg add a page , or delete a page? and everything has to be re-published? You want a system that can quickly and automatically update all pages with the correct menus and cross links, like SEO Page Pro.

Many companies use Content Management Systems (CMS) to manage page content, but they do not focus on Keywords and they do not provide any guidance in creating Meta Titles which is the most important of all the Meta tags.

Nick, the SEO guy

Tuesday, November 10, 2009

What Is The Better Type of BackLink?

-
Sometimes clients ask: why some websites are ranked higher with fewer backlinks? Although, there are many reasons why this can happen, it is also the case that not all backlinks are of equal value.

Posted by randfish on September 10th, 2009 at 1:28 am Link Building, he wrote a thorough explanation about the many ways search engines evaluate the value of links.
Here is a summary of he points. For a more detail discussion, you should visit his blog where he has several good illustrations and graphs.

First, it is good to understand that there is both Page level and Domain level link value.

Search engines have become more and more dependent entire domain matrics, rather than just an individual page. It's why you'll see new pages or those with very few links ranking highly, simply because they're on an important, trusted, well-linked-to domain. This is sometimes referred to as "domain authority" and is viewed as the single largest factor.

#1 - Internal vs. External

When search engines first began valuing links as a way to determine the popularity, importance and relevance of a document, they found the classic citation-based rule that what others say about you is far more important (and trustworthy) than what you say about yourself. Thus, while internal links (links that point from one page on your site to another) do carry some weight; links from external sites matter far more.

This doesn't mean it's not important to have a good internal link structure, or to do all that you can with your internal links (good anchor text, no unnecessary links, etc.), it just means that a site/page's performance is highly dependent on how other sites on the web have cited it.

Matter of fact, one of the most overlooked SEO activity is creating context based internal linking which is an activity you can do yourself.

#2 - Anchor Text

"exact match" anchor text is more beneficial than simply inclusion of the target keywords in an anchor text phrase.

#3 - PageRank

Whether they call it StaticRank (Microsoft's metric), WebRank (Yahoo!'s), PageRank (Google's) or mozRank (Linkscape's), some form of an iterative, Markov-chain based link analysis algorithm is a part of all the engines' ranking systems. PageRank et al. uses the analogy that links are votes and that those pages which have more votes have more influence with the votes they cast.

The nuances of PageRank are well covered in The Professional's Guide to PageRank Optimization. A general understanding is required if you want to be effective:

  1. Every URL is assigned a tiny, innate quantity of PageRank
  2. If there are "n" links on a page, each link passes that page's PageRank divided by "n" (and thus, the more links, the lower the amount of PageRank each one flows)
  3. An iterative calculation that flows juice through the web's entire link graph dozens of times is used to reach the calculations for each URL's ranking score
  4. Representations like those shown in Google's toolbar PageRank or SEOmoz's mozRank on a 0-10 scale are logarithmic (thus, a PageRank/mozRank 4 has 8-10X the link importance than a PR/mR 3)

#4 - TrustRank

The basics of TrustRank are described in this paper from Stanford - Combatting Webspam with TrustRank. The basic tenet of TrustRank is that the web's "good" and "trustworthy" pages tend to be closely linked together, and that spam is much more pervasive outside this "center." Thus, by calculating an iterative, PageRank-like metric that only flows juice from trusted seed sources, a metric like TrustRank can be used to predictively state whether a site/page is likely to be high quality vs. spam. Linkscape uses this intuition to build mozTrust (mT) and Domain mozTrust (DmT). The point being, get links from high trust sites and don't link to potential spam.

#5 - Domain Authority

Though the phrase "domain authority" is often discussed in the SEO world, a formal, universal definition doesn't yet exist. Most practitioners use it to describe a combination of popularity, importance and trustworthiness calculated by the search engines and based largely on link data (though some also feel the engines may use the age of the site here as well).

#6 - Diversity of Sources

No single metric has a more positive a correlation with high rankings than the number of linking root domains. This appears to be both a very hard metric to manipulate for spam (particularly if you need domains of high repute with diverse link profiles of their own) and a metric that indicates true, broad popularity and importance. Having said this, it is far easier (time and money) to get on all pages of a website, then to get on one page of different and unique website. So although having many unique domains link to your website as your primary goal, do not overlook the potential of links from all web pages within the same domain.

#7 - Uniqueness of Source + Target

The engines have a number of ways to judge and predict ownership and relationships between websites. These can include (but are certainly not limited to):

  • A large number of shared, reciprocated links
  • Domain registration data
  • Shared hosting IP address or IP address C-blocks
  • Public acquisition/relationship information
  • Publicized marketing agreements that can be machine-read and interpreted

#8 - Location on the Page

Microsoft was the first engine to reveal public data about their plans to do "block-level" analysis (in an MS Research piece on VIPS - VIsion-based Page Segmentation). Links from the "content" of a piece is most valuable, both from the value the link passes for rankings and, fortuitously, for click-through traffic as well.

Internal links in the footer of web pages may not provide the same beneficial results that those same links will when placed into top/header navigation. One way the engines appear to be fighting pervasive link advertising is by diminishing the value that external links carry from the sidebar or footer of web pages.

#9 - Topical Relevance

There are numerous ways the engines can run topical analysis to determine whether two pages (or sites) cover similar subject matter. Years ago, Google Labs featured an automatic classification tool that could predict, based on a URL, the category and sub-category for virtually any type of content (from medical to real estate, marketing, sports and dozens more). It's possible that engines may use these automated topical-classification systems to identify "neighbourhoods" around particular topics and count links more or less based on attribute.

#10 - Content & Context Assessment

Though topical relevance can provide useful information for engines about linking relationships, it's possible that the content and context of a link may be even more useful in determining the value it should pass from the source to the target. In content/context analysis, the engines attempt to discern, in a machine parse-able way, why a link exists on a page.

#11 - Geographic Location

The geography of a link is highly dependent on the perceived location of its host, but the engines, particularly Google, have been getting increasingly sophisticated about employing data points to pinpoint the location-relevance of a root domain, subdomain or subfolder. These can include:

  • The host IP address location
  • The country-code TLD extension (.de, .co.uk, etc)
  • The language of the content
  • Registration with local search systems and/or regional directories
  • Association with a physical address
  • The geographic location of links to that site/section

Earning links from a page/site targeted to a particular region may help that page (or your entire site) to perform better in that region's searches, eg web design las vegas. Likewise, if your link profile is strongly biased to a particular region, it may be difficult to appear prominently in another, even if other location-identifying data is present (such as hosting IP address, domain extension, etc).

#12 - Use of Rel="Nofollow"

Although in the SEO world it feels like a lifetime ago since nofollow appeared, it's actually only been around since January of 2005, when Google announced it was adopting support for the new HTML tag. Very simply, rel="nofollow", when attached to a link, tells the engines not to ascribe any of the editorial endorsements or "votes" that would boost a page/site's query independent ranking metrics. Today, Linkscape's index notes that approximately 3% of all links on the web are nofollowed, and that of these, more than half are sites using nofollow on internal, rather than external pointing links.

#13 - Link Type

Links can come in a variety of formats. The big three are:

  1. Straight HTML Text Links
  2. Image Links
  3. Javascript Links

It appears that straight, HTML links with standard anchor text pass the most value, followed by image links with keyword-rich alt text and finally, Javascript links (which still aren't universally followed or considered as an endorsement).

#14 - Other Link Targets on the Source Page

When a page links out externally, both the quantity and targets of the other links that exist on that page may be taken into account by the engines when determining how much link juice should pass.

"PageRank"-like algorithms from all the engines divide the amount of juice passed by any given page by the number of links on that page. In addition to this metric, the engines may also consider the quantity of external domains a page points to as a way to judge the quality and value of those endorsements. If, for example, a page links to only a few external resources on a particular topic, spread out amongst the content, that may be perceived differently than a long list of links pointing to many different external sites. One is not necessarily better or worse than the other, but it's possible the engines may pass greater endorsement through one model than another (and could use a system like this to devalue the links sent from what they perceive to be low-value-add directories).

#15 - Domain, Page & Link-Specific Penalties

Search engines apply penalties to sites and pages ranging from the loss of the ability to pass link juice/endorsement all the way up to a full ban from their indices. If a page or site has lost its ability to pass link endorsements, acquiring links from it provides no algorithmic value for search rankings. Be aware that the engines sometimes show penalties publicly (inability to rank for obvious title/URL matches, lowered PageRank scores, etc.) but continue to keep these penalties inconsistent so systemic manipulators can't acquire solid data points about who can gets "hit" vs. not.

#16 - Content/Embed Patterns

As content licensing & distribution, widgets, badges and distributed, embeddable links-in-content become more prevalent across the web, the engines have begun looking for ways to avoid becoming inundated by these tactics. It is likely that content pattern detection and link pattern detection plays a role in how the engines evaluate link diversity and quality. If the search engines see, for example, the same piece of content with the same link across thousands of sites, that may not signal the same level of endorsement that a diversity of unique link types and surrounding content would provide.

#17 - Temporal / Historical Data

Timing and data about the appearance of links is the final point on this checklist. As the engines crawl the web and see patterns about how new sites, new pages and old stalwarts earn links, they can use this data to help fight spam, identify authority and relevance and even deliver greater freshness for pages that are rising quickly in link acquisition.

Any good SEO campaign must include a long term linkback strategy, based on your monthly budget. Although, there is not just one size fits all approach, dont expect to get the same results from doing directory submissions as you would get from well placed context based theme related links from high PR pages, on multiple domains. An experienced search engine expert is capable of evaluating the cost/benefit trade offs that should be evaluated based your competitive landscape.

By: Nick the SEO guy

Meta Data - Do I really need it? Yes, No, Yes, No...

OK this is Highly debatable.

So I will say, just like everything else, that this is only my opinion. I can only go on how I interpret what I have seen.

We all know that you need the trinity to rank:

  • Great Content
  • Focused Meta Data
  • Links, Links, Links (not all links are the same)

You just NEED all three.

When it comes to Meta Data, I am a fanatic. There are so many variations, of how to write it. There is the obvious where you put the keyword term first, but what goes after that? How long should it be? Total, Density, and Prominence can all be factors. Can you stuff another keyword in there?

I always suggest using Meta Data, however is it nessesary? NO.
Is it necessary to highly rank? Also NO.

"So wait, I do not get it, you just said you need it..What is wrong with you?"

Well first lets look at Meta Data.
META DATA, in any media, is just a guide to what you are going to see, hear, or just experience. It is the pamphlet when you attend a concert, graduation, and event. It tells you what you are going to see and how it is organized. However, it is not the actual experience or the 'content of the event.' That experience is like the text and/or content of your website. That is why content is king. They come to see the actual event, not the pamphlet.

Is the pamphlet required by you to understand the event? NO
Is the meta data required by the Search Engines to understand your Website? Also NO

"OK enough with the analogy I get it"

Good, so the reason I like Meta data, is because you are telling Google, Yahoo!, and Bing exactly what you want them to look for. You are telling them that you feel you are most relevant for 'this keyword or that keyword.' And most of the time they listen! Unless your content is so irrelevant.

Now, if your content is long, thorough, organized, informative, and authoritative then the search engines will easily understand your focus based on your content. They will not need the Meta Data. Furthermore, they are not restricted to, or focused on one keyword of your choice. They may index that one page for numerous keywords! Also for every different keyword indexed, your title and description will be different based on what the search engine thought you were trying to show.

Sometimes having no meta data can be more powerful, however, unless you are experienced enough to write your content correctly, then having meta can be more productive for you. ©

by Marketsite Pro

Thursday, November 5, 2009

Just let go and change it. URLs that are already ranked, but not enough

If some URLs on your website are listed on a Search Engine, then that is great.

However, what if you are placed on Page 3 (SERP 27) of Google and you feel that you have peaked. It does not want to break past page 3. You go over your SEO plans and realize that your URL is not fully optimized! What do you do?

Example:
You fix eye glasses. your site is, let's say: fixeg3.com (Which, by the way, would be a horrible domain for an SEO specialist to work with).

One of your internal pages will focus on fixing hand crafted frames, and the keyword you are focusing on is 'hand crafted frames repair.' The current URL string for that page is: fixeg3.com/hcf-repairs.htm

AGAIN you are on the 3rd page og GOOGLE. So close! You already have spent so much money on Backlinks, your META Data is on point, and your content is good. I say, keep working on those three things as they are the meat and potatoes of your SEO campaign (review). However, everything should be optimized, including your URL string.

so......

go for it. Bite the bullet and make the change. Most likely, you do not have backlinks to your internal pages (If you do, then I will go over that in future post). But you have to do it if you want a better future for your website ranking.

Look at URL of one of Nick's Last post:
http://marketsitepro.blogspot.com/2009/11/what-really-matters.html

'/what-really-matters.html'

There is a reason why Google does this, and you NEED do the same.
so in the example, an optimized version may look like this: fixeg3.com/hand-crafted-frames-repair.htm

What happens now? What do I do with the old page? What will happen with my ranking?

The good news is that the page was already ranked on Google and you can use that momentum. Create the new page and then delete the old one. Next, do a 301 redirect to the new page (email us or look for it in future post if you do not know how to do this). This will provide a smooth transistion and allow Google to come in an re-index the new page.

Lastly, you better make sure this time the URL the way you want, because when it is optimized to your satisfaction, then you need to start sending backlinks to that specific page like it was a whole new site.

by: (\/)arketsite Pro