web design portfolio

Buying Expired Domains with PR Value

I hear the question asked all the time. And I know from my own experience that there is no simple answer to the question, "How do you buy expired domains with PR value?"

For years now, domaineers have kept this information close to the vest, as heavily guarded information, because they could buy expired domains and flip them for a quick profit. While I don't consider myself to be a domaineer, I am in the business of buying domains that have PR value, developing useful, unique content on them, marketing them, and then either reselling them, or selling advertising on them. So, I don't particularly want more competition (and believe me, there's tons of it).

That being said, I don't see many people doing the amount of research I do on domains that I do, and I'm usually willing to pay a premium on domains I really want, so i don't expect that providing some basic information here is going to hurt me much.

The first thing to understand is that domains with good pr value never expire. Nearly all registrars nowdays will auction domains off before they actually expire, and it has become a profitable business. The expiring domains are purchased by people specifically watching for expiring domains who know where to buy them.

You also need to know that that days of picking up a valid pr5 expiring domain for less than a few hundred dollars is long gone.
not only will the domain cost you money, but it's probably also going to cost you money in the form of a 'membership' to a place where you can buy these expiring domains. Not to mention there's the registration fee involved.

There's basically three different things you get for the 'membership fees'.

1. The opportunity to bid on domains that may expire in the next 14 days. There are a few different places you can go to get domains like this, such as www.tdnam.com and www.snapnames.com Unfortunately, they don't provide much in the way of domain statistics. They don't give you the pr, traffic, or any other stats, you have to research it yourself.

In order to do this, they will usually give you a list you can download with thousands of domains in it that will be up for auction. Probably about 97% of these domains will have pr0.

Also, there's not very many tools out there that help you check pr value of domains in bulk. The best I've ever found is at www.digixmas.com , and even that doesn't get all the way through a full list of domains. Try to check the pr on a few thousand domains a couple of times in the same day, and you're likely to get your ip banned from Google.

finally, keep in mind that if you do win the auction, these domains are not expired yet, and the original owner may still renew for like a week after the auction.

2. Access to dropped domains that a domain broker has purchased with intent to resell. These guys go to the actual live domain auctions and buy expiring domains in bulk, then sell membership to people, to give them the opportunity to bid on domains. In my experience, membership is a little more expensive, and you still have to pay about the same amount for the domain. And, you still don't have statistics provided on the domains, like pr or alexa rank. You have to do all the research yourself .

3. The opportunity to attend live domain auctions, such as at www.pool.com. This is probably the cheapest way to buy a good domain with pr, but you'd have to be in the market to buy in bulk for it to be worth traveling to a live auction. You can bid by proxy, but I find the rules for this a little confusing, and complex. And again, pr and alexa is not provided. They give you a list of thousands of domains beforehand, and you research them yourself. Out of 1000 domains, there might only be 100 with pr value.

4. Aggregated analysis of domains from the aforementioned services. These guys take all of the domains from all of the aforementioned services, and check various stats, like pr, traffic, Alexa rating, etc., and you pay to see it. Then you can go to the source and purchase it yourself. My favorite in this class is at www.freshdrop.com

If you have other methods, please feel free to list them in the comments below.

New Trends in Search Engine Optimization and SERP Manipulation

As the world wide web evolves and changes, so too does the keepers of the internet, the search engines. With this latest Google page Rank updates, we've learned that Google has begun to change what it perceives as important in determining the order of a SERP (Search Engine Results Page). It's a Search Engine Optimization professional's job to stay on top of these variables, constantly analyzing and testing, and figuring out what factors are most important in getting a site higher in the rankings for a SERP.

Many of the factors in SEO have been known for a very long time. But the question is, with the internet shifting to an architecture more conducive to social marketing, how much has Search Engine Optimization really changed? What do we know that's new? Here's a list of my top five factors in Search Engine Optimization there weren't necessarily very important just a few years ago.

New SEO Factors To be Aware of

1. Evidence of Selling links - Probably the biggest change in Google's algorithm in 5 years or more, it is now confirmed that they are strongly penalizing sites that they suspect are selling text links. There's been a huge outcry from webmaster across the internet that have had their sites completely dropped from Google altogether. If you are selling advertising on your sites, make sure you remove all mention of it, just to be on the safe side.

2.Outbound Links without the 'no-follow' Attribute - As Google attempts to monopolize advertising on the internet, they've begun to penalize sites that appear to be selling links without using the 'no-follow' tag. That is, of course, unless they are from AdSense, then they are perfectly ok. They appear to be justifying this by saying that these sites may be selling links. That is, of course, unless the links are from AdSense, then they are perfectly ok. We advise you to use the no-follow attribute often unless the links are actually embedded in content.

3. Overuse of Outbound Links - Once upon a time, it was safe to have a dozen or more outbound links in the footer or blogroll of your site without repercussion. Now, it looks like Google is beginning to penalize sites with too many outbound links, especially if you link to sites that are questionable or unrelated. Keep your outbound links to a minimum, and outbound links without a no-follow attribute to even fewer.

4. Inbound Links from Paid Directories - I know directory owners will hate to hear this one, but in Google's holy crusade against paid links, they are not only discounting inbound links they suspect were paid for, we have strong evidence that they are actually penalizing not only the directory, but the sites they think purchased the links. Don't use any of the massive paid directory submissions. In fact, I now recommend that clients avoid directories, except the really good ones, altogether.

5. All Inbound Links using the Same anchor Text - This one is kind of interesting. As Google continues to force webmaster to link to sites more naturally, they are beginning to discount inbound links that always have the same anchor text. It's a good idea to very the key phrases in your anchor text. Vary the capitalization, order of words, and substitute synonyms occasionally so your links look more natural.

What's more Important for Inbound Links? Related Content or High PR?

When considering inbound links in your SEO efforts, what effects Google SERPs more? Related Content or High PR?

The correct answer isn't one or the other. You should actually be looking for both. You need links from related content with high pr.

One without the other isn't nearly as effective. If you have links from unrelated site with PR, the link juice will most likely be discounted, especially if the link is from a directory. If you have links from related sites with no RP, you may get a little more traffic directly, but it won't help much with your SERPs.

Keeping a Purchased Domain's PR Value

What's the best way to keep the PR value of a purchased domain

There's a few things You might consider.

First, were they dropped domains? Dropped domains have a higher chance of incurring falling PR than non dropped domains. I like to buy expiring, but not dropped domains. Don't ask me to tell you how I do that, I won't tell you because I don't want more people to compete with when I do this. But you can buy domain with PR from sites going out of business, or sites that never really got off the ground very well.

PR Update In Progress

As expected, a few days after Google updated backlinks, we are beginning to see PR movement, not just penalties, on various google data centers.

As confirmation, we found this new post on Matt Cutts' blog yesterday.

"New Toolbar PageRanks coming

July 24, 2008 @ 4:26 pm · Filed under Google/SEO

Hey folks, I wanted to let you know that new toolbar PageRank values should become visible over the next few days. I’m expecting that also in the next few days that we’ll be expiring some older penalties on websites."

Where Are All My Google Backlinks?

So, you've noticed that Google doesn't show all of the backlinks to your site, by either the 'link"' commande or in Google Webmaster Central that you see when you perform the same sort of command on Yahoo.

Your might ask yourself, "Where are all my backlinks?!?" "Why doesn't Google give me a list of ALL my backlinks?" Or even, "What does Google consider a backlink?"

It's pretty simple, really. The links you see in the 'link' command, or in Google's webmaster tools are the backlinks that are actually counted towards that domains PR rating. That's why you never see 'no follow' backlinks listed in either of those places.

Keep in mind that sometimes google 'resets' the backlinks that count towards a domains PR when if a domain expires. Essentially, google doesn't want someone who picks up the expired domain to reap the rewards of the footwork that the old dwmain owner worked for. That's why sometimes you see a domain with a PR4, but with no backlinks visable on google.

The VAST majority of backlinks you see using the Yahoo baclink checker are crap. Google knows it, so they don't count them when they calculate PR. That's why, for the most part, you only see QUALITY backlinks from the google link command, or in Google webmaster central.

I've seen Matt Cuts allude to backlinks shown by the link command or in webmaster tools as being a 'subset' of the number of backlinks a site has, but i think he means it's a subset of all of the backlinks, with all of the backlinks being ones that don't count towards PR.

Sids are Spider Killers

Continuing our discussion on session IDs in our last article History of the Session IDs, you, like many other webmasters, may be wondering what is wrong with passing a session ID in the query string. After all, it sounds like a great way to keep sessions intact when surfers do not have 'cookies' enabled in the web browser.

Well, the problem is that no one clued the search engines in. Or perhaps it's better stated to say that search engines didn't have the foresight to see this coming and develop a way to better handle it.

In order to better understand the problem, you really need to better understand how search engines get the pages of your website into their index for searchers to find. I've oversimplified this process in the following description so it will be easier to follow along.

History of the Session IDs

If you've read my previous articles, then you probably know that I encourage people to refrain from allowing session IDs to be passed through the URL. In my experience, these session IDs can wreak havoc on a search engine's 'spider', and negatively impact your search engine listings.

But what exactly is a 'session id'? Why do we need them? What do they do?

Session IDs became popular in the early 2000's, when consumers and other web surfers began to find out that websites were actually placing a little file on their computer called a 'cookie'. This cookie was used for all kinds of things, from keeping track of what pages you've visited to your login credentials. You know when you go to log into a website and it has that little check mark that asks you if you want to stay logged in? Well, it uses a 'cookie' in order to do so. When surfers became aware of these 'cookies' that were being placed on their computers (they would have been aware earlier had they read the website's conditions of use, but when was the last time anyone ever did that?) they balked a bit. Many uses actually went into their browser's internet settings and turned off the ability for a website to store cookies.

With many users having cookies turned off, that meant that website owners needed a new way to be able to keep track of important details, like what you have in your shopping cart. Without a session, after you put something in your cart, then click on another page, your shopping cart would be empty again.

And so, the ill fated idea was that the session ID could be passed, from URL to URL as a surfer goes from one page to another. This way details of a surfer's visit could be kept in the database, and the website would be able to 'remember' who you are, where you've been, and what you have in your shopping cart without having to rely on a cookie to do so. This became a fairly popular solution as websites wanted their websites to 'remember' details of visitors whether they had cookies turned off or not.

This seemed like a pretty good idea at the time. Unfortunately, search engines didn't think so, and in the long run, it may have ended up costing small business owners quite a bit in extra bandwidth fees.

Be sure to visit again, to read my next article, "Session IDs, the Spider Killer" to be published later this week.

Google Hates Dynamic URLs

How many times have you heard that? I want to smack someone everytime I hear them say that.

Actually, this was true...... in the 1980's. Back then, when a search engine came across a URL that contained a 'special character' such as '?' and '=', it usually didn't index it properly. Of course back then, most websites didn't have dynamic query strings yet. Nearly the entire web was built with static HTML.

These days Google, and every other search engine out there handle dynamic URLs just fine. And in fact, the vast majority or urls in Google are dynamic, with parameters passed in the query string with characters like '=' and '?'.

What Google doesn't like are SESSION IDS. There's an important distinction. Dynamic URLs are still unique per web page. However, If your website is passing session ID through the query string, Google's parse bots can get stuck (and actually cost you quite a bit of bandwidth). Many of the older shopping cart software programs (osCommerce, X-Cart, Cube Cart) allow the passing of session ids in the query string to keep a customers session alive (as an alternative to forcing customers to have cookies enabled).

If your site is passing session IDs in the query string, then I'd say you have a problem. As long as it's not, then dynamic urls with '?' and '=' in them are just fine.

Force Meta Descriptions into Search Results

I was recently asked if I know how to force search engines to display the description meta tag in the search results rather than the section of the text that matched the customers query.

For Example, he elaborated, his meta description is "Buy Hot New PC Games". Elsewhere in the context of the page appears the text "pg game information", and someone has gone to Google and searched for "pg game information", they would prefer to have their site listed with their meta description "Buy Hot New PC Games" next to it.

I couldn't help but laugh. After all, Google and other search engines have spent years, and millions of dollars trying to get away from this very thing.

It used to be that all search engines used the meta descriptions. Go back and checkout WebCrawler in the earlier years, and you'll see website descriptions looked more like the link sales section of DP than any real search results.

Somewhere along the way, Google decided that given the freedom to provide their own website descriptions, website owners couldn't be trusted to provide a reliable description of what the website actually contained. Search results would deteriorate into nothing more than spam filled, often deceitful, promotional slang phrases.

The bottom line is that Google, along with other successful modern day search engines are not worried about what you, the website owner, desired the search results to be. They are much more worried about what the searchers desired results are. A search engine's number one priority is giving the searcher content that most closely matches their search. They can't trust webmasters to provide that in the meta description, so they rely on, you know, your actual content.

Syndicate content