Monday 30 July 2012

New notifications about inbound links


Lots of site owners use our webmaster console to see how their site is doing in Google. Last week we began sending new messages to sites with a pattern of unnatural links pointing to them, and I wanted to give more context about these new messages.

Original Link Messages

First, let's talk about the original link messages that we've been sending out for months. When we see unnatural links pointing to a site, there are different ways we can respond. In many severe cases, we reduce our trust in the entire site. For example, that can happen when we believe a site has been engaging in a pretty widespread pattern of link spam over a long period of time. If your site is notified for these unnatural links, we recommend removing as many of the spammy or low-quality links as you possibly can and then submitting a reconsideration request for your site.

In a few situations, we have heard about directories or blog networks that won't take links down. If a website tries to charge you to put links up and to take links down, feel free to let us know about that, either in your reconsideration request or by mentioning it on our webmaster forum or in a separate spam report. We have taken action on several such sites, because they often turn out to be doing link spamming themselves.

New Link Messages

In less severe cases, we sometimes target specific spammy or artificial links created as part of a link scheme and distrust only those links, rather than taking action on a site’s overall ranking. The new messages make it clear that we are taking "targeted action on the unnatural links instead of your site as a whole." The new messages also lack the yellow exclamation mark that other messages have, which tries to convey that we're addressing a situation that is not as severe as the previous "we are losing trust in your entire site" messages.

How serious are these new link messages?

These new messages are worth your attention. Fundamentally, it means we're distrusting some links to your site. We often take this action when we see a site that is mostly good but might be might have some spammy or artificial links pointing to it (widgetbait, paid links, blog spam, guestbook spam, excessive article directory submissions, excessive link exchanges, other types of linkspam, etc.). So while the site's overall rankings might not drop directly, likewise the site might not be able to rank for some phrases. I wouldn't classify these messages as purely advisory or something to be ignored, or only for innocent sites.

On the other hand, I don't want site owners to panic. We do use this message some of the time for innocent sites where people are pointing hacked anchor text to their site to try to make them rank for queries like [buy viagra].

Example scenario: widget links

A fair number of site owners emailed me after receiving one of the new messages, and I think it might be helpful if I paraphrased some of their situations to give you an idea of what it might mean if you get one of these messages.

The first example is widget links. An otherwise white-hat site emailed me about the message. Here's what I wrote back, with the identifying details removed:

"Looking into the very specific action that we took, I think we did the right thing. Take URL1 and URL2 for example. These pages are using your EXAMPLE1 widgets, but the pages include keyword-rich anchortext pointing to your site's url. One widget has the link ANCHORTEXT1 and the other has ANCHORTEXT2.


If you do a search for [widgetbait matt cutts] you'll find tons of stories where I discourage people from putting keyword-rich anchortext into their widgets; see http://www.stonetemple.com/articles/interview-matt-cutts-061608.shtml for example. So this message is a way to tell you that not only are those links in your widget not working, they're probably keeping that page from ranking for the phrases that you're using."

Example scenario: paid links

The next example is paid links. I wrote this email to someone:

"I wouldn't recommend that Company X ignore this message. For example, check out SPAMMY_BLOG_POST_URL. That's a link from a very spammy website, and it calls into question the linkbuilding techniques that Company X has been using (we also saw a bunch of links due to widgets). These sorts of links are not helping Company X, and it would be worth their time to review how and why they started gathering links like this."

I also wrote to another link building SEO who got this message pointing out that the SEO was getting links from a directory that appeared to offer only paid links that pass PageRank, and so we weren't trusting links like that.

Here's a final example of paid links. I emailed about one company's situation as follows:

"Company Y is getting this message because we see a long record of buying paid links that pass PageRank. In particular, we see a lot of low-quality 'sponsored posts' with keyword-rich anchortext where the links pass PageRank. The net effect is that we distrust a lot of links to this site. Here are a couple examples: URL1 and URL2. Bear in mind that we have more examples of these paid posts, but these two examples give a flavor of the sort of thing that should really be resolved. My recommendation would be to get these sort of paid posts taken down, and then Company Y could submit a reconsideration request. Otherwise, we'll continue to distrust quite a few links to the site."

Example scenario: reputation management

In some cases we're ignoring links to a site where the site itself didn't violate our guidelines. A good example of that is reputation management. We had two groups write in; one was a large news website, while the other was a not-for-profit publisher. Both had gotten the new link message. In one case, it appeared that a "reputation management" firm was using spammy links to try to push up positive articles on the news site, and we were ignoring those links to the news site. In the other case, someone was trying to manipulate the search results for a person's name by buying links on a well-known paid text link ad network. Likewise, we were just ignoring those specific links, and the not-for-profit publisher didn't need to take any action.

What should I do if I get the new link message?

We recently launched the ability to download backlinks to your site sorted by date. If you get this new link message, you may want to check your most recent links to spot anything unusual going on. If you discover that someone in your company has been doing widgetbait, paid links, or serious linkspam, it's worth cleaning that up and submitting a reconsideration request. We're also looking at some ways to provide more concrete examples to make these messages more actionable and to help narrow down where to look when you get one.

Just to give you some context, less than 20,000 domains received these new messages—that's less than one-tenth the number of messages we send in a typical month—and that's only because we sent out messages retroactively to any site where we had distrusted some of the sites' backlinks. Going forward, based on our current level of action, on average only about 10 sites a day will receive this message.

Summing up

I hope this post and some of the examples above will help to convey the nuances of this new message. If you get one of these new messages, it's not a cause for panic, but neither should you completely ignore it. The message says that the current incident isn't affecting our opinion of the entire website, but it is affecting our opinion of some links to the website, and the site might not rank as well for some phrases as a result.

This message reflects an issue of moderate severity, and we're trying to find the right way to alert people that their site may have a potential issue (and it's worth some investigation) without overly stressing out site owners either. But we wanted to take this extra step toward more transparency now so that we can let site owners know when they might want to take a closer look at their current links.


source: googlewebmastercentral.blogspot.in

Wednesday 25 July 2012

Google Analytics: 3 Tests to Ensure Your Checkout Process Is Working


There’s a reason that an ecommerce site’s checkout process is one of the most popular analytics topics. Once a person initiates a checkout, she has self-identified herself as a visitor with a high intent to purchase. Chances are, you’ve already spent money acquiring that potential customer. You've likely invested in (a) marketing efforts to make her aware of your site, (b) the paid search click that got her to your site on this or a previous visit, and (c) the overall investment in your site so that she was able to find a compelling product or service. But, visitors are often fickle. Even a minor speed bump on their checkout paths can be enough to cause them to abandon the effort, in which case the investments you made were for naught.

There are many ways to identify those speed bumps so they can be flattened out or removed. This article will cover three of them: funnel analysis, error message capture, and simple usability testing.

Google Analytics Funnel Visualization

Hopefully, you already have a Google Analytics funnel set up for your checkout process. The process is straightforward.

Set the order confirmation page as a destination URL goal.
For that goal, configure each step — i.e., page — in the checkout process as a step in the funnel
Depending on how your site works, you may want to set the first step in the funnel as your shopping cart page, or you may want to set it as the first step in the actual checkout process.

For more information on the mechanics of setting up goals and funnels, refer to Google’s documentation on the subject.

Once you have the goal and funnel configured, you can view the fallout that is occurring along the purchase path with the "Funnel Visualization" report under "Conversions > Goals."

The funnel visualization can help find low-functioning steps in the conversion process.
The funnel visualization can help find low-functioning steps in the conversion process.



The funnel visualization shows of all the visitors who started down the purchase path — viewing the cart in this case — who reached each step in the process and how many fell out of the process at each step. The visitor can view other pages in between two steps, such as navigating to the site’s return policy before completing the checkout. As long as she gets to the next step in the process during the visit, she will be counted as having progressed in the funnel.

A second visualization of the funnel that is available in Google Analytics is the "Goal Flow" view, which is also found under "Conversion > Goals."

The "Goal Flow" view allows you to follow the checkout path of specific groups.


The "Goal Flow" view allows you to follow the checkout path of specific groups.



Two things that this view provides that the standard funnel visualization doesn’t are:

A visualization of “loopbacks.” How often visitors move backwards in the funnel. In the example above, one of these is shown in the arrow that goes from "Start Checkout" back to "View Cart."
Segment traffic in a range of ways. Both by viewing the funnel only for visitors in a specific segment, or by changing the dropdown to split out the traffic by one of many dimensions. In the example above, the traffic is split by "Source."
If there is more falloff in the process than you would reasonably expect to see, digging into the "Goal Flow" may help you isolate it, if it is a particular step in the process, traffic from a particular source, users of a particular browser, or some other factor that seems to be problematic.

The example above shows a one-step checkout. One-step checkouts aim to simplify the checkout process, which is good, but they introduce a challenge from an analytics perspective: a funnel visualization only shows that some percentage of people are abandoning the process on the key page, without providing any specifics as to what it is on that page that tripped them up. One way to dig into that page is by capturing and analyzing warnings and error messages that get presented to visitors.




source: practicalecommerce.com

Tuesday 17 July 2012

Why is SEO Important For Your Business


Before asking why SEO (Search Engine Optimization) is so important to the success or otherwise of our websites, let's consider a few facts.

When people surf the internet, they are typically looking for specific information.

Very few people will look beyond page 2 of the search results returned.

Whatever niche or market you are in there is going to be competition.

If your website isn't on page 1, your competition stands a far better chance of getting the sales.

Even if you are targeting the exact key phrases the surfer is searching for, there could still be thousands, even millions of hits.

So, if you wish to run any form of online business, your primary role in life is to get recognized by the major search engines. SEO is really the only way to do that.

That's exactly why this article contains more free SEO tips for you.

One other very important fact before moving on. Google, Yahoo, Bing (ex MSN) and Ask account for over 95 percent of all global traffic on the internet. Why would you invest your precious time and hard earned money chasing the other 5 percent? Forget about the promotions for automatic submission software and any other fancy gizmos and concentrate on the job at hand.

Because there is so much emphasis at the moment about traffic, you must get traffic, get as much traffic as possible, people can get confused when it comes to search engine optimization. SEO is not about getting as much traffic as possible. SEO is about focusing through the use of keywords to get the right traffic for your business.

Are you a robot or a real person?

SEO is not a one-off activity, it is an ongoing process. Because of that lots of marketers try to short cut the work involved by using software packages. They don't work. You have to get the balance right between what the search engine robots are looking for and what value-added content you are providing to your site visitors.

Content created by software may well contain the exact keywords you wish to optimize but the chances of that content being readable and informative are slim to say the least. Plus, believe me, search engines can tell when content has been contrived and packed with keywords. "Keyword Stuffing" was a technique used by many in the past with limited success. The main search engines recognized this and modified their processes accordingly. Trying this method today will only land your site in deep water - no-one will find you!

The balance comes from ensuring your primary keywords are included for the robots and used constructively and meaningfully throughout your content.

Let me re-emphasize, SEO is NOT about cramming your website with every possible keyword. Big, big mistake. Trying to get search engine recognition for as many keywords as possible takes an awful lot of effort and is a total waste of time and resources.

Think about it, why do you have a website? I am thinking you want to get as many of the right visitors as possible and you want to convert those visitors into buyers. Am I right?

Then make sure your website is listed for the right keywords

Make sure your website is clear, easy to read and easy to navigate through.

Build trust through your content, for your visitors and the search engines.

Use a free tracking tool like Google Analytics to find out how your site is performing, what is working for you and what isn't. Dump what isn't.

Continue to build your site's popularity - keep adding fresh, informative content. If possible get as many back links to your site as you can. As your site's popularity grows, the search engines will increase your ranking. Don't keep making the mistakes that millions of internet marketers make every day. Concentrate on your target market as well as focusing only on those keywords that relate directly to your site and your business.


source:practicalecommerce.com

Saturday 14 July 2012

SEO: Understanding XML Sitemaps



XML sitemaps serve a very niche purpose in search engine optimization: facilitating indexation. Posting an XML sitemap is kind of like rolling out the red carpet for search engines and giving them a roadmap of the preferred routes through the site. It’s the site owner’s chance to tell crawlers, “I’d really appreciate it if you’d focus on these URLs in particular.” Whether the engines accept those recommendations of which URLs to crawl depends on the signals the site is sending.

What Are XML Sitemaps?

Simply put, an XML sitemap is a bit of Extensible Markup Language (XML), a standard machine-readable format consumable by search engines and other data-munching programs like feed readers. XML sitemaps convey information about one thing: the URLs that make up a site. Each XML sitemap file follows the same basic form. A one-page site located at www.example.com would have the following XML sitemap:

Sample XML sitemap file
Sample XML sitemap file



The XML version and urlset are the same for every XML sitemap file. For each URL listed, a <url> and <loc> tag are required, with optional <lastmod>, <changefreq> and <priority> tags. The URL information, outlined in red above, indicates the information that changes for each URL. The <loc> tag simply contains the absolute URL or locator for a page. <Lastmod> specifies the file’s last modification date. <Changefreq> indicates the frequency with which a file is changed. <Priority> indicates the file’s importance within the site. Avoid the temptation to set every URL to daily frequency and maximum priority. No multi-page site is structured and maintained this way, so search engines will be more inclined to ignore the whole XML sitemap if the frequency and priority tags do not reflect reality.

The URLs in an XML sitemap can be on the same domain or different subdomains and domains. However, each XML file can only contain 50,000 URLs per file and is limited to 10MB in size. To conserve bandwidth and limit file size, XML sitemaps can be compressed using gzip. When a site contains more than 50,000 URLs or reaches 10MB, multiple XML sitemaps need to be generated and called together from an XML sitemap index file. In the same way an XML sitemap lists URLs in a site, the XML sitemap index lists XML sitemaps for a site. The areas to modify for each XML sitemap listed are outlined below:

 Enlarge This Image
Sample XML sitemap index file
Sample XML sitemap index file

For more examples of XML sitemaps, peruse any site and enter sitemap.xml after the domain. For example, http://www.domesticpackersandmovers.com/sitemap.xml is the XML sitemap index for this site. If adding sitemap.xml doesn’t work, the XML sitemap may be named differently. Try checking the robots.txt file to see if the XML sitemap address is there. For example, check out http://www.domesticpackersandmovers.com/robots.txt for a huge list of XML sitemaps.

What to Exclude

Because XML sitemaps serve as a set of recommended links to crawl, any noncanonical URLs should be excluded from the XML sitemap. Any URLs that have been disallowed in the robots.txt file — such as secure ecommerce pages, duplicate content, and print and email versions of pages — should also not be included in the XML sitemap. Likewise, any files that are excluded from the crawl by robots noindex meta tags and canonical tags should not be included in the XML sitemap. If the crawlers find URLs in the XML sitemap that have been purposely excluded from the crawl by one of these means, it sends a mixed signal. “Don’t crawl this URL. But do consider it more important than the other URLs on my site.” The crawlers will obey the crawl exclusion commands issued by robots.txt disallows and meta robots noindex. But if enough of these mixed signals are present, the XML sitemap may be discredited and lose its recommending ability.


source:practicalecommerce.com