Hi,
If you want fast indexation, your best bet is to promote it via Google+, as Google obviously has direct access to that database.
Facebook and Twitter certainly doesn't hurt, but Google+ is usually faster.
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Hi,
If you want fast indexation, your best bet is to promote it via Google+, as Google obviously has direct access to that database.
Facebook and Twitter certainly doesn't hurt, but Google+ is usually faster.
Yes, Google rolled out this update globally (which is rare for them to do. I think this is the first time they have done it on a big algo update, but I might be wrong on this one).
Source: http://googlewebmastercentral.blogspot.com/2012/04/another-step-to-reward-high-quality.html
Hi Lisa,
The only site I work with that runs on Wordpress is my personal blog, http://thogenhaven.com. So I don't need a lot of schema data on it. This being said, WP About Author, does a pretty good rel=author job.
I have also played around with GD star rating, which is good for product stars.
What are you using?
Hi Steve,
Avinash' Web Analytics 2.0 is by far the best place to start IMO: http://www.amazon.com/Web-Analytics-2-0-Accountability-Centricity/dp/0470529393
Two honorable mentions go to Bryan Eisenberg's A/B Testing http://www.amazon.com/Always-Be-Testing-Complete-Optimizer/dp/0470290633/ref=pd_sim_b_7 and Brian Clifton's Advanced Web Analytics: http://www.amazon.com/Advanced-Metrics-Google-Analytics-Edition/dp/0470562315/ref=pd_sim_b_5
Overseas as in Europe? I have worked with Distilled (www.distilled.net) on several occasions, and can vouch for them. But there are many good SEO companies. Luckily, SEOmoz did the work and compiled a good resource of good SEO companies: http://www.seomoz.org/article/recommended
Breadcrumbs are usually integrated in the site template. You can also add links in product descriptions when relevant. But you might want to A/B test it, to make sure it doesn't hurt your conversion rate.
If you use Wordpress, you can use this breadcrumb plugin by Yoast: http://yoast.com/wordpress/breadcrumbs/
Hi Greg,
Google usually discover pages via links. So if a page does not have any links, it is hard for Google to discover. This being said, you can try submitting XML sitemaps with the pages to Google, and they might crawl and index them.
However, if a page does not receive any links from your own site, it does signal that you do not consider the page to be particularly good/important, making it quite unlikely that it will rank well.
Hope this helps.
Hi Bill,
I have the same experiences as you. I have tried it for a couple Fan pages to build critical mass. But these pages have had a harder time getting started than pages where I didn't do it.
Thomas
Yes, it seems like Google has penalized a bunch of sites with unnatural link profiles.
Patrick Altoft has written a good post on it yesterday: http://www.branded3.com/seo/the-new-google-link-algorithm/
And both Alan and SEOclient12 are right: Do not send reconsideration request before cleaning up.
Best of luck.
Thomas
I think you need to invest time in implementing schema data eventually. When the SERPs you compete in start being filled up with rich snippets, you will have a hard time getting clicks without them.
Some blog posts report a 30 % higher CTR after marking up their snippets with schema / RDF. See for example: http://searchengineland.com/how-to-get-a-30-increase-in-ctr-with-structured-markup-105830
This being said, I really doubt the validity of such posts. It is very hard to measure, and before/after experiments are almost always flawed.
Hi Stevene,
OpenSiteExplorer - which is powered by Linkscape - is only updated around once a month. So there is some delay between getting links and these links being crawled by SEOmoz.
Furthermore, SEOmoz does not cral the web as deep as Google. Low quality pages (and good pages with very few good links) are not necessarily going to show up in OpenSiteExplorer.
You can see the update calender here: http://apiwiki.seomoz.org/w/page/25141119/Linkscape Schedule
3 layers isnt too bad.As long as you have decent domain authority, indexation should be okay.
Just make sure you that all category pages are structured nicely, and link to the pages in the subcategory.
I'll go with this post by Rand: White Hat SEO: It F@$#ing Works
Hi Greg,
In my experiences, direct outreach is better long term strategy than guest blogging sites. Not even do you get access to better blogs, you get a network of people who'll help promote you on social media.
It does take some months to build outreach this way, but totally worth it.
LinkedIn groups seems like a very good idea too!
Thomas
Hi Justin,
I am pretty sure the numbers are extracted directly from FB/Twitter/Google+ APIs. You can, for example, get the raw number for SEOmoz.org FB shares here: graph.facebook.com/http://www.seomoz.org
If you want better data about your competitors, you want to go to Topsy. You can see details about SEOmoz here: analytics.topsy.com/?q=seomoz.org
Thomas
Totally agree, EGOL.
If you can't beat the competitors content, you don't really deserve to beat them in the rankings. Although it certainly is possible to beat competitors with good links on the short term, it's really hard in the long term.
This being said, if I feel I have (or can create) the content to beat the competitors, I certainly do look at the competitors links to scout for opportunities.
I usually use OSE + the SERP analysis tool (http://pro.seomoz.org/tools/keyword-difficulty). That's enough to get a good overview of all the good links to the competitors. You can go to Majestic for a larger index, but that will probably only add the links that won't matter much to you anyway.
I agree with Robert. The ranking difference between .com / .net and no-hyphen / one-hyphen is going to be minimal. So go for the domain that is easier to read. That will probably benefit you in the long run.
Yes, the custom URL variables do indeed overwrite the original referral source.
As far as I know, there is no easy solution to this (I looked and asked around). So I am afraid it is an either/or choice here.
Here is a Facebook question: I would like to hide some content behind a Facebook wall - i.e. only make it visible to users who press the like button.
Something like the New Yorker did with a Jonathan Franzen story.
My question is: how do I do that?
Thanks!
Thomas
+1 for Rackspace. Very good uptime + super quick and dedicated support.
Totally agree with Keith on this one.
Press releases is not a suitable channel for infographics. It makes it way too obvious that you are trying to get links.
I'd also recommend a structured outreach to blogs.
Best,
thomas
Hi Mozzers,
Any of you know a way to bulk check which CMS a list of sites run on?
Let's say I'd like to develop a plugin for a CMS (e.g. Wordpress or vBulletin). But that'd only make sense if some of the sites in the niche use this platform.
And I'd like to avoid checking them manually with WASP.
Thanks!
Thomas
Happy to help. I think the problem might be, that it looks too much as a lead gen site, and too little as a site with great content / serious services.
Hi Bill,
I am sorry if this sounds a bit harsh - but I would not like to give any information to your website. Although the design is simple, it looks kinda spammy to me. It looks like a site that wants my email address and then sell it to other.
Good CRO starts with data collection. What I'd do if I was you is to ask a bunch of Amazon mTurkers some questions about your site. A good list of questions is the one Google uses - you can find it here.
Also, a small thing is the moving fav.ico - would definitely find a new, static one.
So the short answer to your question: invest in better design. But do it on the background of data.
Best,
Thomas
That's what I meant, Dejan. Submit the same site twice to the same account. Thanks for putting it more concise than me
You need to submit your site with and without www to the same account, and thereafter select which version you'll use.
Also, make sure you are consistent so also do a similar .htaccess 301 redirect to either www or non-www.
Best,
Thomas
Hi Gareth,
From my experience, it doesn't really matter that much. The second one looks slightly less spammy, which might work for your advantage in terms of interacting with users.
Also, you might want to look at the SERP before developing the domain. if there are already similar exact match domains ranking on the keyword, it's harder to get a top ranking.
Best,
Thomas
Needless to say - gaining Twitter through sincere outreach and good content is the right way to do it.
This being said, it can be a bit hard to get started on twitter. One tool that lets you buy "real followers" is Twiends.
It's a points based platform. You can either earn points by following other people or buy the points if you don't want to.
Hope it works for you!
Best,
Thomas
Hi Mozzers,
I am working with a website with very decentralized ownership. There are two different languages, each with a different owner.
Owner A keeps linking to crap sites, that hurt the entire site.
My question is this: Is there a way - through .htaccess or robots.txt - that Google can be asked NOT to crawl the links to external crap sites?
The problem is that Owner B cannot control Owner A's html, and thus not implement rel="nofollow" on links.
Thanks!
I am about to start some research on software for experimental design on websites. So I'd really like to know which A/B test software do you prefer? Howcome?
Yes, rel=canonical seem perfect for this job.
And I highly recommend doing it, as so many pages might be seen as low-qualoty content by Google post-panda, and thereby hurt your entire site.
Hi there,
I have previously worked with Distilled (www.distilled.net) who is also ooficial partner with SEOmoz. They have an office in Seattle, and just opened a new one in New York today.
I have heard great things about Seer (www.seerinteractive.com), but I haven't worked with them. They are based in Philly.
Best,
Thomas
Thank you for the inputs.
I fully agree it depends on me and my products - this is why I am asking for ideas to find metrics rather than asking for numbers.
Thanks!
Hi Mozzers,
What variables could be used to calculate the value of a Twitter follower? I m thinking something of
Something else?
Full disclosure: I am a bit biased on this one.
If you are located in the US, you should go for MozCon. The primary reason is that the format at MozCon is presentations instead of panels. This gives the presenters a opportunity for greater depth, which I really appreciate.
If you are based in Europe, you might want to check out Distilled's SEO conference in London in the fall.
But it's a matter of taste - some people prefer panels over presentations. If you do this, SMX is worth attending.
I would not recommend SES (I have only been to one, but that was sub par to SEOmoz/Distilled/SMX conferences).
In my experience, there are two ways to pull off the B2B badges tactic:
1. For B2B badges to work with big brands, you probably need to be some sort of industry leader being able to give out awards that actually mean something.
2. If you are not an industry leader, you can give awards/badges to those companies not receiving a whole lot of attention usually. They tend to be used a lot.
The best example I can think of is Google's AdWords / Analytics certifications used by quite a lot of SEM agencies.
I have the same problem. Running multiple campaigns can result in quite some email notifications
I know SEOmoz is aware of this and looking into it.
Thomas
I'd probably go with a book - either Inbound Marketing or Friends With Benefits
Oh by the way - there is a similar question here: http://www.seomoz.org/q/link-building-companies-for-super-hard-niches. You might want to try some of the companies.
Interesting question. I have tried several top SEO companies for link building. And none of them have really delivered anything. As Tom Critchlow usually says, the problem is "you can't outsource giving a shit."
What usually happens is this:
You pay premium for a reputable SEO company
They pass you on to a junior level SEO to do the link building
They send a monthly report with all sort of on site metrics and keyword tracking you probably don't need
They send a list of links- Turns out that the links are sidebar links from irrelevant websites (or worse) with way optimized anchor text.
Sorry about being so pessimistic on this one. But after trying multiple reputable companies, I have given up on outsourcing link building. The risk of receiving low-value-for-money or a penalty is too high for me.
I think the context is hugely influential on how much registration change matters.
Let's say i buy Site X from M Johnson, and I don't change anything on the website. In this case I am sure that nothing will happen.
Let's say i buy Site X from M Johnson, and changes all the content on the website. In fact, I turn an old NGO website into a pharmaceutical website selling all sorts of pills. In this case, I am sure the registration data matters as it helps Google understand something shady is going on.
Hah okay. Me too. Anyone outside Google?
I was wondering about which people in the industry you'd like to ask a question the most in the q&a?
On my list is Matt Cutts (obviously), Marshall Simmonds, Jimmy Wales and Biz Stone.
Yes but we need to remember that the 100 links limit is very arbitrarily. Some sites can have less than 100 links on a page while other can have thousands and still being crawled. It really comes down to the authority of the website.
Cool - seems like we all agree that linking out is good / the right thing to do.
I am still amazed that many people are still fearing that they will lose all their "page rank juice" if they give a single link to Wikipedia or whatever.
Indeed - Please mark the question as answered if you are satisfied at this point