The SERP is the front-end to Google's multi-billion dollar consumer research machine. They know what searchers want. In this data-heavy talk, Rob will teach you how to uncover what Google already knows about what web searchers are looking for. Using this knowledge, you can deliver the right content to the right searchers at the right time, every time.
QUOTE: “I don’t think we even see what people are doing on your website if they’re filling out forms or not if they’re converting to actually buying something so if we can’t really see that then that’s not something that we’d be able to take into account anyway. So from my point of view that’s not something I’d really treat as a ranking factor. Of course if people are going to your website and they’re filling out forms or signing up for your service or for a newsletter then generally that’s a sign that you’re doing the right things.”. John Mueller, Google 2015
After a while, Google will know about your pages, and keep the ones it deems ‘useful’ – pages with original content, or pages with a lot of links to them. The rest will be de-indexed. Be careful – too many low-quality pages on your site will impact your overall site performance in Google. Google is on record talking about good and bad ratios of quality content to low-quality content.
Google is a link-based search engine. Google doesn’t need content to rank pages but it needs content to give to users. Google needs to find content and it finds content by following links just like you do when clicking on a link. So you need first to make sure you tell the world about your site so other sites link to yours. Don’t worry about reciprocating to more powerful sites or even real sites – I think this adds to your domain authority – which is better to have than ranking for just a few narrow key terms.
Being ‘relevant’ comes down to keywords & key phrases – in domain names, URLs, Title Elements, the number of times they are repeated in text on the page, text in image alt tags, rich markup and importantly in keyword links to the page in question. If you are relying on manipulating hidden elements on a page to do well in Google, you’ll probably trigger spam filters. If it is ‘hidden’ in on-page elements – beware relying on it too much to improve your rankings.
This broken-link checker makes it easy for a publisher or editor to make corrections before a page is live. Think about a site like Wikipedia, for example. The Wikipedia page for the term "marketing" contains a whopping 711 links. Not only was Check My Links able to detect this number in a matter of seconds, but it also found (and highlighted) seven broken links.