Brand new keywords sound super tricky to find — except for a ton of easy ones that come around every January: simply adding the year to whatever keyword you’re targeting. People can start getting traffic from “2020” keywords long before they show up with any kind of search volume in typical keyword-research tools, since their data lags. (Hat tip to Glen Allsopp, who I got that idea from.)
However, that’s totally impractical for established sites with hundreds of pages, so you’ll need a tool to do it for you. For example, with SEMRush, you can type your domain into the search box, wait for the report to run, and see the top organic keywords you are ranking for. Or, use their keyword position tracking tool to track the exact keywords you’re trying to rank for.
Use the Keyword Planner to flag any terms on your list that have way too little (or way too much) search volume, and don't help you maintain a healthy mix like we talked about above. But before you delete anything, check out their trend history and projections in Google Trends. You can see whether, say, some low-volume terms might actually be something you should invest in now -- and reap the benefits for later.
Disclaimer: “Whilst I have made every effort to ensure that the information I have provided is correct, It is not advice.; I cannot accept any responsibility or liability for any errors or omissions. The author does not vouch for third party sites or any third party service. Visit third party sites at your own risk.  I am not directly partnered with Google or any other third party. This website uses cookies only for analytics and basic website functions. This article does not constitute legal advice. The author does not accept any liability that might arise from accessing the data presented on this site. Links to internal pages promote my own content and services.” Shaun Anderson, Hobo
However, that’s totally impractical for established sites with hundreds of pages, so you’ll need a tool to do it for you. For example, with SEMRush, you can type your domain into the search box, wait for the report to run, and see the top organic keywords you are ranking for. Or, use their keyword position tracking tool to track the exact keywords you’re trying to rank for.
QUOTE: “So it’s not something where we’d say, if your website was previously affected, then it will always be affected. Or if it wasn’t previously affected, it will never be affected.… sometimes we do change the criteria…. category pages…. (I) wouldn’t see that as something where Panda would say, this looks bad.… Ask them the questions from the Panda blog post….. usability, you need to work on.“ John Mueller, Google.

QUOTE: “Ultimately, you just want to have a really great site people love. I know it sounds like a cliché, but almost [all of] what we are looking for is surely what users are looking for. A site with content that users love – let’s say they interact with content in some way – that will help you in ranking in general, not with Panda. Pruning is not a good idea because with Panda, I don’t think it will ever help mainly because you are very likely to get Panda penalized – Pandalized – because of low-quality content…content that’s actually ranking shouldn’t perhaps rank that well. Let’s say you figure out if you put 10,000 times the word “pony” on your page, you rank better for all queries. What Panda does is disregard the advantage you figure out, so you fall back where you started. I don’t think you are removing content from the site with potential to rank – you have the potential to go further down if you remove that content. I would spend resources on improving content, or, if you don’t have the means to save that content, just leave it there. Ultimately people want good sites. They don’t want empty pages and crappy content. Ultimately that’s your goal – it’s created for your users.” Gary Illyes, Google 2017


When you write a page title, you have a chance right at the beginning of the page to tell Google (and other search engines) if this is a spam site or a quality site – such as – have you repeated the keyword four times or only once? I think title tags, like everything else, should probably be as simple as possible, with the keyword once and perhaps a related term if possible.
TASK – If running a blog, first, clean it up. To avoid creating pages that might be considered thin content in 6 months, consider planning a wider content strategy. If you publish 30 ‘thinner’ pages about various aspects of a topic, you can then fold all this together in a single topic page centred page helping a user to understand something related to what you sell.
“When I decided to take the plunge and bring an SEO partner onboard my web project, I thought it would be hard – no – impossible! As a not for profit site my budget was very tight, but then I found SEO Rankings.  After explaining my situation and my goals Easy Internet Service worked with me to design a payment plan which meant I got everything I needed at a price I could afford. What’s more, they never once limited their support or assistance, and being new to the SEO field I had a lot to learn, but David from Easy Internet Services had answers and reassurance for all of my questions. This is why I recommend Easy Internet Services to all my friends, and I will continue to use them for as long as the internet exists.”

Everyone knows intent behind the search matters. In e-commerce, intent is somewhat easy to see. B2B or, better yet, healthcare, isn't quite as easy. Matching persona intent to keywords requires a bit more thought. In this video, we'll cover how to find intent modifiers during keyword research, how to organize those modifiers into the search funnel, and how to quickly find unique universal results at different levels of the search funnel to utilize.


This relationship between rankings and clicks (and traffic) is strongest amongst the top 3 search results. However, changing layout of the search results pages is constantly changing, with the inclusion of Google’s Knowledge Graph data and the integration of Universal Search elements (SERP Features) like videos, maps and Google Shopping ads. These developments can mean that the top 3 organic rankings are no longer the 3 best positions on the SERP. This has been demonstrated in heatmap and eye-tracking tests.
Wow! This was so helpful to me. I am new to the blogging world and have been feeling really frustrated and discouraged because I lacked the knowledge of getting my post to rank in search engines. I know there is a lot more I still need to learn but this has layed a foundation for me. I am bookmarking it so I can return and read it again. Thank you for writing!
Some page titles do better with a call to action – a call to action which reflects exactly a searcher’s intent (e.g. to learn something, or buy something, or hire something. THINK CAREFULLY before auto-generating keyword phrase footprints across a site using boiler-plating and article spinning techniques. Remember this is your hook in search engines, if Google chooses to use your page title in its search snippet, and there is a lot of competing pages out there in 2019.
At the moment, I don’t know you, your business, your website, your resources, your competition or your product. Even with all that knowledge, calculating ROI is extremely difficult because ultimately Google decides on who ranks where in its results – sometimes that’s ranking better sites, and sometimes (often) it is ranking sites breaking the rules above yours.
In February 2011, Google announced the Panda update, which penalizes websites containing content duplicated from other websites and sources. Historically websites have copied content from one another and benefited in search engine rankings by engaging in this practice. However, Google implemented a new system which punishes sites whose content is not unique.[36] The 2012 Google Penguin attempted to penalize websites that used manipulative techniques to improve their rankings on the search engine.[37] Although Google Penguin has been presented as an algorithm aimed at fighting web spam, it really focuses on spammy links[38] by gauging the quality of the sites the links are coming from. The 2013 Google Hummingbird update featured an algorithm change designed to improve Google's natural language processing and semantic understanding of web pages. Hummingbird's language processing system falls under the newly recognized term of 'Conversational Search' where the system pays more attention to each word in the query in order to better match the pages to the meaning of the query rather than a few words [39]. With regards to the changes made to search engine optimization, for content publishers and writers, Hummingbird is intended to resolve issues by getting rid of irrelevant content and spam, allowing Google to produce high-quality content and rely on them to be 'trusted' authors.

In February 2011, Google announced the Panda update, which penalizes websites containing content duplicated from other websites and sources. Historically websites have copied content from one another and benefited in search engine rankings by engaging in this practice. However, Google implemented a new system which punishes sites whose content is not unique.[36] The 2012 Google Penguin attempted to penalize websites that used manipulative techniques to improve their rankings on the search engine.[37] Although Google Penguin has been presented as an algorithm aimed at fighting web spam, it really focuses on spammy links[38] by gauging the quality of the sites the links are coming from. The 2013 Google Hummingbird update featured an algorithm change designed to improve Google's natural language processing and semantic understanding of web pages. Hummingbird's language processing system falls under the newly recognized term of 'Conversational Search' where the system pays more attention to each word in the query in order to better match the pages to the meaning of the query rather than a few words [39]. With regards to the changes made to search engine optimization, for content publishers and writers, Hummingbird is intended to resolve issues by getting rid of irrelevant content and spam, allowing Google to produce high-quality content and rely on them to be 'trusted' authors.
QUOTE: “Another problem we were having was an issue with quality and this was particularly bad (we think of it as around 2008 2009 to 2011) we were getting lots of complaints about low-quality content and they were right. We were seeing the same low-quality thing but our relevance metrics kept going up and that’s because the low-quality pages can be very relevant. This is basically the definition of a content farm in our in our vision of the world so we thought we were doing great our numbers were saying we were doing great and we were delivering a terrible user experience and turned out we weren’t measuring what we needed to so what we ended up doing was defining an explicit quality metric which got directly at the issue of quality it’s not the same as relevance …. and it enabled us to develop quality related signals separate from relevant signals and really improve them independently so when the metrics missed something what ranking engineers need to do is fix the rating guidelines… or develop new metrics.” SMX West 2016 – How Google Works: A Google Ranking Engineer’s Story (VIDEO)

Because someone who is looking for something that specific is probably a much more qualified searcher for your product or service (presuming you're in the blogging space) than someone looking for something really generic. And because long-tail keywords tend to be more specific, it's usually easier to tell what people who search for those keywords are really looking for. Someone searching for the head term "blogging," on the other hand, could be searching it for a whole host of reasons unrelated to your business.
Domain authority is an important ranking phenomenon in Google. Nobody knows exactly how Google calculates, ranks and rates the popularity, reputation, intent or trust of a website, outside of Google, but when I write about domain authority I am generally thinking of sites that are popular, reputable and trusted – all of which can be faked, of course.
Don’t break Google’s trust – if your friend betrays you, depending on what they’ve done, they’ve lost trust. Sometimes that trust has been lost altogether. If you do something Google doesn’t like such as manipulate it in a way it doesn’t want, you will lose trust, and in some cases, lose all trust (in some areas). For instance, your pages might be able to rank, but your links might not be trusted enough to vouch for another site. DON’T FALL OUT WITH GOOGLE OVER SOMETHING STUPID
Mobile-first design has been a best practice for a while, and Google is finally about to support it with mobile-first indexing. Learn how mobile-first indexing will give digital marketers their first real swing at influencing Google’s new AI (Artificial Intelligence) landscape. Marketers who embrace an accurate understanding of mobile-first indexing could see a huge first-mover advantage, similar to the early days of the web, and we all need to be prepared.
Optimizing a website may involve editing its content, adding content, and modifying HTML and associated coding to both increase its relevance to specific keywords and remove barriers to the indexing activities of search engines like Google ,Yahoo etc.[citation needed] Promoting a site to increase the number of backlinks, or inbound links, is another SEO tactic. By May 2015, mobile search had surpassed desktop search.[3]

I’ve got by, by thinking external links to other sites should probably be on single pages deeper in your site architecture, with the pages receiving all your Google Juice once it’s been “soaked up” by the higher pages in your site structure (the home page, your category pages). This tactic is old school but I still follow it. I don’t need to think you need to worry about that, too much, in 2019.
Google engineers are building an AI – but it’s all based on simple human desires to make something happen or indeed to prevent something. You can work with Google engineers or against them. Engineers need to make money for Google but unfortunately for them, they need to make the best search engine in the world for us humans as part of the deal. Build a site that takes advantage of this. What is a Google engineer trying to do with an algorithm? I always remember it was an idea first before it was an algorithm. What was that idea? Think “like” a Google search engineer when making a website and give Google what it wants. What is Google trying to give its users? Align with that. What does Google not want to give its users? Don’t look anything like that. THINK LIKE A GOOGLE ENGINEER & BUILD A SITE THEY WANT TO GIVE TOP RANKINGS.
QUOTE: “Returning a code other than 404 or 410 for a non-existent page (or redirecting users to another page, such as the homepage, instead of returning a 404) can be problematic. Firstly, it tells search engines that there’s a real page at that URL. As a result, that URL may be crawled and its content indexed. Because of the time Googlebot spends on non-existent pages, your unique URLs may not be discovered as quickly or visited as frequently and your site’s crawl coverage may be impacted (also, you probably don’t want your site to rank well for the search query” GOOGLE

Understanding the balance of terms that might be a little more difficult due to competition, versus those terms that are a little more realistic, will help you maintain a similar balance that the mix of long-tail and head terms allows. Remember, the goal is to end up with a list of keywords that provide some quick wins but also helps you make progress toward bigger, more challenging SEO goals.
×