Reposted from RxDigitalMarketing.com. Click here to read the full post and listen to the podcast: http://rxdigitalmarketing.com/index.php/2017/12/11/seo-sem-digital-marketers-part-2/
This is Part 2 of the Search Engine Optimization and Search Engine Marketing, or SEO and SEM as they’re more widely known. In the first part of the SEO/SEM series I covered the basics of what is SEO and SEM, how are they alike, and how are they different. I also discussed a few tools from google that will help you get started. If you haven’t checked that out yet, you definitely should follow this link: SEO & SEM for Digital Marketers — Part 1. In this post I’m going to build on that introducing a few new tools and then getting into SEO and SEM strategies, ending with some tips on how to keep you and your business out of trouble.
Google Search Console
Google Search Console Overview
After Google Analytics, the second tool you’ll want to get familiar with is the Google Search Console. Again, this is a free tool. Those of you who may know this as the Google Webmaster Tools should note that Google changed the name to Google Search Console back in 2015. The tool itself, however, still does a lot of the same things.
At the core, Google Search Console is a tool that help you monitor and maintain your presence in Google’s Search results. Sounds a lot like SEO, doesn’t it? Per Google, through Search Console webmasters can truly discover how Google Search – and the world – sees your website. By monitoring performance in the Google Search results you can:
- Make sure that Google can access your content
- Submit new content for crawling and remove content you don’t want shown in search results
- Create and monitor content that delivers visually engaging search results
- Maintain your site with minimal disruption to search performance
- Monitor and resolve malware or spam issues so your site stays clean
- Optimize your ranking
- Make informed decisions about the appearance of your site’s search results
As a service Search Console is quite a good tool. Once you have it setup Search Console will actually email you if anything unusual happens with one of your properties like you’ve been hacked or Google has issues crawling or indexing your site. Unfortunately, getting setup can be a little bit of a pain if you’re not technical. This is for security reasons and it’s only a one-time deal, so just go with it. In short, you’ll have to:
- Create a Google Search Console account
- Add your property (or website)
- Prove that you own it by either uploading an HTML file with a specific code that Google provides you, altering your DNS, adding a <meta> tag to the html, or several other methods.
Sitemaps, in principle, are fairly simple. A sitemap is a list of pages of a web site which are accessible to web crawlers or users that are typically organized in a hierarchical fashion. When planning a website the sitemap may be used for user experience (UX) and content planning. Once the website is live, the two primary versions are the HTML sitemap and the XML sitemap. Most of us have used an HTML sitemap at some time. HTML sitemaps are designed for the user, specifically organized to help them find content on the page.
An XML Sitemap is a structured sitemap not meant for humans. XML, Extensible Markup Language, is a way of marking up data such that it is machine readable. The structured format is designed to tell the search engine about the pages on the site, their relative importance to each other, as well as how often they are updated.
Pages linked in the robots.txt file can still appear in the search engine results. Many times this is because the page is linked to from a page that is indexed. Some formats of the robots.txt file can target specific search engines, disallow the entire website or just a directory or file, and even delay indexing of the site so crawlers to not index it too often if the content isn’t changing. In the end, however, it is ultimately up to the search engine itself to decide how to interpret the robots.txt file.
Google Tag Manager
Do you need Google Analytics and Google Tag Manager?
This is especially useful if you have several different websites or other digital properties to manage. Recall that SEO and SEM are not “set it and forget it” tactics. To do right they need to be actively managed. Some organizations have many digital properties, making the efficiency benefit of Google Tag Manager clear.
The other benefit of Google Tag Manager is the flexibility to customize tracking. Analytics tracking systems like Google Analytics offer a whole lot of tracking points “out of the box”, but there are still elements that need to be added through customizations. An example of this is understanding how far down on the page someone scrolls or tracking outbound links. Through Google Tag Manager administrators can “push” this customization out to many websites without ever modifying the websites’ code directly.
If Google Analytics is a tracking and analytics service, and Google Tag Manager is a service used to manage marketing and analytics tags such as the ones that Google Analytics creates, then Google AdWords is the equivalent advertising platform. In short, the Google AdWords platform allows paid advertisers to create and distribute online ads. If you recall, SEO refers to organic, or unpaid, search and SEM refers to paid. Google AdWords is where you go to pay.
Pay Per Click (PPC)
The first thing to understand is what you pay for with Google AdWords. Pay-per-click (PPC), also sometimes known as cost-per-click (CPC) is a model in which you only pay when someone clicks on your ad. Contrast this to more traditional cost per impression (CPI) advertising where you pay just to display it to someone.
Note that CPI is sometimes referred to as CPM, with “M” being Latin for mille, which translates to 1,000. So CPM is “Cost Per 1000 Impressions”. This is an early attempt to mirror Internet adverting models to television, radio, and print, where you can’t actually track is someone sees your ad or takes an action because of it.
In their simplest form, keywords are terms that capture the subject essence or topic of a document used to organize and retrieve documents. Within organic search, or SEO, keywords for your website are derived automatically by the search engine index based on the content. In SEM, you can buy the ones you want. The two models to buy keywords are flat-rate and bid-based. Flat-rate is just as it sounds – publishers have a rate card and advertisers agree to pay a set price per keyword. I won’t go deep into this topic because it is typically done through specialized searches, not the more generalized Google.
Bid-Based Keyword Buying
In bid-based advertising, rather than pay a set fee for a keyword, advertisers compete against each other in an auction held by the publisher or search engine. Each advertiser names the maximum they are willing to pay for the keyword. When a user searches for that keyword, search engine algorithms automatically determine which ad to show. To make it more targeted, advertisers can also specify geo-location, dates, and times along with their maximum bid. All this is considered in a fraction of a second to determine if your ad will be displayed, and what position on the SERP it will get.
As you can imagine, quite frequently there are multiple advertisers vying for keywords. In other situations, advertisers are attempting to buy keywords that are not reflective of content on their website. In these situations search engines use a quality score.
None of the major search engines have revealed the secret formula for quality scoring, which makes sense for two reasons. First, as soon as they reveal their formula people will naturally adjust their content to the algorithm rather than simply publish high-quality content. Remember, Google makes its money by being a good search engine, so it does everything it can to ensure users have a good experience with the results it choose to display. Second, the algorithms are constantly changing. Even if they were to publish it, it’d change before you had a chance to alter your content.
What we do know about quality scoring is that it is used in two places. First, when the algorithm is deciding which of two competing advertisements to show, quality score is considered. In simple terms, the formula becomes your bid * the quality score of your content. The higher the result, the higher your ranking on the SERP.
The second place Quality Scores are used are in bid pricing. Again, in an effort to keep the quality of results high, Google will often set a minimum bid based on the quality score. A higher quality score for a given keyword may allow you to bid a low amount. While a low quality score for a given keyword will cost you extra. So, in a way, good SEO will help give you more affordable SEM.
What We Know About Quality Scores
There are a lot of theories and several guidance documents to help publishers generate higher quality scores, and subsequently higher rankings. Again, there is no specific directions from providers, but the following factors are a good start if you’re new to SEM or SEO:
- Ad copy should be relevant to the copy on the page. Search engines don’t like bait and switch.
- Target landing pages should be of good quality with easy navigation and original content.
- Websites should be mobile ready and have a reasonable load time
- Geography is also considered depending on the topic.
- Historic click-through rates of an ad are used along with the CTR of an overall account
It’s been said that the best place to hide a dead body is on page two of the Google search results. According to research done by Ignite Visibility (which can be found here: https://ignitevisibility.com/ctr-google-2017/) the first position on the first page of results has an approximately 45% click-through-rate (CTR). Advanced web ranking’s CTR Study (which can be found here: https://www.advancedwebranking.com/cloud/ctrstudy/) puts the CTR of the first position on Google’s SERP closer to 30% on desktop and 22% percent on mobile devices. On average the 4th position on has about a 5.5% CTR which drops to below 1% by the 10th position. Both of these studies indicate that if you’re not in the first three results, or at least on the first page, your chance of someone finding your website is pretty low.
SEO Strategies Overview
Per Wikipedia, as an Internet marketing strategy, SEO considers how search engines work, what people search for, and the actual search terms or keywords typed into search engines. Optimizing a website may involve editing the content itself (more on this later) increase its relevance to specific keywords. On the backend SEO optimization could mean properly structuring the HTML and associated coding to remove barriers to the indexing activities of search engines. Outside the site, some people try promoting a site to increase the number of links to the site, which is a historic measure of a pages ranking. Unfortunately, many mal-intended individuals have tried abusing this over the years by creating thousands of interlinked fake websites with the sole intent to increase their SEO ranking. Needless, Google is wise to this, and I do not recommend SEO “tricks” if this nature.
Knowing Your Competition
One of the first things you’ll want to do is understand your competition. You might think you know, but do a quick Google search on terms you think you may want to rank highly in. What you’ll most likely find is that the competitive set only loosely resembles the competitive list you’d use for a physical sales team. Many times companies are micro-focused on their industry and forget that there are lots of third parties. Wikipedia routinely ranks high in many organic search keywords because of both the amount and quality of their content. There are also government sites, organizations, industry competitors, other industries that happen to share the same keyword, news organizations, and bloggers.
Knowing your competition in combination with the tools outlined above will help you focus your content on what people searching for your targeted keywords want. Research before you author your content because the best strategy is to make sure you’re giving users what they want.
SEO and Mobile
Google is developing and pushing mobile search as the future in all of its products. By May 2015, mobile search had surpassed desktop search and, in response, many brands have begun to take a different approach to their internet strategies. Around that same time Google announced that “Google Search will be expanding its use of mobile-friendliness as a ranking signal. This change will affect mobile searches in all languages worldwide and will have a significant impact in Google Search results.”
To help you with this Google offers an automated “mobile-friendly test” (you can find the test here: https://search.google.com/test/mobile-friendly). Do the analysis and pay attention to the recommendations.
Duplicate content is a huge issue for search engine indexers. If you have a website that can be found at www.example.com, example.com, www.example.com/index.php, example.com/index.php, and example.com/index.php?referrer=twitter that’s a lot of ways to get at the same content. If you have a second domain, which some sites do, this can get even more complex. This can create ambiguity for the search engine and they may miss some of your unique content. Further, having too much non-unique content can be interpreted as a “black-hat” effort and you may actually be penalized for it. Last, but not least, you should decide how users view your content, not a random algorithm in the search engine.
Canonical tags are a way to tell search engines that a specific URL is the main one that you want indexed. The tag is a simple HTML tag that you can plan for and code into your website. Some developers may recommend using 301 redirects, but those are different. Just be sure to plan well. Page A pointing to itself as the canonical is fine, but if A points to B and B points to A, or if A is removed one day there can be some confusion, and ambiguity is bad for SEO.
White-hat vs. Black-hat Techniques
While I 100% in no way endorse the use of “black-hat” or shady SEO practices, it is important to know about them so you know when someone is selling you a bill of goods. While standard SEO focuses on the content and the human audiences, there are some nefarious characters who attempt to subvert the system by focusing on weaknesses in the search engine algorithms.
Make no mistake, search engines are wise to these practices so they either don’t work, or don’t work for long until you get caught or the algorithms are updated. Recall that seven years ago Google updated their algorithm and average of 1.5 time per day! Now, with newer AI implementations, we can be confident this is happening much faster than it used to. The reality is that these practices either do not work, or are a lot of effort and money for short-term gains that should have alternatively been put toward above-board practices for long-term gain.
Some examples of black-hat techniques that no longer work, so don’t bother trying them, are:
- Keyword stuffing: using specific keywords often sounds like it can get you relevance, but use them too often and you be penalized for poor “netiquette”.
- Invisible text: another trick was to try adding white text on a white background so the engines could see it but users could not. Search engines got wise to this a decade ago.
- Doorways: are webpages designed solely to get a high ranking and then redirect you to the content the publisher wants to show you. Google views this as illicit and actually has a Federal Trade Commission (FTC) link to report this kind of activity in their SEO guidelines.
- Shadow domains: are essentially non-existent web domains that continually redirect users to funnel users and increase the appearance of more traffic than is actually there. If someone offers to increase your SEO quickly, make sure they’re not doing this.
Be Aware of Shady SEO
Again, per Google’s own support pages (https://support.google.com/webmasters/answer/35291) there are certain steps you can take to ensure you don’t get taken for a ride by a deceptive SEO company. According to them, here’s a few things that you should know:
- No one can guarantee a #1 rank on Google. There is no “special relationship”, Google does not give them “priority”, and there is not secret paid way to submit content to Google for indexing. Learn to use the tools I’ve outlined above, those are the ones.
- Random, out of the blue, emails that “noticed you’re are not listed in most major search directories…” offering to help should be taken with the same seriousness that you read an email that promises a miracle pill that will burn off fat while you sleep or requests to help transfer funds for a foreign company. It’s not real, don’t fall for it.
- If a company is secretive about EXACTLY what they are going to do for you, walk away. If something is unclear, ask for clarification. Remember, if they take black-hat action on your behalf and get you banned from Google, they can walk away, you can’t.
- SEO and SEM are different, and Google doesn’t sell organic ranking. So make sure you know what you’re getting. If they promise to list you at the top then it may be a simple matter of burning through your budget on high-cost keyword bids. You could do that yourself, but I wouldn’t, and I certainly wouldn’t pay someone else to do it for you.
- Be on the lookout for long and very specialized keyword phrases. A lot of times you’ll get these anyway, so be aware not to pay for something you don’t need to.
There are many good reasons to go with an SEM strategy. First, SEM done right can build traffic quickly. Second, there is a much higher degree of certainty to SEM than there is with SEO. Unfortunately, many times because of limited budgets, people and business tend to go with an SEO strategy first. But that’s much slower, and very much more uncertain, so often they do not decide to start up an SEM strategy until they are already in trouble.
SEM is for when you need traffic to your website now and don’t have time to slow grow a social following, or don’t have enough content or history to grow your SEO ranking. In some cases, you may be up against a well-established brand or significant third party, which is going to make organic growth a lot harder even if you have a better product. SEM levels the playing field and lets smaller players compete with industry incumbents quickly.
Have Good Content
Probably the single best thing you can do to increase your SEO and SEM ranking is actually pretty simple – have good content. The goal of any search engine is to provide a great experience for those that use it to find content. If a user tries and fails to find content too often they might stop trying and switch to a different search engine. Further, if users are regularly burned by paid adverting they’ll also stop clicking on it. Either way, everyone benefits from higher quality content.
Price and Budget
Within SEM, price and budget (along with a few other things) can determine the level of traffic directed to your website. From there you can grow by increasing your keyword bid, or adding more keywords. In the end, the more you’re are willing to pay for clicks, the higher and more frequent the ranking, which leads to higher traffic.
From there you may consider a pay per position strategy. As described above, what this comes down to is how much you’re willing to pay for the click. A higher position in the paid search results will cost more, but it will also convert a higher percentage of views into clicks. This is where the art begins to meet the science. It is up to your SEM team to determine the proper investment that will give you the ROI you’re looking for. In some cases it may be worth a higher bid to get more traffic. Others may opt for a lower bid. It is up to you to determine the value of what you can gain from a click. Keep in mind, this may be different for each keyword, and your strategy should reflect that.
The higher position is likely to cost $5 for a given keyword, and $4.50 for a third location. A third advertiser earns 10% less than the top advertiser, while reducing traffic by 50%. The investors must consider their return on investment and then determine whether the increase in traffic is worth the increase.
Again, per Wikipedia, one of my favorite resources, to maximize success and achieve scale, automated bid management systems can be deployed. These systems can be used directly by the advertiser, though they are more commonly used by advertising agencies that offer PPC bid management as a service. These tools generally allow for bid management at scale, with thousands or even millions of PPC bids controlled by a highly automated system. The system generally sets each bid based on the goal that has been set for it, such as maximize profit, maximize traffic, get the very targeted customer at break even, and so forth. The system is usually tied into the advertiser’s website and fed the results of each click, which then allows it to set bids. The effectiveness of these systems is directly related to the quality and quantity of the performance data that they have to work with — low-traffic ads can lead to a scarcity of data problem that renders many bid management tools useless at worst, or inefficient at best.
As noted above, different keywords may yield you very different results. Where one keyword may have a high conversion rate, another may have a high conversion amount, and a third may yield high traffic amounts. It will be up to you to determine what your strategy is (i.e., maximizing traffic vs. maximizing profit) and adjust your keywords and bids accordingly.
Google Analytics gives you some good information on how people are finding your website. But, once you start buying ads, Google AdWords give you GREAT information. Use these tools often to adjust your SEM often to ensure you stay on strategy.
Also, consider SEO alongside SEM. Remember that SERP ranking is determined by the amount of your bid multiplied by your quality score. Having good SEO will give you a higher quality score and allow you to pay less for the same keywords your competitors may have to pay more for.
The Right Time to Start Thinking About SEO and SEM
A Google search for the term “SEO” returns 480 million results. There are about 12 on the first page, and the majority of the clicks, as we’ll discuss later, are in the top 3-5 links. Google’s job is to sort through those 480 million results and return the best ones to its users. Your goal is to be the first result, just like everyone else. So, to be the best result you’ll need to start planning for that at the beginning.
A lot of people ask the question, “how do I make my site appear first in Google Search results?”. Google’s answer is: there is no secret, only best practices. Best practices which cannot be viewed as an after-thought. Best practices like:
- Provide high quality content on your website because Google tries to find the best answer to their users’ request.
- Build a mobile friendly website because Google not only the quality of the content, but also the user experience in getting it.
- Write and use good, clear titles and accurate meta tag descriptions.
- Add structured data if additional experience enhancing search result features such as stars, event information, or site search boxes apply to you.
- And follow Google guidelines instead of trying to circumvent them or beat the system.
Unfortunately, so many teams I’ve worked with think about SEO and SEM after they’ve got a website. This is waaaay too late. Remember, SEO ranking is heavily influenced by both your copy and your code. Working with a Content Strategist before you write, design, and build your website will go a long way. And having requirements for your technical build will prevent a lot of expensive re-work later.