November 2021’s biggest SEO topics include: a brand new Semrush feature, a new Pagespeed Insights interface, how to generate topic clusters quickly, and how to navigate the most recent Google algorithm update.
Semrush Provides Instant SEO Market Overview of Competitors
Although Semrush users need to spend $200 per month to access this new reporting format, it opens up the door for SEOs to leverage out-of-the-box reporting that identifies market leaders and their winning keyword tactics, establishes benchmark KPIs to march to, and pinpoints potential from up and coming trends.
Best practice – When using this tool, manual tactics such as an in depth SERP analysis should still be used to paint a more detailed picture of keyword intent, metadata patterns.
When Migrating an Old Website To a New Website, 301 Redirects are the Gold Standard
With 2020 encouraging businesses to funnel marketing budgets to improve the overall digital experience, whether that be user experience, site speed, or rebranding, website migrations have become a common initiative. This podcast walks through why the golden rule in any website migration is to always 301 an old URL to a new URL—even when the option to canonicalize the old URL to the new URL is on the table.
Bottom line – Sitemaps and backlinks point to the old URL anyway and Google may or may not honor the canonical, whereas it has to honor the 301.
Google’s New Version of PageSpeed Insights is Now Live
Google just launched a more user-friendly interface of the PageSpeed Insights tool on November 16th, 2021. The new tool, like the old tool, is a report based on a culmination of real world data (helpful for measuring against how actual Chrome users experience the world’s most visited websites) and lab data (helpful in that it creates a debugging environment where developers can reproduce results). Here are the largest differences with the new tool:
✔️ the 10-year-old legacy code was cleaned up to streamline UI
✔️ the navigation labels for mobile and desktop are laid out more intuitively
✔️ the origin summary provides the aggregated real-world data score for the root domain (all pages)
✔️ there’s a summary report card that provides details around data collection for the combined real-world and lab data including: data collection period, length of visit, device type, sample size and more
Bottom line – one segment of data isn’t more accurate than the other. They still need to be taken into account as a combined entity. Other than UI, the origin summary is the newest feature that allows users to understand the root domain’s metrics, and not just the individual URL being analyzed. Try the new tool for yourself here. Same tool, better looking.
The Largest Scientific SEO CTR Study Ever Just Revealed the CTR for position 1 decreases from 8% on Desktop to 6% on Mobile
The SERP landscape is always evolving, yet this type of study is helpful in that it establishes a foundation for forecasting SEO ROI and projecting market opportunity. The study has many profound takeaways (read those here), but arguably, the biggest finding is that users are scrolling now more than ever before. The biggest reason for this is likely the recent expansion of page one search results on mobile which now accommodates upwards of 40 positions now (whereas desktop accommodates 10). The evidence for this is seen in CTR positions 17-20 are on average higher than positions 11-16.
The mobile model demonstrates that users are less likely to click listings at the very top of the SERPs compared to when browsing on desktops. In fact, the click-through rate for position 1 decreases from 8.17% on desktop to 6.74% on mobile devices.
Best practice – top use cases for CTR are:
- Use CTR as a directional model for ROI as it relates to organic performance
- Use CTR to inform future SEO tactics and initiatives
- If CTR is high for organic search, consider lowering budget for paid and allocating saved budget to improve organic efforts
- Use CTR to identify room for page title performance opportunities
- Measure your CTR against industry standard CTRs
- Compare behavioral changes through click thrus across different devices (mobile vs desktop vs ipad)
- Use CTR to determine whether or not on-page content is speaking to the right intent of the searcher
- Use CTR to confirm seasonality trends for specific keywords
Bottom line – searchers are scrolling on Google the same way they scroll through other online platforms, namely websites themselves and social media, and are becoming more sophisticated in the way they search. At the same time, Google continues to provide a richer library of SERPs. Momentic predicts that SEOs talking about position in relationship to page 1 won’t be as relevant or prevalent in the future.
Semrush Now Provides Keyword Intent as a Dimension
Semrush just released a new intent feature that allows SEOs to quickly unearth, qualify and translate competitor success into actionable strategies. Keyword dimensions worth considering in keyword research include:
✔️ keyword intent – provides insight into audience behavior, the way they have been socialized to discover a line of products or services, and how they’ll likely use (or want to take action on) your website when they land on it.
✔️ keyword monthly search volume – reveals market demand for that keyword
✔️ keyword difficulty – can be used to highlight gaps in target keywords that the competition isn’t currently taking advantage of.
Best practice – It’s still difficult for Semrush to identify navigational intent, namely because discerning a brand from a common term can be difficult, so take these results with a grain of salt and when in doubt, spot check with SERPs.
Bottom line – Semrush’s Intent gauge seems to be accurate the majority of the time based on Momentic’s use of the tool and cross references with the actual SERPs. Keyword research and SERP analysis can be done more quickly with the addition of intent in Semrush’s keyword magic tool, but should always be considered in context with other important dimensions.
Google has Finally Published IP Addresses that Googlebot Uses
Google has released an official list of IP addresses for Googlebots and non-Googlebots, This is helpful because Google Analytics segments can be put in place to remove what traffic inflations associated with fake rogue bots (bad thing) allowing more accurate tracking of real visitors (good thing).
The downside is that the file holding the full list of IP addresses will likely be updated daily by Google. The upside is that sites can be slowed down from fake bots crawling them, which can indirectly impact keyword rankings.
Best practice – Never assume that filtering out bot traffic is automatically set up in Google Analytics. Real Google bots aren’t going to send session data to Google Analytics, but fake bots can send session data to Google Analytics. If you currently have server rules in place that block traffic from other countries, make sure you allow for Google bot IPs. You can do this manually, or by using third party cybersecurity services like Cloudflare to protect your site from rogue bots, which is what Momentic uses for our own website.
Bottom line – If you suspect your website is being crawled by a malicious bot or a spam bot, you now have a list of confirmed IP addresses available from Google. This will allow you to create the necessary filters for the most accurate data as well as avoiding any negative consequences as a result of the non-Google bots.
Google is Rolling out It’s 3rd Core Update of the Year on Thursday, November 18th
Google’s November 2021 core update is coming just weeks before the biggest online traffic time of the entire year—Black Friday and Cyber Monday. Typically, these core updates rollout over the course of two weeks. Any algorithm change is always going to be an attempt for Google to serve relevant, helpful content to its users, but this algorithm change in particular is meant to target rankings.
Remember, core updates are never rolled out to penalize websites, their intention is always to make the quality of search results better.
Best practice – these changes are an aim for Google to assess pages and content more efficiently with greater accuracy and not to penalize pages that are doing well. instead of keeping an eye out for drops in rankings, visibility or traffic to pages that typically perform well, monitor pages that have historically underperformed and get a pulse to see whether or not they start to do better over the next month or two.
Bottom line – Google might be more mindful to its service members (SEOs, devs, etc) for future rollouts as seen in Twitter feedback; however, it’s important to note that Google hasn’t rolled out an announcement this prompt for a core update specifically ever in the past—they used to go unannounced. An additional thing we’ll leave you with— another rollout like this continues to put emphasis on UX being more important than ever before. Gone are the days of “Google likes ugly as long as it’s helpful to the user.”
Recent Keyword Tag Testing Shows Content & Structure on a Site is the Most Important Thing
Bolded capitalized keyword tags that resemble headings have started to appear on mobile results pages and these pseudo rich results aren’t appearing due to the addition of structured data or tag page features on the backend of a CMS.
Bottom line – Taken into account with Google’s August 2021 Page Title Rewrite algorithm update, this provides further evidence that Google is getting smarter in the way that it’s presenting products with transactional intent—through metadata rewrites and now, pseudo header classification.
An Automated Approach to Topic Cluster Generation is Possible, Just Don’t Forget to Use Manual Research for Checks & Balances
Topic clusters can be used to inform content structures within a page and even more broadly, can help highlight gaps in site structure (missing pages) in order for Google to better understand the context of a website and the relationship between its pages. If you’re crunched for time, use Jake’s recommendations and create a topic cluster by:
1️⃣ Finding a wikipedia page that corresponds to the target keyword.
2️⃣ Then run the wiki page through Semrush (or Ahrefs) to identify keyword buckets.
3️⃣ After that, run the wiki page through this free tool to pinpoint missing keywords (helps you understand how a bot sees a page)
4️⃣ Add these keywords into Semrush or Ahrefs and filter by Questions
Best practice -All in all this exercise should take 5 to 10 minutes and it provides a great jumping off point for queries you might be less knowledgeable about.
Why Alt Text is Important & Kind
At Momentic, one of our core values is providing a positive impact for our clients, employees and community. Connor Scott-Gardner’s take on how long and grueling the process is for running images through special desktop software and screen readers is a first-hand testament to why ADA compliant sites are not only favored by Google, but are first and foremost necessary for those who are visually impaired to experience a website with as much information and context as their neighbors.
Think of writing descriptive, helpful alt text as a way to provide an additional venue of ease when it comes down to putting forth the best user experience possible.
“Woman in chair” Is not as helpful as “elderly woman sitting in red leather chair.”
Best practice – Paint a rich picture for everyone, by taking the onus off of those who shouldn’t have the onus to begin with. For any image with a chart or graph, make the alt text the high-level takeaway or analysis. For any photograph, picture, meme, GIF, or animated GIPHY, create alt text that lends itself to the joke or give it a sense of flow.