Brian Donahue, Author at Go Fish Digital https://gofishdigital.com/blog/author/brian-donahue/ Wed, 13 Sep 2023 13:21:02 +0000 en-US hourly 1 https://wordpress.org/?v=6.4.3 https://gofishdigital.com/wp-content/uploads/2021/09/cropped-gfdicon-color-favicon-1-32x32.png Brian Donahue, Author at Go Fish Digital https://gofishdigital.com/blog/author/brian-donahue/ 32 32 Identifying and Addressing Drops in Organic Visibility https://gofishdigital.com/blog/addressing-drops-in-organic-visibility/ https://gofishdigital.com/blog/addressing-drops-in-organic-visibility/#respond Mon, 15 Feb 2021 15:26:03 +0000 https://gofishdigital.com/addressing-drops-in-organic-visibility/ Working in digital marketing and, in particular, SEO comes with its own unique set of challenges. Client relationships, shifts in organic algorithms, and an ever-evolving requisite skillset are just a few of the daily challenges. And whether you’re new to digital marketing or a seasoned vet, almost every SEO has been confronted with the dreaded […]

Identifying and Addressing Drops in Organic Visibility is an original blog post first published on Go Fish Digital.

]]>

Working in digital marketing and, in particular, SEO comes with its own unique set of challenges. Client relationships, shifts in organic algorithms, and an ever-evolving requisite skillset are just a few of the daily challenges. And whether you’re new to digital marketing or a seasoned vet, almost every SEO has been confronted with the dreaded morning notification of a drop in organic visibility. Below are steps for identifying a drop in organic visibility for a website and a process for assessing (and addressing) this dilemma.

Related Content:

1. Identify the drop in organic visibility

Imagine you’ve sat down at your desk to find an email alert from Google Analytics stating that organic traffic has taken a dip for your website. The next step is to identify where your website is experiencing this drop. For our example, we’ll assume we’re dealing with a year-over-year decline in organic visibility.

Perform the following steps:

  1. Log in to Google Analytics.
  2. Navigate to your primary GA View.
  3. Set your default channel to “organic.”
  4. Set your primary dimension to “landing page.”
  5. Decide on a date range that aligns with the period that was flagged in your GA alert.

At this point, you can review landing pages with the largest YoY drops for the selected span of time. Sift through and select a URL that has a steep YoY decline:

In this example, our selected web page is down -35% YoY, while organic behavioral metrics are performing better YoY. Our next step is to review this URL in Google Search Console.

2. Review Google Search Console

Now that we’ve identified the underperforming URL, we can utilize Google Search Console to further investigate where the drop in organic visibility is stemming from.

Once you’ve logged into the GSC account for your website, navigate to the “search results” section (beneath performance in the left-hand navigation). Within this window, let’s make some modifications:

  1. Set search type to “web”.
  2. Compare the past 28 days to the previous year.
  3. Set the page field to the target URL.

From here, select “CTR” and “Average Position” to gauge how the target page is trending for these metrics. If the average position had been improving and CTR had been decreasing, our next step would be to review the search engine result page landscape to analyze our page’s meta description. We’d also want to review the search intent of the target keyword to ensure that our page’s content matches the SERP intent.

In this case, with both CTR and average position performing worse year-over-year, we can intuitively conclude that a ranking shift has occurred and that competitors have surpassed our website’s organic ranking (you can also confirm this via a quick SERP review).

As a final step within this window, we should review specific organic keywords that our target URL is performing worse for. To do so, select “queries” – this will show us the click-through-rate and position of our target URL for specific terms. We can now sort by position (by clicking the column header) to identify the keyword(s) that is now performing worse YoY.

3. Review the SERP Landscape and Make Necessary Adjustments

By this point, we’ve noted a drop in visibility, pinpointed the page that is showing a YoY decline, and identified an organic query (or set of queries) that our target page is performing worse for. Next, review top-ranking URLs within the search engine result page for our specific keyword.

When reviewing the SERP, make a note of the page types that are ranking on page 1 for the target keyword. Are the best ranking URLs informational or transactional in nature? If informative, how are they constructed? Keep an eye out for page elements such as additional menus, navigational assets, videos, unique image types, social media feeds, etc. Also note how recently top-ranking URLs have been updated. For the most part, you’ll likely find that fresh, comprehensive content is present for URLs on page 1 of the SERP.

Our team recently followed this step-by-step process for addressing keyword drops for a client website. Our team noted that additional, fresh content and a slight structural reconfiguration would better align the page with other top-ranking URLs. After adding new content and adjusting the page’s layout to be more straightforward (and similar to top-ranking competitors), the result was an improvement in organic positioning. The target URL improved from ranking #36 organically on 11/15/2020 to now ranking #4 as of 2/9/2021. Below is a STAT screenshot of the improvement:

 

So there you have it; from receiving a traffic decline notification to remedying the issue, you now have a better idea of where to begin the process of identifying drops in organic visibility. Note that the requisite on-page adjustments to regain organic ranking positioning will not be uniform across the board. As is the case for nearly every aspect of SEO, each hurdle you face should be approached in an individualistic manner.

If you’re looking to improve your site’s search visibility, feel free to reach out to our SEO agency at Go Fish Digital.

Let us know of questions or comments below – thanks for reading!

Identifying and Addressing Drops in Organic Visibility is an original blog post first published on Go Fish Digital.

]]>
https://gofishdigital.com/blog/addressing-drops-in-organic-visibility/feed/ 0
Using the Amelia Booking Plugin for Services and Appointments https://gofishdigital.com/blog/using-the-amelia-booking-plugin/ https://gofishdigital.com/blog/using-the-amelia-booking-plugin/#respond Mon, 09 Mar 2020 14:37:18 +0000 https://gofishdigital.com/using-the-amelia-booking-plugin/ Most site owners are familiar with the name brand plugins primarily used for eCommerce transactions. WordPress plugins such as WooCommerce, BigCommerce, and Jigoshop have become synonymous with the selling of goods online. Online transactions become more complicated, however, when a website is attempting to sell services that require appointment times. Related Content: Lawyer SEO Service […]

Using the Amelia Booking Plugin for Services and Appointments is an original blog post first published on Go Fish Digital.

]]>
Using the Amelia Booking Plugin for Services and Appointments

Most site owners are familiar with the name brand plugins primarily used for eCommerce transactions. WordPress plugins such as WooCommerce, BigCommerce, and Jigoshop have become synonymous with the selling of goods online. Online transactions become more complicated, however, when a website is attempting to sell services that require appointment times.

Related Content:

Websites that provide service offerings are not barred from using the aforementioned plugins, but workarounds are required in most cases. Instead of (figuratively) jamming the square peg of service offerings into the round hole of eCommerce plugins, we’ve found that utilizing a calendar booking plugin helps streamline the process of online sales. In this case, the Amelia WordPress plugin fills a void for connecting online consumers with businesses, from dentist offices to yoga studios. While not perfect, the team at Go Fish Digital has used this plugin to address client website needs. Check out the areas in which the Amelia plugin excels below.

1.User Experience

The Amelia plugin comes “out of the box” ready to enhance a website’s UX and boost CRO opportunities. The following screenshot, for example, is the catalog view used by a gym/fitness website to display class offerings:

As opposed to beginning a more standard cart process for selecting a good or service, Amelia allows a site visitor to select a service and date/time without having to leave the page in which the Amelia shortcode exists. Also, Amelia allows site owners to upload custom images for each offering, further enhancing the aesthetic appeal of the services available to customers. And, if you don’t like the Amelia booking’s default color, gradient or font, you can adjust the tool to your liking with the “Customization” tab:

The Amelia Customize tab

There’s even a calendar view if you want to direct website visitors straight into the date selection funnel:

A view of the Amelia booking calendar

These UX enhancements by Amelia assist in boosting page interaction and dwell time for a website. While the exact impact of dwell time on organic positioning is debatable, Amelia’s ability to boost engagement only works to the benefit of websites using the tool. Amelia keeps visitors engaged and, often, converting.

2.Employee Scheduling

One tricky aspect of appointment bookings that Amelia addresses is pairing a customer’s service selection with available employees. Amelia accomplishes this goal by allowing for the creation of individual employee accounts, each with their own respective appointment calendar. As a result, companies with several employees can easily manage their appointments, availability, and provided services.

A view of the Amelia employee scheduling functionality

The plugin allows for the configuration of employee work hours and days off, too, further “shouldering the load” when it comes to calendar-keeping and employee schedule management.

3.Setup and deployment

After downloading the Amelia plugin, set up is straight forward. The “Settings” section offers options for nearly everything a site owner can think of. Setting up payment acceptance, for example, only requires a couple of Paypal or Stripe credentials. And, if desired, Amelia allows site owners to create coupons or adjust the (listed) accepted currency within this same section:

A view of the Amelia payment configuration

As mentioned earlier, deploying the plugin is also straightforward. After downloading Amelia and adjusting the configurations, there will be a shortcode button within each WordPress page. Select the shortcode button and a list of options will appear for embedding Amelia. Shortcodes are as simple as “[ameliacatalog]”, but can be amended based on which services you would like visible.

Keep in mind that this list is not exhaustive! Amelia has a range of other options I didn’t cover, from SMS/email notifications to WooCommerce integration. And, while the plugin isn’t always perfect (few ever are), Amelia provides a ticket support system for site owners to use when encountering issues.

I hope you enjoyed the read! Let us know of questions or comments below!

Using the Amelia Booking Plugin for Services and Appointments is an original blog post first published on Go Fish Digital.

]]>
https://gofishdigital.com/blog/using-the-amelia-booking-plugin/feed/ 0
You’ve Crawled Your Site…Now what? https://gofishdigital.com/blog/analyze-your-website-crawl/ https://gofishdigital.com/blog/analyze-your-website-crawl/#respond Tue, 16 Jul 2019 13:40:03 +0000 https://gofishdigital.com/analyze-your-website-crawl/ Whether you are optimizing your own website or working on a client’s, it’s imperative that web owners attempt to simulate search engine bot activity, specifically in regards to site crawling. You, being a proactive digital marketer, have probably reviewed (and maybe even tried using) site crawl tools in the past. So, let’s say you’ve done […]

You’ve Crawled Your Site…Now what? is an original blog post first published on Go Fish Digital.

]]>
Whether you are optimizing your own website or working on a client’s, it’s imperative that web owners attempt to simulate search engine bot activity, specifically in regards to site crawling. You, being a proactive digital marketer, have probably reviewed (and maybe even tried using) site crawl tools in the past. So, let’s say you’ve done the heavy lifting: you downloaded the latest version of Screaming Frog, read through the extensive user guide, and properly configured the tool to crawl key elements of the site in question. Now that you have the results, what should you do with them? Below are three data points from simulated crawls that I review to glean architectural SEO action items worth addressing.

Related Content:

Malformed URL Structures

Upon completion of a full crawl, my first review is of webpage URL structures. I scan for outliers that could be causing page duplication. A key offender is often generated by malformed internal hyperlinks. More specifically, when a website’s internal hyperlinks oscillate between upper and lowercase versions, the best case scenario is that an excessive number of 301 redirects are triggered. In the worst case, two versions of every webpage exist for the entire site. The result, unfortunately, is a dilution of SEO equity and wasted search engine crawl budget.

Simply scroll through the “Address” column within the “Internal – All” portion of the crawl and keep an eye out for URL inconsistencies. Make note of each offending case and then review the “Inlinks” tab to identify the source of the issue.

Spider Traps

If you walk away from your crawl to return to an inordinate number of identified URLs, you might have an issue with spider traps. For reference, a spider trap is an architectural SEO pitfall in which search engine bots get caught in an infinite (or excessively large) loop of crawled webpages.

E-Commerce websites are most susceptible, particularly if employing faceted navigation. Faceted navigation grants website visitors the ability to refine the number of displayed products on a given webpage by applying “filters.” To do so, dynamic parameters are appended to the page’s URL. This becomes an issue of wasted crawl budget when preventative SEO directives are not applied.

So, if you find that you crawled over 300k URLs for a 150-page website, start down this rabbit hole. If faceted navigation is the root issue, look into applying the proper crawl directive (robots.txt disallow, robots meta tag with a value of noindex, canonical tag, etc.) to corral crawls of your site.

Title Tags and Meta Descriptions

Yes, it’s straight forward. But this easy win opportunity is often overlooked when performing a simulated site crawl. Not only are the title tag and meta description fields directly tied to click-through rates, but title tags also offer an opportunity to directly embed targeted keywords within a page’s HTML elements.

First things first when reviewing title tags and meta descriptions: optimize these fields for webpages that are ranking or meant to rank within SERPs. Prioritize based on traffic generated, write unique copy and ensure that a title and meta description exist for every page on your site. To review these fields within your Screaming Frog crawl, just scroll through the right hand “overview” section to “Page Titles” and “Meta Description.” For best practice advice, check out Moz’s SEO fundamentals pages.

While not exhaustive, the three points detailed above should give you a solid start to analyze your website crawl. If you have questions or issues using the Screaming Frog, you can contact their team here. Also, feel free to leave comments or questions below!

You’ve Crawled Your Site…Now what? is an original blog post first published on Go Fish Digital.

]]>
https://gofishdigital.com/blog/analyze-your-website-crawl/feed/ 0
5 Steps for Identifying Thin Content https://gofishdigital.com/blog/five-steps-to-fix-thin-content/ https://gofishdigital.com/blog/five-steps-to-fix-thin-content/#respond Mon, 14 Jan 2019 20:59:55 +0000 https://gofishdigital.com/five-steps-to-fix-thin-content/ There is a lot to juggle when addressing the SEO of a website. If you’re a marketing manager, you’ve probably heard dozens of different industry buzzwords. Backlinks, domain authority, featured snippets and canonical tags are all terms that, almost certainly, have been mentioned during one of your SEO consultations. We find it worthwhile to step […]

5 Steps for Identifying Thin Content is an original blog post first published on Go Fish Digital.

]]>
There is a lot to juggle when addressing the SEO of a website. If you’re a marketing manager, you’ve probably heard dozens of different industry buzzwords. Backlinks, domain authority, featured snippets and canonical tags are all terms that, almost certainly, have been mentioned during one of your SEO consultations.

We find it worthwhile to step back from this noise every so often to prioritize SEO endeavors. At its most basic level, SEO is about “optimizing your site to serve your users’ needs.” It’s imperative, then, that the content we serve to users (and search engine bots) is worthwhile and not thin, i.e. low quality that adds little to no value.

Related Content:

How do you identify pages with thin content? Is it allowable for certain pages to be light on content? Follow the 5 steps below to answer these questions and more as you optimize your site.

1. Crawl Your Site For Thin Content

To optimize your site’s content for search engines, you have to think like a search engine. To do so, you should employ an SEO crawler tool, such as DeepCrawl or SEO Crawler. At Go Fish Digital, we use Screaming Frog.

Before crawling your website, make sure that you’ve configured your simulated crawl correctly. Within the main navigation, select “Configuration” and then, within the dropdown, select “Spider.”

A view of the Screaming Frog configuration settings.

For the purposes of this exercise, we want to perform a quick crawl of just HTML pages to review their content. We will also want to compare some pages against the sitemap’s submitted URLs, so you will need to supply your website’s sitemap location within the configuration menu’s bottom field.

After configuring your crawl, ensure that the main navigation’s mode setting is set to “Spider.” Then, enter your home page’s URL into the field marked as “Enter URL to Spider” and select “Start.”

A screenshot of the Screaming Frog Crawl Menu

2. Review Word Count To Find Thin Content

After your simulated crawl of the site is complete, select “HTML” within the right-hand menu’s navigation.

Next, review any pages with a suspiciously low word count. It depends on the website, but I usually like to review pages with fewer than 200 words.

NOTE: A low word count in and of itself does not quality a page’s content as thin. Sometimes certain pages, such as blog articles, warrant a short word count. In those cases, a manual check should be performed to ensure that it actually contains thin content. 

Back to Screaming Frog. You can select the column header titled “Word Count” to organize your website’s URLs and then begin reviewing.

A Screaming Frog simulated crawl ready for reviewing.

3. Review status codes and canonical tags

As you investigate these pages that (potentially) contain thin content, we need to ensure that the status code isn’t a 301-redirect response and that the canonical tag doesn’t point to another page. The rationale is straightforward: 301 redirects and canonical tags pointing to other URLs direct a page’s link equity to those other web pages. As a result, a page with either of these directives is already properly accounted for (in the case of this task) from an SEO perspective. Instead, review the 301-redirect/canonical tag URL destination. There is a column within your SF crawl titled “Indexability Status” that will supply you with this information:

A review of a web page's indexability status via Screaming Frog.

4. Compare pages with (potentially) thin content against the XML sitemap

After confirming that a page (a) has a low word count, (b) does not 301 redirect and (c) does not contain a canonical tag pointing to another URL, we will want to see if the page that we’re investigating is submitted within the XML sitemap.

As a reminder, a sitemap is an XML file that lists the core URLs that webmasters want indexed by online search engines. So, if the web page that we’re investigating is included within the XML sitemap, we can conclude that it should be searchable by online users and can potentially rank within SERPs (search engine result pages) for organic keywords.

To see if a page is included within the XML sitemap, scroll to the section titled Sitemaps within the right-hand navigation menu. Then, search the URL that you are reviewing by entering it into the search field titled “Search…”

Sitemap review via Screaming Frog.

5. Ensure that the page with thin content is intended for organic listings

Now that you’ve properly identified a page with thin content, the last step is to ensure that the web page is meant to rank for organic keywords within SERPs.

For example, if it’s an About Us or Contact Us page, it probably doesn’t rank for an organic keyword with substantial search volume. You’ll want to check and ensure that these pages have enough text to validate their existence on the site, but they probably will not need further optimization/more breadth added to avoid being flagged as thin.

Conclusion

And there you have it. You’ve crawled your website, organized pages by word count and verified that the page with thin content is meant to supply users with valuable information and rank within organic SERPS.

You’re well on your way to remedying this issue! Your next step is to now circle back and optimize these thin content pages. Ensure that you’ve performed proper organic keyword research before adding text. I suggest utilizing the expertise of an SEO specialist focused on content optimization.

Feel free to leave your comments/questions below! Thanks for reading!

5 Steps for Identifying Thin Content is an original blog post first published on Go Fish Digital.

]]>
https://gofishdigital.com/blog/five-steps-to-fix-thin-content/feed/ 0