Traffic drops are rarely what they look like. A site loses 60% of its organic visits overnight, someone calls it a Google penalty, and the instinct is to start fixing everything at once. That's usually the wrong call — because "penalty" covers two very different situations, each requiring a completely different response.
I manage 44 websites across different niches and tech stacks. When I audited the portfolio earlier this year, 11 of those 44 had indexation or canonicalisation issues significant enough to suppress traffic. None of them had manual actions. The suppression was quiet, algorithmic, and far easier to fix once diagnosed correctly.
Here's how to diagnose what you're actually dealing with — and what recovery looks like for each type.
Manual Actions vs Algorithmic Penalties: Not the Same Thing
The distinction matters because the recovery path is completely different.
A manual action is a human reviewer at Google finding that a specific page or your whole site violates their spam policies. You get a notification in Google Search Console (GSC). It's explicit. Common causes: unnatural links, hidden text, cloaking, pure spam, structured data abuse. Fix the issue, submit a reconsideration request, and Google reviews it manually.
An algorithmic update — Panda, Penguin, the Helpful Content Update (HCU), or a core update — is an automated system reassessing your site's signals. There's no notification. No reconsideration request. You won't find it in Search Console. You'll find it by correlating your traffic drop date against Google's update history.
These are not interchangeable. Businesses routinely spend months trying to recover from a "penalty" that was never a penalty — it was a core update, or a technical issue, or a content quality signal. Fixing the wrong thing wastes time and sometimes makes things worse.
Step One: Check GSC for Manual Actions
Before doing anything else: Security & Manual Actions > Manual Actions in GSC. Takes 30 seconds.
If there's an active manual action, it'll be listed there with a description (site-wide or page-level), the date it was applied, and what policy it violates. If that screen is clean, you don't have a manual action. Move on to diagnosing the real cause.
Step Two: Match the Drop Date to an Update
If GSC is clean, pull your traffic graph from GSC Performance over the past 12 months and identify the exact date the drop began. Then cross-reference that against Google's confirmed update history.
If your drop started within 1-3 days of a confirmed core update or the HCU rollout, that's your culprit. If it started on a random Tuesday with no corresponding update, it's more likely technical — a crawl issue, a hosting change, or an indexation problem.
Diagnosing Algorithmic Hits
Algorithmic penalties aren't announced. You have to reconstruct what happened from the data.
Panda (content quality) — Affects sites with thin content, duplicate pages, or content that adds little over what's already ranking. Check: how many of your pages have fewer than 300 words? How many are near-duplicates of each other? What's your ratio of content pages to thin pages?
Penguin (link spam) — Targets unnatural backlink profiles. Penguin is now built into Google's core algorithm and runs in near-real time. Check: does your backlink profile have an unusually high proportion of exact-match anchor text links from low-quality sites? Ahrefs' anchor text distribution report shows this clearly.
Helpful Content Update (HCU) — Sitewide signal targeting content written for search engines rather than people. Check: how much of your content relies on templates, data pulls, or formulaic structures without original insight? HCU hit a lot of programmatic SEO sites hard in 2023-2024.
Core updates — General reassessment of quality signals across E-E-A-T (Experience, Expertise, Authoritativeness, Trustworthiness). Harder to pin to a single fix. Usually means improving multiple signals simultaneously: author credentials, original research, citation quality, content depth.
The Acquisition Scenario: You Bought a Penalised Site
This situation is more common than people expect, particularly in acquihires and digital asset acquisitions where SEO traffic is the primary value driver.
If you acquired a site and traffic dropped immediately — or you've pulled the GSC data and found a manual action was already in place — here's the order of operations.
First, establish what you actually bought. Request the full GSC history from the previous owner before close. Look at the 12-month trend, not the headline number. A site showing 50,000 monthly clicks may be in a slow decline that the seller knows about. Manual actions also show in GSC history — they don't disappear when ownership transfers.
If there's a manual action: The previous owner's behaviour is now your problem. Google doesn't care that you're new. You need to:
- Audit the backlink profile fully. Disavow every low-quality, paid, or spammy link. This is not optional — partial disavow requests are rejected.
- Fix whatever the manual action specifies. If it's link spam, the disavow. If it's thin content, the content. If it's cloaking, the code.
- Submit a reconsideration request through GSC that documents exactly what was wrong and what you've done to fix it. Be specific. Vague requests get rejected.
- Wait. Manual action review typically takes 2–8 weeks. Some come back faster. Some take longer if the reviewer finds additional issues in your response.
If there's no manual action but traffic dropped after acquisition: The most likely causes are technical, not algorithmic. I've seen this pattern repeatedly. Someone "tidies up" the site during handover — changes hosting, updates the CMS, cleans up URLs — and breaks the signals Google relied on. Check:
- Did the URL structure change? Even adding or removing trailing slashes can cause indexation issues across a large site.
- Are redirects fully in place? Any broken internal redirect is a lost page.
- Did robots.txt change? A single misplaced
Disallowrule can block thousands of pages. - Is the sitemap current and submitted? An outdated sitemap confuses Google's crawl prioritisation.
I fixed this exact set of issues on a 200-page site earlier this year. Indexed page count had dropped from 186 to 94 during a CMS migration. The cause was a trailing-slash redirect misconfiguration that created duplicate URL paths — Google was seeing both versions and canonicalising to the wrong one. Fixing the redirect rules and resubmitting the sitemap brought indexed pages back to 178 within 6 weeks.
Realistic Recovery Timelines
This is where most recoveries go wrong. People fix the issue correctly, then expect traffic to return in two weeks. When it doesn't, they assume the fix didn't work and start changing other things.
| Penalty Type | Fix Timeline | Recovery Timeline |
|---|---|---|
| Manual action (link spam) | 2–4 weeks to audit + disavow | 2–8 weeks post-reconsideration |
| Manual action (content) | Varies by content volume | 2–8 weeks post-reconsideration |
| Penguin (algorithmic) | 1–2 weeks to disavow | Near-real-time once processed |
| HCU / content quality | 2–6 months of content work | Next core update (3–6 months) |
| Core update | Multiple signal improvements | Next core update (3–6 months) |
| Technical (crawl/index) | 1–2 weeks to fix | 4–12 weeks for re-crawl + re-index |
Algorithmic recovery is not in your control once the fix is done. Google's core updates run on Google's schedule. You can do everything right and still wait 4 months to see the results. That's not a failure — it's how the system works.
What Not to Do During Recovery
A few patterns I see repeatedly that extend recovery instead of accelerating it.
Don't change everything at once. If you're recovering from an HCU hit and you simultaneously revamp your content, change your URL structure, and redesign the site, you'll never know what worked. Make changes in isolation where possible. Measure each one.
Don't delete pages in bulk. Removing content feels like "cleaning up." But if those pages have backlinks or historical authority, you're destroying signals, not improving them. The right call for thin content is usually to improve it or consolidate it via 301 redirects — not to delete it.
Don't submit a reconsideration request before the fix is complete. GSC tracks your submissions. If you submit early, Google reviews and rejects, and you wait 30+ days before you can resubmit. Submit once, after the full fix is in place, with full documentation.
Don't treat a traffic recovery as complete when it's partial. I've seen sites recover 50% of lost traffic after fixing a manual action, plateau, and never recover the rest — because the second half of the drop was actually an algorithmic signal the manual action masked. Keep monitoring for 3–6 months after the technical fix.
The One-Page Diagnosis
If you're a CEO or investor trying to work out where you stand, this is the shortest path to an answer:
- Open GSC > Security & Manual Actions > Manual Actions. Clean? Not a manual action.
- Compare traffic drop date to Google update history. Match? Algorithmic. No match? Likely technical.
- If technical: run a Screaming Frog crawl, check indexed page count in GSC, review robots.txt, verify redirects.
- If algorithmic: audit content quality, backlink profile, and E-E-A-T signals. Make improvements. Wait for the next core update.
If you're not sure which category you're in, or you've acquired a site with existing issues and want a clean assessment of what you're dealing with, get in touch. I've worked through this diagnosis across dozens of sites — the patterns repeat more than most people expect.
