Campaign Overview

Detail Info
Client industryPersonal finance / comparison
Target marketUnited Kingdom
Campaign duration10 weeks (link building) + 6 weeks (post-indexing)
Links built2,411 Tier 2 links across 180 domains
Tier 1 links38 niche edits and guest posts on finance-adjacent sites
Indexing tool usedRapid Indexer (Standard Queue)
Indexing budget$48.22 (2,411 URLs at $0.02 each)

The Problem

The agency, a seven-person link building outfit based in Manchester, had been running tiered link building campaigns for around three years. Their standard workflow involved building high-authority Tier 1 placements, then supporting those with a Tier 2 network of contextual links pointing at the Tier 1 pages.

The logic is well-established. When Tier 2 links are indexed, they pass link equity through to the Tier 1 pages, which in turn pass it to the client's money site. The strength of a tiered structure depends on how many of those second-tier links are actually crawled and indexed by Google.

The problem surfaced during a routine GSC audit in week 11 of the campaign. The agency cross-referenced their list of 2,411 built links against Google's index using a combination of Google Search Console for the pages they controlled and bulk site: operator checks for the third-party domains.

The findings were not great. Roughly 74% of the Tier 2 links, around 1,784 URLs, were showing no evidence of having been crawled or indexed. The links existed. They were live and accessible. But Googlebot had not visited them, which meant they were contributing nothing to the Tier 1 pages they were supposed to be strengthening.

The client had a six-week window before a competitive review period. They needed the link network to be functional, not just built.

Why So Many Links Were Unindexed

This is a common situation in Tier 2 campaigns, and it is worth understanding why it happens rather than treating it as a failure.

Googlebot allocates crawl budget based on a site's crawl priority. Sites with strong authority, good link profiles, and high update frequency get visited more often. A web 2.0 property, forum post, or article directory page that was created specifically to host a Tier 2 link is, from Googlebot's perspective, a low-priority destination. It has little inherent authority, low update frequency, and often no internal links pointing at it from within the host domain.

Google's own crawl budget documentation acknowledges that low-value pages are deprioritised. For Tier 2 link pages, this is the structural challenge: by their nature, they sit at the low end of Google's crawl priority queue.

Without an external trigger, Googlebot may not visit these pages for weeks or months. Some will never be crawled at all if the host domain's crawl budget is already fully allocated to higher-priority content.

The Approach

The agency had used manual GSC submission in the past for Tier 1 placements, but GSC only works for properties you own and control. For third-party Tier 2 pages, that route is not available.

After reviewing the options, they used Rapid Indexer's Standard Queue for the bulk submission. The decision came down to three factors: the per-URL cost at scale ($0.02/URL), the auto-refund policy for any URLs that fail to index, and the ability to submit up to 10,000 URLs in a single batch via CSV upload.

They exported the full list of 2,411 Tier 2 URLs, uploaded the CSV, and set the submission to Standard Queue. The total cost came to $48.22, with the expectation that any URLs failing to index would have their credits returned automatically.

The submission was made on a Monday morning. The agency then monitored indexing status using a combination of Screaming Frog SEO Spider (set to check cache dates and index status) and manual spot-checks using the site: operator in Google Search.

Indexing Results

By the end of the first 24 hours, the agency confirmed that approximately 38% of the submitted URLs had been crawled by Googlebot, based on cache date checks and index status responses.

At the 48-hour mark, the confirmed crawl rate had reached 71%.

At 72 hours, 89% of the 2,411 submitted URLs were confirmed as indexed. The remaining 11% (approximately 265 URLs) were not indexed at the point of final check. Of those, around 140 appeared to have technical issues (redirects, thin content flags, or no-index tags on the host pages) that would have prevented indexing regardless of the crawl trigger. The remaining 125 or so credits were automatically refunded by Rapid Indexer under their standard policy.

Timepoint URLs Crawled % of Total
24 hours~91638%
48 hours~1,71271%
72 hours~2,14689%
Final (unindexed)26511% (125 refunded)

The 89% figure is consistent with what Rapid Indexer publishes as their average success rate. The 11% that did not index broke down roughly as: 6% with technical issues on the host pages (outside the tool's control) and 5% where no clear explanation was apparent, with those credits refunded automatically.

Ranking Impact Over the Following 6 Weeks

Rankings were tracked weekly using Ahrefs Rank Tracker across 14 target keywords. The baseline was the ranking position at the point of indexing submission. The client's site was a personal finance comparison property targeting commercial-intent terms in the UK market.

It is important to be clear about what caused what. No new links were built during the six-week observation period. The only change was that existing Tier 2 links, previously unindexed, were now in Google's index and passing equity through to the Tier 1 pages.

Week 1 showed no significant change, which is expected. Freshly indexed links need time to be processed by the ranking algorithm.

By week 2, three of the 14 target keywords had moved up between one and three positions. These were mid-competition terms where the Tier 1 pages supporting them had the highest density of newly indexed Tier 2 links.

By week 4, eleven of the 14 keywords showed positive movement. Five keywords moved up by five or more positions. Two keywords crossed into the top 10 from positions 14 and 17 respectively. One keyword moved from position 8 to position 3.

By week 6, the pattern had stabilised. Twelve of the 14 keywords held their improved positions or continued to climb. One keyword that had briefly entered the top 10 dropped back to position 12, possibly a freshness twiddler adjustment or a competitor response. One keyword showed no meaningful movement throughout.

Important caveat: This is a single campaign observation, not a controlled experiment. Other variables, including seasonal search trends, competitor activity, and Google updates during the period, could have influenced the results. The directional pattern is consistent with the expected impact of activating a large number of previously-dead Tier 2 links, but attributing the changes entirely to the indexing intervention would overstate the certainty.

Key Takeaways

Building links and indexing links are separate activities. The agency in this case had treated the campaign as complete once the links were placed. In reality, an unindexed backlink is a neutral event from a ranking perspective. The build and the index confirmation are both part of the deliverable.

Organic crawl discovery at Tier 2 is unreliable. 74% of newly built Tier 2 links sitting unindexed after 10 weeks is not unusual. Google's crawl budget allocation deprioritises pages with low authority and low update frequency, which describes most Tier 2 link properties by design.

The cost of indexing at scale is low relative to the build cost. The campaign had a link building budget in the thousands. The indexing cost was $48.22. The ratio makes the economics straightforward. Leaving a large fraction of the link investment dormant because of an omitted $48 step is not a rational trade-off.

Auto-refund policies matter for bulk submissions. At Tier 2 scale, some percentage of links will not index regardless of the crawl trigger, usually because of technical issues on the host pages. A tool that refunds credits for those failures removes the financial risk from bulk submissions in a meaningful way.

For a full comparison of the current indexing tools available for Tier 2 campaigns, including pricing, success rates, and drip-feed options for large batches, see our ranking of the best Google indexers in 2026. For the practical mechanics of how indexing tools trigger Googlebot crawls, the backlink indexing methods guide covers the underlying mechanisms in detail.

Start Indexing Your Tier 2 Links

Rapid Indexer Standard Queue handled 2,411 URLs for $48.22 with an 89% success rate and automatic refunds on the rest. Pay-as-you-go, credits never expire.

Get Started with Rapid Indexer →