In this article, we’re exploring one of the latest strategies that focus on maximizing the impact of your on-page in terms of being optimized for Google. It’s called Progressive Optimization. When implemented fully, agencies and website owners are getting some great SEO wins so it’s definitely worth exploring.

What is Progressive Optimization?

Often, we have pages that were created a few years ago. They were written according to the SEO strategies at the time. But life has moved on considerably since then. Google’s perspective on content and on-page has changed; it’s now a major focus.

The industry has taken advantage of the TF IDF theory (Term Frequency and Inverse Document Frequency), which uses the top-ranking content for a particular search to determine how you can optimize your on-page. Recently, Google introduced NLP (Natural Language Processing) in the BERT algorithm, which plays a major role in how Google learns to understand content and user intent. 

Google’s BERT algorithm with Natural Language Processing is data-driven, on-page optimization at its best.

It’s data-driven on-page optimization at its best. And as digital marketers, that’s one itch we can’t help but scratch; we love metrics and data. So the result is, there are some improvements you can make to your page to help get an increase in your rankings. 

Most strategies will follow a process of making your calculations, using your tools, editing your content and making on-page changes, and then publishing those changes and waiting for the Google dance to subside (for those not introduced to the Google dance, this is when Google bounces around your search results for a few weeks until it settles on your new position in the SERPs).

Progressive Optimization operates differently. You make the same calculations, you determine how you are going to optimize your page, you even make those changes (offsite though, in a document). Then rather than making all those changes to your page, you make a plan to make some changes every few weeks over several months.

Essentially, you’re optimizing in steps rather than all at once. 

What Are The Benefits Of Optimizing Progressively?

There are two important reasons why this strategy has some value and is worth exploring

We want to be data-driven. 

Let’s say we go to a page and re-optimize everything, from H1 tags to the density of NLP entities, then publish those changes, and we shoot up 5 positions. 

That’s awesome, but which one of the changes caused the boost? We can’t be sure. It could be any of the changes we made. We’d be guessing. Now, how do we know where to focus our time, considering we have other pages and other sites? We’d be guessing. We can’t even say we’re just chucking it against the wall to see what sticks because we don’t know what is making it stick.

When you get a boost in your SERPs, you will be able to identify what caused the increase in results.

With Progressive Optimization, you make a change; then, you track it. You make the next change, then track again. When you get a boost in your SERPs, you will be able to identify what caused the increase in results. Likewise, when you see a drop, you know what to change back.

You’re in much more control and over time you’ll have a clear idea of what to focus on. There are over 240 ranking factors, many of them on-page. But not all ranking factors were created equal. This allows you to track what is actually making an impact.

You want to be able to employ the Pareto rule, also known as the  80/20 rule. That means 20% of the changes will give you 80% of the ranking results. We need to be able to identify which 20% of the changes we make, so we are focused on the right activities.

Google Loves Dynamic Content

You want your pages to be crawled by Google as often as possible. That’s how Google treats the most trusted pages. 

I recently saw someone from the Wall Street Journal complaining that Google has stopped indexing their pages. It was taking ALMOST 20 mins to get new pages indexed!

Most pages take 1-3 days, WSJ takes less than 20 minutes. 

Every website has a crawl budget. The more often Google crawls your page and finds changes, the more often it will crawl it in the future. The larger the crawl budget, it will assign to your website too.

Part of that crawl budget is related to the structure of your website and how easily Google can crawl it. That’s why sites with a flat site structure tend to do very well with Google. Google also likes the idea that the websites and pages are changing, that they’re growing and building. They see this as improving the user experience. There’s more there to see and more reasons to recrawl your site again and again.

Studies have shown a correlation between increases in rankings and increases in regularity and depth of crawls. With that in mind, we want to train Google that it’s worth coming to our website often because there are interesting changes. Progressive Optimization can help with this.

Where To Start – Introducing Power Factors

The question turns to how to optimize content progressively. There are so many changes you could make – what’s worth focusing on first?

Sticking to the philosophy of Pareto’s Principle, what’s the most likely 20% of changes that will generate 80% of the results?

Dan Cutteridge, one of the first people to really talk about Progressive Optimization, coined the phrase “power factors.” These are:

  • Page Title
  • URL
  • H1
  • H2, H3, H4, H5, H6
  • Term Frequency
  • “Important Terms”
  • NLP entities

If you are unable to track these and decide what changes to make, there are tools that collect the data for you like Surfer, POP, and Clearscope. 

A good suggestion would be to focus on those 7 factors, make a change and track the changes. Then once the rankings have settled down, make the next change and track again. Old school calendars are great for marking down changes you plan to make, so you don’t forget what you did and when. This, coupled with your regular rank tracker, should work great. 


The evidence of a correlation between an increase in crawl regularity and ranking is compelling. I know of expert SEO’s who spend a lot of time on the technical side of SEO, really refining websites to make crawling as easy as possible, to maximize the crawl budget they have. 

Now that they’re implementing Progressive Optimization, which is essentially adding one or 2 steps to their process, they’re seeing some significant improvements. 

The data set is still small. Progressive Optimization as a theory is still not a strategy known and understood by the industry as a whole. But if you’re looking at advantages over competitors, it is definitely worth exploring because ultimately, the only additional investment you’re making is some administrative time in mapping out and tracking changes.