Could old content be dragging down the overall “authority” of your website? We think so.
You have an important decision to make: should you improve your old content or remove it?
Making the right decisions during this process can bring great rewards, in terms of traffic, organic search visibility (rankings, featured snippets, etc.), links, conversions, and engagement.
On March 27, I presented an SEJ ThinkTank webinar to share the process we at Search Engine Journal has been using to improve and remove old content for the last 20 months.
Here’s a recap of the presentation.
Google’s mission since its inception is to “[o]rganize the world’s information and make it universally accessible and useful.”
On Google’s end, nothing has changed.
But what has changed is this little thing called content marketing. Around the time of the original Google Panda update, a lot of businesses and brands finally bought into the idea that content is king.
They started creating all sorts of content – some of it was great, but most of it was average, far below average, or just outright terrible.
Today, lots of content is being published, but most of it isn’t very useful. A lot of it is redundant.
In 2016, the web was made up of about 130 trillion individual pages. But the Google Search index contains hundreds of billions of webpages – only a portion of the total pages available in the web.
The search engine is filtering out a lot of stuff and you don’t want that to be you.
So this year, content marketers and creators have a new mission:
“Give Google only your best content and make it optimized, useful, and relevant.”
It’s 2019. Our mission can’t stay the same. It’s time we all start thinking about content in a new way.
Rethinking Your Content Marketing Approach
Google spokespeople have downplayed the idea that “old content won’t hurt you.” They have also warned that removing content is a dangerous SEO strategy.
But is it really?
Not based on our results.
For the last 20 months, we’ve been hacking and slashing our way through our archives which resulted to increased pageviews and organic traffic of up 60+ percent YoY.
Just check out these numbers:
When I started as Executive Editor in July 2017, we had 910,000 pageviews.
In January of this year, we just had a record month – 1.7 million pageviews. At that time, we had about 18,000 pages.
And we just topped that record again in March – more than 1.9 million pageviews. Today, we still have 18,000 pages indexed – we’re just getting moreout of the same amount of content.
So how did we achieve this growth?
Here’s the process we used.
Step 1: Audit Your Content
The process all begins with auditing and evaluating your content.
There are basically three buckets of content:
- Content that helps you.
- Content that does absolutely nothing for you.
- Content that can hurt you.
We need to figure out which bucket all of our content fits in.
Since 2003, Search Engine Journal has been creating tons of content and it came to a point where it got really messy and disorganized. We needed to get out of that chaos.
The first step in the process is to crawl your content.
Some options that you can use to crawl your content include:
- Screamingfrog
- DeepCrawl
- Oncrawl
- Sitebulb
- Botify
Here are even more crawlers. Choose whichever crawler works for you.
After you get through the crawling process, you need to know about the following elements:
- Title: Is it optimized? Does it include a reader benefit?
- URL: Is it SEO friendly? Do you need to change it?
- Author: Who wrote it? Is it an expert/authority in the field?
- Publication date: Is it still fresh or out of date?
- Number of reads: The more reads, the better. It’s a sign of good content that connected with your audience
- Word count: It isn’t necessarily a sign of low-quality content but it could potentially indicate quality issues.
- Number of links: How many inbound and internal links do you have?
- Trust Flow and Citation Flow: This is Majestic’s metrics for quality score and link equity.
Recent Comments