I’ve been writing and editing for a popular website for several years as a contractor. Most of the site’s writers are also contractors. Let’s say the site started with a focus on dogs and later added a subfolder for cats. I began working on the dogs site and started contributing to the cats subfolder two years ago.
Recently, the entire cats subfolder was de-indexed by Google. It seems Google may think the subfolder is a third-party parasite SEO site, even though it’s owned by the same company and created by the same team as the main dogs site. Some content from the main dogs site was also de-indexed.
Earlier this week, I was told to stop writing for the dogs site (but can continue editing). I’m still allowed to write for the cats site. A handful of other writers received similar instructions—some had to stop writing for one site, others for both. This affects about 6 of us out of 75-100 contract writers. All of us impacted tend to write the site’s most profitable and widely syndicated content.
Rumor has it that the SEO team thinks Google may view our contributions as third-party or external, even though we’re part of the same team. Interestingly, all the affected writers also have other clients in the same niche and disclose this in their bios. Many unaffected writers don’t write for other sites.
For SEOs out there: Any idea why this could be happening? Could Google interpret shared authorship across multiple sites or niches as a red flag? Any advice on how the site could address this issue?
Google might be flagging the subfolder due to perceived author overlap across multiple sites. If your bios are linking to competing publications, it could create confusion about content ownership or originality. Try removing or revising author bios for affected writers.
It’s possible Google sees duplicate author profiles or similar content syndicated across your site and others. This could trigger a quality issue, making it seem like third-party content. I’d recommend adding more unique author bios and ensuring the content on your site doesn’t appear duplicated elsewhere.
If the cats subfolder is newly de-indexed, it might help to audit internal linking and ensure that the subfolder is well-integrated into the main domain. Weak internal linking can make subfolders seem like separate entities.
This might be related to Google’s focus on E-E-A-T (experience, expertise, authority, trustworthiness). If multiple writers are tied to competing sites, Google could interpret this as a dilution of authority. Consider reducing external references in bios and focusing on establishing authority for your site.
I suspect there’s an issue with how the subfolder content is structured or categorized. Google might think it’s a different site or unrelated to the main one. Consolidating the content or merging subfolder topics might help.
If the SEO team believes the issue lies with the writers, they should test it by publishing some content under different author names. This could help confirm if the authorship is part of the problem.
This sounds like a site architecture issue. If Google isn’t treating the cats subfolder as part of the main site, it might see it as a low-quality or unrelated section. Fixing the structure and linking between the main and subfolder could help.
Check for any technical issues like duplicate meta tags, improper canonical tags, or crawl errors in Search Console. Google might be misinterpreting the content hierarchy or treating parts of the subfolder as duplicate content.