/r/bigseo
Welcome! This subreddit was created with the intent to foster growth and knowledge about SEO. This is the largest and most reputable SEO subreddit run by professional SEOs on Reddit.
We encourage you check the sidebar and rules before posting.
See our full rules at https://www.reddit.com/r/bigseo/wiki/rules.
A community for professional Agency, In-House, and self-employed SEOs looking to discuss strategy, share ideas, case-studies, and learn.
Follow @rBigSEO on Twitter to get a feed of the best threads from /r/bigseo there, including some older threads you may have missed.
Please click report to report abuse, spam or anything that you'd like to be flagged for deletion. Our definition of spam is "self-marketing orientated content" and "low grade content that offers no value for the average professional".
To remove spam, this subreddit requires that contributors' accounts are over 1 day old and have at least 3 comment karma.
New posters and posters that fall below that threshold will need to have their posts manually approved. This can take up to 24 hours. Read our guide for new users more details.
Check out past /r/BigSEO AMAs!
Check out this subreddit's wiki for lots of information about SEO. You can contribute to the wiki by having over 100 karma in this subreddit!
Tell us about yourself with user flair!
The color represents your type of employment. Edit the text to include your job title (or whatever identifiable information you want). Please do not put URLs or other spammy material into your user flair.
We also have flair for your @Twitter handle.
Look at your username up above in the sidebar and click the little (edit) link to assign yourself some flair.
The mods are active!
Message the mods and we'll get back to you soon.
/r/bigseo
I've encountered a strange issue that’s directly affecting my SEO strategy. When I search for an exact phrase with quotes (e.g., "Example Phrase"), my site ranks well and shows up without any problem.
However, when I search for the same phrase without quotes (Example Phrase), my site completely disappears from the search results.
This is concerning from an SEO perspective because it impacts organic visibility. I’ve checked for indexing issues in Search Console, and everything seems fine. The site is optimized for that phrase, with proper on-page SEO, but this discrepancy is affecting my CTR.
Has anyone dealt with something similar? Could it be a technical SEO issue, like canonical tags, content relevance signals, or maybe something related to Google’s recent algorithm changes?
Any insights would be greatly appreciated, as this is having a real impact on my site's performance.
So, this is a case of "The cobbler's children have no shoes" because I've worked in SEO for over 10 years, and for the life of me, I can't figure this out.
I opened up a new site for local services in November. I'm under no illusion that I should have rip-roaring search results after only three months, results are going to take a long while. But I've never had a problem getting pages, at least, indexed.
If I was working on a client site, I'd work to make sure that
And I've done all of this for my website, but Google is just not indexing these pages, and is not even acknowledging their existence (I'm being dramatic)
Here is what GSC sitemap report looks like:
https://i.postimg.cc/TPKJhXrH/Sitemaps.png
And here is what Page Indexing looks like
https://i.postimg.cc/5NtYfS5s/Page-indexing.png
I've done URL inspections, Testling live URL with Google, and they always come back Good, then I resubmit the pages (up until Google tells me to quit it for today)
Any ideas? Have you seen any issues with indexing lately?
Hi!
So I (too) have installed GTranslate Enterprise on my Wordpress site. It was 2,5 years ago and I thought it was a nice idea. I then had no clue about the existence of duplicate content, robots.txt, hreflang, nothing. I was just a blogger that got views via Google and then sales. At first after the install of the GTranslate plugin it was going great: my first language blogs still got views in google, around 300 a day, and my two new languages weren't doing bad either. Then I thought, I'm adding a whóle lot more languages. Again: I had no idea of the existence of crawl budget or Google guidelines.
So of course first around september 2022 I noticed some traffic drop. Heard about Google Core updates so I thought that was the cause. Then in april 2024 it went down to around 5 clicks a day in the first language. That's when I quickly had to learn more about SEO; that's going ok.
So in april 2024 GTranslate said the traffic drop was because of me having hreflang turned on ánd translation URL. Didn't exactly know what that all meant but I turned off the hreflang. Then they told me over and over to wait. Eventually at the end of december when I still had 5 clicks a day, I got tired of that and removed the plugin, removed all the foreign sitemaps and 301'd all the other languages. I also found out that even though my site only consists of 500 blogs/ pages, apart from some tags, categories and page2,3's etc., it had 500.000 links in GSC, 400.000 of them not indexed. As time passes and we move to february now, it went down to 350.000 noindexed pages and 6000 indexed.
In the mean time I keep busy with noindexing tags and images, put some things in robots, validating errors in GSC, fix my headers, adjust my content to make it more interesting, linkbuilding, etc.. Because maybe, and I truly don't know, my traffic gone bad had nothing to do with this plugin and had everything to do with the other things I listed.
I'm not sure what my question even is. That it wasn't smart to do the things I did is clear. So, what do you all think? Can my website recover from this? Was it really because of the plugin? I don't have a penalty or some warning in GSC, but I don't know what to do from here: wait more, adjust something, give a signal somewhere, start something else... Do you guys have any experience with this?
Thank you guys in advance for your time!
I was just analyzing the backlink profile of one of my competitors that gained a lot of traction over the past few months. I noticed that they built quite a few links coming from official city websites.
There's no way those are organic links since the company is based in a city that's no way near the towns pointing to them. All the towns are scattered across the country so I doubt they have connections with all those websites.
My best guess is that they did an outreach campaign but I'm confused as to what they could be offering them in return. I don't see a town accepting payment in exchange for a backlink, right?
The links are all homepage links as well, coming from the "news" section of the page. Those websites are (and look) very old so they have a ton of authority.
Any ideas how they could have gotten those links? What could they possibly be offering them?
I've been using SEMrush for years, but I've come to the conclusion it's useless for long-tail kw research.
Purely based on guesses, I've created plenty of content that generates highly focused, moderately high-volume traffic from long-tail kw combinations that would have been impossible to find with SEMrush as it either doesn’t show them or quantifies them as 0 volume.
Are there any tools on the market that actually help find long-tail kws?
Hey everyone,
I run a diabetes-focused blog where I’ve published 400+ articles covering topics like diabetes and weight loss. Here’s what happened:
Initially, impressions grew from 0 to 2,000 per day. After the October Google update, impressions dropped significantly. In December, impressions briefly returned to 2,000 per day for 5 days. Then, impressions plummeted to 50-60 per day and haven’t recovered since. What I’ve already done: ✔ Good page speed ✔ Topical clusters in place ✔ Content quality maintained ✔ Site is ~1 year old
❌ No backlinks yet
I’ve optimized everything I can think of, but I’m still struggling with visibility. Has anyone faced something similar? Any advice on recovery?
Bringing this thread back today because I have my own question to throw in, sorry mods
Google is indexing all translated homepages (e.g., example.com/eu/de/, example.com/eu/fr/, example.com/eu/it/) but is not indexing my main example.com/eu/ homepage.
Even though I’ve set example.com/eu/ as the canonical, Google keeps choosing example.com/eu instead. Has anyone dealt with this before?
Fixing broken backlinks and there’s a handful directing to a profile page of a founder than is no longer with the company.
The anchor texts are all his name on those sites so I can’t ask the sites to change the name and direct it to the founder that’s still with us.
I don’t want to redirect and get soft 404’s.
What’s the best solution?
Do I disavow?
I feel like asking them to change the link to our home page because that’s bad SEO, right?
I bought a site that had about 50 pages on it. They were old and ending in .cfm.
I set up a wordpress site. I redirected as many of the .cfm files to new URLs, ie www.site.com/page.cfm to www.site.com/page/
I think used linkjuice keeper which redirects any other broken links to the home page.
I've run this for about a year now.
Now I'm at the point of incorporating this website into my main site as it's own section. ie sites on dogs and the site about is "dog grooming".
So now I will be adding it to my main site, www.dogs.com/dog-grooming/
And the link structure will be all the same, ie www.dogs.com/dog-grooming/page/
The old site is getting good traffic, so i don't want to mess up the SEO and rankings on it.
Not all of them are just .cfm redirects, some ended like resources?sectorTypeId=8 and I moved it to /resources/section1/
So now I want to 301 redirect the old site to the new site. I'm sure there's a way I can move the whole structure to my sub directory /dog-grooming/
But my concern is about the old pages .cfm and the resources? ones. If I use one code to redirect the whole thing, the old pages, which still receive links and traffic will probably go to an incorrect URL.
Is there a way I can move the whole structure of the site plus the old links that I also redirected?
Hope that's not too confusing.
Hi there. I have a website that is saying (in Semrush’s website) in a lot of pages including the main domain indexed pages error 406 not acceptable. Is there a way to fix it or is it impacting my ranking somehow? i have around 10k traffic and only 10 in domain authority when pages with 1k traffic have the same authority as me.
I’ve some months ago disavowed some backlinks that were gambling related or toxic scored - can that be the reason? I saw a huge decline in backlinks last year.
Thanks!
The domain is theportuguesetraveler com
Working with a client who has 2 GMB profiles in adjacent suburbs. The business has many "location pages" for suburbs in its region.
Wondering whether it would be best to delete one of the GMB profiles or keep both? Also, one of the titles has keywords stuffed into it... would it be safe to change this to something more relevant/accurate? (I would change the citations/NAPs accordingly)
If both are kept, the only difference in their titles would be the suburbs.... If both are kept, should they both link to the home page or to seperate location pages? Should there be a "main" profile which I tell the client to focus getting more Google reviews on? Should posts/images/services be the same or different?
Anybody here know how to go about this? This topic came up in a brainstorming session among leaders and now we've been tasked to come up with some actionables...any idea how to go about this? I've only got experience in keyword optimization, content gap analysis and the likes..please help.
Hi,
I've been receiving a bunch of 404 errors on Google Search console for a ton of these URLs ending in /1000. Anyone know if this will effect our ranking?
Thanks in advance ^ ^
I need guidance on what can a news website do to get genuinely organic visitors on its website and increase distribution of their news articles.
The articles rank on Google / google news
They currently are getting very basic number of views
Has anybody has experience syndicating a news website on another news website? How to go about it?
Any help will be appreciated
Currently with Fathom, but free sounds good
They say they wont sell your info lol
Beginner questions welcome.
Post any legitimate SEO question. Ask for help with technical SEO issues you are having, career questions, anything connected to SEO.
Hopefully someone will see and answer your question.
Feel free to post feedback/ideas in this thread also!
**
r/BigSEO rules still apply, no spam, service offerings, "DM me for help", link exchanges/link sales, or unhelpful links.
Over the past few weeks, I’ve been conducting an experiment. I’ve tracked the websites that consistently appear on the first three pages of Google search results across broad topics like IT, health, space, DIY, and philosophy. After weeks of analysis, one thing became glaringly obvious: the same “elite” group of about 100 authoritative websites dominates nearly all search results.
This raises a crucial question: Why does Google continue to index billions of other websites when they have almost zero chance of appearing in search results?
Google spends billions of dollars maintaining and growing its infrastructure to crawl, index, and serve search results. This includes massive data centers consuming astronomical amounts of energy and resources. Yet, the majority of this effort supports websites that never make it to the visible part of search results.
Here’s the brutal truth:
The cost of crawling and indexing billions of web pages goes beyond money:
The reality is that smaller players rarely rank, no matter how brilliant or original their content is. If this ecosystem already revolves around the big players, why not simplify the system entirely?
Yes, this would mean smaller websites lose any remaining chance of visibility. But let’s be honest: that chance is already minuscule. A passionate hobby blogger with groundbreaking ideas about space exploration or IT solutions is virtually invisible unless they somehow game the SEO system or get lucky.
Google claims to reward great content, but in reality, it rewards authority. That authority is built by big budgets, years of presence, and aggressive SEO strategies—things smaller players simply don’t have.
The harsh truth is this: preserving an illusion of fairness by indexing everything comes at a massive ecological and financial cost.
By limiting the index to the most impactful websites, Google could:
It’s a tough call, but the numbers speak for themselves. Indexing billions of irrelevant pages wastes resources and does nothing for users. Perhaps it’s time to admit that not everything on the internet needs to be indexed.
What do you think? Should Google stop indexing the majority of websites to save money and the planet? Or is it worth keeping the door open for smaller voices, even if they rarely get heard?
I have been testing DataForSeo using Make and a tutorial on youtube on their channel to do some keyword research, but I am finding the resulting to be totally irrelevant.
I typed in the keyword Pet Fitness Plan, and got 100 results for Planet Fitness, Gym Memberships and Weekly Fitness plans! This was following the set up exactly how they had it in their video.
Has anyone else come across odd examples like this? Wondering what I and doing wrong.
Casual Friday is back!
Chat about anything you like, SEO or non-SEO related.
Feel free to share what you have been working on this week, side projects, career stuff... or just whatever is on your mind.
Some of you may be familiar with this site - they're pretty big and used to rank very well for a lot of keywords, including keywords in ultra-competitive niches.
However, I've noticed that Google seems to have completely purged the site from its rankings (even if you search Techopedia. com in Google nothing comes up.)
What do you think is the reason for this? I know they used to sell a lot of guest posts - I wonder if Google is finally clamping down on this more aggressively.
Hello, Most of my product images are not showing up on google becuase i have put a watermark on it.
Do i need to completely remove the watermark from the image for it to be accepted by google merchant center or can I edit the product image to still have the branding of my product by putting the branding in a corner of the image where it is no longer overlapping the product image for it to be acceptable?
What is the criteria for it?
Currently, the name of my website is put on the product image as a watermark (5-10% opacity).
If instead of having the watermark on the product, if I change the location of it to bottom right of the image where it is not overlapping the product, will it get accepted by google merchant center?
Or do I need to completely remove the branding from the image for it to be accepted at google merchant center?
Anyone can help me with the resources to setup a connection in power bi/python to extract reports from Conductor?
My website was hacked ~5 days ago with over 50k pages of spam indexed. I did not notice the hack until they were indexed and our traffic dropped over 50% overnight. Prior to the hack, we were doing well on SEO. We were ranking on hundreds of relevant keywords in top spots.
It appears as if the hack changed all the meta tags for every single page.
I have rolled back the website, removed the entry point, and de-indexed the spam pages. It appears as if Google has already actually de-indexed the pages and has begun re-indexing the correct meta tags (they started the re-index and de-index of spam and all within 5-6 hours since I made the fixes). We still appear to rank high on some search results but we're obviously lost a lot of them.
Is this going to be recoverable? Will we ever get back to our original rankings?
We all know Google hides data by search queries and clicks. But I've never seen this.
Our SEO team did a great job and now Sitechecker ranks №1 by the "SERP alerts tool" keyword and related keywords. However, the Search Console shows 0 clicks for all these search queries.
Yes, it happens, but there is always some reason for that. Usually, it's paid ads or featured snippets. In this case, it's a simple 10-blue link SERP (something we all dream of💰).
I checked SERP in different tools. Everywhere is the same: 10 blue links and we rank №1.
Do you see something similar in your data? What do you think the reason for that?
My company resells several software vendors, most of which have multiple products - over 20 vendors and over 100 total products. As is the norm in software, there are frequent changes – new products, changes of product names and/or logos, etc., and, of course, the need to regularly refresh content to keep SERP happy.
I am a one-person marketing department, so time is precious, but everything from content creation to SWAG design falls to me. The more pages I have, the harder it is to keep content up-to-date and refreshed often, but I want the best SEO/search results possible.
With the recent changes in Google, AI, etc., how much is the gain in SEO if I have a landing page for every vendor AND a landing page for every product vs having the vendor/products on the same page?
Some back story:
I have a website (launched 2013) that’s been number 1 position (2015) in the main keyword for the past 8 years.
This website had a previous domain: previous.com
I wanted to make a complete rebrand because I wanted to pivot, and I saw an opportunity.
Complete rebrand (2024), changed the domain name from previous.com -> new.com
Made sure all 301 redirects were fine.
It worked Google didn’t care, serp rankings were down a bit but recovered.
NOW…
I’m expanding the product and the landing page no longer made sense, it was focused on a single feature and I wanted it to be more broad.
Launched new landing page (1st December 2024) and moved the previous landing page to another slug.
GOOGLE HATED THIS.
He’s coming up with new titles that reflect the previous landing page.
It’s not picking up the previous landing page changed, despite internal links / sitemaps and even 301 redirects to the new slug.
Even worst, now Google won’t even show my website when searching for its own name!
It does however show it, with a complete different title when searching feature related keywords.
What options do I have?
I have had dozens of customers reach out saying they can’t find my website on Google!
Note:
Hi all,
I am in the proces of planning how out product should be organized on the website. And I need some feedback (Pros/cons, comments, etc) on the plan I have so far.
We are a B2B platform and the goal is to enter 20 different markets in Europe the next couple of years, but I want the foundation to be on point before we start.
I am a firm believer in, that local content usually resonates and perform better than standardized English content in markets, where English is the second language.
I am looking for insight in managing content in multiple languages on a strategic level.
So far my plan is:
Launcing the "Basic" pages locally in all 20 languages (So, home page, product page, about, contact, some supporting pages). About 20 pages in total per language.
Evolving the English part with additional supporting pages, a blog universe (Articles, news, etc,)
Later on expanding the English supporting content to other languages one by one.
My biggest question is:
What are the pros/cons on having a gTLD e.g. .com/[language code] (.com/de/ - .com/fr/ - .com/nl/ vs. having ccTLD e.g. .[language code] (.de - .fr - .nl)
My initial thought is, that having ccTLD will have a better impact on the local markets, but more expensive to run linkbuilding, where the gTLD will be easier to manage, but with different builds, while we expand the supporting content to all languages
What are your thoughts?
Anything i need to be careful about?
I have a couple of theories to run by you guys, as GSC is doing my head in.
• Theory 1: A reported drop in ranks could also mean a drop in search demand. One of our clients was an office supplies company, and I'd see their ranks drop every weekend and then recover on Monday. I see other (B2C) clients with seasonal products having massive rank drops off-season for no apparent reason (and that aren't necessarily borne out by manual checks through a VPN on an incognito window).
• Theory 2: The first Organic result on a SERPs page isn't in position 1.
-- The AI overview takes position on, and credited sources (to the right) get positions 1.2, 1.3, 1.4 etc
-- 'People also ask' takes a position
-- Image block takes a position
-- Other Google-provided elements take positions
...so with an AI overview and a PAA block coming before the first Organic result, that first result is actually in position three.
I'm genuinely interested in your opinions, particularly if you have experience that feeds your beliefs. Shoot me down if you think I'm completely off with these (but I kinda think I'm not).
Thanks
I was just checking the traffic and ranking status of two of most liked SEO Tools SEMRUSH and AHREFS.
After countinous Google algorithm updates, SEMRUSH lost lots of traffic and keywords ranking. Here I am adding screenshot of SEMRUSH website audit status.
On the other hand AHREFS is gaining organic traffic and improvement in the organic keywords. Here I am sharing the screenshot.
Both the screenshots are taken from SEMRUSH Tool :)