/r/bigseo
Welcome! This subreddit was created with the intent to foster growth and knowledge about SEO. This is the largest and most reputable SEO subreddit run by professional SEOs on Reddit.
We encourage you check the sidebar and rules before posting.
See our full rules at https://www.reddit.com/r/bigseo/wiki/rules.
A community for professional Agency, In-House, and self-employed SEOs looking to discuss strategy, share ideas, case-studies, and learn.
Follow @rBigSEO on Twitter to get a feed of the best threads from /r/bigseo there, including some older threads you may have missed.
Please click report to report abuse, spam or anything that you'd like to be flagged for deletion. Our definition of spam is "self-marketing orientated content" and "low grade content that offers no value for the average professional".
To remove spam, this subreddit requires that contributors' accounts are over 1 day old and have at least 3 comment karma.
New posters and posters that fall below that threshold will need to have their posts manually approved. This can take up to 24 hours. Read our guide for new users more details.
Check out past /r/BigSEO AMAs!
Check out this subreddit's wiki for lots of information about SEO. You can contribute to the wiki by having over 100 karma in this subreddit!
Tell us about yourself with user flair!
The color represents your type of employment. Edit the text to include your job title (or whatever identifiable information you want). Please do not put URLs or other spammy material into your user flair.
We also have flair for your @Twitter handle.
Look at your username up above in the sidebar and click the little (edit) link to assign yourself some flair.
The mods are active!
Message the mods and we'll get back to you soon.
/r/bigseo
Any gamer SEOs who would be interested in squading up with other SEOs to play Call of Duty on a regular cadence...maybe talk some shop in party chat
Casual Friday is back!
Chat about anything you like, SEO or non-SEO related.
Feel free to share what you have been working on this week, side projects, career stuff... or just whatever is on your mind.
I have a 5-6 year old healthy Google business listing for my UK-registered online business.
Since everything's online, we don't meet customers or need a physical store. So, we use a one of those London virtual addresses to get mail etc.
I tried to update our address to the virtual address that we pay for (previously it was covering regions we operate in) and now they want us to do a video verification.
Am I fudged? If not, how can I deal with this. Any help would be appreciated 🙏
So I have a business, let's say I operate this business in 50 different cities in the US. Let's say this business sells... Car washing services. I want to rank for "car washing" to people in all 50 cities. What tool can I use to track this performance in these cities?
Often for similar keywords I will see pages related to "car washes" linking to a new york specific page for all of the competitors on the first SERP - obviously Google was scraped using a new york IP, and these pages don't rank equally for searchers from everywhere.
Obviously I don't just add "car washes {city} " onto the keywords I'm tracking, because that is a different search, and there could be multiple cities with that name.
Or do I?
Hi all, I have a website that I want to rank #1. It’s a contractor website. To keep it short the criteria for the site is below:
I do realize that backlinks need to be there for the site to rank. But I want to know the best course of action.
I have someone who goes into other small local businesses in the same niche, so in this case for other contractors, and writes guest posts on all of there blogs. There is around 20ish sites, all local businesses that are meh ranked, some rank top 3 in there area, some like 20. So with probably meh amount of visitors, like anywhere from a 100-1000+ each month but are real.
On top of that I’m going to buy 1 link each month from Hoth. In about 2-3 months should I see results from this?
In an e-commerce context with many PDPs, how do you handle pagination? Self canonical on all pages? Or on the first one? Why?
I've activated WP Rocket on my WordPress website. It significantly boosted my desktop speed to 96, but my mobile speed is still stuck at 8 and hasn't improved at all. Does anyone have suggestions on how I can enhance the mobile page speed?
I have a business that offers sound healing and yoga events, etc and I have a home page that's a brief overview of what i offer that then links to the specific service pages.
My service page for sound healing is ranking much better than the home page, which makes sense, as there;s more info there and I have optimised it with surfer SEO. Thing is, it's not showing up in google results as well as i'd like and this might be due to technical SEO which i am working on, but I am worried maybe it's cause all the content isnt on a home page. Not sure how to get around this though? It doesnt really make sense for me to have all that detailed content on the home page...
Hi all,
Anyone understand how the "Shop" filter on the Organic Google Shopping tab is powered?
My client has already has tens of thousands of products live in Google Merchant Centre + visible in Google Shopping, but their business is not a filter under "Shop".
Anyone else had this issue?
My website has about 50-60 pages, and I recently redid it in September. The problem is, Google has only indexed 8 pages so far!
Here’s some backstory: I didn’t keep the same URLs from my old site because they were in the "www" format and had a bunch of other differences. I did manage to set up 301 redirects for what I could, but most of the old links weren’t salvageable.
Now, I don’t think I’m exceeding my crawl budget since my site isn’t super big, and it’s not terribly slow either. So, why is Google taking forever to index my site? I’m doing my best here, but it feels like Google’s dragging its feet. Any advice or insight would be awesome!
Wondering how they pulled this. It's a mirror of the homepage only.
Link to the SERP.
Our large e-commerce website has the ability to search in store stock at other stores when in a product page. Similar to best buy where you can look at a product, then a separate URL is created to search, reserve and pickup. As you can imagine, having thousands of products duplicates the crawl budget when taking these URLs into play. We have had major crawl budget issues in the past and since I have been hired I've reduced our crawl budget by 80%.
Question is should these inventory URLs continue to be indexed (these pages aren't really SEO optimized) or would it make more sense to block these in GSC and robots.txt, or is there better way around this?
Beginner questions welcome.
Post any legitimate SEO question. Ask for help with technical SEO issues you are having, career questions, anything connected to SEO.
Hopefully someone will see and answer your question.
Feel free to post feedback/ideas in this thread also!
**
r/BigSEO rules still apply, no spam, service offerings, "DM me for help", link exchanges/link sales, or unhelpful links.
We want to start tracking the number of citations for specific URLs and mentions for particular brand names in ChatGPT search. I saw a couple of solutions and templates, but none exactly match what we're looking for. Also, I'm not sure if something like this is possible at this point, meaning if OpenAI allows access through its API to such user data. Do you have any suggestions?
I have a website that has been ranking at the top for the last 3 years on high-difficulty keywords. We have consistently worked on SEO, incorporating new updates and strategies according to the latest SEO trends. However, over the last 4 weeks, the rankings have started to drop significantly.
I have checked everything, including technical aspects, web vitals, and more, but I haven’t found any issues. The only difference I noticed is that all the top competitors have started focusing heavily on paid search traffic. For example, my paid traffic is 6k, while my competitors’ paid traffic is 30k. Other than this, I couldn't identify any problems.
Could anyone help me find the error or provide guidance?
Hi all, this is going to be a long one. I'm the SEO Specialist of a big garden products / wood retailer in 5 countries in Europe and the UK.
2 years ago we migrated to a new platform involving microservices/headless, and all the buzzwords you can think of:
We use the following techstack:
CMS:Contentful
JS framework: Remix
PIM:Akeneo
ERP:Netsuite
The website had to be migrated just before the high season (C level decision, it had to be done, I had no say in this). From experience I know that migrating/redirecting urls will most likely lead to a small temporary drop in rankings. So I chose to keep the flat URL structure we had, not changing URLS. Just a platform migration. For me being the safest option, seeing as the high season was only 2 months away.
The JS framework Remix is known for it's nested routing, meaning it's really good at efficiently loading data for pages for with certain prefixes in the slug like /category/ , /product/, /blog/ etc. We're not using that, kindof (imho) abusing the framework with a solution the external dev agency came up with. Their explanation is: "Since we're not using the nested routing, we have to load everything that can possibly be loaded on any page on every page."
Before even migrating I asked questions about this, because I saw a copious amount of inline javascript being loaded (27000 lines of javascript unminified) on every page and the amount of javascript chunks we were loading was exceeding 30. (Now we are loading 64 JS chunks..) But the Dev agency asured me nothing was wrong with this.
We're working with this external development agency coding everything on our site. And I feel like I don't have the technical know how to counter any of their arguments. It just seems to me so blatant that it will take Google so much time and energy to render our pages.
-----------------
The problem: In the beginning of august this year I recognized that 4 of our websites hosted on 1 instance/server had crazy high responsetimes looking at Search Console. I'm talking about 1100ms average. Just when our responsetimes go up Google ofcourse decreased it's crawlrate to about half what it was before.
On these 4 websites all of our rankings dropped terribly on 9 september. So all of a sudden the pagespeed ticket I had made 3 months prior becomes top priority. As we're seeing our rankings and revenue plumetting.
This while the 5th website hosted on a different instance doesn't have this issue and is not seeing the downranking at all. So I'm seeing a clear pattern here.
But the CEO/CTO don't believe my hypothesis that the pagespeed/performance issue is the big factor here and rather believe the external dev agency who are implying that my SEO skills are what's causing the downranking.
Even when I'm crawling with Screaming Frog at 5 threads (the standard setting) I'm getting connection time outs, it just seems like a major clusterfuck to me on the technical side. Any performant site I worked with before (inhouse teams/very big websites) never had this issue aswell.
Question: Remix is using SSR (serverside rendering), why do we need to send 27000 lines of Contentful (CMS) Javascript code to the client on every page? It seems to me this data needs to remain on the server and not be sent to the client? Is this really because we aren't using the routing of remix? And if so, shouldn't the dev agency have reported the performance implications to me after me expressing my concerns about this humongous block of inline JS on every page?
Example of the inline JS being loaded on every page
As an estimation 90% of the JS code you're seeing is also in HTML in our sourcecode.
So as an example: Navlinks you see in the inline JS code are also to be seen as simple plain <a href> links in the sourcecode of our pages.
Question: Seeing the ranking drops happening on the 4 sites with high responsetimes and reduced crawlrate, and not on the 1 website with the fast responsetimes and stable crawlrate. Do you think I'm on the right track with solving this issue?
I know it's quite a longwinded post, any help is much appreciated and It might even be a bit unstructured, it's alot to take in, but I hope it's understandable.
I've noticed a significant drop in my website traffic lately. I'm not sure what's causing it. Any ideas?
Currently rebuilding a performing website with quite a large amount of content. Organization of the content, posts and pages are a little all over the place. All 301 redirects will correctly be put in place prior to launch.
What would be the best structure to follow for URL's when it comes to pages:
website.com/services/specific-service/city-name
website.com/areas-we-serve/city-name/specific-service
We could go by current rankings and relevant performance related to the existing content and silos - but would like to hear everyone's rationale here for either one way or the other - or maybe why it doesn't matter too much?
Casual Friday is back!
Chat about anything you like, SEO or non-SEO related.
Feel free to share what you have been working on this week, side projects, career stuff... or just whatever is on your mind.
Will you create a new one?
I’m no SEO wizard—I’m a webmaster, a web dev. But I work with plenty of SEO companies that clients hire, and let me tell you, they all seem to think the key to success is stuffing every inch of a page with words. Drives me nuts! Homepages and landing pages end up looking like they’re competing for the longest written novel - paragraphs stacked on paragraphs, features buried under even more text, and points explained to death. Sure, Google bots might be happy, but come on, what real-life visitor is going to wade through that travesty? Is it the only way to do SEO these days is to stuff text?
Hello SEOs,
I found multiple robots.txt URLs in my Google search console that I didn't even create. I don't know from where GSC is fetching these URLs?
1- https://example.com/robots.txt
2- https://subdomain.example.com/robots.txt
3- http://www.example.com/robots.txt
4- https://www.example.com/robots.txt
5- http://example.com/robots.txt
The main version of my website is the first one (https://example.com/robots.txt). I don't know how to remove other robots.txt URLs. Need help on this.
Moreover, in Google search console >> settings >> Crawl stats >> Hosts
I can see three different URLs of my site
1- example.com
2- subdomain-example.com
3- www.example.com
The website is on WordPress. I worked on a lot of websites and never faced such issues. Can anybody tell me if these are the technical issues? The website has more than 900 pages and only 10 are indexed. Google is not crawling my site's pages. Content on the website is related to healthcare and its 100% AI generated.
What should I do in order to make Google crawl my website and index its pages.
Are you seeing Local Google Service Ads performing better than SEO and Facebook ads? I have clients asking about LSA currently and having good returns.
Beginner questions welcome.
Post any legitimate SEO question. Ask for help with technical SEO issues you are having, career questions, anything connected to SEO.
Hopefully someone will see and answer your question.
Feel free to post feedback/ideas in this thread also!
**
r/BigSEO rules still apply, no spam, service offerings, "DM me for help", link exchanges/link sales, or unhelpful links.
Curious to hear the ways you use VAs to do SEO tasks?
I am about to start with a VA for Content using my SOPs.
What else have you had success with when using VAs? Thinking backlink reach out? That sort of thing?
I have two pages on my site which have lots of information (for example www.guidedpeaks.com/guides ) and I'd like to share links that are already filtered (for example by country, or by mountain) - and link to them from other pages on the site.
I thought I'd simple do normal get params. eg www.guidedpeaks.com/guides?country=bo
Obviously an ISO code works technically, but doesn't convey much. So I could maybe use full name (www.guidedpeaks.com/guides?country=bolivia).
My question is around what's most typical, and is there any advantage/disadvantage to having this structure, compared for example to (www.guidedpeaks.com/guides/bolivia). I thought against that option, since I'm not sure how it would work with multiple params (eg country=bolivia & mountain=sajama).
Does google look at the two differently? (url get params Vs everything in the path)
Hey all,
I'm a semrush user, but I have a hard time justifying $289/month for Semrush Trends when I "only" need the Traffic Analytics feature that comes with it.
Are guys using it? any alternatives? (not to semrush but to semrush trends Traffic Analytics feature)
I'm fairly new to digital marketing and have been trying to learn as much as possible. I keep seeing SEMrush mentioned everywhere, especially for SEO stuff. While I get that it's probably amazing for SEO (given how often it's recommended), I'm curious about its other features.
Is this tool mainly popular because of its SEO features, or does it offer other strong functionalities? I’ve heard it can be used for content marketing, advertising, market research, and social media management.
What are your thoughts for those of you who’ve been using it for a while? Are there other features beyond SEO that you’ve found useful? Would love to hear about your experiences.
TIA!
Hi Bigseo! We're noticing big rankings drops on multiple search console properties. We've been trying to fix server responsetimes since they were around 1100ms average on all 3 properties (extremely high), they're all hosted on the same slow server. I thought the august core update finished rolled out on 3 september, so I find it odd that all 3 properties are seeing the same downranking on 9 september. Do you guys know if there was any other update going on around 9 september? Are some of you seeing the same? Is fixing these reponsetimes the way to go? Or should I be checking content/ ux? These are all very big e-commerce websites in multiple countries (NL/BE/DE)
Any help or info is much appreciated!
Kind regards, Casper
Screenshots of the rankings of the 3 websites below. (to me basically the same graph, seeing a drop on all properties on 9 september)