/r/TechSEO
Welcome to Tech SEO, A SubReddit that is dedicated to the tech nerd side of SEO.
Welcome to Tech SEO, A SubReddit that is dedicated to the tech nerd side of SEO.
Wednesday, March 28th, 2018 - John Mueller of Google
/r/TechSEO
Spent the weekend getting ranty about the current state of SEO. Have been a long time contributor to technical SEO and it's importance on the web but we're now at a point where little is making a difference for digital publishers.
Thought the group might find it interesting or perhaps disagree with my opinions.
Hello everyone I am kind of new to SEO but i will try to explain it.
If you can see in above screenshot google seems to be picking up footer text and some code and displays it as meta description on SERP page(sitelinks) in the snippet instead of original meta description. How can i stop it from doing this. I had suggested data no-snippet tag for footer section to the tech team but now if you see in the snippets some other code is also getting picked up. Not sure where to go from here. I checked meta description for the PLP pages they seem to be following a structure and are slightly similar but i am not entirely surely it is happening because of that. What should i do here next ?
Hi guys, on the 21th of January I pushed for some of my website pages that weren't indexed to get them sorted and indexed. One day went by and now ALL of the already indexed ones and non-indexed are GONE, null, there's a literal hole in the analytics of the entire website.
Notes: no mayor changes were done in the site, no errors are reported by Google, no penalty, no blocked index, no security issues. The site is around 2 years old, all is up to date and content is added in a meticulous manner.
HELP 😭
I run a comparison blog in Spain and a baby names app. Initially to reduce maintenance I had those together.
They both grew on traffic but they are quite unrelated. While some users coming into the names blogs go to baby products in the comparison part, it's relatively low.
Would it make sense to move away my baby names section of my site?
Migrating that to a separate site makes sense to improve my SEO and topical authority right?
I was thinking to use a subdomain to start with? E.g baby.domain.com
What do you think?
Hello, Most of my product images are not showing up on google becuase i have put a watermark on it.
Do i need to completely remove the watermark from the image for it to be accepted by google merchant center or can I edit the product image to still have the branding of my product by putting the branding in a corner of the image where it is no longer overlapping the product image for it to be acceptable?
What is the criteria for it?
Currently, the name of my website is put on the product image as a watermark (5-10% opacity).
If instead of having the watermark on the product, if I change the location of it to bottom right of the image where it is not overlapping the product, will it get accepted by google merchant center?
Or do I need to completely remove the branding from the image for it to be accepted at google merchant center?
Don't think I've seen this before but I have a few pages in our blog content that were being flagged as 404 in AHREFs. When I open them they load. I noticed that the Ayima Redirect Checker is flagging it as 404 too.
I tried several browsers (I only use chrome regularly) so I don't think it's a cached issue.
Anyone seen this before?
We’re about to do a huge site migration, 301-redirecting thousands of URLs and pages. The site gets around 100k monthly traffic. The main thing we’re doing is removing the /us
subfolder for US content (so all US content is under the root domain) to make it clear to Google that the US is our main market and hopefully fix some ranking issues with our homepage.
And we even consider removing the other countries (/uk /au
etc) all together from the website by 301-redirecting those pages to the relevant US page, or splitting them out to separate domains, to show google we are serious about the US and consolidate all power from the website there.
This feels like a potentially site-tanking task, and I want to avoid messing anything up. I spoke to an agency, and they suggested we do it in stages—migrating a few hundred pages every couple of weeks, analyzing what happens, and then continuing. I’ve never heard of this approach before, and it made me wonder if it’s the right way to do it. They also quoted $25k+ for this, so I'm wondering if they just suggest this approach to be able to quote such a high price? There is no way we are able to pay anyways, we are a small website with limited sales.
How would you handle a migration like this? Also, if we split things up (some US pages still on /us
and others moved to the root domain before migration is complete), wouldn’t that just confuse Google even more?
Any tips, advice, or ideas? Or someone I should talk to about this? Thanks a lot!
EDIT: We have 99% of our sales from the US, and we have no plan of removing any US pages. Since the US is so important, we want the US homepage to start ranking and we also want all of the US content to have every chance it needs to rank better. We were advised that our current international structure might be hampering this slightly where the US content is under a /us subfolder, and definitely messing with the US homepage.
EDIT 2: Peoples livelyhoods are at stake here, so I definitely want to do it correctly and it is no easy decision. I have been trying to get in contact with people for help, but finding good help is not easy. And going through an agency is so expensive. If you know someone that knows this well, that would be super helpful.
----------------------------------
Here are my prior threads on the subject that helped me understand a full migration was neccesary.
Since i changed to Hostinger, im having massive issues with failing crawl attempts. Any issues I can address on my side to fix these. This is my site
Since Nov 18th, I’ve been unable to successfully submit new pages for indexing in a clients google search console.
We just get the error: Oops! Something went wrong. We had a problem submitting your indexing request. Please try again later.
Eventually the page will get crawled, but sits under the “Crawled - not indexed” category.
I’ve tried adding a new user to the account and having them submit it. Nothing.
I’ve written content myself and passed AI detectors. Nothing.
I’ve added internal linking. Nothing.
I’ve confirmed that the new pages get added to the sitemap, and the sitemap is crawled. Nothing.
I’m at a loss for words here. It’s not like we’re doing anything crazy with this profile. The website itself is still ranking and indexed and doing fine…but indexing new pages for whatever reason is impossible.
I have a website built on plain html talwind which i deployed a week ago, Its seo score is above 80 on rankmath, and it shows indexed on search console but not showing up on google when i search its name even when i search its domain name without extension like the"websiteName"
I checked on other indexed page checker's website but it says not indexed on all of them
Hi, I'm a web developer who is helping a client migrate their site. I'm okay at SEO, but not an expert, and I have one situation that I need some advice on.
The clients blog has urls like: /blog/hvac-service-reading/[blog-title] and /blog/hvac-repair-west-chester/[blog-title]
But these are no longer towns that they want to target. They still do business in these towns but they're targeting a new region as their main business center.
Would you recommend I change these to more generic urls like: /blog/[blog-title]
OR should I change the cities names and keep the current structure like: /blog/hvac-service-[newTown]/[blog-title]
OR do I leave it as is?
I have been looking to implement schema using JSON LD using various generators and plugins and found out they are very similar except for the way Yoast does it.
Most generators and plugins do not have the "@graph" collections of objects and instead just do it with a single "@type" followed by its properties.
This is an example of Yoasts implementation of JSON LD.
{
"@context": "https://schema.org",
"@graph": [
{
"@type": "Organization",
"@id": "https://www.example.com/#/schema/Organization/1",
},
{
"@type": "WebSite",
"@id": "https://www.example.com/#/schema/WebSite/1",
},
{
"@type": "WebPage",
"@id": "https://www.example.com/example-page/test/",
"url": "https://www.example.com/example-page/test/",
"isPartOf": {
"@id": "https://www.example.com/#/schema/WebSite/1"
}
}
]
}
So why is Yoast doing it this way and does it have any advantages?
Hi guys I’m migrating my website from a custom CMS to shopify and I saw something that may potentially be an issue. For all of my URLs, the internal URLs you access through the website are different than the external and indexed URLs google shows. So if I go on my website and search for a product it’ll take me to a page with the URL website.com/product. But if I search for that product on google, it will go to the exact same page but instead with url website.com/product.html. For every internal URL there is no .html at the end but for every external indexed URL there is. The URLs are the same in every other way.
Are these the same? And how much of an issue do you think this has been for my website if they aren’t the same, if the indexed and internal links have always been different.
Also, shopify seems to have a limit on URL redirects and I have quite a few products. Is it alright if I only 301 redirect indexed pages and leave out some non indexed pages? I have about 70000 indexed pages, 50000 of which are unsubmitted. Or is there a way to exceed this redirect limit without upgrading to the Plus plan.
On a side note, does anyone have experience with migrating their website to shopify that they can share? I just want to know how it went, my current website is in a bit of a small industry but is extremely slow with no customisation and a lot of issues, especially with URLs as on top of the .html issue each page has 3 or 4 URLs (6 or 8 if you include the duplicate .html external links) that seem to both rank on keywords, usually poorly. Just not too sure what to expect when first migrating and unfortunately don’t have the funds to hire a professional team to do it for me
Thanks, would really appreciate if anyone knows anything about these issues and can share some insight
5 years ago I scraped 10.000 Google serps with SF to compile a list of top 10 URLs (and other data) for each of the clients keyword portfolio of 10.000 keywords.
I am trying the same setup today (JS rendering, slowed down que, changed user agent, local Google) but fails, wondering if anyone has cracked this egg?
Hi, i have a big Website with over 4800 redirects in the vhost. Now i search a Tool for checking the redirects and give me Infos, whether I still need the redirects. are the URLs still indexed etc.
many thanks!
Hi there,
I'm absolutely baffled as to what's going on with this. Purchased the domain in November 24. Site went live about a week ago. Added to GSC and submitted sitemap.
It's indexed, but the site name is completely and utterly nothing to do with me, and it's only happening on the subpages - even though they're meant to inherit the home page's sitename anyway!
I've gone through the steps on the Google documentation re. Sitenames, validated schema, etc. I've requested reindexing, regenerated sitemap, etc etc.
I know this was a weird issue in 2023, and there was a submission form for errors, but that's long since been closed. I can't figure out why this is happening, or how to get it fixed, but it looks awful 😭 I'd be glad for just the domain as the sitename at this point.
If anyone has any pointers, I'd be forever grateful!
Hello, u/johnmu - how important are the HTML 5 elements like <header>, <nav>, <main>, <footer> - to detect main content and for UX generally? Thank you.
Hello,
Our tech team has implemented lazy loading for the homepage content after the third widget. Upon checking the rendered page, I noticed that Google doesn't seem to be receiving content beyond the third widget.
When I raised this with the team, they explained that the lazy loading was introduced to improve page speed, especially since many of the images on the page are GIFs, which can slow down load time. The decision was made to lazy load those images.
Should I ensure that Google can access the full content of the page, rather than just the first few widgets and footer content?
Any insights or best practices would be greatly appreciated!
Thanks!
I wrote script in .js that gets the H-tags and makes a table of content on a page. I was wondering if this is bad for SEO, or if I should render the table of contents server side? Any insights would be great.
Hi all,
I'm dealing with an odd indexing issue and hoping someone has experienced something similar. A bunch of pages on my site suddenly got deindexed and are now showing as "excluded by noindex tag" in Google Search Console. Here's the strange part:
I've been manually requesting indexing through GSC which is slowly getting them back in, but it's a painstaking process. Has anyone run into this issue before? Any suggestions for getting pages reindexed more quickly?
I can confirm they're actually deindexed since they don't appear for any of their previous ranking queries. Looking for any tips or insights from those who might have dealt with something similar.
UPDATE Jan 18, 2025:
The site's getting back in the index slowly but surely - at this point about 70% of pages are back. Here are the steps taken, no clue if any of them helped but for reference:
Requested indexing for all pages through GSC.
Submitted feedback through GSC tool (no response)
Pushed pages through IndexNow (almost certainly did nothing)
Hammered the pages with links from numerous social media accounts. The idea here was just to get more entry points for crawlers; again, though, not sure if this helped.
I'm reaching out for assistance regarding an issue I've encountered after migrating my website to a new domain. It's been nearly 90 days since implementing 301 redirects, yet my old domain remains indexed and continues to receive traffic, causing significant challenges.
Background Recently, I migrated my website from [old domain] to [new domain]. During the migration, I took the following steps:
Despite these efforts, I’m facing the following issues:
Issues Encountered
Performance and indexing reports indicate the following:
Metric | Old Domain | New Domain |
---|---|---|
Total Clicks (Last 28 Days) | 1.75M | 377K |
Total Impressions (Last 28 Days) | 6.34M | 1.33M |
Indexed Pages | 478 | 48 |
Questions to the Community
I have this kind of index problem on my page. It crawls but does not index - how can i overcome this problem ?
Hey Guys,
i have a Special Question for technical Seo. I hope you can tel me your opinion, how to rank the best way to get more klicks & conversions.
I’m from Poland but live in Germany and I have a website with a .de domain.
My business is about, that I help polish people to find work in Germany. Now I want to rank with my website I other countries. Like French and Netherlands, so other company’s can find me, when they searching for like: “worker from other countries” or “worker Poland”
You think I need a new Top Level Domain? .NL, .FR? Or have I to use a subdomain? nl.XY.de Or only changing domain like. .de/nl/
What you think is the best? Thank you for you answers. 👍
I have many old webpages and I want to change the URLs of these pages and then use 301 redirect to redirect the old url to the new url.
It's like 20-25 pages. Will this cause any negative effect on the website? Like reduce in speed, or SEO issue?
And this comes under white hat SEO, correct?
As we are currently seeing with Google Overviews, many questions can now be answered using AI generated summaries. yes, these summaries aren’t always accurate but ongoing advancements in AI are expected to significantly improve their reliability.
Traditional search has somewhat declined and, to a minor extent, is being supplemented or replaced by AI models like chatgpt and others (perplexity).
So to remain relevant , Google and others will need to adapt its search engine. So, what are your thoughts on what the next generation of search engines might look like?
Hello,
My website's meta description is not showing properly in Google. I checked everything. Done many things to resolve this issue. I need your help experts! to identify the problem and resolve it.
Hi everyone,
I’m currently working on a website with a large mega menu that links to all major categories and subpages. The menu is visible on both desktop and mobile and includes around 300–350 links per page.
Now I’m wondering if such a large menu could have negative effects on Google rankings, especially regarding:
Do you have any experience or recommendations on how to make mega menus SEO-friendly? Should we display fewer links, or is a well-structured mega menu generally fine?
Thanks in advance for your opinions and tips!
Would I be right in presuming that this Pagespeed Insights Tool works on data over a period of time eg. the last 30 days?
It's not useful for real time pagespeed debugging. Webpagetest.org is telling me I've an LCP of a little over 3 secs on mobile. Pagespeed insights tells me it's over 6 seconds.
How can I let Google know I've fixed my pagespeed issues? Just resubmit my sitemap on GSC?