/r/TechSEO
Welcome to Tech SEO, A SubReddit that is dedicated to the tech nerd side of SEO.
Welcome to Tech SEO, A SubReddit that is dedicated to the tech nerd side of SEO.
Wednesday, March 28th, 2018 - John Mueller of Google
/r/TechSEO
Hey all,
I’m working on a tool idea aimed at automating a common but time-consuming SEO strategy: using expired domains to build authority and drive link equity. Would love to get some feedback from the pros here to see if this is something you'd find useful or would consider using.
Here's the concept:
The tool would streamline the whole process of taking over an expired domain, building it into an SEO asset, and linking naturally to a target site over time. It would handle everything, from content creation to link placement and ongoing updates, so the site maintains authority without the usual maintenance. The idea is to save hours of work and budget while delivering the same link-building benefits.
I’m curious:
Any feedback—whether on the concept, potential use cases, or general challenges you’ve faced with expired domains—would be super helpful. Thanks in advance for sharing your thoughts!
Cheers,
Sam
Hi,
I'd like to ask more experienced people about the link pyramid strategy. Where you get loads of 'foundational links' - from what I can see these are low grade free links from various generic sites which provide dofollows, the kind you can buy on upwork for $20. Then you try to buy/build lots of niche/relevant links which are credible, and these two 'layers' combined somehow are greater than the sum of their parts.
Previously I've seen the general consensus on these 'buy 100 backlinks' on upwork to be that they're trash, and may even do more harm than good.
But a friend of a friend who has aparently lots of SEO experience recommends this pyramid approach and seemingly uses it on all projects.
For context, here's a vido where someone explains it: https://www.youtube.com/watch?v=370yB1Iqp_4&t=1s
It does make sense to me, but I also have no idea, and it conflates with what I heard about these kind of generic links already (that they're bad). So keen to get any wisdom from those more experienced.
Does It make any sense to canonize a Page generated by a filter( without any valute) and adding noindex and no follow as well?
I have a custom PHP web app at the root of my domain that is going a great job for SEO and Traffic etc.
I also want a blog - and I decided on WordPress and placed it within a subdirectory - and, well - all good. Many blog posts are indexed and all seems well.
My question is to just make sure that I am "ok" doing what I am doing, in other words, would having a WP installation confuse a crawler? For example, if a crawler goes into the blog and then sees a different menu (with a different HTML structure) then is all well or is this not recommended?
I am inclined to think - no. GoogleBot is smart enough to crawl URLs ONLY and parse TEXT (i.e. "content") that it can then render.
Am I overworrying or am I restricting the growth opportunities of my site by having WP as a blog within the subdirectory?
Thanks!
I was looking into the GSC crawling data recently and realized we have dozens of subdomains that are getting crawled very frequently. Most of them have no robots.txt files and thousands of useless pages that should not be accessible. Some have crawling frequency in the millions per day, others have very high download sizes per crawl, significantly higher than that of the main domain.
I'm going to add robots.txt for the biggest offenders but I'm also wondering if this is going to have an actual impact on the main domain as Google claims it considers them separate entities. Also, the main domain has only a few thousand URLs so crawling budget should not be a worry.
Our website has been dealing with CLS issues since August. The main pages pass, however blog pages are all failing. The issue is present sitewide. Our Squarespace theme (which we don't have one) appears to shift down all the content after the page loads resulting in a CLS error. We started the website back in 6.9 back in 2018 so there is no specific theme attached to this website but it showing as a "theme issue".
It appears there is still some visual shifting with ads and slickstream (from Raptive) removed. We asked our ad partner about this and they state it is a solely Squarespace issue. In testing we saw the CLS improve slightly with removing the sidebar but the shift is still happening. This has been flagged by Google Search Console and our google traffic has become virtually non-existent.
Appreciate any insight anyone might be able to offer. Thanks.
Screen recording of issue:
From what I can see this type of Google Knowledge Graph rich snippet (which appears to be an aggregated list based on other lists online - there aren't links to all websites of companies named in the list from each item listed in snippet).
Thanks!
We have some clean up to do on some pages with a good number of external backlinks. This group of pages 404, don't get much traffic, have good backlinks/SEO juice but are NOindex.
I assume to realize that link juice we need to remove the NOindex, then 301 them to relevant pages, right?
Any other details to consider?
A semrush audit raised that warning for many of my pages.
Basically I have many internal links which just go to an api which logs the click then redirects the user to an external site. You can think of them as affiliate links. I don't have a nofollow, since they ultimately go elsewhere, and I don't want crawlers to hit them and thus mess up my stats.
Is it safe to ignore this semrush warning, or is there a better way (ie SEO correct way) to mark these links but so that crawlers don't follow them?
Thanks
Today, I got an email from SemRush that it found a gazillion of nofollow internal links but not on all pages.
And I have no idea how it happened.
example URL
https://ppcpanos.com/about-google-ads-oci/
The site is SSG, SSR, (sveltekit) and is hosted on Cloudflare pages.
Any idea as to how to debug it would be highly appreciated.
I used a random "check broken links" website to scan my website and it gave me almost 200 links with the errors like the title. Is there an easy/ better way to further analyse and remove these?
Hi all,
I'm wondering what kind of work do tech seo freelancers perform monthly for a long-herm client, so not one-off projects (migrations, audits...) and where you exclusively do tech SEO - meaning you're not a regular SEO manager that also builds backlinks, does content writing/planning/briefs etc.
With my clients, I'm a "fractional technical SEO", meaning I help in-house teams that don't need full-time tech SEO or the client's SEO team doesn't have the time to work on it.
As a part of their team, i have access like an employee (jira, slack, github, AWS, guest on Teams etc.) where I, amongst other tasks:
I'm also interested about the pricing - do you get paid by hour or work on a retainer?
Interested in your approach - described above is what came naturally and where I feel like i can bring the most value as opposed to being a consultant which just does calls and doesn't do any actual work or has any responsibility...
Hey Everyone,
I’m looking for recommendations on any reliable AI tools or software that can analyse the headers, body text, and keywords of my content, give it a comprehensive SEO score, and suggest or even automatically implement improvements to boost optimization. I know of tools like Frase and Surfer SEO but haven’t checked them out yet.
My main focus is finding an AI tool that can rewrite existing content (headers / body texts) to improve SEO optimization and achieve a higher score. Any go-to sites, apps, or software you’d recommend for this?
Thanks in advance
Does SASS exist with multi grid GSC widgets in a dashboard?
I already built something like this for my self but it's not exactly simple and I'm looking for more fine polished solutions that are doing this already?
Fairly good size e-commerce website and I've taken it to a really good place for page speed on desktop!
Just somehow need to improve mobile but I'm lost on where to start.
Hi guys
I noticed in the last few days ( I have never checked in before) that the Google search console is not showing all the data he's explaining about and I would like to know if there's something I'm missing
So when I Look from above I can see that the total clicks on GSC is 6.59K
when I'm taking all the clicks and sum them up, the number is not exact (it's not even close to 6.59K)
can anyone explain what I'm missing
Thank you
Hello, how are you?
Im just starting to work with SEO, I'm a young 22M.
Itt's my first job and it's a great opportunity at a well-known company.
That said, I want to specialize in SEO to have a certain "stability".
Do you recommend a study path in the area of technical SEO? any additional knowledge? I currently have basic knowledge of HTML, CSS and Python, but it's not very useful to me, at least at the moment.
Do you have any recommendations for a young person just starting out?
Hello everyone, My customer wants to track clicks on a slider on the homepage. He added a cmpcode to the URL, i was wondering, are we wasting link juice?
Thank you!
I have watched all his content and I enjoy it and have learned a lot. Who else follows him?
I'm planning a major renovation of my website and would like to convert all pages (posts, categories, tags, etc.) to the first URL level including ID and then redirect with 301.
CMS: Wordpress
Example current structure:
https://domain.com/cat-1/article-name-here/
https://domain.com/cat-1/cat-2/article-name-here-2/
https://domain.com/cat-2/article-name-here-3/
New structure:
https://domain.com/article-name-here-a647/
https://domain.com/article-name-here-2-a698/
https://domain.com/article-name-here-3-a765/
But I have around 9000 pages that have a very high traffic.
I'm definitely a little scared and tense. what are your experiences with so many 301 redirects? do you have any tips?
Thanks all
We recently added the sitemap to the Google search console for a very large website with more than 1 million products. The sitemap contains an *.xml index file within this file, there are multiple sitemap files in *.xml.gz format. Each *.xml.gz format contains 50,000 URLs.
The sitemap has been added successfully, however, google discovered 0 pages. What can be wrong here?.
Hello, I need help with indexing problem. I have an academic blog that’s 10 years old, multilingual (English, French, and Arabic). When it was first created, it was only in Arabic for several years. My articles in Arabic used to index quickly (in less than 5 minutes). In 2018, I added English and French, and for the past three years, English has been the default language. I’m using the Polylang/WordPress plugin.
Today, articles in English index quickly and rank well on Google, but articles in French and Arabic index slowly or not at all. It’s strange, as with Polylang, it's easy to change the default language. So, I set Arabic as the main language again, then published three articles in Arabic, which indexed quickly and ranked well. I did the same test with French, with the same result. However, English articles no longer index as quickly.
I don’t understand what’s going on, as it’s the same domain with separate links for each language: English (www.website.c**), French (www.website.c**/fr/), and Arabic (www.website.c**/ar/).
So normally, my articles in French and Arabic should benefit from the domain authority, right?"
when I check the articles in Arabic and French published but which are not yet indexed on GSC I have Explored, currently not indexed
There's a ton of automation happening on the content side of SEO, but feels like the technical side gets less love (or infamy, depending on how you feel about it)
What are things y'all have started automating or using AI to automate? Anything you think is untouchable?
Hello everyone
I don't understand these last few days/weeks.
I have pages that have been indexed and I have not modified or otherwise modified the content of these pages.
But they become unindexed.
What do you think this could be due to? How is this possible? And what can I do to resolve this problem?
Thank you in advance for your answers.
I have a page with multiple subpages, each in a specific language. For example:
page/en/sub-name-en
page/de/sub-name-de
Google crawls them (and throws 404s in the console) to pages like:
page/de/sub-name-en
That bottom page does not exist, nor is it linked anywhere on my page. And it finds hundreds of such random results. We checked the sitemap and found nothing suspicious.
Any ideas? Does Google just do this randomly?
We have one main sites and now are going to create two other sites, one aims for UK and other for European countries. The contents are the same only different in currency.
And now I'm struggling on EU hreflangs.
for main sites, we write alternate for specific EU countries and regions, such as
<link rel="alternate" href="www.domain.com/eu/" hreflang="en-LU"><link rel="alternate" href="www.domain.com/eu/" hreflang="en-DE"><link rel="alternate" href="www.domain.com/eu/" hreflang="en-BE">but when I check it with an online hreflang checker, it said wrong.
I seek for GPT, AND it said I should have a tag for general European English (en-EU
) might suffice if content is not country-specific. Just one alternate like this <link rel="alternate" href="www.domain.com/eu/" hreflang="en-EU">.
I'M curious which one is correct and if I keep the first one(list all specific alternate for specific EU countries), will it affect SEO?
I have an informational website on gaming that will soon be three years old. The first few years everything was great, pages were indexed in Google and ranked high. I regularly update articles and write 100% unique texts in a simple and understandable language. However, in March, after a core update, traffic plummeted. Later, I started noticing that pages began to drop out of Google's index. Today, there are about 300 articles on the site, 80 of which have fallen out of the index. I tried to submit 10 articles a day for indexing manually through Google Search Console, but it didn't yield results. I also attempted to add pages to the index via Google IndexApi, but after that, another 5 pages dropped out of the index after a few days.
Regarding speed metrics, 95-100% and everything is in the green zone, except for CLS (in the orange). However, there are no visual layout shifts. I don’t know how to improve CLS on my website; I’ve tried all methods. Please give advice on how to get the pages back into the index, as these pages ranked highly and brought 50% of the traffic to my site.
I am working on a newly build website for a client and we just built a website. products on the website belongs to only one category and he has no plan of adding any other category in the future.
So will be helpful if I change the product links from
xyz.com/product/product_name to
xyz.com/category-name/product_name
I'm using Hostinger for my hosting, and they have a feature to automatically purge the cache every 30 minutes.
I'm wondering if thats something I should use, as in my mind I was thinking that everytime the cache is purged, it will slow down the website speed to the end user as its having to retrieve it newly again?
Am i right in thinking I should only be purging it when theres changes, rather than on a set schedule?
I've been optimising my website as best as I can over the last few months, and believe i've got it down to almost where I want it on desktop as you can see here:
Its the mobile speed index that is really troubling me;
The problem I'm having is, I just don't know what more I can do for the mobile side of the website to improve its metrics!
With the LCP I've added fetchpriority to it, removed it from any lazy load and optimised the image size for mobile displays.
Maybe this is all I can do? Suggestions would be welcome of course