John Mueller, Senior Search Analyst and Search Relations team lead at Google, was again out there on New Year’s Eve and New Year’s Day, providing SEO support for site owners, creators, publishers, SEOs, and others. He has done this on Christmas every year and also on New Years every year.He has been doing this year after year for over 16 years now.John has a tradition of helping webmasters both on New Years and Christmas. He did new years 2023, 2022, 2021, 2020, the 2019 also, and has done it year after year prior to that. He did it on Christmas 2020 and in 2019 and in 2018, 2017, 2016, 2015, 2014, 2013, 2012, 2011, 2010, 2009, 2008, and 2007. He posted responses to several threads on Reddit over the New Years break, here are some of them with his responses:Please help! Website copied : The DMCA process is commonly used in cases like this.
Video is not the main content of the page on GSC : Feel free to dm me some URLs if you want me to forward them to the team.
Why not buy a Fiverr service for DR & DA 50? : Oh, I thought you were saying these were not really sites :-)). I’m sorry, just trolling :-). Hope things are well for you outside of Reddit.
A 4xx Broken Link Error in Semrush : Just ignore it. It’s normal that pages come & go on the web – and using HTTP 404/410 status code for them (essentially just deleting the page) is the standards-conform way of doing that. If you’re renaming a page, then use a redirect. If the page is gone, leave it gone. Any non-trivially sized site will have hundreds, thousands, millions of 4xx errors in their logs.
External Links Pointing to URL Versions with Partial Encoding : I’ve set up redirects for accidental typos in links, but I doubt you’d see any SEO effects. Theoretically it might make sense if it’s a really important link, or one that drives a lot of traffic — but if that were the case, they’d just fix the link (either on their own, from user feedback, or from your feedback). That leaves the random links that get dropped in random places – those aren’t going to have recognizable effects. Fixing them (through a redirect on your side) feels good, but that’s about it.
How does Google track location? : Do you mean your server? ccTLD is a good differentiator, if you want to target a specific country. If you just ant to be a global site, use a gTLD. Some TLDs look like they’re country-specific, but they’re not (double-check — eg “.nyc” it not a ccTLD, it’s a gTLD), and some country-code TLDs are seen as gTLDs.
Keyword domain name : If you’re planning on using this for the long run, I would not recommend using just a keyword domain name. It makes it weird to start targeting other terms, and it makes it almost impossible for people to find you by your name. A keyword domain name is not going to give you any recognizable SEO advantage on Google.
Url Inspection Tool cant fetch page after repointing DNS : Probably this is just the old ip address being used, the DNS cache will clear out over time. I like to set my DNS “TTL” (how long the DNS cache can last) low when doing a move to make the switch faster. Some people keep the old server running for a day, which also works, depending on whether it’s a static site or not.
Why am I suddenly getting a lot of referred traffic from a scammy looking website with no traffic? : I’d just filter it out or block it in whatever you’re using to track.
0.7 DR Website and it keeps falling : To be a bit blunt, just pumping out blog posts is not going to impress search engines, even if the posts are good. Assuming the tool you’re working on is doing something unique that no other tool is doing, I’d focus your time on that. Make something awesome that others will want to use, that others will want to talk about, that others will want to promote for you on their own. Don’t just write blog posts for the sake of having written another blog post.John also posted some on Mastodon and on X.Forum discussion forum threads above.