What’s worse? The crappy website they have with no content and just super spammy links or the fact that you nor I ever asked for that bad backlink? The fact that the charge to delete a link to you at $35 a link shows how they crash your backlink party. But hey, Google knows this right, they wouldn’t hurt your site still…wrong!
I wish we could collect money like a nonprofit from SEOs everywhere and raise the $10,000,000 asking price of theglobe.com and just turn it off. Why is it I hate theglobe.com so much? Cause their links are bad.
Previous to this post I’ve always said don’t use the Disavow Tool unless facing a manual penalty. Its the Google answer … but I tried something, disavowed a few hundred domains all connected to theglobe.net and over the next 3 weeks I saw a marked improvement in keywords and their position.
A client had the same issue. 300 domains from Sweden many with theglobe.com in their domain name mostly all with it at least in the page title included in their link profile.
These aren’t backlinks someone purchased, at least not from the target company or their SEO for that matter. They are nuisance links, you could use this site for negative SEO. It regularly pops up in the link portfolio of domains as they are expiring which has always annoyed me because it portrays a Domain Pop higher than it actually is … the globe of course doesn’t count.
Theglobe.com actually last I checked charged for backlink removal! That’s one way to run a profitable site…make it so bad it hurts SEO so webmasters will pay you to leave them alone. Of course … then the terrorist wins.
Noticed a few “theglobe.net” in your link portfolio? If so its always an infestation and you should try disavowing.
So you may try the Google Disavow Tool if you haven’t yet. Upload a txt file listing the domains you do not want counted. We’re including a copy of one of our disavow lists, you can use it or add to it.
Who’s on our naughty list? What should the txt file look like? The long list is just below us. Domains we’ve disavowed.
<–start below this line–>
# exported from backlink tool
If the title confuses you don’t worry. Id expect even intermediate SEOs are not familiar with all of those letters. Domain Authority or DA is a well established SEO metric by Moz.com that many often confuse with Google’s ranking of keywords. Lets just remind everyone interested in their DA …that it has NOTHING to do with Google’s search results. It MAY be based on some of the same similar metrics as Google uses but its just a number someone other than Google guessed.
DS – Domain Score by SEMRush
DR – Domain Rank
DP – Domain Pop
DT – Domain Trust by SEMRush
AS – Authority Score by SEMRush
TF – Trust Flow by Majestic
CF – Citation Flow by Majestic
Moz Spam – Is Another Moz Metric
For a more in depth look at each ranking we’ve included the SEO Metrics category from our FAQ section.
Continuing into metric land theres seemingly easy scores that you’d be wrong to expect to be equal. Backlinks are reported by Moz, SEMRush, AHREFs and Majestic. Usually these numbers are widely different, but how could they be different?
Keep in mind that the sites that report these numbers are making crawls and may be different sizes. Ive found MOZ to usually be the lower count, this could mean they have the smaller sample of sites from the others or it could be they have better spam recognition than the others. Another theory could lead you to wonder if one site includes backlinks from redirects where others don’t include redirected backlinks.
It’s important to note that these numbers are just made up criteria that the specific indexing site cane up with and depending on their size and rules the reported number of backlinks almost always vary. Some only count backlinks to the specific domain and not subdomains others do count subdomains.
So whats right? Well to know that you would need to work for a search engine like Google to understand the factors uses to determine a sites values.
In the end no one index has it right, and no one metric is accurate enough to serve as a good predictor to Google’s real metrics.
Domain Authority after all went through a recalculation just under a year ago and we got DA2.0 which saw many sites plumet in the DA metric. That never translated into a drop on Google as DA isn’t considered by Google for ranking purposes.
Lets take a look at the metrics of one of my client’s domains and you may start to understand the disparity between these numbers and see patterns. (Link Opens In New Tab) https://ultimateseo.org/metrics/
Beautiful is it not? A spreadsheet chalk full of data. First Id note that the only benefit to these values is collectively … consider how they interrelate, do they disagree?
In this metrics report the values are often color codes to reflect my opinion of the values I expect in a good SEO site. Red is obviously not what Id want to see.
If a domain has consistently red metrics its a very strong indicator that there is a problem there. If they disagree which they do sometimes then the jury is you. You have to decide which values you accept as accurate and which ones are likely irrelevant to ranking your keywords.
Ive added some overall average score boxes that take into account multiple, similar metrics. Thats also important … not all metrics are intended to show the same thing. Domain trust and Trust Flow are similar while Domain Score, Domain Rank and Domain Authority are similar.
These do generally correlate to one another but sometimes one is oddly higher than usual and that creates an anomaly in the usual spread of these numbers. The oddity can be a sign that the site is optimized towards a metric and not towards Google.
For instance sites may focus on increasing Domain Authority and by doing so they may leave behind other metrics.
Using the image above if we glance across the data we can likely agree the least desirable domain is the third. Even with a DA of 10 its average across all of the numbers available is a 4. With Majestic’s CF and TF giving it a 0 and its DT gaining a 0. It has a total of 9 to 25 backlinks …. only 5 to 10 domains refer to that site. So its likely not a great site … but the third from the bottom is a different story …. still with a DA 10. This time we have an average metric at 10. We average 4934 backlinks across the 3 indexes and we have 350 to 417 referring domains. But what stands out is our ratio … which is flagged in red. There are 10 times as many domains as IP addresses … tell tale sign a PBN has been used to manipulate the site metrics.
Check back as this post is likely to expand rapidly along all of the metrics introduced here.
For more reading on a similar topic check out this article about when to 301 redirect backlinks for an underperforming site. What metrics are important in SEO.
Here is a recently added FAQ to the Ultimate SEO FAQ section.
Let me show you how important it is….
Why is realtor.com not higher than zumper.com in the mobile search on the right? Consider these metrics
Realtor.com = Domain Score: 55 Trust Score 58 Alexa Rank 763 Registered 1996 Ads 3,900 Backlinks 57,000,000 Traffic Rank 108
Zumper.com = Domain Score: 37 Trust Score 44 Alexa Rank 17,000 Registered 2004 Ads 0 Backlinks 1,700,000 Traffic Rank 2830
In every metric realtor.com wins, so why is it below Zumper.com on the mobile search?
So in this example we clearly see a more popular site beaten by a less established site and the single only factor the smaller site did better was speed. And we cant discount this as … well its only important in mobile. In case you missed it…
Now when we consider the facts above lets also dispel people’s over fascination for keywords and text optimization and position of frequency of words, the content length …. on-site SEO, the SEO of the 1990s as I call it… both sites present the same content to the desktop and mobile versions they just differ wildly in the speed. What are some of the reasons? Realtor.com decided to present 16 rows of 3 images of homes to visitors while Zumper shows 4 rows of 1 image …. and then additional rows load as you scroll down. Lazy Load and 1 image vs 3. Thats how they keep their requests to about a third of the realtor.com page.
I’d suggest you think of requests as if they are shots from a gun at your head. You need to avoid them! Less shots is a lot better…
Requests are literally requests of the server before the page can load. If I make a page with one image on it that is one request. Lets say I decide to replace that image with a slider with 5 slides, now I have 5 requests … the same page area but that cool feature increases the trips required of a computer to quadruple! Lets say now I add social media icons to the page … Facebook, Twitter, Instagram, LinkedIn and an email icon …. small and just up in the right corner. That social media addition just added 5 more requests. Think about all the things on your page, they don’t all come together in one big Amazon package with a smile…. they are shipped to the computer individually. Now I have one page with 1 request and another with 10 and the initial difference isn’t much…that slider only displays one image at a time.
Servers don’t respond instantly…they take a little while to think and retrieve the requested resource and then it has to travel the distance from the server to your computer…may be at the speed of light, but light still takes time. This time is called latency. 50 milliseconds is a good latency.
If both servers in the FAQ had a 50 ms latency. We can assume that the
Realtor.com server will take 50 ms x 301 requests = 15050 ms or 15 seconds
Zumper.com server will take 50 ms x 134 requests = 6700 ms or 6 seconds
I hope this explains why you want to limit requests, and prioritize speed as much as you focus on keywords.
Do you need separate images? On ultimateseo.org I wanted to show my COMPTia certifications. I have 4 icons … I combined them to make one image. Thats 1/4 the requests but no change in user experience other than a quicker site.
Lazy Load also helps speed up the initial page load time. If “below the fold” you have a lot of images on a page … the page needs those images still to finish the load unless you institute lazy load which essentially tells the computer to load an image only when it is coming into view. This makes sense likely if you have 300 images on the page and plenty of them are scrolled far down….but all in all I’m on the fence on Lazy Load. I ran speed tests on the homepage of this site with Lazy Load on …. 3 tests results 2.3 seconds, 1.9 seconds and 1.9 seconds. I turned off lazy load, and reran the test and got 2.3 seconds, 1.9 seconds and 1.7 seconds. So technically the site loaded faster with Lazy Load off….keep in mind it take a bit of thinking for the server to implement it. This helps speed up a site drastically if there are a ton of images spread vertically…but not much in a normal page. What are the full implications on SEO when a site is crawled?
Its suggested by “Ask Yoast” that Lazy Load is fine for SEO and the images are rendered as Google scrolls down the page and indexes the content.
Tracert is a command thats elementary to networking and computers. Trace Route or Tracert does exactly what it sounds like, and its useful cause it tells ya every ip address it passes through between the server and the catcher (not technical terms there). It explains where speed issues are in a global perspective or in your home.
Its usually just text but https://www.monitis.com/traceroute/ made it more fun…and from this map I can see why my fiber connection isnt seemingly very fast tonight, I’m being routed through London, England to do a domestic “hop” (hops are each leg of a journey in a tracert.
Thanks for checking in with Ultimate SEO, this site is a side project as my client’s sites are the main one. This means I may often have edits and unfinished elements to circle back too. I encourage you to feel free to let me how this site could be better using the contact form.
UltimateSEO.org has backlinks from about a thousand domains. In a recent review of these I found an odd reoccurring link from multiple domains but all with the same content and titles. I was introduced with “The Globe” which charges sites to NOT list them or makes money from SEOs paying them to not backlink to them. At $36 a link they’re likely insane and I bet its bringing in some money. But before we go all crazy and start paying Ransomlinks (if its not a word I claim it … Ransomlinks are backlinks from bad sites meant to lower your SEO score unless you pay to not be linked too.)
In reviewing the situation I ran across a list of the most disavowed sites. I figured Id share that with you below, but before I do what outcome did I choose for these bad links pointed to my site?
I’m opting for the third as I dont have any indication that Google cares about these Ransomlinks. They may actually bring some random traffic of use so redirecting them would take that from my site.
And now the most disavowed sites…