Link Building: Site Directories

An easy way to build backlinks is through directory submissions.  Anything hard though is worth more, so keep in mind directory submissions don’t carry the weight that in content links will.  With that said there is still diluted value in link directories, just don’t focus on them.

Here are some link directories and they are organized by SEO power.

The button below opens a new window with about 100 sites. We also have the following directory sites maintained by Ultimate SEO and these are always free to list with.  This site also has a directory on it built in…you can access it by the top menu bar.

https://seoblogs.directory
https://healthblogs.directory
https://candidate.directory
https://workout.directory

Hits: 1

Link Building: Site Directories

PBNs 2019 – Domain Detailing – Picking Expired Domains And 301 Redirects

In PBNs 2019 Part 1 we discussed some logistics of organization on a high level concerning Cloudflare, Spreadsheets and above all randomness.  In this the second part of our private blog network series we’ll focus on expired domains and the assessing of their value.

Domain Authority Is One Part

First, we will look at Domain Authority but before we do lets note Domain Authority in and of itself is worthless.  It IS NOT used by Google.  It is made by a company called Moz and it is their best guest how Google interrupts the content on a site based on a lot of factors but not all the factors and not in the same ratio.  Why do we use it then?  Google doesn’t have to tell us their formula for ranking a site in their results…in fact to do so would likely be a disservice to everyone involved.  So based on observation and conjecture Moz provides us with an imperfect rating called Domain Authority.  It is though in the end a made up number that is flawed.  It was so flawed that DA 2.0 came out in March 2019 where many sites saw their rating plummet.  Search results didn’t change … why would they when they are not connected to DA.  Now thats out of the way…

Expired Domains

We should start with https://expireddomains.net its a good site that has aggregated the domains being deleted from multiple registrars and it provides some base metrics that we can use to qualify domains for a closer look.  There are millions of deleted domain names and we need someway to filter them down to something you can work with.

Why do we care about expired domains?  Your PBN is attempting to mlmic organically earned backlinks.  In the real world backlinks would likely come from sites that have their own backlink profiles and no two organic sites would likely have the same backlink profile.  Buying a new domain name will come with 0 backlinks.  Buying a used one will likely include some backlinks therefore giving your PBN its power.  Think of these like telephone numbers…and you want someone to call you…you wont want to use the old number of someone who didn’t have any friends, but if you get a popular person’s old number then your phone is more likely to ring.  Sure they are after the former owner ( or site in our case ) but if you can answer their questions and keep them happy they may keep calling.  Same with expired domains.

Another site that focuses on expired domains is domcop.com.  They are expensive and I’ve never used them because I can do all the work it does pretty swiftly and save the 100 dollars a month.  Sure they have a cheaper plan but you cant do anything with it.  Access to the expired section and archived section isn’t included …. making it relatively worthless.  Back to expireddomains.net its free.  Once we log in we can choose filters to sort down to the domains we want.  Some of the filters I always put into play are:

expireddomains.net filters

expireddomains.net filters

The Main Tab

  1. Domain Name   CONTAINS  – If I want a domain about a hospital I’ll type ” hospital ” then may remove some domains that would have worked but its helpful.  If you get too few results you can remove the filter.
  2. Length – I set this to no more than 15 characters.  Longer ones are likely created for some weird purpose.
  3. Hyphens – I set this to no more than 1
  4. Backlinks – I set this to 10 or more.  Keep in mind that the metrics reported may not be accurate all the time but this at least fishes out all the domains that have 0 links.
  5. Only New Last 7 Days – 7 days is ages in this kind of hunt but for our purposes I leave a few extra days because those that set it to 12 or 24 hours may have missed a gem.  After a week though the pile is looted beyond use.
  6. Only Available Domains – who cares about it if its already registered for someone else.
  7. Domains per page – change this to the max … 200.

The Additional Tab

  1. I check NO CCTLDS and NO SECOND LEVEL DOMAINS – the quality of the domains aren’t worth your time in these country codes unless you are specifically targeting that country…but even then a .org or a .info is better usually than a .bi

The Adwords & SEO Tab

  1. Domain Pop – How many domains point to the expired domain?  A site with 100 backlinks from 1 domain is less valuable than a site with 10 backlinks from 10 domains.  So we want to make sure we have the most domains pointing to our site.  I generally set this to 50 or 100.
  2. Wikipedia – If I have a ton of results after all these filters then I may put a minimal of 1 here…but thats a secondary search if iI need to filter more.

The Majestic Tab

  1. Trust Flow – I’ll set this to 7.  Its rare to find a high trust expired domain as they’ve been picked up usually in an auction but you don’t want a trust flow of below 5 so I put that number in here to ensure the site has some trust.
  2. I may use the drop down filer Majestic Topical Trust Flow for specific niches but be careful here because the sites often lose context of the domain name… for instance a domain named “ThisIsHealthInsurance.com” can very easily be miscategorized as transit.

Filtered Results

Search and gain these results….

expired domains search results for pbns

expired domains search results for pbns

Now most of these values are worthless to me.  They are cached results usually and not exactly accurate.  SO I use the clip board icon to export the top 200 sites that made it through our filters.  I sort these results though first by Domain Pop or DP in the graph.  Use the clip board to export just the domain names to your clip board. clipboard

clipboard

Domain Detailer – Accurate SEO Metrics

With these export names in order of domain pop I paste the results into a speed-sheet for safe keeping.  And then I use a program called Domain Detailer.  You can download it … its an application on PC.  You can buy credits and each domain name looked up takes a credit.  But you get a ton for your money.  Ive probably spent 80 bucks on these credits and looked up over a hundred thousand domains in this tool.  The metrics reported are more accurate than expireddomains.net metrics.

Domain Detailer

Domain Detailer

From Domain Detailer I paste the domain names I have selected and click “Add From Text Box” with credits the program will build you a spreadsheet with metrics that should be semi self explanatory.  The metrics I focus on are Moz DA, Majestic Domains, Majestic TF and the categories.  Backlinks are often off and its good that we have two sources … Moz and Majestics counts … if either has some I believe the higher one.

When I find a domain that has a mix of these metrics that seems above the group I move to the final phase.

Anchor Text

I open SEMRush.com and search that domain name and verify that the domain pop is as good as I expected.  I then check the anchor text and if the anchor text is consistent with the keywords I’d like to be pulling in I then say … its time to buy.

Remember to use a registrar that offers free private registration and shop around they are not all the same price.

We’ll talk soon about PBNs and the onsite things that you’ll need to consider when building the site up.  You also may decide that you just want to help a specific site out and skip building a PBN site and 301 redirect all backlinks to the main domain…we’ll cover that as well in Part 3.

Hits: 0

PBNs 2019 – Domain Detailing – Picking Expired Domains And 301 Redirects

Trace Route Network Tool, Only On A Map

Tracert is a command thats elementary to networking and computers.  Trace Route or Tracert does exactly what it sounds like, and its useful cause it tells ya every ip address it passes through between the server and the catcher (not technical terms there).  It explains where speed issues are in a global perspective or in your home.

Its usually just text but https://www.monitis.com/traceroute/ made it more fun…and from this map I can see why my fiber connection isnt seemingly very fast tonight, I’m being routed through London, England to do a domestic “hop” (hops are each leg of a journey in a tracert.

tracert in SEO

tracert in SEO

Hits: 40

Trace Route Network Tool, Only On A Map

How To Rank Your Site On Google…Forget the Keywords

Ultimate SEO

Well don’t totally forget the keywords but I think if you spend more than 5 minutes on keywords you’re going to be pretty surprised by some purported data I stumbled across.  As you likely know Google uses about 200 factors in determining your sites ranking.  I personally have place a lot of emphasis on speed and backlinks and while I have thought it was important I must admit I didn’t give the social media factor as much attention as I should.

The first big thing to not is that 10.3% of ranking is CTR so if you have ever seen your content just jump up on the rankings and then slowly (or quickly) taper off as time goes its likely that people are clicking on you less and less as you slide below the pages.  Its the single biggest impact.  I feel Google gives you the benefit of doubt at first ranking you higher than average and then they allow people to determine if your site is worth it.  That’s important to consider and similar to conversion rates.

When we take human behavior out we have largely backlinks and social media deciding your ranking ability.  These make sense, if no one is talking about you but they are about your competition and more people click on your competitors site who also has the most backlinks you’re wasting your time trying to get your keywords exactly right in the headers, description and title….they combined contribute a value of 5  where backlinks are over 120.

If you’d like to access the Data Studio report directly you can visit https://datastudio.google.com/open/1lNt4SYd4jrfXWMo9HPKvrj1FWFO0oxG4

If this graphic surprises you it might be a good plan to check out our SEO Store or Upwork Profile.

Hits: 81

https://ultimateseo.org/site-on-google-ranking/

52 Tools And Counting: Mostly Free SEO Tools – I Actually Use

Work In Progress

Thanks for checking in with Ultimate SEO, this site is a side project as my client’s sites are the main one.  This means I may often have edits and unfinished elements to circle back too.  I encourage you to feel free to let me how this site could be better using the contact form.

52 Tools And Counting: Mostly Free SEO Tools – I Actually Use

Bad Backlinks: 100 Sites You Don’t Want A Backlink From.

Bad Backlinks

UltimateSEO.org has backlinks from about a thousand domains.  In a recent review of these I found an odd reoccurring link from multiple domains but all with the same content and titles.  I was introduced with “The Globe” which charges sites to NOT list them or makes money from SEOs paying them to not backlink to them.  At $36 a link they’re likely insane and I bet its bringing in some money.  But before we go all crazy and start paying Ransomlinks (if its not a word I claim it … Ransomlinks are backlinks from bad sites meant to lower your SEO score unless you pay to not be linked too.)

In reviewing the situation I ran across a list of the most disavowed sites.  I figured Id share that with you below, but before I do what outcome did I choose for these bad links pointed to my site?

  1. Option 1 Pay: Heck No! Then the terrorists win.
  2. Disavow: No! Don’t use disavow unless Google has placed a manual action against your site.  I’m skeptical anyhow of the tools purpose and Google itself says there is no need to use the tool unless you’ve been penalized and told by them you are being penalized.
  3. Do Nothing: Yes! Don’t do anything. Google likely knows about the Ransomlinks scheme and has already penalized the site by deindexing it.  There are so many random domains its going to be a mess to address so let it be unless you have a seen a negative affect.  In other words…before you saw your leg off wondering if that spot is cancer…stop and find out.
  4. An idea: 301 Redirect Them…seriously…all of these links point to a subdomain that until now hasn’t existed.  Most others who are talking about this site note a similar subdomain targeted.   I could create the targeted subdomain and redirect all links to it from my site back to theirs.  🙂  

I’m opting for the third as I dont have any indication that Google cares about these Ransomlinks.  They may actually bring some random traffic of use so redirecting them would take that from my site.

[democracy id=”2″]

And now the most disavowed sites…

Most popular websites disavowed by webmasters

1 blogspot.com
2 blogspot.ca
3 blogspot.co.uk
4 ning.com
5 wordpress.com
6 blog.pl
7 linkarena.com
8 yuku.com
9 blogspot.de
10 webs.com
11 blogspot.nl
12 blogspot.fr
13 lemondir.com
14 blog.com
15 alonv.com
16 tistory.com
17 searchatlarge.com
18 dvpdvp1.com
19 typepad.com
20 nju-jp.com
21 bluehost.com
22 wldirectory.com
23 tumblr.com
24 hyperboards.com
25 directoryfuse.com
26 prlog.ru
27 informe.com
28 ligginit.com
29 theglobe.org
30 pulsitemeter.com
31 articlerich.com
32 weebly.com
33 the-globe.com
34 blogspot.no
35 theglobe.net
36 articledashboard.com
37 dig.do
38 seodigger.com
39 cybo.com
40 fat64.net
41 bravenet.com
42 cxteaw.com
43 askives.com
44 mrwhatis.net
45 insanejournal.com
46 xurt.com
47 freedirectorysubmit.com
48 commandresults.com
49 sagauto.com
50 internetwebgallery.com
51 freewebsitedirectory.com
52 ewbnewyork.com
53 000webhost.com
54 tblog.com
55 directorylist.me
56 analogrhythm.com
57 snapcc.org
58 bravejournal.com
59 weblinkstoday.com
60 m-pacthouston.com
61 linkcruncher.com
62 tripod.com
63 cogizz.com
64 niresource.com
65 over-blog.com
66 ogdenscore.com
67 free-link-directory.info
68 alikewebsites.com
69 folkd.com
70 djsonuts.com
71 uia.biz
72 bangkokprep.com
73 forumsland.com
74 punbb-hosting.com
75 hostmonster.com
76 blogspot.in
77 siteslikesearch.com
78 bookmark4you.com
79 siliconvalleynotary.com
80 listablog.com
81 poetic-dictionary.com
82 linkspurt.com
83 cultuurtechnologie.net
84 azjournos.com
85 exteen.com
86 articletrader.com
87 blogspot.com.au
88 delphistaff.com
89 altervista.org
90 media-tourism.com
91 woodwardatelier.com
92 holdtiteadhesives.com
93 lorinbrownonline.com
94 tech4on.com
95 popyourmovie.com
96 trilogygroveland.com
97 foqe.net
98 directorybin.com
99 eatrightkc.com

Hits: 550

https://ultimateseo.org/bad-backlinks-ransomlinks/

Updates That Matter AND Updates That Don’t :SEO Basics

This post was originally made 3/7/2019 but was lost during a restoration from backup while Ultimate SEO was dealing with a DOS attack.

I’ve heard things come in threes so curious whats next because 2 big updates this week are out.  Now unfortunately the one that matters is unlikely to gain as much attention as the one that doesn’t mean anything.  So lets begin on what folks are focused on…something that means nothing.

DA 2.0 NEW DOMAIN AUTHORITY

MOZ Domain Authority 2.0

Moz redid their Domain Authoritycalculations and the implications are HUGE for those who workat Moz.  If you don’t workthere then its not a big deal.  Domain Authorityis like a fake credit score.  Your bank will likely use your FICA score but no one can release your FICA score to you but FICA.  To solve this banks and other organizations created their own scoring systems that attempted to mimic your FICA score that they can give out to you on credit monitoring sites.   These numbers though aren’t really that important as they are guesses to what your FICA score should be…VantageScore for instance gives a person a score based on their credit history thats also 3 digits but it isn’t a FICA score.  If you bank uses FICA scores who cares what your score is at VantageScore.

Moz made up Domain Authorityand Google doesn’t use it.  So a change to calculating Domain Authorityfrom Moz does not mean a change in search engine ranking.  Domain Authority is useful to you because it’s an educated guess as to what Google thinks just like Citation and Trust Flow are guesses by Majestic.

I don’t know about everyone else but the new calculations vastly changed the impression some would have of several domains I operate.  Here are some examples:

Moz DA April 2019 – March 2019 – Backlinks

So am I upset that Ultimateseo.orglost 5 points?  No.  Cause its like it lost 5 Matt Dollars …. But Matt Dollars don’t matter and NEITHER DOES Moz Domain Authority.

But again, Domain Authority has value when used among multiple other metrics geared at assessing a site’s rank ability.

PHP 7.3

PHP 7.3 Released And Available To Cpanel Servers

Its out and you can now run your site on it if you’re using Cpanel/WHM servers.  Im focused on Cpanel because its highly popular …. unless you use Wix or Shopify which use propreietary server management software that isn’t an industry standard.  Now, likely you don’t even use 7.2 as many sites still operate on PHP5.6.  BUT here are the advantages of 7.3

Now its relevant to SEObecause WordPressruns on PHP and WordPressis an SEOfavorite.  While new features are great and PHP 7 has proven much faster than PHP 6 this newest update may require some caution.  PHP 7.3 Issue With WordPress 5

Php7.3 And WordPress Speed

We have some testing done by others that note”

Is PHP 7.3 faster than PHP 7.2? Should I use PHP 7.3 for my WordPress site? We have done our own performance testing for WordPress running with WooCommerce and benchmarked PHP 7.2 against PHP 7.3.

We installed a standard WordPress 5.0 with the Storefront theme and imported the 50 products supplied by WooCommerce as sample data on a standard Servebolt High Performance plan.

We wanted to test whether PHP 7.3 was performing better than PHP 7.2, and therefore bypassed the reverse proxy and ran tests directly against the backend web server running the PHP, effectively bypassing all caching. The tests were run from the server to eliminate network bias.

We used as the testing tool, running 3 000 requests with a concurrency of 1000, with keep alive enabled. ….

We did a few more compiles of PHP 7.3, and tested benchmarked those. We also did benchmarks on all major versions from 5.6 and up. See the results in the table below.

PHP and Database

Req/s PHP 5.6 PHP 7.0 PHP 7.1 PHP 7.2 PHP 7.3 PHP 7.3 v2 PHP 7.3 v3
PHP 5.6 74 data-sheets-numberformat=”[null,3,”0.00%”,1]”> data-sheets-numberformat=”[null,3,”0.00%”,1]”> data-sheets-numberformat=”[null,3,”0.00%”,1]”> data-sheets-numberformat=”[null,3,”0.00%”,1]”>
PHP 7.0 177 239.19%
PHP 7.1 183 247.30% 103.39%
PHP 7.2 192 259.46% 108.47% 104.92%
PHP 7.3 221 298.65% 124.86% 120.77% 115.10%
7.3 v2 221 298.65% 124.86% 120.77% 115.10% 100.00%
7.3 v3 223 301.35% 125.99% 121.86% 116.15% 100.90% 100.90%
7.3 FINAL 224 302.70% 126.55% 122.40% 116.67% 101.36% 101.36% 100.45%

We ran this test 3 times on PHP 7.2 and three times on PHP 7.3, and compared the numbers.

PHP 7.2 average: 192 requests per second
PHP 7.3 average: 224 requests per second

The results were consistent with very small variation. WordPress with WooCommerce running PHP 7.3 outperforms PHP 7.2 by 16.67%.

So what are you waiting for? It is time to get that extra performance boost. Upgrade your site to be PHP 7.3 compatible today, and get the 10-17% extra performance boost!”

Techy Stuff In 7.3 Update

From hackernoon.com we get the features listed below:

JSON_THROW_ON_ERROR

Not having an adequate way to handle errors when using JSON has been a problem for quite a long time, and web developer all over the worlds have seen this as a huge downside of the language,

The RFC of PHP 7.3 has accepted this update by a 23 vs 0 vote which says how much this feature has been requested.

Until PHP v7.2 we needed to use a workaround to get an error from JSON and it was not reliable nor proficient in its job;

The new flag I am about to show you is an excellent alternative because give to the programmer the opportunity to use the power of exceptions that can be managed within the “try-catch” block of code.

Is_countable

A countable element in your code can be a variable with an array format or an object whose class is implementing the Countable interface.

Last year PHP 7.2 added a warning that shows up whenever the web developer was counting or trying to loop an uncountable element.

It is possible to solve this problem and one of the best solutions currently used is to apply a check like a snippet below:

The code checks whether the variable is an array or is an instance of the Countable interface.

And it will work but it seems a little bit “crowded” and as many of you that work long hours, after a while seeing this kind of lines wear your eyes out.

The teamthat is developing the new version took accountof this and added a new function that will help the web developer immensely.

The is_countable function takes a variable as a parameter and then return a boolean depending if the function is countable or not.

There is no restriction about the format the parameter has to be, of course, if you put a non-countable variable the return will be false.

array_key_first(), array_key_last()

As per version 5.6 of PHP, there are over 75 built-in functions that belong to the arrays’ category.

Despite the vast numbers of tools available, at the moment, if we need to retrieve the first or the last key of an array we have to get all the keys using array_keys()and only then go for the first or last values of the array.

Another way is to opt for end()or reset().

As you may know, all the methods just described modifying the array pointer, which is something that (other than be resources consumer) you just may not want to do.

The RFC of the upcoming version proposed the introduction of 4 brand-new methods the were set to solve this issue.

The four methods were:

  • array_key_first()
  • array_key_last()
  • array_value_first()
  • array_value_last()

Among the four of them, only the one set that fetches the keys were accepted with 18 to 14 votes.

They work for both numeric and associative arrays.

The same would have worked for the other two functions illustrated in this chapter array_value_*

Just to be clear, let me repeat,

Those functions have been refused with 18 no and 15 yes.

In my opinion, these two functions would have been useful as well but according to several web developers, in certain cases, the value returned would have been ambiguous.

Here is why:

An additional option that I come across browsing on forums and talking to other web developers was to return a tuple like [$key => $value].

Even though this option will not be available on the new version, seeing the favourable responses, it might arrive with the following RFCs.

Since this is a function that did not exist before there are not going to be any backwards compatibility problems, the only issue could arrive if you have created and you are using your own version of array_key_first()and array_key_last().

Same site cookie

Deploy secure application must always be the main focus of every programmer.

One task that each of us is facing of daily basis it to diminish the risk of CSRF and information leakage attacks.

Same-site cooking declares that cookies must be sent only with request initiated from the same domain.

This is not an official standard but Google Chrome and several of the best PHP frameworks already implement it whereas Firefox and new version of other frameworks confirmed that they are planning to do so.

Here is the support schema for same site cookie from caniuse.com

Currently, cookies are issued by the set-cookie header, a web developer can set a key-value pair alongside flags for the browser in order to agree if a cookie should be available or not.

This way to do things allows a vulnerable access to CSRF attacks.

The new RFC adds is suppose to solve the problem is a non-breaking mode by adding a new parameter and also modify four main functions of the language.

  • setcookie
  • setrawcookie
  • session_set_cookie_params

Two ways were proposed.

Adding a new argument to the function or allowing an array of option for moving all the options of the cookies inside.

How it will work?

Two values can be added to the same site flag present in the above functions

They are Lax and Strict.

Cookies that are using Lax will be accessible in a GET request that comes from another domain, while on the contrary Strict will not be accessible in a Get request.

Include features that Increase security in the code seem always a no-brainer but as always before deciding to apply them in our scripts we need to properly evaluate the pro and the cons of our choices

The main risk implied for using the same site flag as a supplementary argument to those functions is that it might never become an official standard.

It means that eventually browser will downturn the flag.

If this happens and you have already implemented it, it will result in you applications stuffed with junk code that need to be removed.

Ultimate SEO

https://ultimateseo.org/updates-that-matter-and-updates-that-dont-seo-basics/

Creating A Private Blog Network: PBNs In 2019 For SEO

Typical Parts Of A PBNEach PBN site will include registering the domain, setting the name servers and hosting the site.

First and foremost the most important aspect of your Private Blog Network is randomness.  Consider what pattern or foot print your PBN might have and avoid that commonality.

patterns that give away a PBNPatterns and commonality to avoid in building a Private Blog Network

Good PBNs Are Random, Start With Different Name Registrars

First off you need private domain registration, if not private then you’ll need people and addresses from all over.  If you always use Godaddy you’re going to have to try out others to avoid a pattern.  Incidentally if you always use Godaddy you’re getting ripped off as they will charge you for privacy and many others don’t.  Some popular Name Registrars are 1and1.com namesilo.com namecheap.com cosmotown.com each of these can save you a considerable amount over Godaddy considering they offer free private registration and using more than one breaks a pattern.

Each time you add a new site to your PBN you need to approach it from the beginning as if you’re playing a character in a story who has never made a website before, when I say that I mean if you know you have a site on Host A and you like that host you’re making decisions based on previous sites and are more likely to create a pattern.  Forget Host A how would you find a host for the first time?  Google popular web hosts and pick a cheap new partner.

One thing that’s really beneficial about building PBNs that is more helpful to you in the long run is the forced exploration.  After you’ve built ten sites on ten hosts using ten registrars and ten WordPress themes you’ll be able to write three top ten lists and rank the best of the 720 combinations that were available to you.  It’s a lot of practice and as you’re avoiding patterns and repetition you’ll find yourself stepping out of your norm.

Vary Your Web Hosts

Speed of a web host is important normally but not necessarily when your building a PBN.  While you want your primary or money site to load in under 3 seconds its perfectly fine if your PBN site loads in 7 seconds and that opens the door to all manner of generic no name web hosts.   Your primary goal with multiple web hosts is to utilize a different IP address.

Organizating A PBN Gets ComplexConsidering the complexity that can quickly arise when seeking randomness of your sites.

The only two big issues with this model …

Organization OF PBN Resources

What site is down?  Oh….well which domain registrar did I use?  Am I using their nameservers, someone else’s?  Where did I point that to be hosted?  Sure these aren’t that annoying to answer with a 10 site network, but try answering it when you’ve built and scaled up to 200 sites using 7 registrars, 20 name servers, 150 different IPs … it becomes unmanageable as you find yourself searching for your site more than you are building new sites, and why are you having to search?  Maintaining a site is essential, as updates roll out to WordPress, plugins get updated and hackers exploit new vulnerabilities.  If you log into every site you own and spend 5 minutes on each site your 200 domain name network will take 16 hours … or two days a week and consider that you only spend 5 minutes on a site, you likely didn’t fix any issues and took no breaks!  It’s time to consider an apprentice or spreadsheets that fully document every aspect of your network, or both.

Uptime Monitoring

Somewhere around 100 domains I figured out I needed to approach this like an enterprise would and have actual uptime monitoring allowing me to see the state of the network easily.  UptimeRobot allows you to set up 50 monitors on a free account.

Uptime Monitoring Your PBN

In the real world 94% Uptime is horrible.  Consider that in the last 30 days I had a recorded 104765 minutes sites were down in this sample of sites.  I had issues with a server getting attacked by someone using 1700 servers causing a DOS attack.  Why?  Anyone’s guess … usually its a game to them and they aren’t paying for those 1700 servers but they’re other people’s hacked resources being used to grow their network.

You may be interested in MainWP or InfiniteWP … Godaddy provides Godaddy Pro.  You need to be mindful that these only work when they work and will they give away a signature pattern?  Likely they can create an easier management solution but easier is dangerous.

Costs Ballon And Randomness Prevents Savings

As you scale up from 10 to 20 to 50 sites your going to wake up one day and realize youre spending hundreds of dollars a month on infrastructure and all of your time will now be consumed with maintaining your network.  Adding someone to help you is going to increase costs and take your time to train them in being effective at maintaining the network.  Be careful who you bring in to help you, friends are obvious choices but when they get upset about something unrelated to the network they could leave you high and dry.  Worse yet, they are the most likely to teach you a lesson by bailing on you for a couple weeks.  Trust the people who are in it for the money … pay them more than they can get at a retail job to build loyalty to your mission. They need not be technical people but they need to understand that if a site is down, Google can’t index it and that backlink is missing now.  They need to be able to follow a logical progression and understand the parts that are in play to help you maintain the site.

The obvious answer to addressing costs is to bundle services and make sure you’re utilizing resources in the most effective manner but that is accomplished by making patterns.  You can’t find cost savings by giving away your sites.

Cloudflare Allows Consolidation And The Pattern Is Indistinguishable

Cloudflare Use For PBNsCloudflare allows some consolidation while masking the pattern

12,000,000 sites utilize Cloudflare’s free services which include masking your host servers IP, CDN services and security.

Cloudflare offers the ability to hide among the masses.  Who is Cloudflare?  They stand in front of your server and take the brunt of the internets crap.  Upwork.com, Medium.com, Themeforest.net, Chaturbate.com are among the names using Cloudflare.com services.  Some estimates suggest that Cloudflare is about 8% of the entire internet.  Thats huge!  At one point they found themselves protecting the Israeli government’s network as well as the PLOs.Cyber Warfare: Cloudflare In The Middle

Using Cloudflare is hiding in plain sight and free.  I recommend it but in a mixture capacity still have some sites out side of their network just to avoid any one bottleneck, it would seem odd if 100& of the sites linking to a domain are using Cloudflare….remember they are 8% and while the largest chunk of the internet they aren’t the internet.

This article has focused mainly on external and infrastructure concerns of building a PBN.  This is really a third of topic and in the coming weeks I’ll include two more posts that address on site content issues of building a PBN and site design considerations for a network of sites.

Hits: 247

https://ultimateseo.org/creating-a-private-blog-network-pbns-in-2019-for-seo/