Cloud Computing: Digital Ocean vs Google Cloud vs AWS

This may seem off topic but its on topic, technical SEO is imperative … you’re not going to rank number one on Google using Shopify or Wix.  It just isnt going to happen.

Its also apparently difficult to get solid advice on SEO Hosting from “experts” Best Blog Hosting for SEO is junk … reciting features doesnt make a hosting plan the best…one quote notes that WordPress is already installed with InMotionHosting.com … so what!  Our web servers are preconfigured to install WordPress in every new account as well…it only saves maybe 5 minutes per user but for a web host that time adds up very quickly. But you arent a web host so it’s not that big of a deal.  I’d like to hear about benchmarking tests they may have run to decide who is the best.

Features Aren’t Technical Specs

Unlimited bandwidth…sounds great but what are the limits?  There are limits and these are beyond the hosts control sometimes but for instance …. if someone uses a CAT5 cable instead of a CAT6 everything will be more speed limited and especially if a bottle neck is designed in to infrastructure. Unlimited bandwidth means nothing to me because there are limits … physical limits exist and can’t be avoided.And WordPress preinstalled saves someone 5 minutes but nothing else.  These aren’t important to the Hosting platform.

Cloud Computing: Be Your Own Host

The industry standard in web hosting is cPanel.  No way around it with cPanel your support opinions are bountiful where as dreamhost.com has its own proprietary server software … its no better in actuality its just far less supported by third parties.  Ultimate SEO is hosted on a variety of cPanel servers that were eay to build and deploy, I made them from scratch and with templates but all in all there are 4 AWS servers, 2 Google Cloud Platfrom and 4 Digital Ocean currently powering hundreds of sites including this site.  Cost varies wildly…

Its important to note that your web host is honestly likely run on one of these three services.  Youre sharing their share of the cloud environment.  Why not just skip ahead and be the master of your domain….sure it will cost more than $3 a month … but that $3 a month hosting plan is shit.

A good review between AWS and a traditional hosting provider is AWS vs Blue Host

Amazon Web Services

I don’t even know what I am spending, where and how it is being spent.  AWS charges you for everything little thing and no matter what steps you might take it may seem like rising project costs are simply unavoidable.  There platform to work within is NOT intuitive and it will require some play time to remember that you have to leave the virtual server’s configuration area to select an IP address  ( that will cost you money…each ip address, not talking about bandwidth I’m just saying the number ) and then return to that original area to associate it.  Dont even think about swapping hard drives and knowing what is attached t what unless you are prepared to write down long strings of numbers and letters.

AWS does provide greater flexibility than the others on options beyond just a virtual server…but unless you plan to send 100,oo0 emails a day to people you wont benefit from their email service … as an example.  Technical SEO wise I’d give AWS a D overall. Infrastructure and computing power is an obivous A+ but its how you interact with that that weighs the grade.

Poor navigation and the nickle and dime pricing is absurd.  Want to monitor your usage so you can understand your bill?  Monitoring costs more…its ridiculous.

They do offer reserved instances and I loaded up on those but still my costs never decreased.  AWS is so hard to understand billing wise that IT Managed Service Providers will offer free excel templates to figure out your AWS monthly costs.  Think I’m being over the top?  Check out this calculator form sheet by AWS to forecast your expenses.

Heres something crazy…why my April bill was 167 but AWS forcasts it will be $1020 in May I have no idea.  I’m not adding servers…

AWS costs are high and unpredictable

AWS costs are high and unpredictable

Google Cloud Platform

Is easier to use and wrap your head around but it is considerably more expensive than either of the other options. For this simple reason…they receive an F. The additional costs come with less options and less features than AWS.  Billing is more transparent and you can understand why your bill is what it is at least.  But Google also makes unilateral decisions for you like blocking smtp and ssh access.  Sure its more secure but it makes email and server maintenance a nightmare.  Documents like this Connecting to Instances make it seem like not a big deal, but these wont allow you to move a file from your computer to the server like SFTP would.

They are expensive, offer less and needlessly shot you in the foot with their restrictions.  Thats why I stand by the F as an overall grade.  Now infrastructure capabilities … A+ no doubt about it.

Digital Ocean

I received no compensation or thank you from anyone for writing this … Digital Ocean is my B+ graded cloud solution.  Its the cheapest, and they don’t seem to charge you a fee for tools that are required for the main product to function, unlike AWS and their static ip addresses.  They have the least ability and options outside of a virtual server.  If you want a database server thats in the works unless you can use Postgres. Thats limiting, but it is also not important if you’re just running a few web servers that will already have MySQL installed on them anyhow.

Digital Ocean is the no frills, no surprises, cloud computing option.  The reason I have so many servers is because I am migrating everything off AWS and Google Cloud to Digital Ocean…it’ll be cheaper.  A lot cheaper…

cloud computing cost comparisons

cloud computing cost comparisons

Thats right… $20 vs $121, $177 and $120 from AWS, GCP and Azure.  I didn’t really consider Microsoft Azure just because I have reservations moving into their sphere or control where every thing you need to do is addressed by yet another Microsoft product that usually has little imagination in it.

Test out a server in each environment and I think you’ll quickly take to the Digital Ocean option.

Hits: 3

https://ultimateseo.org/digital-ocean-aws-google-cloud/

Speed: Page Load – Technical SEO Out Ranks Most In Mobile

A Case Study of SEO Metrics And Rank

Here is a recently added FAQ to the Ultimate SEO FAQ section.

Let me show you how important it is….desktop vs mobile search results

desktop vs mobile search results

Why is realtor.com not higher than zumper.com in the mobile search on the right?  Consider these metrics

Realtor.com = Domain Score: 55 Trust Score 58 Alexa Rank 763 Registered 1996 Ads 3,900 Backlinks 57,000,000 Traffic Rank 108

Zumper.com = Domain Score: 37 Trust Score 44 Alexa Rank 17,000 Registered 2004 Ads 0 Backlinks 1,700,000 Traffic Rank 2830

In every metric realtor.com wins, so why is it below Zumper.com on the mobile search?

Site Speed Test on GTMetrix

Realtor.com Fails Speed

site speed

site speed

Zumper.com Passes Speed

page load

page load

So in this example we clearly see a more popular site beaten by a less established site and the single only factor the smaller site did better was speed.  And we cant discount this as … well its only important in mobile.  In case you missed it…

60% of searches are mobile

60% of searches are mobile

Now when we consider the facts above lets also dispel people’s over fascination for keywords and text optimization and position of frequency of words, the content length …. on-site SEO, the SEO of the 1990s as I call it… both sites present the same content to the desktop and mobile versions they just differ wildly in the speed.  What are some of the reasons?  Realtor.com decided to present 16 rows of 3 images of homes to visitors while Zumper shows 4 rows of 1 image …. and then additional rows load as you scroll down.  Lazy Load and 1 image vs 3.  Thats how they keep their requests to about a third of the realtor.com page.

What Are Requests?

I’d suggest you think of requests as if they are shots from a gun at your head.  You need to avoid them!  Less shots is a lot better…

Requests are literally requests of the server before the page can load.  If I make a page with one image on it that is one request.  Lets say I decide to replace that image with a slider with 5 slides, now I have 5 requests … the same page area but that cool feature increases the trips required of a computer to quadruple!  Lets say now I add social media icons to the page … Facebook, Twitter, Instagram, LinkedIn and an email icon …. small and just up in the right corner.  That social media addition just added 5 more requests.  Think about all the things on your page, they don’t all come together in one big Amazon package with a smile…. they are shipped to the computer individually.  Now I have one page with 1 request and another with 10 and the initial difference isn’t much…that slider only displays one image at a time.

Latency And Requests

Servers don’t respond instantly…they take a little while to think and retrieve the requested resource and then it has to travel the distance from the server to your computer…may be at the speed of light, but light still takes time.  This time is called latency.  50 milliseconds is a good latency.

If both servers in the FAQ had a 50 ms latency.  We can assume that the

Realtor.com server will take 50 ms x 301 requests = 15050 ms or 15 seconds

Zumper.com server will take 50 ms x 134 requests = 6700 ms or 6 seconds  

I hope this explains why you want to limit requests, and prioritize speed as much as you focus on keywords.

Ways To Decrease Requests

Do you need separate images?  On ultimateseo.org I wanted to show my COMPTia certifications.  I have 4 icons … I combined them to make one image.   Thats 1/4 the requests but no change in user experience other than a quicker site.

technical certifications

technical certifications

Lazy Load

Lazy Load also helps speed up the initial page load time.  If “below the fold” you have a lot of images on a page … the page needs those images still to finish the load unless you institute lazy load which essentially tells the computer to load an image only when it is coming into view.  This makes sense likely if you have 300 images on the page and plenty of them are scrolled far down….but all in all I’m on the fence on Lazy Load.  I ran speed tests on the homepage of this site with Lazy Load on …. 3 tests results 2.3 seconds, 1.9 seconds and 1.9 seconds.  I turned off lazy load, and reran the test and got 2.3 seconds, 1.9 seconds and 1.7 seconds.  So technically the site loaded faster with Lazy Load off….keep in mind it take a bit of thinking for the server to implement it. This helps speed up a site drastically if there are a ton of images spread vertically…but not much in a normal page.  What are the full implications on SEO when a site is crawled?

Its suggested by “Ask Yoast” that Lazy Load is fine for SEO and the images are rendered as Google scrolls down the page and indexes the content.

Hits: 4

http://ultimateseo.org/tech-seo-mobile-speed/

Updates That Matter AND Updates That Don’t :SEO Basics

This post was originally made 3/7/2019 but was lost during a restoration from backup while Ultimate SEO was dealing with a DOS attack.

I’ve heard things come in threes so curious whats next because 2 big updates this week are out.  Now unfortunately the one that matters is unlikely to gain as much attention as the one that doesn’t mean anything.  So lets begin on what folks are focused on…something that means nothing.

DA 2.0 NEW DOMAIN AUTHORITY

MOZ Domain Authority 2.0

Moz redid their Domain Authoritycalculations and the implications are HUGE for those who workat Moz.  If you don’t workthere then its not a big deal.  Domain Authorityis like a fake credit score.  Your bank will likely use your FICA score but no one can release your FICA score to you but FICA.  To solve this banks and other organizations created their own scoring systems that attempted to mimic your FICA score that they can give out to you on credit monitoring sites.   These numbers though aren’t really that important as they are guesses to what your FICA score should be…VantageScore for instance gives a person a score based on their credit history thats also 3 digits but it isn’t a FICA score.  If you bank uses FICA scores who cares what your score is at VantageScore.

Moz made up Domain Authorityand Google doesn’t use it.  So a change to calculating Domain Authorityfrom Moz does not mean a change in search engine ranking.  Domain Authority is useful to you because it’s an educated guess as to what Google thinks just like Citation and Trust Flow are guesses by Majestic.

I don’t know about everyone else but the new calculations vastly changed the impression some would have of several domains I operate.  Here are some examples:

Moz DA April 2019 – March 2019 – Backlinks

So am I upset that Ultimateseo.orglost 5 points?  No.  Cause its like it lost 5 Matt Dollars …. But Matt Dollars don’t matter and NEITHER DOES Moz Domain Authority.

But again, Domain Authority has value when used among multiple other metrics geared at assessing a site’s rank ability.

PHP 7.3

PHP 7.3 Released And Available To Cpanel Servers

Its out and you can now run your site on it if you’re using Cpanel/WHM servers.  Im focused on Cpanel because its highly popular …. unless you use Wix or Shopify which use propreietary server management software that isn’t an industry standard.  Now, likely you don’t even use 7.2 as many sites still operate on PHP5.6.  BUT here are the advantages of 7.3

Now its relevant to SEObecause WordPressruns on PHP and WordPressis an SEOfavorite.  While new features are great and PHP 7 has proven much faster than PHP 6 this newest update may require some caution.  PHP 7.3 Issue With WordPress 5

Php7.3 And WordPress Speed

We have some testing done by others that note”

Is PHP 7.3 faster than PHP 7.2? Should I use PHP 7.3 for my WordPress site? We have done our own performance testing for WordPress running with WooCommerce and benchmarked PHP 7.2 against PHP 7.3.

We installed a standard WordPress 5.0 with the Storefront theme and imported the 50 products supplied by WooCommerce as sample data on a standard Servebolt High Performance plan.

We wanted to test whether PHP 7.3 was performing better than PHP 7.2, and therefore bypassed the reverse proxy and ran tests directly against the backend web server running the PHP, effectively bypassing all caching. The tests were run from the server to eliminate network bias.

We used as the testing tool, running 3 000 requests with a concurrency of 1000, with keep alive enabled. ….

We did a few more compiles of PHP 7.3, and tested benchmarked those. We also did benchmarks on all major versions from 5.6 and up. See the results in the table below.

PHP and Database

Req/s PHP 5.6 PHP 7.0 PHP 7.1 PHP 7.2 PHP 7.3 PHP 7.3 v2 PHP 7.3 v3
PHP 5.6 74 data-sheets-numberformat=”[null,3,”0.00%”,1]”> data-sheets-numberformat=”[null,3,”0.00%”,1]”> data-sheets-numberformat=”[null,3,”0.00%”,1]”> data-sheets-numberformat=”[null,3,”0.00%”,1]”>
PHP 7.0 177 239.19%
PHP 7.1 183 247.30% 103.39%
PHP 7.2 192 259.46% 108.47% 104.92%
PHP 7.3 221 298.65% 124.86% 120.77% 115.10%
7.3 v2 221 298.65% 124.86% 120.77% 115.10% 100.00%
7.3 v3 223 301.35% 125.99% 121.86% 116.15% 100.90% 100.90%
7.3 FINAL 224 302.70% 126.55% 122.40% 116.67% 101.36% 101.36% 100.45%

We ran this test 3 times on PHP 7.2 and three times on PHP 7.3, and compared the numbers.

PHP 7.2 average: 192 requests per second
PHP 7.3 average: 224 requests per second

The results were consistent with very small variation. WordPress with WooCommerce running PHP 7.3 outperforms PHP 7.2 by 16.67%.

So what are you waiting for? It is time to get that extra performance boost. Upgrade your site to be PHP 7.3 compatible today, and get the 10-17% extra performance boost!”

Techy Stuff In 7.3 Update

From hackernoon.com we get the features listed below:

JSON_THROW_ON_ERROR

Not having an adequate way to handle errors when using JSON has been a problem for quite a long time, and web developer all over the worlds have seen this as a huge downside of the language,

The RFC of PHP 7.3 has accepted this update by a 23 vs 0 vote which says how much this feature has been requested.

Until PHP v7.2 we needed to use a workaround to get an error from JSON and it was not reliable nor proficient in its job;

The new flag I am about to show you is an excellent alternative because give to the programmer the opportunity to use the power of exceptions that can be managed within the “try-catch” block of code.

Is_countable

A countable element in your code can be a variable with an array format or an object whose class is implementing the Countable interface.

Last year PHP 7.2 added a warning that shows up whenever the web developer was counting or trying to loop an uncountable element.

It is possible to solve this problem and one of the best solutions currently used is to apply a check like a snippet below:

The code checks whether the variable is an array or is an instance of the Countable interface.

And it will work but it seems a little bit “crowded” and as many of you that work long hours, after a while seeing this kind of lines wear your eyes out.

The teamthat is developing the new version took accountof this and added a new function that will help the web developer immensely.

The is_countable function takes a variable as a parameter and then return a boolean depending if the function is countable or not.

There is no restriction about the format the parameter has to be, of course, if you put a non-countable variable the return will be false.

array_key_first(), array_key_last()

As per version 5.6 of PHP, there are over 75 built-in functions that belong to the arrays’ category.

Despite the vast numbers of tools available, at the moment, if we need to retrieve the first or the last key of an array we have to get all the keys using array_keys()and only then go for the first or last values of the array.

Another way is to opt for end()or reset().

As you may know, all the methods just described modifying the array pointer, which is something that (other than be resources consumer) you just may not want to do.

The RFC of the upcoming version proposed the introduction of 4 brand-new methods the were set to solve this issue.

The four methods were:

  • array_key_first()
  • array_key_last()
  • array_value_first()
  • array_value_last()

Among the four of them, only the one set that fetches the keys were accepted with 18 to 14 votes.

They work for both numeric and associative arrays.

The same would have worked for the other two functions illustrated in this chapter array_value_*

Just to be clear, let me repeat,

Those functions have been refused with 18 no and 15 yes.

In my opinion, these two functions would have been useful as well but according to several web developers, in certain cases, the value returned would have been ambiguous.

Here is why:

An additional option that I come across browsing on forums and talking to other web developers was to return a tuple like [$key => $value].

Even though this option will not be available on the new version, seeing the favourable responses, it might arrive with the following RFCs.

Since this is a function that did not exist before there are not going to be any backwards compatibility problems, the only issue could arrive if you have created and you are using your own version of array_key_first()and array_key_last().

Same site cookie

Deploy secure application must always be the main focus of every programmer.

One task that each of us is facing of daily basis it to diminish the risk of CSRF and information leakage attacks.

Same-site cooking declares that cookies must be sent only with request initiated from the same domain.

This is not an official standard but Google Chrome and several of the best PHP frameworks already implement it whereas Firefox and new version of other frameworks confirmed that they are planning to do so.

Here is the support schema for same site cookie from caniuse.com

Currently, cookies are issued by the set-cookie header, a web developer can set a key-value pair alongside flags for the browser in order to agree if a cookie should be available or not.

This way to do things allows a vulnerable access to CSRF attacks.

The new RFC adds is suppose to solve the problem is a non-breaking mode by adding a new parameter and also modify four main functions of the language.

  • setcookie
  • setrawcookie
  • session_set_cookie_params

Two ways were proposed.

Adding a new argument to the function or allowing an array of option for moving all the options of the cookies inside.

How it will work?

Two values can be added to the same site flag present in the above functions

They are Lax and Strict.

Cookies that are using Lax will be accessible in a GET request that comes from another domain, while on the contrary Strict will not be accessible in a Get request.

Include features that Increase security in the code seem always a no-brainer but as always before deciding to apply them in our scripts we need to properly evaluate the pro and the cons of our choices

The main risk implied for using the same site flag as a supplementary argument to those functions is that it might never become an official standard.

It means that eventually browser will downturn the flag.

If this happens and you have already implemented it, it will result in you applications stuffed with junk code that need to be removed.

Ultimate SEO

https://ultimateseo.org/updates-that-matter-and-updates-that-dont-seo-basics/