Here is a recently added FAQ to the Ultimate SEO FAQ section.
Let me show you how important it is….
Why is realtor.com not higher than zumper.com in the mobile search on the right? Consider these metrics
Realtor.com = Domain Score: 55 Trust Score 58 Alexa Rank 763 Registered 1996 Ads 3,900 Backlinks 57,000,000 Traffic Rank 108
Zumper.com = Domain Score: 37 Trust Score 44 Alexa Rank 17,000 Registered 2004 Ads 0 Backlinks 1,700,000 Traffic Rank 2830
In every metric realtor.com wins, so why is it below Zumper.com on the mobile search?
So in this example we clearly see a more popular site beaten by a less established site and the single only factor the smaller site did better was speed. And we cant discount this as … well its only important in mobile. In case you missed it…
Now when we consider the facts above lets also dispel people’s over fascination for keywords and text optimization and position of frequency of words, the content length …. on-site SEO, the SEO of the 1990s as I call it… both sites present the same content to the desktop and mobile versions they just differ wildly in the speed. What are some of the reasons? Realtor.com decided to present 16 rows of 3 images of homes to visitors while Zumper shows 4 rows of 1 image …. and then additional rows load as you scroll down. Lazy Load and 1 image vs 3. Thats how they keep their requests to about a third of the realtor.com page.
I’d suggest you think of requests as if they are shots from a gun at your head. You need to avoid them! Less shots is a lot better…
Requests are literally requests of the server before the page can load. If I make a page with one image on it that is one request. Lets say I decide to replace that image with a slider with 5 slides, now I have 5 requests … the same page area but that cool feature increases the trips required of a computer to quadruple! Lets say now I add social media icons to the page … Facebook, Twitter, Instagram, LinkedIn and an email icon …. small and just up in the right corner. That social media addition just added 5 more requests. Think about all the things on your page, they don’t all come together in one big Amazon package with a smile…. they are shipped to the computer individually. Now I have one page with 1 request and another with 10 and the initial difference isn’t much…that slider only displays one image at a time.
Servers don’t respond instantly…they take a little while to think and retrieve the requested resource and then it has to travel the distance from the server to your computer…may be at the speed of light, but light still takes time. This time is called latency. 50 milliseconds is a good latency.
If both servers in the FAQ had a 50 ms latency. We can assume that the
Realtor.com server will take 50 ms x 301 requests = 15050 ms or 15 seconds
Zumper.com server will take 50 ms x 134 requests = 6700 ms or 6 seconds
I hope this explains why you want to limit requests, and prioritize speed as much as you focus on keywords.
Do you need separate images? On ultimateseo.org I wanted to show my COMPTia certifications. I have 4 icons … I combined them to make one image. Thats 1/4 the requests but no change in user experience other than a quicker site.
Lazy Load also helps speed up the initial page load time. If “below the fold” you have a lot of images on a page … the page needs those images still to finish the load unless you institute lazy load which essentially tells the computer to load an image only when it is coming into view. This makes sense likely if you have 300 images on the page and plenty of them are scrolled far down….but all in all I’m on the fence on Lazy Load. I ran speed tests on the homepage of this site with Lazy Load on …. 3 tests results 2.3 seconds, 1.9 seconds and 1.9 seconds. I turned off lazy load, and reran the test and got 2.3 seconds, 1.9 seconds and 1.7 seconds. So technically the site loaded faster with Lazy Load off….keep in mind it take a bit of thinking for the server to implement it. This helps speed up a site drastically if there are a ton of images spread vertically…but not much in a normal page. What are the full implications on SEO when a site is crawled?
Its suggested by “Ask Yoast” that Lazy Load is fine for SEO and the images are rendered as Google scrolls down the page and indexes the content.
Tracert is a command thats elementary to networking and computers. Trace Route or Tracert does exactly what it sounds like, and its useful cause it tells ya every ip address it passes through between the server and the catcher (not technical terms there). It explains where speed issues are in a global perspective or in your home.
Its usually just text but https://www.monitis.com/traceroute/ made it more fun…and from this map I can see why my fiber connection isnt seemingly very fast tonight, I’m being routed through London, England to do a domestic “hop” (hops are each leg of a journey in a tracert.
Thanks for checking in with Ultimate SEO, this site is a side project as my client’s sites are the main one. This means I may often have edits and unfinished elements to circle back too. I encourage you to feel free to let me how this site could be better using the contact form.
UltimateSEO.org has backlinks from about a thousand domains. In a recent review of these I found an odd reoccurring link from multiple domains but all with the same content and titles. I was introduced with “The Globe” which charges sites to NOT list them or makes money from SEOs paying them to not backlink to them. At $36 a link they’re likely insane and I bet its bringing in some money. But before we go all crazy and start paying Ransomlinks (if its not a word I claim it … Ransomlinks are backlinks from bad sites meant to lower your SEO score unless you pay to not be linked too.)
In reviewing the situation I ran across a list of the most disavowed sites. I figured Id share that with you below, but before I do what outcome did I choose for these bad links pointed to my site?
I’m opting for the third as I dont have any indication that Google cares about these Ransomlinks. They may actually bring some random traffic of use so redirecting them would take that from my site.
And now the most disavowed sites…
Google Data Studio Reports are some fun things. Here at Ultimate SEO you love visualizations and thats partially why we like Data Studio. Beyond the looks its also integrated easily with Google Sheets, Google Analytics and Search Console to name a few. These few though create a powerful free SEO PPC tool.
You can check out the report directly by clicking the link above, here is an embedded look at the nine pages of live data thats basically always right. It’s nice to be able to pull in data from two very different Google tools. Lots of people know of Google Analytics and think it covers Google Search Console but it doesn’t and I’ll discuss that more in another post but the unique data from these sources can all mix to form one handy live report.
You can check out all the information pulled here in this report and change the dates as needed using the drop down. To personalize the report to your own site simply copy it and set the data sources to your own Google Analytics and Search Console sources. Word of caution on the Search Console aspect there are two connections, one is the site and the other I believe is the page urls. So make sure to connect those correctly. Just like in electrical work it’s like to like.
Across these nine pages you’ll find insights into any site with an Adwords campaign including keywords, search terms, CTR and CPC.
I’ve heard things come in threes so curious whats next because 2 big updates this week are out. Now unfortunately the one that matters is unlikely to gain as much attention as the one that doesn’t mean anything. So lets begin on what folks are focused on…something that means nothing.
Moz redid their Domain Authoritycalculations and the implications are HUGE for those who workat Moz. If you don’t workthere then its not a big deal. Domain Authorityis like a fake credit score. Your bank will likely use your FICA score but no one can release your FICA score to you but FICA. To solve this banks and other organizations created their own scoring systems that attempted to mimic your FICA score that they can give out to you on credit monitoring sites. These numbers though aren’t really that important as they are guesses to what your FICA score should be…VantageScore for instance gives a person a score based on their credit history thats also 3 digits but it isn’t a FICA score. If you bank uses FICA scores who cares what your score is at VantageScore.
Moz made up Domain Authorityand Google doesn’t use it. So a change to calculating Domain Authorityfrom Moz does not mean a change in search engine ranking. Domain Authority is useful to you because it’s an educated guess as to what Google thinks just like Citation and Trust Flow are guesses by Majestic.
I don’t know about everyone else but the new calculations vastly changed the impression some would have of several domains I operate. Here are some examples:
But again, Domain Authority has value when used among multiple other metrics geared at assessing a site’s rank ability.
Its out and you can now run your site on it if you’re using Cpanel/WHM servers. Im focused on Cpanel because its highly popular …. unless you use Wix or Shopify which use propreietary server management software that isn’t an industry standard. Now, likely you don’t even use 7.2 as many sites still operate on PHP5.6. BUT here are the advantages of 7.3
Now its relevant to SEObecause WordPressruns on PHP and WordPressis an SEOfavorite. While new features are great and PHP 7 has proven much faster than PHP 6 this newest update may require some caution. PHP 7.3 Issue With WordPress 5
We have some testing done by others that note”
We wanted to test whether PHP 7.3 was performing better than PHP 7.2, and therefore bypassed the reverse proxy and ran tests directly against the backend web server running the PHP, effectively bypassing all caching. The tests were run from the server to eliminate network bias.
We used as the testing tool, running 3 000 requests with a concurrency of 1000, with keep alive enabled. ….
We did a few more compiles of PHP 7.3, and tested benchmarked those. We also did benchmarks on all major versions from 5.6 and up. See the results in the table below.
|Req/s||PHP 5.6||PHP 7.0||PHP 7.1||PHP 7.2||PHP 7.3||PHP 7.3 v2||PHP 7.3 v3|
We ran this test 3 times on PHP 7.2 and three times on PHP 7.3, and compared the numbers.
PHP 7.2 average: 192 requests per second
PHP 7.3 average: 224 requests per second
The results were consistent with very small variation. WordPress with WooCommerce running PHP 7.3 outperforms PHP 7.2 by 16.67%.
So what are you waiting for? It is time to get that extra performance boost. Upgrade your site to be PHP 7.3 compatible today, and get the 10-17% extra performance boost!”
From hackernoon.com we get the features listed below:
Not having an adequate way to handle errors when using JSON has been a problem for quite a long time, and web developer all over the worlds have seen this as a huge downside of the language,
The RFC of PHP 7.3 has accepted this update by a 23 vs 0 vote which says how much this feature has been requested.
Until PHP v7.2 we needed to use a workaround to get an error from JSON and it was not reliable nor proficient in its job;
The new flag I am about to show you is an excellent alternative because give to the programmer the opportunity to use the power of exceptions that can be managed within the “try-catch” block of code.
A countable element in your code can be a variable with an array format or an object whose class is implementing the Countable interface.
Last year PHP 7.2 added a warning that shows up whenever the web developer was counting or trying to loop an uncountable element.
It is possible to solve this problem and one of the best solutions currently used is to apply a check like a snippet below:
The code checks whether the variable is an array or is an instance of the Countable interface.
The is_countable function takes a variable as a parameter and then return a boolean depending if the function is countable or not.
There is no restriction about the format the parameter has to be, of course, if you put a non-countable variable the return will be false.
As per version 5.6 of PHP, there are over 75 built-in functions that belong to the arrays’ category.
Despite the vast numbers of tools available, at the moment, if we need to retrieve the first or the last key of an array we have to get all the keys using array_keys()and only then go for the first or last values of the array.
Another way is to opt for end()or reset().
As you may know, all the methods just described modifying the array pointer, which is something that (other than be resources consumer) you just may not want to do.
The RFC of the upcoming version proposed the introduction of 4 brand-new methods the were set to solve this issue.
The four methods were:
Among the four of them, only the one set that fetches the keys were accepted with 18 to 14 votes.
They work for both numeric and associative arrays.
The same would have worked for the other two functions illustrated in this chapter array_value_*
Just to be clear, let me repeat,
Those functions have been refused with 18 no and 15 yes.
In my opinion, these two functions would have been useful as well but according to several web developers, in certain cases, the value returned would have been ambiguous.
Here is why:
An additional option that I come across browsing on forums and talking to other web developers was to return a tuple like [$key => $value].
Even though this option will not be available on the new version, seeing the favourable responses, it might arrive with the following RFCs.
Since this is a function that did not exist before there are not going to be any backwards compatibility problems, the only issue could arrive if you have created and you are using your own version of array_key_first()and array_key_last().
Deploy secure application must always be the main focus of every programmer.
One task that each of us is facing of daily basis it to diminish the risk of CSRF and information leakage attacks.
Same-site cooking declares that cookies must be sent only with request initiated from the same domain.
This is not an official standard but Google Chrome and several of the best PHP frameworks already implement it whereas Firefox and new version of other frameworks confirmed that they are planning to do so.
Currently, cookies are issued by the set-cookie header, a web developer can set a key-value pair alongside flags for the browser in order to agree if a cookie should be available or not.
This way to do things allows a vulnerable access to CSRF attacks.
The new RFC adds is suppose to solve the problem is a non-breaking mode by adding a new parameter and also modify four main functions of the language.
Two ways were proposed.
Adding a new argument to the function or allowing an array of option for moving all the options of the cookies inside.
How it will work?
Two values can be added to the same site flag present in the above functions
They are Lax and Strict.
Cookies that are using Lax will be accessible in a GET request that comes from another domain, while on the contrary Strict will not be accessible in a Get request.
Include features that Increase security in the code seem always a no-brainer but as always before deciding to apply them in our scripts we need to properly evaluate the pro and the cons of our choices
The main risk implied for using the same site flag as a supplementary argument to those functions is that it might never become an official standard.
It means that eventually browser will downturn the flag.
If this happens and you have already implemented it, it will result in you applications stuffed with junk code that need to be removed.