Hiring An SEO? Hear From Google What SEOs Do

Hire An SEO

This Google video “How to hire an SEO” isn’t new but it’s to the point and vital to setting expectations.  I encourage both SEOs and clients to watch this video and learn what Google says you should look for in an SEO.

How to hire an SEO

[embedded content]
Transcript
hi I’m Maile Ohye and I work with Google
search I like to share advice to help
you hire a useful SEO and prevent hiring
a bad SEO one who you might pay a lot of
money without positive results or even
worse one who implements shady practices
on your website that result in a
reduction in search rankings SEO stands
for search engine optimization – some
SEO seems like black magic having worked
with Google search for over a decade
what I’ve learned is that first it’s not
black magic and second if you want
long-term success
there aren’t any quick magical tricks
that an SEO will provide so that your
site ranks number one it’s important to
note that an SEO potential is only as
high as the quality of your business or
website so successful SEO helps your
website put your best foot forward so
that it ranks appropriately in the spot
where an unbiased potential customer
would expect your site to be seen
a successful SEO also looks to improve
the entire searcher experience from
search results to clicking on your
website and potentially converting a
good SEO will recommend best practices
for a search friendly site from basic
things like descriptive page titles for
a blog or small business to more complex
things like language markup for a
multilingual global site SEO is ensure
that you’re serving your online
customers a good experience especially
those coming from a search engine and
that your site is helpful whether
they’re using a desktop computer or
mobile phone in most cases the SEO will
need four months to a year to help your
business first implement improvements
and then see potential benefit my
strongest advice when working with an
SEO is to request if they corroborate
their recommendation with a documented
statement from Google either in a Help
Center article video or Google a
response in a forum that supports both
one the SEO description of the issue
that needs to be improved to help with
ranking and to the approach they
prescribed to accomplishing this
tasks requesting these two bits of
information will help prevent hiring a
poor SEO who might otherwise convince
you to do useless things like add more
words to the keyword meta tag or by
links because if you search for google
advice on this topic you’d see blog
posts and videos from us that clearly
explain that adding keywords to the meta
tag wouldn’t help furthermore while
google uses links for page rank our
documentation highlights that we
strongly advise against the approach of
buying links for the purpose of
increasing page rank one basic rule is
that in a majority of cases doing what’s
good for SEO is also doing what’s good
for your online customers things like
having a mobile-friendly website good
navigation and building a great brand
additionally if you’re a more
established brand with complicated
legacy systems then good search friendly
best practices likely involved paying
off some of your site’s technical debt
such as updating your infrastructure so
that your website is agile and able to
implement features faster in the long
term if you own a small local business
you can probably do the initial work
yourself check out our 30-minute video
series on how to build an online
presence for your local business now if
you still believe you want to hire an
SEO here’s a general process one conduct
a two way interview with your potential
SEO check that they seem generally
interested in you and your business to
check their references three act four
and you’ll probably have to pay for a
technical and search audit 4 decide if
you want to hire let’s break this down
and start with step 1 conduct a two-way
interview in the interview here are some
things to look for a good SEO doesn’t
focus only on search engine ranking but
how they can help your business so they
should ask questions like what makes
your business content and/or service
unique and therefore valuable to
customers they want to know this
information to make sure it’s
highlighted on your website for your
current and potential new audience
– what does your common customer look
like and how do they currently find your
website 3 how does your business make
money and how can search help for what
other channels are you using offline
advertising social networks 5 who are
your competitors what do they do well
online and potentially offline if the
SEO doesn’t seem interested in learning
about your business from a holistic
standpoint look elsewhere it’s difficult
to do good SEO without knowing about a
business’s goals their customers and
other existing marketing efforts SEO
should complement your existing work the
second step in hiring an SEO is to check
references if your potential SEO
provides prior clients be sure to check
their references you want to hear from
past clients that the SEO was able to
provide useful guidance and worked
effectively with their developers
designers UX researchers and our
marketers a good SEO should feel like
someone you can work with learn from
experiment with and who generally cares
about you and your business not just
getting your site the highest rank as
ultimately those techniques rarely last
long if they work at all they’ll want to
educate you and your staff on how search
engines work so that SEO becomes part of
your general business operations step 3
is to request a technical and search
audit if you trust your SEO candidate
give them restricted view not full or
right access to your Google search
console data and even your analytics
data before they actually modify
anything on your website have them
conduct a technical and search audit to
give you a prioritized list of what they
think should be improved for SEO if
you’re a larger business you can hire
multiple SEO to run audits and
prioritize improvements see what each
has to say and then determine who you
could work with the best in the audit
the SEO should prioritize improvements
with a structure like one the issue to
the suggested improvement 3 an estimate
on the overall investment in other words
the time energy or money it would take
for your developers to implement the
improvement and for Google search as
well as searchers and customers to
recognize the improvement the SEO will
need to talk with your developers to
better understand what technical
constraints may exist for the estimated
positive business impact the impact
might be a ranking improvement that will
lead to more visitors and conversions or
perhaps the positive impact comes from a
back-end change that cleans up your site
and helps your brand be more agile in
the future five a plan of how to iterate
and improve on the implementation or
perhaps how to experiment and fail fast
should the results not meet expectations
that covers the structure of the
technical and search audit now let’s
talk about each of these audits
individually in the technical audit your
SEO should be able to review your site
for issues related to internal linking
crawl ability URL parameters server
connectivity and response codes to name
some if they mention that your site has
duplicate content problems that need to
be corrected make sure they show you the
specific URLs that are competing for the
same query or that they explained it
should be cleaned up for long term site
health not initial growth I mention this
because lots of duplicate content exists
on web sites and often it’s not a
pressing problem in this search audit
your potential SEO will likely break
down your search queries into categories
like branded and unbranded terms branded
terms are those with your business or
website’s name like a search for Gmail
is a branded term while the search for
email is an unbranded or general keyword
an SEO should make sure that for branded
queries such as Gmail your website is
providing a great experience that allows
customers who know your brand or website
to easily find exactly what they need
and potentially convert they might
recommend improvements that help the
entire searcher experience from what the
searcher sees in search results to when
they click on a result and use your
website for unbranded queries an SEO can
help you
better make sense of the online
competitive landscape they can tell you
things like here are the types of
queries it would make sense for your
business to rank but here’s what your
competition is done and why I think they
rank where they do for instance perhaps
your competition has great reviews
really shareable content or they run a
highly reputable site an SEO will
provide recommendations for how to
improve rankings for these queries and
the entire searcher experience they’ll
introduce ideas like update obsolete
content they might say your site is
suffering because some of your well
ranking content is obsolete has poor
navigation a useless page title or isn’t
mobile-friendly let’s improve these
pages and see if more website visitors
convert and purchase or if they can
micro convert meaning that perhaps they
subscribe or share content improve
internal linking your SEO might say your
site is suffering because some of your
best articles are too far from the
homepage and users would have a hard
time finding it we can better internally
link to your content to feature it more
prominently generate buzz the SEO might
say you have great content but not
enough people know we can try to get
more user interaction and generate buzz
perhaps through social media or business
relationships this will help us attract
more potential customers and perhaps
garner natural links to your site learn
from the competition your SEO might
explain here’s what your competitors do
well
can you reach parity with this and
potentially surpass them in utilities or
can you better show customers your
business’s unique value again a good SEO
will try to prioritize what ideas can
bring your business the most improvement
for the least investment and what
improvements may take more time but help
growth in the long term
once they talk with you and other
members of your team such as developers
or marketers they’ll help your business
forge a path ahead the last thing I want
to mention is that when I talk with SEO
s one of the biggest holdups to
improving away
site isn’t there recommendation but it’s
the business making time to implement
their ideas if you’re not ready to
commit to making SEO improvements while
getting an SEO audit may be helpful make
sure that your entire organization is on
board else your SEO improvements may be
non-existent regardless of who you hire
so that wraps it up thanks for watching
and best of luck to you and your
business

Hits: 2

Hiring An SEO? Hear From Google What SEOs Do

Speed: Page Load – Technical SEO Out Ranks Most In Mobile

A Case Study of SEO Metrics And Rank

Here is a recently added FAQ to the Ultimate SEO FAQ section.

Let me show you how important it is….desktop vs mobile search results

desktop vs mobile search results

Why is realtor.com not higher than zumper.com in the mobile search on the right?  Consider these metrics

Realtor.com = Domain Score: 55 Trust Score 58 Alexa Rank 763 Registered 1996 Ads 3,900 Backlinks 57,000,000 Traffic Rank 108

Zumper.com = Domain Score: 37 Trust Score 44 Alexa Rank 17,000 Registered 2004 Ads 0 Backlinks 1,700,000 Traffic Rank 2830

In every metric realtor.com wins, so why is it below Zumper.com on the mobile search?

Site Speed Test on GTMetrix

Realtor.com Fails Speed

site speed

site speed

Zumper.com Passes Speed

page load

page load

So in this example we clearly see a more popular site beaten by a less established site and the single only factor the smaller site did better was speed.  And we cant discount this as … well its only important in mobile.  In case you missed it…

60% of searches are mobile

60% of searches are mobile

Now when we consider the facts above lets also dispel people’s over fascination for keywords and text optimization and position of frequency of words, the content length …. on-site SEO, the SEO of the 1990s as I call it… both sites present the same content to the desktop and mobile versions they just differ wildly in the speed.  What are some of the reasons?  Realtor.com decided to present 16 rows of 3 images of homes to visitors while Zumper shows 4 rows of 1 image …. and then additional rows load as you scroll down.  Lazy Load and 1 image vs 3.  Thats how they keep their requests to about a third of the realtor.com page.

What Are Requests?

I’d suggest you think of requests as if they are shots from a gun at your head.  You need to avoid them!  Less shots is a lot better…

Requests are literally requests of the server before the page can load.  If I make a page with one image on it that is one request.  Lets say I decide to replace that image with a slider with 5 slides, now I have 5 requests … the same page area but that cool feature increases the trips required of a computer to quadruple!  Lets say now I add social media icons to the page … Facebook, Twitter, Instagram, LinkedIn and an email icon …. small and just up in the right corner.  That social media addition just added 5 more requests.  Think about all the things on your page, they don’t all come together in one big Amazon package with a smile…. they are shipped to the computer individually.  Now I have one page with 1 request and another with 10 and the initial difference isn’t much…that slider only displays one image at a time.

Latency And Requests

Servers don’t respond instantly…they take a little while to think and retrieve the requested resource and then it has to travel the distance from the server to your computer…may be at the speed of light, but light still takes time.  This time is called latency.  50 milliseconds is a good latency.

If both servers in the FAQ had a 50 ms latency.  We can assume that the

Realtor.com server will take 50 ms x 301 requests = 15050 ms or 15 seconds

Zumper.com server will take 50 ms x 134 requests = 6700 ms or 6 seconds  

I hope this explains why you want to limit requests, and prioritize speed as much as you focus on keywords.

Ways To Decrease Requests

Do you need separate images?  On ultimateseo.org I wanted to show my COMPTia certifications.  I have 4 icons … I combined them to make one image.   Thats 1/4 the requests but no change in user experience other than a quicker site.

technical certifications

technical certifications

Lazy Load

Lazy Load also helps speed up the initial page load time.  If “below the fold” you have a lot of images on a page … the page needs those images still to finish the load unless you institute lazy load which essentially tells the computer to load an image only when it is coming into view.  This makes sense likely if you have 300 images on the page and plenty of them are scrolled far down….but all in all I’m on the fence on Lazy Load.  I ran speed tests on the homepage of this site with Lazy Load on …. 3 tests results 2.3 seconds, 1.9 seconds and 1.9 seconds.  I turned off lazy load, and reran the test and got 2.3 seconds, 1.9 seconds and 1.7 seconds.  So technically the site loaded faster with Lazy Load off….keep in mind it take a bit of thinking for the server to implement it. This helps speed up a site drastically if there are a ton of images spread vertically…but not much in a normal page.  What are the full implications on SEO when a site is crawled?

Its suggested by “Ask Yoast” that Lazy Load is fine for SEO and the images are rendered as Google scrolls down the page and indexes the content.

Hits: 4

http://ultimateseo.org/tech-seo-mobile-speed/

Trace Route Network Tool, Only On A Map

Tracert is a command thats elementary to networking and computers.  Trace Route or Tracert does exactly what it sounds like, and its useful cause it tells ya every ip address it passes through between the server and the catcher (not technical terms there).  It explains where speed issues are in a global perspective or in your home.

Its usually just text but https://www.monitis.com/traceroute/ made it more fun…and from this map I can see why my fiber connection isnt seemingly very fast tonight, I’m being routed through London, England to do a domestic “hop” (hops are each leg of a journey in a tracert.

tracert in SEO

tracert in SEO

Hits: 40

Trace Route Network Tool, Only On A Map

52 Tools And Counting: Mostly Free SEO Tools – I Actually Use

Work In Progress

Thanks for checking in with Ultimate SEO, this site is a side project as my client’s sites are the main one.  This means I may often have edits and unfinished elements to circle back too.  I encourage you to feel free to let me how this site could be better using the contact form.

52 Tools And Counting: Mostly Free SEO Tools – I Actually Use

Bad Backlinks: 100 Sites You Don’t Want A Backlink From.

Bad Backlinks

UltimateSEO.org has backlinks from about a thousand domains.  In a recent review of these I found an odd reoccurring link from multiple domains but all with the same content and titles.  I was introduced with “The Globe” which charges sites to NOT list them or makes money from SEOs paying them to not backlink to them.  At $36 a link they’re likely insane and I bet its bringing in some money.  But before we go all crazy and start paying Ransomlinks (if its not a word I claim it … Ransomlinks are backlinks from bad sites meant to lower your SEO score unless you pay to not be linked too.)

In reviewing the situation I ran across a list of the most disavowed sites.  I figured Id share that with you below, but before I do what outcome did I choose for these bad links pointed to my site?

  1. Option 1 Pay: Heck No! Then the terrorists win.
  2. Disavow: No! Don’t use disavow unless Google has placed a manual action against your site.  I’m skeptical anyhow of the tools purpose and Google itself says there is no need to use the tool unless you’ve been penalized and told by them you are being penalized.
  3. Do Nothing: Yes! Don’t do anything. Google likely knows about the Ransomlinks scheme and has already penalized the site by deindexing it.  There are so many random domains its going to be a mess to address so let it be unless you have a seen a negative affect.  In other words…before you saw your leg off wondering if that spot is cancer…stop and find out.
  4. An idea: 301 Redirect Them…seriously…all of these links point to a subdomain that until now hasn’t existed.  Most others who are talking about this site note a similar subdomain targeted.   I could create the targeted subdomain and redirect all links to it from my site back to theirs.  🙂  

I’m opting for the third as I dont have any indication that Google cares about these Ransomlinks.  They may actually bring some random traffic of use so redirecting them would take that from my site.

[democracy id=”2″]

And now the most disavowed sites…

Most popular websites disavowed by webmasters

1 blogspot.com
2 blogspot.ca
3 blogspot.co.uk
4 ning.com
5 wordpress.com
6 blog.pl
7 linkarena.com
8 yuku.com
9 blogspot.de
10 webs.com
11 blogspot.nl
12 blogspot.fr
13 lemondir.com
14 blog.com
15 alonv.com
16 tistory.com
17 searchatlarge.com
18 dvpdvp1.com
19 typepad.com
20 nju-jp.com
21 bluehost.com
22 wldirectory.com
23 tumblr.com
24 hyperboards.com
25 directoryfuse.com
26 prlog.ru
27 informe.com
28 ligginit.com
29 theglobe.org
30 pulsitemeter.com
31 articlerich.com
32 weebly.com
33 the-globe.com
34 blogspot.no
35 theglobe.net
36 articledashboard.com
37 dig.do
38 seodigger.com
39 cybo.com
40 fat64.net
41 bravenet.com
42 cxteaw.com
43 askives.com
44 mrwhatis.net
45 insanejournal.com
46 xurt.com
47 freedirectorysubmit.com
48 commandresults.com
49 sagauto.com
50 internetwebgallery.com
51 freewebsitedirectory.com
52 ewbnewyork.com
53 000webhost.com
54 tblog.com
55 directorylist.me
56 analogrhythm.com
57 snapcc.org
58 bravejournal.com
59 weblinkstoday.com
60 m-pacthouston.com
61 linkcruncher.com
62 tripod.com
63 cogizz.com
64 niresource.com
65 over-blog.com
66 ogdenscore.com
67 free-link-directory.info
68 alikewebsites.com
69 folkd.com
70 djsonuts.com
71 uia.biz
72 bangkokprep.com
73 forumsland.com
74 punbb-hosting.com
75 hostmonster.com
76 blogspot.in
77 siteslikesearch.com
78 bookmark4you.com
79 siliconvalleynotary.com
80 listablog.com
81 poetic-dictionary.com
82 linkspurt.com
83 cultuurtechnologie.net
84 azjournos.com
85 exteen.com
86 articletrader.com
87 blogspot.com.au
88 delphistaff.com
89 altervista.org
90 media-tourism.com
91 woodwardatelier.com
92 holdtiteadhesives.com
93 lorinbrownonline.com
94 tech4on.com
95 popyourmovie.com
96 trilogygroveland.com
97 foqe.net
98 directorybin.com
99 eatrightkc.com

Hits: 550

https://ultimateseo.org/bad-backlinks-ransomlinks/

Adwords Template With Search Console, Google Analytics In Data Studio

SEO & PPC Data Studio Report Using Adwords, Google Analytics and Google Search Console All-In-One Template

Google Data Studio Reports are some fun things.  Here at Ultimate SEO you love visualizations and thats partially why we like Data Studio. Beyond the looks its also integrated easily with Google Sheets, Google Analytics and Search Console to name a few. These few though create a powerful free SEO PPC tool.

You can check out the report directly by clicking the link above, here is an embedded look at the nine pages of live data thats basically always right.  It’s nice to be able to pull in data from two very different Google tools.  Lots of people know of Google Analytics and think it covers Google Search Console but it doesn’t and I’ll discuss that more in another post but the unique data from these sources can all mix to form one handy live report.

You can check out all the information pulled here in this report and change the dates as needed using the drop down.  To personalize the report to your own site simply copy it and set the data sources to your own Google Analytics and Search Console sources.  Word of caution on the Search Console aspect there are two connections, one is the site and the other I believe is the page urls.  So make sure to connect those correctly.  Just like in electrical work it’s like to like.

Across these nine pages you’ll find insights into any site with an Adwords campaign including keywords, search terms, CTR and CPC.

https://ultimateseo.org/google-analytics-in-data-studio/

Updates That Matter AND Updates That Don’t :SEO Basics

This post was originally made 3/7/2019 but was lost during a restoration from backup while Ultimate SEO was dealing with a DOS attack.

I’ve heard things come in threes so curious whats next because 2 big updates this week are out.  Now unfortunately the one that matters is unlikely to gain as much attention as the one that doesn’t mean anything.  So lets begin on what folks are focused on…something that means nothing.

DA 2.0 NEW DOMAIN AUTHORITY

MOZ Domain Authority 2.0

Moz redid their Domain Authoritycalculations and the implications are HUGE for those who workat Moz.  If you don’t workthere then its not a big deal.  Domain Authorityis like a fake credit score.  Your bank will likely use your FICA score but no one can release your FICA score to you but FICA.  To solve this banks and other organizations created their own scoring systems that attempted to mimic your FICA score that they can give out to you on credit monitoring sites.   These numbers though aren’t really that important as they are guesses to what your FICA score should be…VantageScore for instance gives a person a score based on their credit history thats also 3 digits but it isn’t a FICA score.  If you bank uses FICA scores who cares what your score is at VantageScore.

Moz made up Domain Authorityand Google doesn’t use it.  So a change to calculating Domain Authorityfrom Moz does not mean a change in search engine ranking.  Domain Authority is useful to you because it’s an educated guess as to what Google thinks just like Citation and Trust Flow are guesses by Majestic.

I don’t know about everyone else but the new calculations vastly changed the impression some would have of several domains I operate.  Here are some examples:

Moz DA April 2019 – March 2019 – Backlinks

So am I upset that Ultimateseo.orglost 5 points?  No.  Cause its like it lost 5 Matt Dollars …. But Matt Dollars don’t matter and NEITHER DOES Moz Domain Authority.

But again, Domain Authority has value when used among multiple other metrics geared at assessing a site’s rank ability.

PHP 7.3

PHP 7.3 Released And Available To Cpanel Servers

Its out and you can now run your site on it if you’re using Cpanel/WHM servers.  Im focused on Cpanel because its highly popular …. unless you use Wix or Shopify which use propreietary server management software that isn’t an industry standard.  Now, likely you don’t even use 7.2 as many sites still operate on PHP5.6.  BUT here are the advantages of 7.3

Now its relevant to SEObecause WordPressruns on PHP and WordPressis an SEOfavorite.  While new features are great and PHP 7 has proven much faster than PHP 6 this newest update may require some caution.  PHP 7.3 Issue With WordPress 5

Php7.3 And WordPress Speed

We have some testing done by others that note”

Is PHP 7.3 faster than PHP 7.2? Should I use PHP 7.3 for my WordPress site? We have done our own performance testing for WordPress running with WooCommerce and benchmarked PHP 7.2 against PHP 7.3.

We installed a standard WordPress 5.0 with the Storefront theme and imported the 50 products supplied by WooCommerce as sample data on a standard Servebolt High Performance plan.

We wanted to test whether PHP 7.3 was performing better than PHP 7.2, and therefore bypassed the reverse proxy and ran tests directly against the backend web server running the PHP, effectively bypassing all caching. The tests were run from the server to eliminate network bias.

We used as the testing tool, running 3 000 requests with a concurrency of 1000, with keep alive enabled. ….

We did a few more compiles of PHP 7.3, and tested benchmarked those. We also did benchmarks on all major versions from 5.6 and up. See the results in the table below.

PHP and Database

Req/s PHP 5.6 PHP 7.0 PHP 7.1 PHP 7.2 PHP 7.3 PHP 7.3 v2 PHP 7.3 v3
PHP 5.6 74 data-sheets-numberformat=”[null,3,”0.00%”,1]”> data-sheets-numberformat=”[null,3,”0.00%”,1]”> data-sheets-numberformat=”[null,3,”0.00%”,1]”> data-sheets-numberformat=”[null,3,”0.00%”,1]”>
PHP 7.0 177 239.19%
PHP 7.1 183 247.30% 103.39%
PHP 7.2 192 259.46% 108.47% 104.92%
PHP 7.3 221 298.65% 124.86% 120.77% 115.10%
7.3 v2 221 298.65% 124.86% 120.77% 115.10% 100.00%
7.3 v3 223 301.35% 125.99% 121.86% 116.15% 100.90% 100.90%
7.3 FINAL 224 302.70% 126.55% 122.40% 116.67% 101.36% 101.36% 100.45%

We ran this test 3 times on PHP 7.2 and three times on PHP 7.3, and compared the numbers.

PHP 7.2 average: 192 requests per second
PHP 7.3 average: 224 requests per second

The results were consistent with very small variation. WordPress with WooCommerce running PHP 7.3 outperforms PHP 7.2 by 16.67%.

So what are you waiting for? It is time to get that extra performance boost. Upgrade your site to be PHP 7.3 compatible today, and get the 10-17% extra performance boost!”

Techy Stuff In 7.3 Update

From hackernoon.com we get the features listed below:

JSON_THROW_ON_ERROR

Not having an adequate way to handle errors when using JSON has been a problem for quite a long time, and web developer all over the worlds have seen this as a huge downside of the language,

The RFC of PHP 7.3 has accepted this update by a 23 vs 0 vote which says how much this feature has been requested.

Until PHP v7.2 we needed to use a workaround to get an error from JSON and it was not reliable nor proficient in its job;

The new flag I am about to show you is an excellent alternative because give to the programmer the opportunity to use the power of exceptions that can be managed within the “try-catch” block of code.

Is_countable

A countable element in your code can be a variable with an array format or an object whose class is implementing the Countable interface.

Last year PHP 7.2 added a warning that shows up whenever the web developer was counting or trying to loop an uncountable element.

It is possible to solve this problem and one of the best solutions currently used is to apply a check like a snippet below:

The code checks whether the variable is an array or is an instance of the Countable interface.

And it will work but it seems a little bit “crowded” and as many of you that work long hours, after a while seeing this kind of lines wear your eyes out.

The teamthat is developing the new version took accountof this and added a new function that will help the web developer immensely.

The is_countable function takes a variable as a parameter and then return a boolean depending if the function is countable or not.

There is no restriction about the format the parameter has to be, of course, if you put a non-countable variable the return will be false.

array_key_first(), array_key_last()

As per version 5.6 of PHP, there are over 75 built-in functions that belong to the arrays’ category.

Despite the vast numbers of tools available, at the moment, if we need to retrieve the first or the last key of an array we have to get all the keys using array_keys()and only then go for the first or last values of the array.

Another way is to opt for end()or reset().

As you may know, all the methods just described modifying the array pointer, which is something that (other than be resources consumer) you just may not want to do.

The RFC of the upcoming version proposed the introduction of 4 brand-new methods the were set to solve this issue.

The four methods were:

  • array_key_first()
  • array_key_last()
  • array_value_first()
  • array_value_last()

Among the four of them, only the one set that fetches the keys were accepted with 18 to 14 votes.

They work for both numeric and associative arrays.

The same would have worked for the other two functions illustrated in this chapter array_value_*

Just to be clear, let me repeat,

Those functions have been refused with 18 no and 15 yes.

In my opinion, these two functions would have been useful as well but according to several web developers, in certain cases, the value returned would have been ambiguous.

Here is why:

An additional option that I come across browsing on forums and talking to other web developers was to return a tuple like [$key => $value].

Even though this option will not be available on the new version, seeing the favourable responses, it might arrive with the following RFCs.

Since this is a function that did not exist before there are not going to be any backwards compatibility problems, the only issue could arrive if you have created and you are using your own version of array_key_first()and array_key_last().

Same site cookie

Deploy secure application must always be the main focus of every programmer.

One task that each of us is facing of daily basis it to diminish the risk of CSRF and information leakage attacks.

Same-site cooking declares that cookies must be sent only with request initiated from the same domain.

This is not an official standard but Google Chrome and several of the best PHP frameworks already implement it whereas Firefox and new version of other frameworks confirmed that they are planning to do so.

Here is the support schema for same site cookie from caniuse.com

Currently, cookies are issued by the set-cookie header, a web developer can set a key-value pair alongside flags for the browser in order to agree if a cookie should be available or not.

This way to do things allows a vulnerable access to CSRF attacks.

The new RFC adds is suppose to solve the problem is a non-breaking mode by adding a new parameter and also modify four main functions of the language.

  • setcookie
  • setrawcookie
  • session_set_cookie_params

Two ways were proposed.

Adding a new argument to the function or allowing an array of option for moving all the options of the cookies inside.

How it will work?

Two values can be added to the same site flag present in the above functions

They are Lax and Strict.

Cookies that are using Lax will be accessible in a GET request that comes from another domain, while on the contrary Strict will not be accessible in a Get request.

Include features that Increase security in the code seem always a no-brainer but as always before deciding to apply them in our scripts we need to properly evaluate the pro and the cons of our choices

The main risk implied for using the same site flag as a supplementary argument to those functions is that it might never become an official standard.

It means that eventually browser will downturn the flag.

If this happens and you have already implemented it, it will result in you applications stuffed with junk code that need to be removed.

Ultimate SEO

https://ultimateseo.org/updates-that-matter-and-updates-that-dont-seo-basics/

MAJOR GOOGLE SEO CHANGE FOR SOME: Website Traffic CREDITED To Where Google Chooses

Wednesday, February 06, 2019

In Search Console, the Performance report currently credits all page metrics to the exact URL that the user is referred to by Google Search. Although this provides very specific data, it makes property management more difficult; for example: if your site has mobile and desktop versions on different properties, you must open multiple properties to see all your Search data for the same piece of content.

To help unify your data, Search Console will soon begin assigning search metrics to the (Google-selected) canonical URL, rather than the URL referred to by Google Search. This change has several benefits:

  • It unifies all search metrics for a single piece of content into a single URL: the canonical URL. This shows you the full picture about a specific piece of content in one property.
  • For users with separate mobile or AMP pages, it unifies all (or most, since some mobile URLs may end up as canonical) of your data to a single property (the “canonical” property).
  • It improves the usability of the AMP and Mobile-Friendly reports. These reports currently show issues in the canonical page property, but show the impression in the property that owns the actual URL referred to by Google Search. After this change, the impressions and issues will be shown in the same property.

Google Search Console

When will this happen?

We plan to transition all performance data on April 10, 2019. In order to provide continuity to your data, we will pre-populate your unified data beginning from January 2018. We will also enable you to view both old and new versions for a few weeks during the transition to see the impact and understand the differences.

API and Data Studio users: The Search Console API will change to canonical data on April 10, 2019.

How will this affect my data?

  • At an individual URL level, you will see traffic shift from any non-canonical (duplicate) URLs to the canonical URL.
  • At the property level, you will see data from your alternate property (for example, your mobile site) shifted to your “canonical property”. Your alternate property traffic probably won’t drop to zero in Search Console because canonicalization is at the page, not the property level, and your mobile property might have some canonical pages. However, for most users, most property-level data will shift to one property. AMP property traffic will drop to zero in most cases (except for self-canonical pages).
  • You will still be able to filter data by device, search appearance (such as AMP), country, and other dimensions without losing important information about your traffic.

You can see some examples of these traffic changes below.

Preparing for the change

  • Consider whether you need to change user access to your various properties; for example: do you need to add new users to your canonical property, or do existing users continue to need access to the non-canonical properties.
  • Modify any custom traffic reports you might have created in order to adapt for this traffic shift.
  • If you need to learn the canonical URL for a given URL, you can use the URL Inspection tool.
  • If you want to save your traffic data calculated using the current system, you should download your data using either the Performance report’s Export Data button, or using the Search Console API.

Examples

Here are a few examples showing how data might change on your site. In these examples, you can see how your traffic numbers would change between a canonical site (called example.com) and alternate site (called m.example.com).

Important: In these examples, the desktop site contains all the canonical pages and the mobile contains all the alternate pages. In the real world, your desktop site might contain some alternate pages and your mobile site might contain some canonical pages. You can determine the canonical for a given URL using the URL Inspection tool.

Total traffic

In the current version, some of your traffic is attributed to the canonical property and some to the alternate property. The new version should attribute all of your traffic to the canonical property.

Canonical property
(http://example.com)
Alternate property
(http://m.example.com)
Current
New, based on canonical URLs
Change +0.7K     |        +3K -0.7K        |          -3K

Individual page traffic

You can see traffic changes between the duplicate and canonical URLs for individual pages in the Pages view. The next example shows how traffic that used to be split between the canonical and alternate pages are now all attributed to the canonical URL:

Canonical property
(http://example.com)
Alternate property
(http://m.example.com)
Old
New
Change +150     |        +800 -150     |        -800

Mobile traffic

In the current version, all of your mobile traffic was attributed to your m. property. The new version attributes all traffic to your canonical property when you apply the “Device: Mobile” filter as shown here:

Canonical property
(http://example.com)
Alternate property
(http://m.example.com)
Old
New
Change +0.7K      | +3K -0.7K      | -3K

In conclusion

We know that this change might seem a little confusing at first, but we’re confident that it will simplify your job of tracking traffic data for your site. If you have any questions or concerns, please reach out on the Webmaster Help Foru

Hits: 25

https://ultimateseo.org/update-google/

Skip to toolbar