Romantic Depot operates six, soon to be seven adult stores offering sex toys and lingerie, in the New York City area. Their flagship stores are in Manhattan and the Bronx. They’ve been around for sometime and over the years their website aged along with other businesses seeking to help drive local foot traffic.
With the move to mobile devices in full demonstration the old RomanticDepot.com site was not responsive. This lead the owner of the chain to build a new site that was mobile friendly and it lead to Ultimate SEO‘s involvement overseeing the process of migrating to this new site without hurting the site’s strong local SEO presence.
Romantic Depot does have an impressive keyword positioning presence in the New York City area. Even nationally they are on page 2 of results for “sex shop“. The goal was to ensure a smooth transition to the mobile site while maintaining the SEO that had been built over the years.
The new site was largely a 1 to 1 ratio. The html static page manhattan.html went now to /manhattan/ on the WordPress site. We placed in any one off redirects, a redirect that took any url that ended in .html and would return it without the html and a redirect for the index.html homepage to come back with the WordPress homepage at /
Backlinks are the life blood of a sit’s ranking and it was important to ensure that those would be maintained with relevant content as well. Using SEMRush.com we collected all of the backlinks and their existing targets and ensured those had rules as well. While the site’s backlinks were in the tens of thousands it quick came down to a few hundred target urls that needed to be accounted for to maintain SEO.
Most of the work involved in preparing for the migration was speed performance in nature. The new site when tested on GTMetrix.com was loading in 12.6 seconds with over 400 http requests. We targeted a 3 second load and through Cloudflare.com we were able to utilize a CDN that brought the site closer to users as well as offered other benefits. Cloudflare alone brough the site load time to about 7 seconds.
We further limited content that could be on other pages for those other pages such as Google maps to the location homepages. Instituting lazy load ended up being the primary aspect of speeding up the site. Image optimization was also completed and a move to PHP7.3 from 5.6. Merging CSS and JS files also worked to reduce the requests.
With our work to provide a faster site complete the migration was completed and load times on the site are under 3 seconds for mobile users.
Multi Domain Strategy Consolidation
The problem that arises from multiple domain strategies is the segmentation of resources and confusion it can cause to Google Analytics. An easy eample of this is the bounce rate and pageviews metrics are actually hurt on the primary domain.
Consider this… a person searches romantic depot on Google. The first result is their site, likely the person is going to want to know what items might be at the store. Once the page loads they find the link to the store, maybe even before the page loads. Clicking that link they are now taken to a new domain.
That visitors actions would have counted as a bounced visitor. See when someone goes to your site and immediately leaves for another site that signals to Google that what was on that original site wasn’t what the searcher wanted. To prevent future searchers from going to a site that people leave directly after going to it they might increase the position of other sites to try and correct for this in the future. That ultimately means the top spot position for the keyword is being hampered by the site’s structure.
Further Google sees that a person wasn’t even interested enough in the site to look at a second page, they just left. In realty the second site is part of the same overall topic or brand its just that Google doesn’t necessarily understand that. An artificially inflated bounce rate and lower page views are all that the first site is getting and the second site is losing out as well as most of the marketing is surrounding the first site’s address…backlinks, social mentions and such.
Lower Bounce Rate
The illustration above shows our page views of the main root domain. Guess when on the graph the romanticdepotsuperstore.com site was rolled into the main domains … late July. The thing is, the traffic isn’t any greater its just not split up anymore. The homepage link to the store is now going to a subfolder of the same domain, its helping by acting as another page view rather than hurting the site as a bounce.
The keywords and authority of this additional site were better utilized under the main domain RomanticDepot.com and this site was migrated to a subfolder /store as a separate WordPress site.
Thats important the site was migrated as a separate site under the original. This was done for multiple reason and it creates its own set of unique challenges but we’ll discuss that later in a future post.
The consolidation of the sites further helps with SEO because after we migrated we put into place redirects from the store’s domain to its new home within the subfolder. That means all the backlinks now combine to help one site. Lets consider the following illustration…
Domain A: DA 30 Backlinks: 10,000 Referring Domains: 1,000
Domain B: DA 30 Backlinks: 9,000 Referring Domains: 900
Competitor: DA 35 Backlinks: 13,000 Referring Domains: 1,300
Lets assume everything else is the same…we’d expect then that Competitor will rank higher on Google Search. But if we combine Domain A ad Domain B.
Domain AB: DA 40 Backlinks: 19,000 Referring Domains: 1,900
Everything else still the same….Domain AB will now rank higher than the Competitor.
Two things are important for you to take away from this post and they are listed right below. We’ve then included Google content explaining these two changes in backlinks. If the text is italic and navy blue, its quoted from Google.
User Generated Content or Sponsored Backlinks Added
Are two additional rel=”” tag options introduced by Google. UGC Sponsored can be used together or separately in an attempt to show a difference between the two backlinks. Nofollow remains a link option but if the link isn’t a paid advertisement or User Generated COntent yet you want to ensure if is not seen as an endorsement you may still use No Follow as an attribute.
Every Link Counts Now.
Rel+”NoFollow” used to mean don’t follow this link and give the target any credit for this backlink. Thats changing, all links will now be rewarded potentially…some links have been described as “hints” with “signals” still in the rankings. It’s not clear if they are one in the same or separate.
Today, we’re announcing two new link attributes that provide webmasters with additional ways to identify to Google Search the nature of particular links. These, along with nofollow, are summarized below:rel=”sponsored”: Use the sponsored attribute to identify links on your site that were created as part of advertisements, sponsorships or other compensation agreements.rel=”ugc”: UGC stands for User Generated Content, and the ugc attribute value is recommended for links within user generated content, such as comments and forum posts.
rel=”nofollow”: Use this attribute for cases where you want to link to a page but don’t want to imply any type of endorsement, including passing along ranking credit to another page.
When nofollow was introduced, Google would not count any link marked this way as a signal to use within our search algorithms. This has now changed. All the link attributes — sponsored, UGC and nofollow — are treated as hints about which links to consider or exclude within Search. We’ll use these hints — along with other signals — as a way to better understand how to appropriately analyze and use links within our systems.
Why not completely ignore such links, as had been the case with nofollow? Links contain valuable information that can help us improve search, such as how the words within links describe content they point at. Looking at all the links we encounter can also help us better understand unnatural linking patterns. By shifting to a hint model, we no longer lose this important information, while still allowing site owners to indicate that some links shouldn’t be given the weight of a first-party endorsement.
We know these new attributes will generate questions, so here’s a FAQ that we hope covers most of those.
Do I need to change my existing nofollows?
No. If you use nofollow now as a way to block sponsored links, or to signify that you don’t vouch for a page you link to, that will continue to be supported. There’s absolutely no need to change any nofollow links that you already have.
Can I use more than one rel value on a link?
Yes, you can use more than one rel value on a link. For example, rel=”ugc sponsored” is a perfectly valid attribute which hints that the link came from user-generated content and is sponsored. It’s also valid to use nofollow with the new attributes — such as rel=”nofollow ugc” — if you wish to be backwards-compatible with services that don’t support the new attributes.
If I use nofollow for ads or sponsored links, do I need to change those?
No. You can keep using nofollow as a method for flagging such links to avoid possible link scheme penalties. You don’t need to change any existing markup. If you have systems that append this to new links, they can continue to do so. However, we recommend switching over to rel=”sponsored” if or when it is convenient.
Do I still need to flag ad or sponsored links?
Yes. If you want to avoid a possible link scheme action, use rel=“sponsored” or rel=“nofollow” to flag these links. We prefer the use of “sponsored,” but either is fine and will be treated the same, for this purpose.
What happens if I use the wrong attribute on a link?
There’s no wrong attribute except in the case of sponsored links. If you flag a UGC link or a non-ad link as “sponsored,” we’ll see that hint but the impact — if any at all — would be at most that we might not count the link as a credit for another page. In this regard, it’s no different than the status quo of many UGC and non-ad links already marked as nofollow.
It is an issue going the opposite way. Any link that is clearly an ad or sponsored should use “sponsored” or “nofollow,” as described above. Using “sponsored” is preferred, but “nofollow” is acceptable.
Why should I bother using any of these new attributes?
Using the new attributes allows us to better process links for analysis of the web. That can include your own content, if people who link to you make use of these attributes.
Won’t changing to a “hint” approach encourage link spam in comments and UGC content?
Many sites that allow third-parties to contribute to content already deter link spam in a variety of ways, including moderation tools that can be integrated into many blogging platforms and human review. The link attributes of “ugc” and “nofollow” will continue to be a further deterrent. In most cases, the move to a hint model won’t change the nature of how we treat such links. We’ll generally treat them as we did with nofollow before and not consider them for ranking purposes. We will still continue to carefully assess how to use links within Search, just as we always have and as we’ve had to do for situations where no attributions were provided.
When do these attributes and changes go into effect?
All the link attributes, sponsored, ugc and nofollow, now work today as hints for us to incorporate for ranking purposes. For crawling and indexing purposes, nofollow will become a hint as of March 1, 2020. Those depending on nofollow solely to block a page from being indexed (which was never recommended) should use one of the much more robust mechanisms listed on our Learn how to block URLs from Google help page.
These backlink designations came about to combat link schemes. Yes, I can see any effort to build backlinks as a link scheme but there are permissible and forbidden ways to build links. Here is Google on what is a link scheme.
Any links intended to manipulate PageRank or a site’s ranking in Google search results may be considered part of a link scheme and a violation of Google’s Webmaster Guidelines. This includes any behavior that manipulates links to your site or outgoing links from your site.
The following are examples of link schemes which can negatively impact a site’s ranking in search results:
- Buying or selling links that pass PageRank. This includes exchanging money for links, or posts that contain links; exchanging goods or services for links; or sending someone a “free” product in exchange for them writing about it and including a link
- Excessive link exchanges (“Link to me and I’ll link to you”) or partner pages exclusively for the sake of cross-linking
- Large-scale article marketing or guest posting campaigns with keyword-rich anchor text links
- Using automated programs or services to create links to your site
- Requiring a link as part of a Terms of Service, contract, or similar arrangement without allowing a third-party content owner the choice of using nofollow or other method of blocking PageRank, should they wish.
Additionally, creating links that weren’t editorially placed or vouched for by the site’s owner on a page, otherwise known as unnatural links, can be considered a violation of our guidelines. Here are a few common examples of unnatural links that may violate our guidelines:
- Text advertisements that pass PageRank
- Advertorials or native advertising where payment is received for articles that include links that pass PageRank
- Links with optimized anchor text in articles or press releases distributed on other sites. For example:
There are many wedding rings on the market. If you want to have a wedding, you will have to pick the best ring. You will also need to buy flowers and a wedding dress.
- Low-quality directory or bookmark site links
- Keyword-rich, hidden or low-quality links embedded in widgets that are distributed across various sites, for example:
Visitors to this page: 1,472
- Widely distributed links in the footers or templates of various sites
- Forum comments with optimized links in the post or signature, for example:
Thanks, that’s great info!
paul’s pizza san diego pizza best pizza san diego
Note that PPC (pay-per-click) advertising links that don’t pass PageRank to the buyer of the ad do not violate our guidelines. You can prevent PageRank from passing in several ways, such as:
- Adding a rel=”nofollow” or a more specific attribute to the
- Redirecting the links to an intermediate page that is blocked from search engines with a robots.txt file
The best way to get other sites to create high-quality, relevant links to yours is to create unique, relevant content that can naturally gain popularity in the Internet community. Creating good content pays off: Links are usually editorial votes given by choice, and the more useful content you have, the greater the chances someone else will find that content valuable to their readers and link to it.
If you see a site that is participating in link schemes intended to manipulate PageRank, let us know. We’ll use your information to improve our algorithmic detection of such links.
A Case Study of SEO Metrics And Rank
Here is a recently added FAQ to the Ultimate SEO FAQ section.
Let me show you how important it is….
Why is realtor.com not higher than zumper.com in the mobile search on the right? Consider these metrics
Realtor.com = Domain Score: 55 Trust Score 58 Alexa Rank 763 Registered 1996 Ads 3,900 Backlinks 57,000,000 Traffic Rank 108
Zumper.com = Domain Score: 37 Trust Score 44 Alexa Rank 17,000 Registered 2004 Ads 0 Backlinks 1,700,000 Traffic Rank 2830
In every metric realtor.com wins, so why is it below Zumper.com on the mobile search?
Site Speed Test on GTMetrix
Realtor.com Fails Speed
Zumper.com Passes Speed
So in this example we clearly see a more popular site beaten by a less established site and the single only factor the smaller site did better was speed. And we cant discount this as … well its only important in mobile. In case you missed it…
Now when we consider the facts above lets also dispel people’s over fascination for keywords and text optimization and position of frequency of words, the content length …. on-site SEO, the SEO of the 1990s as I call it… both sites present the same content to the desktop and mobile versions they just differ wildly in the speed. What are some of the reasons? Realtor.com decided to present 16 rows of 3 images of homes to visitors while Zumper shows 4 rows of 1 image …. and then additional rows load as you scroll down. Lazy Load and 1 image vs 3. Thats how they keep their requests to about a third of the realtor.com page.
What Are Requests?
I’d suggest you think of requests as if they are shots from a gun at your head. You need to avoid them! Less shots is a lot better…
Requests are literally requests of the server before the page can load. If I make a page with one image on it that is one request. Lets say I decide to replace that image with a slider with 5 slides, now I have 5 requests … the same page area but that cool feature increases the trips required of a computer to quadruple! Lets say now I add social media icons to the page … Facebook, Twitter, Instagram, LinkedIn and an email icon …. small and just up in the right corner. That social media addition just added 5 more requests. Think about all the things on your page, they don’t all come together in one big Amazon package with a smile…. they are shipped to the computer individually. Now I have one page with 1 request and another with 10 and the initial difference isn’t much…that slider only displays one image at a time.
Latency And Requests
Servers don’t respond instantly…they take a little while to think and retrieve the requested resource and then it has to travel the distance from the server to your computer…may be at the speed of light, but light still takes time. This time is called latency. 50 milliseconds is a good latency.
If both servers in the FAQ had a 50 ms latency. We can assume that the
Realtor.com server will take 50 ms x 301 requests = 15050 ms or 15 seconds
Zumper.com server will take 50 ms x 134 requests = 6700 ms or 6 seconds
I hope this explains why you want to limit requests, and prioritize speed as much as you focus on keywords.
Ways To Decrease Requests
Do you need separate images? On ultimateseo.org I wanted to show my COMPTia certifications. I have 4 icons … I combined them to make one image. Thats 1/4 the requests but no change in user experience other than a quicker site.
Lazy Load also helps speed up the initial page load time. If “below the fold” you have a lot of images on a page … the page needs those images still to finish the load unless you institute lazy load which essentially tells the computer to load an image only when it is coming into view. This makes sense likely if you have 300 images on the page and plenty of them are scrolled far down….but all in all I’m on the fence on Lazy Load. I ran speed tests on the homepage of this site with Lazy Load on …. 3 tests results 2.3 seconds, 1.9 seconds and 1.9 seconds. I turned off lazy load, and reran the test and got 2.3 seconds, 1.9 seconds and 1.7 seconds. So technically the site loaded faster with Lazy Load off….keep in mind it take a bit of thinking for the server to implement it. This helps speed up a site drastically if there are a ton of images spread vertically…but not much in a normal page. What are the full implications on SEO when a site is crawled?
Its suggested by “Ask Yoast” that Lazy Load is fine for SEO and the images are rendered as Google scrolls down the page and indexes the content.
An easy way to build backlinks is through directory submissions. Anything hard though is worth more, so keep in mind directory submissions don’t carry the weight that in content links will. With that said there is still diluted value in link directories, just don’t focus on them.
Here are some link directories and they are organized by SEO power.
The button below opens a new window with about 100 sites. We also have the following directory sites maintained by Ultimate SEO and these are always free to list with. This site also has a directory on it built in…you can access it by the top menu bar.