Romantic Depot operates six, soon to be seven adult stores offering sex toys and lingerie, in the New York City area. Their flagship stores are in Manhattan and the Bronx. They’ve been around for sometime and over the years their website aged along with other businesses seeking to help drive local foot traffic.
With the move to mobile devices in full demonstration the old RomanticDepot.com site was not responsive. This lead the owner of the chain to build a new site that was mobile friendly and it lead to Ultimate SEO‘s involvement overseeing the process of migrating to this new site without hurting the site’s strong local SEO presence.
Romantic Depot does have an impressive keyword positioning presence in the New York City area. Even nationally they are on page 2 of results for “sex shop“. The goal was to ensure a smooth transition to the mobile site while maintaining the SEO that had been built over the years.
The new site was largely a 1 to 1 ratio. The html static page manhattan.html went now to /manhattan/ on the WordPress site. We placed in any one off redirects, a redirect that took any url that ended in .html and would return it without the html and a redirect for the index.html homepage to come back with the WordPress homepage at /
Backlinks are the life blood of a sit’s ranking and it was important to ensure that those would be maintained with relevant content as well. Using SEMRush.com we collected all of the backlinks and their existing targets and ensured those had rules as well. While the site’s backlinks were in the tens of thousands it quick came down to a few hundred target urls that needed to be accounted for to maintain SEO.
Most of the work involved in preparing for the migration was speed performance in nature. The new site when tested on GTMetrix.com was loading in 12.6 seconds with over 400 http requests. We targeted a 3 second load and through Cloudflare.com we were able to utilize a CDN that brought the site closer to users as well as offered other benefits. Cloudflare alone brough the site load time to about 7 seconds.
We further limited content that could be on other pages for those other pages such as Google maps to the location homepages. Instituting lazy load ended up being the primary aspect of speeding up the site. Image optimization was also completed and a move to PHP7.3 from 5.6. Merging CSS and JS files also worked to reduce the requests.
With our work to provide a faster site complete the migration was completed and load times on the site are under 3 seconds for mobile users.
Multi Domain Strategy Consolidation
The problem that arises from multiple domain strategies is the segmentation of resources and confusion it can cause to Google Analytics. An easy eample of this is the bounce rate and pageviews metrics are actually hurt on the primary domain.
Consider this… a person searches romantic depot on Google. The first result is their site, likely the person is going to want to know what items might be at the store. Once the page loads they find the link to the store, maybe even before the page loads. Clicking that link they are now taken to a new domain.
That visitors actions would have counted as a bounced visitor. See when someone goes to your site and immediately leaves for another site that signals to Google that what was on that original site wasn’t what the searcher wanted. To prevent future searchers from going to a site that people leave directly after going to it they might increase the position of other sites to try and correct for this in the future. That ultimately means the top spot position for the keyword is being hampered by the site’s structure.
Further Google sees that a person wasn’t even interested enough in the site to look at a second page, they just left. In realty the second site is part of the same overall topic or brand its just that Google doesn’t necessarily understand that. An artificially inflated bounce rate and lower page views are all that the first site is getting and the second site is losing out as well as most of the marketing is surrounding the first site’s address…backlinks, social mentions and such.
Lower Bounce Rate
The illustration above shows our page views of the main root domain. Guess when on the graph the romanticdepotsuperstore.com site was rolled into the main domains … late July. The thing is, the traffic isn’t any greater its just not split up anymore. The homepage link to the store is now going to a subfolder of the same domain, its helping by acting as another page view rather than hurting the site as a bounce.
The keywords and authority of this additional site were better utilized under the main domain RomanticDepot.com and this site was migrated to a subfolder /store as a separate WordPress site.
Thats important the site was migrated as a separate site under the original. This was done for multiple reason and it creates its own set of unique challenges but we’ll discuss that later in a future post.
The consolidation of the sites further helps with SEO because after we migrated we put into place redirects from the store’s domain to its new home within the subfolder. That means all the backlinks now combine to help one site. Lets consider the following illustration…
Domain A: DA 30 Backlinks: 10,000 Referring Domains: 1,000
Domain B: DA 30 Backlinks: 9,000 Referring Domains: 900
Competitor: DA 35 Backlinks: 13,000 Referring Domains: 1,300
Lets assume everything else is the same…we’d expect then that Competitor will rank higher on Google Search. But if we combine Domain A ad Domain B.
Domain AB: DA 40 Backlinks: 19,000 Referring Domains: 1,900
Everything else still the same….Domain AB will now rank higher than the Competitor.
Two things are important for you to take away from this post and they are listed right below. We’ve then included Google content explaining these two changes in backlinks. If the text is italic and navy blue, its quoted from Google.
User Generated Content or Sponsored Backlinks Added
Are two additional rel=”” tag options introduced by Google. UGC Sponsored can be used together or separately in an attempt to show a difference between the two backlinks. Nofollow remains a link option but if the link isn’t a paid advertisement or User Generated COntent yet you want to ensure if is not seen as an endorsement you may still use No Follow as an attribute.
Every Link Counts Now.
Rel+”NoFollow” used to mean don’t follow this link and give the target any credit for this backlink. Thats changing, all links will now be rewarded potentially…some links have been described as “hints” with “signals” still in the rankings. It’s not clear if they are one in the same or separate.
Today, we’re announcing two new link attributes that provide webmasters with additional ways to identify to Google Search the nature of particular links. These, along with nofollow, are summarized below:rel=”sponsored”: Use the sponsored attribute to identify links on your site that were created as part of advertisements, sponsorships or other compensation agreements.rel=”ugc”: UGC stands for User Generated Content, and the ugc attribute value is recommended for links within user generated content, such as comments and forum posts.
rel=”nofollow”: Use this attribute for cases where you want to link to a page but don’t want to imply any type of endorsement, including passing along ranking credit to another page.
When nofollow was introduced, Google would not count any link marked this way as a signal to use within our search algorithms. This has now changed. All the link attributes — sponsored, UGC and nofollow — are treated as hints about which links to consider or exclude within Search. We’ll use these hints — along with other signals — as a way to better understand how to appropriately analyze and use links within our systems.
Why not completely ignore such links, as had been the case with nofollow? Links contain valuable information that can help us improve search, such as how the words within links describe content they point at. Looking at all the links we encounter can also help us better understand unnatural linking patterns. By shifting to a hint model, we no longer lose this important information, while still allowing site owners to indicate that some links shouldn’t be given the weight of a first-party endorsement.
We know these new attributes will generate questions, so here’s a FAQ that we hope covers most of those.
Do I need to change my existing nofollows?
No. If you use nofollow now as a way to block sponsored links, or to signify that you don’t vouch for a page you link to, that will continue to be supported. There’s absolutely no need to change any nofollow links that you already have.
Can I use more than one rel value on a link?
Yes, you can use more than one rel value on a link. For example, rel=”ugc sponsored” is a perfectly valid attribute which hints that the link came from user-generated content and is sponsored. It’s also valid to use nofollow with the new attributes — such as rel=”nofollow ugc” — if you wish to be backwards-compatible with services that don’t support the new attributes.
If I use nofollow for ads or sponsored links, do I need to change those?
No. You can keep using nofollow as a method for flagging such links to avoid possible link scheme penalties. You don’t need to change any existing markup. If you have systems that append this to new links, they can continue to do so. However, we recommend switching over to rel=”sponsored” if or when it is convenient.
Do I still need to flag ad or sponsored links?
Yes. If you want to avoid a possible link scheme action, use rel=“sponsored” or rel=“nofollow” to flag these links. We prefer the use of “sponsored,” but either is fine and will be treated the same, for this purpose.
What happens if I use the wrong attribute on a link?
There’s no wrong attribute except in the case of sponsored links. If you flag a UGC link or a non-ad link as “sponsored,” we’ll see that hint but the impact — if any at all — would be at most that we might not count the link as a credit for another page. In this regard, it’s no different than the status quo of many UGC and non-ad links already marked as nofollow.
It is an issue going the opposite way. Any link that is clearly an ad or sponsored should use “sponsored” or “nofollow,” as described above. Using “sponsored” is preferred, but “nofollow” is acceptable.
Why should I bother using any of these new attributes?
Using the new attributes allows us to better process links for analysis of the web. That can include your own content, if people who link to you make use of these attributes.
Won’t changing to a “hint” approach encourage link spam in comments and UGC content?
Many sites that allow third-parties to contribute to content already deter link spam in a variety of ways, including moderation tools that can be integrated into many blogging platforms and human review. The link attributes of “ugc” and “nofollow” will continue to be a further deterrent. In most cases, the move to a hint model won’t change the nature of how we treat such links. We’ll generally treat them as we did with nofollow before and not consider them for ranking purposes. We will still continue to carefully assess how to use links within Search, just as we always have and as we’ve had to do for situations where no attributions were provided.
When do these attributes and changes go into effect?
All the link attributes, sponsored, ugc and nofollow, now work today as hints for us to incorporate for ranking purposes. For crawling and indexing purposes, nofollow will become a hint as of March 1, 2020. Those depending on nofollow solely to block a page from being indexed (which was never recommended) should use one of the much more robust mechanisms listed on our Learn how to block URLs from Google help page.
These backlink designations came about to combat link schemes. Yes, I can see any effort to build backlinks as a link scheme but there are permissible and forbidden ways to build links. Here is Google on what is a link scheme.
Any links intended to manipulate PageRank or a site’s ranking in Google search results may be considered part of a link scheme and a violation of Google’s Webmaster Guidelines. This includes any behavior that manipulates links to your site or outgoing links from your site.
The following are examples of link schemes which can negatively impact a site’s ranking in search results:
- Buying or selling links that pass PageRank. This includes exchanging money for links, or posts that contain links; exchanging goods or services for links; or sending someone a “free” product in exchange for them writing about it and including a link
- Excessive link exchanges (“Link to me and I’ll link to you”) or partner pages exclusively for the sake of cross-linking
- Large-scale article marketing or guest posting campaigns with keyword-rich anchor text links
- Using automated programs or services to create links to your site
- Requiring a link as part of a Terms of Service, contract, or similar arrangement without allowing a third-party content owner the choice of using nofollow or other method of blocking PageRank, should they wish.
Additionally, creating links that weren’t editorially placed or vouched for by the site’s owner on a page, otherwise known as unnatural links, can be considered a violation of our guidelines. Here are a few common examples of unnatural links that may violate our guidelines:
- Text advertisements that pass PageRank
- Advertorials or native advertising where payment is received for articles that include links that pass PageRank
- Links with optimized anchor text in articles or press releases distributed on other sites. For example:
There are many wedding rings on the market. If you want to have a wedding, you will have to pick the best ring. You will also need to buy flowers and a wedding dress.
- Low-quality directory or bookmark site links
- Keyword-rich, hidden or low-quality links embedded in widgets that are distributed across various sites, for example:
Visitors to this page: 1,472
- Widely distributed links in the footers or templates of various sites
- Forum comments with optimized links in the post or signature, for example:
Thanks, that’s great info!
paul’s pizza san diego pizza best pizza san diego
Note that PPC (pay-per-click) advertising links that don’t pass PageRank to the buyer of the ad do not violate our guidelines. You can prevent PageRank from passing in several ways, such as:
- Adding a rel=”nofollow” or a more specific attribute to the
- Redirecting the links to an intermediate page that is blocked from search engines with a robots.txt file
The best way to get other sites to create high-quality, relevant links to yours is to create unique, relevant content that can naturally gain popularity in the Internet community. Creating good content pays off: Links are usually editorial votes given by choice, and the more useful content you have, the greater the chances someone else will find that content valuable to their readers and link to it.
If you see a site that is participating in link schemes intended to manipulate PageRank, let us know. We’ll use your information to improve our algorithmic detection of such links.
This may seem off topic but its on topic, technical SEO is imperative … you’re not going to rank number one on Google using Shopify or Wix. It just isnt going to happen.
Its also apparently difficult to get solid advice on SEO Hosting from “experts” Best Blog Hosting for SEO is junk … reciting features doesnt make a hosting plan the best…one quote notes that WordPress is already installed with InMotionHosting.com … so what! Our web servers are preconfigured to install WordPress in every new account as well…it only saves maybe 5 minutes per user but for a web host that time adds up very quickly. But you arent a web host so it’s not that big of a deal. I’d like to hear about benchmarking tests they may have run to decide who is the best.
Features Aren’t Technical Specs
Unlimited bandwidth…sounds great but what are the limits? There are limits and these are beyond the hosts control sometimes but for instance …. if someone uses a CAT5 cable instead of a CAT6 everything will be more speed limited and especially if a bottle neck is designed in to infrastructure. Unlimited bandwidth means nothing to me because there are limits … physical limits exist and can’t be avoided.And WordPress preinstalled saves someone 5 minutes but nothing else. These aren’t important to the Hosting platform.
Cloud Computing: Be Your Own Host
The industry standard in web hosting is cPanel. No way around it with cPanel your support opinions are bountiful where as dreamhost.com has its own proprietary server software … its no better in actuality its just far less supported by third parties. Ultimate SEO is hosted on a variety of cPanel servers that were eay to build and deploy, I made them from scratch and with templates but all in all there are 4 AWS servers, 2 Google Cloud Platfrom and 4 Digital Ocean currently powering hundreds of sites including this site. Cost varies wildly…
Its important to note that your web host is honestly likely run on one of these three services. Youre sharing their share of the cloud environment. Why not just skip ahead and be the master of your domain….sure it will cost more than $3 a month … but that $3 a month hosting plan is shit.
A good review between AWS and a traditional hosting provider is AWS vs Blue Host
Amazon Web Services
I don’t even know what I am spending, where and how it is being spent. AWS charges you for everything little thing and no matter what steps you might take it may seem like rising project costs are simply unavoidable. There platform to work within is NOT intuitive and it will require some play time to remember that you have to leave the virtual server’s configuration area to select an IP address ( that will cost you money…each ip address, not talking about bandwidth I’m just saying the number ) and then return to that original area to associate it. Dont even think about swapping hard drives and knowing what is attached t what unless you are prepared to write down long strings of numbers and letters.
AWS does provide greater flexibility than the others on options beyond just a virtual server…but unless you plan to send 100,oo0 emails a day to people you wont benefit from their email service … as an example. Technical SEO wise I’d give AWS a D overall. Infrastructure and computing power is an obivous A+ but its how you interact with that that weighs the grade.
Poor navigation and the nickle and dime pricing is absurd. Want to monitor your usage so you can understand your bill? Monitoring costs more…its ridiculous.
They do offer reserved instances and I loaded up on those but still my costs never decreased. AWS is so hard to understand billing wise that IT Managed Service Providers will offer free excel templates to figure out your AWS monthly costs. Think I’m being over the top? Check out this calculator form sheet by AWS to forecast your expenses.
Heres something crazy…why my April bill was 167 but AWS forcasts it will be $1020 in May I have no idea. I’m not adding servers…
Google Cloud Platform
Is easier to use and wrap your head around but it is considerably more expensive than either of the other options. For this simple reason…they receive an F. The additional costs come with less options and less features than AWS. Billing is more transparent and you can understand why your bill is what it is at least. But Google also makes unilateral decisions for you like blocking smtp and ssh access. Sure its more secure but it makes email and server maintenance a nightmare. Documents like this Connecting to Instances make it seem like not a big deal, but these wont allow you to move a file from your computer to the server like SFTP would.
They are expensive, offer less and needlessly shot you in the foot with their restrictions. Thats why I stand by the F as an overall grade. Now infrastructure capabilities … A+ no doubt about it.
I received no compensation or thank you from anyone for writing this … Digital Ocean is my B+ graded cloud solution. Its the cheapest, and they don’t seem to charge you a fee for tools that are required for the main product to function, unlike AWS and their static ip addresses. They have the least ability and options outside of a virtual server. If you want a database server thats in the works unless you can use Postgres. Thats limiting, but it is also not important if you’re just running a few web servers that will already have MySQL installed on them anyhow.
Digital Ocean is the no frills, no surprises, cloud computing option. The reason I have so many servers is because I am migrating everything off AWS and Google Cloud to Digital Ocean…it’ll be cheaper. A lot cheaper…
Thats right… $20 vs $121, $177 and $120 from AWS, GCP and Azure. I didn’t really consider Microsoft Azure just because I have reservations moving into their sphere or control where every thing you need to do is addressed by yet another Microsoft product that usually has little imagination in it.
Test out a server in each environment and I think you’ll quickly take to the Digital Ocean option.
I hate missing out just as much as anyone else. Its why Ultimate SEO has accounts on Twitter, Facebook, LinkedIn, Tumblr, Pinterest, Flickr, Youtube, Blogger, Instagram, SnapChat and more. But the only thing worse than not being on a platform is to not appear active on that platform. If someone’s first impression of you or your company is your activity on a forgotten profile it is more damaging than not having been there to begin with.
IFTTT And Buffer
Thats where IFTTT has stepped in and been able to save time while helping to get a message out consistently. From WordPress IFTTT automatically shares and posts each update to a slew of other sites and until recently its been the most effective means for auto updating social media. Now that Google+ has ended and with the loss of Gmail applets on IFTTT it may be a good time to look again at social media auto posting techniques. Recently it appears LinkedIn may have discontinued its connection to IFTTT as well, which is a shame and hurts both LinkedIn and IFTTT.
In researching IFTTT applets Buffer.com came to light. Its mostly a paid version of what IFTTT did for free but it also includes a free options which allows integration with 3 social media platforms. In the use case of Ultimate SEO that meant connecting to LinkedIn. The other two positions going to Facebook and Twitter as those are the powerhouse social media platforms.
Automated Cross Posting In Social Media
My ultimate goal is to fully automate this process and I haven’t yet seen that in Buffer.com but further testing may reveal that an action from IFTTT completes this chain. Its not enough to be able to post an individual article from one site to all, we need something that checks all sites and then auto publishes what isnt published on others to those sites.
You might think of this daisy chain of social media as if it were a PBN. It’s a network of separate sites but all you.
I’ll keep you updated on this case study. At present though IFTTT still appears to be the best auto posting option for social media. It can start from a sites RSS feed or an integrated service like WordPress then post to a central site such as Blogger which has a lot of versatility due to the number of connections available.
Connectors Applets Or Recipes
Some connections utilized for Ultimate SEO include:
(The WordPress to X recipes are available but I’d recommend making these connections through Blogger where possible for consistency)
- WordPress to Blogger
- WordPress to Photostream
- WordPress to Tumblr
- WordPress to Facebook
- WordPress to Twitter
- Blogger to Buffer – this recipe enables the auto posting into LinkedIn
- Blogger to Flickr
- Blogger to Pinterest
- Blogger to Bitly
- Blogger to Diigo
- Blogger to Instapaper
- Blogger to Reddit
- Blogger to Pocket
- Blogger to Trello
- Blogger to Dropbox
- Blogger to OneNote
It may appear that I love Blogger but its important to have a centralized distribution point. Consider how easy it would be to accidentally create an auto updating loop if you didn’t have a defined start. I accidentally created one of these months ago and it was annoying first discovering it and second reviewing where in the chain I was picking up the update I was trying to put down. So blogger serves in that regard as a check point on redundancy. I also prefer a secondary site from WordPress. If after publishing something on WordPress you realized the permalink is too long or something just didn’t look right, you at least have another spot to stop that mistake from going out to everything else.
A Case Study of SEO Metrics And Rank
Here is a recently added FAQ to the Ultimate SEO FAQ section.
Let me show you how important it is….
Why is realtor.com not higher than zumper.com in the mobile search on the right? Consider these metrics
Realtor.com = Domain Score: 55 Trust Score 58 Alexa Rank 763 Registered 1996 Ads 3,900 Backlinks 57,000,000 Traffic Rank 108
Zumper.com = Domain Score: 37 Trust Score 44 Alexa Rank 17,000 Registered 2004 Ads 0 Backlinks 1,700,000 Traffic Rank 2830
In every metric realtor.com wins, so why is it below Zumper.com on the mobile search?
Site Speed Test on GTMetrix
Realtor.com Fails Speed
Zumper.com Passes Speed
So in this example we clearly see a more popular site beaten by a less established site and the single only factor the smaller site did better was speed. And we cant discount this as … well its only important in mobile. In case you missed it…
Now when we consider the facts above lets also dispel people’s over fascination for keywords and text optimization and position of frequency of words, the content length …. on-site SEO, the SEO of the 1990s as I call it… both sites present the same content to the desktop and mobile versions they just differ wildly in the speed. What are some of the reasons? Realtor.com decided to present 16 rows of 3 images of homes to visitors while Zumper shows 4 rows of 1 image …. and then additional rows load as you scroll down. Lazy Load and 1 image vs 3. Thats how they keep their requests to about a third of the realtor.com page.
What Are Requests?
I’d suggest you think of requests as if they are shots from a gun at your head. You need to avoid them! Less shots is a lot better…
Requests are literally requests of the server before the page can load. If I make a page with one image on it that is one request. Lets say I decide to replace that image with a slider with 5 slides, now I have 5 requests … the same page area but that cool feature increases the trips required of a computer to quadruple! Lets say now I add social media icons to the page … Facebook, Twitter, Instagram, LinkedIn and an email icon …. small and just up in the right corner. That social media addition just added 5 more requests. Think about all the things on your page, they don’t all come together in one big Amazon package with a smile…. they are shipped to the computer individually. Now I have one page with 1 request and another with 10 and the initial difference isn’t much…that slider only displays one image at a time.
Latency And Requests
Servers don’t respond instantly…they take a little while to think and retrieve the requested resource and then it has to travel the distance from the server to your computer…may be at the speed of light, but light still takes time. This time is called latency. 50 milliseconds is a good latency.
If both servers in the FAQ had a 50 ms latency. We can assume that the
Realtor.com server will take 50 ms x 301 requests = 15050 ms or 15 seconds
Zumper.com server will take 50 ms x 134 requests = 6700 ms or 6 seconds
I hope this explains why you want to limit requests, and prioritize speed as much as you focus on keywords.
Ways To Decrease Requests
Do you need separate images? On ultimateseo.org I wanted to show my COMPTia certifications. I have 4 icons … I combined them to make one image. Thats 1/4 the requests but no change in user experience other than a quicker site.
Lazy Load also helps speed up the initial page load time. If “below the fold” you have a lot of images on a page … the page needs those images still to finish the load unless you institute lazy load which essentially tells the computer to load an image only when it is coming into view. This makes sense likely if you have 300 images on the page and plenty of them are scrolled far down….but all in all I’m on the fence on Lazy Load. I ran speed tests on the homepage of this site with Lazy Load on …. 3 tests results 2.3 seconds, 1.9 seconds and 1.9 seconds. I turned off lazy load, and reran the test and got 2.3 seconds, 1.9 seconds and 1.7 seconds. So technically the site loaded faster with Lazy Load off….keep in mind it take a bit of thinking for the server to implement it. This helps speed up a site drastically if there are a ton of images spread vertically…but not much in a normal page. What are the full implications on SEO when a site is crawled?
Its suggested by “Ask Yoast” that Lazy Load is fine for SEO and the images are rendered as Google scrolls down the page and indexes the content.
An easy way to build backlinks is through directory submissions. Anything hard though is worth more, so keep in mind directory submissions don’t carry the weight that in content links will. With that said there is still diluted value in link directories, just don’t focus on them.
Here are some link directories and they are organized by SEO power.
The button below opens a new window with about 100 sites. We also have the following directory sites maintained by Ultimate SEO and these are always free to list with. This site also has a directory on it built in…you can access it by the top menu bar.