Series of posts over the next few months to say what I’ve found works and what doesn’t in SEO. So this is installment 1 in Top SEO Tools I Use.
So this is somewhat of a product endorsement piece but its also worth sharing with others what you’ve found useful and hopefully they’ll let you know what they have found useful. So Tools I use and even a few I don’t and why.
No the first tool has NOTHING to do with keywords but more importantly everything to do with deciding what your keywords are … if you think my life revolves around keywords you really need to catch up. Backlinks and anchor text are the hardest things to control and often the most impactful. Consider….back in the 2000s Miserable Failure was backlinked to George W Bush’s White House biography page by so many websites that he ranked number one on Google for Miserable Failure. Now I guarantee his site had NO SEO or optimization for Miserable Failure but he held that spot until a counter wave of links went to Jimmy Carter. Then Google manually shut the term war off. Its backlinks that make or break a sites ranking.
You can optimize all day long on content choices and keywords but if the rest of the world doesn’t think you are that in their links neither will Google. So the product is essential. It trends your links and competitors and its just $25 a month. Here are screenshots I have taken from time to time to include in reports to clients.
Notice the colored circles at the top break down the links and provide you with metric like CF TF and DA. Below that image you can see link by link and the strength of each link as well as no follow or follow status.
Below are more links since the site added more features. Under the status column the G shows if its indexed by Google with green as good and indexed, yellow as not indexed but not out right banned and finally red … the worst link and honestly its like being caught hanging out with the wrong crowd. Google even allows you to “disavow” the red Gs.
Image above trends you backlinks, your traffic and your keywords. This graph is also why I dont mind nofollow links and I believe they do affect ranking. Notice the tower of purple nofollow domains that linked to us when I released a news story. The orange keyword position swings up from 90 to 50 as the new position.
Its something farthest from your mind, I’m sure. If you’re working for a political campaign you’re pushing forward and the next 5 weeks are all out war ahead. What to do with your campaign site after the election? Heck I suspect some of you are just now getting around to your website, or many feel it hasn’t helped in the past so no need to worry. You’d be wrong if you fall into either of those mindsets, if you’re the diligent one you’ll find the rewards are like a garden.
I started renting a house in my hometown after returning from Chicago and suddenly I found I had room to grow things. I wanted hydrangeas so I planted 14 or so … it took a lot of them to make a show at a gallon a plant. We also planted a grapevine. Not much happened though, and I could have easily given up after the summer, just ignore them…but they were never going to be mature in one season. A grapevine takes 3 years before it produces grapes, I learned hydrangeas were “old” wood and new growth wouldn’t come from new plant life. If you get where I’m going, I’ll stop with the gardening story. You’re website will not produce fruit in its first couple months.
Domain age actually has both a direct and indirect effect on your ranking. For one, a website thats been up and running since the last election has had links from other sites organically made, not a ton if you just leave it sitting there but definitely more than if you take it down and just hold on to the domain. In a previous post I mentioned it takes 3 to 6 months to rank a site, you’ll be a step further if you just leave it up and alone. (Best not alone, maybe post a new article every couple months.)
Domain Age directly affects your Trust Factor/Citation Factor and your Domain Authority which in turn suggest that your Google standing next election cycle is going to be improved as well. And stop thinking your target keyword is your name, if someone is Googling your name they’ve already heard of you. Take the big terms like election results, voter guide or the other candidate’s name.
Take seandelahanty.com and judgeseandelahanty.com the first is younger but it has 70 times the backlinks and its been updated religiously, has the social media mentions, it has the content. The first address is a Domain Authority 23 and the older one is a Domain Authority 2! Its still neck and neck in some searches. Just today I Googled the candidates name the older one comes up 3rd and the real site thats 2 months old, has 10+ times the domain authority that site is 6th. Here is a representation of how much weight domain age may have, I link to the case study below. Oh and BTW the new site is actually doing pretty good I think. Page 1 ranking on 45 keywords on Google…100 ranked keywords altogether. But back to the point…
Now let me clear the air though, no matter how old your domain is and consistently you’ve had a site up…if the content isn’t any good its a lost opportunity. So do make an effort to convey your continued message through your site and when you run for the next office you’re site will be that much more ready. Final note here, there are a ton of opinions out there on domain age, but no one would disagree a site thats up and updated periodically is more likely to gain backlinks.
SO Just get a cheap web hosting plan and post every few months, don’t just take it down and box it up till a month before the next election. For further reading on domain authority a case study. Id recommend that article, it goes over several factors.
And I can tell you one person who’s still got her site up…signs of the times.
Its likely no secret we’re working hard to rank up Judge Sean Delahanty‘s site. He had judgeseandelahanty.com but I’ve always believed that the primary site for any campaign needs to be someone’s name unless its hard to spell without any office referenced.
Thats why we developed his new election site seandelahanty.com. Wanting to bring a small level of gamification and interactivity I added polls and debated a chat section but due to speed concerns on the domain I moved that idea to fordelahanty.com. Its a site to allow supporters to collaborate and make a difference.
In the image below you can see the vast majority of his tracked pages are on page one of Google. In order to draw new traffic from higher search volumned topics I added the FAQ and Voter Guide content. The voter guide is not the most innovative site but it is likely the one campaign site that provides maps and names on races beyond their own. Its a gamble that the voter will find and appreciate Sean Delahanty’s site
The site recently received a make over notice the reported as of today.
There are two custom services running on the deployed machines that are essential for the solution to function properly. These services are gcs-sync (running on WordPress instances – both Admin and Content) and cloudsql-proxy (running on the SQL Proxy instances).
The gcs-sync service runs a script /opt/c2d/downloads/gcs-sync that, depending on the role the VM is assigned (Content or Admin), will check in with the GCS bucket tied to the deployment and determine if content needs to be pushed to or pulled from GCS. If you need to interact with the service, you can do so via systemctl. For example:
systemctl stop gcs-sync
will kill the script checking GCS, and the node will not receive any updates that come from the Administrator Node. Conversely, if the service needs to be started you can do so with the following command:
systemctl start gcs-sync
The cloudsql-proxy service makes use of the Cloud SQL Proxy binary so you can connect to your Cloud SQL instance without having to whitelist IP addresses, which can change when instances are deleted and recreated in a Managed Instance Group. The Cloud SQL binary is located at /opt/c2d/downloads/cloud_sql_proxy and the script that executes the binary is located at /opt/c2d/downloads/cloudsql-proxy. Like the service that runs gcs-sync, it can be interacted with using systemctl. Stopping the service can be done with:
systemctl stop cloudsql-proxy
At this point your instance will not be able to communicate with the Cloud SQL instance, and the application will not function. If you needed to manually start the service for any reason you can do so with the following command:
It Takes About 3 Months To 6 Months To Rank #1 On Google
In the world of politics everything seems to be able the last week of an election, after that its over. In the business world you have to keep going after a certain defined date to remain in business. Political SEO requires keeping in mind the business world because in Googleranking they are your biggest competitors and the other candidate is likely just as I was talking with a politico kind of friend in the last two days about the need in getting in contact with any campaigns for November now … he seemed puzzled and said we had three months and most campaigns are just starting. That may be fine and dandy for yard signs and television commercials but online everything needs to be in place before you want it. That means now.
I will grant you that Jefferson County Judge Executive isn’t a highly sought after keyword. But then who is going to Google that keyword closer to the election? If you thought that was the keyword for that office, you’d be wrong. You need to stop thinking of proper nouns as good Google keywords. I’ll pin a conversation about keywords at another time but for now focus on timing.
Its not too late to invest in search engine optimization and building an engaging site for voters. It will be too late in October when you will wish you had.
Today, most people are searching on Google using a mobile device. However, our ranking systems still typically look at the desktop version of a page’s content to evaluate its relevance to the user. This can cause issues when the mobile page has less content than the desktop page because our algorithms are not evaluating the actual page that is seen by a mobile searcher.
Let me summarize this for you in plain English:
Most people are using Google on a mobile device.
Conducting mobile searches is more common than ever. Yet Google was still ranking search results by desktop page content and experience.
The disconnect here is that most are on mobile yet website owners are designing for an audience viewing desktop.
This results in bad experiences for users on unoptimized mobile pages.
And Google is all about providing the best search engine to its user base. Which is the majority of the world:
If Google didn’t make the change to rank mobile pages, there would likely be a drop in mobile users not searching with Google due to slow sites and unoptimized mobile pages.
So, they started testing the mobile-first index in 2016, where Google planned to shift its indexing to mobile pages before desktop, providing better browsing for the majority of users.
As you can imagine, this was a pretty significant shift. And, it left the industry with a lot of unanswered questions.
If your website is not optimized for mobile, you could see a drop in traffic and rankings.
Over a few days that the update was pushed live, Glenn Gabe from GSQIS started to notice fluctuations in rankings, rich snippets, and more:
This prompted tweets from Google’s Search Liaison account, confirming the updates:
So, what exactly happened here?
Websites with little content relevance were primarily affected, sending rankings for companies to either jump high or fall heavily.
Many SEO marketers suspected this update was targeted at low-quality websites.
As Glenn speculates on the “brackets” update:
“This was one of the biggest updates I’ve seen in a while. It seems Google once again improved how it assesses quality, and with Google always looking to surface the highest quality content for users, that’s a really big deal.”
However, John Mueller cleared things up during a Google Webmaster Hangout on April 6th.
He went on to explain that if your site was affected by the March 7 update, it is not an indication of a low-quality site, but more about content relevance.
Meaning your website is attempting to rank for specific queries that might not be relevant to the user clicking through to read them.
Your content has to connect the dots to search queries. Bounce rates must be reduced. Anything and everything you write and publish should be relevant to the reader.
Anyone who knows me knows I love visualizations. I’ve been working on SEO strategies for a site and the beginning of any good strategy is understand the site and quality of its content. Some studies suggest 60% of people think in pictures…that is to say if I said the word “cat” some people will see a cat and some will see the letters cat. Its hard for visual people to grasp complex concepts such as “millions” its harder to produce that word in a visual way in your mind. Similarly when we think about a web site its often hard to grasp the links and relationship between a sites pages. Sure we can create a boring flow chart but it often has to be over simplified and lacks a good representation of the sites content.
Recently while working for https://pearlharbor.org I found a tool that created some really useful visualizations of the site and the sites pages with linkage between them. I believe the software limits its linkage to 10,000 which is astounding. The site offers information and Pearl Harbor tours while serving as a memorial that includes a page for all the survivors. My focus with this site is to make SEO recommendations and ensure the effective utilization of social media in improving site ranking.
The free plug for this software is Website Auditor you can use the free evaluation version for as long as you’d like, you just cant save or print anything. Its still something extraordinary for understanding a sites layout. In addition this software includes additional tools for site link building and so on.
I’m excited about my newly discovered free SEO tools. I’m going to make these images for any site I work on. While I want to improve SEO through increased authoritative external links its essential that these links land on a site that utilizes intelligent internal linkage.
I have a solid record of identifying a problem and making a solution that yields measurable results. I was recently the technology manager for a political campaign where I built a WHM/cPanel server on CentOS using Amazon Web Services to host about 14 WordPress websites. The server also provides mailing lists, a team management site, RSS feeds augmentation and its own name servers. I was able to get the server and sites built in a couple months.
I’ve managed the SEO and Adwords for these sites and have data that helps demonstrate the value I added to the organization.
I currently host all of these sites on an Amazon Web Services (AWS) EC2 server running CentOS and WHM. I also built an SQL server with large tables of voter records, city crimes, city financial payments, jail population censuses, web traffic data. This web server is integrated with Google Data Studio providing easily understood graphics. I used Google Cloud Platform for the SQL database solution.
Driving Traffic With SEO
Brentackerson.com channel results were organic and social media driven
Ackerson2018.com channel results were Adwords and paid searches
Each of these sites is intended to attract different target crowds and through SEO optimization, these sites were indexed with the keywords we needed. I’m able to see the effect of this through Google Search Console and Google Analytics.
Search Console lets us see what Google indexed
It’s important that sites remain fresh and update often. Several RSS feeds are collected into one site that then publishes posts through feeds that meet the target audience of different sites. The sites also have keyword auto linking. In the political campaign we built links for the office our candidate was seeking and links for the other candidate to a negative news site. I also posted custom content further building the content quality.
All of these sites had updated sitemaps and robots.txt files. With content that offers linked keywords tailored to the site’s audiences I were able to build on our page traffic and relevancy. Within Weeks we were a top result between the 3rd and 6th position for our competitor’s name.
I have built a network of these sites, with keywords in the domain name. This can be helpful in SEO ranking but also has been proven to increase user click rates with users in unpaid searches. Some sites have a more newspaper feel, some a magazine, others may focus on videos but all of them are meant to capture users and ultimately direct them to a site targeted to them.
Google Analytics Site Statistics Using Google Data Studio and Adwords
I’ve integrated Youtube and Facebook accounts as well as created Tumblr, Pinterest and other social media accounts that can be auto updated from the sites posts. These are intended to continue to spread the word about our site and activities. The social media links improve our ranking. These feeder sites funnel users into the campaign’s primary site.
The SEO of https://www.brentackerson.com was something I was asked to improve and through these sites and social media posts I improved its position from 63rd to 3.6 when searching the candidates name on Google in under two months. Several other keywords are now are well associated with the site. The additional targeted sites and social media accounts were essential to improving the main site’s ranking by giving me plenty of opportunity to back link to the original site.
Performance using Google Adwords. We have a highly targeted audience (one county’s primary voters) On a weekend day before 5pm 14,000 times the candidates name was displayed, 3.4% clicked the search Ads which is actually pretty good. Our Videos are viewed at 28% meaning 1,210 videos were watched. Our target was 2% CTR and 15% View Rate.
The numbers above compare the current 3 day period with the previous 3 day period.
MORE ABOUT ME:
EXPERIENCE: Prior to my current position I worked at Accenture for two years as an Instructional Designer and as a Big Data Technical Trainer for HP for two years. While I was a Help Desk Lead I was able to create a technical training program at Enova Financial.
MBA in Technology Management
MEd Instructional Design
BS Information Technology