Nick EubanksClarity's Top SEO Expert
Bio

SEO entrepreneur with a passion for products. Currently building web3 tools and coaching SEO agencies on revenue growth. I built and sold 2 software companies before starting my first agency - I've now purchased 3 agencies on behalf of https://FTF.co I'm also a Partner at https://TrafficThinkTank.com the #1 private SEO community with members from Google, Intuit, Hubspot, Uber, Square and many other SEO powerhouses. READ MY REVIEWS or learn more at https://NickEubanks.com


Recent Answers


Identify who your closest online competitors are, and forget about potential brick and mortar and bigger competing businesses offline.

Don't just look for other website that offer your products and services but instead go wider to find sites where your target persona/audience is likely to be hanging out and spending their time.

Gather 4-5 of these URL's and drop them into SEMRush, use the SE Keyword reports to find the sites with the largest overall footprints and export those lists of terms. Keep this all in an Excel file for now.

You'll then want to build some simple filters to separate and tag the terms that are commercial/show buying intent, and have search volume. The average CPC numbers (the price someone is willing to pay, on average, for Ad clicks from Google is a good leading indicator on the commercial value of the keyword -- but it's only really applicable for paid search.

To get organic search competitiveness dialed in you'll need to enrich this data with additional metrics like links and topic relevance, for which you'll likely want to use Ahrefs or Majestic.

For a much more detailed explanation of the process from soup to nuts check out:
1. https://imfromthefuture.com/keyword-research-now/
2. https://imfromthefuture.com/not-keyword-research/


Stoney is on the money here.

The answer is really yes and no. Yes you want to get links, but no you don't want crappy links in high volume all pointing to the same page with the same optimized anchor text.


Funny I actually started a project *JUST FOR THIS PURPOSE* called AppDea.net. It got overloaded and crashed my server so the database is currently down, but I am moving it over to a telesc.pe install right now and will let you know as soon as it is back up.

Cheers!


I agree with some of the recommendations above;

1) Focus on making sure your XML sitemap is accurate, and make sure your weightings (server priority weights) are set correctly from parent to child (category > sub-category > product detail).
2) Submit through Google webmaster tools
3) Submit partials in increments (again, good suggestion at 50k/time) and pay close attention to your index rate to see how fast these are picked up
4) you can brute force a lot of these crawls by submitting and re-submitting your sitemap(s) - as often as every day
5) For very important URL's that are not getting picked up a handy trick is to tweet them out from an account that has a positive share of voice (more followers than following)


I'm with Hartley on this one. But it all depends on your content.

If you are going to have (and maintain duplicate content on all of these URL's) the the re-directs from the old URL's to the new URL's only take care of half of your problem, you still will be cannibalizing the rankings of these pages as they will be ''dinged' for duplicate content (worst case scenario) or at the very least, search engines will not know which version is to be representative, will not be able to discern any difference, and instead of trying to figure out which version should rank higher, both will receive a lower relevancy score. This is where the rel=canonical tag comes into play - which essentially tells Google, "hey, this is the representative version of this content that I want indexed," and to implement this you would need to add the <link rel="canonical" href="final destination URL goes here" /> to every URL that has duplicate content, all pointing back to the "champion" version.

Whenever you change URL's, that you wish to remain indexed you must do a 301 redirect for continuity of both page-level authority and so users/crawlers can make it to the correct final destination.


The basics of SEO are:

Crawlability: proper HTML mark-up and syntax starting with the document head right down to the <div> structure. If search engine robots can't parse your code then it doesn't matter how relevant it is to the search query.

Authority: content that is popular tends to rank well, this is due to a high signal to noise ratio based on the qualitative metrics that Google's algorithm is taking into account - metrics like organic click-through rate from the SERP, adjusted bounce rate on the page, average time on site, etc.

Links: authority is built over time and through trust. One of Google's leading trust indicators is links back to your site or page from other trusted (authoritative) sources.

At the end of the day Google (and search engines in general) are looking to provide the best experience for their users, delivering the most relevant and useful results as fast as possible.

If your webpage is delivering value, solving a problem, or answering a question then you are off to a good start.


Contact on Clarity

$ 16.67/ min

4.98 Rating
Schedule a Call

Send Message

Stats

6

Answers

223

Calls


Access Startup Experts

Connect with over 20,000 Startup Experts to answer your questions.

Learn More

Copyright © 2022 Startups.com LLC. All rights reserved.