Call +1 (872) 256-1610 or fill out this easy Contact Form for a free SEO scope of work estimate.

E-commerce retailer technical & onsite audit package (8/17)

Case study: how I can deliver an SEO consulting “package” to quickly identify & resolve your e-commerce store’s mission-critical site roadblocks.

Opportunity

Business: achieve better visibility in SERPs (search engine result pages) with target pages showing for product category and geomodified brand searches
SEO: clean up Google’s index of HouserShoes.com, provide an actionable list of next-step fixes, and uncover new opportunities for growth

The Way Forward:

July 2017-present: I’d been helping Charleston, S.C.-based agency Visiture tackle SEO audits for their large e-commerce clients since early July. When I took a look at the crawl for one such site in August, I first noticed the massive number of URL’s — too many to crawl in fact, as my laptop ran out of memory.

So what was up? Certainly large e-commerce stores can have a large number of URL’s when there are situations like product filters (size, color, etc) creating URL parameters…

But in this case, all of the product filters were actually unique pages. (!)
/columbia-sportswear/l/adult/
/columbia-sportswear/l/child/
/columbia-sportswear/l/beige/
/columbia-sportswear/l/black/
[+ lots, lots more]

This meant an exponentially large # of page variations, meaning a massive, likely “messy” site crawl, and likely negative SEO impact, when I saw these filters had duplicate content and were getting indexed the same way as URLs with more value for search. (Plus a strain on their site’s server due to 500/503 errors.)

This is where Robots.txt comes in. If you’ve got a site that has what I call a messy crawl — meaning you can see duplicate/low-quality pages in the crawl that should be excluded — you can 1) check these are actually live, 200-code pages in the site crawl (I use Screaming Frog for this) and 2) confirm whether these unwanted URL cases are getting indexed using a “site:domain” Google search. Robots is an incredibly useful tool for larger sites; when there are 100,000+ URLs that should be removed from Google’s index, for example.

Disallow a whole folder in Robots.txt you don’t want indexed (disallow: /l/ for example above), and now you’ve got a clean solution to what could be otherwise a very messy situation… e.g., a content management system creating exponential number of filter URL’s per product page.

How Can My Experience Directly Help You?

In both work for clients directly and for agencies, I offer my consulting services as “packages” that help to both contain scope upfront and offer opportunity analysis beyond a basic checklist audit approach — common with larger agencies that have all the resources except time to invest in each client’s needs individually.

That’s where an SEO consultant can come in.

For Visiture, for example, I didn’t just help optimize the client’s Robots.txt and provide a technical solution that promised a significant boost for their website and business, I also uncovered insights around…

Structural SEO (solving for non SEO friendly URL’s)
Click-through & keyword optimization in business-critical page titles
Local SEO insights around site structure – unique to their specific business situation

By focusing on search opportunity analysis, the whole package is designed to offer a “step 1” incremental SEO improvement for you — or your clients — with real business impact.

More

Related projects

Path Digital Services & Consulting