Kern Media takes the guesswork out of technical SEO and can audit your website to provide specific recommendations to ensure your website is properly technically optimized.
Benefits of a Technical SEO Audit
- Improve search engine ranking potential by fixing internal broken pages, redirects and other technical SEO issues.
- Improve user experience by fixing broken links, broken images, orphaned pages and other user experience issues.
- Improve compliance with Google’s quality guidelines by ensuring Google renders your pages just as users do.
- Enhance your search engine listings with schema markup to make your content more discoverable in search engines.
- Take the guesswork out of technical SEO, implement professional recommendations and move on to your other online marketing efforts.
Conducting Your Technical SEO Audit
Kern Media offers technical SEO audits that ensure websites are properly optimized for search engine crawling and indexation so that your content strategy and promotional efforts can have maximum impact. The process is a very data-heavy, but recommendations are provided in a manner that knowledgeable website owners and developers can understand and implement.
Crawling Your Website to Identify Technical Issues
The starting point for a technical SEO audit consists of crawling your website in a manner similar to search engines in order to identify issues that may limit indexation, ranking potential and usability. Tools such as Screaming Frog are used to execute the crawl and identify issues such as the following:
- 4XX errors (404, 410, 400, 401, 403, etc.)
- 3XX redirects (301, 302, etc.)
- 5XX errors (500, 502, 503, etc.)
- Redirect chains
- Pages blocked by robots.txt
- Incorrect meta robots tag implementation
- Incorrect canonical URL implementation
- Incorrect rel=prev/next implementation
The crawl data will be thoroughly analyzed and clear recommendations provided to fix any issues discovered.
Reviewing Indexation in Search Engines
Your website’s indexation in search engines is important to keep in check. With too many low-quality pages indexed, or important pages excluded from search engine indexes, your overall “content quality” (in terms of indexation) can be put at risk.
Situations of “indexation bloat” can potentially hurt your site’s overall ranking potential in search engines (especially Google), and situations of important pages being blocked from indexation (due to any number of factors) can limit your organic search traffic potential.
Kern Media takes a close look at your website’s indexation by analyzing data in Google Search Console and Google Analytics, and comparing to the crawl data mentioned above.
Reviewing Google Analytics & Search Console
Google Analytics and Google Search Console provide important data for analyzing SEO efforts. Important data includes traffic sources (especially organic search traffic), engagement metrics, traffic history, demographic and interest data, conversion rate, revenue, goals, etc. Google Analytics settings can also be reviewed and improved as needed.
Google Search Console offers very important data regarding overall indexation, sitemap indexation and errors, HTML improvements, search query and page-level metrics, etc. Google Search Console settings can also be reviewed to ensure a “Preferred domain” is set, URL parameter handling is specified (if needed), etc.
Analyzing Website Architecture & Taxonomy
A well-structured website uses topical hierarchy to provide a logical organization of content for both users and search engines. This also allows internal link equity to be passed through the structure in a manner that will flow PageRank through a site for optimal ranking potential in search engines. Taxonomy will be reviewed and recommendations will be made to improve it (if necessary).
Optimizing the Robots.txt File
Your website’s robots.txt file offers instructions to search engine “bots” on how to crawl your website. Disallowing crawl of important pages will limit search engine ranking (and traffic) potential. Furthermore, allowing crawl of low-quality pages will cause “indexation bloat” in search engines and put your website at risk of Google’s various algorithms. Kern Media has the expertise to ensure that your robots.txt file is optimized for proper search engine bot crawling. Read my blog post on How to Write a Robots.txt File for insight into how I will approach your robots.txt file optimization.
Auditing the XML Sitemap
Your website’s XML sitemap offers a roadmap of URLs for search engines to crawl and index. Having errors in your XML sitemap (4XX response codes, 3XX response codes, etc.) can decrease search engines’ trust in your XML sitemap and limit its effectiveness. Your XML sitemap can be reviewed to ensure accuracy and limit errors reported in Google Search Console. Read my blog post on How to Audit an XML Sitemap to understand how I approach XML sitemap analysis.
Analyzing Page Load Time
Reviewing Mobile Friendliness
In April, 2015, Google began incorporating mobile friendliness into its ranking algorithm due to the explosion of mobile usage among internet users. Your website will be reviewed to determine if it’s mobile friendly and whether the mobile optimization is implemented properly. Responsive design is usually recommended, and the preferred choice of Google. However, mobile subdomains can be okay as long as the proper rel=alternate and canonical coding is used in the <head> of each page on your desktop and mobile sites.
Identifying Structured Data Opportunities
Search engines, especially Google, are encouraging websites to use structured data in order to enhance search results for users. Appropriate structured data (schema markup) opportunities will be provided for your website, depending on the content types. There are many different schema markup coding sets that can be used for websites ranging from E-Commerce, recipe sites, review sites, instructional websites, etc.
Reviewing Duplicate Content Issues
Internal and external duplicate content issues limit your pages’ value to search engine indexes, and eradicating these issues (as much as possible) can have a good impact on your overall content quality rating with search engines. If your website has duplicate content issues, specific recommendations will be provided to ensure that you know how to handle them and improve your content and ranking potential in search engines.
Ready for a Technical SEO Audit? I Can Help
Stop the guesswork and get a professional tech audit of your website and start improving your SEO. Contact Kern Media for a free consultation.