The gShift Site Audit tool is simple to use and provides a quick method of analyzing and evaluating your on-site elements and structured data. It delivers easy to digest reports and recommendations, enabling you to analyze and prioritize items, which may need more immediate attention in order to improve your organic search visibility.
Follow these steps to create a Site Audit:
- Login to gShift and navigate to the Web Presence you would like to run a Site Audit on.
- Click on Content from the main navigation.
- Click on Site Audit from the sub navigation.
- Click the Create a New Site Audit button.
- Select the website you would like to run the report on. You can choose a site your are following for optimization purposes or you can specify a Custom URL to run an audit on a competitor's site or to conduct pre-sales analysis.
- Select which resources you would like to crawl:
- Pages (content pages within the selected website)
- Images (images within the selected website)
- Stylesheets (CSS Stylesheets within the selected website)
- PDFs (PDFs within the selected website)
- User Agent - Choose which recognizable agent name you would like to use to crawl the site. This will help you identify and/or filter the agent in your web server logs or analytics. SiteCondor is the recommended Default.
- Throttling - Choose how fast you would like to crawl the website. This setting enables you to slow down the crawl in the event of performance concerns.
- Max Resources - You are contractually limited to how many resources you can use per month. This option enables you to control how many resources you allocate to a single Site Audit.
- Concurrency - Specify how many crawlers may crawl your site at once for performance management.
- Timeouts - Specify how long of a timeout is acceptable for your web server.
- Include URL Filter - Enter pages you want to be crawled explicitly. This can include a regular expression to call out specific sections or pages of the website e.g. a product category.
- Disregards query strings - Leave this checked (by Default) to disable crawling of search result pages within your site.
- Scan subdomains - Will crawl subdomains off the main website domain selected.
- Include www. - Will crawl both www. and non-www. versions of the website should both exist.
- Cookies - Will enable cookies to be created and applied during a crawl. Leave this checked (by Default) to ensure quicker future crawls.