
Rick Coterillo
By: Rick Coterillo – Bayshore Solutions Senior Developer
SEO is an acronym for “search engine optimization” or “search engine optimizer.” SEO is the process of affecting the visibility of a website or a web page in a search engine’s “natural” or un-paid (“organic”) search results. Optimizing a website may involve editing its content, HTML and associated coding to both increase its relevance to specific keywords and to remove barriers to the indexing activities of search engines. A compatible website will allow a search engine spider or bot to crawl the webpages looking for keywords they can add to their index. A search engine in general will return a list of pages based on the keywords a user has entered.
Search engine spiders or bots crawl billions of webpages looking for HTML keywords; such as Title Tags, Itemprop tags, alt tags, etc. These tags are inserted into the HTML of your web pages by your web developer or designer. A popular Spider; GoogleBot from Google, uses computer programs to determine which sites to crawl, how often to crawl, and how many pages to fetch from each site. The purpose of crawling webpages is to allow Google to create a massive index of all the keywords it sees. This index is then used when a user types search words into google to return the relevant webpages.
Google’s crawl process begins with a list of web page URLs, generated from previous crawl processes, and augmented with Sitemap data provided by webmasters. As GoogleBot crawls each site, it detects links on each page and adds them to its list of pages to crawl.
Google cannot process the content of some rich media files or dynamic pages. A dynamic page is a page where the url contains a “?”
What to do to ensure GoogleBot and the other spiders can crawl your site:
- Remember bots are simple programs, think version 2.0 or your browser. Avoid javascript links.
- DHTML menus can be hostile to bots. They rely on javascript. Bots cannot crawl javascript, especially script located in separate files.
- Ensure your site has a clear hierarchy and text links.
- Every page should be reachable from at least one static text link.
- Your site should have a sitemap that offers your users, and the bots, links that point to the important parts of your site.
- Try to use text instead of images to display important names, content, or links. If you must use images for textual content, consider using the Alt tag.
- Make sure your html tags are descriptive, accurate, and relevant to your page. Think, what words would a user enter into a search engine to find this page?
- Test your page load times, the longer the load time, the less likely the page will be crawled. Both bots and humans are impatient.
- Use clean url’s to all your pages. Remember bots do not click dropdowns, select radio buttons, check checkboxes, or enter keywords.
- Generate fresh content.
- Have supplemental navigation, footer, breadcrumbs, links, etc. These will make your site easier to crawl.
The tips above can be daunting and difficult to understand; and they are just the beginning. The experts at Bayshore Solutions can guide you through this html maze – contact us today!
Rick Coterillo is a Senior Developer at Bayshore Solutions—a Web Design, Web Development, and Digital Marketing Company.