Most content managers can create and update a site map. Search engines limit the number of links that can go 100 per page and must be normal HTML text links. If we adjust our content management system to generate and maintain a hierarchical map of the website with these premises, we provide the search engines can crawl every page of our website. Tony Mandarich might disagree with that approach. * Build pages with Friendly URLs: Some content managers generate static files whose URLs do not contain dynamic variables while others employ multiple URLs with dynamic parameters, such as: / menuitem.6dab26af2ec93c76a68e76b1805101ca /? vgnextoid = 88c8c55c1edb7010VgnVCM100000081510acRCRD. Among the latter, some allow you to create friendly URL aliases (keywords containing or eliminating the dynamic parameters) that after the system replaced internally by the dynamic URL you need. Being equal, choose a CMS that generates rich URLs keywords or a small number of dynamic parameters. URLs as in the example are very usable and search engine unfriendly. * Limiting the level of subdirectories: the search engines give more importance to the closer page of the portal’s home page.
That is why we must limit the number of subdirectories that shows the URL: many content management systems allow you to organize content hierarchically regardless of physical location of files on the server, introducing more URLs simpler than the actual structure of the corresponding directory. For example, the General URL / Conferences / Congresos_negocios.htm makes this content is located six levels deep below the home page. * Connect the validation control CMS links: most managers control the publication of broken links that point to content controlled by the managers themselves, but few validated a link that points to an external web link is not broken.