Dissallow: for a wrapper page?

  • vidweb
    Inactive member
  • Topic Author
  • New Member
  • New Member
More
14 years 4 months ago #4078 by vidweb
I have a number of wrappers on my site.

In terms of good SEO I want to make sure that SE crawlers do not spider these wrapper pages as it will affect my page PR.

Within the Article Manager and Article - under Metadata Information it shows the heading "Robots" field where you can insert disallow: to have the crawler exclude that page.

Is it possible somehow to include a disallow: for a wrapper and if so how?

Cheers

Please Log in to join the conversation.

More
14 years 4 months ago - 14 years 4 months ago #4082 by ivan.milic
If I understud you , you want to tell crawler to exclude some element like some div etc.... that is not possibile for now. Rules generaly apply to paths.
But you can disallow any page or directory by editing your robot.txt file in site's root directory.
see : www.rankspirit.com/frobotseng.php
Last edit: 14 years 4 months ago by ivan.milic.

Please Log in to join the conversation.

Time to create page: 0.173 seconds
Powered by Kunena Forum