Dissallow: for a wrapper page?

More
14 years 11 months ago #4078 by vidweb
I have a number of wrappers on my site.

In terms of good SEO I want to make sure that SE crawlers do not spider these wrapper pages as it will affect my page PR.

Within the Article Manager and Article - under Metadata Information it shows the heading "Robots" field where you can insert disallow: to have the crawler exclude that page.

Is it possible somehow to include a disallow: for a wrapper and if so how?

Cheers

Please Log in to join the conversation.

More
14 years 11 months ago - 14 years 11 months ago #4082 by ivan.milic
If I understud you , you want to tell crawler to exclude some element like some div etc.... that is not possibile for now. Rules generaly apply to paths.
But you can disallow any page or directory by editing your robot.txt file in site's root directory.
see : www.rankspirit.com/frobotseng.php
Last edit: 14 years 11 months ago by ivan.milic.

Please Log in to join the conversation.

Time to create page: 0.303 seconds
Powered by Kunena Forum