- Posts: 1
- Thank you received: 0
Dissallow: for a wrapper page?
14 years 11 months ago #4078
by vidweb
Dissallow: for a wrapper page? was created by vidweb
I have a number of wrappers on my site.
In terms of good SEO I want to make sure that SE crawlers do not spider these wrapper pages as it will affect my page PR.
Within the Article Manager and Article - under Metadata Information it shows the heading "Robots" field where you can insert disallow: to have the crawler exclude that page.
Is it possible somehow to include a disallow: for a wrapper and if so how?
Cheers
In terms of good SEO I want to make sure that SE crawlers do not spider these wrapper pages as it will affect my page PR.
Within the Article Manager and Article - under Metadata Information it shows the heading "Robots" field where you can insert disallow: to have the crawler exclude that page.
Is it possible somehow to include a disallow: for a wrapper and if so how?
Cheers
Please Log in to join the conversation.
-
ivan.milic
Support Staff -
- Offline
- Moderator
-
Less
More
- Posts: 14116
- Thank you received: 1639
14 years 11 months ago - 14 years 11 months ago #4082
by ivan.milic
Replied by ivan.milic on topic Re: Dissallow: for a wrapper page?
If I understud you , you want to tell crawler to exclude some element like some div etc.... that is not possibile for now. Rules generaly apply to paths.
But you can disallow any page or directory by editing your robot.txt file in site's root directory.
see : www.rankspirit.com/frobotseng.php
But you can disallow any page or directory by editing your robot.txt file in site's root directory.
see : www.rankspirit.com/frobotseng.php
Last edit: 14 years 11 months ago by ivan.milic.
Please Log in to join the conversation.
Time to create page: 0.303 seconds