Support » Plugin: The SEO Framework » Sitemap

  • Attempting to use your plugin and create a site map…however the functionality seems to not work for that and robot file in the mult-site subfolder set up with domain mapping.

    Known issue or am I doing something wrong.

    Thanks.

Viewing 1 replies (of 1 total)
  • Plugin Author Sybre Waaijer

    (@cybr)

    Hi @jiggaman,

    Does the robots file start with this?:

    # This is an invalid robots.txt location.
    # Please visit: example.com/robots.txt

    If not, then the robots.txt file can’t be adjusted by The SEO Framework.

    Regardless, the notice should be output, because robots.txt files aren’t read on subdirectories.

    For more information, please visit: http://www.robotstxt.org/robotstxt.html

    Where to put it
    The short answer: in the top-level directory of your web server.

    When a robot looks for the “/robots.txt” file for URL, it strips the path component from the URL (everything from the first single slash), and puts “/robots.txt” in its place.

    For example, for “http://www.example.com/shop/index.html, it will remove the “/shop/index.html”, and replace it with “/robots.txt”, and will end up with “http://www.example.com/robots.txt”.

    If you’re running a MultiSite subdirectory installation, where each subdirectory is subjected to its own instance/business, then I’m afraid your websites will be very confusing for Google and will have a hard time ranking.
    On the other hand, if all directories have their own purpose for the whole domain, surrounding the same domain, then you’re set up great! Alas, the sitemap won’t contain all domains’ items (yet).

    When you’re using a subdomain installation, each site will be seen as its own instance/business (unless when vigorously linked to each other).

    Edit: I’ve opened a related GitHub issue (147) regarding expanded MS subdirectory sitemap entries.

Viewing 1 replies (of 1 total)
  • You must be logged in to reply to this topic.