Due to the nature of the default robots.txt and the meta tags in Lemmy, search engines will index even non-local communities. This leads to results that are undesirable, such as unrelated/undesirable content being associated with your instance.

As of today, lemmy-ui does not allow hiding non-local (or any) communities from Google and other search engines. If you, like me, do not want your instance to be associated with other content, you can add a custom robots.txt and response headers to avoid indexing.

In nginx, simply add this:

# Disallow all search engines
location / {
  ...
  add_header X-Robots-Tag noindex;
}

location = /robots.txt {
    add_header Content-Type text/plain;
    return 200 "User-agent: *\nDisallow: /\n";
}

Here’s a commit in my fork of the lemmy-ansible playbook. And here’s a corresponding issue I opened in lemmy-ui.

I hope this helps someone :-)

  • Serinus@lemmy.ml
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    1
    ·
    1 year ago

    If you do this, I’d recommend excluding at least your most common communities. Google searching Reddit has been a great tool over the years, and improved discoverablity of the service as a whole. Especially for smaller communities.

    Feels kind of like shooting yourself in the foot. Maybe just exclude NSFW communities (though, do those even exist here?)

    • Scrubbles@poptalk.scrubbles.tech
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      I agree, you do you, but IMO if you want to host a lemmy instance (that’s not private), this is kind of part of the deal. If you host communities, you are literally opening yourself up like this.

    • binwiederhier@discuss.ntfy.shOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      There is no way to exclude individual communities. The post URLs are generic, like /post/1234. From nginx or other proxies, I cannot tell what community they belong to. I would love to have my own be searchable, but not at the price of tainting my project’s reputation.

  • NXL@lemmy.ml
    link
    fedilink
    English
    arrow-up
    4
    ·
    1 year ago

    Please don’t do this and keep information easy to google. The best part of Reddit was how much hours of time it saves when googling for information on stuff

    • binwiederhier@discuss.ntfy.shOP
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 year ago

      There are plenty of instances that copy the original content. As an instance owner that runs a only a single project specific community, I should be able to decide what content is available on my domain, and what isn’t. Don’t you think?

      Aside from the questionable content, there is also legal issues around it that I’d rather not deal with.

      • NXL@lemmy.ml
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        Yes, its your choice. I would prefer it if this is barely done to increase the likely hood of information being indexed and easily found on google searches though.

  • parmesancrabs@lemmy.ml
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    1 year ago

    Would it be a better idea to exclude any URLs that are similar to /c/*@*.* I think that would block external communities but keep local ones still indexable in their native locations.