• kingthrillgore@lemmy.ml
    link
    fedilink
    English
    arrow-up
    3
    ·
    9 months ago

    robots.txt has been an unofficial standard for 30 years and its augmented with sitemap.xml to help index uncrawlable pages, and Schema.org to expose contents for Semantic Web. I’m not stating it shouldn’t not be a law, but to suggest changing norms as a reason is a pretty weak counterargument, man.