Skip to content

Commit 9df1640

Browse files
add robots.txt to guide crawlers
1 parent 967c0e2 commit 9df1640

File tree

2 files changed

+3
-1
lines changed

2 files changed

+3
-1
lines changed

doc/conf.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -209,7 +209,7 @@
209209
# Add any extra paths that contain custom files (such as robots.txt or
210210
# .htaccess) here, relative to this directory. These files are copied
211211
# directly to the root of the documentation.
212-
# html_extra_path = []
212+
html_extra_path = ['robots.txt']
213213

214214
# If not '', a 'Last updated on:' timestamp is inserted at every page bottom,
215215
# using the given strftime format.

doc/robots.txt

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,2 @@
1+
User-agent: *
2+
Sitemap: https://contrib.scikit-learn.org/skglm/sitemap.xml

0 commit comments

Comments
 (0)