Skip to content

[BUG] robots.txt appends lines multiple times #3012

@MyUncleSam

Description

@MyUncleSam

What happened?

I enabled using the global config the robots.txt - default setting.

This adds a text block list

# robots.txt generated by BunkerWeb (https://bunkerweb.io)
Disallow: /

But whenever I revisit the robots.txt it add as far as I see at least 2 new duplicate entries ending up at something like:

# robots.txt generated by BunkerWeb (https://bunkerweb.io)
Disallow: /
Disallow: /
Disallow: /
Disallow: /
Disallow: /
Disallow: /
Disallow: /
Disallow: /
Disallow: /
Disallow: /
Disallow: /
Disallow: /
Disallow: /
Disallow: /
Disallow: /
Disallow: /
Disallow: /
Disallow: /
Disallow: /
...

It also seems like it gets reset from time to time and starts from 1 element again to fill.

How to reproduce?

As far as I see I only set this variable which could affect robots.txt:

  • USE_ROBOTSTXT: yes

Configuration file(s) (yaml or .env)

Relevant log output

BunkerWeb version

1.6.6

What integration are you using?

Docker

Linux distribution (if applicable)

No response

Removed private data

  • I have removed all private data from the configuration file and the logs

Code of Conduct

  • I agree to follow this project's Code of Conduct

Metadata

Metadata

Assignees

Labels

Type

Projects

No projects

Milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions