Skip to content

Commit 90cae92

Browse files
authored
Dark Visitors rebranded to Known Agents
https://knownagents.com/posts/dark-visitors-is-now-known-agents The Discourse plugin is still named Dark Visitors.
2 parents 8ac26c0 + eaa5e68 commit 90cae92

File tree

8 files changed

+52
-40
lines changed

8 files changed

+52
-40
lines changed

.gitignore

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1 @@
1+
bin/rspec

Gemfile.lock

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -35,7 +35,7 @@ GEM
3535
prettier_print (1.2.1)
3636
prism (1.4.0)
3737
racc (1.8.1)
38-
rack (3.2.3)
38+
rack (2.2.17)
3939
rainbow (3.1.1)
4040
regexp_parser (2.11.2)
4141
rubocop (1.79.2)

README.md

Lines changed: 8 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -1,15 +1,15 @@
1-
# Discourse Dark Visitors Plugin
1+
# Dark Visitors: Discourse Known Agents Plugin
22

3-
This [Discourse](https://discourse.com) plugin adds an integration with [Dark Visitors](https://darkvisitors.com).
4-
Via Dark Visitors you will get some insights into which bots or scrapers visit your forum.
3+
This [Discourse](https://discourse.com) plugin adds an integration with [Known Agents](https://knownagents.com), previously called: _Dark Visitors_.
4+
Via Known Agents you will get some insights into which bots or scrapers visit your forum.
55

66
It provides the following features:
77

88
- augmenting robots.txt
99
- server analytics
1010
- client analytics
1111

12-
In order to use this plugin you need to sign up with [Dark Visitors](https://darkvisitors.com).
12+
In order to use this plugin you need to sign up with [Known Agents](https://knownagents.com).
1313

1414
For more information and discussion see [this thread](https://meta.discourse.org/t/dark-visitors/365158) on the Discourse Meta forum.
1515

@@ -24,7 +24,7 @@ It is no definite guarantee that there are no issues.
2424

2525
## Augmenting robots.txt
2626

27-
With this enabled the robots.txt file created by Discourse will be augmented with [agents](https://darkvisitors.com/agents) from the configured categories.
27+
With this enabled the robots.txt file created by Discourse will be augmented with [agents](https://knownagents.com/agents) from the configured categories.
2828
Once a day the latest list of agents is retrieved and the robots.txt is updated accordingly.
2929
Only agents which are not already registered in the robots.txt are added.
3030

@@ -34,15 +34,15 @@ This feature only works if you have not manually overridden robots.txt.
3434

3535
## Server Analytics
3636

37-
Requests to the server are reported to Dark Visitors.
37+
Requests to the server are reported to Known Agents.
3838

3939
This feature can be enabled for everybody, or only unauthenticated users (recommended).
4040

4141
## Client Analytics
4242

43-
A javascript based tracker is added to the forum which, under certain conditions, will report back to Dark Visitors.
43+
A javascript based tracker is added to the forum which, under certain conditions, will report back to Known Agents.
4444

45-
At the moment of writing under the following conditions trigger a callback to Dark Visitors:
45+
At the moment of writing under the following conditions trigger a callback to Known Agents:
4646
- User is referred to the forum from an AI service
4747
- The browser might be a scraper
4848

config/locales/server.en.yml

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -1,11 +1,11 @@
11
en:
22
site_settings:
3-
darkvisitors_access_token: "Go to the project settings at Dark Visitors and copy the Access Token."
4-
darkvisitors_robots_txt_enabled: "Only when enabled will the robots.txt be updated with agents retrieved from Dark Visitors."
5-
darkvisitors_robots_txt_agents: "Agent types to include in the robots.txt. See https://darkvisitors.com/agents for an overview of all agents."
3+
darkvisitors_access_token: "Go to the project settings at Known Agents and copy the Access Token."
4+
darkvisitors_robots_txt_enabled: "Only when enabled will the robots.txt be updated with agents retrieved from Known Agents."
5+
darkvisitors_robots_txt_agents: "Agent types to include in the robots.txt. See the <a href=\"https://knownagents.com/agents\">agent list</a> for an overview of all agents and what they are used for."
66
darkvisitors_robots_txt_path: "Path not allowed to be visited by the agents."
7-
darkvisitors_server_analytics: "Send request data to Dark Visitors for analysis. With 'anonymous_only' only requests form unauthenticated visitors are reported."
8-
darkvisitors_server_analytics_include: "Besides standard tracked requests, what other requests should be reported to Dark Visitors."
7+
darkvisitors_server_analytics: "Send request data to Known Agents for analysis. With 'anonymous_only' only requests form unauthenticated visitors are reported."
8+
darkvisitors_server_analytics_include: "Besides standard tracked requests, what other requests should be reported to Known Agents."
99
darkvisitors_server_analytics_ignore: "User agent substrings (case sensitive) to ignore. Use it to exclude uptime monitoring bots."
1010
darkvisitors_client_analytics: "Add client-side analytics. Redirects from AI services can be traced this way. It also helps to identify possible hidden AI agents."
11-
darkvisitors_client_analytics_project_key: "Go to the project settings at Dark Visitors and copy the project_key value from the \"JavaScript Tag\"."
11+
darkvisitors_client_analytics_project_key: "Go to the project settings at Known Agents and copy the project_key value from the \"JavaScript Tag\"."

config/settings.yml

Lines changed: 18 additions & 18 deletions
Original file line numberDiff line numberDiff line change
@@ -16,21 +16,21 @@ discourse_darkvisitors:
1616
allow_any: true
1717
list_type: compact
1818
choices:
19-
- AI Agent
20-
- AI Assistant
21-
- AI Data Scraper
22-
- AI Search Crawler
23-
- Archiver
24-
- Developer Helper
25-
- Fetcher
26-
- Headless Agent
27-
- Intelligence Gatherer
28-
- Scraper
29-
- SEO Crawler
30-
- Search Engine Crawler
31-
- Security Scanner
32-
- Undocumented AI Agent
33-
- Uncategorized
19+
- AI Agent
20+
- AI Assistant
21+
- AI Data Scraper
22+
- AI Search Crawler
23+
- Archiver
24+
- Developer Helper
25+
- Fetcher
26+
- Automated Agent
27+
- Intelligence Gatherer
28+
- Scraper
29+
- SEO Crawler
30+
- Search Engine Crawler
31+
- Security Scanner
32+
- Undocumented AI Agent
33+
- Uncategorized
3434
darkvisitors_robots_txt_path:
3535
default: '/'
3636
client: false
@@ -74,14 +74,14 @@ discourse_darkvisitors:
7474
default: false
7575
hidden: true
7676
darkvisitors_robots_txt_api:
77-
default: "https://api.darkvisitors.com/robots-txts"
77+
default: "https://api.knownagents.com/robots-txts"
7878
client: false
7979
hidden: true
8080
darkvisitors_server_analytics_api:
81-
default: "https://api.darkvisitors.com/visits"
81+
default: "https://api.knownagents.com/visits"
8282
client: false
8383
hidden: true
8484
darkvisitors_client_analytics_script:
85-
default: "https://darkvisitors.com/tracker.js"
85+
default: "https://knownagents.com/tracker.js"
8686
client: true
8787
hidden: true
Lines changed: 11 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,11 @@
1+
# frozen_string_literal: true
2+
3+
class HeadlessAgentToAutomatedAgent < ActiveRecord::Migration[7.2]
4+
def up
5+
execute "UPDATE site_settings SET value = REPLACE(value, 'Headless Agent', 'Automated Agent') WHERE name = 'darkvisitors_robots_txt_agents' AND value LIKE '%Headless Agent%'"
6+
end
7+
8+
def down
9+
execute "UPDATE site_settings SET value = REPLACE(value, 'Automated Agent', 'Headless Agent') WHERE name = 'darkvisitors_robots_txt_agents' AND value LIKE '%Automated Agent%'"
10+
end
11+
end

lib/robots_txt.rb

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -20,16 +20,16 @@ def self.on_robots_info(robots_info)
2020
end
2121
end
2222
robots_info[:header] = robots_info[:header] +
23-
"\n# Augmented by Dark Visitors on #{config[:last_update]} with #{config[:agents].count} agents"
23+
"\n# Augmented by Known Agents on #{config[:last_update]} with #{config[:agents].count} agents"
2424
end
2525

2626
def self.update_robots_txt
2727
return unless SiteSetting.darkvisitors_robots_txt_enabled
2828
if SiteSetting.darkvisitors_access_token == ""
29-
Rails.logger.warn "Cannot update robots.txt from Dark Visitors. No access_token configured."
29+
Rails.logger.warn "Cannot update robots.txt from Known Agents. No access_token configured."
3030
return
3131
end
32-
Rails.logger.info "Updating Dark Visitors robots.txt"
32+
Rails.logger.info "Updating Known Agents robots.txt"
3333

3434
uri =
3535
URI(
@@ -48,7 +48,7 @@ def self.update_robots_txt
4848
}
4949
response = Net::HTTP.post(uri, request, headers)
5050
unless response.code == "200"
51-
Rails.logger.error "Dark Visitors robots-txt API failure: #{response.code}"
51+
Rails.logger.error "Known Agents robots-txt API failure: #{response.code}"
5252
return
5353
end
5454

@@ -76,7 +76,7 @@ def self.update_robots_txt
7676
ROBOTS_TXT,
7777
{ last_update: DateTime.now.to_s, agents: agents }
7878
)
79-
Rails.logger.info "Received #{agents.count} agents to deny from Dark Visitors"
79+
Rails.logger.info "Received #{agents.count} agents to deny from Known Agents"
8080
end
8181
end
8282
end

plugin.rb

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,9 +1,9 @@
11
# frozen_string_literal: true
22

33
# name: discourse-darkvisitors
4-
# about: Connects to Dark Visitors to keep the robots.txt up to date and monitor crawlers and scrapers visiting your forum.
4+
# about: Connects to Known Agents (formerly known as Dark Visitors) to keep the robots.txt up to date and monitor crawlers and scrapers visiting your community.
55
# meta_topic_id: 365158
6-
# version: 1.2
6+
# version: 1.3
77
# authors: elmuerte
88
# url: https://github.com/magicball-network/discourse-darkvisitors
99
# required_version: 3.4.0

0 commit comments

Comments
 (0)