Skip to content

Commit de6e42f

Browse files
committed
Update documentation
1 parent b8a7965 commit de6e42f

File tree

2 files changed

+40
-0
lines changed

2 files changed

+40
-0
lines changed

docs/Scraper-Reference.md

Lines changed: 39 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -184,3 +184,42 @@ More information about how filters work is available on the [Filter Reference](.
184184
Overrides the `:title` option for the root page only.
185185

186186
_Note: this filter is disabled by default._
187+
188+
## Keeping scrapers up-to-date
189+
190+
In order to keep scrapers up-to-date the `get_latest_version(options, &block)` method should be overridden by all scrapers that define the `self.release` attribute. This method should return the latest version of the documentation that is being scraped. The result of this method is periodically reported in a "Documentation versions report" issue which helps maintainers keep track of outdated documentations.
191+
192+
To make life easier, there are a few utility methods that you can use in `get_latest_version`:
193+
* `fetch(url, options, &block)`
194+
195+
Makes a GET request to the url and calls `&block` with the body.
196+
197+
Example: [lib/docs/scrapers/bash.rb](../lib/docs/scrapers/bash.rb)
198+
* `fetch_doc(url, options, &block)`
199+
200+
Makes a GET request to the url and calls `&block` with the HTML body converted to a Nokogiri document.
201+
202+
Example: [lib/docs/scrapers/git.rb](../lib/docs/scrapers/git.rb)
203+
* `fetch_json(url, options, &block)`
204+
205+
Makes a GET request to the url and calls `&block` with the JSON body converted to a dictionary.
206+
* `get_npm_version(package, options, &block)`
207+
208+
Calls `&block` with the latest version of the given npm package.
209+
210+
Example: [lib/docs/scrapers/bower.rb](../lib/docs/scrapers/bower.rb)
211+
* `get_latest_github_release(owner, repo, options, &block)`
212+
213+
Calls `&block` with the latest GitHub release of the given repository ([format](https://developer.github.com/v3/repos/releases/#get-the-latest-release)).
214+
215+
Example: [lib/docs/scrapers/jsdoc.rb](../lib/docs/scrapers/jsdoc.rb)
216+
* `get_github_tags(owner, repo, options, &block)`
217+
218+
Calls `&block` with the list of tags on the given repository ([format](https://developer.github.com/v3/repos/#list-tags)).
219+
220+
Example: [lib/docs/scrapers/liquid.rb](../lib/docs/scrapers/liquid.rb)
221+
* `get_github_file_contents(owner, repo, path, options, &block)`
222+
223+
Calls `&block` with the contents of the requested file in the default branch of the given repository.
224+
225+
Example: [lib/docs/scrapers/minitest.rb](../lib/docs/scrapers/minitest.rb)

docs/adding-docs.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -16,6 +16,7 @@ Adding a documentation may look like a daunting task but once you get the hang o
1616
9. To customize the pages' styling, create an SCSS file in the `assets/stylesheets/pages/` directory and import it in both `application.css.scss` AND `application-dark.css.scss`. Both the file and CSS class should be named `_[type]` where [type] is equal to the scraper's `type` attribute (documentations with the same type share the same custom CSS and JS). _(Note: feel free to submit a pull request without custom CSS/JS)_
1717
10. To add syntax highlighting or execute custom JavaScript on the pages, create a file in the `assets/javascripts/views/pages/` directory (take a look at the other files to see how it works).
1818
11. Add the documentation's icon in the `public/icons/docs/[my_doc]/` directory, in both 16x16 and 32x32-pixels formats. It'll be added to the icon sprite after your pull request is merged.
19+
12. Ensure `thor updates:check [my_doc]` shows the correct latest version.
1920

2021
If the documentation includes more than a few hundreds pages and is available for download, try to scrape it locally (e.g. using `FileScraper`). It'll make the development process much faster and avoids putting too much load on the source site. (It's not a problem if your scraper is coupled to your local setup, just explain how it works in your pull request.)
2122

0 commit comments

Comments
 (0)