Skip to content

Commit 07038cc

Browse files
committed
fix remaining broken anchor links
1 parent f8dadcf commit 07038cc

File tree

5 files changed

+6
-6
lines changed

5 files changed

+6
-6
lines changed

sources/academy/tutorials/apify_scrapers/getting_started.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -39,7 +39,7 @@ Scroll down to the **Performance and limits** section and set the **Max pages pe
3939
4040
Now click **Save & Run**! *(in the bottom-left part of your screen)*
4141

42-
### [](#the-run-detail) The run detail
42+
### The run detail
4343

4444
After clicking **Save & Run**, the window will change to the run detail. Here, you will see the run's log. If it seems that nothing is happening, don't worry, it takes a few seconds for the run to fully boot up. In under a minute, you should have the 10 pages scraped. You will know that the run successfully completed when the `RUNNING` card in top-left corner changes to `SUCCEEDED`.
4545

@@ -108,7 +108,7 @@ Some of this information may be scraped directly from the listing pages, but for
108108

109109
### The start URL
110110

111-
In the **Input** tab of the task we have, we'll change the **Start URL** from **https://apify.com**. This will tell the scraper to start by opening a different URL. You can add more **Start URL**s or even [use a file with a list of thousands of them](#-crawling-the-website-with-pseudo-urls), but in this case, we'll be good with just one.
111+
In the **Input** tab of the task we have, we'll change the **Start URL** from **https://apify.com**. This will tell the scraper to start by opening a different URL. You can add more **Start URL**s or even [use a file with a list of thousands of them](#crawling-the-website-with-pseudo-urls), but in this case, we'll be good with just one.
112112

113113
How do we choose the new **Start URL**? The goal is to scrape all Actors in the store, which is available at [apify.com/store](https://apify.com/store), so we choose this URL as our **Start URL**.
114114

sources/academy/tutorials/apify_scrapers/puppeteer_scraper.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -129,7 +129,7 @@ async function pageFunction(context) {
129129
}
130130
```
131131

132-
### [](#modified-date) Modified date
132+
### Modified date
133133

134134
The DevTools tell us that the `modifiedDate` can be found in a `<time>` element.
135135

sources/academy/tutorials/apify_scrapers/web_scraper.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -34,7 +34,7 @@ This will add a `context.jQuery` function that you can use.
3434
Now that's out of the way, let's open one of the Actor detail pages in the Store, for example
3535
the [Web Scraper](https://apify.com/apify/web-scraper) page and use our DevTools-Fu to scrape some data.
3636

37-
## [](#building-our-page-function) Building our Page function
37+
## Building our Page function
3838

3939
Before we start, let's do a quick recap of the data we chose to scrape:
4040

sources/platform/proxy/usage.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -149,7 +149,7 @@ Use [sessions](#sessions) to control how you rotate IP addresses. See our guide
149149

150150
Sessions allow you to use the same IP address for multiple connections. In cases where you need to keep the same session (e.g. when you need to log in to a website), it is best to keep the same proxy and so the IP address. On the other hand by switching the IP address, you can avoid being blocked by the website.
151151

152-
To set a new session, pass the `session` parameter in your [username](./usage.md#username-parameters) field when connecting to a proxy. This will serve as the session's ID and an IP address will be assigned to it. To [use that IP address in other requests](/platform/proxy/datacenter-proxy##connecting-to-datacenter-proxies), pass that same session ID in the username field.
152+
To set a new session, pass the `session` parameter in your [username](./usage.md#username-parameters) field when connecting to a proxy. This will serve as the session's ID and an IP address will be assigned to it. To [use that IP address in other requests](/platform/proxy/datacenter-proxy#connecting-to-datacenter-proxies), pass that same session ID in the username field.
153153

154154
We recommend you to use [SessionPool](https://crawlee.dev/api/core/class/SessionPool) abstraction when managing sessions. The created session will then store information such as cookies and can be used to generate [browser fingerprints](/academy/anti-scraping/mitigation/generating-fingerprints). You can also assign custom user data such as authorization tokens and specific headers.
155155

sources/platform/storage/dataset.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -45,7 +45,7 @@ To view or download a dataset:
4545
2. Select the format & configure other options if desired in **Export dataset** section.
4646
3. Click **Download**.
4747

48-
Utilize the **Actions** menu to modify the dataset's name, which also affects its [retention period](./usage.md#data-retention-data-retention), and to adjust [access rights](../collaboration/index.md). The **API** button allows you to explore and test the dataset's [API endpoints](/api/v2#/reference/datasets).
48+
Utilize the **Actions** menu to modify the dataset's name, which also affects its [retention period](/platform/storage/usage#data-retention), and to adjust [access rights](../collaboration/index.md). The **API** button allows you to explore and test the dataset's [API endpoints](/api/v2#/reference/datasets).
4949

5050
![Datasets detail view](./images/datasets-detail.png)
5151

0 commit comments

Comments
 (0)