Skip to content

Commit bf58a9b

Browse files
colin2328meta-codesync[bot]
authored andcommitted
fix crawler example to use updated API (#1427)
Summary: as title Pull Request resolved: #1427 Test Plan: Imported from GitHub, without a `Test Plan:` line. Finished - Found 3127 in 85.47 seconds. ValueMesh({procs: 8}): (({'procs': 0/8}, 392), ({'procs': 1/8}, 393), ({'procs': 2/8}, 387), ({'procs': 3/8}, 392), ({'procs': 4/8}, 394), ({'procs': 5/8}, 386), ({'procs': 6/8}, 394), ({'procs': 7/8}, 389)). Reviewed By: dcci Differential Revision: D83884413 Pulled By: colin2328 fbshipit-source-id: 5f26deb63a3f616002161821bdacdfc47bdb55f2
1 parent a57c16e commit bf58a9b

File tree

1 file changed

+3
-3
lines changed

1 file changed

+3
-3
lines changed

docs/source/examples/crawler.py

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -126,20 +126,20 @@ async def main():
126126
start_time = time.time()
127127

128128
# Start up a ProcMesh.
129-
local_proc_mesh: ProcMesh = await this_host().spawn_procs(
129+
local_proc_mesh: ProcMesh = this_host().spawn_procs(
130130
per_host={"procs": NUM_CRAWLERS}
131131
)
132132

133133
# Create queues across the mesh and use slice to target the first one; we will not use the rest.
134134
# TODO: One ProcMesh::slice is implemented, avoid spawning the extra ones here.
135-
all_queues = await local_proc_mesh.spawn("queues", QueueActor)
135+
all_queues = local_proc_mesh.spawn("queues", QueueActor)
136136
target_queue = all_queues.slice(procs=slice(0, 1))
137137

138138
# Prime the queue with the base URL we want to crawl.
139139
await target_queue.insert.call_one(BASE, DEPTH)
140140

141141
# Make the crawlers and pass in the queues; crawlers will just use the first one as well.
142-
crawlers = await local_proc_mesh.spawn("crawlers", CrawlActor, all_queues)
142+
crawlers = local_proc_mesh.spawn("crawlers", CrawlActor, all_queues)
143143

144144
# Run the crawlers; display the count of documents they crawled when done.
145145
results = await crawlers.crawl.call()

0 commit comments

Comments
 (0)