Skip to content

Commit b9208cb

Browse files
committed
Add job posts
1 parent b18d681 commit b9208cb

File tree

3 files changed

+77
-0
lines changed

3 files changed

+77
-0
lines changed

src/content/config.ts

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -246,6 +246,7 @@ const jobs = defineCollection({
246246
min_requirements: z.array(z.string()).optional().nullable(),
247247
requirements: z.array(z.string()).nullable(),
248248
preffered: z.array(z.string()).optional().nullable(),
249+
stack: z.array(z.string()).optional().nullable(),
249250
benefits: z.array(z.string()).nullable(),
250251
description2: z.string().optional().nullable(),
251252
apply_link: z.string().url().optional(),
Lines changed: 35 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,35 @@
1+
---
2+
title: "Open Source Engineer (TypeScript)"
3+
location:
4+
type:
5+
level:
6+
tags:
7+
salary:
8+
description:
9+
"Apify has grown as the tool of choice for any Node.js/JS/TS engineer when it
10+
comes to web scraping and web automation. We are now also taking on the Python
11+
community. Our open-source tooling is used by tens of thousands of people
12+
worldwide. Check out the [Crawlee](https://crawlee.dev/) library, our
13+
[GitHub](https://github.com/apify), and the vibrant community on
14+
[Discord](https://discord.com/invite/jyEM2PRvMU)."
15+
responsibilities:
16+
- You'll be part of one of our 7 product teams—the team that works on Apify’s
17+
open-source tools.
18+
- The team is led by Martin Adamek, co-author of Apify’s Crawlee and his very
19+
own MikroORM library.
20+
- You'll face many challenges regarding the usage of browsers in the cloud,
21+
browser fingerprinting, scalability, and more.
22+
- But you'll also be in direct touch with the community that uses the tools
23+
you build.
24+
requirements:
25+
- Experience with working on large, complex applications or frameworks
26+
- Great skills at developing and debugging in JavaScript/Node.js/TypeScript or
27+
have this skill in some other programming language and are able to learn it
28+
quickly
29+
- Familiarity with Linux
30+
- Experience in open-source development is a plus
31+
- Experiment-driven and collaborative mindset
32+
- Good communication skills in English
33+
benefits:
34+
apply_link: https://jobs.ashbyhq.com/apify/7ad57272-61d1-49f6-8a00-6708cabb940e/application?utm_source=career_page_apify
35+
---
Lines changed: 41 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,41 @@
1+
---
2+
title: "Web Automation Engineer (Anti-Scraping)"
3+
location:
4+
type:
5+
level:
6+
tags:
7+
salary:
8+
description:
9+
responsibilities:
10+
- To automate human activities on the web and proceed with our mission, we
11+
need robots that behave and act like real users, and this is where we need
12+
your help!
13+
- You will be part of one of our 5 product teams—the team that works on
14+
Apify’s open-source tooling loved by the community.
15+
- The team is led by Martin Adamek, co-author, and maintainer of Apify’s
16+
Crawlee and his own MikroORM framework.
17+
- You will research existing anti-scraping services and implement new
18+
open-source libraries or internal services to overcome them by utilizing
19+
various proxy services, browser fingerprinting methods, human-like browser
20+
interaction, session management, and others.
21+
requirements:
22+
- 3+ years of industry experience with Node.js/TypeScript or Python (if you
23+
are open to learning TypeScript too)
24+
- Prior experience in web scraping or web automation
25+
- Ability to solve unique, unprecedented challenges
26+
- Understanding of how the web works, from HTTP protocol up to modern browser
27+
APIs to draw an SVG
28+
- Experience in open-source development is a plus
29+
- Experiment-driven and collaborative mindset
30+
- Good communication skills in English
31+
stack:
32+
- "Frontend: React.js, styled-components, Storybook, Cypress"
33+
- "Backend: TypeScript/Node.js, Next.js, Express.js, Docusaurus, Jest"
34+
- "Infra: AWS, Kubernetes, Helm, MongoDB, Redis, DynamoDB, S3, GitHub Actions"
35+
- "Monitoring: New Relic, LogDNA, Sentry, PagerDuty"
36+
- "Tools: GitHub, ZenHub, Notion, GSuite"
37+
- "Process: two-week sprints, code reviews, tests, automating whatever we can,
38+
deploying multiple times per day"
39+
benefits:
40+
apply_link: https://jobs.ashbyhq.com/apify/8ab84138-043f-4ef7-b739-977292a9161e/application?utm_source=career_page_apify
41+
---

0 commit comments

Comments
 (0)