Help developers discover the tools you use. Get visibility for your team's tech choices and contribute to the community's knowledge.
ByteFlow is an AI-native orchestration and automation platform that connects systems, automates workflows, and deploys AI agents across real-world operations with or without APIs. It unifies agentic AI, API↔MCP communication, and no-API automation to turn fragmented enterprise systems into intelligent, end-to-end processes that scale. | Turn any employer careers page into AI-enriched XML job feeds. 11 ATS integrations (Greenhouse, Lever, Workday+). Handles JS-heavy custom sites that other aggregators miss. REST API + MCP server. You choose the employers you'd like to feature on your job board — JBoard, Niceboard, Jobboardly, SmartJobBoard or WordPress — and we do the rest. |
ai automation | job scraping, careers page scraper, job feed, XML job feed, ATS integration, job board data, recruitment data, job aggregator, HR tech, job API, Greenhouse, Lever, Workday |
Statistics | |
Stacks 0 | Stacks 1 |
Followers 1 | Followers 1 |
Votes 1 | Votes 1 |

Use Airflow to author workflows as directed acyclic graphs (DAGs) of tasks. The Airflow scheduler executes your tasks on an array of workers while following the specified dependencies. Rich command lines utilities makes performing complex surgeries on DAGs a snap. The rich user interface makes it easy to visualize pipelines running in production, monitor progress and troubleshoot issues when needed.

It makes it easy to automate all your software workflows, now with world-class CI/CD. Build, test, and deploy your code right from GitHub. Make code reviews, branch management, and issue triaging work the way you want.

import.io is a free web-based platform that puts the power of the machine readable web in your hands. Using our tools you can create an API or crawl an entire website in a fraction of the time of traditional methods, no coding required.

Web Scraping and Data Extraction ParseHub is a free and powerful web scraping tool. With our advanced web scraper, extracting data is as easy as clicking on the data you need. ParseHub lets you turn any website into a spreadsheet or API w

It implements batch and streaming data processing jobs that run on any execution engine. It executes pipelines on multiple execution environments.

Extract data from websites and turn them to API. We will handle all the rotating proxies and Chrome rendering for you. Many specialists have to handle Javascript rendering, headless browser update and maintenance, proxies diversity and rotation. It is a simple API that does all the above for you.

It is a free client-side Windows web scraping software that turns unstructured or semi-structured data from websites into structured data sets, no coding necessary. Extracted data can be exported as API, CSV, Excel or exported into a database.

Developer framework to orchestrate multiple services and APIs into your software application using logic triggered by events and time. Build ETL processes, A/B testing, real-time alerts and personalized user experiences with custom logic.

It is a Python module that helps you build complex pipelines of batch jobs. It handles dependency resolution, workflow management, visualization etc. It also comes with Hadoop support built in.

Build and map powerful workflows across tools to save your team time. No coding required. Create rules to define what information flows between each of your tools, in minutes.