Web Scraping and Data Extraction ParseHub is a free and powerful web scraping tool. With our advanced web scraper, extracting data is as easy as clicking on the data you need. ParseHub lets you turn any website into a spreadsheet or API w | Portia is an open source tool that lets you get data from websites. It facilitates and automates the process of data extraction. This visual web scraper works straight from your browser, so you don't need to download or install anything. |
Works with single-page apps, multi-page apps;Uses machine learning for its state-of-the-art relationship engine;Instantly shows sample data as you're working | Extracts data from websites based on visual selections by the user; Creates generic web scrapers which are capable of extracting data from any web page with a similar structure; Exports scraped data in CSV, JSON, JSON-lines and XML; There is a hosted version available as a free service on Scrapy Cloud which lets Portia leverage from all the features of a cloud-based production platform including scaling and scheduling jobs, data storage, QA features, and add ons |
Statistics | |
GitHub Stars - | GitHub Stars 9.5K |
GitHub Forks - | GitHub Forks 1.4K |
Stacks 32 | Stacks 26 |
Followers 92 | Followers 66 |
Votes 19 | Votes 0 |
Pros & Cons | |
Pros
| No community feedback yet |

import.io is a free web-based platform that puts the power of the machine readable web in your hands. Using our tools you can create an API or crawl an entire website in a fraction of the time of traditional methods, no coding required.

Extract data from websites and turn them to API. We will handle all the rotating proxies and Chrome rendering for you. Many specialists have to handle Javascript rendering, headless browser update and maintenance, proxies diversity and rotation. It is a simple API that does all the above for you.

It is a free client-side Windows web scraping software that turns unstructured or semi-structured data from websites into structured data sets, no coding necessary. Extracted data can be exported as API, CSV, Excel or exported into a database.

You don't need to write any code or install any software to extract data with Kimono. The easiest way to use Kimono is to add our bookmarklet to your browser's bookmark bar. Then go to the website you want to get data from and click the bookmarklet. Select the data you want and Kimono does the rest. We take care of hosting the APIs that you build with Kimono and running them on the schedule you specify. Use the API output in JSON or as CSV files that you can easily paste into a spreadsheet.

It works with your favorite parser to provide idiomatic ways of navigating, searching, and modifying the parse tree. It commonly saves programmers hours or days of work.

Apify is a platform that enables developers to create, customize and run cloud-based programs called actors that can, among other things, be used to extract data from any website using a few lines of JavaScript.

It is the most popular web scraping framework in Python. An open source and collaborative framework for extracting the data you need from websites. In a fast, simple, yet extensible way.

ScraperAPI is a powerful and efficient web scraping API and tool designed to empower developers, data scientists, and businesses with reliable data extraction at scale.

It is an efficient tool to scrape data from a URL. It works particularly well on e-commerce product pages, real estate listings or even google ranking. Use the Live test on the Dashboard to test without coding.

It is a convenient standalone library to scrape websites and to run end-to-end tests using real browsers.