import.io vs ParseHub vs Scrapy

Need advice about which tool to choose?Ask the StackShare community!

import.io

39
89
+ 1
24
ParseHub

32
89
+ 1
19
Scrapy

244
239
+ 1
0
Get Advice from developers at your company using StackShare Enterprise. Sign up for StackShare Enterprise.
Learn More
Pros of import.io
Pros of ParseHub
Pros of Scrapy
  • 8
    Easy setup
  • 5
    Native desktop app
  • 5
    Free lead generation tool
  • 3
    Continuous updates
  • 3
    Features based on users suggestions
  • 6
    Great support
  • 5
    Easy setup
  • 5
    Complex websites
  • 3
    Native Desktop App
    Be the first to leave a pro

    Sign up to add or upvote prosMake informed product decisions

    - No public GitHub repository available -
    - No public GitHub repository available -

    What is import.io?

    import.io is a free web-based platform that puts the power of the machine readable web in your hands. Using our tools you can create an API or crawl an entire website in a fraction of the time of traditional methods, no coding required.

    What is ParseHub?

    Web Scraping and Data Extraction ParseHub is a free and powerful web scraping tool. With our advanced web scraper, extracting data is as easy as clicking on the data you need. ParseHub lets you turn any website into a spreadsheet or API w

    What is Scrapy?

    It is the most popular web scraping framework in Python. An open source and collaborative framework for extracting the data you need from websites. In a fast, simple, yet extensible way.

    Need advice about which tool to choose?Ask the StackShare community!

    What companies use import.io?
    What companies use ParseHub?
    What companies use Scrapy?
      No companies found

      Sign up to get full access to all the companiesMake informed product decisions

      What tools integrate with import.io?
      What tools integrate with ParseHub?
      What tools integrate with Scrapy?
        No integrations found
        What are some alternatives to import.io, ParseHub, and Scrapy?
        Diffbot
        Our APIs use computer vision, machine learning and natural language processing to help developers extract and understand objects from any Web page. We've determined that the entire Web can be classified into approximately 18 structural page types. From this basic understanding of common page layouts, Diffbot then uses computer vision, natural language processing and other machine learning algorithms to identify and extract the important items from within these pages.
        Octoparse
        It is a free client-side Windows web scraping software that turns unstructured or semi-structured data from websites into structured data sets, no coding necessary. Extracted data can be exported as API, CSV, Excel or exported into a database.
        Kimono
        You don't need to write any code or install any software to extract data with Kimono. The easiest way to use Kimono is to add our bookmarklet to your browser's bookmark bar. Then go to the website you want to get data from and click the bookmarklet. Select the data you want and Kimono does the rest. We take care of hosting the APIs that you build with Kimono and running them on the schedule you specify. Use the API output in JSON or as CSV files that you can easily paste into a spreadsheet.
        BeautifulSoup
        It works with your favorite parser to provide idiomatic ways of navigating, searching, and modifying the parse tree. It commonly saves programmers hours or days of work.
        Apify
        Apify is a platform that enables developers to create, customize and run cloud-based programs called actors that can, among other things, be used to extract data from any website using a few lines of JavaScript.
        See all alternatives