Alternatives to AWS Step Functions logo

Alternatives to AWS Step Functions

AWS Lambda, Airflow, AWS Batch, AWS Data Pipeline, and Batch are the most popular alternatives and competitors to AWS Step Functions.
240
31

What is AWS Step Functions and what are its top alternatives?

AWS Step Functions makes it easy to coordinate the components of distributed applications and microservices using visual workflows. Building applications from individual components that each perform a discrete function lets you scale and change applications quickly.
AWS Step Functions is a tool in the Cloud Task Management category of a tech stack.

Top Alternatives to AWS Step Functions

  • AWS Lambda
    AWS Lambda

    AWS Lambda is a compute service that runs your code in response to events and automatically manages the underlying compute resources for you. You can use AWS Lambda to extend other AWS services with custom logic, or create your own back-end services that operate at AWS scale, performance, and security. ...

  • Airflow
    Airflow

    Use Airflow to author workflows as directed acyclic graphs (DAGs) of tasks. The Airflow scheduler executes your tasks on an array of workers while following the specified dependencies. Rich command lines utilities makes performing complex surgeries on DAGs a snap. The rich user interface makes it easy to visualize pipelines running in production, monitor progress and troubleshoot issues when needed. ...

  • AWS Batch
    AWS Batch

    It enables developers, scientists, and engineers to easily and efficiently run hundreds of thousands of batch computing jobs on AWS. It dynamically provisions the optimal quantity and type of compute resources (e.g., CPU or memory optimized instances) based on the volume and specific resource requirements of the batch jobs submitted. ...

  • AWS Data Pipeline
    AWS Data Pipeline

    AWS Data Pipeline is a web service that provides a simple management system for data-driven workflows. Using AWS Data Pipeline, you define a pipeline composed of the “data sources” that contain your data, the “activities” or business logic such as EMR jobs or SQL queries, and the “schedule” on which your business logic executes. For example, you could define a job that, every hour, runs an Amazon Elastic MapReduce (Amazon EMR)–based analysis on that hour’s Amazon Simple Storage Service (Amazon S3) log data, loads the results into a relational database for future lookup, and then automatically sends you a daily summary email. ...

  • Batch
    Batch

    Yes, we’re really free. So, how do we keep the lights on? Instead of charging you a monthly fee, we sell ads on your behalf to the top 500 mobile advertisers in the world. With Batch, you earn money each month while accessing great engagement tools for free. ...

  • Camunda
    Camunda

    With Camunda, business users collaborate with developers to model and automate end-to-end processes using BPMN-powered flowcharts that run with the speed, scale, and resiliency required to compete in today’s digital-first world ...

  • Postman
    Postman

    It is the only complete API development environment, used by nearly five million developers and more than 100,000 companies worldwide. ...

  • Postman
    Postman

    It is the only complete API development environment, used by nearly five million developers and more than 100,000 companies worldwide. ...

AWS Step Functions alternatives & related posts

AWS Lambda logo

AWS Lambda

24K
18.6K
432
Automatically run code in response to modifications to objects in Amazon S3 buckets, messages in Kinesis streams, or...
24K
18.6K
+ 1
432
PROS OF AWS LAMBDA
  • 129
    No infrastructure
  • 83
    Cheap
  • 70
    Quick
  • 59
    Stateless
  • 47
    No deploy, no server, great sleep
  • 12
    AWS Lambda went down taking many sites with it
  • 6
    Event Driven Governance
  • 6
    Extensive API
  • 6
    Auto scale and cost effective
  • 6
    Easy to deploy
  • 5
    VPC Support
  • 3
    Integrated with various AWS services
CONS OF AWS LAMBDA
  • 7
    Cant execute ruby or go
  • 3
    Compute time limited
  • 1
    Can't execute PHP w/o significant effort

related AWS Lambda posts

Jeyabalaji Subramanian

Recently we were looking at a few robust and cost-effective ways of replicating the data that resides in our production MongoDB to a PostgreSQL database for data warehousing and business intelligence.

We set ourselves the following criteria for the optimal tool that would do this job: - The data replication must be near real-time, yet it should NOT impact the production database - The data replication must be horizontally scalable (based on the load), asynchronous & crash-resilient

Based on the above criteria, we selected the following tools to perform the end to end data replication:

We chose MongoDB Stitch for picking up the changes in the source database. It is the serverless platform from MongoDB. One of the services offered by MongoDB Stitch is Stitch Triggers. Using stitch triggers, you can execute a serverless function (in Node.js) in real time in response to changes in the database. When there are a lot of database changes, Stitch automatically "feeds forward" these changes through an asynchronous queue.

We chose Amazon SQS as the pipe / message backbone for communicating the changes from MongoDB to our own replication service. Interestingly enough, MongoDB stitch offers integration with AWS services.

In the Node.js function, we wrote minimal functionality to communicate the database changes (insert / update / delete / replace) to Amazon SQS.

Next we wrote a minimal micro-service in Python to listen to the message events on SQS, pickup the data payload & mirror the DB changes on to the target Data warehouse. We implemented source data to target data translation by modelling target table structures through SQLAlchemy . We deployed this micro-service as AWS Lambda with Zappa. With Zappa, deploying your services as event-driven & horizontally scalable Lambda service is dumb-easy.

In the end, we got to implement a highly scalable near realtime Change Data Replication service that "works" and deployed to production in a matter of few days!

See more
Tim Nolet

Heroku Docker GitHub Node.js hapi Vue.js AWS Lambda Amazon S3 PostgreSQL Knex.js Checkly is a fairly young company and we're still working hard to find the correct mix of product features, price and audience.

We are focussed on tech B2B, but I always wanted to serve solo developers too. So I decided to make a $7 plan.

Why $7? Simply put, it seems to be a sweet spot for tech companies: Heroku, Docker, Github, Appoptics (Librato) all offer $7 plans. They must have done a ton of research into this, so why not piggy back that and try it out.

Enough biz talk, onto tech. The challenges were:

  • Slice of a portion of the functionality so a $7 plan is still profitable. We call this the "plan limits"
  • Update API and back end services to handle and enforce plan limits.
  • Update the UI to kindly state plan limits are in effect on some part of the UI.
  • Update the pricing page to reflect all changes.
  • Keep the actual processing backend, storage and API's as untouched as possible.

In essence, we went from strictly volume based pricing to value based pricing. Here come the technical steps & decisions we made to get there.

  1. We updated our PostgreSQL schema so plans now have an array of "features". These are string constants that represent feature toggles.
  2. The Vue.js frontend reads these from the vuex store on login.
  3. Based on these values, the UI has simple v-if statements to either just show the feature or show a friendly "please upgrade" button.
  4. The hapi API has a hook on each relevant API endpoint that checks whether a user's plan has the feature enabled, or not.

Side note: We offer 10 SMS messages per month on the developer plan. However, we were not actually counting how many people were sending. We had to update our alerting daemon (that runs on Heroku and triggers SMS messages via AWS SNS) to actually bump a counter.

What we build is basically feature-toggling based on plan features. It is very extensible for future additions. Our scheduling and storage backend that actually runs users' monitoring requests (AWS Lambda) and stores the results (S3 and Postgres) has no knowledge of all of this and remained unchanged.

Hope this helps anyone building out their SaaS and is in a similar situation.

See more
Airflow logo

Airflow

1.7K
2.7K
128
A platform to programmaticaly author, schedule and monitor data pipelines, by Airbnb
1.7K
2.7K
+ 1
128
PROS OF AIRFLOW
  • 53
    Features
  • 14
    Task Dependency Management
  • 12
    Beautiful UI
  • 12
    Cluster of workers
  • 10
    Extensibility
  • 6
    Open source
  • 5
    Complex workflows
  • 5
    Python
  • 3
    Good api
  • 3
    Apache project
  • 3
    Custom operators
  • 2
    Dashboard
CONS OF AIRFLOW
  • 2
    Observability is not great when the DAGs exceed 250
  • 2
    Running it on kubernetes cluster relatively complex
  • 2
    Open source - provides minimum or no support
  • 1
    Logical separation of DAGs is not straight forward

related Airflow posts

Data science and engineering teams at Lyft maintain several big data pipelines that serve as the foundation for various types of analysis throughout the business.

Apache Airflow sits at the center of this big data infrastructure, allowing users to “programmatically author, schedule, and monitor data pipelines.” Airflow is an open source tool, and “Lyft is the very first Airflow adopter in production since the project was open sourced around three years ago.”

There are several key components of the architecture. A web UI allows users to view the status of their queries, along with an audit trail of any modifications the query. A metadata database stores things like job status and task instance status. A multi-process scheduler handles job requests, and triggers the executor to execute those tasks.

Airflow supports several executors, though Lyft uses CeleryExecutor to scale task execution in production. Airflow is deployed to three Amazon Auto Scaling Groups, with each associated with a celery queue.

Audit logs supplied to the web UI are powered by the existing Airflow audit logs as well as Flask signal.

Datadog, Statsd, Grafana, and PagerDuty are all used to monitor the Airflow system.

See more

We are a young start-up with 2 developers and a team in India looking to choose our next ETL tool. We have a few processes in Azure Data Factory but are looking to switch to a better platform. We were debating Trifacta and Airflow. Or even staying with Azure Data Factory. The use case will be to feed data to front-end APIs.

See more
AWS Batch logo

AWS Batch

90
250
6
Fully Managed Batch Processing at Any Scale
90
250
+ 1
6
PROS OF AWS BATCH
  • 3
    Containerized
  • 3
    Scalable
CONS OF AWS BATCH
  • 3
    More overhead than lambda
  • 1
    Image management

related AWS Batch posts

AWS Data Pipeline logo

AWS Data Pipeline

95
398
1
Process and move data between different AWS compute and storage services
95
398
+ 1
1
PROS OF AWS DATA PIPELINE
  • 1
    Easy to create DAG and execute it
CONS OF AWS DATA PIPELINE
    Be the first to leave a con

    related AWS Data Pipeline posts

    Batch logo

    Batch

    43
    42
    2
    Free retention toolkit for indie developers & startups - push notifications, user analytics, reward engine, and native ads
    43
    42
    + 1
    2
    PROS OF BATCH
    • 2
      Revenuecat
    CONS OF BATCH
      Be the first to leave a con

      related Batch posts

      Camunda logo

      Camunda

      185
      216
      0
      The Universal Process Orchestrator
      185
      216
      + 1
      0
      PROS OF CAMUNDA
        Be the first to leave a pro
        CONS OF CAMUNDA
          Be the first to leave a con

          related Camunda posts

          Postman logo

          Postman

          94.5K
          81K
          1.8K
          Only complete API development environment
          94.5K
          81K
          + 1
          1.8K
          PROS OF POSTMAN
          • 490
            Easy to use
          • 369
            Great tool
          • 276
            Makes developing rest api's easy peasy
          • 156
            Easy setup, looks good
          • 144
            The best api workflow out there
          • 53
            It's the best
          • 53
            History feature
          • 44
            Adds real value to my workflow
          • 43
            Great interface that magically predicts your needs
          • 35
            The best in class app
          • 12
            Can save and share script
          • 10
            Fully featured without looking cluttered
          • 8
            Collections
          • 8
            Option to run scrips
          • 8
            Global/Environment Variables
          • 7
            Shareable Collections
          • 7
            Dead simple and useful. Excellent
          • 7
            Dark theme easy on the eyes
          • 6
            Awesome customer support
          • 6
            Great integration with newman
          • 5
            Documentation
          • 5
            Simple
          • 5
            The test script is useful
          • 4
            Saves responses
          • 4
            This has simplified my testing significantly
          • 4
            Makes testing API's as easy as 1,2,3
          • 4
            Easy as pie
          • 3
            API-network
          • 3
            I'd recommend it to everyone who works with apis
          • 3
            Mocking API calls with predefined response
          • 2
            Now supports GraphQL
          • 2
            Postman Runner CI Integration
          • 2
            Easy to setup, test and provides test storage
          • 2
            Continuous integration using newman
          • 2
            Pre-request Script and Test attributes are invaluable
          • 2
            Runner
          • 2
            Graph
          • 1
            <a href="http://fixbit.com/">useful tool</a>
          CONS OF POSTMAN
          • 10
            Stores credentials in HTTP
          • 9
            Bloated features and UI
          • 8
            Cumbersome to switch authentication tokens
          • 7
            Poor GraphQL support
          • 5
            Expensive
          • 3
            Not free after 5 users
          • 3
            Can't prompt for per-request variables
          • 1
            Import swagger
          • 1
            Support websocket
          • 1
            Import curl

          related Postman posts

          Noah Zoschke
          Engineering Manager at Segment · | 30 upvotes · 3M views

          We just launched the Segment Config API (try it out for yourself here) — a set of public REST APIs that enable you to manage your Segment configuration. A public API is only as good as its #documentation. For the API reference doc we are using Postman.

          Postman is an “API development environment”. You download the desktop app, and build API requests by URL and payload. Over time you can build up a set of requests and organize them into a “Postman Collection”. You can generalize a collection with “collection variables”. This allows you to parameterize things like username, password and workspace_name so a user can fill their own values in before making an API call. This makes it possible to use Postman for one-off API tasks instead of writing code.

          Then you can add Markdown content to the entire collection, a folder of related methods, and/or every API method to explain how the APIs work. You can publish a collection and easily share it with a URL.

          This turns Postman from a personal #API utility to full-blown public interactive API documentation. The result is a great looking web page with all the API calls, docs and sample requests and responses in one place. Check out the results here.

          Postman’s powers don’t end here. You can automate Postman with “test scripts” and have it periodically run a collection scripts as “monitors”. We now have #QA around all the APIs in public docs to make sure they are always correct

          Along the way we tried other techniques for documenting APIs like ReadMe.io or Swagger UI. These required a lot of effort to customize.

          Writing and maintaining a Postman collection takes some work, but the resulting documentation site, interactivity and API testing tools are well worth it.

          See more
          Simon Reymann
          Senior Fullstack Developer at QUANTUSflow Software GmbH · | 27 upvotes · 5.1M views

          Our whole Node.js backend stack consists of the following tools:

          • Lerna as a tool for multi package and multi repository management
          • npm as package manager
          • NestJS as Node.js framework
          • TypeScript as programming language
          • ExpressJS as web server
          • Swagger UI for visualizing and interacting with the API’s resources
          • Postman as a tool for API development
          • TypeORM as object relational mapping layer
          • JSON Web Token for access token management

          The main reason we have chosen Node.js over PHP is related to the following artifacts:

          • Made for the web and widely in use: Node.js is a software platform for developing server-side network services. Well-known projects that rely on Node.js include the blogging software Ghost, the project management tool Trello and the operating system WebOS. Node.js requires the JavaScript runtime environment V8, which was specially developed by Google for the popular Chrome browser. This guarantees a very resource-saving architecture, which qualifies Node.js especially for the operation of a web server. Ryan Dahl, the developer of Node.js, released the first stable version on May 27, 2009. He developed Node.js out of dissatisfaction with the possibilities that JavaScript offered at the time. The basic functionality of Node.js has been mapped with JavaScript since the first version, which can be expanded with a large number of different modules. The current package managers (npm or Yarn) for Node.js know more than 1,000,000 of these modules.
          • Fast server-side solutions: Node.js adopts the JavaScript "event-loop" to create non-blocking I/O applications that conveniently serve simultaneous events. With the standard available asynchronous processing within JavaScript/TypeScript, highly scalable, server-side solutions can be realized. The efficient use of the CPU and the RAM is maximized and more simultaneous requests can be processed than with conventional multi-thread servers.
          • A language along the entire stack: Widely used frameworks such as React or AngularJS or Vue.js, which we prefer, are written in JavaScript/TypeScript. If Node.js is now used on the server side, you can use all the advantages of a uniform script language throughout the entire application development. The same language in the back- and frontend simplifies the maintenance of the application and also the coordination within the development team.
          • Flexibility: Node.js sets very few strict dependencies, rules and guidelines and thus grants a high degree of flexibility in application development. There are no strict conventions so that the appropriate architecture, design structures, modules and features can be freely selected for the development.
          See more
          Postman logo

          Postman

          94.5K
          81K
          1.8K
          Only complete API development environment
          94.5K
          81K
          + 1
          1.8K
          PROS OF POSTMAN
          • 490
            Easy to use
          • 369
            Great tool
          • 276
            Makes developing rest api's easy peasy
          • 156
            Easy setup, looks good
          • 144
            The best api workflow out there
          • 53
            It's the best
          • 53
            History feature
          • 44
            Adds real value to my workflow
          • 43
            Great interface that magically predicts your needs
          • 35
            The best in class app
          • 12
            Can save and share script
          • 10
            Fully featured without looking cluttered
          • 8
            Collections
          • 8
            Option to run scrips
          • 8
            Global/Environment Variables
          • 7
            Shareable Collections
          • 7
            Dead simple and useful. Excellent
          • 7
            Dark theme easy on the eyes
          • 6
            Awesome customer support
          • 6
            Great integration with newman
          • 5
            Documentation
          • 5
            Simple
          • 5
            The test script is useful
          • 4
            Saves responses
          • 4
            This has simplified my testing significantly
          • 4
            Makes testing API's as easy as 1,2,3
          • 4
            Easy as pie
          • 3
            API-network
          • 3
            I'd recommend it to everyone who works with apis
          • 3
            Mocking API calls with predefined response
          • 2
            Now supports GraphQL
          • 2
            Postman Runner CI Integration
          • 2
            Easy to setup, test and provides test storage
          • 2
            Continuous integration using newman
          • 2
            Pre-request Script and Test attributes are invaluable
          • 2
            Runner
          • 2
            Graph
          • 1
            <a href="http://fixbit.com/">useful tool</a>
          CONS OF POSTMAN
          • 10
            Stores credentials in HTTP
          • 9
            Bloated features and UI
          • 8
            Cumbersome to switch authentication tokens
          • 7
            Poor GraphQL support
          • 5
            Expensive
          • 3
            Not free after 5 users
          • 3
            Can't prompt for per-request variables
          • 1
            Import swagger
          • 1
            Support websocket
          • 1
            Import curl

          related Postman posts

          Noah Zoschke
          Engineering Manager at Segment · | 30 upvotes · 3M views

          We just launched the Segment Config API (try it out for yourself here) — a set of public REST APIs that enable you to manage your Segment configuration. A public API is only as good as its #documentation. For the API reference doc we are using Postman.

          Postman is an “API development environment”. You download the desktop app, and build API requests by URL and payload. Over time you can build up a set of requests and organize them into a “Postman Collection”. You can generalize a collection with “collection variables”. This allows you to parameterize things like username, password and workspace_name so a user can fill their own values in before making an API call. This makes it possible to use Postman for one-off API tasks instead of writing code.

          Then you can add Markdown content to the entire collection, a folder of related methods, and/or every API method to explain how the APIs work. You can publish a collection and easily share it with a URL.

          This turns Postman from a personal #API utility to full-blown public interactive API documentation. The result is a great looking web page with all the API calls, docs and sample requests and responses in one place. Check out the results here.

          Postman’s powers don’t end here. You can automate Postman with “test scripts” and have it periodically run a collection scripts as “monitors”. We now have #QA around all the APIs in public docs to make sure they are always correct

          Along the way we tried other techniques for documenting APIs like ReadMe.io or Swagger UI. These required a lot of effort to customize.

          Writing and maintaining a Postman collection takes some work, but the resulting documentation site, interactivity and API testing tools are well worth it.

          See more
          Simon Reymann
          Senior Fullstack Developer at QUANTUSflow Software GmbH · | 27 upvotes · 5.1M views

          Our whole Node.js backend stack consists of the following tools:

          • Lerna as a tool for multi package and multi repository management
          • npm as package manager
          • NestJS as Node.js framework
          • TypeScript as programming language
          • ExpressJS as web server
          • Swagger UI for visualizing and interacting with the API’s resources
          • Postman as a tool for API development
          • TypeORM as object relational mapping layer
          • JSON Web Token for access token management

          The main reason we have chosen Node.js over PHP is related to the following artifacts:

          • Made for the web and widely in use: Node.js is a software platform for developing server-side network services. Well-known projects that rely on Node.js include the blogging software Ghost, the project management tool Trello and the operating system WebOS. Node.js requires the JavaScript runtime environment V8, which was specially developed by Google for the popular Chrome browser. This guarantees a very resource-saving architecture, which qualifies Node.js especially for the operation of a web server. Ryan Dahl, the developer of Node.js, released the first stable version on May 27, 2009. He developed Node.js out of dissatisfaction with the possibilities that JavaScript offered at the time. The basic functionality of Node.js has been mapped with JavaScript since the first version, which can be expanded with a large number of different modules. The current package managers (npm or Yarn) for Node.js know more than 1,000,000 of these modules.
          • Fast server-side solutions: Node.js adopts the JavaScript "event-loop" to create non-blocking I/O applications that conveniently serve simultaneous events. With the standard available asynchronous processing within JavaScript/TypeScript, highly scalable, server-side solutions can be realized. The efficient use of the CPU and the RAM is maximized and more simultaneous requests can be processed than with conventional multi-thread servers.
          • A language along the entire stack: Widely used frameworks such as React or AngularJS or Vue.js, which we prefer, are written in JavaScript/TypeScript. If Node.js is now used on the server side, you can use all the advantages of a uniform script language throughout the entire application development. The same language in the back- and frontend simplifies the maintenance of the application and also the coordination within the development team.
          • Flexibility: Node.js sets very few strict dependencies, rules and guidelines and thus grants a high degree of flexibility in application development. There are no strict conventions so that the appropriate architecture, design structures, modules and features can be freely selected for the development.
          See more