What is OpenAPI and what are its top alternatives?
Top Alternatives to OpenAPI
- JsonAPI
t is a format that works with HTTP. A main goal of the specification is to optimize HTTP requests both in terms of the number of requests and the size of data packages exchanged between clients and servers. ...
- Postman
It is the only complete API development environment, used by nearly five million developers and more than 100,000 companies worldwide. ...
- GraphQL
GraphQL is a data query language and runtime designed and used at Facebook to request and deliver data to mobile and web apps since 2012. ...
- OData
It is an ISO/IEC approved, OASIS standard that defines a set of best practices for building and consuming RESTful APIs. It helps you focus on your business logic while building RESTful APIs without having to worry about the various approaches to define request and response headers, status codes, HTTP methods, URL conventions, media types, payload formats, query options, etc. ...
- RAML
RESTful API Modeling Language (RAML) makes it easy to manage the whole API lifecycle from design to sharing. It's concise - you only write what you need to define - and reusable. It is machine readable API design that is actually human friendly. ...
- gRPC
gRPC is a modern open source high performance RPC framework that can run in any environment. It can efficiently connect services in and across data centers with pluggable support for load balancing, tracing, health checking... ...
- REST
An architectural style for developing web services. A distributed system framework that uses Web protocols and technologies. ...
- Stack Overflow
Stack Overflow is a question and answer site for professional and enthusiast programmers. It's built and run by you as part of the Stack Exchange network of Q&A sites. With your help, we're working together to build a library of detailed answers to every question about programming. ...
OpenAPI alternatives & related posts
related JsonAPI posts
- Easy to use490
- Great tool369
- Makes developing rest api's easy peasy276
- Easy setup, looks good156
- The best api workflow out there144
- It's the best53
- History feature53
- Adds real value to my workflow44
- Great interface that magically predicts your needs43
- The best in class app35
- Can save and share script12
- Fully featured without looking cluttered10
- Collections8
- Option to run scrips8
- Global/Environment Variables8
- Shareable Collections7
- Dead simple and useful. Excellent7
- Dark theme easy on the eyes7
- Awesome customer support6
- Great integration with newman6
- Documentation5
- Simple5
- The test script is useful5
- Saves responses4
- This has simplified my testing significantly4
- Makes testing API's as easy as 1,2,34
- Easy as pie4
- API-network3
- I'd recommend it to everyone who works with apis3
- Mocking API calls with predefined response3
- Now supports GraphQL2
- Postman Runner CI Integration2
- Easy to setup, test and provides test storage2
- Continuous integration using newman2
- Pre-request Script and Test attributes are invaluable2
- Runner2
- Graph2
- <a href="http://fixbit.com/">useful tool</a>1
- Stores credentials in HTTP10
- Bloated features and UI9
- Cumbersome to switch authentication tokens8
- Poor GraphQL support7
- Expensive5
- Not free after 5 users3
- Can't prompt for per-request variables3
- Import swagger1
- Support websocket1
- Import curl1
related Postman posts
We just launched the Segment Config API (try it out for yourself here) — a set of public REST APIs that enable you to manage your Segment configuration. A public API is only as good as its #documentation. For the API reference doc we are using Postman.
Postman is an “API development environment”. You download the desktop app, and build API requests by URL and payload. Over time you can build up a set of requests and organize them into a “Postman Collection”. You can generalize a collection with “collection variables”. This allows you to parameterize things like username
, password
and workspace_name
so a user can fill their own values in before making an API call. This makes it possible to use Postman for one-off API tasks instead of writing code.
Then you can add Markdown content to the entire collection, a folder of related methods, and/or every API method to explain how the APIs work. You can publish a collection and easily share it with a URL.
This turns Postman from a personal #API utility to full-blown public interactive API documentation. The result is a great looking web page with all the API calls, docs and sample requests and responses in one place. Check out the results here.
Postman’s powers don’t end here. You can automate Postman with “test scripts” and have it periodically run a collection scripts as “monitors”. We now have #QA around all the APIs in public docs to make sure they are always correct
Along the way we tried other techniques for documenting APIs like ReadMe.io or Swagger UI. These required a lot of effort to customize.
Writing and maintaining a Postman collection takes some work, but the resulting documentation site, interactivity and API testing tools are well worth it.
Our whole Node.js backend stack consists of the following tools:
- Lerna as a tool for multi package and multi repository management
- npm as package manager
- NestJS as Node.js framework
- TypeScript as programming language
- ExpressJS as web server
- Swagger UI for visualizing and interacting with the API’s resources
- Postman as a tool for API development
- TypeORM as object relational mapping layer
- JSON Web Token for access token management
The main reason we have chosen Node.js over PHP is related to the following artifacts:
- Made for the web and widely in use: Node.js is a software platform for developing server-side network services. Well-known projects that rely on Node.js include the blogging software Ghost, the project management tool Trello and the operating system WebOS. Node.js requires the JavaScript runtime environment V8, which was specially developed by Google for the popular Chrome browser. This guarantees a very resource-saving architecture, which qualifies Node.js especially for the operation of a web server. Ryan Dahl, the developer of Node.js, released the first stable version on May 27, 2009. He developed Node.js out of dissatisfaction with the possibilities that JavaScript offered at the time. The basic functionality of Node.js has been mapped with JavaScript since the first version, which can be expanded with a large number of different modules. The current package managers (npm or Yarn) for Node.js know more than 1,000,000 of these modules.
- Fast server-side solutions: Node.js adopts the JavaScript "event-loop" to create non-blocking I/O applications that conveniently serve simultaneous events. With the standard available asynchronous processing within JavaScript/TypeScript, highly scalable, server-side solutions can be realized. The efficient use of the CPU and the RAM is maximized and more simultaneous requests can be processed than with conventional multi-thread servers.
- A language along the entire stack: Widely used frameworks such as React or AngularJS or Vue.js, which we prefer, are written in JavaScript/TypeScript. If Node.js is now used on the server side, you can use all the advantages of a uniform script language throughout the entire application development. The same language in the back- and frontend simplifies the maintenance of the application and also the coordination within the development team.
- Flexibility: Node.js sets very few strict dependencies, rules and guidelines and thus grants a high degree of flexibility in application development. There are no strict conventions so that the appropriate architecture, design structures, modules and features can be freely selected for the development.
- Schemas defined by the requests made by the user75
- Will replace RESTful interfaces63
- The future of API's62
- The future of databases49
- Self-documenting13
- Get many resources in a single request12
- Query Language6
- Ask for what you need, get exactly that6
- Fetch different resources in one request3
- Type system3
- Evolve your API without versions3
- Ease of client creation2
- GraphiQL2
- Easy setup2
- "Open" document1
- Fast prototyping1
- Supports subscription1
- Standard1
- Good for apps that query at build time. (SSR/Gatsby)1
- 1. Describe your data1
- Better versioning1
- Backed by Facebook1
- Easy to learn1
- Hard to migrate from GraphQL to another technology4
- More code to type.4
- Takes longer to build compared to schemaless.2
- No support for caching1
- All the pros sound like NFT pitches1
- No support for streaming1
- Works just like any other API at runtime1
- N+1 fetch problem1
- No built in security1
related GraphQL posts
I just finished the very first version of my new hobby project: #MovieGeeks. It is a minimalist online movie catalog for you to save the movies you want to see and for rating the movies you already saw. This is just the beginning as I am planning to add more features on the lines of sharing and discovery
For the #BackEnd I decided to use Node.js , GraphQL and MongoDB:
Node.js has a huge community so it will always be a safe choice in terms of libraries and finding solutions to problems you may have
GraphQL because I needed to improve my skills with it and because I was never comfortable with the usual REST approach. I believe GraphQL is a better option as it feels more natural to write apis, it improves the development velocity, by definition it fixes the over-fetching and under-fetching problem that is so common on REST apis, and on top of that, the community is getting bigger and bigger.
MongoDB was my choice for the database as I already have a lot of experience working on it and because, despite of some bad reputation it has acquired in the last months, I still believe it is a powerful database for at least a very long list of use cases such as the one I needed for my website
When I joined NYT there was already broad dissatisfaction with the LAMP (Linux Apache HTTP Server MySQL PHP) Stack and the front end framework, in particular. So, I wasn't passing judgment on it. I mean, LAMP's fine, you can do good work in LAMP. It's a little dated at this point, but it's not ... I didn't want to rip it out for its own sake, but everyone else was like, "We don't like this, it's really inflexible." And I remember from being outside the company when that was called MIT FIVE when it had launched. And been observing it from the outside, and I was like, you guys took so long to do that and you did it so carefully, and yet you're not happy with your decisions. Why is that? That was more the impetus. If we're going to do this again, how are we going to do it in a way that we're gonna get a better result?
So we're moving quickly away from LAMP, I would say. So, right now, the new front end is React based and using Apollo. And we've been in a long, protracted, gradual rollout of the core experiences.
React is now talking to GraphQL as a primary API. There's a Node.js back end, to the front end, which is mainly for server-side rendering, as well.
Behind there, the main repository for the GraphQL server is a big table repository, that we call Bodega because it's a convenience store. And that reads off of a Kafka pipeline.
- Patterns for paging, sorting, filtering7
- ISO Standard5
- Query Language4
- RESTful3
- No overfetching, no underfetching3
- Get many resources in a single request2
- Self-documenting2
- Batch requests2
- Bulk requests ("array upsert")2
- Ask for what you need, get exactly that2
- Evolve your API by following the compatibility rules1
- Resource model defines conventional operations1
- Resource Modification Language1
- Overwhelming, no "baby steps" documentation1
related OData posts
RAML
- API Specification15
- Human Readable7
- API Documentation6
- Design Patterns & Code Reuse3
- API Modeling2
- Automatic Generation of Mule flow2
- Unit Testing2
- API Mocking1
- SDK Generation1
related RAML posts
- Higth performance24
- The future of API15
- Easy setup13
- Contract-based5
- Polyglot4
- Garbage2
related gRPC posts
We just launched the Segment Config API (try it out for yourself here) — a set of public REST APIs that enable you to manage your Segment configuration. Behind the scenes the Config API is built with Go , GRPC and Envoy.
At Segment, we build new services in Go by default. The language is simple so new team members quickly ramp up on a codebase. The tool chain is fast so developers get immediate feedback when they break code, tests or integrations with other systems. The runtime is fast so it performs great at scale.
For the newest round of APIs we adopted the GRPC service #framework.
The Protocol Buffer service definition language makes it easy to design type-safe and consistent APIs, thanks to ecosystem tools like the Google API Design Guide for API standards, uber/prototool
for formatting and linting .protos and lyft/protoc-gen-validate
for defining field validations, and grpc-gateway
for defining REST mapping.
With a well designed .proto, its easy to generate a Go server interface and a TypeScript client, providing type-safe RPC between languages.
For the API gateway and RPC we adopted the Envoy service proxy.
The internet-facing segmentapis.com
endpoint is an Envoy front proxy that rate-limits and authenticates every request. It then transcodes a #REST / #JSON request to an upstream GRPC request. The upstream GRPC servers are running an Envoy sidecar configured for Datadog stats.
The result is API #security , #reliability and consistent #observability through Envoy configuration, not code.
We experimented with Swagger service definitions, but the spec is sprawling and the generated clients and server stubs leave a lot to be desired. GRPC and .proto and the Go implementation feels better designed and implemented. Thanks to the GRPC tooling and ecosystem you can generate Swagger from .protos, but it’s effectively impossible to go the other way.
I used GraphQL extensively at a previous employer a few years ago and really appreciated the data-driven schema etc alongside the many other benefits it provided. At that time, it seemed like it was set to replace RESTful APIs and many companies were adopting it.
However, as of late, it seems like interest has been waning for GraphQL as opposed to increasing as I had assumed it would. Am I missing something here? What is the current perspective regarding this technology?
Currently, I'm working with gRPC and was curious as to the state of everything now.
related REST posts
Stack Overflow
- Scary smart community257
- Knows all206
- Voting system142
- Good questions134
- Good SEO83
- Addictive22
- Tight focus14
- Share and gain knowledge10
- Useful7
- Fast loading3
- Gamification2
- Knows everyone1
- Experts share experience and answer questions1
- Stack overflow to developers As google to net surfers1
- Questions answered quickly1
- No annoying ads1
- No spam1
- Fast community response1
- Good moderators1
- Quick answers from users1
- Good answers1
- User reputation ranking1
- Efficient answers1
- Leading developer community1
- Not welcoming to newbies3
- Unfair downvoting3
- Unfriendly moderators3
- No opinion based questions3
- Mean users3
- Limited to types of questions it can accept2
related Stack Overflow posts
Google Analytics is a great tool to analyze your traffic. To debug our software and ask questions, we love to use Postman and Stack Overflow. Google Drive helps our team to share documents. We're able to build our great products through the APIs by Google Maps, CloudFlare, Stripe, PayPal, Twilio, Let's Encrypt, and TensorFlow.