Deployment Architecture of the wicked API Portal
The API Portal relies on proven technology to manage traffic: The actual portal either sits behind a HAproxy, or leverages standard Ingress Controllers (on Kubernetes) and the actual API traffic is proxied by the excellent API Gateway Kong by Mashape.
The portal itself is implemented as lightweight as possible, using node.js.
The API Portal can be deployed to any environment which runs a docker host.
This can be a single VM, or a Swarm environment. You can either deploy using
docker-compose file, or using an orchestrator like Kubernetes,
leveraging a Kubernetes Helm Chart.
Depending on your requirements regarding high availability and your SLAs towards your clients, you are free to choose how to deploy the API Portal and Gateway.
Behind the scenes, the API Portal runs in several small containers, as microservices inside a service. It is simple to extend the portal functionality using the Portal API, which is also how the "Mailer", "Chatbot" and "Kong Adapter" work in the background.
Scaling your API Gateway is supported out of the box, and depending on your runtime orchestration, setting up high availability is as well (for the Gateway). Highly available setups also for the API Portal is planned for 1.0.0.