Replies: 6 comments
-
@levic You could try building a single docker container which runs all applications, Hasura, your app server, the frontend and a gateway to manage URLs. But this is an anti-pattern in the Docker world. Each container is supposed to run one process. Statically building the binary is not possible at the moment as the ecosystem is still evolving. Also, I am not sure if I understand the concerns properly. Can you elaborate a little bit on
|
Beta Was this translation helpful? Give feedback.
-
Consider for simplicity a single-process single-threaded synchronous application server (python/ruby/node/php). A typical setup might look like:
You can increase the number of processes/threads or make it asynchronous, it doesn't make a difference -- as soon as you impose any sort of concurrent request limit then you run the risk of a deadlock. The only way to guarantee success is to ensure that webhooks are not part of the same concurrent request limit as regular browser requests (eg run a second application server instance on a different port dedicated to handling web hooks only, or use a single application server but configure nginx with separate limits for browser vs webhook requests) |
Beta Was this translation helpful? Give feedback.
-
@shahidhk To summarize and generalize @levic's concern here: I believe he is talking about the nastiness we would have if we don't create a third server, separate from the frontend app, to handle hooks for Hasura, and instead we have a frontend which depends on (makes calls to) Hasura which depends on the frontend. You may be wondering why on earth someone would even consider such an inherently complicated architecture. This goes to show you how strongly some developers want to build their app out as a monolith. I believe the modern trend of microservice architecture has overshadowed the fact that monolithic architecture still has its advantages (as well disadvantages) over microservices, and depending on your project, monolith may be a better choice.
That does not seem to be entirely true. See this official Docker documentation on running multiple services in a container:
Why are multi-service containers not generally recommended? I don't have an official answer on this, but judging from the documentation page & knowing how docker works with That is why I think Hasura should consider & support monolithic architectures. Thanks for reading and thanks for the incredible OSS project! 😀 |
Beta Was this translation helpful? Give feedback.
-
@levic The approach that @shahidhk suggested #1289 (comment) worked for my Hasura + nodejs monolith. I am currently working on a npm package (nodejs) to help with both (1) running multiple services in a single container, and (2) creating the http gateway service. When that's a little more ready, I will share an example project demonstrating the total setup/architecture. |
Beta Was this translation helpful? Give feedback.
-
@shahidhk - it would be great to have Hasura support monolithes. Currently Hasura doesn't work play nicely with Heroku as per @zenflow comments. We would love to use hasura with our Heroku setup but it's really a burden since we have to deploy hasura as a seperate app which isn't practically for us. In Heroku's specific instance, a hasura buildpack would solve the issue. However, a solution that may work for other PaaS would be providing the binary instead of only providing a docker image. |
Beta Was this translation helpful? Give feedback.
-
@levic As of version 2, the Hasura docker image has the apt package manager (i.e. Once you have Hasura GraphQL Engine & your app installed in the same image, you can refer to this documentation for options on how to run multiple services/processes in a single container: https://docs.docker.com/config/containers/multi-service_container/ |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
(This is a half-Heroku, half-Hasura question:)
Is there a recommended way of doing atomic app deployments with Hasura + an app server (eg django or rails) on Heroku?
In a Heroku Procfile only the 'web' process type can receive HTTP requests. This is a problem if I have an app server and Hasura: I have to create two Heroku apps (one for the app server and one for Hasura) -- I now have to deal with potential race conditions as the apps may not deploy at the same time.
It's actually even worse if I use webhooks: I have to configure 3 apps if I want to avoid thread starvation. Consider a request that goes from app server (frontend request) -> hasura -> app server (webhook callback): what if the frontend app server threads are all blocked waiting on hasura and hasura tries to call a webhook?
If I could combine Hasura and the app into a single Heroku buildpack then this problem wouldn't exist: Each dyno would have its own copy of the frontend app, Hasura, and a second copy of the app purely to deal with webhook callbacks (memory usage might be higher but CPU would be identical since each frontend thread will be blocked waiting on a webhook callback thread anyway)
My understanding of the current build process is:
Dockerfile
that is based on a single static binary built usingfpco/stack-build
and copied into ascratch
base imageOptions to handle this
graphql-engine
working in a subdirectory and merging that with say a debian image). This could work but packaging would be slow compared to a normal heroku build.graphql-engine
as a static binary (see statically compile server binary #724) that could then just be dropped into any heroku buildpack without needing docker at all.Are there any simpler options that I'm overlooking? Is there a better way of creating single-app Heroku deployments?
Beta Was this translation helpful? Give feedback.
All reactions