Is anyone hosting a full CRUD type app with routing and persisting as an AWS Lambda?
And if so, how is that going?
Since you can run a container as a lambda, and you get built in scaling up and down it seems like it might be a good fit.
Is anyone hosting a full CRUD type app with routing and persisting as an AWS Lambda?
And if so, how is that going?
Since you can run a container as a lambda, and you get built in scaling up and down it seems like it might be a good fit.
For further actions, you may consider blocking this person and/or reporting abuse
Bek Brace -
Oliver Bennet -
Ethan Lee -
Hana Sato -
Top comments (10)
Are things at a good place in terms of vendor lock-in in this situation?
How might one approach this without having an app that is 100% locked in to AWS?
Not saying it would need to be in a state where it could run anywhere today, but I'd hate to get to a place where it was impossible to make a few changes and migrate vendors.
Not an answer to your question, but I'm wondering about the state of things in that regard.
So, you can run a rails app, or a go app inside a container, and start it up on lambda.
The container means its a bit less locked in. And for low usage stuff, you only pay per request with lambda.
I think cold start-up might be the issue, and where you persist things and if you need many services that call each other then not sure how that would work.
So assuming I'm comfortable with the specific tradeoffs of running in Lambda, there is a pretty low-risk situation here, eh?
What would you say are the best use cases — my mind goes towards... wildly variant traffic volumes — like a company that does flash sales with big spikes — or viral content sites where edge caching is not possible. Or I suppose anything where the flexibility of Lambda pays off once you're all set up?
Found this with a quick search
Lambda Containers with Rails; A Perfect Match!
Ken Collins for AWS Heroes ・ Dec 8 '20 ・ 2 min read
Scaling up would definitely be one use case. I was thinking about the opposite, scaling down. So a REST API service that is just a side project, I can have it sitting as a lamdba, and then it's idle 99% of the time and so I just pay pennes whenever its called.
So, side projects basically 🤷
How does that work in practice? Is the first request after a while painfully slow due to cold boot up?
yeah, because your app is just starting up. So it depends on how fast your app is to startup, Java is bad at this. I'm using go, and its like 100 ms to start up in lambda i think.
I found this: mikhail.io/serverless/coldstarts/a...
But yeah, I'm just playing around. Here is my lambda app:
earthly.dev/blog/aws-lambda-golang/
I'm not sure about the real world downsides outside of toy app land.
An entire backend on a single lambda would have high cold start times that I would recommend just using fargate if you would rather use containers.
Otherwise I suggest using API Gateway & a lambda per route.
If you want to take a step further into the AWS land I would highly suggest using appsync and VTL templates to have a fully serverless graphql API with 0 lambdas.
My work currently does API Gateway and Lambda approach, but if I could rewrite I would 100% use appsync and VTL templates.
Hey Garret,
What types of start up times do you see and what language are you using?
How is fargate different or better than a Provisioned Concurrency lambda?
I ended up writing up a blog post about how to do this in go:
earthly.dev/blog/aws-lambda-api-pr...