StackShareStackShare
Follow on
StackShare

Discover and share technology stacks from companies around the world.

Follow on

© 2025 StackShare. All rights reserved.

Product

  • Stacks
  • Tools
  • Feed

Company

  • About
  • Contact

Legal

  • Privacy Policy
  • Terms of Service
  1. Stackups
  2. Utilities
  3. Background Jobs
  4. Message Queue
  5. AWS Lambda vs Celery

AWS Lambda vs Celery

OverviewDecisionsComparisonAlternatives

Overview

Celery
Celery
Stacks1.7K
Followers1.6K
Votes280
GitHub Stars27.5K
Forks4.9K
AWS Lambda
AWS Lambda
Stacks26.0K
Followers18.8K
Votes432

AWS Lambda vs Celery: What are the differences?

Introduction

In this article, we will explore the key differences between AWS Lambda and Celery. Both AWS Lambda and Celery are popular technologies used for executing code in a distributed and scalable manner. However, they have some fundamental differences that set them apart from each other. Let's dive into the key differences.

  1. Scaling Methodology: AWS Lambda scales automatically based on the incoming request load. It allocates resources dynamically and ensures that each request is processed independently and in parallel. On the other hand, Celery provides manual scaling by allowing users to configure the number of workers and concurrent tasks. It requires the users to manage the scaling of workers based on the anticipated load.

  2. Event-Driven vs Task Queue: AWS Lambda is an event-driven computing service that allows developers to execute code in response to events like file uploads, database changes, or API calls. It focuses on executing specific functions in response to events, making it widely used in serverless architectures. Celery, on the other hand, is a distributed task queue that enables developers to queue and execute tasks asynchronously. It provides a broader scope for task management and coordination.

  3. Execution Environment: AWS Lambda provides a managed environment where users can write and execute functions using various programming languages supported by AWS. It takes care of provisioning and managing the infrastructure required to execute the functions. Celery, on the other hand, requires users to set up their execution environment, including message brokers like RabbitMQ or Redis, and worker processes. It gives users more control over the execution environment setup.

  4. Vendor Lock-in: AWS Lambda is a cloud service provided by Amazon Web Services (AWS) and is tightly integrated with other AWS services. Users may become vendor-locked when using Lambda as it requires utilizing the AWS ecosystem. On the other hand, Celery is an open-source technology that can be used with various message brokers and backends. It provides more flexibility and avoids vendor lock-in.

  5. Pricing Model: AWS Lambda follows a pay-as-you-go pricing model, where users are charged based on the number of requests and the amount of compute time used. It provides a detailed billing structure and automatic scalability based on demand. Celery, being an open-source technology, does not have any direct pricing associated with it. However, users need to consider the infrastructure costs for hosting the message broker and worker processes.

  6. Deployment and Management: AWS Lambda provides a seamless deployment experience as it is integrated with other AWS services like AWS CloudFormation or AWS Serverless Application Model (SAM). It simplifies the management of serverless applications and automates deployment. Celery, being a self-hosted technology, requires users to manage deployment and infrastructure on their own. It requires additional efforts for deployment and configuration management.

In summary, AWS Lambda and Celery differ in their scaling methodology, execution environment, event-driven vs task queue approach, vendor lock-in, pricing model, and deployment/management experience.

Share your Stack

Help developers discover the tools you use. Get visibility for your team's tech choices and contribute to the community's knowledge.

View Docs
CLI (Node.js)
or
Manual

Advice on Celery, AWS Lambda

Tim
Tim

CTO at Checkly Inc.

Sep 18, 2019

Needs adviceonHerokuHerokuAWS LambdaAWS Lambda

When adding a new feature to Checkly rearchitecting some older piece, I tend to pick Heroku for rolling it out. But not always, because sometimes I pick AWS Lambda . The short story:

  • Developer Experience trumps everything.
  • AWS Lambda is cheap. Up to a limit though. This impact not only your wallet.
  • If you need geographic spread, AWS is lonely at the top.

The setup

Recently, I was doing a brainstorm at a startup here in Berlin on the future of their infrastructure. They were ready to move on from their initial, almost 100% Ec2 + Chef based setup. Everything was on the table. But we crossed out a lot quite quickly:

  • Pure, uncut, self hosted Kubernetes — way too much complexity
  • Managed Kubernetes in various flavors — still too much complexity
  • Zeit — Maybe, but no Docker support
  • Elastic Beanstalk — Maybe, bit old but does the job
  • Heroku
  • Lambda

It became clear a mix of PaaS and FaaS was the way to go. What a surprise! That is exactly what I use for Checkly! But when do you pick which model?

I chopped that question up into the following categories:

  • Developer Experience / DX 🤓
  • Ops Experience / OX 🐂 (?)
  • Cost 💵
  • Lock in 🔐

Read the full post linked below for all details

357k views357k
Comments
Shantha
Shantha

Sep 30, 2020

Needs adviceonRabbitMQRabbitMQCeleryCeleryMongoDBMongoDB

I am just a beginner at these two technologies.

Problem statement: I am getting lakh of users from the sequel server for whom I need to create caches in MongoDB by making different REST API requests.

Here these users can be treated as messages. Each REST API request is a task.

I am confused about whether I should go for RabbitMQ alone or Celery.

If I have to go with RabbitMQ, I prefer to use python with Pika module. But the challenge with Pika is, it is not thread-safe. So I am not finding a way to execute a lakh of API requests in parallel using multiple threads using Pika.

If I have to go with Celery, I don't know how I can achieve better scalability in executing these API requests in parallel.

334k views334k
Comments

Detailed Comparison

Celery
Celery
AWS Lambda
AWS Lambda

Celery is an asynchronous task queue/job queue based on distributed message passing. It is focused on real-time operation, but supports scheduling as well.

AWS Lambda is a compute service that runs your code in response to events and automatically manages the underlying compute resources for you. You can use AWS Lambda to extend other AWS services with custom logic, or create your own back-end services that operate at AWS scale, performance, and security.

-
Extend other AWS services with custom logic;Build custom back-end services;Completely Automated Administration;Built-in Fault Tolerance;Automatic Scaling;Integrated Security Model;Bring Your Own Code;Pay Per Use;Flexible Resource Model
Statistics
GitHub Stars
27.5K
GitHub Stars
-
GitHub Forks
4.9K
GitHub Forks
-
Stacks
1.7K
Stacks
26.0K
Followers
1.6K
Followers
18.8K
Votes
280
Votes
432
Pros & Cons
Pros
  • 99
    Task queue
  • 63
    Python integration
  • 40
    Django integration
  • 30
    Scheduled Task
  • 19
    Publish/subsribe
Cons
  • 4
    Sometimes loses tasks
  • 1
    Depends on broker
Pros
  • 129
    No infrastructure
  • 83
    Cheap
  • 70
    Quick
  • 59
    Stateless
  • 47
    No deploy, no server, great sleep
Cons
  • 7
    Cant execute ruby or go
  • 3
    Compute time limited
  • 1
    Can't execute PHP w/o significant effort

What are some alternatives to Celery, AWS Lambda?

Kafka

Kafka

Kafka is a distributed, partitioned, replicated commit log service. It provides the functionality of a messaging system, but with a unique design.

RabbitMQ

RabbitMQ

RabbitMQ gives your applications a common platform to send and receive messages, and your messages a safe place to live until received.

Amazon SQS

Amazon SQS

Transmit any volume of data, at any level of throughput, without losing messages or requiring other services to be always available. With SQS, you can offload the administrative burden of operating and scaling a highly available messaging cluster, while paying a low price for only what you use.

NSQ

NSQ

NSQ is a realtime distributed messaging platform designed to operate at scale, handling billions of messages per day. It promotes distributed and decentralized topologies without single points of failure, enabling fault tolerance and high availability coupled with a reliable message delivery guarantee. See features & guarantees.

ActiveMQ

ActiveMQ

Apache ActiveMQ is fast, supports many Cross Language Clients and Protocols, comes with easy to use Enterprise Integration Patterns and many advanced features while fully supporting JMS 1.1 and J2EE 1.4. Apache ActiveMQ is released under the Apache 2.0 License.

ZeroMQ

ZeroMQ

The 0MQ lightweight messaging kernel is a library which extends the standard socket interfaces with features traditionally provided by specialised messaging middleware products. 0MQ sockets provide an abstraction of asynchronous message queues, multiple messaging patterns, message filtering (subscriptions), seamless access to multiple transport protocols and more.

Apache NiFi

Apache NiFi

An easy to use, powerful, and reliable system to process and distribute data. It supports powerful and scalable directed graphs of data routing, transformation, and system mediation logic.

Azure Functions

Azure Functions

Azure Functions is an event driven, compute-on-demand experience that extends the existing Azure application platform with capabilities to implement code triggered by events occurring in virtually any Azure or 3rd party service as well as on-premises systems.

Google Cloud Run

Google Cloud Run

A managed compute platform that enables you to run stateless containers that are invocable via HTTP requests. It's serverless by abstracting away all infrastructure management.

Gearman

Gearman

Gearman allows you to do work in parallel, to load balance processing, and to call functions between languages. It can be used in a variety of applications, from high-availability web sites to the transport of database replication events.

Related Comparisons

Bootstrap
Materialize

Bootstrap vs Materialize

Laravel
Django

Django vs Laravel vs Node.js

Bootstrap
Foundation

Bootstrap vs Foundation vs Material UI

Node.js
Spring Boot

Node.js vs Spring-Boot

Liquibase
Flyway

Flyway vs Liquibase