When you build a web API, some tasks take too long to run inside a single request. Things like sending emails, processing images, or calling slow external services can time out or make your application feel unresponsive. The solution is to offload these jobs to a background worker.
This article walks through a FastAPI project on GitHub that uses ARQ and Redis to manage background tasks. It provides a solid, production-ready template for handling asynchronous jobs in your own applications.
When it comes to background tasks in FastAPI, you have a few options. It's important to understand the trade-offs.
First, FastAPI has a built-in BackgroundTasks feature. It's very simple to use and is great for "fire-and-forget" tasks that don't need to be tracked, like sending a notification email after a user signs up.
However, the built-in solution has significant limitations for more complex applications:
For anything beyond simple, non-critical tasks, you need a dedicated task queue. This is where tools like ARQ or Celery come in. They solve all the limitations of the built-in approach by running tasks in separate worker processes.
While many Python developers reach for Celery, this project uses ARQ for several compelling reasons:
Built for asyncio: ARQ is designed for asyncio from the ground up. Your API routes are async def, and so are your ARQ tasks. This makes integration with FastAPI seamless and avoids the complexity of bridging synchronous and asynchronous code that you might encounter with Celery.
High Performance: Because it's fully asynchronous, ARQ is significantly faster than synchronous queues like RQ. It can run hundreds of jobs concurrently in a single worker without the overhead of forking processes, making it extremely efficient for I/O-bound tasks.
Powerful and Reliable Features: ARQ includes features designed for critical jobs. It supports easy retries, deferred execution, and pessimistic job timeouts. This means if a job gets stuck, ARQ will consider it failed and requeue it, ensuring tasks don't get lost.
Simplicity and Elegance: ARQ has a small and focused codebase (around 700 lines of code). This makes it easy to understand, reason with, and debug, which is a significant advantage over larger, more complex frameworks.
For a modern FastAPI application, ARQ is a better, simpler, and more direct fit.
Here’s a simple breakdown to help you decide:
Use FastAPI's BackgroundTasks when:
Use a Job Queue (like ARQ) when:
The project is a complete example of how to connect FastAPI, ARQ, and Redis. It's not just a "hello world" example; it includes features you'd need in a real application.
Here’s what it does:
Getting the project running is straightforward. First, you need to have Redis installed and running.
Then, clone the repository and install the dependencies.
git clone https://github.com/davidmuraya/fastapi-arq.git cd fastapi-arq pip install -r requirements.txt
Next, create a .env file in the root directory to configure the Redis connection, queue name, and database file.
REDIS_BROKER=localhost:6379 WORKER_QUEUE=my-fastapi-queue JOBS_DB=database/jobs.db
Now you can run the application. You'll need two separate terminals.
In the first terminal, start the FastAPI server:
uvicorn main:app --reload --port 5000
In the second terminal, start the ARQ worker:
arq worker:WorkerSettings
The worker will automatically connect to Redis and start listening for jobs on the queue you defined.
Once everything is running, you can enqueue a job. The project includes a simple add task that adds two numbers.
curl -X POST "http://localhost:5000/tasks/add" \ -H "Content-Type: application/json" \ -d '{"x": 5, "y": 10}'
The API will respond immediately with a JSON object containing the job_id.
{ "job_id": "your-unique-job-id" }
You can then use this ID to check the job's status.
curl "http://localhost:5000/jobs/your-unique-job-id"
The response will tell you if the job is complete and show you the result.
{ "id": "your-unique-job-id", "status": "complete", "result": 15, "start_time": "2025-08-18T12:00:00.000Z", "finish_time": "2025-08-18T12:00:01.000Z" }
The project also includes examples for more complex tasks, like making a retriable HTTP request.
In a real-world application, tasks can fail for many reasons - a network glitch, a temporary API outage, or a database deadlock. A robust system must be able to handle these transient failures gracefully.
ARQ is designed with this in mind and provides powerful features for retrying failed jobs. This includes:
Building resilient, idempotent tasks is a critical part of using any background job system. For a detailed guide on how to implement these retry mechanisms in ARQ, see my article: Building Resilient Task Queues in FastAPI with ARQ Retries.
Beyond enqueuing one-off tasks, ARQ also has powerful support for recurring jobs, similar to a traditional system cron. You can define these directly in your WorkerSettings.
This is done using the cron_jobs list, which takes cron functions. Here's how you would set up a task to run at 9:12 AM, 12:12 PM, and 6:12 PM every day:
# In worker.py or wherever your WorkerSettings are defined from arq import cron async def run_regularly(ctx): print("Running a scheduled job...") class WorkerSettings: # ... other settings cron_jobs = [ cron(run_regularly, hour={9, 12, 18}, minute=12) ]
The syntax is intuitive. You pass your async task function to cron() and specify the schedule using parameters like hour and minute. Using a set like {9, 12, 18} tells ARQ to run the job at each of those hours.
A thoughtful detail in ARQ's design is that it avoids scheduling jobs exactly at the top of a second. By default, it uses a microsecond offset to prevent jobs from being enqueued at the same instant as other system tasks, which can reduce load spikes.
Handling background tasks is a common requirement for web applications. This project provides a clear and practical blueprint for doing it in FastAPI with ARQ. It shows how to separate your API from your workers, manage job state, and build a reliable system for handling long-running processes. If you're looking for a straightforward way to add background jobs to your async Python project, give ARQ a try.
Q: Why not just use FastAPI's built-in BackgroundTasks?
A: FastAPI's BackgroundTasks is great for simple, "fire-and-forget" tasks. However, it doesn't allow you to check a job's status, retrieve its result, or run tasks in a separate process. For reliable, scalable applications where you need to monitor jobs, handle failures, and avoid blocking the web server, a dedicated queue system like ARQ is a much better approach.
Q: What is the difference between ARQ and Celery?
A: The main difference is that ARQ is designed for asyncio from the ground up, making it a natural fit for async frameworks like FastAPI. Celery is older and was built for synchronous code, so using it with async functions requires extra configuration.
Q: Do I need Redis to use ARQ? A: Yes, ARQ uses Redis as its message broker to manage the queue of tasks between your web application and the workers. You will need a running Redis instance.
Q: How does the worker know what code to run?
A: The ARQ worker is configured to import your task functions (from tasks.py in this project). When it receives a job from the queue, it knows which function to call and what arguments to pass to it.
Q: Can I schedule a task to run in the future?
A: Yes. ARQ supports two types of scheduling. For one-off tasks, you can use defer_until to enqueue a job that will only run after a specific datetime. For recurring tasks, you can use cron_jobs in your WorkerSettings to run jobs on a schedule, similar to a traditional cron job.
Q: Is this setup ready for production? A: The project provides a solid foundation for a production setup. For a real-world deployment, you would run the FastAPI server and the ARQ worker as separate services (e.g., in different Docker containers) and use a production-grade Redis server.
About the Author
David Muraya is a Solutions Architect specializing in Python, FastAPI, and Cloud Infrastructure. He is passionate about building scalable, production-ready applications and sharing his knowledge with the developer community. You can connect with him on LinkedIn.
Related Blog Posts
Enjoyed this blog post? Check out these related posts!

Building Resilient Task Queues in FastAPI with ARQ Retries
How to Handle Failures and Implement Robust Retry Logic in FastAPI Background Jobs with ARQ.
Read More...

How to Protect Your FastAPI OpenAPI/Swagger Docs with Authentication
A Guide to Securing Your API Documentation with Authentication
Read More...

A Practical Guide to FastAPI Security
A Comprehensive Checklist for Production-Ready Security for a FastAPI Application
Read More...

How to Handle File Uploads in FastAPI
A Practical Guide to Streaming and Validating File Uploads
Read More...
On this page
Back to Blogs
Contact Me
Have a project in mind? Send me an email at hello@davidmuraya.com and let's bring your ideas to life. I am always available for exciting discussions.