- Fix reconnecting after disconnect (thanks to @heimtathurs)
- Add Python 3.11 support (with
evo-aioredis
dependency instead ofaioredis
) - Remove
pydantic
dependency - Remove
aioredis
from dependencies to allow choose betweenaioredis
andevo-aioredis
- fork with Python 3.11 compatability
- Added ability to optionally pass
ctx
to the task, like this:
@task(with_ctx=True)
def foobar(ctx):
log.info('Foobar try %s', ctx['job_try'])
ctx
contains: job_id
, job_try
, enqueue_time
, score
, metadata
+ all worker's ctx
(including custom context which can be passed via on_startup
). Thanks to @kindermax (#426) !
- Add proper typing for functions wrapped with the @task decorator. Mypy will now check that parameters are passed correctly when calling
func()
andfunc.delay()
- Add
sentinel_timeout
(defaults to 0.2) param toRedisSettings
- Breaking change: Rename
darq.worker.Function
todarq.worker.Task
- Made
job
totask
naming migration - Add max_jobs parameter to CLI (thanks to @antonmyronyuk)
- Fixed bug with
expires
argument:default_job_expires
could not be replaced withNone
in@task
or.apply_async
- Breaking change: Add
scheduler_ctx
param toon_scheduler_startup
andon_scheduler_shutdown
to share data between this callbacks. It already hasctx['redis']
- instance ofArqRedis
- Breaking change: Changed CLI command format. Before:
darq some_project.darq_app.darq
. Now:darq -A some_project.darq_app.darq worker
- Breaking change: Scheduler (cron jobs) now run's seperate from worker (see
darq scheduler
command) - Breaking change: Changed some function signatures (rename arguments)
- Breaking change: Remove
redis_pool
param fromDarq
app - Add
on_scheduler_startup
andon_scheduler_shutdown
callbacks
- Fix some types (cron, OnJobPrepublishType)
on_job_prerun
now runs before "task started" log andon_job_postrun
now runs after "task finished" log
.apply_async
: Makeargs
andkwargs
arguments optional
- Fork
arq
to project and merge it withdarq
(It was easier to rewritearq
than to write a wrapper) - Breaking change: Remove "magic" params from
.delay
. For enqueue job with special params added.apply_async
. - Add
watch
-mode to CLI. - Fix: Now worker will not run cronjob if it's functions queue not match with worker's
- Breaking change: Changed Darq constructor from single config param to separate params.
- arq_function.coroutine now has .delay method.
- Add
on_job_prepublish(metadata, arq_function, args, kwargs)
callback.metadata
is mutable dict, which will be available atctx['metadata']
.
- Add
default_job_expires
param to Darq (if the job still hasn't started after this duration, do not run it). Default - 1 day - Add expires param to
@task
(if set - overwritesdefault_job_expires
)
- Rewrite warm shutdown: now during warm shutdown cron is disabled, on second signal the warm shutdown will be canceled
- Breaking change:
on_job_prerun
andon_job_postrun
now acceptsarq.worker.Function
instead of the original function (it can still be accessed atarq_function.coroutine
)
- Fix
add_cron_jobs
method. Tests added.
- Add
on_job_prerun(ctx, function, args, kwargs)
andon_job_postrun(ctx, function, args, kwargs, result)
callbacks.
- Breaking change: Jobs no longer explicitly get
JobCtx
as the first argument, as in 99.9% cases it doesn't need it. In future release will be possible to optionally passJobCtx
in some way. - Breaking change: All cron jobs should be wrapped in
@task
decorator - Directly pass
functions
toarq.Worker
, not names.
.delay()
now returnsarq_redis.enqueue_job
result (Optional[Job]
)- Add
py.typed
file - Fixed
add_cron_jobs
typing
- Add
add_cron_jobs
method
First release