multithreading - Beanstalk setup with multiple queue worker: Jobs that spawns another jobs -
is safe spawn multiple jobs job workers start working on vacant jobs?
currently set this. have 20 workers waiting jobs pushed. 1 of job send ios push notification, problem ios, can't send bulk messages.
current: made was, job gets list of specific users batch, each device token db , start sending notification.
scenario: if 1 topic has 1000 users, have 1000 users , devices , start sending on each device. push new job on queue , 1 worker pick app, while other workers vacant , waits incoming jobs. if no jobs available given time, worker 1 had job sending then,
what working right now. safe if 1 big job, instead create jobs other workers vacant can pick , work?
p.s jobs running in 1 tube.
that sounds quite reasonable me, spreading load out among number of workers.
there things careful - such setting appropriate priority. if task created dozens, or hundreds more tasks has higher priority job sending, potentially hundreds of thousands of jobs, workers may not running them, , queue filling up.
leaving large gaps between priorities mean can slot in jobs important well. more important customer may have priority closer zero, , hence processed, , sent ahead of smaller customer.
other matters think include account being rate-limited - if limited 10 notifications per second, running 20 workers non-starter.
i put new groups of jobs new tube (running dozens of tubes not expensive). can watch number of tubes @ once (getting 'most important' job of them), can't count different types of job within single tube, splitting types different queues allows see how many jobs of each type running. thus, if sending processes building up, slow splitting jobs being created while, or mark them lower priority while.
finally, keep advantage of batching jobs , avoiding overhead, i'd split jobs 1000+-off packets of maybe 25-50 notifications per job.
Comments
Post a Comment