Running background Celery task in Flask
Tag : python , By : Lunis Neko
Date : March 29 2020, 07:55 AM
help you fix your problem I ended up being able to save the ID of my task in the session dictionary in Flask See code below: #!/usr/bin/env python
"""Page views."""
from flask import render_template, request
from flask import Flask
from celerytest import add
from time import sleep
app = Flask(__name__)
@app.route('/', methods=['GET', 'POST'])
def run():
if request.method == 'GET':
return render_template("template.html")
else:
form = request.form
n1 = str(form.get("n1"))
n2 = str(form.get("n2"))
aysnc_res = add.delay(n1,n2)
session['TASK_ID'] = async_res.id
return render_template("loading.html")
@app.route('/loading')
def check_if_complete():
aysnc_res = session['TASK_ID']
if async_res.ready() == True:
return render_template("template2.html", val=async_res.get())
else:
sleep(5)
return render_template("loading.html")
if __name__ == '__main__':
app.run()
|
Python Celery - How to call celery tasks inside other task
Date : March 29 2020, 07:55 AM
wish helps you I'm calling a task within a tasks in Django-Celery , This should work: celery.current_app.send_task('mymodel.tasks.mytask', args=[arg1, arg2, arg3])
|
In Celery Python, output of a task to parallel processing task
Date : March 29 2020, 07:55 AM
Does that help What I am understanding from your requirement, I would suggest, Instead of making mainTask as async you can call the other 5 tasks as celery tasks, from mainTask i.e, def mainTask(msg, nc):
decryptFunction.decryptFunc(msg)
if len(decryptFunction.messageJson):
response1 = task1.delay(decryptFunction.messageJson)
response2 = task2.delay(decryptFunction.messageJson)
response3 = task3.delay(decryptFunction.messageJson)
response4 = task4.delay(decryptFunction.messageJson)
response5 = task5.delay(decryptFunction.messageJson)
def on_message(client, userdata, msg):
result = mainTask(msg.payload.decode("utf-8"),1)
|
Advanced task formatting in Flower (Celery monitoring)
Tag : python , By : doctorbigtime
Date : March 29 2020, 07:55 AM
Hope this helps I found a solution to get the task's arguments as they are when called (list/dict) instead of strings. The argsrepr and kwargsrepr parameters of the apply_async method allows to specify custom representations for the task's arguments. import json
from celery import Task
class FlowerTask(Task):
def delay(self, *args, **kwargs):
argsrepr, kwargsrepr = [], {}
for arg in args:
if isinstance(arg, bytes):
argsrepr.append("<binary content>")
elif isinstance(arg, list):
argsrepr.append("<list ({} items)".format(len(arg)))
else:
...
for key, value in kwargs.items():
...
# Create and call the task with the reprs we just built
new_task = super().s(*args, **kwargs)
return new_task.apply_async(argsrepr=json.dumps(argsrepr), kwargsrepr=json.dumps(kwargsrepr))
@shared_task(base=FlowerTask)
def test_task(*args, **kwargs):
return "OK !"
def format_task(task):
task.args = json.loads(task.args)
if not task.args:
task.args = "( )"
else:
task.args = ', '.join(arg for arg in task.args)
# [...]
|
Python Celery task to restart celery worker
Tag : python , By : LinnheCreative
Date : March 29 2020, 07:55 AM
|