ImportError: No module named control - python

I have a celery app, my files like this:
/fetcher.py
/mirad
celery.py
fetcher_tasks.py
in celery.py i was import fetcher_tasks.py
and in fetcher.py i call a task from fetcher_tasks.py
i want to import celery.control in fetcher.py but i can't do it, how i can do this work?
this is a part of my fetcher code:
from __future__ import absolute_import
import mirad.fetcher_tasks as tasks
from mirad.models.models import SourceModel
from mirad.settings import *
from mirad.celery.control import inspect
parse_feed_tasks = list()
def fetch():
for source in SourceModel.objects(active=True):
a = tasks.parse_feed.delay(source)

Looks like you mix celery.py in your project which is used to start Celery app with celery package from which you can import necessery functions.
You should import inspect function from celery.task.control package.
from __future__ import absolute_import
import mirad.fetcher_tasks as tasks
from mirad.models.models import SourceModel
from mirad.settings import *
from celery.task.control import inspect
parse_feed_tasks = list()
def fetch():
for source in SourceModel.objects(active=True):
a = tasks.parse_feed.delay(source)

Related

How to debug VSCODE python using local packages

import logging
import sys
import os
from LoggingHelper.main import LoggingHelper <-- Import "LoggingHelper.main" could not be resolvedPylancereportMissingImports
from LoggingHelper.models import logDataclass as ldc <-- Import "LoggingHelper.main" could not be resolvedPylancereportMissingImports
from .configs.project_configs import *
kafka_config = {}
from fastapi import FastAPI, Request <-- Import "LoggingHelper.main" could not be resolvedPylancereportMissingImports
from fastapi.responses import JSONResponse <-- Import "LoggingHelper.main" could not be resolvedPylancereportMissingImports
import fastapi <-- Import "LoggingHelper.main" could not be resolvedPylancereportMissingImports
import azure.functions as func <-- Import "LoggingHelper.main" could not be resolvedPylancereportMissingImports
import nest_asyncio
from starlette.middleware import Middleware <-- Import "LoggingHelper.main" could not be resolvedPylancereportMissingImports
from starlette.middleware.cors import CORSMiddleware <-- Import "LoggingHelper.main" could not be resolvedPylancereportMissingImports
nest_asyncio.apply()
from TSAuthHelper import FastAPIAuthHelper <-- Import "LoggingHelper.main" could not be resolvedPylancereportMissingImports
import json
import os
import time
import traceback
import pickle
I have a local library/package in ".python_packages\lib\site-packages" called "LoggingHelper", "fastapi", "starlette/middleware", etc. But I can't compile those in VSCODE.
They work just fine if I publish them all to Azure Functions, but not locally. I need to debug them locally on my VSCODE.
I have been trying to read everything I can, change the interpreter, etc. But I'm not a developer and need some guidance.
from .python_packages/fastapi import FastAPI
from fullpath/.python_packages/fastapi import FastAPI
nothing works.
When running locally you may need to specify the major version of Python you intend to use. Normally on Linux the first line of a Python3 program would be something like:
#!/usr/bin/python3
import os
...
Then before importing modules that are in an unusual location you might need to add a line:
sys.path.append("/fullpath/to/custom/modules/dir")
then you should be able to import modules that are in that directory.

Logging; ImportError: cannot import name 'get_logger' from 'logger'

I've got a Python script from a friend who is not in available at the moment.
Unfortunately i am not able run it.
Error:
ImportError: cannot import name 'get_logger' from 'logger' (C:\Users\MyName\AppData\Local\Programs\Python\Python311\Lib\site-packages\logger\__init__.py)
I can locate the file __init__.py at the mentioned path.
Part of the script which causes this error:
import time
import sys
import requests
from bs4 import BeautifulSoup
import pandas as pd
from requests.api import post
import os
import traceback
from logger import get_logger #<----- Error
from datetime import date
dir_path = os.path.dirname(os.path.realpath(__file__))
logging = get_logger(dir_path)
logger = logging.getLogger(__name__)
logger = logging.getLogger()
[...]
Code of __init__.py:
import logging
import logging.config
import os
import sys
import types
logging.getLogger('paramiko').setLevel(logging.WARNING)
logging.getLogger('requests').setLevel(logging.WARNING)
config_file = os.path.join(os.path.dirname(__file__), 'logging.conf')
logging.config.fileConfig(config_file, disable_existing_loggers=False)
logger = logging.getLogger()
def error(self, msg, *args, **kwargs):
self.error(msg, *args, **kwargs)
sys.exit(1)
logger.interrupt = types.MethodType(error, logger)
logger.info = logger.info
logger.warn = logger.warn
I hope you guys can help me to solve this issue.
Thanks!
What i tried
Googled ImportError: cannot import name 'get_logger' from 'logger' - Answere was based on another issue (ImportError: cannot import name 'getLogger')
Tried to directly import logging instead of logger
No idea what else i should try since i usually do not develope (Hi from sales)

Irregular behaviour of celery

I am very new to Celery and I am trying to use it to schedule a function, but its not working properly it seems.
Here is my settings.py: (Along with the default settings given by django)
CELERY_BROKER_URL = 'amqp://guest:guest#localhost'
CELERY_ACCEPT_CONTENT = ['json']
CELERY_RESULT_BACKEND = 'db+sqlite:///results.sqlite'
CELERY_TASK_SERIALIZER = 'json'
celery.py:
rom __future__ import absolute_import, unicode_literals
import os
from celery import Celery
from celery.schedules import crontab
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'mera_project.settings')
app = Celery('mera_project')
app.config_from_object('django.conf:settings', namespace='CELERY')
app.autodiscover_tasks()
init.py:
from __future__ import absolute_import, unicode_literals
from .celery import app as celery_app
__all__ = ['celery_app']
tasks_notes/tasks.py:(tasks_notes is my app name)
from celery.decorators import periodic_task
from celery.task.schedules import crontab
from tasks_notes.models import BudgetInfo
#periodic_task(run_every=(crontab(minute='*/15')))
def monthly_starting_value():
print("hi")
return 0
views.py:
from .tasks import monthly_starting_value
def app_view(request):
abcd = monthly_starting_value.delay()
print("new"+str(abcd))
I had expected value zero and hi in my terminal, but instead of that I have got a random number as new 42bf83ef-850f-4b34-af78-da696d2ee0f2 and the random number keeps on changing in every 15 minutes.
In my ``celery beat``` running terminal tab, I am getting something like:
WARNING/ForkPoolWorker-9] hi
Task tasks_notes.tasks.monthly_starting_value[42bf83ef-850f-4b34-af78-da696d2ee0f2] succeeded in 0.0009442089994990965s: 0
in every 15 minutes.
I have even tried ``app.beat.conf_scheduleincelery.py``` and also tried running in admin phase, but its not working as expected.
Where can I be wrong?
Any help is highly appreciated.
It is definitely not irregular - it behaves exactly as it should.
If you wanted to grab the result of a task, then you should have something like:
abcd = monthly_starting_value.delay().get()
delay() returns an instance of AsyncResult class.
Finally, do not call print() inside task. Use the Celery logger.
Example:
import os
from celery.utils.log import get_task_logger
from worker import app
logger = get_task_logger(__name__)
#app.task()
def add(x, y):
result = x + y
logger.info(f'Add: {x} + {y} = {result}')
return result

Should I write import statements in each file in python

I have three files: module_one.py module_two.py module_three.py which each has a class for itself in it:
# module_one.py
from module_two import ModuleTwo
from module_three import ModuleThree
class ModuleOne:
...
# module_two.py
from module_one import ModuleOne
from module_three import ModuleThree
class ModuleTwo:
...
# module_three.py
from module_one import ModuleOne
from module_two import ModuleTwo
class ModuleThree:
...
I have a file main.py which looks like this:
# main.py
from module_one import ModuleOne
from module_two import ModuleTwo
from module_three import ModuleThree
... # and uses them
Which is the entry point of the application and uses those module classes.
In each of those module files (module_*.py) as you see I have to import other modules to use them. Isn't there any better way to manage this? can I write a statement in main.py and after that those module files that get loaded in the memory can see each other and call on themselves without having to import each other in their own files first?

Python - Can't import with_statement from __future__

Here are the other imports already in my app
import os
import sys
from google.appengine.ext.webapp import template
import cgi
import urllib
import wsgiref.handlers
from google.appengine.ext import db
from google.appengine.api import users
from google.appengine.ext import webapp
from google.appengine.ext.webapp.util import run_wsgi_app
from google.appengine.api import mail
from django.utils import simplejson as json
from datetime import datetime
import random
import string
from google.appengine.ext import blobstore
from google.appengine.ext.webapp import blobstore_handlers
from google.appengine.api import files
Everything compiles and works fine.
But when I add this import:
from __future__ import with_statement
nothing works. I go to appspot, and the page just says "server error."
How can I successfully import with_satement?
EDIT:
I I know blobstore is deprecated. with is used with blobstore. Could that be what's causing the problem? But with isn't only used with blobstore...
Try adding the import to the very top of your file (after any #!/usr/bin/python2.x statements).
from __future__ imports must occur at the beginning of the file

Categories