json.decoder.JSONDecodeError when migrating to Kubeflow Pipelines v2 - python

Copied from here: https://github.com/kubeflow/pipelines/issues/7608
I have a generated code file that runs against Kubeflow. It ran fine on Kubeflow v1, and now I'm moving it to Kubeflow v2. When I do this, I get the following error:
json.decoder.JSONDecodeError: Expecting property name enclosed in double quotes: line 1 column 2 (char 1)
I honestly don't even know where to go next. It feels like something is fundamentally broken for something to fail in the first character, but I can't see it (it's inside the kubeflow execution).
Thanks!
Environment
How did you deploy Kubeflow Pipelines (KFP)?
Standard deployment to AWS
KFP version:
1.8.1
KFP SDK version:
1.8.12
Here's the logs:
time="2022-04-26T17:38:09.547Z" level=info msg="capturing logs" argo=true
WARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager. It is recommended to use a virtual environment instead:
https://pip.pypa.io/warnings/venv
[KFP Executor 2022-04-26 17:38:24,691 INFO]: Looking for component `run_info_fn` in --component_module_path `/tmp/tmp.NJW6PWXpIt/ephemeral_component.py`
[KFP Executor 2022-04-26 17:38:24,691 INFO]: Loading KFP component "run_info_fn" from /tmp/tmp.NJW6PWXpIt/ephemeral_component.py (directory "/tmp/tmp.NJW6PWXpIt" and module name "ephemeral_component")
Traceback (most recent call last):
File "/usr/local/lib/python3.7/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/usr/local/lib/python3.7/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/usr/local/lib/python3.7/site-packages/kfp/v2/components/executor_main.py", line 104, in <module>
executor_main()
File "/usr/local/lib/python3.7/site-packages/kfp/v2/components/executor_main.py", line 94, in executor_main
executor_input = json.loads(args.executor_input)
File "/usr/local/lib/python3.7/json/__init__.py", line 348, in loads
return _default_decoder.decode(s)
File "/usr/local/lib/python3.7/json/decoder.py", line 337, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
File "/usr/local/lib/python3.7/json/decoder.py", line 353, in raw_decode
obj, end = self.scan_once(s, idx)
json.decoder.JSONDecodeError: Expecting property name enclosed in double quotes: line 1 column 2 (char 1)
time="2022-04-26T17:38:24.803Z" level=error msg="cannot save artifact /tmp/outputs/run_info/data" argo=true error="stat /tmp/outputs/run_info/data: no such file or directory"
Error: exit status 1
Here's the files to repro:
root_pipeline_04d99580c84b47c28405a2c8bcae8703.py
import kfp.v2.components
from kfp.v2.dsl import InputPath
from kubernetes.client.models import V1EnvVar
from kubernetes import client, config
from typing import NamedTuple
from base64 import b64encode
import kfp.v2.dsl as dsl
import kubernetes
import json
import kfp
from run_info import run_info_fn
from same_step_000_ce6494722c474dd3b8bef482bb976557 import same_step_000_ce6494722c474dd3b8bef482bb976557_fn
run_info_comp = kfp.v2.dsl.component(
func=run_info_fn,
packages_to_install=[
"kfp",
"dill",
],
)
same_step_000_ce6494722c474dd3b8bef482bb976557_comp = kfp.v2.dsl.component(
func=same_step_000_ce6494722c474dd3b8bef482bb976557_fn,
base_image="public.ecr.aws/j1r0q0g6/notebooks/notebook-servers/codeserver-python:v1.5.0",
packages_to_install=[
"dill",
"requests",
# TODO: make this a loop
],
)
#kfp.dsl.pipeline(name="root_pipeline_compilation",)
def root(
context: str='', metadata_url: str='',
):
# Generate secrets (if not already created)
secrets_by_env = {}
env_vars = {
}
run_info = run_info_comp(run_id=kfp.dsl.RUN_ID_PLACEHOLDER)
same_step_000_ce6494722c474dd3b8bef482bb976557 = same_step_000_ce6494722c474dd3b8bef482bb976557_comp(
input_context_path="",
run_info=run_info.outputs["run_info"],
metadata_url=metadata_url
)
same_step_000_ce6494722c474dd3b8bef482bb976557.execution_options.caching_strategy.max_cache_staleness = "P0D"
for k in env_vars:
same_step_000_ce6494722c474dd3b8bef482bb976557.add_env_variable(V1EnvVar(name=k, value=env_vars[k]))
run_info.py
"""
The run_info component fetches metadata about the current pipeline execution
from kubeflow and passes it on to the user code step components.
"""
from typing import NamedTuple
def run_info_fn(
run_id: str,
) -> NamedTuple("RunInfoOutput", [("run_info", str),]):
from base64 import urlsafe_b64encode
from collections import namedtuple
import datetime
import base64
import dill
import kfp
client = kfp.Client(host="http://ml-pipeline:8888")
run_info = client.get_run(run_id=run_id)
run_info_dict = {
"run_id": run_info.run.id,
"name": run_info.run.name,
"created_at": run_info.run.created_at.isoformat(),
"pipeline_id": run_info.run.pipeline_spec.pipeline_id,
}
# Track kubernetes resources associated wth the run.
for r in run_info.run.resource_references:
run_info_dict[f"{r.key.type.lower()}_id"] = r.key.id
# Base64-encoded as value is visible in kubeflow ui.
output = urlsafe_b64encode(dill.dumps(run_info_dict))
return namedtuple("RunInfoOutput", ["run_info"])(
str(output, encoding="ascii")
)
same_step_000_ce6494722c474dd3b8bef482bb976557.py
import kfp
from kfp.v2.dsl import component, Artifact, Input, InputPath, Output, OutputPath, Dataset, Model
from typing import NamedTuple
def same_step_000_ce6494722c474dd3b8bef482bb976557_fn(
input_context_path: InputPath(str),
output_context_path: OutputPath(str),
run_info: str = "gAR9lC4=",
metadata_url: str = "",
):
from base64 import urlsafe_b64encode, urlsafe_b64decode
from pathlib import Path
import datetime
import requests
import tempfile
import dill
import os
input_context = None
with Path(input_context_path).open("rb") as reader:
input_context = reader.read()
# Helper function for posting metadata to mlflow.
def post_metadata(json):
if metadata_url == "":
return
try:
req = requests.post(metadata_url, json=json)
req.raise_for_status()
except requests.exceptions.HTTPError as err:
print(f"Error posting metadata: {err}")
# Move to writable directory as user might want to do file IO.
# TODO: won't persist across steps, might need support in SDK?
os.chdir(tempfile.mkdtemp())
# Load information about the current experiment run:
run_info = dill.loads(urlsafe_b64decode(run_info))
# Post session context to mlflow.
if len(input_context) > 0:
input_context_str = urlsafe_b64encode(input_context)
post_metadata(
{
"experiment_id": run_info["experiment_id"],
"run_id": run_info["run_id"],
"step_id": "same_step_000",
"metadata_type": "input",
"metadata_value": input_context_str,
"metadata_time": datetime.datetime.now().isoformat(),
}
)
# User code for step, which we run in its own execution frame.
user_code = f"""
import dill
# Load session context into global namespace:
if { len(input_context) } > 0:
dill.load_session("{ input_context_path }")
{dill.loads(urlsafe_b64decode("gASVGAAAAAAAAACMFHByaW50KCJIZWxsbyB3b3JsZCIplC4="))}
# Remove anything from the global namespace that cannot be serialised.
# TODO: this will include things like pandas dataframes, needs sdk support?
_bad_keys = []
_all_keys = list(globals().keys())
for k in _all_keys:
try:
dill.dumps(globals()[k])
except TypeError:
_bad_keys.append(k)
for k in _bad_keys:
del globals()[k]
# Save new session context to disk for the next component:
dill.dump_session("{output_context_path}")
"""
# Runs the user code in a new execution frame. Context from the previous
# component in the run is loaded into the session dynamically, and we run
# with a single globals() namespace to simulate top-level execution.
exec(user_code, globals(), globals())
# Post new session context to mlflow:
with Path(output_context_path).open("rb") as reader:
context = urlsafe_b64encode(reader.read())
post_metadata(
{
"experiment_id": run_info["experiment_id"],
"run_id": run_info["run_id"],
"step_id": "same_step_000",
"metadata_type": "output",
"metadata_value": context,
"metadata_time": datetime.datetime.now().isoformat(),
}
)
Python file to execute to run:
from sameproject.ops import helpers
from pathlib import Path
import importlib
import kfp
def deploy(compiled_path: Path, root_module_name: str):
with helpers.add_path(str(compiled_path)):
kfp_client = kfp.Client() # only supporting 'kubeflow' namespace
root_module = importlib.import_module(root_module_name)
return kfp_client.create_run_from_pipeline_func(
root_module.root,
arguments={},
)

Turns out it has to do with not compiling with the right execution mode on.
If you're getting this, your code should look like this.
Compiler(mode=kfp.dsl.PipelineExecutionMode.V2_COMPATIBLE).compile(pipeline_func=root_module.root, package_path=str(package_yaml_path))

Related

AWS CDK Resolution error: Unable to resolve object tree with circular reference

I have one stack (etl_stack_2.py) that initializes a Partner construct and an OrderWorkflow construct. So partner and OrderWorkFlow are siblings in the tree. partner has a job property that is used by OrderWorkFlow. When I run cdk ls I get a circular reference error:
$ cdk ls
jsii.errors.JavaScriptError:
Error: Resolution error: Resolution error: Unable to resolve object tree with circular reference. Path: /Resources/${Token[data-lake-etl-infra-dev.order_workflow-dev.order_workflow_start_trigger-dev.LogicalID.100]}/Properties/actions/0/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host.
Object creation stack:
at new Intrinsic (/private/var/folders/gd/gpl89dgn5f7c8mzqx3dc8hzh0000gr/T/jsii-kernel-hDzf8X/node_modules/#aws-cdk/core/lib/private/intrinsic.js:21:44)
at new PostResolveToken (/private/var/folders/gd/gpl89dgn5f7c8mzqx3dc8hzh0000gr/T/jsii-kernel-hDzf8X/node_modules/#aws-cdk/core/lib/util.js:73:9)
at Object.ignoreEmpty (/private/var/folders/gd/gpl89dgn5f7c8mzqx3dc8hzh0000gr/T/jsii-kernel-hDzf8X/node_modules/#aws-cdk/core/lib/util.js:33:12)
at CfnTrigger._toCloudFormation (/private/var/folders/gd/gpl89dgn5f7c8mzqx3dc8hzh0000gr/T/jsii-kernel-hDzf8X/node_modules/#aws-cdk/core/lib/cfn-resource.js:237:44)
at /private/var/folders/gd/gpl89dgn5f7c8mzqx3dc8hzh0000gr/T/jsii-kernel-hDzf8X/node_modules/#aws-cdk/core/lib/private/refs.js:100:74
at Object.findTokens (/private/var/folders/gd/gpl89dgn5f7c8mzqx3dc8hzh0000gr/T/jsii-kernel-hDzf8X/node_modules/#aws-cdk/core/lib/private/resolve.js:128:13)
at findAllReferences (/private/var/folders/gd/gpl89dgn5f7c8mzqx3dc8hzh0000gr/T/jsii-kernel-hDzf8X/node_modules/#aws-cdk/core/lib/private/refs.js:100:38)
at Object.resolveReferences (/private/var/folders/gd/gpl89dgn5f7c8mzqx3dc8hzh0000gr/T/jsii-kernel-hDzf8X/node_modules/#aws-cdk/core/lib/private/refs.js:23:19)
at Object.prepareApp (/private/var/folders/gd/gpl89dgn5f7c8mzqx3dc8hzh0000gr/T/jsii-kernel-hDzf8X/node_modules/#aws-cdk/core/lib/private/prepare-app.js:34:16)
at Object.synthesize (/private/var/folders/gd/gpl89dgn5f7c8mzqx3dc8hzh0000gr/T/jsii-kernel-hDzf8X/node_modules/#aws-cdk/core/lib/private/synthesis.js:18:19)
at App.synth (/private/var/folders/gd/gpl89dgn5f7c8mzqx3dc8hzh0000gr/T/jsii-kernel-hDzf8X/node_modules/#aws-cdk/core/lib/stage.js:78:41)
at /Users/honarmand/xyz/xyz-data-lake/data-lake-etl-infra/venv/lib/python3.7/site-packages/jsii/_embedded/jsii/jsii-runtime.js:7756:51
at Kernel._wrapSandboxCode (/Users/honarmand/xyz/xyz-data-lake/data-lake-etl-infra/venv/lib/python3.7/site-packages/jsii/_embedded/jsii/jsii-runtime.js:8405:20)
at /Users/honarmand/xyz/xyz-data-lake/data-lake-etl-infra/venv/lib/python3.7/site-packages/jsii/_embedded/jsii/jsii-runtime.js:7756:25
at Kernel._ensureSync (/Users/honarmand/xyz/xyz-data-lake/data-lake-etl-infra/venv/lib/python3.7/site-packages/jsii/_embedded/jsii/jsii-runtime.js:8381:20)
at Kernel.invoke (/Users/honarmand/xyz/xyz-data-lake/data-lake-etl-infra/venv/lib/python3.7/site-packages/jsii/_embedded/jsii/jsii-runtime.js:7755:26)
at KernelHost.processRequest (/Users/honarmand/xyz/xyz-data-lake/data-lake-etl-infra/venv/lib/python3.7/site-packages/jsii/_embedded/jsii/jsii-runtime.js:7456:28)
at KernelHost.run (/Users/honarmand/xyz/xyz-data-lake/data-lake-etl-infra/venv/lib/python3.7/site-packages/jsii/_embedded/jsii/jsii-runtime.js:7394:14)
at Immediate._onImmediate (/Users/honarmand/xyz/xyz-data-lake/data-lake-etl-infra/venv/lib/python3.7/site-packages/jsii/_embedded/jsii/jsii-runtime.js:7397:37)
at processImmediate (internal/timers.js:456:21).
Object creation stack:
at new Intrinsic (/private/var/folders/gd/gpl89dgn5f7c8mzqx3dc8hzh0000gr/T/jsii-kernel-hDzf8X/node_modules/#aws-cdk/core/lib/private/intrinsic.js:21:44)
at new PostResolveToken (/private/var/folders/gd/gpl89dgn5f7c8mzqx3dc8hzh0000gr/T/jsii-kernel-hDzf8X/node_modules/#aws-cdk/core/lib/util.js:73:9)
at CfnTrigger._toCloudFormation (/private/var/folders/gd/gpl89dgn5f7c8mzqx3dc8hzh0000gr/T/jsii-kernel-hDzf8X/node_modules/#aws-cdk/core/lib/cfn-resource.js:235:39)
at /private/var/folders/gd/gpl89dgn5f7c8mzqx3dc8hzh0000gr/T/jsii-kernel-hDzf8X/node_modules/#aws-cdk/core/lib/private/refs.js:100:74
at Object.findTokens (/private/var/folders/gd/gpl89dgn5f7c8mzqx3dc8hzh0000gr/T/jsii-kernel-hDzf8X/node_modules/#aws-cdk/core/lib/private/resolve.js:128:13)
at findAllReferences (/private/var/folders/gd/gpl89dgn5f7c8mzqx3dc8hzh0000gr/T/jsii-kernel-hDzf8X/node_modules/#aws-cdk/core/lib/private/refs.js:100:38)
at Object.resolveReferences (/private/var/folders/gd/gpl89dgn5f7c8mzqx3dc8hzh0000gr/T/jsii-kernel-hDzf8X/node_modules/#aws-cdk/core/lib/private/refs.js:23:19)
at Object.prepareApp (/private/var/folders/gd/gpl89dgn5f7c8mzqx3dc8hzh0000gr/T/jsii-kernel-hDzf8X/node_modules/#aws-cdk/core/lib/private/prepare-app.js:34:16)
at Object.synthesize (/private/var/folders/gd/gpl89dgn5f7c8mzqx3dc8hzh0000gr/T/jsii-kernel-hDzf8X/node_modules/#aws-cdk/core/lib/private/synthesis.js:18:19)
at App.synth (/private/var/folders/gd/gpl89dgn5f7c8mzqx3dc8hzh0000gr/T/jsii-kernel-hDzf8X/node_modules/#aws-cdk/core/lib/stage.js:78:41)
at /Users/honarmand/xyz/xyz-data-lake/data-lake-etl-infra/venv/lib/python3.7/site-packages/jsii/_embedded/jsii/jsii-runtime.js:7756:51
at Kernel._wrapSandboxCode (/Users/honarmand/xyz/xyz-data-lake/data-lake-etl-infra/venv/lib/python3.7/site-packages/jsii/_embedded/jsii/jsii-runtime.js:8405:20)
at /Users/honarmand/xyz/xyz-data-lake/data-lake-etl-infra/venv/lib/python3.7/site-packages/jsii/_embedded/jsii/jsii-runtime.js:7756:25
at Kernel._ensureSync (/Users/honarmand/xyz/xyz-data-lake/data-lake-etl-infra/venv/lib/python3.7/site-packages/jsii/_embedded/jsii/jsii-runtime.js:8381:20)
at Kernel.invoke (/Users/honarmand/xyz/xyz-data-lake/data-lake-etl-infra/venv/lib/python3.7/site-packages/jsii/_embedded/jsii/jsii-runtime.js:7755:26)
at KernelHost.processRequest (/Users/honarmand/xyz/xyz-data-lake/data-lake-etl-infra/venv/lib/python3.7/site-packages/jsii/_embedded/jsii/jsii-runtime.js:7456:28)
at KernelHost.run (/Users/honarmand/xyz/xyz-data-lake/data-lake-etl-infra/venv/lib/python3.7/site-packages/jsii/_embedded/jsii/jsii-runtime.js:7394:14)
at Immediate._onImmediate (/Users/honarmand/xyz/xyz-data-lake/data-lake-etl-infra/venv/lib/python3.7/site-packages/jsii/_embedded/jsii/jsii-runtime.js:7397:37)
at processImmediate (internal/timers.js:456:21)
at resolve (/private/var/folders/gd/gpl89dgn5f7c8mzqx3dc8hzh0000gr/T/jsii-kernel-hDzf8X/node_modules/#aws-cdk/core/lib/private/resolve.js:35:15)
at Object.resolve (/private/var/folders/gd/gpl89dgn5f7c8mzqx3dc8hzh0000gr/T/jsii-kernel-hDzf8X/node_modules/#aws-cdk/core/lib/private/resolve.js:29:33)
at resolve (/private/var/folders/gd/gpl89dgn5f7c8mzqx3dc8hzh0000gr/T/jsii-kernel-hDzf8X/node_modules/#aws-cdk/core/lib/private/resolve.js:113:43)
at Object.resolve (/private/var/folders/gd/gpl89dgn5f7c8mzqx3dc8hzh0000gr/T/jsii-kernel-hDzf8X/node_modules/#aws-cdk/core/lib/private/resolve.js:29:33)
at resolve (/private/var/folders/gd/gpl89dgn5f7c8mzqx3dc8hzh0000gr/T/jsii-kernel-hDzf8X/node_modules/#aws-cdk/core/lib/private/resolve.js:113:43)
at Object.resolve (/private/var/folders/gd/gpl89dgn5f7c8mzqx3dc8hzh0000gr/T/jsii-kernel-hDzf8X/node_modules/#aws-cdk/core/lib/private/resolve.js:29:33)
at resolve (/private/var/folders/gd/gpl89dgn5f7c8mzqx3dc8hzh0000gr/T/jsii-kernel-hDzf8X/node_modules/#aws-cdk/core/lib/private/resolve.js:113:43)
at Object.resolve (/private/var/folders/gd/gpl89dgn5f7c8mzqx3dc8hzh0000gr/T/jsii-kernel-hDzf8X/node_modules/#aws-cdk/core/lib/private/resolve.js:29:33)
at resolve (/private/var/folders/gd/gpl89dgn5f7c8mzqx3dc8hzh0000gr/T/jsii-kernel-hDzf8X/node_modules/#aws-cdk/core/lib/private/resolve.js:113:43)
at Kernel._wrapSandboxCode (/Users/honarmand/xyz/xyz-data-lake/data-lake-etl-infra/venv/lib/python3.7/site-packages/jsii/_embedded/jsii/jsii-runtime.js:8408:19)
at /Users/honarmand/xyz/xyz-data-lake/data-lake-etl-infra/venv/lib/python3.7/site-packages/jsii/_embedded/jsii/jsii-runtime.js:7756:25
at Kernel._ensureSync (/Users/honarmand/xyz/xyz-data-lake/data-lake-etl-infra/venv/lib/python3.7/site-packages/jsii/_embedded/jsii/jsii-runtime.js:8381:20)
at Kernel.invoke (/Users/honarmand/xyz/xyz-data-lake/data-lake-etl-infra/venv/lib/python3.7/site-packages/jsii/_embedded/jsii/jsii-runtime.js:7755:26)
at KernelHost.processRequest (/Users/honarmand/xyz/xyz-data-lake/data-lake-etl-infra/venv/lib/python3.7/site-packages/jsii/_embedded/jsii/jsii-runtime.js:7456:28)
at KernelHost.run (/Users/honarmand/xyz/xyz-data-lake/data-lake-etl-infra/venv/lib/python3.7/site-packages/jsii/_embedded/jsii/jsii-runtime.js:7394:14)
at Immediate._onImmediate (/Users/honarmand/xyz/xyz-data-lake/data-lake-etl-infra/venv/lib/python3.7/site-packages/jsii/_embedded/jsii/jsii-runtime.js:7397:37)
at processImmediate (internal/timers.js:456:21)
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "app.py", line 13, in <module>
app.synth()
File "/Users/honarmand/xyz/xyz-data-lake/data-lake-etl-infra/venv/lib/python3.7/site-packages/aws_cdk/core/__init__.py", line 11531, in synth
return jsii.invoke(self, "synth", [options])
File "/Users/honarmand/xyz/xyz-data-lake/data-lake-etl-infra/venv/lib/python3.7/site-packages/jsii/_kernel/__init__.py", line 122, in wrapped
return _recursize_dereference(kernel, fn(kernel, *args, **kwargs))
File "/Users/honarmand/xyz/xyz-data-lake/data-lake-etl-infra/venv/lib/python3.7/site-packages/jsii/_kernel/__init__.py", line 320, in invoke
args=_make_reference_for_native(self, args),
File "/Users/honarmand/xyz/xyz-data-lake/data-lake-etl-infra/venv/lib/python3.7/site-packages/jsii/_kernel/providers/process.py", line 351, in invoke
return self._process.send(request, InvokeResponse)
File "/Users/honarmand/xyz/xyz-data-lake/data-lake-etl-infra/venv/lib/python3.7/site-packages/jsii/_kernel/providers/process.py", line 321, in send
raise JSIIError(resp.error) from JavaScriptError(resp.stack)
jsii.errors.JSIIError: Resolution error: Resolution error: Unable to resolve object tree with circular reference. Path: /Resources/${Token[data-lake-etl-infra-dev.order_workflow-dev.order_workflow_start_trigger-dev.LogicalID.100]}/Properties/actions/0/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host/node/host.
Object creation stack:
at new Intrinsic (/private/var/folders/gd/gpl89dgn5f7c8mzqx3dc8hzh0000gr/T/jsii-kernel-hDzf8X/node_modules/#aws-cdk/core/lib/private/intrinsic.js:21:44)
at new PostResolveToken (/private/var/folders/gd/gpl89dgn5f7c8mzqx3dc8hzh0000gr/T/jsii-kernel-hDzf8X/node_modules/#aws-cdk/core/lib/util.js:73:9)
at Object.ignoreEmpty (/private/var/folders/gd/gpl89dgn5f7c8mzqx3dc8hzh0000gr/T/jsii-kernel-hDzf8X/node_modules/#aws-cdk/core/lib/util.js:33:12)
at CfnTrigger._toCloudFormation (/private/var/folders/gd/gpl89dgn5f7c8mzqx3dc8hzh0000gr/T/jsii-kernel-hDzf8X/node_modules/#aws-cdk/core/lib/cfn-resource.js:237:44)
at /private/var/folders/gd/gpl89dgn5f7c8mzqx3dc8hzh0000gr/T/jsii-kernel-hDzf8X/node_modules/#aws-cdk/core/lib/private/refs.js:100:74
at Object.findTokens (/private/var/folders/gd/gpl89dgn5f7c8mzqx3dc8hzh0000gr/T/jsii-kernel-hDzf8X/node_modules/#aws-cdk/core/lib/private/resolve.js:128:13)
at findAllReferences (/private/var/folders/gd/gpl89dgn5f7c8mzqx3dc8hzh0000gr/T/jsii-kernel-hDzf8X/node_modules/#aws-cdk/core/lib/private/refs.js:100:38)
at Object.resolveReferences (/private/var/folders/gd/gpl89dgn5f7c8mzqx3dc8hzh0000gr/T/jsii-kernel-hDzf8X/node_modules/#aws-cdk/core/lib/private/refs.js:23:19)
at Object.prepareApp (/private/var/folders/gd/gpl89dgn5f7c8mzqx3dc8hzh0000gr/T/jsii-kernel-hDzf8X/node_modules/#aws-cdk/core/lib/private/prepare-app.js:34:16)
at Object.synthesize (/private/var/folders/gd/gpl89dgn5f7c8mzqx3dc8hzh0000gr/T/jsii-kernel-hDzf8X/node_modules/#aws-cdk/core/lib/private/synthesis.js:18:19)
at App.synth (/private/var/folders/gd/gpl89dgn5f7c8mzqx3dc8hzh0000gr/T/jsii-kernel-hDzf8X/node_modules/#aws-cdk/core/lib/stage.js:78:41)
at /Users/honarmand/xyz/xyz-data-lake/data-lake-etl-infra/venv/lib/python3.7/site-packages/jsii/_embedded/jsii/jsii-runtime.js:7756:51
at Kernel._wrapSandboxCode (/Users/honarmand/xyz/xyz-data-lake/data-lake-etl-infra/venv/lib/python3.7/site-packages/jsii/_embedded/jsii/jsii-runtime.js:8405:20)
at /Users/honarmand/xyz/xyz-data-lake/data-lake-etl-infra/venv/lib/python3.7/site-packages/jsii/_embedded/jsii/jsii-runtime.js:7756:25
at Kernel._ensureSync (/Users/honarmand/xyz/xyz-data-lake/data-lake-etl-infra/venv/lib/python3.7/site-packages/jsii/_embedded/jsii/jsii-runtime.js:8381:20)
at Kernel.invoke (/Users/honarmand/xyz/xyz-data-lake/data-lake-etl-infra/venv/lib/python3.7/site-packages/jsii/_embedded/jsii/jsii-runtime.js:7755:26)
at KernelHost.processRequest (/Users/
I'm not sure why this is happening or how to fix it. I would appreciate any insight. Here is my code:
app.py
from aws_cdk import core
from etl_stack_2 import EtlStack
app = core.App()
environments = ["dev", "staging", "pre-production", "production"]
for env in environments:
EtlStack(app, id=f"data-lake-etl-infra-{env}", env=env)
app.synth()
etl_stack_2.py
from aws_cdk import (
aws_s3 as s3,
core
)
import yaml
from pathlib import Path
import os
from partner import Partner
from workflows.order_workflow import OrderWorkflow
class EtlStack(core.Stack):
def __init__(self, scope: core.Construct, id: str, env: str,**kwargs) -> None:
super().__init__(scope, id=id, **kwargs)
assets_bucket = s3.Bucket.from_bucket_name(self, 'assets_bucket', "bucket_name")
current_dir = os.path.dirname(os.path.abspath(__file__))
scripts_root = str(Path(current_dir, "../..", "scripts").resolve())
job_role = f"role-service-glue-{env}"
partner = Partner(self,
"id-123",
env=env,
assets_bucket=assets_bucket,
scripts_root=scripts_root,
job_role=job_role
)
OrderWorkflow(
scope=self,
id=f"order_workflow-{env}",
env=env,
partner_job=partner.job # this is where I pass the job property to OrderWorkflow
)
order_workflow.py
from aws_cdk import (
aws_glue as glue,
core
)
class OrderWorkflow(core.Construct):
def __init__(self, scope: core.Construct, id: str, *,
env: str,
partner_job: glue.CfnJob,
**kwargs):
super().__init__(scope, id, **kwargs)
order_workflow = glue.CfnWorkflow(
scope=self,
id=f"glue-order_workflow-{env}"
)
start_trigger = glue.CfnTrigger(
scope=self,
id=f"order_workflow_start_trigger-{env}",
actions=[partner_job], # If I pass [] to actions instead it works fine
type="SCHEDULED",
description="this schedules trigger starts the order workflow",
name="Order Workflow start",
schedule="cron(15 * * * ? *)",
workflow_name=order_workflow.name
)
In the OrderWorkflow, I pass [glue.CfnTrigger.ActionProperty(job_name=partner_job.ref)] to actions instead of [partner_job], and that solved the problem. So the start_trigger looks like:
start_trigger = glue.CfnTrigger(
scope=self,
id=f"order_workflow_start_trigger-{env}",
actions=[glue.CfnTrigger.ActionProperty(job_name=partner_job.ref)],
type="SCHEDULED",
description="this schedules trigger starts the order workflow",
name="Order Workflow start",
schedule="cron(15 * * * ? *)",
workflow_name=order_workflow.name
)
If you look at CDK documentation on CfnTrigger, the actions property needs to be an ActionProperty: https://docs.aws.amazon.com/cdk/api/latest/docs/#aws-cdk_aws-glue.CfnTrigger.html#construct-props

Python detect new connection to wifi

I saw a tutorial on YouTube(I can't link it because I can't find it anymore),
So the code is supposed to detect devices that are connected to my Internet/Router, I don't understand a lot about how his(The person who made the tutorial) code works
I also got this error in my console:
File "c:/Users/j/Desktop/Connection-Detection.py", line 6, in
IP_NETWORK = config('IP_NETWORK')
File "C:\Users\j\AppData\Local\Programs\Python\Python38-32\lib\site-packages\decouple.py", line 199, in call
return self.config(*args, **kwargs)
File "C:\Users\j\AppData\Local\Programs\Python\Python38-32\lib\site-packages\decouple.py", line 83, in call
return self.get(*args, **kwargs)
File "C:\Users\j\AppData\Local\Programs\Python\Python38-32\lib\site-packages\decouple.py", line 68, in get
raise UndefinedValueError('{} not found. Declare it as envvar or define a default value.'.format(option))
decouple.UndefinedValueError: IP_NETWORK not found. Declare it as envvar or define a default value.
PS C:\Users\j\Desktop\python\login>
That's "Detection.py"
import sys
import subprocess
import os
from decouple import config
IP_NETWORK = config('IP_NETWORK')
IP_DEVICE = config('IP_DEVICE')
proc = subprocess.Popen(['ping', IP_NETWORK],stdout=subprocess.PIPE)
while True:
line = proc.stdout.readline
if not line:
break
connected_ip = line.decode('utf-8').split()[3]
if connected_ip == IP_DEVICE:
subprocess.Popen(['say', 'Someone connected to network'])
You need to define an environment variable in same directory as the Detection.py file.
Steps
Install python-decouple - pip install python-decouple.
Create a file called .env
Open the .env file and paste the following into it.
IP_NETWORK=YOUR_IP_NETWORK
IP_DEVICE=YOUR_IP_DEVICE
Replace YOUR_IP_NETWORK and YOUR_IP_DEVICE with your IP_NETWORK and IP_DEVICE

How to import model in Django

I want to execute a python file where I want to check if there are new rows in a csv file. If there are new rows, I want to add them to a database.
The project tree is as follows:
Thus, the file which I want to execute is check_relations.py inside relations folder.
check_relations.py is as follows:
from master.models import TraxioRelations
with open('AUTOI_Relaties.csv', 'rb') as tr:
traxio_relations = tr.readlines()
for line in traxio_relations:
number = line.split(';')[0]
number_exists = TraxioRelations.objects.filter(number=number)
print(number_exists)
The model TraxioRelations is inside models.py in master folder.
When I run python check_relations.py I get an error
Traceback (most recent call last):
File "check_relations.py", line 3, in <module>
from master.models import TraxioRelations
ImportError: No module named master.models
What am I doing wrong? How can I import model inside check_relations.py file?
i think the most usable way it is create commands
for example in your master catalog:
master /
management /
__init__.py
commands /
__init__.py
check_relations.py
in check_relations.py
from django.core.management.base import BaseCommand
from master.models import TraxioRelations
class Command(BaseCommand):
help = 'check_relations by file data'
def handle(self, *args, **options):
with open('AUTOI_Relaties.csv', 'rb') as tr:
traxio_relations = tr.readlines()
for line in traxio_relations:
number = line.split(';')[0]
number_exists = TraxioRelations.objects.filter(number=number)
print(number_exists)
! don't forget to change the path to the file AUTOI_Relaties.csv or put it to the new dir
and then you can run in shell:
./manage.py check_relations
When using Django, you are not supposed to run an individual module on its own. See eg https://docs.djangoproject.com/en/1.11/intro/tutorial01/#the-development-server.

ImportError in python

In clean.py I have:
import datetime
import os
from flask_script import Manager
from sqlalchemy_utils import dependent_objects
from components import db, app
from modules.general.models import File
from modules.workflow import Workflow
manager = Manager(usage='Cleanup manager')
#manager.command
def run(dryrun=False):
for abandoned_workflow in Workflow.query.filter(Workflow.current_endpoint == "upload.upload_init"):
if abandoned_workflow.started + datetime.timedelta(hours=12) < datetime.datetime.utcnow():
print("Removing abandoned workflow {0} in project {1}".format(
abandoned_workflow.id, abandoned_workflow.project.name
))
if not dryrun:
db.session.delete(abandoned_workflow)
db.session.commit()
for file in File.query.all():
dependencies_number = dependent_objects(file).count()
print("File {0} at {1} has {2} dependencies".format(file.name, file.path, dependencies_number))
if not dependencies_number:
file_delete(file, dryrun)
if not dryrun:
db.session.delete(file)
db.session.commit()
# List all files in FILE_STORAGE directory and delete ones tat don't have records in DB
all_files_hash = list(zip(*db.session.query(File.hash).all()))
for file in os.listdir(app.config['FILE_STORAGE']):
if file.endswith('.dat'):
continue
if file not in all_files_hash:
file_delete(os.path.join(app.config['FILE_STORAGE'], file), dryrun)enter code here
I need start def run()
in console I write:
python clean.py
And I have outputs :
`Traceback (most recent call last):
File "cleanup_command.py", line 7, in <module>
from components import db, app
ImportError: No module named 'components'
clean.py is located in- C:\App\model\clean.py
components.py is located in - C:\components.py
Workflow.py is located in - C:\modules\workflow\Workflow.py
Please, tell me what could be the problem?
The problem is that modules for import are searched in certain locations: https://docs.python.org/2/tutorial/modules.html#the-module-search-path.
In your case you can put all source directory paths in PYTHONPATH var like:
PYTHONPATH=... python clean.py
But I guess it would be better to relocate your code files (i.e. put all the libs in one location)
To start run() when you call python clean.py, Add these lines at the end of the script.
if __name__ == '__main__':
r = run()
## 0-127 is a safe return range, and 1 is a standard default error
if r < 0 or r > 127: r = 1
sys.exit(r)
Also as Eugene Primako mentioned it is better to relocate your code files in one location.
from components import db, app
ImportError: No module named 'components'
This means its looking for script named components.py in a location where clean.py is placed. This is the reason why you have got import error.

i got this error: "ImportError: cannot import name python" How do I fix it?

File "G:\Python25\Lib\site-packages\PyAMF-0.6b2-py2.5-win32.egg\pyamf\util\__init__.py", line 15, in <module>
ImportError: cannot import name python
How do I fix it?
If you need any info to know how to fix this problem, I can explain, just ask.
Thanks
Code:
from google.appengine.ext.webapp.util import run_wsgi_app
from google.appengine.ext import webapp
from TottysGateway import TottysGateway
import logging
def main():
services_root = 'services'
#services = ['users.login']
#gateway = TottysGateway(services, services_root, logger=logging, debug=True)
#app = webapp.WSGIApplication([('/', gateway)], debug=True)
#run_wsgi_app(app)
if __name__ == "__main__":
main()
Code:
from pyamf.remoting.gateway.google import WebAppGateway
import logging
class TottysGateway(WebAppGateway):
def __init__(self, services_available, root_path, not_found_service, logger, debug):
# override the contructor and then call the super
self.services_available = services_available
self.root_path = root_path
self.not_found_service = not_found_service
WebAppGateway.__init__(self, {}, logger=logging, debug=True)
def getServiceRequest(self, request, target):
# override the original getServiceRequest method
try:
# try looking for the service in the services list
return WebAppGateway.getServiceRequest(self, request, target)
except:
pass
try:
# don't know what it does but is an error for now
service_func = self.router(target)
except:
if(target in self.services_available):
# only if is an available service import it's module
# so it doesn't access services that should be hidden
try:
module_path = self.root_path + '.' + target
paths = target.rsplit('.')
func_name = paths[len(paths) - 1]
import_as = '_'.join(paths) + '_' + func_name
import_string = "from "+module_path+" import "+func_name+' as service_func'
exec import_string
except:
service_func = False
if(not service_func):
# if is not found load the default not found service
module_path = self.rootPath + '.' + self.not_found_service
import_string = "from "+module_path+" import "+func_name+' as service_func'
# add the service loaded above
assign_string = "self.addService(service_func, target)"
exec assign_string
return WebAppGateway.getServiceRequest(self, request, target)
You need to post your full traceback. What you show here isn't all that useful. I ended up digging up line 15 of pyamf/util/init.py. The code you should have posted is
from pyamf import python
This should not fail unless your local environment is messed up.
Can you 'import pyamf.util' and 'import pyamf.python' in a interactive Python shell? What about if you start Python while in /tmp (on the assumption that you might have a file named 'pyamf.py' in the current directory. Which is a bad thing.)
= (older comment below) =
Fix your question. I can't even tell where line 15 of util/__init__.py is supposed to be. Since I can't figure that out, I can't answer your question. Instead, I'll point out ways to improve your question and code.
First, use the markup language correctly, so that all the code is in a code block. Make sure you've titled the code, so we know it's from util/__init__.py and not some random file.
In your error message, include the full traceback, and not the last two lines.
Stop using parens in things like "if(not service_func):" and use a space instead, so its " if not service_func:". This is discussed in PEP 8.
Read the Python documentation and learn how to use the language. Something like "func_name = paths[len(paths) - 1]" should be "func_name = paths[-1]"
Learn about the import function and don't use "exec" for this case. Nor do you need the "exec assign_string" -- just do the "self.addService(service_func, target)"

Categories