This question already has answers here:
How can I access environment variables in Python?
(16 answers)
Closed 9 years ago.
I'm new to Python and would like reproduce a convenience I used when working in Perl.
When calling a Perl script I usually set some $ENV variables (like VERBOSE, DEVELOP and DEBUG). Inside the called script I recover their values using
my $verbose=$ENV{VERBOSE};
my $develop=$ENV{DEVELOP};
my $debug=$ENV{DEBUG};
This allow print stmts conditional on these variables.
Can I do the same thing in Python? I know thanks to previous responses (Thank you!) to use os.environ[var] within the script to access the values. But I have not been able to figure out how to assign a value to a variable when I call the script from the command line as I could when callinbg a Perl script
Can values be set for such variables on the commandline invoking a python script?
TIA
They are accessible via the os.environ dictionary.
os.environ['VERBOSE']
Related
This question already has answers here:
Passing IPython variables as arguments to bash commands
(4 answers)
Running Bash commands in Python
(11 answers)
Closed 2 years ago.
So I found this tool: https://github.com/elceef/dnstwist and I want to pass a list of domains into that tool then take the output and visualize it.
It operates through the command line, but how do I automate entering each domain and process the output automatically as well?
By the way I am using colab, so I would need a solution using jupyter notebooks!
Thanks!
If you want to do this programmatically I would not use the provided cli – although this would also be possible.
Instead you can create a small python script to do this.
Import dwintwist in your script and have a look at the main function in dwintwist.py to see how you call this.
Doesn't look to hard, but I don't know your programming skill level of course.
This question already exists:
Is there a way to run a script with 'unrecognizable' variablenames & funcitonnames? [closed]
Closed 2 years ago.
I wrote a project with many sub-sheets of python code. and need to run it on a p2p cloud-computing service because it needs performance. I dont want the 'host' to know what it is by trying to understand the variable names and function names.
Its about 1000s of variables and 100s of functions, so doing it via CTRL+r and then renaming gives a high risk of errors and takes a long time.
a) Is there a procedure in compiling to make the variable names (of a copy) unrecognizable? (e.g. ahjbeunicsnj instead of placeholder_1_for_xy_csv or kjbej() instead of save_csv())
or alternatively b) Is there a way to encrypt all the files and run it encrypted? (maybe .exe)
Yes it's possible. You can obfuscate the Python script. Programs like PyInstaller can create executables too. You didn't indicate that you researched that but it's an option.
Here's the official page on this topic which goes into far more detail: https://wiki.python.org/moin/Asking%20for%20Help/How%20do%20you%20protect%20Python%20source%20code%3F
Here's an answer on another StackExchange that's also relevant: https://reverseengineering.stackexchange.com/questions/22648/best-way-to-protect-source-code-of-exe-program-running-on-python
This question already has answers here:
How to get local variables updated, when using the `exec` call?
(3 answers)
Closed 4 years ago.
In my main script, I import one of my own module which contains global variables. This main script execute another script with the function exec (exec(compile(open(Seq_1, "rb").read(), Seq_1, 'exec')) and this other script import the same module.
So my question is: does these scripts have access to the same global variables (that means if I modify one global variable, the other script will be impacted) or not?
Python will run your file when you first import it. On the second import python won't re-run the file.
In practice, python functions and variables directly on modules (not wrapped in classes) works like singletons.
This answer explains more about it. You can directly refer to the docs, also suggested on linked answer.
This question already has answers here:
How to get environment from a subprocess?
(7 answers)
Closed 4 years ago.
I try execute csh script(which creates or updates environment variables) from python but environment variables don't update after return to shell.
Why ? How can I solve it ?
subprocess.call('script.csh',shell=True,executable="/bin/csh")
To set an environment variable in python, use
os.environ['YOUR_VARIABLE'] = "your_value"
Note that environment variables must be strings.
Explanation for why you cannot do what you want to do:
Environment variables are set in a per process memory space. When bash (or whatever shell have you) runs a program, it uses fork(), which inherits bash’s variables because it is a child process. What your trying to do is create a child process and have he parent inherit from the child, like #PM 2Ring said.
This question already has answers here:
How do I call a function from another .py file? [duplicate]
(19 answers)
Closed 6 years ago.
I'm new to Python, coming from Matlab like many. I'm used to defining my functions as standalone .m files and calling them easily from a second script as long as the function is saved somewhere within the defined Matlab path.
I've learned how to define a (user-defined) function in Python (def my_function() etc.), but I'm coming up short in my Google searches on a way to define the function in a a separate .py file A, and how to call it in another script B. All help files I can find give support on how to define the function within the same script. When I try calling the function (which I've defined as a .py file in the same folder I'm using for script A) my script doesn't recognize it. Do I need to be declaring the function at the beginning of my script? I'm getting the sense that Python is very different in the way it handles these things than Matlab--perhaps I can't do this?
Cheers
You have to use imports
for example, you have the function doSomething() in functions.py and want to work in main.py
in main.py
from functions import doSomething
var = doSomething()