This question already has answers here:
Closed 12 years ago.
Possible Duplicates:
What is the best practice for dealing with passwords in github?
How can I track system-specific config files in a repo/project?
Hi,
I would like to hide
DATABASE_NAME = ''
DATABASE_USER = ''
DATABASE_PASSWORD = ''
DATABASE_HOST = ''
Line 13 to 17 of the default Django settings.py file when checking it in github.
I still want to check it in tho, because I am adding modifications from time to time. I just want these four lines to always be set to empty.
You could also have an extra settings file which holds passwords and just import them in your main settings.py
For example:
settings.py
DATABASE_PASSWORD = ''
try:
from dev_settings import *
except ImportError:
pass
dev_settings.py
DATABASE_PASSWORD = 'mypassword'
And keep dev_settings.py out of revision control.
For my configuration files, I create a config.py-example and a config.py. config.py is ignored by the version control. When I deploy, I just copy config.py-example to config.py and update the passwords.
Unfortunately, that's not how Git works - either a file is in version control, or it isn't.
If you don't want the info in Github, then don't check it in. You could keep a copy of the config file in a separate (private) repository elsewhere if you wanted, though.
I would recommend keeping settings.py with those four lines exactly as you've shown them, and have a separate, tiny Python script to add and remove the four bits of secret information (reading them from a file that's not part of your git repository, but rather is safely and secretly kept -- in a couple of copies, for safety -- in very secure places).
You can have a presubmit check to make sure you never, ever push a settings.py that has not been shorn of the secrets (I don't know git enough to tell if it has "presubmit triggers" that can modify the repo, as well as presubmit checks that just check it, but, if it does, then clearly it may be more convenient for you to use said tiny Python script in such a trigger -- indeed, if that's the case, you might want to consider doing the removal/restoring of the secrets by using patch, and a simple diff file to use as its input, so you don't have to write even a line of script for the purpose).
Related
The Ignoring Errors docs currently list a way of ignoring a particular error for a particular line:
example = lambda: 'example' # noqa: E731
... and a way of ignoring all errors for an entire file:
# flake8: noqa
from foo import unused
function_that_doesnt_exist()
x = 1+ 2
... and a couple of ways, either through config or through command-line options, of disabling a particular error globally across an entire project.
But what if I want to ignore a particular error across the entirety of a single file - for instance, to disable warnings about unused imports in an __init__.py barrel file that just imports a bunch of classes so that code from other packages can import them from it in turn? The docs don't seem to hint at any syntax for this. Is it possible?
As of Flake8 3.7.0 you can do this using the --per-file-ignores option.
Command line example
flake8 --per-file-ignores="project/__init__.py:F401 setup.py:E121"
Or in your config file
per-file-ignores =
project/__init__.py:F401
setup.py:E121
other_project/*:W9
See the documentation here: http://flake8.pycqa.org/en/latest/user/options.html?highlight=per-file-ignores#cmdoption-flake8-per-file-ignores
It is not possible to place a noqa comment for specific codes at the top of a file like you can for individual lines. # flake8: noqa: F401 may at first appear to work, but it's actually being detected as only # flake8: noqa, which means "ignore all messages in the file".
Before version 3.7.0, ignoring specific errors was only implemented per-line but not per-file.
The feature was discussed in issue #324 and the project chose not to implement. An implementation was proposed in this merge request, which nobody has followed up on.
However, some extensions have emerged to address the problem:
[discontinued] flake8-per-file-ignores lets you ignore specific warning/errors for specific files via an entry in the config.
flake8-putty claims to do the same, but hasn't been updated for a while.
I implemented a flake8 plugin flake8-in-file-ignores to allow adding "ignore" rules in the file itself (as opposed to the built-in config approach), the plugin uses the following syntax
# flake8-in-file-ignores: noqa: E731,E123
Update .flake8 file with
ignore = E123,E125,H404,H405,H803 ... any other rools
Sphinx usually incrementally builds the documentation which means that only files that have been changed will be regenerated. I am wondering if there is a way to tell Sphinx to always regenerate certain files which may not directly have been changed but are influenced by changes in other files. More specific: Is there a way to tell Sphinx to always regenerate files that contain a certain directive? The documentation I am working on relies on the possibility to collect and reformat information from other pages with the help of directives quite frequently. A clean (make clean && make [html]) and/or full (sphinx-build -a) build takes significantly longer than an incremental build. Additionally, manually keeping track of files which contain the directive might be complicated. The documentation is written by 10+ authors with limited experience in writing Sphinx documentation.
But even in less complex scenarios you might face this 'issue':
For instance sphinx.ext.todo contains a directive called todolist which collects todos from the whole documentation. If I create a file containing all the todos from my documentation (basically an empty document just containing the todolist directive) the list is not updated until I make a clean build or alter the file.
If you want to test it yourself: Create a documentation with sphinx-quickstart and stick to the default values except for
'> todo: write "todo" entries that can be shown or hidden on build (y/n) [n]: y'
Add a file in source called todos.rst and reference this file from index.rst.
Content of the index.rst:
Welcome to sphinx-todo's documentation!
=======================================
.. toctree::
:maxdepth: 2
todos
.. todo::
I have to do this
Indices and tables
==================
* :ref:`genindex`
* :ref:`modindex`
* :ref:`search`
Content of todos.rst:
.. _`todos`:
List of ToDos
=============
.. todolist::
Assuming you use the html output you will notice that todos.html will not change when you add todos to index.html.
tl;dr: How -- if possible -- do I include files containing a specific directive (e.g. todolist) into an incremental build of Sphinx without the need of manually keeping track of them?
By default Sphinx will only update output for new or changed files. This is buried under sphinx-build -a.
At the end of the documentation of command options for sphinx-build:
You can also give one or more filenames on the command line after the source and build directories. Sphinx will then try to build only these output files (and their dependencies).
You could either invoke sphinx-build directly or through your makefile, depending on the makefile that shipped with your version of Sphinx (you can customize the makefile, too).
Just for the record: I benchmarked several solutions.
I created a function called touch_files in my conf.py. It searches for strings in files and -- if found -- touches the file to trigger a rebuilt:
def touch_files(*args):
# recursively search the 'source' directory
for root, dirnames, filenames in os.walk('.'):
# check all rst files
for filename in fnmatch.filter(filenames, '*.rst'):
cur = os.path.join(root, filename)
f = open(cur)
# access content directly from disk
s = mmap.mmap(f.fileno(), 0, access=mmap.ACCESS_READ)
if any(s.find(d) != -1 for d in args):
# if a string pattern has been found alter the edit
# time of the file
os.utime(cur, None)
f.close()
# actually call the function
touch_files('.. todolist::')
touch_files can be called with a variable amount of arguments and will edit a file when ONE of the arguments has been found. I tried to optimize the function with regular expression but this did not achieve much. Reading the file content directly from the disk with mmap seemed to have a minor impact.
This is the result of 78 files total from which 36 contain one of two directives.
Command Time Comment
time make html 2.3 s No changes
time sh -c 'make clean && make html' 13.3 s
time make htmlfull 9.4 s sphinx-build -a
time make html 8.4 s with 'touch_files'
'touch_files' 0.2 s tested with testit
Result: Every command has been called just a few times (except 'touch_files') and therefore lack statistical reliability. Sphinx requires roughly 2.3 seconds to check the documentation for changes without doing anything. A clean build requires 13.3 seconds which is much longer than a build with sphinx-build -a. If we just rebuilt 36 out of 78 files, the built process is slightly faster, although I doubt a significant difference could be found here. The overhead of 'touch_files' is rather low. Finding the strings is quite cheap compared to editing the timestamps.
Conclusion: As Steve Piercy pointed out, using sphinx-build -a seems to be the most reasonable approach. At least for my use case. Should a file which does not contain a directive in question result in long building times touch_files might be useful though.
I have a Python script with secret keys for the Tweeter API. I would like to version control my script using Github. How do I keep my keys secret while still uploading to Github? That is, the values of these
KEY = ""
KEY_SECRET = ""
TOKEN = ""
TOKEN_SECRET = ""
should be kept secret. Maybe I can put them in another file and load them, but .gitignor'ing said file? What is the correct pattern?
# project/.gitignore
passwords.py
# project/passwords.py
GITHUB_KEY = '123'
GITHUB_KEY_SECRET = 'ABC'
GITHUB_TOKEN = '456'
GITHUB_TOKEN_SECRET = 'XYZ'
# project/my_script.py
from passwords import GITHUB_KEY, GITHUB_KEY_SECRET, GITHUB_TOKEN, GITHUB_TOKEN_SECRET
KEY = GITHUB_KEY
KEY_SECRET = GITHUB_KEY_SECRET
TOKEN = GITHUB_TOKEN
TOKEN_SECRET = GITHUB_TOKEN_SECRET
As hinted to by #chishaku, a good idea would be to save all your confidential information in a separate file that is not version controlled ie: is not known by git. You do this by adding it to the .gitignore file.
With this in place, you can safely commit your code to GitHub where everyone can see your project - however your confidential information and passwords are no where to be seen!
Within your project, you can now read that file (or import it) and use the information held within.
Keep in mind that when you (or someone else) accesses this project, you will have to ensure that your "secret" file exists since your project depends on it.
In my projects, creating this "secret" file is part of the deploy script. Something like:
echo '{"password": "123"}' > config.json && git checkout master
This line of code writes the (simple) settings file to config.json and only afterwards retrieves the latest code version from the master branch.
This question already has answers here:
How to temporarily modify sys.path in Python?
(4 answers)
Closed 8 years ago.
import sys
print(sys.path)
'C:\\python32\\Lib\\site-packages\\django'
'C:\\Python32'
'C:\\Python32\\lib\\site-packages'
...
for some reason, my pythonpath got messy. I'd like to organize it. Is it correct that I don't need the first and the last one above? And how can I change it permanently? (Not like sys.path.remove or sys.path.append)
I'm using Python3.2. in windows8.
Path 'C:\Python32\lib\site-packages' is added to sys.path by the built-in site module.
If you want to, you can start python with the -S flag to tell the site module "Don't add site-packages".
python -S
Next, 'C:\python32\Lib\site-packages\django'.
Here's a wild guess: you installed django with pip/easy_install/msi-installer and there is a file
C:\python32\Lib\site-packages\django.pth (or something like this ending with .pth)
Quoting the docs:
A path configuration file is a file whose name has the form name.pth
and exists in one of the four directories mentioned above; its
contents are additional items (one per line) to be added to sys.path.
You can remove django.pth file (not recommended, see below) to remove '..../django' from sys.path
So, short answer: don't mess with sys.path, what's in sys.path is probably for a good reason.
If you don't need django, then uninstall django using whatever tool you used to install it. Same for every package you don't need.
As the question title might suggest, I would very much like to know of the way to check the ntfs permissions of the given file or folder (hint: those are the ones you see in the "security" tab). Basically, what I need is to take a path to a file or directory (on a local machine, or, preferrably, on a share on a remote machine) and get the list of users/groups and the corresponding permissions for this file/folder. Ultimately, the application is going to traverse a directory tree, reading permissions for each object and processing them accordingly.
Now, I can think of a number of ways to do that:
parse cacls.exe output -- easily done, BUT, unless im missing something, cacls.exe only gives the permissions in the form of R|W|C|F (read/write/change/full), which is insufficient (I need to get the permissions like "List folder contents", extended permissions too)
xcacls.exe or xcacls.vbs output -- yes, they give me all the permissions I need, but they work dreadfully slow, it takes xcacls.vbs about ONE SECOND to get permissions on a local system file. Such speed is unacceptable
win32security (it wraps around winapi, right?) -- I am sure it can be handled like this, but I'd rather not reinvent the wheel
Is there anything else I am missing here?
Unless you fancy rolling your own, win32security is the way to go. There's the beginnings of an example here:
http://timgolden.me.uk/python/win32_how_do_i/get-the-owner-of-a-file.html
If you want to live slightly dangerously (!) my in-progress winsys package is designed to do exactly what you're after. You can get an MSI of the dev version here:
http://timgolden.me.uk/python/downloads/WinSys-0.4.win32-py2.6.msi
or you can just checkout the svn trunk:
svn co http://winsys.googlecode.com/svn/trunk winsys
To do what you describe (guessing slightly at the exact requirements) you could do this:
import codecs
from winsys import fs
base = "c:/temp"
with codecs.open ("permissions.log", "wb", encoding="utf8") as log:
for f in fs.flat (base):
log.write ("\n" + f.filepath.relative_to (base) + "\n")
for ace in f.security ().dacl:
access_flags = fs.FILE_ACCESS.names_from_value (ace.access)
log.write (u" %s => %s\n" % (ace.trustee, ", ".join (access_flags)))
TJG