How do I run the Deep Dream source code? - python

(I downloaded the deep dream source code from https://github.com/google/deepdream)
First of all, I'm not interested in purely Deep Dream only, but machine learning, and deep learning in particular, as a whole. I know programming (but by no means an expert) and python syntax etc. However, I'm not familiar with external libraries and how to properly install them.
Thus, I'm struggling with simply getting the source code for Deep Dream to run. Here's what I've done so far:
Installed Python, but it couldn't run the .ipynb (nor did it include any of the libraries) file so I:
Installed Anaconda, but it didn't include Caffe so I:
Downloaded Caffe, but it requires cudNN(??) so I:
Downloaded cudNN (Does it require Cuda (whatever that is?))
What are the next steps? There are so many things to download and install and I have no experience with any of it except for Python programming itself.
I tried reading the install instructions but they get me even more confused.
What are the steps I should take next in order to get it running?
Keep in mind that I'm a beginner. No hate please. Official documentation and terminology are still hard to understand. I'm simply looking for step-by-step instructions.
Thanks in advance!
Edit: I'm using Windows

[Promoted from a comment]
If you're not familiar with it, Docker is going to be your easiest option. Think of a docker container as as a portable, fully self-contained VM.
You can install docker on almost any OS, then use it to load a container which has all the software pre-installed.
You can get docker here and you can get the CPU / GPU container by following the instructions here.
Note that Docker's really handy for other things too - eg I have containers for Centos 6/6.5/7, RHEL, SLES, Windows, etc... for testing and as servers.

Related

Is there an easy solution to "No module named [...]" in Python 3? [duplicate]

I wish to place a python program on GitHub and have other people download and run it on their computers with assorted operating systems. I am relatively new to python but have used it enough to have noticed that getting the assorted versions of all the included modules to work together can be problematic. I just discovered the use of requirements.txt (generated with pipreqs and deployed with the command pip install -r /path/to/requirements.txt) but was very surprised to notice that requirements.txt does not actually state what version of python is being used so obviously it is not the complete solution on its own. So my question is: what set of specifications/files/something-else is needed to ensure that someone downloading my project will actually be able to run it with the fewest possible problems.
EDIT: My plan was to be guided by whichever answer got the most upvotes. But so far, after 4 answers and 127 views, not a single answer has even one upvote. If some of the answers are no good, it would be useful to see some comments as to why they are no good.
Have you considered setting up a setup.py file? It's a handy way of bundling all of your... well setup into a single location. So all your user has to do is A) clone your repo and B) run pip install . to run the setup.py
There's a great stack discussion about this.
As well as a handle example written by the requests guy.
This should cover most use cases. Now if you want to make it truly distributable then you'll want to look into setting it up in PyPi, the official distribution hub.
Beyond that if you're asking how to make a program "OS independent" there isn't a one size fits all. It depends on what you are doing with your code. Requires researching how your particular code interacts with those OS's etc.
There are many, many, many, many, many, many, many ways to do this. I'll skate over the principles behind each, and it's use case.
1. A python environment
There are many ways to do this. pipenv, conda, requirments.txt, etc etc.
With some of these, you can specify python versions. With others, just specify a range of python versions you know it works with - for example, if you're using python 3.7, it's unlikely not to support 3.6; there's only one or two minor changes. 3.8 should work as well.
Another similar method is setup.py. These are generally used to distribute libraries - like PyInstaller (another solution I'll mention below), or numpy, or wxPython, or PyQt5 etc - for import/command line use. The python packaging guide is quite useful, and there are loads of tutorials out there. (google python setup.py tutorial) You can also specify requirements in these files.
2. A container
Docker is the big one. If you haven't heard of it, I'll be surprised. A quick google of a summary comes up with this, which I'll quote part of:
So why does everyone love containers and Docker? James Bottomley, formerly Parallels' CTO of server virtualization and a leading Linux kernel developer, explained VM hypervisors, such as Hyper-V, KVM, and Xen, all are "based on emulating virtual hardware. That means they're fat in terms of system requirements."
Containers, however, use shared operating systems. This means they are much more efficient than hypervisors in system resource terms. Instead of virtualizing hardware, containers rest on top of a single Linux instance. This means you can "leave behind the useless 99.9 percent VM junk, leaving you with a small, neat capsule containing your application,"
That should summarise it for you. (Note you don't need a specific OS for containers.)
3. An executable file
There are 2 main tools that do this at the time of writing. PyInstaller, and cx_Freeze. Both are actively developed. Both are open source.
You take your script, and the tool compiles it to bytecode, finds the imports, copies those, and creates a portable python environment that runs your script on the target system without the end user needing python.
Personally, I prefer PyInstaller - I'm one of the developers. PyInstaller provides all of its functionality through a command line script, and supports most libraries that you can think of - and is extendable to support more. cx_Freeze requires a setup script.
Both tools support windows, Linux, macOS, and more. PyInstaller can create single file exes, or a one folder bundle, whereas cx_Freeze only supports one folder bundles. PyInstaller 3.6 supports python 2.7, and 3.5-3.7 - but 4.0 won't support python 2. cx_Freeze has dropped python 2 support as of the last major release (6.0 I think).
Anyway, enough about the tools features; you can look into those yourself. (See https://pyinstaller.org and https://cx-freeze.readthedocs.io for more info)
When using this distribution method, you usually provide source code on the GitHub repo, a couple of exes (one for each platform) ready for download, and instructions on how to build the code into an executable file.
The best tool I have used so far for this is Pipenv. Not only it unifies and simplifies the whole pip+virtualenv workflow for you, developer, but it also guarantees that the exact versions of all dependencies (including Python itself) are met when other people run your project with it.
The project website does a pretty good job at explaining how to use the tool, but, for completeness sake, I'll give a short explanation here.
Once you have Pipenv installed (for instance, by running pip install --user pipenv), you can go to the directory of your project and run pipenv --python 3.7, so Pipenv will create a new virtualenv for your project, create a Pipfile and a Pipfile.lock (more on them later). If you go ahead and run pipenv install -r requirements.txt it will install all your packages. Now you can do a pipenv shell to activate your new virtualenv, or a pipenv run your_main_file.py to simply run your project.
Now let's take a look at the contents of your Pipfile. It should be something resembling this:
[packages]
Django = "*"
djangorestframework = "*"
iso8601 = "*"
graypy = "*"
whitenoise = "*"
[requires]
python_version = "3.7"
This file has the human-readable specifications for the dependencies of your project (note that it specifies the Python version too). If your requirements.txt had pinned versions, your Pipfile could have them too, but you can safely wildcard them, because the exact versions are stored in the Pipfile.lock. Now you can run things like pipenv update to update your dependencies and don't forget to commit Pipfile and Pipfile.lock to your VCS.
Once people clone your project, all they have to do is run pipenv install and Pipenv will take care of the rest (it may even install the correct version of Python for them).
I hope this was useful. I'm not affiliated in any way with Pipenv, just wanted to share this awesome tool.
If your program is less about GUI, or has a web GUI, then you can share the code using Google Colaboratory.
https://colab.research.google.com/
Everyone can run it with the same environment. No need for installation.
In case converting all your python scripts into one executable can help you, then my answer below would help ...
I have been developing a large desktop application purely in python since 3 years. It is a GUI-based tool built on top of pyqt library (python-bindings of QT C++ framework).
I am currently using "py2exe" packaging library : is a distutils extension which allows to build standalone Windows executable programs (32-bit and 64-bit) from Python scripts; all you have to do is to:
install py2exe: 'pip install py2exe'
Create a setup.py script: It is used to specify the content of the final EXE (name, icon, author, data files, shared libraries, etc ..)
Execute: python setup.py py2exe
I am also using "Inno Setup" software to create installer: Creating shortcuts, setting environment variables, icons, etc ...
I'll give you a very brief summary of some of the existing available solutions when it comes to python packaging you may choose from (knowledge is power):
Follow the guidelines provided at Structuring Your Project, these conventions are widely accepted by python community and it's usually a good starting point when newcomers start coding in python. By following these guidelines pythonists watching your project/source at github or other similar places will know straightaway how to install it. Also, uploading your project to pypi as well as adding CI by following those rules will be painless.
Once your project is structured properly according to standard conventions, the next step might be using some of the available freezers, in case you'd like to ship to your end-users a package they can install without forcing them to have python installed on their machines. Be aware though these tools won't provide you any code protection... said otherwise, extracting the original python code from the final artifacts would be trivial in all cases
If you still want to ship your project to your users without forcing them to install any dev dependency and you do also care about code protection so you don't want to consider any of the existing freezers you might use tools such as nuitka, shedskin, cython or similar ones. Usually reversing code from the artifacts produced by these tools isn't trivial at all... Cracking protection on the other hand is a different matter and unless you don't provide a physical binary to your end-user you can't do much about it other than slowing them down :)
Also, in case you'd need to use external languages in your python project another classic link that comes to mind would be https://wiki.python.org/moin/IntegratingPythonWithOtherLanguages, adding the build systems of such tools to CI by following rules of 1 would be pretty easy.
That said, I'd suggest stick to bulletpoint 1 as I know that will be more than good enough to get you started, also that particular point should cover many of the existing use-cases for python "standard" projects.
While this is not intended to be a full guide by following those you'll be able to publish your python project to the masses in no time.
I think you can use docker with your python https://github.com/celery/celery/tree/master/docker
kindly follow the files and I think you can figure out the way to make your docker file for your python scripts!
Because it is missing from the other answers, I would like to add one completely different aspect:
Unit testing. Or testing in general.
Usually, it is good to have one known good configuration. Depending on what the dependencies of the program are, you might have to test different combinations of packages. You can do that in an automated fashion with e.g. tox or as part of a CI/CD pipeline.
There is no general rule of what combination of packages should be tested, but usually python2/3 compatability is a major issue. If you have strong dependencies on packages with major version differences, you might want to consider testing against these different versions.

Tensorflow installation issues

System information
OS Platform and Distribution - Windows 10
TensorFlow version: latest
Python version: 3.6.4
Installed using virtualenv? pip? conda?: - virtualenv
Greetings,
I hope this is the correct place to submit an inquiry of this nature, if it is not, please forgive my confusion & please point me in the right direction. I greatly appreciate your time & consideration.
I am new to Python & Tensorflow. I've done some coding with C in the past, mostly when I was in college. I am determined to learn Python & to utilize both Python & Tensorflow for AI & Machine Learning purposes.
I've had difficulties in getting Tensorflow to install properly. I started by installing the latest version of Python which didn't seem to like my attempts at installing Tensorflow, I then went with Python 3.6.4-amd64. I installed that, created a fresh directory for my environments, then installed pip & virtual env, then created a virtual environment to setup with Tensorflow.
One of the confusing issues I keep encountering is that when I install pip & virtualenv, and eventually Tensorflow, it keeps sending it by default to C:\user\username\appdata\roaming\python etc, my question is, how do I prevent it from doing that? I am trying to install in the direct being utilized in the command prop, I call up the fresh directory I created for my virtual environment, then activate the virtual environment, and no matter what I do it keeps sending all new install files into the appdata/roaming user directory sub folders.
This is causing the incredibly annoying issue of making it impossible for me to proceed with utilizing Tensorflow because I get nothing but errors on missing files, path directory etc etc. I even tried manually moving some of the files over to the virtual environment directory and that worked in some cases, but did not solve the overall problem.
Okay, now that I've made it painfully apparent how much of an uneducated newbie I am with all of this, may someone please give me some advice. The first step is admitting you need help, and I clearly do as I've spent several hours with my eyes glued to various articles and tutorials that have left me with more questions than answers. I truly appreciate any help you're willing to provide. Just a loner trying to figure this all out & increase my knowledge along the way. Thanks for your time,
Just moving the folder is not enough.
Once you have moved it, you must replace the original with a symbolic link to the new location. This will make windows think the data is still located on your C drive, while it actually is on your D drive.
Do note, this does work with AppData, but not with Program Files nor with the Windows folder, as it will break things like Windows Update.
To create the Directory Junction (Symbolic Link) do the following:
Open a cmd window with administrative privileges.
Navigate to c:\Users\username\appdata
execute the following command: mklink /d local d:\appdata\local
replace d:\appdata\local with the actual path of where you moved the appdata to.
If you cannot move/delete the original copy, create a 2nd user, make it administrator, login with it, and retry the option. This should ensure that no files are in use.
With the above issue fixed, follow the tensorflow installation steps in Anaconda provided here.
Hope this answers your question. Happy Learning.

Robust way to ensure other people can run my python program

I wish to place a python program on GitHub and have other people download and run it on their computers with assorted operating systems. I am relatively new to python but have used it enough to have noticed that getting the assorted versions of all the included modules to work together can be problematic. I just discovered the use of requirements.txt (generated with pipreqs and deployed with the command pip install -r /path/to/requirements.txt) but was very surprised to notice that requirements.txt does not actually state what version of python is being used so obviously it is not the complete solution on its own. So my question is: what set of specifications/files/something-else is needed to ensure that someone downloading my project will actually be able to run it with the fewest possible problems.
EDIT: My plan was to be guided by whichever answer got the most upvotes. But so far, after 4 answers and 127 views, not a single answer has even one upvote. If some of the answers are no good, it would be useful to see some comments as to why they are no good.
Have you considered setting up a setup.py file? It's a handy way of bundling all of your... well setup into a single location. So all your user has to do is A) clone your repo and B) run pip install . to run the setup.py
There's a great stack discussion about this.
As well as a handle example written by the requests guy.
This should cover most use cases. Now if you want to make it truly distributable then you'll want to look into setting it up in PyPi, the official distribution hub.
Beyond that if you're asking how to make a program "OS independent" there isn't a one size fits all. It depends on what you are doing with your code. Requires researching how your particular code interacts with those OS's etc.
There are many, many, many, many, many, many, many ways to do this. I'll skate over the principles behind each, and it's use case.
1. A python environment
There are many ways to do this. pipenv, conda, requirments.txt, etc etc.
With some of these, you can specify python versions. With others, just specify a range of python versions you know it works with - for example, if you're using python 3.7, it's unlikely not to support 3.6; there's only one or two minor changes. 3.8 should work as well.
Another similar method is setup.py. These are generally used to distribute libraries - like PyInstaller (another solution I'll mention below), or numpy, or wxPython, or PyQt5 etc - for import/command line use. The python packaging guide is quite useful, and there are loads of tutorials out there. (google python setup.py tutorial) You can also specify requirements in these files.
2. A container
Docker is the big one. If you haven't heard of it, I'll be surprised. A quick google of a summary comes up with this, which I'll quote part of:
So why does everyone love containers and Docker? James Bottomley, formerly Parallels' CTO of server virtualization and a leading Linux kernel developer, explained VM hypervisors, such as Hyper-V, KVM, and Xen, all are "based on emulating virtual hardware. That means they're fat in terms of system requirements."
Containers, however, use shared operating systems. This means they are much more efficient than hypervisors in system resource terms. Instead of virtualizing hardware, containers rest on top of a single Linux instance. This means you can "leave behind the useless 99.9 percent VM junk, leaving you with a small, neat capsule containing your application,"
That should summarise it for you. (Note you don't need a specific OS for containers.)
3. An executable file
There are 2 main tools that do this at the time of writing. PyInstaller, and cx_Freeze. Both are actively developed. Both are open source.
You take your script, and the tool compiles it to bytecode, finds the imports, copies those, and creates a portable python environment that runs your script on the target system without the end user needing python.
Personally, I prefer PyInstaller - I'm one of the developers. PyInstaller provides all of its functionality through a command line script, and supports most libraries that you can think of - and is extendable to support more. cx_Freeze requires a setup script.
Both tools support windows, Linux, macOS, and more. PyInstaller can create single file exes, or a one folder bundle, whereas cx_Freeze only supports one folder bundles. PyInstaller 3.6 supports python 2.7, and 3.5-3.7 - but 4.0 won't support python 2. cx_Freeze has dropped python 2 support as of the last major release (6.0 I think).
Anyway, enough about the tools features; you can look into those yourself. (See https://pyinstaller.org and https://cx-freeze.readthedocs.io for more info)
When using this distribution method, you usually provide source code on the GitHub repo, a couple of exes (one for each platform) ready for download, and instructions on how to build the code into an executable file.
The best tool I have used so far for this is Pipenv. Not only it unifies and simplifies the whole pip+virtualenv workflow for you, developer, but it also guarantees that the exact versions of all dependencies (including Python itself) are met when other people run your project with it.
The project website does a pretty good job at explaining how to use the tool, but, for completeness sake, I'll give a short explanation here.
Once you have Pipenv installed (for instance, by running pip install --user pipenv), you can go to the directory of your project and run pipenv --python 3.7, so Pipenv will create a new virtualenv for your project, create a Pipfile and a Pipfile.lock (more on them later). If you go ahead and run pipenv install -r requirements.txt it will install all your packages. Now you can do a pipenv shell to activate your new virtualenv, or a pipenv run your_main_file.py to simply run your project.
Now let's take a look at the contents of your Pipfile. It should be something resembling this:
[packages]
Django = "*"
djangorestframework = "*"
iso8601 = "*"
graypy = "*"
whitenoise = "*"
[requires]
python_version = "3.7"
This file has the human-readable specifications for the dependencies of your project (note that it specifies the Python version too). If your requirements.txt had pinned versions, your Pipfile could have them too, but you can safely wildcard them, because the exact versions are stored in the Pipfile.lock. Now you can run things like pipenv update to update your dependencies and don't forget to commit Pipfile and Pipfile.lock to your VCS.
Once people clone your project, all they have to do is run pipenv install and Pipenv will take care of the rest (it may even install the correct version of Python for them).
I hope this was useful. I'm not affiliated in any way with Pipenv, just wanted to share this awesome tool.
If your program is less about GUI, or has a web GUI, then you can share the code using Google Colaboratory.
https://colab.research.google.com/
Everyone can run it with the same environment. No need for installation.
In case converting all your python scripts into one executable can help you, then my answer below would help ...
I have been developing a large desktop application purely in python since 3 years. It is a GUI-based tool built on top of pyqt library (python-bindings of QT C++ framework).
I am currently using "py2exe" packaging library : is a distutils extension which allows to build standalone Windows executable programs (32-bit and 64-bit) from Python scripts; all you have to do is to:
install py2exe: 'pip install py2exe'
Create a setup.py script: It is used to specify the content of the final EXE (name, icon, author, data files, shared libraries, etc ..)
Execute: python setup.py py2exe
I am also using "Inno Setup" software to create installer: Creating shortcuts, setting environment variables, icons, etc ...
I'll give you a very brief summary of some of the existing available solutions when it comes to python packaging you may choose from (knowledge is power):
Follow the guidelines provided at Structuring Your Project, these conventions are widely accepted by python community and it's usually a good starting point when newcomers start coding in python. By following these guidelines pythonists watching your project/source at github or other similar places will know straightaway how to install it. Also, uploading your project to pypi as well as adding CI by following those rules will be painless.
Once your project is structured properly according to standard conventions, the next step might be using some of the available freezers, in case you'd like to ship to your end-users a package they can install without forcing them to have python installed on their machines. Be aware though these tools won't provide you any code protection... said otherwise, extracting the original python code from the final artifacts would be trivial in all cases
If you still want to ship your project to your users without forcing them to install any dev dependency and you do also care about code protection so you don't want to consider any of the existing freezers you might use tools such as nuitka, shedskin, cython or similar ones. Usually reversing code from the artifacts produced by these tools isn't trivial at all... Cracking protection on the other hand is a different matter and unless you don't provide a physical binary to your end-user you can't do much about it other than slowing them down :)
Also, in case you'd need to use external languages in your python project another classic link that comes to mind would be https://wiki.python.org/moin/IntegratingPythonWithOtherLanguages, adding the build systems of such tools to CI by following rules of 1 would be pretty easy.
That said, I'd suggest stick to bulletpoint 1 as I know that will be more than good enough to get you started, also that particular point should cover many of the existing use-cases for python "standard" projects.
While this is not intended to be a full guide by following those you'll be able to publish your python project to the masses in no time.
I think you can use docker with your python https://github.com/celery/celery/tree/master/docker
kindly follow the files and I think you can figure out the way to make your docker file for your python scripts!
Because it is missing from the other answers, I would like to add one completely different aspect:
Unit testing. Or testing in general.
Usually, it is good to have one known good configuration. Depending on what the dependencies of the program are, you might have to test different combinations of packages. You can do that in an automated fashion with e.g. tox or as part of a CI/CD pipeline.
There is no general rule of what combination of packages should be tested, but usually python2/3 compatability is a major issue. If you have strong dependencies on packages with major version differences, you might want to consider testing against these different versions.

Prebuilt Python Caffe for OSX

Is there any pre-built PyCaffe out there for OSX? I do see instructions on how to build it but I'm sure I'll have a lot of difficulties trying to build all of its dependencies. So, I'd appreciate it anyone knows where I can get the prebuilt PyCaffe module? Or is it necessary that it gets fully built on the machine?
Thanks
In case you're familiar with docker, you could try: https://github.com/ryankennedyio/deep-dream-generator
On OS X, you'll need to have boot2docker or some such.
Anyway, I was able to get up and running in under 5 minutes or so with this approach, after spending an hour or so (and giving up) downloading/building/installing dependencies directly.

Python development setup

So, id like to start serious python development, and its proven to be a big pain. Im not worried at all about the language itself; I like it well enough and I will have no problems picking it up. But the ecosystem is driving me crazy.
First I tried to get up and running under windows. I gave up on that after a few days, as 90% of packages dont include windows support / install instructions. So I switched to macosx, which people said was good for mac development.
More frustration ensues. Id like to use python as a matlab replacement and tool development platform, so spyderlib seems like an excellent tool. But now ive been busy trying to build pyqt on my mac for two days, to no avail, and im starting to question the wisdom of it all. Obviously, following several guides literally invariantly ends in cryptic errors. For which platform was this dependency built? What arcane compiler flags need to be set? I dont know and I dont care; why doesnt the installer figure it out? Oh wait, there isnt any... I want to USE these tools, not first completely reverse engineer them to find out how to build them.
There is a vast amount of implied knowledge in all the documentation I can find on these matters, both with regard to unix and pythonic quirks. Is there any way to scale this mountain, in a place with a managable learning curve? Right now I have no idea what im doing. Or should I go back to windows and try to coerce the unix packages I need into cooperation?
On Mac OS X, you can get spyder with macports. This should build everything needed.
If you prefer Windows, take a look at Python(x,y). It has a bunch of scientific tools pre-built, including spyder.
Finally, the Enthought Python distribution is worth considering for scientific work.
Have you tried ActivePython?
Why battle with compiling the modules yourself when you can get the pre-built packages from PyPM?
pypm install pyqt4 matplotlib scipy numpy
From my experience the best platform for kind of project you're describing is Linux. There you just install the libs you need from package manager and that's it. Binary packages, so compiling is not required.
If you want to stick with MacOS X, you should install either MacPorts or Fink. It's usually easy to use. Problem is, that things like Qt take forever to compile. But you won't be doing that very often.
As for installing Python modules, the best is PIP, which is very nice replacement for easy_install did does much more. Especially useful if you want virtualenv setup.
This is nearly the exact opposite of my experience with Python on Windows. Python itself installs with a binary installer, most add-on packages support easy_install, others provide binary installers of their own. The IDE I use is SciTE, which uses the old DOS install model - copy the files to a directory and run the SciTE.exe file. If you get a source distribution of a Python package, go to the directory containing setup.py and run python setup.py install. Maybe that's the implied knowledge you're talking about.
You can also find many unofficial Windows binaries at http://www.lfd.uci.edu/~gohlke/pythonlibs/.
I switched to Mac a few years ago and found that it took me quite a while of googling to properly install all the packages I needed for Python development. While I installed everything I made a list of the steps required to setup a functional system that may be appropriate for you as well. I usually use NetCDF4, HDF5, Numpy, Matplotlib, f2py, and Fortran in combination with Python. I published my list of 22 setup-steps for installing from source on my website. Installing from source is somewhat more time-consuming than using macports and fink, but enables you to have a working environment that is optimized for your system.

Categories