Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 years ago.
Improve this question
I've read several threads about some of the differences between Cython and Swig and have implemented both techniques, but I am still not sure what route is the best route to take. Basically I have been working on a C++ library and I would like to expose those functions to Python. Cython seemed rather complicated and required me to write several .pyx files for every C++ class that I needed to expose whereas SWIG just did the work for me. I don't have a lot of experience with either one of these wrapping methods to date, but it seems that SWIG is a clear winner for wrapping my C++ code. So what is the controversy? Why would I spend hours writting .pyx files in Cython to wrap my C++ when I could just write one simple .i file in SWIG? Furthermore, I use CMake as my build tool and it was WAY easier to build SWIG with CMAKE than with Cython. I feel that I must be missing something because even big projects like OpenCV are not using SWIG and before I dedicate my project to SWIG, I want to find out why I would rather use Cython or nothing at all. Here is a summary:
My project:
C++ source code is primary
Python is a nice to have, just want a path of least resistance to expose the majority of my C++ code.
Writting wrapper code sucks because now I have two places to maintain an API--I just want one, the C++ code.
Any advice is greatly appreciated. Also, I was curious about xdress.org--seems like that project has died though.
Related
Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 5 years ago.
Improve this question
In C/C++ we write a makefile for a project installation/execution. How do we create a makefile or equivalent for a python project?
If python is a scripting language and we don't need a makefile. How do we integrate a python project with multiple python files (.py files)?
There is another thread with a similar question
Call a function from another file in Python. But my question is different from the one asked in that thread. One solution of my question may be possible by calling a function from another file. But I wanted a better solution as described by the Simon.
Python is a scripting language this means that there is no compiling/recompiling is necessary and errors are reported when you run the script. This removes one of the greatest assets of using make.
Although you can use makefiles for replacing long commands in the command line or controlling dictionaries/files, it is unlikely you will really need this.
If you are using C/C++ in with your Python project then it is highly recommendable.
You mention intergrating. makefiles are unlikely to be the tools you want. You need to build a module at the very least.
Modules allow you to use functions from other python files as if they were in that file. You need to import them and that's pretty much it.
If you want to install on other PCs use setup.py script to create a package. This allows you to make your project installable and then the project can be used just like an extension to Python.
Python is not like C or C++. Just knowing where to find the files together is fine for Python and when they are turned into modules you will just have to import them once and you will be able to use the functions they provide
The c/c++ makefile is mainly used for compiling the project, which is not needed with python as it is a a script language (not compiled). What kind of operations do you require from your makefile? For package management you can read about pip (a common python package manager)
Arrange your project of several python files into a package.
Package can mean to put several files together for others to use your code by means of an import package, or actually to distribute your app for others to use.
For the first just create a folder with the py files in it, and an empty __init__.py inside that folder. For the first and the second scenario also read the documentation here, and maybe here too.
The best analog is a shell script that executes your file. It could include taking care of any complicated arguments, setting environmental variables if necessary. To include other .py files, you can probably just import them into your main python file. But if you have more advanced needs I would recommend a second question
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
Disclaimer: I'm still not sure I understand fully what setup.py does.
From what I understand, using a setup.py file is convenient for packages that need to be compiled or to notify Disutils that the package has been installed and can be used in another program. setup.py is thus great for libraries or modules.
But what about super simple packages that only has a foo.py file to be run? Is a setup.py file making packaging for Linux repository easier?
Using a setup.py script is only useful if:
Your code is a C extension, and then depends on platform-specific features that you really don't want to define manually.
Your code is pure Python but depends on other modules, in which case dependencies may be resolved automatically.
For a single file or a few set of files that don't rely on anything else, writing one is not worth the hassle. As a side note, your code is likely to be more attractive to people if trying it up doesn't require a complex setup. "Installing" it is then just about copying a directory or a single file in one's project directory.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
i have been working on a project on python for physics, using enthought canopy, i wished to enhance it by modifying it to a cython code. Could someone pleases tell me how to rewrite a python to cython codes in canopy? or do i need a separate software?
Just to add, it's probably better to use libraries which already incorporate Cython. NumPy for example, has virtually any array handling you can think of and has been optimized around things like matrix multiplication. Smart people have already done the work for you, so see if you can get that module to do what you need then as a last resort rewrite your code using Cython.
The Canopy Python distribution is bundled with Cython. However to use it, you will need a C compiler on your machine, which you may or may not already have. The Cython documentation has a good overview of Cython basics, including a brief description of where to get/find a C compiler for you operating system.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 9 years ago.
Improve this question
Personally I think it's better to distribute .py files as these will then be compiled by the end-user's own python, which may be more patched.
What are the pros and cons of distributing .pyc files versus .py files for a commercial, closed-source python module?
In other words, are there any compelling reasons to distribute .pyc files?
Edit: In particular, if the .py/.pyc is accompanied by a DLL/SO module which is compiled against a certain version of Python.
If your proprietary bits are inside a binary DLL or SO, then there's no real value in making an interface layer a .pyc (as opposed to a .py). You can drop that all together or distribute it as an uncompiled python file. I don't know of any reasons to distribute compiled python files. In many cases, build environments treat them as stale byproducts and clean them out so your program might disappear.
You should know that there are open source and proprietary de-compilers to convert Python byte code to Python source. One such example would be Mysterie's uncompyle2
Moreover, .pyc files are not cross version safe, so there are more trouble than benefit in distributing .pyc over .py.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 5 years ago.
Improve this question
I write tons of python scripts, and I find myself reusing lots code that I've written for other projects. My solution has been to make sure the code is separated into logical modules/packages (this one's a given). I then make them setuptools-aware and publish them on PyPI. This allows my other scripts to always have the most up-to-date code, I get a warm fuzzy feeling because I'm not repeating myself, and my development, in general, is made less complicated. I also feel good that there MAY be someone out there that finds my code handy for something they're working on, but it's mainly for selfish reasons :)
To all the pythonistas, how do you handle this? Do you use PyPI or setuptools (easy_install)? or something else?
I have been doing the same thing. Extract common functionality, pretty the code up with extra documentation and unit tests/ doctests, create an easy_install setup.py, and then release on PyPi. Recently, I created a single Google Code site where I manage the source and keep the wiki up to date.
What kind of modules are we talking about here? If you're planning on distributing your projects to other python developers, setuptools is great. But it's usually not a very good way to distribute apps to end users. Your best bet in the latter case is to tailor your packaging to the platforms you're distributing it for. Sure, it's a pain, but it makes life for end users far easier.
For example, in my Debian system, I usually don't use easy_install because it is a little bit more difficult to get eggs to work well with the package manager. In OS X and windows, you'd probably want to package everything up using py2app and py2exe respectively. This makes life for the end user better. After all, they shouldn't know or care what language your scripts are written in. They just need them to install.
I store it all offline in a logical directory structure, with commonly used modules grouped as utilities. This means it's easier to control which versions I publish, and manage. I also automate the build process to interpret the logical directory structure.