I want to simulate a propagating wave with absorption and reflection on some bodies in three dimensional space. I want to do it with python. Should I use numpy? Are there some special libraries I should use?
How can I simulate the wave? Can I use the wave equation? But what if I have a reflection?
Is there a better method? Should I do it with vectors? But when the ray diverge the intensity gets lower. Difficult.
Thanks in advance.
If you do any computationally intensive numerical simulation in Python, you should definitely use NumPy.
The most general algorithm to simulate an electromagnetic wave in arbitrarily-shaped materials is the finite-difference time domain method (FDTD). It solves the wave equation, one time-step at a time, on a 3-D lattice. It is quite complicated to program yourself, though, and you are probably better off using a dedicated package such as Meep.
There are books on how to write your own FDTD simulations: here's one, here's a document with some code for 1-D FDTD and explanations on more than 1 dimension, and Googling "writing FDTD" will find you more of the same.
You could also approach the problem by assuming all your waves are plane waves, then you could use vectors and the Fresnel equations. Or if you want to model Gaussian beams being transmitted and reflected from flat or curved surfaces, you could use the ABCD matrix formalism (also known as ray transfer matrices). This takes into account the divergence of beams.
If you are solving 3D custom PDEs, I would recommend at least a look at FiPy. It'll save you the trouble of building a lot of your matrix conditioners and solvers from scratch. It uses numpy and/or trilinos. Here are some examples.
I recommend you use my project GarlicSim as the framework in which you build the simulation. You will still need to write your algorithm yourself, probably in Numpy, but GarlicSim may save you a bunch of boilerplate and allow you to explore your simulation results in a flexible way, similar to version control systems.
Don't use Python. I've tried using it for computationally expensive things and it just wasn't made for that.
If you need to simulate a wave in a Python program, write the necessary code in C/C++ and export it to Python.
Here's a link to the C API: http://docs.python.org/c-api/
Be warned, it isn't the easiest API in the world :)
Related
I have a time series and generate its spectrogram in Python with matplotlib.pyplot.specgram.
After I make some analysis and changes I need to convert the spectrogram back into time series.
Is there any function in matplotlib or in other library that I can use directly? Or if not, could you please elaborate on which direction I should work on?
Your warm help is appreciated.
Matplotlib is a library for plotting data. Generally if you're trying to do any computation you'd use a library suited for that.
numpy is a very popular library for doing numerical computation in Python. It just so happens they have a fairly extensive set of fft and ifft methods.
I would check them out here and see if they can solve your problem.
One thing commonly done (for example in the source separation community) is to use the phase data of the original signal (before transformation where applied to it) - the result is much better than null or random phase, and not so far from algorithms aiming at reconstructing the phase information from scratch.
A classic reconstruction algorithm is Griffin&Lim's, described in the paper "Signal estimation from modified short-time Fourier transform". This is an iterative algorithm, each iteration requires a full STFT / inverse STFT, which makes it quite costly.
This problem is indeed an active area of research, a search for STFT + reconstruction + magnitude will yield plenty of papers aiming at improving on Griffin&Lim in terms of signal quality and/or computational efficiency.
You can find detailed dicussion hereThread on DSP Stack Exchange
I'm trying to call upon the famous multilateration algorithm in order to pinpoint a radiation emission source given a set of arrival times for various detectors. I have the necessary data, but I'm still having trouble implementing this calculation; I am relatively new with Python.
I know that, if I were to do this by hand, I would use matrices and carry out elementary row operations in order to find my 3 unknowns (x,y,z), but I'm not sure how to code this. Is there a way to have Python implement ERO, or is there a better way to carry out my computation?
Depending on your needs, you could try:
NumPy if your interested in numerical solutions. As far as I remember, it could solve linear equations. Don't know how it deals with non-linear resolution.
SymPy for symbolic math. It solves symbolically linear equations ... according to their main page.
The two above are "generic" math packages. I doubt you will find (easily) any dedicated (and maintained) library for your specific need. Their was already a question on that topic here: Multilateration of GPS Coordinates
long-time R and Python user here. I use R for my daily data analysis and Python for tasks heavier on text processing and shell-scripting. I am working with increasingly large data sets, and these files are often in binary or text files when I get them. The type of things I do normally is to apply statistical/machine learning algorithms and create statistical graphics in most cases. I use R with SQLite sometimes and write C for iteration-intensive tasks; before looking into Hadoop, I am considering investing some time in NumPy/Scipy because I've heard it has better memory management [and the transition to Numpy/Scipy for one with my background seems not that big] - I wonder if anyone has experience using the two and could comment on the improvements in this area, and if there are idioms in Numpy that deal with this issue. (I'm also aware of Rpy2 but wondering if Numpy/Scipy can handle most of my needs). Thanks -
R's strength when looking for an environment to do machine learning and statistics is most certainly the diversity of its libraries. To my knowledge, SciPy + SciKits cannot be a replacement for CRAN.
Regarding memory usage, R is using a pass-by-value paradigm while Python is using pass-by-reference. Pass-by-value can lead to more "intuitive" code, pass-by-reference can help optimize memory usage. Numpy also allows to have "views" on arrays (kind of subarrays without a copy being made).
Regarding speed, pure Python is faster than pure R for accessing individual elements in an array, but this advantage disappears when dealing with numpy arrays (benchmark). Fortunately, Cython lets one get serious speed improvements easily.
If working with Big Data, I find the support for storage-based arrays better with Python (HDF5).
I am not sure you should ditch one for the other but rpy2 can help you explore your options about a possible transition (arrays can be shuttled between R and Numpy without a copy being made).
I use NumPy daily and R nearly so.
For heavy number crunching, i prefer NumPy to R by a large margin (including R packages, like 'Matrix') I find the syntax cleaner, the function set larger, and computation is quicker (although i don't find R slow by any means). NumPy's Broadcasting functionality for instance, i do not think has an analog in R.
For instance, to read in a data set from a csv file and 'normalize' it for input to an ML algorithm (e.g., mean center then re-scale each dimension) requires just this:
data = NP.loadtxt(data1, delimiter=",") # 'data' is a NumPy array
data -= NP.mean(data, axis=0)
data /= NP.max(data, axis=0)
Also, i find that when coding ML algorithms, i need data structures that i can operate on element-wise and that also understand linear algebra (e.g., matrix multiplication, transpose, etc.). NumPy gets this and allows you to create these hybrid structures easily (no operator overloading or subclassing, etc.).
You won't be disappointed by NumPy/SciPy, more likely you'll be amazed.
So, a few recommendations--in general and in particular, given the facts in your question:
install both NumPy and Scipy. As a rough guide, NumPy provides the
core data structures (in particular
the ndarray) and SciPy (which is
actually several times larger than
NumPy) provides the domain-specific
functions (e.g., statistics, signal
processing, integration).
install the repository versions, particularly w/r/t NumPy because the
dev version is 2.0. Matplotlib and NumPy are tightly integrated, you can use one without the other of course, but both are the best in their respective class among python libraries. You can get all three via easy_install, which i assume you already.
NumPy/SciPy have several modules
specifically directed to Machine
Learning/Statistics, including the Clustering package and the Statistics package.
As well as packages directed to
general computation, but which are
make coding ML algorithms a lot
faster, in particular,
Optimization and Linear Algebra.
There are also the SciKits, not included in the base NumPy or
SciPy libraries; you need to install them separately.
Generally speaking, each SciKit is a
set of convenience wrappers to
streamline coding in a given domain. The SciKits you are likely to find most relevant are: ann (approximate Nearest Neighbor), and learn (a set of ML/Statistics regression and classification algorithms, e.g., Logistic Regression, Multi-Layer Perceptron, Support Vector Machine).
Is there a library of data structures and operations for quadratic bezier curves? I need to implement:
bezier to bitmap converting with arbitrary quality
optimizing bezier curves
common operations like subtraction, extraction, rendering etc.
languages: c,c++,.net,python
Algorithms without implementation (pseudocode or etc) could be useful too. (especially optimization)
A little bit of python lib is included in nodebox:
http://nodebox.net/code/index.php/Bezier
There are plenty of algorithms inside inkscape, but I did not digg the code yet to find, how easy they could be used outside if inkscape.
Update: Inkscape is using lib2geom:
lib2geom (2Geom in private life) was
initially a library developed for
Inkscape but will provide a robust
computational geometry framework for
any application. It is not a rendering
library, instead concentrating on high
level algorithms such as computing arc
length.
lib2geom is at http://lib2geom.sourceforge.net
You might want to take a look at Cairo. I am not exactly sure if it covers all your requirements but it should be able to handle rendering at least.
I am searching for a python package that I can use to simulate molecular dynamics in non-equilibrium situations. I need a setup that can handle a fairly large number of molecules in a primarily kinetic theory manner, and that can handle having solid surfaces present. With regards to the surfaces, I would need to be able to create arbitrary shapes and monitor pressure and other variables resulting from the molecular action. Alternatively, I could add the surface parts myself if I had molecules that could handle it.
Does anyone know of any packages that might be suitable?
Have you considered SimPy? SimPy is a rather generic Discrete Event Simulation package, but could feasibly meet your needs.
Better yet the Molecular Modelling ToolKit (MMTK) seems more specialized...
I have used neither, but this sounds like fun. Python, as a language, seems to be in privileged position for use in simulation software, whereby people can script the specific details of their model while relying on the framework for all the common logic, such as scheduling, visualization, monitoring etc. The unknown is how well such toolkits scale when fed with agent counts commensurate with biology models (BTW, how "big" is that?)
Lampps and gromacs are two well known molecular dynamics codes. These codes both have some python based wrapper stuff, but I am not sure how much functionality the wrappers expose. They may not give you enough control over the simulation.
Google for "GromacsWrapper" or google for "lammps" and "pizza.py"
Digital material and ASE are two molecular dynamics codes that expose a lot of functionality, but last time I looked, they were both fairly specialized. They may not allow you to use the force potentials that you want
Google for "digital material" and "cornell" or google for "ase" and dtu
Note to MJV: Normal MD-codes take one time step at a time, and they move all particles in each time step. Most of the time is spend calculating the total force on each atom. This involves iterating over a list of pairs of neighboring atoms. I think the best idea is to do the force calculation and a few more basics in c++ or fortran and then wrap that functionality in python. (But it could be fun to see how far one can get by using numpy matrices)
The following programs can be used to run MD symulations:
Gromacs
AMBER
charmm
OpenMM
many others...
The following Python packages are useful for preparing and analysing MD trajectories:
MDtraj and the OMNIA ecosystem
MDAnalysis
ProDy
MMTK
Another generic simulations framework is my own GarlicSim. You can try that. I could help you get a simpack up if you're serious about it.
I don't know if that programs does all the features you need but there is avogadro in the kde programs, i think it is extendable and since it is open source you could do anything with it. http://www.kde-apps.org/content/show.php/Avogadro?content=59521
It is really advanced and programmed by a friend of mine
I second MMTK, but take a look at VMD, which is the best MD software I'm aware of, and is Python-scriptable (in addition to Tk). See this for examples and tutorials.
I recommend to use molecular dynamics software to run MD simulations like Gromacs. This software is highly optimized for that particular purpose. You can also run on GPU's and you will be able to run larger systems in less time.
Afterwards, you run only the analysis with python packages using the generated trajectories.
mdtraj
pmx