So my research mates and I are trying to save a pretty big (47104,5) array into a TTree in a ROOT file. The array on the Python side works fine. We can access everything and run normal commands, but when we run the root_numpy.array2root() command, we get a weird error.
Object of type 'NoneType' has no len()
The code we are running for this portion is as follows:
import root_numpy as rnp
import numpy as np
import scipy
import logging
def save_array(outputArray, outputName):
outputString =str(outputName)
logging.info("Creating .Root file")
rnp.array2root(outputArray,outputString,treename="Training_Variables",mode="recreate")
We placed the outputString variable as a way to make sure we were putting the filename in as a string. ( In our python terminal, we add .root at the end of outputName to save it as a .root file.).
Here is a picture of the terminal.
Showing exact error location in root_numpy
Pretty much, we are confused about why array2root() is calling for the len() of an object, which we dont think should have a len? It should just have a shape. Any insight would be greatly appreciated.
The conversion routines from NumPy arrays to ROOT datatypes work with structured arrays. See the two following links. (Not tested, but this is very likely the problem as the routines use the arr.dtypes.names and arr.dtypes.fields attributes).
http://rootpy.github.io/root_numpy/reference/generated/root_numpy.array2tree.html#root_numpy.array2tree
http://rootpy.github.io/root_numpy/reference/generated/root_numpy.array2root.html#root_numpy.array2root
Related
I was getting some error and I don't know how to handle that error. I have created a script that is used to generate a barcode. Everything is working fine. But when I convert that script into exe. And try to use it then I am getting an error.
Error Image
Here is my code
import barcode
from barcode.writer import ImageWriter
EAN = barcode.get_barcode_class('code128')
with open('somefile.jpeg', 'wb') as f:
EAN("12347859450", writer=ImageWriter()).write(f)
Please help me out from this situation.
looks like your freeze method when creating the executable(cx_freeze?) is failing to include the needed dependency - in this case, looks like the FreeType library. I would investigate in that direction.
E.g. for:
import numpy as np
x = np.array(1)
I know that the array function is part of the numpy module. However, numpy is quite big, hence is there a way to find the exact file where the array function is implemented?
If you are using visual studio code. Put the mouse one the word, press ctrl and click when it's blue.
It works for me at least.
It seems I need to use freeze_graph.py (unless I"m mistaken)
'https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/tools/freeze_graph.py'
This is the graph I want to extract:
C:\tmp\output_graph.pb
I'm looking at freeze_graph.py but I"m having trouble understanding some of the
arguments of the script AND I'm ending in an import error.
--input_graph = the graph I want to extract.
--input_checkpoint ? I don't think I need this
--input binary I don't think I need this as my inputs were image file values
--output_node_names The labels?
----input_meta_graph not sure what this is.
I don't see an output argument for a text.file or pickle file that will
hold all weights.
I tried running the following line from command prompt:
python3.6 C:\Users\Moondra\tensorflow\tensorflow\python\tools\freeze_graph.py --input-graph=C:\tmp\output_graph.pb
However, I'm getting an import error:
from tensorflow.python.tools import saved_model_util
importError: cannot import name 'saved_model_utils'
saved_model_utils seems to be in the same folder, so not sure what the problem is.
Thank you.
I want to produce plots like this, except with many more particles. Matplotlib is woefully inadequate.
Right now I am using mayavi in python 3.5 running through a jupyter notebook. As I need to plot 5x10^5 spheres it will not be practical, since time is a limiting factor already at 2x10^4 spheres.
Here is my python code to produce the mayavi plot. I have a numpy array of values [a,r,x,y,z]. It's not relevant what the first quantity is for this problem.
"""VISUALIZATION WITH MAYAVI"""
#I think this is too slow to be practical.
#view particles with mayavi
import mayavi
from mayavi import mlab
%gui qt
def plot_sphere(p): #feed it p and get back a sphere \n",
t1,R,a,b,c = p
[phi,theta] = np.mgrid[0:2*np.pi:12j,0:np.pi:12j] #increase the numbers before j for better resolution but more time
x = R*np.cos(phi)*np.sin(theta) + a
y = R*np.sin(phi)*np.sin(theta) + b
z = R*np.cos(theta) + c
return mlab.mesh(x, y, z)
#run s over all particles and plot it
def view(particles):
for p in particles:
plot_sphere(p)
view(spheres)
This code produces plots like this:
I have been told I should look into writing my numpy arrays to .vtk files using evtk, then visualizing these in paraview. I downloaded paraview and read this, but perhaps my version of python is limiting me? First, install pyevtk-- okay:
I tried conda install -c opengeostat pyevtk=1.0.0, but it fails due to incompatibility with my python version. I looked for details but could not find any.
Next I downloaded the repository [here][https://pypi.python.org/pypi/PyEVTK/1.0.0], then used pip to install it successfully.
Next I put evtk.py, vtk.py, hl.py, and xml.py, and tried some examples in the repository-- none of them work. Seemingly there is some problem with
from .vtk import *
type commands. I tried replacing all of these in the four .py files with
from evtk import vtk
from vtk import *
and such, but no luck. Long story short, I can't get pyevtk working to export my numpy arrays as .vtk files. I could use some help in this regard, or better yet I would love a different option to get my numpy arrays rendered by paraview. Any help is appreciated !
Ok, I solved my own problem. This image is made using paraview, after converting numpy arrays to a .vtu object using pyevtk.
Out of the box, the repository did not work, there was some problem with importing the modules inside the four .py files, so I modified them all. Instead of from .vtk import *, I changed it to from vtk import *, and so on, in every module in the library. evtk.py was not able to import a class from xml.py, so I just copied it and pasted, then deleted xml.py. After some tinkering and clueless modifying to make the errors go away, eventually it worked.
I've been trying to throw together a python program that will align, crop and create an RGB image from HST and VLA .fits data. Unfortunately I've run into a bit of a problem with it continually opening a past file that does not exist in the folder and neither is it opening in the code itself. I've googled and googled and haven't found anything like it, so perhaps it's just common sense to most, but I can't figure it out. Here's the error message:
You can see at the top that the program I'm running has the filename rgbhstvla.py. I'm not sure what the error message means. Here's the python program as well:
import pyfits
import numpy as np
import pylab as py
import img_scale
from pyraf import iraf as ir
fits.open('3c68.fits', readonly)
j_img = pyfits.getdata('230UVIS.fits')
h_img = pyfits.getdata('230IR.fits')
k_img = pyfits.getdata('5GHZ.fits')
jmin,jmax = j_img.mean()+0.75*j_img.std(),j_img.mean()+5*j_img.std()
hmin,hmax = h_img.mean()+0.75*h_img.std(),h_img.mean()+5*h_img.std()
kmin,kmax = k_img.mean()+0.75*k_img.std(),k_img.mean()+5*k_img.std()
img = numpy.zeros((1024,1024,3))
img[:,:,0] = img_scale.asinh(j_img,scale_min=jmin,scale_max=jmax)
img[:,:,1] = img_scale.asinh(h_img,scale_min=hmin,scale_max=hmax)
img[:,:,2] = img_scale.asinh(k_img,scale_min=kmin,scale_max=kmax)
pylab.clf()
pylab.imshow(img)
pylab.show()
(I'm still working on the program since I'm new to python, tips here would be nice as well but they're mostly unnecessary as I'm sure I'll figure it out eventually).
Python cannot find the file 3c68.fits, which is expected to be in the current working directory, C:\Users\Brandon\Desktop\Research. Either make sure the file is in that directory, or provide an absolute path in your code.