Getting filenotfound error when trying to open a h5 file - python

I have a h5 file that contains the "catvnoncat" dataset. when I try to run the following code, I get an error which I will include at the bottom. I have tried getting the dataset from three different sources to exclude to possibility of a corrupted file.
What I would like to know is what is causing the problem?
# My Code
import h5py
train_dataset = h5py.File('train_catvnoncat.h5', "r")
# The Error
Traceback (most recent call last):
File "f:\Downloads\Compressed\coursera-deep-learning-specialization-master_2\coursera-deep-learning-specialization-master\C1 - Neural Networks and Deep Learning\Week 2\Logistic Regression as a Neural Network\lr_utils.py", line 4, in <module>
train_dataset = h5py.File('train_catvnoncat.h5', "r")
File "C:\Users\Mohsen\AppData\Local\Programs\Python\Python39\lib\site-packages\h5py\_hl\files.py", line 507, in __init__
fid = make_fid(name, mode, userblock_size, fapl, fcpl, swmr=swmr)
File "C:\Users\Mohsen\AppData\Local\Programs\Python\Python39\lib\site-packages\h5py\_hl\files.py", line 220, in make_fid
fid = h5f.open(name, flags, fapl=fapl)
File "h5py\_objects.pyx", line 54, in h5py._objects.with_phil.wrapper
File "h5py\_objects.pyx", line 55, in h5py._objects.with_phil.wrapper
File "h5py\h5f.pyx", line 106, in h5py.h5f.open
FileNotFoundError: [Errno 2] Unable to open file (unable to open file: name = 'train_catvnoncat.h5', errno = 2, error message = 'No such file or directory', flags = 0, o_flags = 0)

Your code is looking in the current directory which is not where the file is.
Based on the error message, it looks like you are on windows. Is the file 'train_catvnoncat.h5' in your Downloads folder? Find that file on your system and copy the full path. You can then update this:
train_dataset = h5py.File('train_catvnoncat.h5', "r")
to something that contains the full path. Similar to this:
train_dataset = h5py.File('C:/Users/Moshen/Downloads/train_catvnoncat.h5', "r")
Pay attention to the slashes and make sure you update the file path to the actual value.

Related

OsError : Unable to open file in pixellib

I'm trying to change background of image using pixellib library but am encountering problem. I'm getting this error:
Traceback (most recent call last): File
"C:/Users/FEd/PycharmProjects/pythonProject/PIXEL.py", line 16, in
change_bg.load_pascalvoc_model("deeplabv3_xception_tf_dim_ordering_tf_kernels.h5")
File
"C:\Users\FEd\PycharmProjects\pythonProject\venv\lib\site-packages\pixellib\tune_bg.py",
line 43, in load_pascalvoc_model
self.model.load_weights(model_path) File "C:\Users\FEd\PycharmProjects\pythonProject\venv\lib\site-packages\tensorflow\python\keras\engine\training.py",
line 2319, in load_weights
with h5py.File(filepath, 'r') as f: File "C:\Users\FEd\PycharmProjects\pythonProject\venv\lib\site-packages\h5py_hl\files.py",
line 427, in init
swmr=swmr) File "C:\Users\FEd\PycharmProjects\pythonProject\venv\lib\site-packages\h5py_hl\files.py",
line 190, in make_fid
fid = h5f.open(name, flags, fapl=fapl) File "h5py_objects.pyx", line 54, in h5py._objects.with_phil.wrapper File
"h5py_objects.pyx", line 55, in h5py._objects.with_phil.wrapper
File "h5py\h5f.pyx", line 96, in h5py.h5f.open OSError: Unable to open
file (unable to open file: name =
'deeplabv3_xception_tf_dim_ordering_tf_kernels.h5', errno = 2, error
message = 'No such file or directory', flags = 0, o_flags = 0)
Have used os.getcwd to get the current working directory but still am getting same error. Have found this link states i should use os.getcwd to get current working directory
Code
import pixellib
from pixellib.tune_bg import alter_bg
import os
my_dir = os.getcwd()
print(my_dir)
m1 = my_dir+"\sample.jpg"
m2 = my_dir+"\change_background.jpg"
print(m1)
print(m2)
change_bg = alter_bg()
change_bg.load_pascalvoc_model("deeplabv3_xception_tf_dim_ordering_tf_kernels.h5")
change_bg.change_bg_img(f_image_path =m1, b_image_path = m2, output_image_name="new_img.jpg")
print("done")

Cannot load BERT from local disk

I am trying to use Huggingface transformer api to load a locally downloaded M-BERT model but it is throwing an exception.
I clone this repo: https://huggingface.co/bert-base-multilingual-cased
bert = TFBertModel.from_pretrained("input/bert-base-multilingual-cased")
The directory structure is:
But I am getting this error:
Traceback (most recent call last):
File "/usr/local/lib/python3.7/dist-packages/transformers/modeling_tf_utils.py", line 1277, in from_pretrained
missing_keys, unexpected_keys = load_tf_weights(model, resolved_archive_file, load_weight_prefix)
File "/usr/local/lib/python3.7/dist-packages/transformers/modeling_tf_utils.py", line 467, in load_tf_weights
with h5py.File(resolved_archive_file, "r") as f:
File "/usr/local/lib/python3.7/dist-packages/h5py/_hl/files.py", line 408, in __init__
swmr=swmr)
File "/usr/local/lib/python3.7/dist-packages/h5py/_hl/files.py", line 173, in make_fid
fid = h5f.open(name, flags, fapl=fapl)
File "h5py/_objects.pyx", line 54, in h5py._objects.with_phil.wrapper
File "h5py/_objects.pyx", line 55, in h5py._objects.with_phil.wrapper
File "h5py/h5f.pyx", line 88, in h5py.h5f.open
OSError: Unable to open file (file signature not found)
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "train.py", line 81, in <module>
__main__()
File "train.py", line 59, in __main__
model = create_model(num_classes)
File "/content/drive/My Drive/msc-project/code/model.py", line 26, in create_model
bert = TFBertModel.from_pretrained("input/bert-base-multilingual-cased")
File "/usr/local/lib/python3.7/dist-packages/transformers/modeling_tf_utils.py", line 1280, in from_pretrained
"Unable to load weights from h5 file. "
OSError: Unable to load weights from h5 file. If you tried to load a TF 2.0 model from a PyTorch checkpoint, please set from_pt=True.
Where am I going wrong?
Need help!
Thanks in advance.
As it was already pointed in the comments - your from_pretrained param should be either id of a model hosted on huggingface.co or a local path:
A path to a directory containing model weights saved using
save_pretrained(), e.g., ./my_model_directory/.
See documentation
Looking at your stacktrace it seems like your code is run inside:
/content/drive/My Drive/msc-project/code/model.py so unless your model is in:
/content/drive/My Drive/msc-project/code/input/bert-base-multilingual-cased/ it won't load.
I would also set the path to be similar to documentation example ie:
bert = TFBertModel.from_pretrained("./input/bert-base-multilingual-cased/")

Reading TDMS File with python nptdms, cannot open tdms file

I am having issues with getting basic function of the nptdms module working.
First, I am just trying to open a TDMS file and print the contents of specific channels within specific groups.
Using python 2.7 and the nptdms quick start here
Following this, I will be writing these specific pieces of data into a new TDMS file. Then, my ultimate goal is to be able to take a set of source files, open each, and write (append) to a new file. The source data files contain far more information that is needed, so I am breaking out the specifics into their own file.
The problem I have is that I cannot get past a basic error.
When running this code, I get:
Traceback (most recent call last):
File "PullTDMSdataIntoNewFile.py", line 27, in <module>
tdms_file = TdmsFile(r"C:\\Users\daniel.worts\Desktop\this_is_my_tdms_file.tdms","r")
File "C:\Anaconda2\lib\site-packages\nptdms\tdms.py", line 94, in __init__
self._read_segments(f)
File "C:\Anaconda2\lib\site-packages\nptdms\tdms.py", line 119, in _read_segments
object._initialise_data(memmap_dir=self.memmap_dir)
File "C:\Anaconda2\lib\site-packages\nptdms\tdms.py", line 709, in _initialise_data
mode='w+b', prefix="nptdms_", dir=memmap_dir)
File "C:\Anaconda2\lib\tempfile.py", line 475, in NamedTemporaryFile
(fd, name) = _mkstemp_inner(dir, prefix, suffix, flags)
File "C:\Anaconda2\lib\tempfile.py", line 244, in _mkstemp_inner
fd = _os.open(file, flags, 0600)
OSError: [Errno 2] No such file or directory: 'r\\nptdms_yjfyam'
Here is my code:
from nptdms import TdmsFile
import numpy as np
import pandas as pd
#set Tdms file path
tdms_file = TdmsFile(r"C:\\Users\daniel.worts\Desktop\this_is_my_tdms_file.tdms","r")
# set variable for TDMS groups
group_nameone = '101'
group_nametwo = '752'
# set objects for TDMS channels
channel_dataone = tdms_file.object(group_nameone 'Payload_1')
channel_datatwo = tdms_file.object(group_nametwo, 'Payload_2')
# set data from channels
data_dataone = channel_dataone.data
data_datatwo = channel_datatwo.data
print data_dataone
print data_datatwo
Big thanks to anyone who may have encountered this before and can help point to what I am missing.
Best,
- Dan
edit:
Solved the read data issue by removing the 'r' argument from the file path.
Now I am having another error I can't trace when trying to write.
from nptdms import TdmsFile, TdmsWriter, RootObject, GroupObject, ChannelObject
import numpy as np
import pandas as pd
newfilepath = r"C:\\Users\daniel.worts\Desktop\Mined.tdms"
datetimegroup101_channel_object = ChannelObject('101', DateTime, data_datetimegroup101)
with TdmsWriter(newfilepath) as tdms_writer:
tdms_writer.write_segment([datetimegroup101_channel_object])
Returns error:
Traceback (most recent call last):
File "PullTDMSdataIntoNewFile.py", line 82, in <module>
tdms_writer.write_segment([datetimegroup101_channel_object])
File "C:\Anaconda2\lib\site-packages\nptdms\writer.py", line 68, in write_segment
segment = TdmsSegment(objects)
File "C:\Anaconda2\lib\site-packages\nptdms\writer.py", line 88, in __init__
paths = set(obj.path for obj in objects)
File "C:\Anaconda2\lib\site-packages\nptdms\writer.py", line 88, in <genexpr>
paths = set(obj.path for obj in objects)
File "C:\Anaconda2\lib\site-packages\nptdms\writer.py", line 254, in path
self.channel.replace("'", "''"))
AttributeError: 'TdmsObject' object has no attribute 'replace'

Python unable to open a .h5 file

I am trying to open a HDF5 file in order to read it with python, so that I can do more things with it later. There is an error when I run the program to read the file. The program is below:
import h5py # HDF5 support
import numpy
fileName = "C:/.../file.h5"
f = h5py.File(fileName, "r")
for item in f.attrs.keys():
print item + ":", f.attrs[item]
mr = f['/entry/mr_scan/mr']
i00 = f['/entry/mr_scan/I00']
print "%s\t%s\t%s" % ("#", "mr", "I00")
for i in range(len(mr)):
print "%d\t%g\t%d" % (i, mr[i], i00[i])
f.close()
If I run the program I end up seeing this error:
Traceback (most recent call last):
File "TestHD5.py", line 8, in <module>
mr = f['/entry/mr_scan/mr']
File "h5py\_objects.pyx", line 54, in h5py._objects.with_phil.wrapper (C:\aroot\work\h5py\_objects.c:2587)
File "h5py\_objects.pyx", line 55, in h5py._objects.with_phil.wrapper (C:\aroot\work\h5py\_objects.c:2546)
File "C:\programs\Python27\lib\site-packages\h5py\_hl\group.py", line 166, in __getitem__
oid = h5o.open(self.id, self._e(name), lapl=self._lapl)
File "h5py\_objects.pyx", line 54, in h5py._objects.with_phil.wrapper (C:\aroot\work\h5py\_objects.c:2587)
File "h5py\_objects.pyx", line 55, in h5py._objects.with_phil.wrapper (C:\aroot\work\h5py\_objects.c:2546)
File "h5py\h5o.pyx", line 190, in h5py.h5o.open (C:\aroot\work\h5py\h5o.c:3417)
KeyError: 'Unable to open object (Component not found)'
Am I just missing some modules to read the file, or is this something else. It will open the .h5 file if I use an h5 file veiwer program. Thank you
Your string:
path = "C:\Users\312001\m2020\data\20170104_145626\doPoint_20170104_150016\dataset_XMIT data_20170104_150020.h5"
is full of broken/illegal escapes (thankfully they will be turned into SyntaxErrors, though you are using Python 2), and some that actually do work, so Python thinks path is really equal to: 'C:\\Users\xca001\\m2020\\data\x8170104_145626\\doPoint_20170104_150016\\dataset_XMIT data_20170104_150020.h5' (note those \x##'s).
Your options:
Use a raw string by prefixing the string literal with r
Don't use backslashes for paths. Python will convert forward slashes to backslashes for Windows paths.
Double-backslash.
The answer that #NickT posted fixed the orginal problem I had. The problem that is shown in the new version is due to the hd5 folder names in the hd5 file not matching the folder names that the code provided.

How to unpack mnist dataset?

I want to unpack a pkl file from a MNIST dataset. I have used this code to do that:
import cPickle, gzip, numpy
# Load the dataset
f = gzip.open(’mnist.pkl.gz’, ’rb’)
train_set, valid_set, test_set = cPickle.load(f)
f.close()
But I am getting this error:
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/lib/python2.7/gzip.py", line 34, in open
return GzipFile(filename, mode, compresslevel)
File "/usr/lib/python2.7/gzip.py", line 94, in __init__
fileobj = self.myfileobj = __builtin__.open(filename, mode or 'rb')
IOError: [Errno 2] No such file or directory: 'mnist.pkl.gz'
The mnist.pkl.gz file is on the desktop. How to specify the path in the code?
Put /home/<username>/Desktop/ in front of mnist.pkl.gz or launch it with the terminal working directory being the Desktop.

Categories