Running in Google Colab - python

Good evening.
I want to run my python program on Google collab, but in what place I should download files and when open in python file?
How do I open this file?

You can always upload files to Google colab and you can create a directory as well.
You can create a directory named data
And then upload the files which you want to be placed in the directory as shown below.
* Remember the data uploaded or created during runtime will not be saved *
Alternatively, you can save the files to your google drive and mount the drive
->
Google Colab: how to read data from my google drive? to save your runtime files and folder directly to the drive.

Related

I need to open latest modified (latest time stamp) file from my shared drive (google) in colab (python)

I need to open a file from my shared drive (google) in colab (python). But I need to open only the latest modified file. I tried the below code which is working, after I mount my google drive to colab. But I need to open directly from the shared folder without mounting the drive.
path.Cakare = path.join(max(glob.iglob('/content/gdrive/MyDrive/gdrive/DMS - Finance Data Pack/Chennai/DMS Cakare/CustomerMaster/asterik/asterik'),key=os.path.getctime))
dfck=pd.read_excel(path.Cakare)

upload folders inside a folder to google cloud using python

Previously I was working in AWS and I am new in Google Cloud, in AWS there was a way to upload directories/folder to bucket. I have done bit of research for uploading directory/folder in Google Cloud bucket but couldn't find. Can someone help me.I would like to upload some folders(not files) inside a folder to google cloud using python.How to do that?
To achieve this, you need to upload file by file the content on each directory and replicate the path that you have locally in your GCS bucket.
Note: directory doesn't exist in GCS, it's simply a set of the same file path prefix presented as directory in the UI

dump files downloaded by google Colab in temporary location to google drive

I have a json file with over 16k urls of images, which I parse using a python script and use urllib.request.urlretrieve in it to retrieve images. I uploaded the json file to google drive and run the python script in google Colab.
Though the files were downloaded (I checked this using a print line in the try block of urlretrieve) and it took substantial time to download them, I am unable to see where it has stored these files. When I had run the same script on my local machine, it stored the files in the current folder.
As an answer to this question suggests, the files may be downloaded to some temporary location, say, on some cloud. Is there a way to dump these temporary files to google drive?
(*Note I had mounted the drive in the colab notebook, still the files don't appear to be stored in google drive)
Colab stores files in some temp location which is new every time you run the notebook. If you want your data to persist across sessions you need to store it in GDrive. For that you need to map some GDrive folder in your notebook and use it as path. Also, you need to give the Colab permissions to access your GDrive
After mounting GDrive you need to move files from the Colab to GDrive using command:
!mv /content/filename /content/gdrive/My\ Drive/

How to upload files to drive from Collab?

I want to know how we can upload the .txt & .vcf files.
I've already mounted the drive then done some sorting and downloading of data with wget in Collab. But I was not able to find resource to export or commit changes to drive.
Please help me!!
Once you are inside a particular notebook, you can use the file browser on the left to upload files to be used the current notebook. Remember that they will be deleted once your current session ends, so you will have to upload them again when you open the notebook later. If you have uploaded them elsewhere, you can simply use !wget to download them to your notebook's temporary storage.
Edit: To copy data, simply use !cp to copy the file(s) from your notebook storage to the drive once you have mounted it. For example, here is how I would copy data.xyz:
from google.colab import drive
drive.mount('/content/gdrive')
!cp data.xyz "gdrive/My Drive/data.xyz"
You may just as simply use !mv to move data to the drive instead of copying it. Just like that, you can copy/move data from the drive to the Collaboratory notebook too.

How do you import a .txt (or other file type) into a Google Colab notebook directly from your own Drive?

I'm trying to import a couple .txt files into Colab from my own Google Drive. The colab notebook I'm using is in the same folder as the files I want to upload within that notebook. While there is documentation on file I/O on External data: Local Files, Drive, Sheets, and Cloud Storage within Colab, the descriptions do not seem to fit what I'm looking for (e.g. I could upload these files directly from my local directory each time I use the notebook, but instead, I would like to have a cell written that loads them once and keeps them loaded via a connection to my own Drive). Also, the .txt files I want to upload contain samples of plain text to use for training a deep learning algorithm on sentiment analysis, and thus does not appear to be convertable to a .csv for functionally equivalent application (which might be solved by Google Sheets).
From External data: Local Files, Drive, Sheets, and Cloud Storage, the options are 1) uploading from a local directory, 2) mounting to Drive locally/using an API, or 3) importing from Google Sheets or Google Cloud Storage. Of these, I have tried 1 and some of 2, but because I am still relatively new to Python and Colab, the documentation of 2 and 3 are confusing and do not clearly direct me to a solution. Thus, it is possible that my problem is an interpretation problem, in which case showing how 2/3 could solve it would be quite helpful.
I'm thinking it should be possible to refer to these .txt files within Colab since both the colab notebook and .txt files are in the same Drive folder, but perhaps I'm overly comparing the cell functionality of Colab with navigating files in a directory in your everyday terminal.
If you have a file that you want to write in the root of your Google Drive called foo.txt then the following code should read the file into Colab:
from google.colab import drive
drive.mount('/content/drive')
The above will ask you to provide your credentials via a link in the cell output. Once you have authorized this access, you can run the following in another cell to write Hello Google Drive! to a text file called foo.txt in your Google Drive:
with open('/content/drive/My Drive/foo.txt', 'w') as f:
f.write('Hello Google Drive!')
To read a text file from Google Drive called foo.txt in your Google Drive drive you can similarly use:
with open('/content/drive/My Drive/foo.txt', 'r') as f:
print(f.read())
If one runs this reading code after having written to foo.txt this should output Hello Google Drive!

Categories