I recently started using Github and was wondering if I should be working directly in the Github repository folder, or working in a different folder and then adding new files to the repository later?
GitHub is a code hosting platform for version control and collaboration. It lets you and others work together on projects from anywhere and you can create one branch for you and update it to master branch.
Related
I wanted to know how popular tools update their folders automatically when running tool locally. I wanted to explain a scenario in this below way.
Tool I wanted to run - example
Tool GitHub repo contains a directory (rules) that I want to check if there is any update, if there is any update then download the rules folder locally and tool continues to run.
I hope you got the question.
How can I save a file generated by colab notebook directly to github repo?
It can be assumed that the notebook was opened from the github repo and can be (the same notebook) saved to the same github repo.
Google Colaboratory's integrating with github tends to be lacking, however you can run bash commands from inside the notebook. These allow you to access and modify any data generated.
You'll need to generate a token on github to allow access to the repository you want to save data to. See here for how to create a personal access token.
Once you have that token, you run git commands from inside the notebook to clone the repository, add whatever files you need to, and then upload them. This post here provides an overview of how to do it in depth.
That being said, this approach is kind of cumbersome, and it might be preferable to configure colab to work over an SSH connection. Once you do that, you can mount a folder on the colab instance to a folder on your local machine using sshfs. This will allow you to access the colab as though it were any other folder on your machine, including opening it in your IDE, viewing files in a file browser, and cloning or updating git repositories. This goes more in depth on that.
These are the best options I was able to identify, and I hope one of them can be made to work for you.
I'm brand new to Github but experienced using Python and just wondered how to use both so that I can upload and export scripts for others to use! I have created a repository to start with! Any help would be great.
for a beginner with GitHub as you, it will be great to use GitHub desktop.
Download it and clone the repository you created. Now move your files into the repository folder. In GitHub desktop add a quick summary about what you changed and click the "commit" button. With this you created a timestamp in the history of your repository, but it is only available on your local machine. Now click the "push origin" button on the top in GitHub desktop. This will push the changes to the repository. When your repository will become more popular, people might want to commit changes to it. To do this, they will open a pull request and you just need to approve it and their changes will be merged into your repository. I hope this helped. If you don't understand anything about GitHub, feel free to ask. :)
I'm building an internal only flask-based application for work. I would like the program to update automatically from the master branch of my git repository. The application is running on a Linux based (Ubuntu server) VM. Is there a way to get the flask app to automatically pull changes and update from the master branch?
Edit: I don't want a python module updater from pip/git (most common search result)
Update: I found a module called micropython-ota-updator, the problem with this is it seems to update on reboots. The software I'm building is meant to consistently stay up. My inital idea was to check for updates at a interval.
For anyone else having this issue:
I decided on dynamically generating GitHub links and building a function for cloning the repo. There were also some configuration files that get saved outside of the repo before cloning. Once the repo is cloned, the configuration files are moved back into their proper places in the new repo. If anyone else wants more information about this, let me know. I'm glad to share my discoveries!
I have a repo with multiple sub-directories. Each sub-directory contains an app that's deployable to Heroku. To push the code off, I use git subtree, as suggested here.
This has worked well, and I want to continue in this direction by adding versioned APIs as deployable apps. The problem is that these APIs share substantial amounts of code.
Is there a way to have shared code in this setup, such that the apps are still deployable and the shared code is contained in the repo? The APIs are written in python, so something pip/virtualenv-specific would work.
Allegedly submodules work, but I'd prefer to avoid them as they've left a bad experience.
Never got an answer, so here's what I did.
I continued to use subtree instead of submodules. Then for each sub-project, I created two requirements.txt for pip: one for Heroku, one for running a sub-project locally. They look the same, minus the shared lib.
In the Heroku-based requirements.txt, the shared lib is specified like this:
-e git+https://username:password#github.com/org/repo.git#egg=sharedlib&subdirectory=sharedlib
This tells pip to pull the subdirectory sharedlib off repo from github.
In the requirements.txt for running the project locally, the shared lib is specified like this:
-e ../sharedlib
This tells pip to pull the localy subdirectory sharedlib from the repo.
Hope this helps!