I am currently working on a server API. Said API is used in a mobile app that is being developed, and thus I've set up a dev server on a VPS. The API server uses python/django/django-rest-framework.
I have two offices that I work at- one has my desktop workstation, and at the other I use a laptop. I commonly will move from one office to the other during the day.
I have followed the many tutorials that are out there for setting up PyCharm to use a remote interpreter, and to sync changes to a remote server. However this seems to be causing a bit of an issue for me- I have 3 different copies of my code that may or may not be synced. Specifically, the server will always have its code synced with the last machine to have updated files. For example:
Make a change to coolStuff.py using my desktop. Save the changes. This pushes the saved changes to the server via pyCharm.
Get called to the other office. Drive over, open up my laptop. Go to edit coolStuff.py, and not realize that I am now working on an old copy, potentially overwriting the changes I made in step 1.
To get around this, I've been committing all of my changes after step 1, then pushing them up to the repo. I then need to pull the changes to the laptop. However, there are times that the push or pull is forgotten, and issues arise.
Is there a better workflow, or tools within pyCharm that can automate a 2-way sync between a dev machine and the server? Alternatively, would it be bad practice to use a cloud storage/sync to keep everything syncd up? (e.g.- store my code directory on something like Dropbox or OneDrive). I figure there has got to be a better way.
Related
My apologies if this has been answered somewhere. I looked and did not find an answer to this rather specific question.
My environment:
PC: Windows 10 Home, Dual Monitors, Git Bash, PyCharm, static IP assigned
Raspberry Pi: GPS Module (NMEA 0183 interface connected via USB
Developing a data logger utilizing Python with flask and serial modules
Since my PC is much easier to develop on, I have been updating the code there but I need to actually test it on the Raspberry Pi because it has the GPS module and is my target environment. I was pushing the changes via GIT to BitBucket. I would then pull the changes back down to my R Pi - this was working OK but it required me to commit and push after each update. I was trying eliminate a step using my local network.
I added a remote name "Pi3" to my project on the PC but when I tried to push the updates to my Pi I got a message saying "[remote rejected] (branch is currently checked out). I could checkout the master, push the updates and then re-checkout my development branch but by the time I do this I might as well hop through BitBucket.
I then tried doing a pull or fetch from my Pi 3 but since I do not have an sshd (SSH server) running on my PC, it naturally failed. A quick look at setting up a SSH Server on Windows 10 looks like a huge pain. I am willing to endure that pain for long term gain but I am unsure if this would get me there even after I set it up.
I guess another solution is to script using scp to copy the files that I want but that requires maintenance as the project grows or I work on a different project. I can't be the only one that codes on one system and tests on another.
Any suggestions or feedback is appreciated.
It sounds like your workflow is based around git and new commits should be made available on the remote machine (Pi). You might want to define a post-commit or post-receive hook to copy your latest code over to the Pi, preferably via rsync.
See more about git hooks here.
Another option is to define your remote git as bare clone (see here), so you can just push to the remote git without conflicting checked-out branches in the way.
There are a number of computer that I can access via ssh.
I was wondering if I can have a python code base in a single location but execute a part of it or all of it in each of these computers independently.
I can copy my codes in each of these PCs and then execute them in each through SSH but it will be hard then to make a change in the code since I should do it in all the copies.
I was also wondering if I can do something like this similar to a cluster since each of these PCs has a number of CPUs, though that would probably not be possible or very easy.
Two options quickly pop to mind:
Use sshfs to mount the remote code location to the local PC and run.
Use something like git to store your configured code, and setup each of the PCs to pull the app (?) from the remote code repo. This way, you update / configure the code once and pull the update to each PC.
For example:
We use the second method. We have seven RasPi servers running various (isolated) tasks. One of the servers is a NAS server which has a Git repo on it - where we store our configured code, and use git pull or git clone commands (via ssh) to pull the app to the local server. Works really well for us. Maybe an idea to help you ... ?
I'm asking a question about my Django application updates.
I'm working on my Django Development server and I would like to know How I could create a patch which will update my different Django Project installed on different servers.
For example :
In my Dev' server, I add a new Query in my script. Then I would like to add this new Query to my all virtual instances by updating my script and not just copy/past modifications. It could work for one query, but if I have 200 rows, copy/paste will be too long.
I launch a patch which will make all updates without manually copy/paste rows.
How it's possible to do that ?
Thank you so much
You should probably consider migrating your project to a version control system.
Then every time you are changing something to your local copy of the code and push the changes on the repository, you can fetch/rebase/pull your changes wherever you want (be it your server or another computer of yours), and your patches will be applied without copy/pasting!
Some version control systems to consider:
Git Lab allowing the creation of free private repositories
Github the "old reliable", but no free private repositories
Bit Bucket (don't use it myself but it is widely used!)
Good luck :)
We have developed a desktop application using Python. The scenario is:
An internet connection is not available.
The desktop application would be installed in multiple PCs
PC1 would enter data to a particular section (other PCs, (like PC2, PC3, etc.) can't input their respective sections unless PC1 enters their respective section).
When PC1 enters data on their section, we want it to be reflected on every other PC that has the desktop application installed.
How can we sync data within multiple desktop applications where there is no internet connectivity?
If we can achieve this using a local LAN, then how can we do this?
The most common solution would be to have one machine act as the server. Each other machine would pass its data back to the server machine over the local network. The other client machines would be notified by the server that changes had been made. The server would control and store all the data.
You could either write a separate server application to run on a completely different machine, or you could have one of your client machines act as the 'host' and take on the server role itself.
Alternatively, you could work on a peer-to-peer solution, where each machine has a separate but synchronised copy of the data. That sounds harder to get right, to me.
One of the answers at Python Library/Framework for writing P2P applications recommends the Twisted framework for network applications.
I am running a backbone server operations for my main server. The main server basically supports a iFrame to portal to the backbone.
I run the backbone at home on 4 Old Apple TV's that I installed Linux on and one Raspberry Pi I run the API on. The API switches from loading the resource from Apple TV 1, to the next (2), to the next (3), to the next (4), and back to one (1).
The problem I face is that it's difficult to have to copy a file 4 times across the servers. Especially if I am making a new page automatically that needs to be forwarded to the 4 servers.
Is there a way I can allow, say, an exception in Apache? or run SSH? And preferably a way to send these commands via Python?
Sorry, I'm much more of CSS and HTML developer then running servers.
Thanks in advance, cheers.
Syncing files can be done a number of ways
You can either copy the files around all of the time to keep everything updated or you can use a network filesystem to keep everything all in one place.
If you decide to copy the files around I would take a look at rsync.
If you must have python you can do some cool things with fabric.
If you decide to go the network filesystem route you should take a look at NFS