VSCode git integration is just awful- is there a better way? [closed] - python

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 3 years ago.
Improve this question
I am a software architect with 23 years commercial experience in C++/C#/Python/Cython. I was forced into using VSCode rather than PyCharm becasue I had to debug some particularly tricky Django templates. I use several machines, one of which is a production test machine. I changed 1 line of code while logged on there and pushed it because it was critical. Now my Git is totally screwed. I have have worked with sourcesafe and clearcase over the years, and I do not have a problem with Git however VSCode has rendered my setup useless.
I now have to save log data on my production test box, wipe the prod folder and reinstall all the code from scratch from Git. This is about a day's work and should not be necessary.
I was warming to VSCode but I will change the way I work until VSCode/Git integration is improved enormously. I will code using VSCode but at any point I want to pull, push, fetch or merge, I will switch to the community edition of PyCharm. PyCharm handles source control like a dream but I don't want to do my day-to-day development with it.
Anybody else out there having similar problems? I simply don't have the time to write a VSCode extension to handle Git properly. I have an end-of-Septemebr deadline and I simply can't have this happen again.
Any feedback is most welcome.
/Luke

For the most part, I just use command line git integration.
However, for large projects or for particularly tricky merge conflicts or other problems, you could use third party git clients such as Source Tree (what I use) or GitKraken. These offer a host of features with nice interfaces.

Related

Integrating Docker into the existing TFS-based infrastructure (not web-apps) [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 5 years ago.
Improve this question
I need some general advice here. In our (large) company we have an already established test and build infrastructure that is based on MS TFS to build and test our products. Quick facts:
mostly C++ and C# code
system-level software
relying on lots of pre-configured build agents
supported by folks with TFS expertise
For a new Python project (which is not a web-app either) we need to set up a CI. My idea is to piggy-back on the existing unified infrastructure and take advantage of Docker container. Here is my idea: on every check-in into Python project's repository launch a dedicated TFS build which
pulls the code
builds a Docker container or image
runs the tests inside it
and...?
And here is the question: what is the best way to interact with the outside world? How would you advice to push results out if we're not planning to build a web-app? Ideally I would like to keep only the failed tests' artifacts while not even keeping all those containers from every check-in.

How best to save python code? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 8 years ago.
Improve this question
I am learning with an online interpreter and would like to save bits of code in my Google drive. Is there a good way to do this so I can easily copy and paste my work back into a webpage. If there is a better way to save it that isn't in the cloud I would be interested in that as well.
The best way is with a version control system, for example git or mercurial are popular choices for python users.
You can then host your code for free on github or bitbucket, amongst other services.
If you download or have a text editor ( something like Sublime text ) you can paste it into that, put in the syntax you are using so your code is highlighted. Save it via that. However if you want something online, use github or for simplicity dropbox.
If you'd like to use Google Drive, just save your .py files and drop them into Google Drive. Then you can edit using the ShiftEdit extension (https://shiftedit.net/). But Github Gists or a similar product is better.
If you would like to use a VCS, you could use git with GitLab. Just head over to http://www.gitlab.com and create your personal account, and have unlimited private repositories.
Something like gist.github.com or jsfiddle.net whould work, though they're both "cloudish". You could always create a local directory and use git or hg or some other distributed version control system to manage your code. That would be a good learning experience too.

Setting up local environment for Python [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 9 years ago.
Improve this question
I'm trying to transfer from using PHP to Python, Im looking to change to Python as it seems a much more versatile language able to work across a range of scenarios. The sort of things i plan to use it for range from web app development (using django), NLP, machine learning and automation using mechanize.
One of the things I really liked about PHP was MAMP, the way it creates an htdocs folder, a localhost:8888 url, and a MySQL server, with pretty much 0 effort.
Is there something similar with Python ? I'm not necessarily looking for a GUI like MAMP (although that would be good) - what are the other options for setting up a local environment?
Python excels in this area, but as with most tools exactly what you do depends on what you want. In particular, you certainly want virtualenv, Python's configuration and dependency -isolation tool.
You may also want a development-configuration management tool such as buildout, but that is more controversial as there are many other great, language-agnostic tools that overlap. (For example, you may want to set up your environment using Vagrant and leave your host OS behind.)
Neither virtualenv nor buildout will set up Apache for you OotB, but you do have the option of installing django, zope, or many other Python frameworks and applications with buildout recipes. There are recipes for apache too, but most Python web development that I know of is agnostic of the httpd, so you might end up not wanting it.

how is docker better if tools like fedora's devassistant and virtualenv exist? [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 9 years ago.
Improve this question
I am trying to understand docker.io LXC container for a while,but if we consider fedora's devassistant tool and virtualenv then virtualenv will do the work of isolation and devassistant will download all the needed dependency by interpreting the setup configuration file.so its like by using two keyword commands on terminal like we almost setup a developer environment of Openstack or lets say large multi-repository project within minutes taking into consideration to use right tool for the right job.so how is docker different?
virtualenv only does isolation for python environments, it doesn't do process isolation.
I'm not familiar with fedora's devassistant but I'm pretty sure those changes are system wide. What if on the same server you want to run, python, ruby, java, and node.js apps? There might be conflicting requirements at the system level.
With Docker, this is easy because each app has it's own container and they can put what ever you want in there, and they don't interfere with each other. Think of docker like this. It is giving each application it's own VM (container) to live in, it is similar to setting up a physical server and installing different virtualbox servers on it, one for each application. But it is much more lightweight and you can run it on both physical and virtual hosts.
You can also move the docker containers from one docker compatible server to another really easily.

As part of development, I am committing to github and pulling down and executing elsewhere. It feels wrong [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
I have a standard-ish setup. Call it three servers - www, app and db, all fed from fabric scripts, and the whole on github.
I have a local laptop with the repo clone. I change a file locally, and push it to github then deploy using jenkins - which pulls from github and does its business. The problem here is I can put a dozen rubbish commits up till I manage to fix all my typos.
Its not so much the round trip to github that matters, but the sheer number of commits - I cannot squash them as they have been pushed. It looks ugly. It works sure but it is ugly.
I don't think I can edit on the servers directly - the file are spread out a lot, and I cannot make each directory on three servers a clone of github and hope to keep things sane.
And trying to write scripts that will synch the servers with my local repo is insane - fabric files took long enough.
I cannot easily git pull from jenkins, because I still have to commit to have jenkins pull, and we still get ugly ugly commit logs.
I cannot see a graceful way to do this - ideas anyone.
The solution is quite simple: make cleaner commits (fix typos before committing, only commit changes that belong together, not for too small edits). It's a bit odd that you don't take the time to fix typos (by running/testing locally) but wish to reduce the number of commits by some other means.
The solution is to not use github / jenkins to deploy to the servers.
The servers should be seen as part of the 'local' deployment (local being pre-commit)
So use the fab files directly, from my laptop.
That was harder because of pre processing occuring on jenkins but that is replicable.
So, I shall take Jeff Atwoods advice here
embrace the suck, in public.
Well I certainly sucked at that - but hey I learnt.
Will put brain in the right way tomorrow.

Categories