pycharm can't complete remote interpreter setup for Docker - python

I'm new to Docker. I'm using Docker & docker-compose, going through a flask tutorial. The base docker image is python 2.7 slim.
It's running on Linux. docker 1.11.2
The application is working fine.
I want to get pycharm pro connecting to the remote interpreter, something I have never done before.
I followed the instructions for docker-compose. Initially it was failing because it could not connect to port 2376. I added this port to docker-compose.yml and the error went away.
However, trying to save the configuration now stalls/hangs with a dialog 'Getting Remote Interpreter Version'. This never completes. Also, I can't quit pycharm. This happens in Pycharm 2016.2 and 2016.3 EAP (2nd).
The help say "SFTP support is required for copying helpers to the server".
Does this mean I need to do something?

I'm not using docker-machine
The problem was that TCP access to the docker API is not established by default under ubuntu 16.04.
There are suggestions to enable TCP/IP access.
However, JetBrains gave me the simplest solution:
If you are using Linux it is most likely that Docker installed with
its default setup and Docker is expecting to be used through UNIX
domain file socket /var/run/docker.sock. And you should specify
unix:///var/run/docker.sock in the API URL field. Please comment
whether it helps!
This suggestion worked with my Ubuntu 16.04 -derived distribution.
This goes into the Docker entry in PyCharm preferences under Build, Execution, Deployment.
You can also edit this while setting up a remote interpreter, but only by making a new Docker entry.
TCP/IP Method
This method works if you want TCP/IP access, but this is a security risk. The socket approach is better, which is probably why it is the default.
https://coreos.com/os/docs/latest/customizing-docker.html
Customizing docker
The Docker systemd unit can be customized by overriding the unit that
ships with the default CoreOS settings. Common use-cases for doing
this are covered below.
Enable the remote API on a new socket
Create a file called /etc/systemd/system/docker-tcp.socket to make
Docker available on a TCP socket on port 2375.
[Unit]
Description=Docker Socket for the API
[Socket]
ListenStream=2375
BindIPv6Only=both
Service=docker.service
[Install]
WantedBy=sockets.target
Then enable this new socket:
systemctl enable docker-tcp.socket
systemctl stop docker
systemctl start docker-tcp.socket
systemctl start docker
Test that it’s working:
docker -H tcp://127.0.0.1:2375 ps
Once I thought to search for ubuntu 16.04 I came across simpler solutions, but I did not test them.
For instance:
https://www.ivankrizsan.se/2016/05/18/enabling-docker-remote-api-on-ubuntu-16-04/
Edit the file /lib/systemd/system/docker.service
Modify the line that starts with ExecStart to look like this:
ExecStart=/usr/bin/docker daemon -H fd:// -H tcp://0.0.0.0:2375
Where my addition is the “-H tcp://0.0.0.0:2375” part. Save the
modified file. Restart the Docker service:
sudo service docker restart
Test that the Docker API is indeed accessible:
curl http://localhost:2375/version

I - docker-compose up
I think PyCharm will run docker-compose up, have you try to run this command first in your terminal (from where your docker-compose.yml is) ?
Maybe if some errors occur, you will get more info in your terminal.
II - pycharm docker configuration
Otherwise it could be due to your docker machine configuration in PyCharm.
What I do to configure my machine and to be sure this one is correctly configured:
1 - run docker-machine ls in your shell
2 - copy paste the url without tcp://
3 - go to pycharm preferences -> Build, Execution, Deployement -> Docker -> + to create a new server, fill the server name field
4 - paste previously copied url keeping https://
5 - fill the path of your machine certificates folder
6 - tick Import credentials from Docker Machine
7 - click Detect -> your machine should appear in the selection list
8 - save this server
9 - select this server when configuring your remote interpreter, from PyCharm Preferences -> Project -> Project Interpreter -> wheel -> add remote -> Docker or Docker Compose
10 - you should be able to select a service name
11 - save your new interpreter
11 - try run your test twice, sometimes it could take time to initialize

Related

How do I build docker container behind company proxy?

I am trying to build a simple python based docker container. I am working at a corporate behind a proxy, on Windows 10. Below is my docker file:
FROM python:3.7.9-alpine3.11
WORKDIR ./
RUN pip install --proxy=http://XXXXXXX:8080 -r requirements.txt
COPY . /
EXPOSE 5000
CMD ["python", "application.py"]
But it's giving me the following errors in cmd :
"failed to solve with frontend dockerfile.v0: failed to build LLB: failed to load cache key: failed to do request: Head https://registry-1.docker.io/v2/library/python/manifests/3.7.9-alpine3.11: proxyconnect tcp: EOF"
I've tried to figure out how to configure docker's proxy, using many links but they keep referring to a file "/etc/sysconfig/docker" which I cannot find anywhere under Windows 10 or maybe I'm not looking at the right place.
Also I'm not sure this is only a proxy issue since I've seen people running into this issue without using a proxy.
I would highly appreciate anyone's help. Working at this corporate already made me spend >10 hours doing something that took me 10 minutes to do on my Mac... :(
Thank you
You're talking about the most basic of Docker functionality. Normally, it has to connect to the Docker Hub on the internet to get base images. If you can't make this work with your proxy, you can either
preload your local cache with the necessary images
set up a Docker registry inside your firewall that contains all the images you'll need
Obviously, the easiest thing, probably by far, would be to figure out how to get Docker to connect to Docker Hub through your proxy.
In terms of getting Docker on Windows to work with your proxy, might this help? - https://learn.microsoft.com/en-us/virtualization/windowscontainers/manage-docker/configure-docker-daemon
Here's what it says about configuring a proxy:
To set proxy information for docker search and docker pull, create a Windows environment variable with the name HTTP_PROXY or HTTPS_PROXY, and a value of the proxy information. This can be completed with PowerShell using a command similar to this:
In PowerShell:
[Environment]::SetEnvironmentVariable("HTTP_PROXY", "http://username:password#proxy:port/", [EnvironmentVariableTarget]::Machine)
Once the variable has been set, restart the Docker service.
In PowerShell:
Restart-Service docker
For more information, see Windows Configuration File on Docker.com.
I've also seen it mentioned that Docker for Windows allows you to set proxy parameters in its configuration GUI interface.
There is no need to pass proxy information in the Dockerfile.
There are predefined ARGs which can be used for this purpose.
HTTP_PROXY
HTTPS_PROXY
FTP_PROXY
You can pass the details when building the image
https://docs.docker.com/engine/reference/builder/#predefined-args
I do not see any run time dependency of your container on the Internet. So running the container will work without an issue.

How to Deploy Flask app on AWS EC2 Linux/UNIX instance

How to deploy Flask app on AWS Linux/UNIX EC2 instance.
With any way either
1> using Gunicorn
2> using Apache server
It's absolutely possible, but it's not the quickest process! You'll probably want to use Docker to containerize your flask app before you deploy it as well, so it boils down to these steps:
Install Docker (if you don't have it) and build an image for your application and make sure you can start the container locally and the app works as intended. You'll also need to write a Dockerfile that sets your runtime, copies all your directories and exposes port 80 (this will be handy for AWS later).
The command to build an image is docker build -t your-app-name .
Once you're ready to deploy the container, head over to AWS and launch an EC2 instance with the Linux 2 machine. You'll be required to create a security key (.pem file) and move it to somewhere on your computer. This acts like your credential to login to your instance. This is where things get different depending on what OS you use. On Mac, you need to cd into your directory where the key is and modify the permissions of it by running chmod 400 key-file-name.pem. On Windows, you have to go into the security settings and make sure only your account (ideally the owner of the computer) can use this file, basically setting it to private. At this point, you can connect to your instance from your command prompt with the command AWS gives you when you click connect to instance on the EC2 dashboard.
Once you're logged in, you can configure your instance to install docker and let you use it by running the following:
sudo amazon-linux-extras install docker
sudo yum install docker
sudo service docker start
sudo usermod -a -G docker ec2-user
Great, now you need to copy all your files from your local directory to your instance using SCP (secure transfer protocol). The long way is to use this command for each file: scp -i /path/my-key-pair.pem file-to-copy ec2-user#public-dns-name:/home/ec2-user. Another route is to install FileZilla or WinSCP to speed up this process.
Now that all your files are in the instance, build the docker container using the same command from the first step and activate it. If you go to the URL that AWS gives you, your app should be running on AWS!
Here's a reference I used when I did this for the first time, it might be helpful for you to look at too

Pycharm Remote interpreter on Docker remote: [Errno 2] No such file or directory

As specified in the title I am trying to use Pycharm Professional (2018.2) with a python remote interpreter in a Docker machine hosted on a remote server in my LAN. I created a very simple example by following the help 'https://www.jetbrains.com/help/pycharm/using-docker-as-a-remote-interpreter.html'.
Pycharm 2018.2 is installed on a LAN pc (192.168.1.10) on a debian distro;
Docker is installed on a LAN debian server (192.168.1.22)
I was able to configure Docker as a remote interpreter, to connect with the Docker service through the Pycharm tool but when I try to run (or debug) the main.py in the Docker container I always get this:
37073edcd9d2:python -u /opt/project/main.py (null): can't open file '/opt/project/main.py': [Errno 2] No such file or directory
Process finished with exit code 2
The execution is certainly done in the remote Docker container but it seems that the file to be executed is not found. I manually attached the local volume as described on various blogs with all possible variations but I always get the same error.
These are some specifications of my configuration:
docker tool setting
project interpreter setting
Run/Debug Configuration
docker container setting with volume mapping into Run/Debug Configuration
Is missing something?
Tanks. Any help is appreciated!
The problem lies in the Pycharm 'limit' of managing a docker machine on a remote host 'under the hood'. When inserting the volume mapping in the run / debug configuration, it is interpreted as a local path and therefore, in this case, a path that must be present on the remote server. So, for now, the only option is to mount the local path (the folder where the project is located) on the remote host of the Docker service by first sharing it through an SSHFS or NFS service.
So ... (1) I shared the Pycharm project folder (local machine ip 192.168.1.10) using NFS; (2) I mounted the shared folder on the server host (on server ip 192.168.1.22; mount -t nfs 192.168.1.10:/home/user/PythonProjects /home/ext-user/mnt/projects) then (3) in the run / debug configuration of Pycharm I mapped the volumes with the path mounted on the remote server ... Run ... the program now runs without any errors. [Run result]
These are some specifications of my new configuration:
Run/Debug Configuration
docker container setting with volume mapping into Run/Debug Configuration
I hope the solution can be useful to other people. I also hope that there are better solutions than mine :-)

How do I run pycharm within my docker container?

I'm very new to docker. I want to build my python application within a docker container. As I build the application I want to be testing / running it in Pycharm and in the container I build.
How do I connect Pycharm pro to a specific container or image (either python or Anaconda)?
When I create a project, click pure python and then add remote, then clicking docker I get the following result
I'm running on Mac OS X El Capitan (10.11.6) with Docker version 1.12.1 and Pycharm Pro 2016.2.3
Docker-for-mac only supports connections over the /var/run/docker.sock socket that is listening on your OSX host.
If you try to add this to pycharm, you'll get the following message:
"Cannot connect: java.lang.ExceptionInInitializerError, caused by: java.lang.IllegalStateException: Only supported on Linux"
So PyCharm really only wants to connect to a docker daemon over a TCP socket, and has support for the recommended TLS protection of that socket. The Certificates folder defaults to the certificate folder for the default docker-machine machine, "default".
It is possible to implement a workaround to expose Docker for Mac via a TCP server if you have socat installed on your OSX machine.
On my system, I have it installed via homebrew:
brew install socat
Now that's installed, I can run socat with the following parameters:
socat TCP-LISTEN:2376,reuseaddr,fork,bind=127.0.0.1 UNIX-CLIENT:/var/run/docker.sock
WARNING: this will make it possible for any process running as any user on your whole mac to access your docker-for-mac. The unix socket is protected by user permissions, while 127.0.0.1 is not.
This socat command tells it to listen on 127.0.0.1:2376 and pass connections on to /var/run/docker.sock. The reuseaddr and fork options allow this one command to service multiple connections instead of just the very first one.
I can test that socat is working by running the following command:
docker -H tcp://127.0.0.1:2376 ps
If you get a successful docker ps response back, then you know that the socat process is doing its job.
Now, in the PyCharm window, I can put the same tcp://127.0.0.1:2376 in place. I should get a "Connection successful" message back:
This workaround will require that socat command to be running any time you want to use docker from PyCharm.
If you wanted to do the same thing, but with TLS, you could set up certificates and make them available for both pycharm and socat, and use socat's OPENSSL-LISTEN instead of the TCP-LISTEN feature. I won't go into the details on that for this answer though.

Pycharm 4.5.3 remote console 'cannot connect to remote process'

Starting today for no discernible reason, Pycharm's remote console function will not connect with my remote server.
All other functions are working as normal, SSH session, deployment config, skeletons update, file sync etc.
I am running Pycharm 4.5.3 on Windows7x64 against a remote server running Centos6.5x64 on AWS, note that this setup has been working fine for months until today.
The following output appears in the console window when remote console is launched, it takes a minute or so to timeout:
sftp://user#FQDN:22/home/user/Envs/lab1/bin/python2.7 -u /home/user/.pycharm_helpers/pydev/pydevconsole.py 0 0
Couldn't connect to console process.
Process finished with exit code -1
Unhelpful Log output(C:\Users\user\.PyCharm40\system\log\idea.txt):
2015-07-09 17:15:07,910 [ 236325] INFO - esdk.transport.JschExecProcess - Executing ssh command: env "PYTHONIOENCODING"="UTF-8" "JETBRAINS_REMOTE_RUN"="1" "IPYTHONENABLE"="True" "PYTHONUNBUFFERED"="1" /home/user/Envs/lab1/bin/python2.7 -u /home/user/.pycharm_helpers/pydev/pydevconsole.py 0 0 for user#FQDN:22
The following troubleshooting steps have yielded nothing: Workstation / Server, reboot, Fresh Virtualenv, Different version of Python, reinstall of iPython, uninstall of iPython, reset of console / deployment configuration, connecting from a different workstation running same version of Pycharm, upgrade Pycharm from 4.5.2 to 4.5.3.
Suggestions for further troubleshooting steps gladly welcome while I wait for Jetbrains support to get around to my ticket!
The problem was within the security configuration on AWS.
My AWS instance has a friendly FQDN to go with the unfriendly AWS internal name, which appears to cause some havoc with the way their NAT works if you don't have a rule allowing traffic from the IP bound to the FQDN back into the server.
I created the issue yesterday when my office router reset to a new IP - I must've overwritten the recursive rule with the rule allowing my new office IP into the instance.
So, add a rule allowing the server public IP for all traffic and Pycharm connects again when pointed at the FQDN.

Categories