I am using python FTP server and client program. My need is to run Python FTP server on a remote machine that is connected on the same network as my local machine. FTP client will run from local machine, I need to connect FTP server with my FTP client running on local machine.
Please help!
This is my ftpserver.py:
from pyftpdlib.servers import FTPServer
from pyftpdlib.authorizers import DummyAuthorizer
from pyftpdlib.handlers import FTPHandler
authorizer = DummyAuthorizer()
authorizer.add_user("lokesh", "123", "current_dir", perm="elradfmw")
authorizer.add_anonymous("curent_dir", perm="elradfmw")
handler = FTPHandler
handler.authorizer = authorizer
server=FTPServer(("localhost",8080),handler)
server.serve_forever()
This is my ftpclient.py that needs to connect with the above server:
from ftplib import FTP
ftp = FTP('')
host='localhost'
port=8080
ftp.connect(host,port)
ftp.login()
print(ftp.getwelcome())
print('Current Directory ',ftp.pwd())
ftp.dir()
ftp.quit()
When I test my server and client on same machine it worked. But when I run the same server on another machine and tried to connect with my client it gave me error:
error: [Errno 10061] No connection could be made because the target
machine actively refused it
If you run the client on another machine, you have to connect to the host of the server, not to "localhost":
host='<server_host>'
Run ipconfig on your Windows server machine and look for "IPv4 address".
Replace port = 1027 with port = 8080 in your ftpclient file.
Related
Here is what I am trying to achieve:
SSH into an EC2 instance Node1.
Public IP is available for Node1.
I am using a .pem file to create Connection with Node1.
From Node1 ssh into localhost on port 2022: ssh admin#localhost -p 2022
Now execute a command, while inside localhost.
Here is the code snippet I am using:
from fabric2 import Connection
jump_host_ip = "A.B.C.D"
user = "root_user"
pem_key = "example.pem"
with Connection(jump_host_ip, user=user, connect_kwargs={"key_filename": pem_key}) as jump_host:
with Connection('localhost', user='dummy_user', port=2022,
connect_kwargs={'password': 'password'}, gateway=jump_host) as local_host:
local_host.run('ls -la')
This code is hosted on another EC2 server. And when executed from the EC2 server it throws the following exception:
paramiko.ssh_exception.AuthenticationException: Authentication failed.
But this code works when executed from a local machine (not from EC2 server).
Is it possible EC2 could be blocking the Connection to localhost through gateway ?
If yes, what should be the fix for this ?
I have the IP address of my apache server, running on ubuntu 18.04. I have installed MongoDB on the server. I have a script in python to connect the database but it doesn't work. I have shh connection without authentication for the database
I have already tried SSHTunnelForwarder but unsuccessfully.
from sshtunnel import SSHTunnelForwarder
from pymongo import MongoClient
MONGO_HOST = 'MY_IP_ADDRESS'
server = SSHTunnelForwarder(
MONGO_HOST,
remote_bind_address=('127.0.0.1', 27017)
)
server.start()
client = MongoClient('127.0.0.1', server.local_bind_port)
db=client.myDatabaseName
I have also tried
client = MongoClient('mongodb://MY_IP_ADDRESS/')
I have dockerized a simple OPC UA server. When I run it locally, I am capable of connecting to the server without problems. However, when I run the server in a Docker container the client refuses to connect. Further, when I try to set the endpoint for the server as opc.tcp://localhost:4840, the server will not bind to the address when it is ran inside a container. The endpoint opc.tcp://127.0.0.1:4840 must be used. This is not an issue when running the server locally. The following library is used to implement the server https://github.com/FreeOpcUa/python-opcua and the client used is https://github.com/FreeOpcUa/opcua-client-gui.
I have tried to set different endpoints without any luck.
The server implementation is:
from opcua import Server, ua
server = Server()
server.set_endpoint('opc.tcp://127.0.0.1:4840')
server.set_security_policy([ua.SecurityPolicyType.NoSecurity])
server.start()
try:
while True:
i = 1
finally:
server.stop()
The 'Dockerfile' exposes the following port EXPOSE 4840. The Docker run command is
docker run --rm --name server -p 4840:4840 opcua
You server in container is only listening to 127.0.0.1, hence only accepting connection from inside the container:
server.set_endpoint('opc.tcp://127.0.0.1:4840')
You should listen to all hosts such as:
server.set_endpoint('opc.tcp://0.0.0.0:4840')
you need to use --network host in your docker run command , since localhost on the conatiner is not your host
I have a client.py sending data to (server_ip,60000). The server side code, which receives data, sits inside a docker container. The codes are in Python and the server runs on Mac OS. Before migrating to docker, I could successfully transmit data. After dockerizing the server.py code, the bind happens, but client.py at connection.sendall(out) says:
socket.error: [Errno 32] Broken pipe
Here is my docker-compose.yml:
version: '2'
services:
server:
build: ./server
ports:
- server_IP:60000:60000
and here is the binding inside server.py:
port = 60000
host = "localhost"
Any idea why this happens?
Well, I could fix it by setting the host at server side to 0.0.0.0 inside docker and removing-rebuilding the image. All works fine.
I am not able connect to PostgreSQL remotely using python and psycopg2:
Here is my code.
>>> import psycopg2
>>> conn_string = "host='localhost' dbname='mydb' user='postgres'"
>>> print "Connecting to database\n ->%s" % (conn_string)
Connecting to database
->host='localhost' dbname='mydb' user='postgres'
>>> conn = psycopg2.connect(conn_string)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/home/tools/lib/python2.7/site-packages/psycopg2/__init__.py", line 164, in connect
conn = _connect(dsn, connection_factory=connection_factory, async=async)
psycopg2.OperationalError: could not connect to server: Connection refused
Is the server running on host "localhost" (::1) and accepting
TCP/IP connections on port 5432?
could not connect to server: Connection refused
Is the server running on host "localhost" (127.0.0.1) and accepting
TCP/IP connections on port 5432?
The password is not set for postgres user.
Generally, I can connect to database by running following method on host.
1. SSH to box
2. su - postgres
3. psql
4. \c mydb
The server runs PostgreSQL 9.1.
You're trying to connect to PostgreSQL on localhost using a script running on your computer, but there's no PostgreSQL server running there.
For this to work, you'd have to ssh to the remote server, then run your Python script there, where the PostgreSQL server is "local" relative to the Python script.
(That's why running psql works - because you're running it on the remote server, where PostgreSQL is "local" relative to psql).
Alternately, you could:
Use an SSH tunnel to forward the PostgreSQL port from the local computer to the remote one; or
Connect directly over TCP/IP to the remote PostgreSQL server using its host name or IP address, after enabling remote connections on the server.
Note that just putting the server's IP address or host name into the connection string instead of localhost will not work unless you also configure the server to accept remote connections. You must set listen_addresses to listen for non-local connections, add any required firewall rules, set pg_hba.conf to permit connections from remote machines, and preferably set up SSL. All this is covered in the Client Authentication chapter of the PostgreSQL user manual.
You'll probably find an SSH tunnel simpler and easier to understand.