transfer files from ssh server directly to client - python

i have a ssh server which is use to store my files online. i need to make those file available for download easily so i am using paramiko library to connect to the ssh server (from my python script), it then lists files and displays on the webpage. the problem is that i dont want to download the files to the web server's disk (dont have enough space) then send them to the clients , instead do something like read the file and spit it like it can be done in php.
something like this in python
<?php
// your file to upload
$file = '2007_SalaryReport.pdf';
header("Expires: 0");
header("Last-Modified: " . gmdate("D, d M Y H:i:s") . " GMT");
header("Cache-Control: no-store, no-cache, must-revalidate");
header("Cache-Control: post-check=0, pre-check=0", false);
header("Pragma: no-cache");
header("Content-type: application/pdf");
// tell file size
header('Content-length: '.filesize($file));
// set file name
header('Content-disposition: attachment; filename='.basename($file));
readfile($file);
// Exit script. So that no useless data is output-ed.
exit;
?>

Not a programmer's answer: instead of paramiko, mount the directory with documents with sshfs and serve the documents as if they were on the local file system
http://fuse.sourceforge.net/sshfs.html

A cgi script generates output on stdout which is delivered to the webserver. The output starts with header lines, one blank line (as a separator) and then the content.
So you have to change the header() calls to print statements. And readfile($file) should be replaced by
print
print open("2007_SalaryReport.pdf","rb").read()
See http://en.wikipedia.org/wiki/Common_Gateway_Interface

Related

FTP upload to any upload center and get download link by python [duplicate]

Below is my code to upload file to other server ,
$ftp_server="host";
$ftp_user_name="";
$ftp_user_pass="";
$file = "referesh.php";//tobe uploaded
$name = "diff_adtaggeneration";
$remote_file = $name."/referesh.php";
// set up basic connection
$conn_id = ftp_connect($ftp_server);
// login with username and password
$login_result = ftp_login($conn_id, $ftp_user_name, $ftp_user_pass);
$res = ftp_rawlist($conn_id, $name);
//print_r($_SERVER);exit;
if(count($res) > 0)
{
ftp_put($conn_id, $remote_file, $file, FTP_ASCII);
}
else
{
ftp_mkdir($conn_id, $name);
ftp_put($conn_id, $remote_file, $file, FTP_ASCII);
}
ftp_close($conn_id);
above code works perfectly, the file succesfully uploaded to other server, the new folder name 'diff_adtaggeneration' will create in root of the directory and file uploaded in that folder, and I need to get the uploaded file path ..! I use print_r($_SERVER) , I know this shows only current server details, but how to get the root directory full file path of uploaded server(other server). Any help appreciated...
There's no magical way to find out an HTTP URL of the file uploaded to an FTP server, that happens to share a file storage with the HTTP server.
It's a matter of configuration of both the FTP and HTTP servers; how do they map virtual HTTP and FTP paths to and from an actual file system paths.
The mapping can be completely virtual and undeterministic.
Having that said, most web hostings have some root both on FTP and HTTP, from where the mapping is deteministics.
If you connect to the FTP, you usually land in the root folder of the HTTP hosting, or immediately below it (there would be a subfolder like httpdocs or www that maps to the HTTP root).
If the FTP account is chrooted, the root folder will be / (or /httpdocs, etc).
If you have your own domain (www.example.com), then an FTP path like /index.html maps to https://www.example.com/index.html.
That's the most simple case. It can be a way more complicated. But that's something we cannot help you with. That's a question for your web hosting provider and/or administrator.
Using ftp_pwd you can achieve to this kind of a level. But will be relative to (may be public_html). If you want to access the file using http add the file to a http exposed directory.
$ftp_server="host";
$ftp_user_name="";
$ftp_user_pass="";
$file = "referesh.php";//tobe uploaded
$name = "diff_adtaggeneration";
$remote_file = $name."/referesh.php";
// set up basic connection
$conn_id = ftp_connect($ftp_server);
// login with username and password
$login_result = ftp_login($conn_id, $ftp_user_name, $ftp_user_pass);
$res = ftp_rawlist($conn_id, $name);
//print_r($_SERVER);exit;
if(count($res) > 0)
{
ftp_put($conn_id, $remote_file, $file, FTP_ASCII);
}
else
{
ftp_mkdir($conn_id, $name);
ftp_put($conn_id, $remote_file, $file, FTP_ASCII);
}
$remotePath = ftp_pwd($conn_id)."/".$remote_file; ///public_html/name/refresh.php
ftp_close($conn_id);

Signature did not match on generated Azure SAS download link?

I've been trying to get the shared access signature working for Azure Blob Storage as I want to provide time limited downloads. For that I currently use a PHP script that calls a Python script on the Azure Websites service.
This is the way I call it to test things:
<?php echo "https://container.blob.core.windows.net/bla/random.zip?"; system('azureapp\Scripts\python azureapp\generate-sas.py "folder/file.zip"'); ?>
And this is the python script I use to generate the parameters:
from azure.storage import *
import sys, datetime
file = sys.argv[1]
accss_plcy = AccessPolicy()
accss_plcy.start = (datetime.datetime.utcnow() + datetime.timedelta(seconds=-120)).strftime('%Y-%m-%dT%H:%M:%SZ')
accss_plcy.expiry = (datetime.datetime.utcnow() + datetime.timedelta(seconds=15)).strftime('%Y-%m-%dT%H:%M:%SZ')
accss_plcy.permission = 'r'
sap = SharedAccessPolicy(accss_plcy)
sas = SharedAccessSignature('containername', 'accountkey')
qry_str = sas.generate_signed_query_string(file, 'b', sap)
print (sas._convert_query_string(qry_str))
I managed to get this construct running for the most part, but my current issue is that if I use the link that was generated I am now always faced with this error:
<Message>
Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.
</Message>
<AuthenticationErrorDetail>
Signature did not match. String to sign used was r 2015-01-27T22:52:17Z 2015-01-27T22:54:32Z /filepath/random.zip 2012-02-12
</AuthenticationErrorDetail>
I double checked everything and tried to find something on Google, but sadly this aspect isn't really THAT well documented, and the error message isn't really helping me out either.
EDIT: forgot to leave an example link that was generated: https://container.blob.core.windows.net/filepath/random.zip?st=2015-01-27T23%3A23%3A18Z&se=2015-01-27T23%3A26%3A18Z&sp=r&sr=b&sv=2012-02-12&sig=eqCirXRjUbGGVVAYwWwARreTPr4j8wXubGx1q51AUHU%3D&
Assuming your container name is bla and file name is random.zip, please change the following code from:
<?php echo "https://container.blob.core.windows.net/bla/random.zip?"; system('azureapp\Scripts\python azureapp\generate-sas.py "folder/file.zip"'); ?>
to:
<?php echo "https://container.blob.core.windows.net/bla/random.zip?"; system('azureapp\Scripts\python azureapp\generate-sas.py "bla/random.zip"'); ?>
The reason you were running into this error is because you were not providing the name of the blob container in your SAS calculation routine, thus storage service was computing the SAS on $root blob container. Since the SAS was computed on a one blob container and used on another blob container, you're getting this authorization error.

Python Sockets, download is almost 10x the size of original file, upload is 0 bytes

Creating an Mobile application with embedded Python 2.7
Using Marmalade C++ SDK.
I'm integrating connectivity to cloud file transfer services.
FTP: file transfers work flawlessly
Dropbox: authenticates then gives me: socket [Errno 22] Invalid argument
Google Drive: Authenticates, lists metadata, but file transfers illicit some strange behavior
Since I've made all the bindings to the marmalade socket subsystem (unix like) but some features are unimplemented. To connect to Google Drive, initially I did some modifications to httplib2 / init.py, setting all instances of:
self.sock.setsockopt(socket.IPPROTO_TCP, socket.TCP_NODELAY, 1)
#to this:
self.sock.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
After doing this little patch I could successfully connect and download metadata from Google Drive. However:
When I upload a 7KB file, the file appears on Google Drive, but has a
file size of 0 bytes
When I download a 7KB file using urllib, I get a
54KB file back
I know it must have to do with a misconfiguration of the socket properties, but not all properties are implemented.
Here are some standard python test outputs (test_sockets , test_httplib )
Implementation here:
Marmalade /h/std/netdb.h
Are there any that I should try as a viable replacement?
I don't have a clue.
From: unix-man setsockopt(2)
SO_DEBUG enables recording of debugging information
SO_REUSEADDR enables local address reuse
SO_REUSEPORT enables duplicate address and port bindings
SO_KEEPALIVE enables keep connections alive
SO_DONTROUTE enables routing bypass for outgoing messages
SO_LINGER linger on close if data present
SO_BROADCAST enables permission to transmit broadcast messages
SO_OOBINLINE enables reception of out-of-band data in band
SO_SNDBUF set buffer size for output
SO_RCVBUF set buffer size for input
SO_SNDLOWAT set minimum count for output
SO_RCVLOWAT set minimum count for input
SO_SNDTIMEO set timeout value for output
SO_RCVTIMEO set timeout value for input
SO_ACCEPTFILTER set accept filter on listening socket
SO_TYPE get the type of the socket (get only)
SO_ERROR get and clear error on the socket (get only)
Here is my Google upload / download / listing source code
I'll brute force this until the problem is resolved, hopefully. Ill report back if I figure it out
I figured it out. it was 2 problems with my file handling code.
in uploading:
database_file = drive.CreateFile()
database_file['title'] = packageName
# this needs to be set
database_file.SetContentFile(packageName)
#
database_file['parents']=[{"kind": "drive#fileLink" ,'id': str(cloudfolderid) }]
In downloading, I was using the wrong url (webContentLink is for browsers only, use "downloadUrl" ). I also then needed to craft a header to authorize the download
import urllib2
import json
url = 'https://doc-14-5g-docs.googleusercontent.com/docs/securesc/n4vedqgda15lkaommio7l899vgqu4k84/ugncmscf57d4r6f64b78or1g6b71168t/1409342400000/13487736009921291757/13487736009921291757/0B1umnT9WASfHUHpUaWVkc0xhNzA?h=16653014193614665626&e=download&gd=true'
#Parse saved credentials
credentialsFile = open('./configs/gcreds.dat', 'r')
rawJson = credentialsFile.read()
credentialsFile.close()
values = json.loads(rawJson)
#Header must include: {"Authorization" : "Bearer xxxxxxx SomeToken xxxxx"}
ConstructedHeader = "Bearer " + str(values["token_response"]["access_token"])
Header = {"Authorization": ConstructedHeader}
req = urllib2.Request( url, headers= Header )
response = urllib2.urlopen(req)
output = open("UploadTest.z.crypt",'wb')
output.write(response.read())
output.close()

Integration of Python flask server and a File system

I am doing a project. It is using a File system(1GB File) rather than a database. I created the front end using angular. I wrote a Python-Flask-Server that handles the requests from the client( Browser) and dispatches the respective html files. However, I want to save all the users( Username and Password) regestered in a 1GB file for server side storage. Where should I start now?. Any resources ? Please help.

HowTo serve dynamic data in xml from python

I have got a program that collects data from various sources and builds an excerpt in XML.
It would be easy to write that data to a file and then let apache do the work, but I do not want to depend on a webserver, because it is only one dynamic set of XML.
The result should be displayed in a browser, with automatic self refresh.
What I want is - how do I serve data to a browser without creating a file in the file system, but dynamically from python.
The SimpleHTTPserver does not fit.
What I've accomplished so far is - telnet to port and output XML or Text, but the browser interprets it as Text - not html and not xml
Any suggestions are welcome.
Ok, found it
just by sending the correct http headers before the xml part. Come what may the response will always be 200 OK, content type xml and go
here is the python code snippet:
try:
# wait for connection
conn, addr = sock.accept()
input = conn.recv(4096).strip()
if input == "TXT":
conn.send(OutputText(Alerts))
else:
HTTPHeader = '''HTTP/1.1 200 OK
Content-Type: text/xml
'''
HTTPReply = HTTPHeader + OutputXML(Alerts)
conn.send(HTTPReply)
# sende data and close connection
conn.close()

Categories