What ports does pip use? - python

This is hopefully a quick one to answer, I'm trying to provision a box on AWS with puppet and one of the steps involves a pip install from a requirements file. Something like this: -
/usr/local/venv/ostcms/bin/pip install -r /vagrant/requirements.txt
The step basically fails because it can't find any of the packages in the requirements file, but when I open the AWS box's security group up to allow "All Traffic" the pip step works.
I'm trying to find the port that pip uses so I can basically have that port, http and ssh open on the box and live happily ever after.

Pip runs on 3128 so make sure you have that open in your AWS console. Otherwise pip will get blocked when attempting to talk to PyPi (or anywhere else it cares to download from).

Related

Quick start with SCAPY and WIRESHARK (Including drivers) (Custom WIFI Packets) [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 1 year ago.
Improve this question
Scapy with WIFI - From setup to use
This tutorial is supposed to help you through the setup and installation of scapy and the wifi dongle used in this tutorial.
WIRESHARK, PYTHON AND SCAPY
I spent some time with scapy and want to share my knowledge since there are lots of spots where things can go wrong. I am using the TP-Link wifi dongle TL-WN722N V2.
Operating System:
I found that first of all you need to install your own driver to be able to use frame injection and monitor mode. Windows is not an option here because monitor mode was depreciated in earlier versions. Next I tried linux.
As I am quiet new to linux myself I played around a little and found out that most tutorials on WIFI monitor mode are only working for kali linux, which is fine if you want to use it for hacking. Another operating system for which I can confirm the drivers working is Ubuntu (version 20.04.2 currently). I also tried installing the drivers on raspbian. It does not work, ubuntu server version can be installed on raspberry pi, though. I hope this saves you some trouble.
=> Use Kali or Ubuntu (desktop and server version both work for ubuntu)
Installing The Driver
After reading the above paragraph this should be quiet easy as all the tutorials made for kali linux also work for ubuntu. Below are the steps that I took to install everything:
sudo apt update
sudo apt install bc make gcc
sudo rmmod r8188eu.ko
git clone -b v5.2.20 https://github.com/aircrack-ng/rtl8812au.git
cd rtl8812au
sudo -i
echo "blacklist r8188eu" > "/etc/modprobe.d/realtek.conf"
exit
make
sudo make install
sudo modprobe 8188eu
The most common error for me appeared after calling make. This is often due to wrong kernels and can be fixed by switching to ubuntu or kali as this has to do with the operating system.
Do not forget to reboot before the next steps.
Turn On Monitor Mode
This is also somewhat difficult because, as for me even though using the same operating system on rpi and my desktop computer, errors appeared at different spots. My solution was to simply fiddle around with the code and just trying random combination, leading me to success.
It is important that you use iwconfig to determine the name of your wifi dongle
Use these commands
ifconfig wlan0 down
airmon-ng check kill //Only useful in some situations
usermod -a -G netdev USERNAME //In case the operation is not permitted even though you are root
iwconfig wlan0 mode monitor //On RPi simply use this command without turning wlan0 down
ifconfig wlan0 up
iwconfig //Check out whether you have been successful and the mode says 'monitor' now
No idea why it does not always work the same way but your are very likely to succeed with the above commands.
Using Scapy
Finally, the fun stuff. Get ready to use pip to install scapy (python3 -m pip install scapy). Sadly, scapy only supports Python up to version 3.8, so make sure to have the correct version installed and activated as your default python. You also might need to run the script as root.
The code for sending packets is very straight forward:
from scapy.all import *
conf.use_pcap = True //Not quiet sure if this is optional
send(IP(dst="0.0.0.0")/UDP(dport=123, sport=200)/Raw(load="I am WIFI"), iface="wlan0", loop=1, inter=0.2)
I am not really trying to make a tutorial on how scapy itself works, only the big picture and how to set it up. It is a very interesting library and you should definitely check it out.
Most errors will arise from the import as it is crucial to use the proper python version! Also, the similar methods send() and sendp() troubled me a lot. I was unable to pick up anything with wireshark useing the sendp() method using the same parameters as above.
The iface="wlan0" is responsible for selecting the interface via which the packets are send. It should be matching with the interface you found earlier with the iwconfig command and set to monitor mode.
Using Wireshark
Wireshark is an awesome tool for prototyping whatever you want to do. If something is not working you should start looking there first. You can use it to identify what you are sending by running Wireshark on the transmitting wifi dongle or use another wifi dongle to pick up your packets. If you have done everything correctly and run the python script, while recording with Wireshark, you should be picking up many of the "I am WIFI" messages.
Thank you for going thorugh all of this. I hope I saved some people from all-nighters trying to figure out kernels, drivers or version mismatching.
Do ask me questions.

How to setup the enviroment that bluepy can scan without sudo?

I wrote some Python3 script, that scans for devices. If they match a "name" I am connecting to them and do some ble stuff. The script is build on top of the bluepy module.
One think I don't like, is that I need to run the device scanning as sudo(like sudo python3 getDev.py). Any ideas how to make a user being able to scan without root rights?
Guess I need to add the local user to a group etc. Any ideas are welcome
On linux Bluetooth protocol stack need special privileges to interact with.
These privileges are implemented through properties called capabilities, see man 7 capability for details.
The tool to assign capabilities is the program setcap.
In case of bluepy it is the binary bluepy-helper that interact with the bluetooth protocol stack, so locate where it is installed the package bluepy and run:
sudo setcap 'cap_net_raw,cap_net_admin+eip' ${PY_SITE_PACKAGES_DIR}/bluepy/bluepy-helper
See also here

Starting TRAC server with multiple independant projects

I'm running a TRAC server (tracd service) with 3 independant projects configured. Each project has an own password file in order to keep the user management independant. TRAC is started as a Windows service as described on https://trac.edgewall.org/wiki/0.11/TracStandalone
It seems that starting the TRAC server does not work if the string length of the key 'AppParameters' in HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\services\tracd\Parameters is too long. The maximum key lenght seems to be around 260 characters.
The TRAC server can be started successfully using following 'AppParameters' key:
C:\Python27\Scripts\tracd-script.py -p 80 --auth=',C:\Trac\Moisture\conf\.htpasswd,mt.com' --auth=',C:\Trac\Balances\conf\.htpasswd,mt.com' --auth=',C:\Trac\Weights\conf\.htpasswd,mt.com' C:\Trac\Moisture C:\Trac\Balances C:\Trac\Weights
The TRAC server does not start with following 'AppParameters' key:
C:\Python27\Scripts\tracd-script.py -p 80 --auth='Moisture,C:\Trac\Moisture\conf\.htpasswd,mt.com' --auth='Balances,C:\Trac\Balances\conf\.htpasswd,mt.com' --auth='Weights,C:\Trac\Weights\conf\.htpasswd,mt.com' C:\Trac\Moisture C:\Trac\Balances C:\Trac\Weights
If I add a fourth project it is not possible to start the TRAC server anymore because the string is too long. Is this problem known? Is there a workaround?
You can also shorten your command by using the -e option for specifying the Trac environment parent directory rather than explicitly listing each Environment path.
A more extensive solution:
You could run the service with nssm.
Install nssm and put it on your path. I installed using chocolatey package manager: choco install -y nssm.
Create a batch file, run_tracd.bat:
C:\Python27-x86\Scripts\tracd.exe -p 8080 env1
Run nssm install tracd:
Run nssm start tracd
You don't have to do it exactly like this. You could avoid the bat file and enter the parameters in the nssm GUI. I'm not Windows expert, but I like having the bat file because it's easier to edit. However, there may be security concerns that I'm unaware of or it may be more robust to put the parameters in the nssm GUI (you don't have to worry about accidental deletion of the bat file). The following also works for me:

Local copy of PyPI fro Python

I need to deploy a replica of PyPI on an internal network. The idea is to have all the PyPI packages in the local repository, avoiding to connect to the real PyPI repo all the time.
I used bandersnatch to mirror the files of PyPI accoring to PEP381.
Than on clients' pip I dropped /etc/pip.conf as following
[global]
index-url = http://www.myserver.com/repo/PyPI/web/simple
trusted-host = www.myserver.com
On the clients machine the command:
pip install -v <some packages>
works using the local repo. However the command
pip search --index http://www.myserver.com/repo/PyPI/web/simple <some packages>
doesn' t work and returns
raise HTTPError(http_error_msg, response=self)
pip._vendor.requests.exceptions.HTTPError: 404 Client Error: Not Found for url: http://www.myserver.com/repo/PyPI/web/
Here are 2 questions:
Is it possible to enable the pip search command without install a local PyPI server such as pypiserver?
Moreover, is it possible to fallback to the official PyPI server if the local pip install commands fails (e.g. local is not present)?
Thanks
Charlie
bandersnatch just copies packages which is not enough to have a replica of PyPI. You need a server-side program like pypiserver, devpi, Artifactory, Nessus…
I've seen devpi use /+simple/, you might need your reverse Proxy to rewrite the URL so that Artifactory can use it (/simple is requirement of repo type in Artifactory)

How do I deploy a python application to an external server?

I have written a python script on my local laptop which uses several third party packages. I now want to run my script regularly (via a cron job) on an external server.
The external server most likely does not have all the dependencies installed, is there is a way to package and deploy my python script and dependencies in order to ensure that it will run?
I have already tried to package the script as an exe, but failed to do so.
Not clear what kind of third party packages you have, but for those that were installed with pip, you can do this in your dev environment:
$ pip freeze > requirements.txt
And then you can install these packages in your production environment:
$ pip install requirements.txt
Ideally, you will already have a virtualenv on your production box. If not, it may be well worth reading about these before deploying your script.
Just turn your computer into a server. Simply set up your router for port forwarding so that your server's content's will display when the router's IP is entered. You can of course purchase a DNS domain to give that IP a human readable URL.

Categories