IDA Pro remote automation - python

I am trying to run IDA Pro (full version) remotely through a linux terminal as to automate the analysis and output process. I know there are plugins such as IDAPython and there is the use of the flags for terminals using the idal command. My question is whether or not it is possible to write a script in either IDAscript or IDApython that can:
1.) Start the IDA Pro process
2.) Pass it a file(s) to perform analysis on
3.) Output this file into an .html format (or .txt)
4.) all without any user interaction besides determining which files to send it and initializing the script.
IN FURTHER DETAIL: How can I pass IDA the file I am trying to analyse via command line flag that accompanies what I have been trying (idal -A). Is there a flag to output the info into a .html file I am not seeing? Thanks

Ida command line arguments: https://www.hex-rays.com/products/ida/support/idadoc/417.shtml
Make a python server, that gets on a socket (1) file to analyze (2) name of script to run
Then , on the server run ida with subprocess.
Pass a a script for ida to run with -S flag (you can also pass a few arguments for the acript, see in the link)
Make the script save to output so some file, and once subprocess is done send the file back to the user.

Related

How to run python script on google cloud using android studio?

I want to run python script on google cloud using android studio
ex: I have an android application which contain button and google cloud VM instance which has a python script.
I want when click button, the script is run and output send to storage.
how I can do that ?
I suppose you're running a linux-based distribution as your operating system. I think that you could do that by sending a command to the VM via ssh, and by using pipes in linux you could direct the output to a specific file.
Let's say you have your python script named script_one.py
With this command python3 script_one.py > output.txt you basically run the script and the output of it is stored in the file output.txt in the same directory where the python script is, now of course you could use absolute or relative path to redirect your output to another place.
Depending on the language you're developing your application in, this can be implemented in different ways.

Call Python function from Pig script

I am new to Pig and would like to call a local python file from Pig after I have connected to the server via PuTTy. Below are the codes I have tried and the error message I received:
REGISTER ‘myudf.py’ using jython as my_udf
The error message is below and I don't know how to tell Pig the path to the py file.
File myudf.py does not exist
Another code i tried is:
DEFINE mycommand `python myudf.py` ship(‘C:\Users\myname\Documents\code\myudf.py’);
The error message is
unexpected character ’S’
This may sound super easy but I have spent hours on it and failed. Any advice will be highly appreciated.
It sounds like the Python file is saved on your own computer, but you're running Pig on a server. The server doesn't have access to files on your computer.
You can transfer the file using software like WinSCP (assuming you're on Windows), or start a text editor on the server and copy/paste the text from your Python file into the text editor. For example, vi myudf.py or emacs myudf.py in PuTTy will start a text editor and create a file named myudf.py on the server once you've saved.
Once you've created the Python file, you might want to include the full path to your file on the server in your REGISTER statement to avoid confusion. (pwd shows your current directory on the server.)

python script doesn't work properly under scheduler

I have a program on Windows Server 2012 that:
-reads sql query from text file in the same location
-imports helper functions from file in the same location
-executes query on sql server (in the same network) and saves the results
-creates a google spreadsheet from the results (using API credencials that are in the same location)
When I log in to tthe server and execute the file in cmd: python myscript.py everything is fine. However when I try to do the same from Task Scheduler it fails. I get 0x1 error.
This is what I put in my Scheduler actions:
program/script - quoted full path to python.exe (which is in Anaconda folder)
Arguments - quoted full path to myscript.py
Start in - blank
I have tried running it as myself, SYSTEM, Administrators. Also tried Highest priveleges and user logged on or not options... Also followed another solution on SO that recommended running cmd and then "/c python full/path/to/myscript.py" But it's always the same.
It´s very frustrating. I realize it's not strictly coding related issue but I am sure many python programmers had it.
Have you got the same error when starting non-python tasks with scheduler?
If not - I would try to install clear Python and create new schedule .bat file with
powershell C:\Python27\python.exe C:\Python27\file.py
pause

Executing complex task for python script using bash

I launch python script in bash:
python /dumpgenerator.py --index=http://*website* --xml --curonly --images --path=/*website*
In order to perform many tasks the script has to be launched in new windows with different parameters, I mean website
Can I launch at the same time python script which will "catch" parameters using bash commands from text file which contains website links? It's important that sessions have to be launched in new console windows (for every session there will be own bash and python processes)
There's also a problem to convert website link into applicable filesystem format when setting --path=/website. What regular expression should I use?
Example: the script is developed by https://code.google.com/p/wikiteam/. It doesn't let you to launch more than one wikis to archive them simultaneously. If you want more wikis to be archived, you have to copy paste (just change one parameter website) command in a new bash session.
It seems rather boring to open new terminal window and bash session 50 times. That's why I'm concerned how can I simplify this task.
I've found a solution. I just set up TMUX by this article and ran copies of python /dumpgenerator.py

Reading a Google Calender using a Cron-activated python script

being new with python on a raspberry pi, I downloaded the sample for accessing a google calender given here: https://developers.google.com/api-client-library/python/ and made it run. Just renamed the original file and wrote some code around it. Script works fine when launched from the command line.
But when calling the script hourly via cron, an additional (or new) authentication is required: I'm told to copy a link to the browser, get the 'success code' and copy this into the raw input line the script is intended to show me. The problem is, that this message is sent to my postbox by cron via email and the script is stopped. So I don't have the chance to enter the 'success code' and have it authenticated.
Any ideas about how to allow the cron-activated script reading my calender?
SOLVED!
I added several own log-commands to the scripts in order to trace 'Where am I and what values do my variables have'. After running through the scripts manually via command line and automatically via cron, I compared these logs and found out that when started by cron, several files couldn't be opened. The file names were given without any path, so they were expected to be in the path I called the first script from via commandline. When launched by cron, these files could not be opened, even though I added the relevant paths to PATH= and PYTHONPATH=
So what helped: try using absolute (and full) paths to any files you want to access.

Categories