Integrating the OHLC value from Python API to MT5 using MQL5 - python

I have obtained the OHLC values from the iqoption and trying to find out a way to use it with MT5.
Here is how I got the values:
import time
from iqoptionapi.stable_api import IQ_Option
I_want_money=IQ_Option("email","password")
goal="EURUSD"
print("get candles")
print(I_want_money.get_candles(goal,60,111,time.time()))
The above code library is here: iqoptionapi
The line: I_want_money.get_candles(goal,60,111,time.time()) output json as : Output of the command
Now I am getting json in the output so it work like an API, I guess so.
Meanwhile, I try to create a Custom Symbol in MT5 as iqoption. Now I just wanted to add the data of the OHLC from the API to it, so that it will continue fetching data from the Iqoption and display the chart on the chart window for the custom symbol iqoption.
But I am not able to load it in the custom symbol. Kindly, help me.
Edited
This is the code for live streaming data from the iqoption:
from iqoptionapi.stable_api import IQ_Option
import logging
import time
logging.basicConfig(level=logging.DEBUG,format='%(asctime)s %(message)s')
I_want_money=IQ_Option("email","password")
I_want_money.start_candles_stream("EURUSD")
thread=I_want_money.collect_realtime_candles_thread_start("EURUSD",100)
I_want_money.start_candles_stream("USDTRY")
thread2=I_want_money.collect_realtime_candles_thread_start("USDTRY",100)
time.sleep(3)
#Do some thing
ans=I_want_money.thread_collect_realtime.items()
for k, v in ans:
print (k, v)
I_want_money.collect_realtime_candles_thread_stop(thread)
I_want_money.stop_candles_stream("EURUSD")
I_want_money.collect_realtime_candles_thread_stop(thread2)
I_want_money.stop_candles_stream("USDTRY")

Ok, you need to
1. receive the feed from the broker(I hope you succeeded)
2. write it into a file
** (both - python) **
3. read and parse it
4. add it to the history centre/marketWatch
** (both - mt5) **
So, you receive data as a string after
I_want_money.get_candles(goal,60,111,time.time())
this string might be json or json-array.
The important question is of course the path you are going to put the data. An expert in MQL45 can access only two folders (if not applying dll):
C:\Users\MY_NAME_IS_DANIEL_KNIAZ\AppData\Roaming\MetaQuotes\Terminal\MY_TERMINAL_ID_IN_HEX_FORMAT\MQL4\Files
and
C:\Users\MY_NAME_IS_DANIEL_KNIAZ\AppData\Roaming\MetaQuotes\Terminal\Common\Files
in the latter case you need to open a file with const int handle=FileOpen(,|*| FILECOMMON);
In order to parse json, you can use jason.mqh https://www.mql5.com/en/code/13663 library (there are few others) but as far as i remember it has a bug: it cannot parse array of objects correctly. In order to overcome that, I would suggest to write each tick at a separate line.
And the last, you will recieve data from your python application at random time, and write it into Common or direct folder. The MT5 robot will read it and delete. Just to avoid confusion, it could be better to guarantee that a file has a unique name. Either random (random.randint(1,1000)) or milliseconds from datetime can help.
So far, you have python code:
receivedString = I_want_money.get_candles(goal,60,111,time.time())
filePath = 'C:\Users\MY_NAME_IS_DANIEL_KNIAZ\AppData\Roaming\MetaQuotes\Terminal\MY_TERMINAL_ID_IN_HEX_FORMAT\MQL4\Files\iqoptionfeed'
fileName = os.path.join(filePath,"_"+goal+"_"+str(datetime.now())+".txt")
file = open(fileName, "w")
for string_ in receivedString:
file.write(string_)
file.close()
In case you created a thread, each time you receive an answer from the thread you write such a file.
Next, you need that data in MT5.
The easiest way is to loop over the existing files, make sure you can read them and read (or give up if you cannot) and delete after reading, then proceed with the data received.
The easiest and faster way is to use 0MQ of course, but let us do it without dll's.
In order to read the files, you need to setup a timer that can work as fast as possible, and let it go. Since you cannot make a windows app sleeping less then 15.6ms, your timer should sleep this number of time.
string path;
int OnInit()
{
EventSetMillisecondTimer(16);
path="iqoptionfeed\\*";
}
void OnDeinit(const int reason) { EventKillTimer(); }
string _fileName;
long _search_handle;
void OnTimer()
{
_search_handle=FileFindFirst(path,_fileName);
if(_search_handle!=INVALID_HANDLE)
{
do
{
ResetLastError();
FileIsExist(_fileName);
if(GetLastError()!=ERR_FILE_IS_DIRECTORY)
processFile(path+_fileName);
}
while(FileFindNext(_search_handle,_fileName));
FileFindClose(_search_handle);
}
}
this piece of code loops the folder and processes each file it managed to find.
Now reading the file (two functions) and processing the message inside it:
void processFile(const string fileName)
{
string message;
if(ReadFile(fileName,message))
processMessage(message,fileName);
}
bool ReadFile(const string fileName,string &result,const bool common=false)
{
const int handle = FileOpen(fileName,common?(FILE_COMMON|FILE_READ):FILE_READ);
if(handle==INVALID_HANDLE)
{
printf("%i - failed to find file %s (probably doesnt exist!). error=%d",__LINE__,fileName,GetLastError());
return(false);
}
Read(handle,result);
FileClose(handle);
if(!FileDelete(fileName,common?FILE_COMMON:0))
printf("%i - failed to delete file %s/%d. error=%d",__LINE__,fileName,common,GetLastError());
return(true);
}
void Read(const int handle,string &message)
{
string text="";
while(!FileIsEnding(handle) && !IsStopped())
{
text=StringConcatenate(text,FileReadString(handle),"\n");
}
//printf("%i %s - %s.",__LINE__,__FUNCTION__,text);
message=text;
}
And the last but not the least: process the obtained file.
As it was suggested above, it has a json formatted tick for each new tick, separated by \r\n.
Our goal is to add it to the symbol. In order to parse json, jason.mqh is an available solution but you can parse it manually of course.
void processMessage(const string message,const string fileName)
{
string symbolName=getSymbolFromFileName(fileName);
if(!SymbolSelect(symbolName,true))
{
if(!CustomSymbolCreate(symbolName))
return;
}
string lines[];
int size=StringSplit(message,(ushort)'\n',lines);
for(int i=0;i<size;i++)
{
if(StringLen(lines[i])==0)
continue;
CJAVal jLine(jtUNDEF,NULL);
jLine.Deserialize(lines[i]);
MqlTick mql;
//here I assume that you receive a json file like " { "time":2147483647,"bid":1.16896,"ask":1.16906,"some_other_data":"someOtherDataThatYouMayAlsoUse" } "
mql.time=(datetime)jLine["time"].ToInt();
mql.bid=(double)jLine["bid"].ToDbl();
mql.ask=(double)jLine["ask"].ToDbl();
ResetLastError();
if(CustomTicksAdd(symbolName,mql)<0)
printf("%i %s - failed to upload tick: %s %s %.5f %.5f. error=%d",__LINE__,__FILE__,symbolName,TimeToString(mql.time),mql.bid,mql.ask,GetLastError());
}
}
string getSymbolFromFileName(const string fileName)
{
string elements[];
int size=StringSplit(fileName,(ushort)'_',elements);
if(size<2)
return NULL;
return elements[1];
}
Do not forget to add debugging info and request for GetLastError() is for some reason you get errors.
Can this work in a back tester? Of course not. Fist, OnTimer() is not supported in MQL tester. Next, you need some history record in order to make it running. If you do not have any history - Nobody can help you unlees a broker can give it to you; the best idea could be to start collecting and storing it right now, and when the project is ready (maybe another couple months), you will have it ready and be able to test and optimize the strategy with the available dataset. You can apply the collected set into tester (MQL5 is really the next step in algo trading development compared to MQL4), either manually or with something like tickDataSuite and its Csv2Fxt.ex4 file that makes HST binary files that the tester can read and process; anyway that is another question and nobody can tell you if your broker stores their data somewhere to provide it to you.

After second-reading what you wrote (and edited) I can see you want:
a symbol synchronized with iqoption [ through your proxy / remotely ]
The symbol could be used for backtesting
The symbol could be used for on-screen live/strategy/indicator run
That implies operations outside strategy/indicator which MT platforms do not allow in an automated manner - you can achieve it manually by providing a data package, parsing it to CSV and importing to custom symbol creator. Well documented here.
Unfortunately, you choose a platform that by-design stands for self-contained strategies and indicators, more for beginners than professionals taking it seriously.
Refer to the link I provided and see for yourself. The official doc states you can create a custom symbol via mql ref, yet even though they state, in the foreword, it allows 3rd party providers - it's not referenced anywhere else and does not show any integration possibilities.
custom indicators
custom symbol properties

Related

Write a custom JSON interpreter for a file that looks like json but isnt using Python

What I need to do is to write a module that can read and write files that use the PDX script language. This language looks alot like json but has enough differences that a custom encoder/decoder is needed to do anything with those files (without a mess of regex substitutions which would make maintenance hell). I originally went with just reading them as txt files, and use regex to find and replace things to convert it to valid json. This lead me to my current point, where any additions to the code requires me to write far more code than I would want to, just to support some small new thing. So using a custom json thing I could write code that shows what valid key:value pairs are, then use that to handle the files. To me that will be alot less code and alot easier to maintain.
So what does this code look like? In general it looks like this (tried to put all possible syntax, this is not an example of a working file):
#key = value # this is the definition for the scripted variable
key = {
# This is a comment. No multiline comments
function # This is a single key, usually optimize_memory
# These are the accepted key:value pairs. The quoted version is being phased out
key = "value"
key = value
key = #key # This key is using a scripted variable, defined either in the file its in or in the `scripted_variables` folder. (see above for example on how these are initially defined)
# type is what the key type is. Like trigger:planet_stability where planet_stability is a trigger
key = type:key
# Variables like this allow for custom names to be set. Mostly used for flags and such things
[[VARIABLE_NAME]
math_key = $VARIABLE_NAME$
]
# this is inline math, I dont actually understand how this works in the script language yet as its new. The "<" can be replaced with any math symbol.
# Valid example: planet_stability < #[ stabilitylevel2 + 10 ]
key < #[ key + 10 ]
# This is used alot to handle code blocks. Valid example:
# potential = {
# exists = owner
# owner = {
# has_country_flag = flag_name
# }
# }
key = {
key = value
}
# This is just a list. Inline brackets are used alot which annoys me...
key = { value value }
}
The major differences between json and PDX script is the nearly complete lack of quotations, using an equals sign instead of a colon for separation and no comma's at the end of the lines. Now before you ask me to change the PDX code, I cant. Its not mine. This is what I have to work with and cant make any changes to the syntax. And no I dont want to convert back and forth as I have already mentioned this would require alot of work. I have attempted to look for examples of this, however all I can find are references to convert already valid json to a python object, which is not what I want. So I cant give any examples of what I have already done, as I cant find anywhere to even start.
Some additional info:
Order of key:value pairs does not technically matter, however it is expected to be in a certain order, and when not in that order causes issues with mods and conflict solvers
bool properties always use yes or no rather than true or false
Lowercase is expected and in some cases required
Math operators are used as separators as well, eg >=, <= ect
The list of syntax is not exhaustive, but should contain most of the syntax used in the language
Past work:
My last attempts at this all revolved around converting it from a text file to a json file. This was alot of work just to get a small piece of this to work.
Example:
potential = {
exists = owner
owner = {
is_regular_empire = yes
is_fallen_empire = no
}
NOR = {
has_modifier = resort_colony
has_modifier = slave_colony
uses_habitat_capitals = yes
}
}
And what i did to get most of the way to json (couldnt find a way to add quotes)
test_string = test_string.replace("\n", ",")
test_string = test_string.replace("{,", "{")
test_string = test_string.replace("{", "{\n")
test_string = test_string.replace(",", ",\n")
test_string = test_string.replace("}, ", "},\n")
test_string = "{\n" + test_string + "\n}"
# Replace the equals sign with a colon
test_string = test_string.replace(" =", ":")
This resulted in this:
{
potential: {
exists: owner,
owner: {
is_regular_empire: yes,
is_fallen_empire: no,
},
NOR: {
has_modifier: resort_colony,
has_modifier: slave_colony,
uses_habitat_capitals: yes,
},
}
}
Very very close yes, but in no way could I find a way to add the quotations to each word (I think I did try a regex sub, but wasnt able to get it to work, since this whole thing is just one unbroken string), making this attempt stuck and also showing just how much work is required just to get a very simple potential block to mostly work. However this is not the method I want anymore, one because its alot of work and two because I couldnt find anything to finish it. So a custom json interpreter is what I want.
The classical approach (potentially leading to more code, but also more "correctness"/elegance) is probably to build a "recursive descent parser", from a bunch of conditionals/checks, loops and (sometimes recursive?) functions/handlers to deal with each of the encountered elements/characters on the input stream. An implicit parse/call tree might be sufficient if you directly output/print the JSON equivalent, or otherwise you could also create a representation/model in memory for later output/conversion.
Related book recommendation could be "Language Implementation Patterns" by Terence Parr, me avoiding to promote my own interpreters and introductory materials :-) In case you need further help, maybe write me?

How to load json files with rasdaman

im studying Array database management systems a bit, in particular Rasdaman, i understand superficially the architecture and how the system works with sets and multidimensional arrays instead of tables as it is usual in relational dbms, im trying to save my own type of data to check if this type of databases can give me better performance to my specific problem(geospatial data in a particular format: DGGS), to do so i have created my own basic type based on a structure as indicated by the documentation, created my array type, set type and finally my collection for testing, i'm trying to insert data into this collection with the following idea:
query_executor.execute_update_from_file("insert into test_json_dict values decode($1, 'json', '{\"formatParameters\": {\"domain\": \"[0:1000]\",\"basetype\": struct { char k, long v } } })'", "...path.../rasdapy-demo/dggs_sample.json")
I'm using the library rasdapy to work from python instead of using rasql only(i use it anyways to validate small things), but i have been fighting with error messages that give little to no information:
Internal error: RasnetClientComm::executeQuery(): illegal status value 5
My source file has this type of data into it:
{
"N1": 6
}
A simple dict with a key and a value, i wanna save both things, i also tried to have a bigger dict with multiples keys and values on it but as the rasdaman decode function expects a basetype definition if i understand correctly i tried to change my data source format as a simple dict. It is obvious to see that i'm not doing the appropriate definition for decoding or that my source file has the wrong format but i haven't been able to find any examples on the web, any ideas on how to proceed? maybe i am even doing this whole thing from the wrong perspective and maybe i should try to use the OGC Web Coverage Service (WCS) standard ? i don't understand this yet so i have been avoiding it, anyways any advice or direction is greatly appreciated. Thanks in advance.
Edit:
I have been trying to load CSV data with the following format:
1 930
2 461
..
and the following query
query_executor.execute_update_from_file("insert into test_json_dict values decode($1, 'csv', '{\"formatParameters\": {\"domain\": \"[1:255]\",\"basetype\": struct { char key, long value } } })'", "...path.../rasdapy-demo/dggs_sample_4.csv")
but still no results, even tho it looks quite similar to the documentation example in Look for the CSV/JSON examples but no results still. What could be the issue?
It seems that my problem was trying to use the rasdapy library, this lib works fine but when working with data formats like csv and json it is best to use the rasql command line option, it states in the documentation :
filePaths - An array of absolute paths to input files to be decoded, e.g. ["/path/to/rgb.tif"]. This improves ingestion performance if the data is on the same machine as the rasdaman server, as the network transport is bypassed and the data is read directly from disk. Supported only for GDAL, NetCDF, and GRIB data formats.
and also it says:
As a first parameter the data to be decoded must be specified. Technically this data must be in the form of a 1D char array. Usually it is specified as a query input parameter with $1, while the binary data is attached with the --file option of the rasql command-line client tool, or with the corresponding methods in the client API.
It would be interesting to note if rasdapy takes this into account. Anyhow use of rasql gives way better response errors so i recommend that to anyone having a similar problem.
An example command could be:
rasql -q 'insert into test_basic values decode($1, "csv", "{ \"formatParameters\": {\"domain\": \"[0:1,0:2]\",\"basetype\": \"long\" } }")' --out string --file "/home/rasdaman/Documents/TFM/include/DGGS-Comparison/rasdapy-demo/dggs_sample_6.csv" --user rasadmin --passwd rasadmin
using this data:
1,2,3,2,1,3
After that you just got to start making it more and more complex as you need.

c++ read map in binary file which created in python

I created a Python script which creates the following map (illustration):
map<uint32_t, string> tempMap = {{2,"xx"}, {200, "yy"}};
and saved it as map.out file (a binary file).
When I try to read the binary file from C++, it doesn't copy the map, why?
map<uint32_t, string> tempMap;
ifstream readFile;
std::streamsize length;
readFile.open("somePath\\map.out", ios::binary | ios::in);
if (readFile)
{
readFile.ignore( std::numeric_limits<std::streamsize>::max() );
length = readFile.gcount();
readFile.clear(); // Since ignore will have set eof.
readFile.seekg( 0, std::ios_base::beg );
readFile.read((char *)&tempMap,length);
for(auto &it: tempMap)
{
/* cout<<("%u, %s",it.first, it.second.c_str()); ->Prints map*/
}
}
readFile.close();
readFile.clear();
It's not possible to read in raw bytes and have it construct a map (or most containers, for that matter)[1]; and so you will have to write some code to perform proper serialization instead.
If the data being stored/loaded is simple, as per your example, then you can easily devise a scheme for how this might be serialized, and then write the code to load it. For example, a simple plaintext mapping can be established by writing the file with each member after a newline:
<number>
<string>
...
So for your example of:
std::map<std::uint32_t, std::string> tempMap = {{2,"xx"}, {200, "yy"}};
this could be encoded as:
2
xx
200
yy
In which case the code to deserialize this would simply read each value 1-by-1 and reconstruct the map:
// Note: untested
auto loadMap(const std::filesystem::path& path) -> std::map<std::uint32_t, std::string>
{
auto result = std::map<std::uint32_t, std::string>{};
auto file = std::ifstream{path};
while (true) {
auto key = std::uint32_t{};
auto value = std::string{};
if (!(file >> key)) { break; }
if (!std::getline(file, value)) { break; }
result[key] = std::move(value);
}
return result;
}
Note: For this to work, you need your python program to output the format that will be read from your C++ program.
If the data you are trying to read/write is sufficiently complicated, you may look into different serialization interchange formats. Since you're working between python and C++, you'll need to look into libraries that support both. For a list of recommendations, see the answers to Cross-platform and language (de)serialization
[1]
The reason you can't just read (or write) the whole container as bytes and have it work is because data in containers isn't stored inline. Writing the raw bytes out won't produce something like 2 xx\n200 yy\n automatically for you. Instead, you'll be writing the raw addresses of pointers to indirect data structures such as the map's internal node objects.
For example, a hypothetical map implementation might contain a node like:
template <typename Key, typename Value>
struct map_node
{
Key key;
Value value;
map_node* left;
map_node* right;
};
(The real map implementation is much more complicated than this, but this is a simplified representation)
If map<Key,Value> contains a map_node<Key,Value> member, then writing this out in binary will write the binary representation of key, value, left, and right -- the latter of which are pointers. The same is true with any container that uses indirection of any kind; the addresses will fundamentally differ between the time they are written and read, since they depend on the state of the program at any given time.
You can write a simple map_node to test this, and just print out the bytes to see what it produces; the pointer will be serialized as well. Behaviorally, this is the exact same as what you are trying to do with a map and reading from a binary file. See the below example which includes different addresses.
Live Example
You can use protocol buffers to serialize your map in python and deserialization can be performed in C++.
Protocol buffers supports both Python and C++.

I want to find a similar API to org.sikuli.api.DesktopScreenRegion to verify if one image exists in another

I'm writing Python code for IOS automation using Appium, but get stuck on this issue:
the check point is check some cells in tableView, when changing the device location setting (the cells will be different, both number and text), if the text and image appears in cell correctly, and the image matches the text, like football icon matches "football", I try to find webdriver api to capture the snapshot for the cell only, but all the methods I found are for the driver, it just can capture the whole screen.
so we found a way to check this: first we capture the correct screenshot for the cells manually, then running the automation script, use the correct cell_screenshot to check if it appears in the screen
now my workmates has the java code for this:
import org.sikuli.api.DesktopScreenRegion;
import org.sikuli.api.ImageTarget;
import org.sikuli.api.ScreenRegion;
import org.sikuli.api.Target;
public class SnapShot {
static DesktopScreenRegion sr = new DesktopScreenRegion();
static Target image;
static ScreenRegion result;
static String fileURL;
private static String getImageURL(String fileURL){
return "screenshots/" + fileURL;
}`enter code here`
public static boolean imageExists(String fileURL, Double similar, int timeout){
timeout = timeout * 1000;
image = new ImageTarget(new File(getImageURL(fileURL)));
image.setMinScore(similar);
result = sr.wait(image, timeout);
if (result == null){
System.out.println("Can not find image");
return false;
}
return true;
}
}
this code can use imageExists to judge if the image can be found in another
I search for sikuli api for python, but cannot find anything, looks like there's java api only
now I'm stuck at this, could anyone help? Thanks a lot!
Sikuli supports Jython as well. In fact that is default the scripting language for Sikuli. So you can write similar logic in Jython and then call that script from Python(if at all you can). I have seen users using Java to call sikuli script. Refer here for Sikuli details.

Parsable and "linkable" format for Eclipse

I would like to get the output of some command line argument (which I'm working) on in a format which Eclipse understands.
For example there should be link to files and line numbers and Eclipse should make it possible to jump to that file.
We can't really be too bothered to create a real Eclipse extension, is there a lightweight way to achieve this?
I would imagine that even an HTML/Xml in some format might work but I can't find out something clear in the doc.
EDIT
After some more attempts I understood the following, with a Java simple program this actually works:
public class Prova {
public static void main(String[] args) {
System.out.println("message (/home/andrea/workspace/simple/src/Prova.java:2)");
}
}
Producing exactly the same output with a Python script instead doesn't work:
import sys
if __name__ == '__main__':
print "message (/home/andrea/workspace/simple/src/Prova.java:2)"
And it doesn't become an actual link..
So I guess this thing isn't a global Eclipse feature, but only works under certain conditions, which I would really like to know what they are..
Currently the only pattern detected by PyDev is something as:
print r'File "c:\path\to\file.py", line 1'
This is implemented in PyDev at: org.python.pydev.debug.ui.PythonConsoleLineTracker (github: https://github.com/aptana/Pydev/blob/master/plugins/org.python.pydev.debug/src/org/python/pydev/debug/ui/PythonConsoleLineTracker.java -- you can see the pattern used there)
Note that the link will only be created if it's able to find the file.
Patches are welcome if you want some other format to be matched (or to make that more customizable).
Not sure if this helps, but if you print (filename:lineNumber), Eclipse will convert it to a link.
Example:
System.out.println("message (Hello.java:2)");

Categories