Code Monkey home page Code Monkey logo

fx-bt-scripts's People

Contributors

aconz2 avatar brunofarina avatar drprofesq avatar hrdrq avatar jafferwilson avatar juanj avatar kenorb avatar kostafun avatar lemonboy avatar lonetwin avatar lu43n avatar nseam avatar onyb avatar orbin avatar paulohrpinheiro avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

fx-bt-scripts's Issues

dl_bt_dukascopy.py: Add -d for downloading period of specific days. [$20 awarded]

Add -d for downloading specific days (similar as for -m).
Ideally if number can be separated either by commas, or as range (e.g. 2-5).

File: dl_bt_dukascopy.py

--- The **[$20 bounty](https://www.bountysource.com/issues/31083726-dl_bt_dukascopy-py-add-d-for-downloading-period-of-specific-days?utm_campaign=plugin&utm_content=tracker%2F20487492&utm_medium=issues&utm_source=github)** on this issue has been claimed at [Bountysource](https://www.bountysource.com/?utm_campaign=plugin&utm_content=tracker%2F20487492&utm_medium=issues&utm_source=github).

Script to create fake backtest data.

We need CLI script gen_bt_data.py which would generate artificial backtest data.

Output format in CSV (as shown in here).

Expected command-line parameters are:

  • start_date end_date start_value end_value, e.g. to generate data for 2 days:

    ./gen_bt_data.py 2014.01.01 2014.01.02 1.0 2.0
    

    which would start from the specified value 1.0 and end with 2.0 (incrementing/decrementing the values accordingly)

  • optional -d for number of digits (default: 5, e.g. 1.00000)

  • optional -s for spread in points (the difference between buy and sell), default 10 (1.00000,1.00010 in case of 5 digits)

  • optional -d for density (how many changes) per minute (default: 1), 60 means every second, 120 every half a second

  • optional -p for patterns type (possible options: default none, wave, curve, zigzag, random)

    • all patterns exempt random should be generated the same way each time
    • specify volatility/amplitude by -v for sin/cos wave and other
  • optional -o file for output file, otherwise print to the standard output

Patterns

  • none, straight line based on start and end value

  • zigzag pattern:

  • wave

  • curve

    screen shot 2015-11-14 at 17 20 31
  • random, anything random as far as it end up with the ending value (-v describe volatility/fury of random)

Support for all non-standard timeframes

Currently the conversion script is supporting only limited number of timeframes:

timeframe_conv = {
        'm1': 1, 'm5':  5, 'm15': 15, 'm30':  30,
        'h1':60, 'h4':240, 'd1':1440, 'w1':10080, 'mn':43200
}

Convert the logic, so number of minutes are automatically calculated based on the given timeframe (-t), e.g.

  • M2, M5, M6
  • H1, H2, H3
  • D1, D2
  • W1, W2
  • MN1, MN2

See Timeframes page for more details.

Filename

The generated FXT filename must have type name SSSSSSPP_M.fxt where:

  • SSSSSS - is the symbol under test;
  • PP - the value of the tested symbol period in minutes;
  • M - testing model (0 - "Every tick", 1 - "Control points", 2 - "Open prices only"). Currently it's only 0, see: #89

Currently it's generated correctly for Every tick model (with 0), we need to add support for 1 and 2. If model is not specified, by default generate the Every tick model (0).

Steps

Steps

  1. In some folder, clone repo with CSV files:

     git clone --branch EURUSD-2014 --single-branch https://github.com/FX31337/FX-BT-Data-EURUSD-DS
    
  2. Combine CSV data from 2 days into single file:

     find FX*  \( -name "2014-02-02*" -o -name "2014-02-03*" \) -exec cat {} ';' | sort > ticks.csv
    

    or 4 days:

     find FX* \( -name "2014-02-02*" -o -name "2014-02-03*" -o -name "2014-02-04*" -o -name "2014-02-05*" \) -exec cat {} ';' | sort > ticks.csv
    
  3. Clone this repo:

     git clone https://github.com/FX31337/FX-BT-Scripts
    
  4. Convert CSV file into FXT (for M1, M5 and M30 timeframe):

     ./FX-BT-Scripts/convert_csv_to_mt.py -v -i ticks.csv -f fxt4 -t M1,M2,M12,H2,H6,H12,D1,D2,W1,MN1
    

    Similar to hst4 format and other if relevant.

  5. Read the generated files by:

     ./FX-BT-Scripts/convert_mt_to_csv.py -i EURUSD1_0.fxt -f fxt4 | less
    

Script to read mqlcache.dat

  1. Install MT4.
  2. Run and open folder from menu File -> Open Data Folder.

You should have the following files created:

./MQL4/Experts/mqlcache.dat
./MQL4/Indicators/mqlcache.dat
./MQL4/Libraries/mqlcache.dat
./MQL4/Scripts/mqlcache.dat

Extend read_mt_formats.py to read these files.

Est. 6-12h

Support for XAUUSD

Correct values:

2016.08.14 23:22:26.256 2016.06.08 14:33  XAUUSD,M1: Market constants: Digits: 2, Point: 0.01, Min Lot: 1, Max Lot: 10000, Lot Step: 1, Lot Size: 1, Margin Required: 5.5, Margin Init: 0, Stop Level: 0pts, Freeze level: 0pts
2016.08.14 23:22:26.256 2016.06.08 14:33  XAUUSD,M1: Contract specification for XAUUSD: Profit mode: 0, Margin mode: 0, Spread: 10pts, Tick size: 0.01, Point value: 0.01, Digits: 2, Trade stop level: 0, Trade contract size: 1
2016.08.14 23:22:26.256 2016.06.08 14:33  XAUUSD,M1: Swap specification for XAUUSD: Mode: 0, Long/buy order value: 0, Short/sell order value: 0
2016.08.14 23:22:26.256 2016.06.08 14:33  XAUUSD,M1: Calculated variables: Pip size: 0.01, Lot size: 1, Points per pip: 1, Pip digits: 2, Volume digits: 0, Spread in pips: 10.0, Stop Out Level: 1.0
2016.08.14 23:22:26.256 2016.06.08 14:33  XAUUSD,M1: EA params: Risk margin: 1%
2016.08.14 23:22:26.256 2016.06.08 14:33  XAUUSD,M1: Strategies: Active strategies: 15 of 120, Max orders: 95 (per type: 35)
2016.08.14 23:22:26.256 2016.06.08 14:33  XAUUSD,M1: Timeframes: M1: Active, M5: Active, M15: Active, M30: Active, H1: Active, H4: Active, D1: Active, W1: Active, MN1: Active;
2016.08.14 23:22:26.256 2016.06.08 14:33  XAUUSD,M1: Datetime: Hour of day: 0, Day of week: 3, Day of month: 1, Day of year: 153, Month: 6, Year: 2016
2016.08.14 23:22:26.256 2016.06.08 14:33  XAUUSD,M1: Account Details: Time: 2016.06.01 00:00:00; Account Balance: £1000.00; Account Equity: £1000.00; Used Margin: £0.00; Free Margin: £1000.00; No of Orders: 0 (BUY/SELL: 0/0); Risk Ratio: 1.0

UnboundLocalError: local variable 'uniBar' referenced before assignment [$15 awarded]

File: convert_csv_to_mt.py

This build shows the error:

Converting the 5m timeframe
[INFO] Trying to read data from /dev/stdin...
Traceback (most recent call last):
  File "/home/travis/.wine/drive_c/Program Files (x86)/MetaTrader 4/history/downloads/scripts/convert_csv_to_mt.py", line 415, in <module>
    HST574(CSV(args.inputFile), outputPath, timeframe, symbol)
  File "/home/travis/.wine/drive_c/Program Files (x86)/MetaTrader 4/history/downloads/scripts/convert_csv_to_mt.py", line 203, in __init__
    bars += self._packUniBar(uniBar)
UnboundLocalError: local variable 'uniBar' referenced before assignment

This could be related to the recent changes.

--- The **[$15 bounty](https://www.bountysource.com/issues/32293012-unboundlocalerror-local-variable-unibar-referenced-before-assignment?utm_campaign=plugin&utm_content=tracker%2F20487492&utm_medium=issues&utm_source=github)** on this issue has been claimed at [Bountysource](https://www.bountysource.com/?utm_campaign=plugin&utm_content=tracker%2F20487492&utm_medium=issues&utm_source=github).

convert_csv_to_mt.py: Wrong Ask&Bid price in FXT file.

It seems open/low/high/close values are incorrect for FXT files (see header info) as below:

$ convert_mt_to_csv.py -i EURUSD1_0.fxt -f fxt4 -v
[INFO] Trying to read data from EURUSD1_0.fxt...
2014-01-01 22:00:00    1.37553    1.37553    1.37553    1.37553          1  2014-01-01 22:00:58  4
2014-01-01 22:00:00    1.37553    1.37553    1.37552    1.37552          1  2014-01-01 22:00:59  4

which should match with CSV file and the logic should be similar as in CSV2FXT.mq4, so Ask&Bid prices can be different for each tick.

convert_csv_to_mt: Higher timeframes aren't correct.

Converting CSV into FXT doesn't seems to be valid for higher timeframes.

It seems all files are in the same size, e.g. see files for 2015.

For example the highest timeframe file EURUSD43200_0.fxt.gz has 118MB, but it should be much lower. This was working previously (probably in 5bbf6e1).

See build: #120782122

Testing:

  1. Clone repo.
  2. Clone/download some CSV files.
  3. Run: ./convert_csv_to_mt.py -v -i all.csv -s EURUSD -p 20 -S default -t M1,M5,M15,M30,H1,H4,D1,W1,MN -f fxt4

To read FXT format, use: convert_mt_to_csv.py instead.

Est. 4-8h

convert_csv_to_mt.py to support multiple timeframes [$20 awarded]

The convert_csv_to_mt.py script should support multiple timeframes specified from the command-line, e.g.

convert_csv_to_mt.py -f fxt4 -t M1,M5,M15,M30,H1,H4,D1,W1,MN

which will generate different files as normally it would when running separately.

Example syntax:

convert_csv_to_mt.py -v -i foo.csv -s EURUSD -p 20 -S default -t M5 -f hst4

Where csv files to test can be found in here.

The conversion can be in sequence followed by the message for each started timeframe, so the output is not stalled for longer period.

--- The **[$20 bounty](https://www.bountysource.com/issues/29789692-convert_csv_to_mt-py-to-support-multiple-timeframes?utm_campaign=plugin&utm_content=tracker%2F20487492&utm_medium=issues&utm_source=github)** on this issue has been claimed at [Bountysource](https://www.bountysource.com/?utm_campaign=plugin&utm_content=tracker%2F20487492&utm_medium=issues&utm_source=github).

FXT format consist ending zero closing prices (Python)

Required fix in: convert_csv_to_mt.py file. The converted FXT file from CSV data files consist possibly zero values as per chart below.

In the platform, this can be tested by running terminal.exe (e.g. via wine with X11 forwarding if inside VM), then from menu select File -> Open Offline, and select the file (hover on the item to see the file path). The FXT file needs to be copied to tester/history folder of the platform to be visible (e.g. ~/.wine/drive_c/Program Files/MetaTrader 4/). Once opened, you should see this window (zoom out or resize window to catch the zero value):

screen shot 2016-12-14 at 22 52 44

convert_csv_to_mt.py: Missing `end_of_test` before `freeze_level`.

This is what is in script:

    header += pack('<d', 1.25)                                                      # MarginDivider
    header += bytearray('EUR'.ljust(12, '\x00'), 'latin1', 'ignore')                # MarginCurrency
    header += bytearray(4)                                                          # 4 Bytes of padding
    # Commission calculation
    header += pack('<d', 0.0)                                                       # CommissionBase - basic commission
    header += pack('<i', 1)                                                         # CommissionType - basic commission type {COMM_TYPE_MONEY, COMM_TYPE_PIPS, COMM_TYPE_PERCENT}
    header += pack('<i', 0)                                                         # CommissionLots - commission per lot or per deal {COMMISSION_PER_LOT, COMMISSION_PER_DEAL}
    # For internal use
    header += pack('<i', 0)                                                         # FromBar - FromDate bar number
    header += pack('<i', 0)                                                         # ToBar - ToDate bar number
    header += pack('<6i', 1, 0, 0, 0, 0, 0)                                         # StartPeriod - number of bar at which the smaller period modeling started
    header += pack('<i', 0)                                                         # SetFrom - begin date from tester settings
    header += pack('<i', 0)                                                         # SetTo - end date from tester settings
    header += pack('<i', 0)                                                         # FreezeLevel - order's freeze level in points

and here is probably the correct version (see: fxt-405-refined.mqh):

  double            margin_divider;     // Margin divider.
  char              margin_currency[12];// Margin currency.
  // 420
  //---- Commission calculation.
  double            comm_base;          // Basic commission
  int               comm_type;          // Basic commission type          { COMM_TYPE_MONEY, COMM_TYPE_PIPS, COMM_TYPE_PERCENT }
  int               comm_lots;          // Commission per lot or per deal { COMMISSION_PER_LOT, COMMISSION_PER_DEAL }
  // 436
  //---- For internal use.
  int               from_bar;           // FromdAte bar number.
  int               to_bar;             // ToDate bar number.
  int               start_period[6];    // Number of bar at which the smaller period modeling started.
  int               set_from;           // Begin date from tester settings.
  int               set_to;             // End date from tester settings.
  // 476
  //----
  int               end_of_test;
  int               freeze_level;       // Order's freeze level in points.

The difference is between end_of_test which is missing in the script and 4 bytes of padding which probably are not there (based on the header file).

Can we find out which version is correct?

Write script to read ticks.raw file. [$30 awarded]

Write a new script in Python 3 (e.g. read_ticks?) which will print the values from ticks.raw file to the output (similar to other read_symbols_x scripts).

Can be in any output format, but format of fields should match its type (ask/bid should print float, time should print time, etc).

Related: #33, #41, #42, #43

Format reference: ticks.raw.h

Sample file: ticks.raw.zip

--- The **[$30 bounty](https://www.bountysource.com/issues/32284656-write-script-to-read-ticks-raw-file?utm_campaign=plugin&utm_content=tracker%2F20487492&utm_medium=issues&utm_source=github)** on this issue has been claimed at [Bountysource](https://www.bountysource.com/?utm_campaign=plugin&utm_content=tracker%2F20487492&utm_medium=issues&utm_source=github).

socket.error: [Errno 110] Connection timed out

We should catch the error and re-try the download (e.g. 5 times).

Error:

Downloading http://www.dukascopy.com/datafeed/EOANDEEUR/2015/09/06/07h_ticks.bi5 into: download/dukascopy/EOANDEEUR/2015/10/2015-10-06--07h_ticks.bi5...
Traceback (most recent call last):
  File "/usr/lib/python3.2/urllib/request.py", line 1542, in open
    return getattr(self, name)(url)
  File "/usr/lib/python3.2/urllib/request.py", line 1720, in open_http
    return self._open_generic_http(http.client.HTTPConnection, url, data)
  File "/usr/lib/python3.2/urllib/request.py", line 1700, in _open_generic_http
    http_conn.request("GET", selector, headers=headers)
  File "/usr/lib/python3.2/http/client.py", line 970, in request
    self._send_request(method, url, body, headers)
  File "/usr/lib/python3.2/http/client.py", line 1008, in _send_request
    self.endheaders(body)
  File "/usr/lib/python3.2/http/client.py", line 966, in endheaders
    self._send_output(message_body)
  File "/usr/lib/python3.2/http/client.py", line 811, in _send_output
    self.send(msg)
  File "/usr/lib/python3.2/http/client.py", line 749, in send
    self.connect()
  File "/usr/lib/python3.2/http/client.py", line 727, in connect
    self.timeout, self.source_address)
  File "/usr/lib/python3.2/socket.py", line 415, in create_connection
    raise err
  File "/usr/lib/python3.2/socket.py", line 406, in create_connection
    sock.connect(sa)
socket.error: [Errno 110] Connection timed out

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "./dl_bt_dukascopy.py", line 384, in <module>
    ds.download()
  File "./dl_bt_dukascopy.py", line 284, in download
    urllib.request.urlretrieve(self.url, filename=self.path)
  File "/usr/lib/python3.2/urllib/request.py", line 151, in urlretrieve
    return _urlopener.retrieve(url, filename, reporthook, data)
  File "/usr/lib/python3.2/urllib/request.py", line 1574, in retrieve
    fp = self.open(url, data)
  File "/usr/lib/python3.2/urllib/request.py", line 1546, in open
    raise IOError('socket error', msg).with_traceback(sys.exc_info()[2])
  File "/usr/lib/python3.2/urllib/request.py", line 1542, in open
    return getattr(self, name)(url)
  File "/usr/lib/python3.2/urllib/request.py", line 1720, in open_http
    return self._open_generic_http(http.client.HTTPConnection, url, data)
  File "/usr/lib/python3.2/urllib/request.py", line 1700, in _open_generic_http
    http_conn.request("GET", selector, headers=headers)
  File "/usr/lib/python3.2/http/client.py", line 970, in request
    self._send_request(method, url, body, headers)
  File "/usr/lib/python3.2/http/client.py", line 1008, in _send_request
    self.endheaders(body)
  File "/usr/lib/python3.2/http/client.py", line 966, in endheaders
    self._send_output(message_body)
  File "/usr/lib/python3.2/http/client.py", line 811, in _send_output
    self.send(msg)
  File "/usr/lib/python3.2/http/client.py", line 749, in send
    self.connect()
  File "/usr/lib/python3.2/http/client.py", line 727, in connect
    self.timeout, self.source_address)
  File "/usr/lib/python3.2/socket.py", line 415, in create_connection
    raise err
  File "/usr/lib/python3.2/socket.py", line 406, in create_connection
    sock.connect(sa)
IOError: [Errno socket error] [Errno 110] Connection timed out

Est. 1-2h

We need requirements.txt with required pip packages.

Currently on Debian ./dl_bt_dukascopy.py --help fails with:

ImportError: No module named lzma

We need to have requirements.txt with required pip packages which can be installed via pip3 automatically.

This can be generated via:

pip3 freeze | grep lzma > requirements.txt

Then installed via:

pip3 -r requirements.txt

Tested, but it doesn't work as expected.

Script to convert CSV into HCC

Expand convert_csv_to_mt.py script to convert CSV input files into HCC (as implemented as part of GH-77). So -f parameter should accept the new hcc format (similar as it works for other hst/fxt formats).

codemill$120

dl_bt_dukascopy.py: Support for downloading commodities and stocks. [$20 awarded]

Currently it's downloading like:

http://www.dukascopy.com/datafeed/E_YHOO/2012/02/04/22h_ticks.bi5

which does not exist. So the logic needs to be improved after this change: 8731410

Example run:

./dl_bt_dukascopy -p E_YHOO

Check dev branch for the right commodities and stocks mapping.

This shouldn't break existing downloading of currency pairs.

Check existing dl_bt_dukascopy.php to compare the logic.

Please check this page for details (manual download).

Est. 1-2h

--- The **[$20 bounty](https://www.bountysource.com/issues/31083586-dl_bt_dukascopy-py-support-for-downloading-commodities-and-stocks?utm_campaign=plugin&utm_content=tracker%2F20487492&utm_medium=issues&utm_source=github)** on this issue has been claimed at [Bountysource](https://www.bountysource.com/?utm_campaign=plugin&utm_content=tracker%2F20487492&utm_medium=issues&utm_source=github).

TimeoutError: [Errno 60] Operation timed out

Test:

./dl_bt_dukascopy.py -v -c -D . -p EURUSD -y 2007,2008,2009,2010,2011

When the connection is slow, this error can happen:

Converting into CSV (./EURUSD/2010/02/2010-02-24--04h_ticks.csv)...
Downloading http://www.dukascopy.com/datafeed/EURUSD/2010/01/24/05h_ticks.bi5 into: ./EURUSD/2010/02/2010-02-24--05h_ticks.bi5...
Traceback (most recent call last):
  File "./FX-BT-Scripts/dl_bt_dukascopy.py", line 286, in download
    urllib.request.urlretrieve(self.url, filename=self.path)
  File "/usr/local/Cellar/python3/3.4.3/Frameworks/Python.framework/Versions/3.4/lib/python3.4/urllib/request.py", line 186, in urlretrieve
    with contextlib.closing(urlopen(url, data)) as fp:
  File "/usr/local/Cellar/python3/3.4.3/Frameworks/Python.framework/Versions/3.4/lib/python3.4/urllib/request.py", line 161, in urlopen
    return opener.open(url, data, timeout)
  File "/usr/local/Cellar/python3/3.4.3/Frameworks/Python.framework/Versions/3.4/lib/python3.4/urllib/request.py", line 469, in open
    response = meth(req, response)
  File "/usr/local/Cellar/python3/3.4.3/Frameworks/Python.framework/Versions/3.4/lib/python3.4/urllib/request.py", line 579, in http_response
    'http', request, response, code, msg, hdrs)
  File "/usr/local/Cellar/python3/3.4.3/Frameworks/Python.framework/Versions/3.4/lib/python3.4/urllib/request.py", line 501, in error
    result = self._call_chain(*args)
  File "/usr/local/Cellar/python3/3.4.3/Frameworks/Python.framework/Versions/3.4/lib/python3.4/urllib/request.py", line 441, in _call_chain
    result = func(*args)
  File "/usr/local/Cellar/python3/3.4.3/Frameworks/Python.framework/Versions/3.4/lib/python3.4/urllib/request.py", line 684, in http_error_302
    return self.parent.open(new, timeout=req.timeout)
  File "/usr/local/Cellar/python3/3.4.3/Frameworks/Python.framework/Versions/3.4/lib/python3.4/urllib/request.py", line 463, in open
    response = self._open(req, data)
  File "/usr/local/Cellar/python3/3.4.3/Frameworks/Python.framework/Versions/3.4/lib/python3.4/urllib/request.py", line 481, in _open
    '_open', req)
  File "/usr/local/Cellar/python3/3.4.3/Frameworks/Python.framework/Versions/3.4/lib/python3.4/urllib/request.py", line 441, in _call_chain
    result = func(*args)
  File "/usr/local/Cellar/python3/3.4.3/Frameworks/Python.framework/Versions/3.4/lib/python3.4/urllib/request.py", line 1210, in http_open
    return self.do_open(http.client.HTTPConnection, req)
  File "/usr/local/Cellar/python3/3.4.3/Frameworks/Python.framework/Versions/3.4/lib/python3.4/urllib/request.py", line 1185, in do_open
    r = h.getresponse()
  File "/usr/local/Cellar/python3/3.4.3/Frameworks/Python.framework/Versions/3.4/lib/python3.4/http/client.py", line 1171, in getresponse
    response.begin()
  File "/usr/local/Cellar/python3/3.4.3/Frameworks/Python.framework/Versions/3.4/lib/python3.4/http/client.py", line 351, in begin
    version, status, reason = self._read_status()
  File "/usr/local/Cellar/python3/3.4.3/Frameworks/Python.framework/Versions/3.4/lib/python3.4/http/client.py", line 313, in _read_status
    line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1")
  File "/usr/local/Cellar/python3/3.4.3/Frameworks/Python.framework/Versions/3.4/lib/python3.4/socket.py", line 374, in readinto
    return self._sock.recv_into(b)
TimeoutError: [Errno 60] Operation timed out

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "./FX-BT-Scripts/dl_bt_dukascopy.py", line 397, in <module>
    ds.download()
  File "./FX-BT-Scripts/dl_bt_dukascopy.py", line 292, in download
    print("Error: %s, reason: %s. Retrying (%i).." % (err.code, err.reason, i));
AttributeError: 'TimeoutError' object has no attribute 'code'

Please fix the exception handler, so the download can be re-tried, instead of throwing the error.

codemill$20

Python script to add/modify values in symbols.raw

Write a script in Python 3, similar to read_mt_formats.py which instead of reading, it's modifying existing values, adds new records, or removes.
Suggested script filename: modify_mt_format.py (or similar)

The format is already defined in SymbolsRaw class in bstruct_defs.py which can be reused.

The feature should include:

  • adding new symbol group into file based on the given name and the name to copy values from
  • removing symbol group from the file based on the given name
  • modifying selected existing known values such as: description, altName, baseCurrency, digits, spread, profitCalcMode, contractSize, stopsLevel, marginInit, marginMaintenance, marginHedged, marginDivider, pointSize, pointsPerUnit, marginCurrency (ideally lookup for the available key names from SymbolsRaw._fields

Suggested input parameters:

  • -f (file) - input file name (mandatory)
  • -t (type)/--input-type - mandatory input file type (currently only symbolsraw)
  • -k (key) - key group name to use
  • -d - delete symbol name (specified by -k param) from the file (specified by -f)
  • -a (symbol-name) - add new symbol-name group with values copied from the symbol specified by -k, e.g. -a XAGUSD -k GOLD - create a new XAGUSD group with the same values as GOLD; if no -k is specified - create empty one (zero all values); if symbol already exists, do not add, display warning and exit with code 1
  • -m name=value - modify value of param key (name) with the given value, e.g. -k EURUSD -m digits=5; add support to modify multiple name/values at the same run (e.g. multiple -m, or separated by comma); -k is required for -m;

Related: #41

symbols.raw.zip
symbols.raw-samples.zip

Can't Ctrl-C when Python script is busy.

File: convert_csv_to_mt.py

When there is a long conversion, I can't stop the script by pressing Ctrl-C, something is blocking it.

The workaround is to do: Ctrl-Z, then kill %1. So I believe something is wrong.

Python script to read HCC files

Extend read_mt_formats.py and convert_mt_to_csv.py scripts (both) to read HCC files containing historical data (similar as it does for other formats).

The structure of this format is explained here (note: this info could be a bit out of date).

Then after running the script (read_mt_formats.py) I expect that all values are printed into the screen into meaningful formats, for example:

2015-01-02 10:00:00 1.205450 1.205450 1.205140 1.205160    31     9 22000000 
2015-01-02 10:01:00 1.205160 1.205210 1.205030 1.205030    31     6 35000000 
2015-01-02 10:02:00 1.205030 1.205030 1.204680 1.204730    51     9 54500000 
2015-01-02 10:03:00 1.204730 1.204750 1.203460 1.203850    81    18 65700000 

There should be two type of formats to print for read_mt_formats.py script, one for the header, another for data it-self.

Then the convert_mt_to_csv.py should print only data into CSV format, in the same format as it is for the existing HST or FXT format (the one which is most suitable for it).

For unknown fields, you can add the placeholders (e.g. unknown1, unknown2).

See also: CSVtoHCC repo.

Please find the sample files in the attachment.

EURUSD.zip

Ref: codemill$160

Script to read srv format.

Extend read_mt_formats.py to read srv format.

You can check how different formats are defined in bstruct_defs.py script (used by read_mt_formats.py), basically you need to find the type (such as double, integer or string) and size of the individual fields. You can guess the names, but it's not so important. The obvious is that the first is the name, then company name, IP, port.

Samples: srv-files.zip

Suggested command to compare two binary files:

diff -y <(xxd FXCM-EURDemo01.srv) <(xxd FXCM-GBPDemo01.srv) | colordiff

Hints: mt5demo.zip

Est. 4-8h

Add a progress bar for convert_csv_to_mt.py during conversion [$20 awarded]

When running the script in CI for big files, if fails the build because there is no output for at least 10 minutes. See this build.

The script should have some simple progress bar. Here are some suggestions, but it could be anything easy to implement (less external dependencies, than better).

Could be similar to pv -p command:

20.4MiB 0:00:02 [9.96MiB/s] [   <=>                                                                                                ]

or if the total is known:

100%[======================================>] 578,432     3.28MB/s   in 0.2s   

or anything else. Not be refreshed too often, to not slow down the conversion too much (could be for each next 1%, or each 1000000 ticks if the total is not known).

Testing:

git clone --single-branch --branch EURUSD-2014 https://github.com/FX31337/FX-BT-Data-EURUSD-DS.git
cd FX*
curl -o Makefile https://raw.githubusercontent.com/FX31337/FX-BT-Data-Test/master/Makefile
make

--- The **[$20 bounty](https://www.bountysource.com/issues/32291461-add-a-progress-bar-for-convert_csv_to_mt-py-during-conversion?utm_campaign=plugin&utm_content=tracker%2F20487492&utm_medium=issues&utm_source=github)** on this issue has been claimed at [Bountysource](https://www.bountysource.com/?utm_campaign=plugin&utm_content=tracker%2F20487492&utm_medium=issues&utm_source=github).

Script to read accounts.ini binary data.

  1. Install MT4.
  2. Run and open folder from menu File -> Open Data Folder.
  3. Go to config/ folder, you should have accounts.ini file created.

It changes on configuration changes related to accounts (e.g. File -> Login to Trade Account) or when you create a demo account (File -> Open an Account). It remembers the account number and it's used to set the account leverage.

Could be similar encryption as used for DAT files, see: convert_dat.py.

xor each byte of data with that of key, cycling over key as needed

Examples: accounts-files.zip.

The file should be in plain text format.

Related: #62
Est. 8-16h


Details

  • Account Database

    The database of accounts (/Config/accounts.ini) of the terminal is bound to the user account in the operating system and to the computer hardware. If you try to authorize in the terminal under another account in the operating system or try to move the terminal data to another PC, whole base of accounts will be deleted as soon as the terminal is launched. In this connection, you must keep the accounts details (login and password) in a separate safe place.

  • Databases Security

    All databases of the terminal are encrypted and protected from using in other terminals. When transferring the terminal from one PC to another, there is no possibility of using the information accumulated in it (accounts, mailing, trade history). After the authorization with an account, the trade, mail and news databases are recovered, but account details can only be restored by appealing to the broker.

convert_csv_to_mt.py: TypeError: 'NoneType' object is not subscriptable [$20]

E.g. this one works fine:

find . -name "*.csv" -print0 | sort -z | xargs -r0 cat | ./scripts/convert_csv_to_mt.py -i /dev/stdin -f hst4 -v -S default -s EURUSD -p 10 -t M1 -d foo

However sometimes conversion fails when using multiple streams with the following error:

Traceback (most recent call last):
  File "./scripts/convert_csv_to_mt.py", line 417, in <module>
    FXT(CSV(args.inputFile), outputPath, timeframe, server, symbol, spread)
  File "./scripts/convert_csv_to_mt.py", line 256, in __init__
    header += pack('<i', int(firstUniBar['barTimestamp']))                          # FromDate - Date of first tick
TypeError: 'NoneType' object is not subscriptable

Possible failing scenario (given folder with backtest data: DS-EURUSD-2014 (change the branch to see csv files):

bt_size=$(find "DS-EURUSD-2014" -name '*.csv' -print0 | du -bc --files0-from=- | tail -n1 | cut -f1)
find DS-EURUSD-2014/ -name "*.csv" -print0 | sort -z | xargs -r0 cat | pv -s $bt_size | tee &>/dev/null   \
  >(./scripts/convert_csv_to_mt.py -i /dev/stdin -f hst4 -v -S default -s EURUSD -p 10 -t M1 -d output/)  \
  >(./scripts/convert_csv_to_mt.py -i /dev/stdin -f hst4 -v -S default -s EURUSD -p 10 -t M5 -d output/)  \
  >(./scripts/convert_csv_to_mt.py -i /dev/stdin -f hst4 -v -S default -s EURUSD -p 10 -t M15 -d output/) \
  >(./scripts/convert_csv_to_mt.py -i /dev/stdin -f hst4 -v -S default -s EURUSD -p 10 -t M30 -d output/) \
  >(./scripts/convert_csv_to_mt.py -i /dev/stdin -f hst4 -v -S default -s EURUSD -p 10 -t H1 -d output/)  \
  >(./scripts/convert_csv_to_mt.py -i /dev/stdin -f hst4 -v -S default -s EURUSD -p 10 -t H4 -d output/)  \
  >(./scripts/convert_csv_to_mt.py -i /dev/stdin -f hst4 -v -S default -s EURUSD -p 10 -t D1 -d output/)  \
  >(./scripts/convert_csv_to_mt.py -i /dev/stdin -f hst4 -v -S default -s EURUSD -p 10 -t W1 -d output/)  \
  >(./scripts/convert_csv_to_mt.py -i /dev/stdin -f hst4 -v -S default -s EURUSD -p 10 -t MN -d output/)

or given the following script (get_bt_data.sh, after this one: install_mt4.sh)

./get_bt_data.sh EURUSD 2014 DS

This happens, because we're reading data from /dev/stdin, and the data is not available yet at the time of reading (because of streaming). So Python should support stdin streaming somehow, or wait for the data? Since I think it can get half of the line instead.

The above syntax can be found in this commit, so to reproduce the problem you need to clone than repo, reset to that revision (git reset 75191cae6f1f58c656373bf3c845eb66e8af4991) and and run get_bt_data.sh EURUSD 2014 DS?

Sometimes it fails on:

Traceback (most recent call last):
  File "/Users/kenorb/.wine/drive_c/Program Files/FXCM MetaTrader 4/history/d
    FXT(CSV(args.inputFile), outputPath, timeframe, server, symbol, spread)
  File "/Users/kenorb/.wine/drive_c/Program Files/FXCM MetaTrader 4/history/d
    for tick in ticks:
  File "/Users/kenorb/.wine/drive_c/Program Files/FXCM MetaTrader 4/history/d
    return self._parseLine(line)
  File "/Users/kenorb/.wine/drive_c/Program Files/FXCM MetaTrader 4/history/d
    'timestamp': time.mktime(datetime.datetime.strptime(tick[0], '%Y.%m.%d %H
  File "/usr/local/Cellar/python3/3.4.3/Frameworks/Python.framework/Versions/
    tt, fraction = _strptime(data_string, format)
  File "/usr/local/Cellar/python3/3.4.3/Frameworks/Python.framework/Versions/
    (data_string, format))
ValueError: time data '2014.0' does not match format '%Y.%m.%d %H:%M:%S.%f'

See syntax in this commit.

Here is simple syntax which works:

cat /etc/hosts | tee >(wc -l) >(wc -l) >(wc -l) >(wc -l)

But it seems the current code of convert_csv_to_mt.py doesn't like it when we're trying to send the same data into multiple processes. See: http://stackoverflow.com/a/60955

--- Did you help close this issue? Go claim the **[$20 bounty](https://www.bountysource.com/issues/29942105-convert_csv_to_mt-py-typeerror-nonetype-object-is-not-subscriptable?utm_campaign=plugin&utm_content=tracker%2F20487492&utm_medium=issues&utm_source=github)** on [Bountysource](https://www.bountysource.com/?utm_campaign=plugin&utm_content=tracker%2F20487492&utm_medium=issues&utm_source=github).

Combine all CSV into one

Currently, the dl_bt_dukascopy.py script downloads bi5 files, and converts it into separate CSV files.

It would be great, if can combine all data into one CSV file after it finishes, by specifying extra parameter.

Similarly as go-duka tool which downloads CSV and combine all CSV files automatically.

image

Fix Makefile to pass Travis CI [$10 awarded]

File: Makefile

So Makefile must pass, then Travis CI (see: .travis.yml) should be happy:

Build Status

based on the existing logic in Makefile.

Run make test on local to test it locally.

The Makefile test includes:

  1. Initial test of Python syntax (--help) which already works.
  2. Test download of files, which almost works, but pattern-rule needs to be fixed, so it's run when csv files weren't downloaded yet.
  3. Test format conversion into M1/ and M5/ dirs.
  4. Test dump tool to convert back into csv files for each hst and fxt file. #10 needs to be implemented (as part of this ticket) in order to support extra -o (in similar way as the other scripts do).

--- The **[$10 bounty](https://www.bountysource.com/issues/26908582-fix-makefile-to-pass-travis-ci?utm_campaign=plugin&utm_content=tracker%2F20487492&utm_medium=issues&utm_source=github)** on this issue has been claimed at [Bountysource](https://www.bountysource.com/?utm_campaign=plugin&utm_content=tracker%2F20487492&utm_medium=issues&utm_source=github).

dl_bt_dukascopy.py: EOFError: Compressed file ended before the end-of-stream marker was reached

Command:

./dl_bt_dukascopy.py -y 2013 -p EURUSD -c

Traceback:

Converting into CSV (download/EURUSD/2013-01-07--01h_ticks.csv)...
Traceback (most recent call last):
  File "./dl_bt_dukascopy.py", line 159, in <module>
    ds.bt5_to_csv()
  File "./dl_bt_dukascopy.py", line 102, in bt5_to_csv
    data = f.read()
  File "/usr/local/Cellar/python3/3.4.3/Frameworks/Python.framework/Versions/3.4/lib/python3.4/lzma.py", line 310, in read
    return self._read_all()
  File "/usr/local/Cellar/python3/3.4.3/Frameworks/Python.framework/Versions/3.4/lib/python3.4/lzma.py", line 251, in _read_all
    while self._fill_buffer():
  File "/usr/local/Cellar/python3/3.4.3/Frameworks/Python.framework/Versions/3.4/lib/python3.4/lzma.py", line 225, in _fill_buffer
    raise EOFError("Compressed file ended before the "
EOFError: Compressed file ended before the end-of-stream marker was reached

It happens randomly. In this case, it happens for: 2013-01-07--01h_ticks.bi5 specifically.

Mismatch in OLHC data when converting from CSV to HST

  1. Clone this repo.

  2. Download CSV file into the repo folder, e.g.

     wget https://raw.githubusercontent.com/FX31337/FX-BT-Data-EURUSD-DS/EURUSD-2014/EURUSD/2014/01/2014-01-01--23h_ticks.csv
    
  3. Convert CSV file into HST format:

     ./convert_csv_to_mt.py -v -i 2014-01-01--22h_ticks.csv -f hst4 -t M1,M5,M15,M30,H1,H4,D1,W1,MN
    
  4. Verify data by convert_mt_to_csv.py, e.g.:

     $ ./convert_mt_to_csv.py -i EURUSD60.hst -f hst4
     2014.01.01 22:00:00,1.37553,1.37553,1.37552,1.37553,1,0,0
     $ ./convert_mt_to_csv.py -i EURUSD30.hst -f hst4
     2014.01.01 22:00:00,1.37553,1.37553,1.37552,1.37553,1,0,0
     2014.01.01 22:30:00,1.37499,1.37499,1.37499,1.37499,12,0,0
    

The above data is not correct (only the first open price is correct). The high (2nd) should be the highest tick from the list, then the lowest value from the period, then the last closing value of the particular period. Same for all other HST formats.

Write script to read symgroups.raw file. [$30 awarded]

Write a new script in Python 3 which will print the values from symgroups.raw file to the output (similar to other read_symbols_x scripts).

Can be in any output format, but format of fields should match its type (ask/bid should print float, time should print time, etc).

Related: #33, #41

Format reference: symgroups.raw.h

Check samples files symgroups.raw-samples.zip or they can be generated by installing MT4 platform.

--- The **[$30 bounty](https://www.bountysource.com/issues/32284438-write-script-to-read-symgroups-raw-file?utm_campaign=plugin&utm_content=tracker%2F20487492&utm_medium=issues&utm_source=github)** on this issue has been claimed at [Bountysource](https://www.bountysource.com/?utm_campaign=plugin&utm_content=tracker%2F20487492&utm_medium=issues&utm_source=github).

convert_csv_to_mt.py takes ages to convert the data [$60 awarded]

Currently conversion of CSV data from only one year (800M) into tick data format takes around 1 hour to complete (for all timeframes).

This makes fails for CI builds, because timeout of 30 minutes being reached in order to convert the data from here. This one failed after maximum time of 50 minutes.

Testing:

git clone --single-branch --branch EURUSD-2014 https://github.com/FX31337/FX-BT-Data-EURUSD-DS.git
cd FX*
curl -o Makefile https://raw.githubusercontent.com/FX31337/FX-BT-Data-Test/master/Makefile
time make

I didn't count, but it's ~3h on MBP (OS X).

The goal is to reduce the conversion time of the data mentioned above, so it completes in reasonable time (e.g. less than half an hour if that's achievable, otherwise max. 50 min), so CI won't fail (max time is 50 minutes).

File: convert_csv_to_mt.py

Test:

$ time make
...
Done.

real    187m37.007s
user    185m2.853s
sys 1m7.004s

which makes ~3.11h

Result files:

-rw-r--r--  1 kenorb 4.7M Mar 29 11:35 EURUSD1.hst.gz
-rw-r--r--  1 kenorb 2.0K Mar 29 12:43 EURUSD10080.hst.gz
-rw-r--r--  1 kenorb  80M Mar 29 14:19 EURUSD10080_0.fxt.gz
-rw-r--r--  1 kenorb 9.8K Mar 29 12:34 EURUSD1440.hst.gz
-rw-r--r--  1 kenorb  80M Mar 29 14:09 EURUSD1440_0.fxt.gz
-rw-r--r--  1 kenorb 477K Mar 29 11:55 EURUSD15.hst.gz
-rw-r--r--  1 kenorb  83M Mar 29 13:25 EURUSD15_0.fxt.gz
-rw-r--r--  1 kenorb  90M Mar 29 13:04 EURUSD1_0.fxt.gz
-rw-r--r--  1 kenorb  40K Mar 29 12:24 EURUSD240.hst.gz
-rw-r--r--  1 kenorb  80M Mar 29 13:58 EURUSD240_0.fxt.gz
-rw-r--r--  1 kenorb 257K Mar 29 12:04 EURUSD30.hst.gz
-rw-r--r--  1 kenorb  82M Mar 29 13:36 EURUSD30_0.fxt.gz
-rw-r--r--  1 kenorb  597 Mar 29 12:53 EURUSD43200.hst.gz
-rw-r--r--  1 kenorb  80M Mar 29 14:30 EURUSD43200_0.fxt.gz
-rw-r--r--  1 kenorb 1.3M Mar 29 11:45 EURUSD5.hst.gz
-rw-r--r--  1 kenorb  86M Mar 29 13:15 EURUSD5_0.fxt.gz
-rw-r--r--  1 kenorb 139K Mar 29 12:14 EURUSD60.hst.gz
-rw-r--r--  1 kenorb  81M Mar 29 13:47 EURUSD60_0.fxt.gz

--- The **[$60 bounty](https://www.bountysource.com/issues/32272156-convert_csv_to_mt-py-takes-ages-to-convert-the-data?utm_campaign=plugin&utm_content=tracker%2F20487492&utm_medium=issues&utm_source=github)** on this issue has been claimed at [Bountysource](https://www.bountysource.com/?utm_campaign=plugin&utm_content=tracker%2F20487492&utm_medium=issues&utm_source=github).

convert_csv_to_mt.py: Further performance improvements for conversion. [$40 awarded]

Follow-up to #37

Currently the script uses mmap to read the file in memory to leave some part of the work to the OS, but that's just a marginal speedup, the real one is in the optimized timestamp parser and the more compact pack.
It might be possible to shave some more time by shuffling some code around though.

--- The **[$40 bounty](https://www.bountysource.com/issues/32438471-convert_csv_to_mt-py-further-performance-improvements-for-conversion?utm_campaign=plugin&utm_content=tracker%2F20487492&utm_medium=issues&utm_source=github)** on this issue has been claimed at [Bountysource](https://www.bountysource.com/?utm_campaign=plugin&utm_content=tracker%2F20487492&utm_medium=issues&utm_source=github).

Allow to specify the months. [$10 awarded]

File: dl_bt_dukascopy.py

New parameter: -m. If not specified, all months are downloaded by default.
Example syntax: -m 1, -m 02, -m 1,2,5,12

Example syntax:

python3 dl_bt_dukascopy.py -v -p EURUSD -y 2014 -m 1,2,3,5

Bounty: https://www.bountysource.com/issues/26905050-allow-to-specify-the-months

--- The **[$10 bounty](https://www.bountysource.com/issues/26905050-allow-to-specify-the-months?utm_campaign=plugin&utm_content=tracker%2F20487492&utm_medium=issues&utm_source=github)** on this issue has been claimed at [Bountysource](https://www.bountysource.com/?utm_campaign=plugin&utm_content=tracker%2F20487492&utm_medium=issues&utm_source=github).

dl_bt_dukascopy.py: CSV file is not always generated. [$10 awarded]

Testing command:

./dl_bt_dukascopy.py -y 2014 -p EURUSD -c

on the existing folder with already existing files.

Problem:

  • the CSV file is not generated for every file.

Log:

Converting into CSV (download/EURUSD/2014-02-14--19h_ticks.csv)...
Downloading http://www.dukascopy.com/datafeed/EURUSD/2014/02/14/20h_ticks.bi5 into: download/EURUSD/2014-02-14--20h_ticks.bi5...
Converting into CSV (download/EURUSD/2014-02-14--20h_ticks.csv)...
Downloading http://www.dukascopy.com/datafeed/EURUSD/2014/02/14/21h_ticks.bi5 into: download/EURUSD/2014-02-14--21h_ticks.bi5...
Downloading http://www.dukascopy.com/datafeed/EURUSD/2014/02/14/22h_ticks.bi5 into: download/EURUSD/2014-02-14--22h_ticks.bi5...
Downloading http://www.dukascopy.com/datafeed/EURUSD/2014/02/14/23h_ticks.bi5 into: download/EURUSD/2014-02-14--23h_ticks.bi5...
Downloading http://www.dukascopy.com/datafeed/EURUSD/2014/02/15/00h_ticks.bi5 into: download/EURUSD/2014-02-15--00h_ticks.bi5...
Downloading http://www.dukascopy.com/datafeed/EURUSD/2014/02/15/01h_ticks.bi5 into: download/EURUSD/2014-02-15--01h_ticks.bi5...

Converting into CSV (download/EURUSD/2014-04-09--19h_ticks.csv)...
Downloading http://www.dukascopy.com/datafeed/EURUSD/2014/04/09/20h_ticks.bi5 into: download/EURUSD/2014-04-09--20h_ticks.bi5...
Converting into CSV (download/EURUSD/2014-04-09--20h_ticks.csv)...
Downloading http://www.dukascopy.com/datafeed/EURUSD/2014/04/09/21h_ticks.bi5 into: download/EURUSD/2014-04-09--21h_ticks.bi5...
Downloading http://www.dukascopy.com/datafeed/EURUSD/2014/04/09/22h_ticks.bi5 into: download/EURUSD/2014-04-09--22h_ticks.bi5...
Downloading http://www.dukascopy.com/datafeed/EURUSD/2014/04/09/23h_ticks.bi5 into: download/EURUSD/2014-04-09--23h_ticks.bi5...
Downloading http://www.dukascopy.com/datafeed/EURUSD/2014/04/10/00h_ticks.bi5 into: download/EURUSD/2014-04-10--00h_ticks.bi5...
...
Downloading http://www.dukascopy.com/datafeed/EURUSD/2014/04/11/18h_ticks.bi5 into: download/EURUSD/2014-04-11--18h_ticks.bi5...
Downloading http://www.dukascopy.com/datafeed/EURUSD/2014/04/11/19h_ticks.bi5 into: download/EURUSD/2014-04-11--19h_ticks.bi5...
Downloading http://www.dukascopy.com/datafeed/EURUSD/2014/04/11/20h_ticks.bi5 into: download/EURUSD/2014-04-11--20h_ticks.bi5...
Downloading http://www.dukascopy.com/datafeed/EURUSD/2014/04/11/21h_ticks.bi5 into: download/EURUSD/2014-04-11--21h_ticks.bi5...
Converting into CSV (download/EURUSD/2014-04-11--21h_ticks.csv)...
Downloading http://www.dukascopy.com/datafeed/EURUSD/2014/04/11/22h_ticks.bi5 into: download/EURUSD/2014-04-11--22h_ticks.bi5...
Converting into CSV (download/EURUSD/2014-04-11--22h_ticks.csv)...

When the file is not generated, you can see Downloading few times in a row (without Converting line).

$ find . -name 2014-04-25\*
./download/EURUSD/2014-04-25--00h_ticks.bi5
./download/EURUSD/2014-04-25--01h_ticks.bi5
./download/EURUSD/2014-04-25--02h_ticks.bi5
./download/EURUSD/2014-04-25--03h_ticks.bi5
./download/EURUSD/2014-04-25--04h_ticks.bi5
./download/EURUSD/2014-04-25--05h_ticks.bi5
./download/EURUSD/2014-04-25--06h_ticks.bi5
./download/EURUSD/2014-04-25--07h_ticks.bi5
./download/EURUSD/2014-04-25--08h_ticks.bi5
./download/EURUSD/2014-04-25--09h_ticks.bi5
./download/EURUSD/2014-04-25--10h_ticks.bi5
./download/EURUSD/2014-04-25--11h_ticks.bi5
./download/EURUSD/2014-04-25--12h_ticks.bi5
./download/EURUSD/2014-04-25--13h_ticks.bi5
./download/EURUSD/2014-04-25--14h_ticks.bi5
./download/EURUSD/2014-04-25--15h_ticks.bi5
./download/EURUSD/2014-04-25--16h_ticks.bi5
./download/EURUSD/2014-04-25--17h_ticks.bi5
./download/EURUSD/2014-04-25--18h_ticks.bi5
./download/EURUSD/2014-04-25--19h_ticks.bi5
./download/EURUSD/2014-04-25--20h_ticks.bi5
./download/EURUSD/2014-04-25--21h_ticks.bi5
./download/EURUSD/2014-04-25--21h_ticks.csv
./download/EURUSD/2014-04-25--22h_ticks.bi5
./download/EURUSD/2014-04-25--22h_ticks.csv
./download/EURUSD/2014-04-25--23h_ticks.bi5
./download/EURUSD/2014-04-25--23h_ticks.csv

Become a Bounty Hunter

--- The **[$10 bounty](https://www.bountysource.com/issues/26472268-dl_bt_dukascopy-py-csv-file-is-not-always-generated?utm_campaign=plugin&utm_content=tracker%2F20487492&utm_medium=issues&utm_source=github)** on this issue has been claimed at [Bountysource](https://www.bountysource.com/?utm_campaign=plugin&utm_content=tracker%2F20487492&utm_medium=issues&utm_source=github).

convert_csv_to_mt: unmatched data error (volume limit exceeded)

When backtesting using higher timeframes, the following errors appear in the terminal log files:

TestGenerator: unmatched data error (volume limit 584 at 2014.01.02 23:00 exceeded)
TestGenerator: unmatched data error (volume limit 852 at 2014.01.02 23:30 exceeded)
TestGenerator: unmatched data error (volume limit 426 at 2014.01.02 23:45 exceeded)
TestGenerator: unmatched data error (volume limit 822 at 2014.01.03 00:00 exceeded)
TestGenerator: unmatched data error (volume limit 933 at 2014.01.03 00:15 exceeded)
TestGenerator: unmatched data error (volume limit 884 at 2014.01.03 00:30 exceeded)

CI fails: #142088830, #142088831, #142088832

It's the same as all 'unmatched data errors', only in this case the volume is unmatched (usually O/C/H/L are unmatched). It means the tester is using data from a lower timeframe to test the EA on a larger timeframe, but the volume in the larger timeframe is not the sum of volumes of the lower timeframes... Hence - unmatched data error - this is literally the problem.

To solve this problem, build all larger timeframes from M1 data using the standard period_converter script. Please search for more info (this error and it's solution have been discussed many times).

Source: forum.mql4.com/32463

Testing:

  1. Clone repo.

  2. Clone/download some CSV files.

  3. Run: ./convert_csv_to_mt.py -v -i all.csv -s EURUSD -p 20 -S default -t M1,M5,M15,M30,H1,H4,D1,W1,MN -f fxt4

  4. Copy HST files into platform dir at: history/default/

  5. Copy FXT files into platform dir at: tester/history/

  6. Run the test using M15 or M30 period, e.g.

    ./_VM/scripts/run_backtest.sh -v -t -e TestSpread -D5 -m1 -s0 -P M15
    

To read FXT format, use: convert_mt_to_csv.py instead.

Est. 4-8h

Python: TestGenerator: unmatched data error (low value X at Y is not reached from the least timeframe, low price Z mismatches)

After fixing #85, this error start happening at M30 timeframe when backtesting:

TestGenerator: unmatched data error (low value 1.08294 at 2015.07.20 09:15 is not reached from the least timeframe, low price 1.08391 mismatches)

Similar error at TestPeriod M5 (plus volume error, similar as per: #70):

TestGenerator: unmatched data error (low value 1.08391 at 2015.07.20 09:15 is not reached from the least timeframe, low price 1.08474 mismatches)
TestGenerator: unmatched data error (volume limit 590 at 2015.07.20 09:30 exceeded)

And M15:

TestGenerator: unmatched data error (low value 1.08294 at 2015.07.20 09:15 is not reached from the least timeframe, low price 1.08391 mismatches)

This can be tested using Docker container, e.g.

docker run ea31337/ea-tester run_backtest -v -t -T M30 -e MACD -y 2015 -m 7 -D5 -b DS -M 4.0.0.1010
  • Try testing also with M5 and M15 (-T). And all months: -m 1-12. Remove -b and -M params to avoid re-downloading the data when running it again.
  • You can login to Docker container by: docker run -it ea31337/ea-tester bash.
  • Add -x for debug.

After fix in convert_csv_to_mt.py, clone and convert the CSV data e.g. via (see: Makefile as example):

find . -name '*.csv' -print0 | sort -z | xargs -r0 cat | tee all.csv > /dev/null
./convert_csv_to_mt.py -v -i all.csv -f hst4 -t M1,M5,M15,M30,H1,H4,D1,W1,MN

and move generated files and replace with the existing HST files in ~/.wine/drive_c/Program Files/MetaTrader 4/history/default and re-run the test.

To read the converted HST files, you can use convert_mt_to_csv.py tool, e.g.

./convert_mt_to_csv.py -i EURUSD15.hst -f hst4

The problem is with low value mismatch as per error. The fix shouldn't break the fixed logic in dbb096e, which wasn't working before as per #85. The solution should include the proper calculation of OLHCV values (open/low/high/close/volume) across the different timeframes to avoid any unmatched data errors.

See also:

Requirements

Basically these commands needs to work with no data error after complete testing:

  • docker run ea31337/ea-tester run_backtest -v -t -M4.0.0.1010 -d 1000 -p EURUSD -m 1-12 -y 2015 -s 10 -b DS -D5 -e TestFXTHeader (see: Build #1087).
  • docker run ea31337/ea-tester run_backtest -T M1 -v -t -e MA -y 2015 -m 7 (this works)
  • docker run ea31337/ea-tester run_backtest -T M5 -v -t -e MA -y 2015 -m 7
  • docker run ea31337/ea-tester run_backtest -T M15 -v -t -e MA -y 2015 -m 7
  • docker run ea31337/ea-tester run_backtest -T M30 -v -t -e MA -y 2015 -m 7
  • docker run ea31337/ea-tester run_backtest -T M30 -v -t -e MA -y 2015 -m 1-12 (test all months just in case)

Resources

--

Est. 10h

Write Python script to read symbols.raw file. [$40 awarded]

Write a new script in Python 3 which will print the values from symbols.raw to the output (similar to read_symbols_sel.py).

Can be in any output format, but format of fields should match its type (ask/bid should print float, time should print time, etc).

Related: #33

Format: symbols.raw.h.

Sample files: symbols.raw-samples.zip

--- The **[$40 bounty](https://www.bountysource.com/issues/32284324-write-python-script-to-read-symbols-raw-file?utm_campaign=plugin&utm_content=tracker%2F20487492&utm_medium=issues&utm_source=github)** on this issue has been claimed at [Bountysource](https://www.bountysource.com/?utm_campaign=plugin&utm_content=tracker%2F20487492&utm_medium=issues&utm_source=github).

Add model generation for Control points and Open prices

Extend convert_csv_to_mt.py script to have 3 modes of FXT conversion:

  • -m 0: Every tick (the most precise method) - this is how it works by default now,

  • -m 1: Control points (the nearest less timeframe, 1 minute OHLC),

    Historical data of the nearest less timeframe must be available. As soon as historical data of the less timeframe appear, these are these data that are interpolated. However, the really existing OHLC prices of the less timeframe appear as control points.

  • -m 2: Open prices only

    In this mode, first, bar is opened (Open = High = Low = Close, Volume=1), and this allows the Expert Advisor to identify the end of completion of the preceding price bar.

Specifying multiple modes by: -m 0,1,2 should be supported as well, so as result 3 files should be generated at one run. This -m param only applies to FXT format. For HST it should be ignored.

Filename

The generated FXT filename must have type name SSSSSSPP_M.fxt where:

  • SSSSSS - is the symbol under test;
  • PP - the value of the tested symbol period in minutes;
  • M - testing model (0 - "Every tick", 1 - "Control points", 2 - "Open prices only").

Currently it's generated correctly for Every tick model (with 0), we need to add support for 1 and 2. If model is not specified, by default generate the Every tick model (0).

Resources

For more explanation about mode of modelling, see:

Steps

  1. In some folder, clone repo with CSV files:

     git clone --branch EURUSD-2014 --single-branch https://github.com/FX31337/FX-BT-Data-EURUSD-DS
    
  2. Combine CSV data from 2 days into single file:

     find FX*  \( -name "2014-02-02*" -o -name "2014-02-03*" \) -exec cat {} ';' | sort > ticks.csv
    

    or 4 days:

     find FX* \( -name "2014-02-02*" -o -name "2014-02-03*" -o -name "2014-02-04*" -o -name "2014-02-05*" \) -exec cat {} ';' | sort > ticks.csv
    
  3. Clone this repo:

     git clone https://github.com/FX-Data/FX-Data-EURUSD-DS
    
  4. Convert CSV file into FXT (for M1, M5 and M30 timeframe) or using Docker:

     ./convert_csv_to_mt.py -v -i ticks.csv -f fxt4 -t M1,M5,M30
    

    Similar to hst4 format if you need to for testing.

  5. Read the generated files by:

     ./convert_mt_to_csv.py -i EURUSD1_0.fxt -f fxt4 | less
    
  6. Now generate file with control points (-m 1 to be implemented):

     ./convert_csv_to_mt.py -v -i ticks.csv -f fxt4 -t M1,M5,M30 -m 1
    

This should generate 3 files EURUSD1_1.fxt, EURUSD5_1.fxt, EURUSD30_1.fxt having only control point prices as described above.

Sample 1

M30

$ ./FX-BT-Scripts/convert_mt_to_csv.py -i EURUSD30_0.fxt -f fxt4 | head
2014.02.02 22:00:00,1.34842,1.34842,1.34842,1.34842,1,2014.02.02 22:00:00,4
2014.02.02 22:00:00,1.34837,1.34837,1.34837,1.34837,1,2014.02.02 22:00:00,4
2014.02.02 22:00:00,1.34828,1.34828,1.34828,1.34828,1,2014.02.02 22:00:00,4
2014.02.02 22:00:00,1.34828,1.34828,1.34828,1.34828,1,2014.02.02 22:00:00,4
2014.02.02 22:00:00,1.34832,1.34832,1.34832,1.34832,1,2014.02.02 22:00:00,4
2014.02.02 22:00:00,1.34832,1.34832,1.34832,1.34832,1,2014.02.02 22:00:00,4
2014.02.02 22:00:00,1.34827,1.34827,1.34827,1.34827,1,2014.02.02 22:00:00,4
2014.02.02 22:00:00,1.34829,1.34829,1.34829,1.34829,1,2014.02.02 22:00:00,4
2014.02.02 22:00:00,1.34835,1.34835,1.34835,1.34835,1,2014.02.02 22:00:00,4
2014.02.02 22:00:00,1.34837,1.34837,1.34837,1.34837,1,2014.02.02 22:00:00,4
$ ./FX-BT-Scripts/convert_mt_to_csv.py -i EURUSD30_1.fxt -f fxt4 | head
2014.02.02 22:00:00,1.34842,1.34871,1.34820,1.34832,717,2014.02.02 22:29:59,0
2014.02.02 22:30:00,1.34843,1.34905,1.34818,1.34863,2299,2014.02.02 22:59:59,0
2014.02.02 23:00:00,1.34865,1.34912,1.34851,1.34871,2650,2014.02.02 23:29:59,0
2014.02.02 23:30:00,1.34867,1.34892,1.34843,1.34873,2124,2014.02.02 23:59:59,0
2014.02.03 00:00:00,1.34873,1.34878,1.34800,1.34865,3445,2014.02.03 00:29:59,0
2014.02.03 00:30:00,1.34866,1.34916,1.34836,1.34865,4396,2014.02.03 00:59:59,0
2014.02.03 01:00:00,1.34864,1.34874,1.34801,1.34855,4995,2014.02.03 01:29:59,0
2014.02.03 01:30:00,1.34855,1.34865,1.34820,1.34850,3583,2014.02.03 01:59:59,0
2014.02.03 02:00:00,1.34852,1.34853,1.34798,1.34830,3988,2014.02.03 02:29:59,0
2014.02.03 02:30:00,1.34830,1.34853,1.34809,1.34849,3820,2014.02.03 02:59:59,0
$ ./FX-BT-Scripts/convert_mt_to_csv.py -i EURUSD30_2.fxt -f fxt4 | head
2014.02.02 22:00:00,1.34842,1.34871,1.34820,1.34832,717,2014.02.02 22:29:59,0
2014.02.02 22:30:00,1.34843,1.34905,1.34818,1.34863,2299,2014.02.02 22:59:59,0
2014.02.02 23:00:00,1.34865,1.34912,1.34851,1.34871,2650,2014.02.02 23:29:59,0
2014.02.02 23:30:00,1.34867,1.34892,1.34843,1.34873,2124,2014.02.02 23:59:59,0
2014.02.03 00:00:00,1.34873,1.34878,1.34800,1.34865,3445,2014.02.03 00:29:59,0
2014.02.03 00:30:00,1.34866,1.34916,1.34836,1.34865,4396,2014.02.03 00:59:59,0
2014.02.03 01:00:00,1.34864,1.34874,1.34801,1.34855,4995,2014.02.03 01:29:59,0
2014.02.03 01:30:00,1.34855,1.34865,1.34820,1.34850,3583,2014.02.03 01:59:59,0
2014.02.03 02:00:00,1.34852,1.34853,1.34798,1.34830,3988,2014.02.03 02:29:59,0
2014.02.03 02:30:00,1.34830,1.34853,1.34809,1.34849,3820,2014.02.03 02:59:59,0

M5

$ ./convert_mt_to_csv.py -i EURUSD5_0.fxt -f fxt4 | head
2014.02.02 22:00:00,1.34842,1.34842,1.34842,1.34842,1,2014.02.02 22:00:00,4
2014.02.02 22:00:00,1.34837,1.34837,1.34837,1.34837,1,2014.02.02 22:00:00,4
2014.02.02 22:00:00,1.34828,1.34828,1.34828,1.34828,1,2014.02.02 22:00:00,4
2014.02.02 22:00:00,1.34828,1.34828,1.34828,1.34828,1,2014.02.02 22:00:00,4
2014.02.02 22:00:00,1.34832,1.34832,1.34832,1.34832,1,2014.02.02 22:00:00,4
2014.02.02 22:00:00,1.34832,1.34832,1.34832,1.34832,1,2014.02.02 22:00:00,4
2014.02.02 22:00:00,1.34827,1.34827,1.34827,1.34827,1,2014.02.02 22:00:00,4
2014.02.02 22:00:00,1.34829,1.34829,1.34829,1.34829,1,2014.02.02 22:00:00,4
2014.02.02 22:00:00,1.34835,1.34835,1.34835,1.34835,1,2014.02.02 22:00:00,4
2014.02.02 22:00:00,1.34837,1.34837,1.34837,1.34837,1,2014.02.02 22:00:00,4
$ ./convert_mt_to_csv.py -i EURUSD5_1.fxt -f fxt4 | head
2014.02.02 22:00:00,1.34842,1.34842,1.34821,1.34821,60,2014.02.02 22:04:59,0
2014.02.02 22:05:00,1.34823,1.34845,1.34823,1.34833,65,2014.02.02 22:09:59,0
2014.02.02 22:10:00,1.34833,1.34855,1.34831,1.34850,142,2014.02.02 22:14:59,0
2014.02.02 22:15:00,1.34850,1.34865,1.34850,1.34865,90,2014.02.02 22:19:59,0
2014.02.02 22:20:00,1.34864,1.34866,1.34857,1.34864,67,2014.02.02 22:24:59,0
2014.02.02 22:25:00,1.34862,1.34871,1.34820,1.34832,291,2014.02.02 22:29:59,0
2014.02.02 22:30:00,1.34843,1.34863,1.34821,1.34860,595,2014.02.02 22:34:59,0
2014.02.02 22:35:00,1.34860,1.34867,1.34857,1.34862,276,2014.02.02 22:39:59,0
2014.02.02 22:40:00,1.34861,1.34867,1.34859,1.34862,270,2014.02.02 22:44:59,0
2014.02.02 22:45:00,1.34865,1.34895,1.34865,1.34889,234,2014.02.02 22:49:59,0
$ ./convert_mt_to_csv.py -i EURUSD5_2.fxt -f fxt4 | head
2014.02.02 22:00:00,1.34842,1.34842,1.34821,1.34821,60,2014.02.02 22:04:59,0
2014.02.02 22:05:00,1.34823,1.34845,1.34823,1.34833,65,2014.02.02 22:09:59,0
2014.02.02 22:10:00,1.34833,1.34855,1.34831,1.34850,142,2014.02.02 22:14:59,0
2014.02.02 22:15:00,1.34850,1.34865,1.34850,1.34865,90,2014.02.02 22:19:59,0
2014.02.02 22:20:00,1.34864,1.34866,1.34857,1.34864,67,2014.02.02 22:24:59,0
2014.02.02 22:25:00,1.34862,1.34871,1.34820,1.34832,291,2014.02.02 22:29:59,0
2014.02.02 22:30:00,1.34843,1.34863,1.34821,1.34860,595,2014.02.02 22:34:59,0
2014.02.02 22:35:00,1.34860,1.34867,1.34857,1.34862,276,2014.02.02 22:39:59,0
2014.02.02 22:40:00,1.34861,1.34867,1.34859,1.34862,270,2014.02.02 22:44:59,0
2014.02.02 22:45:00,1.34865,1.34895,1.34865,1.34889,234,2014.02.02 22:49:59,0

M1

$ ./convert_mt_to_csv.py -i EURUSD1_0.fxt -f fxt4 | head
2014.02.02 22:00:00,1.34842,1.34842,1.34842,1.34842,1,2014.02.02 22:00:00,4
2014.02.02 22:00:00,1.34837,1.34837,1.34837,1.34837,1,2014.02.02 22:00:03,4
2014.02.02 22:00:00,1.34828,1.34828,1.34828,1.34828,1,2014.02.02 22:00:04,4
2014.02.02 22:00:00,1.34828,1.34828,1.34828,1.34828,1,2014.02.02 22:00:09,4
2014.02.02 22:00:00,1.34832,1.34832,1.34832,1.34832,1,2014.02.02 22:00:09,4
2014.02.02 22:00:00,1.34832,1.34832,1.34832,1.34832,1,2014.02.02 22:00:10,4
2014.02.02 22:00:00,1.34827,1.34827,1.34827,1.34827,1,2014.02.02 22:00:10,4
2014.02.02 22:00:00,1.34829,1.34829,1.34829,1.34829,1,2014.02.02 22:00:12,4
2014.02.02 22:00:00,1.34835,1.34835,1.34835,1.34835,1,2014.02.02 22:00:14,4
2014.02.02 22:00:00,1.34837,1.34837,1.34837,1.34837,1,2014.02.02 22:00:15,4
 $ ./convert_mt_to_csv.py -i EURUSD1_1.fxt -f fxt4 | head
2014.02.02 22:00:00,1.34842,1.34842,1.34827,1.34833,19,2014.02.02 22:00:59,0
2014.02.02 22:01:11,1.34837,1.34842,1.34834,1.34839,23,2014.02.02 22:02:04,0
2014.02.02 22:02:05,1.34840,1.34841,1.34825,1.34827,11,2014.02.02 22:03:04,0
2014.02.02 22:03:11,1.34827,1.34832,1.34827,1.34827,2,2014.02.02 22:04:04,0
2014.02.02 22:04:05,1.34821,1.34821,1.34821,1.34821,3,2014.02.02 22:04:59,0
2014.02.02 22:05:00,1.34823,1.34845,1.34823,1.34845,16,2014.02.02 22:05:59,0
2014.02.02 22:06:26,1.34839,1.34840,1.34839,1.34840,6,2014.02.02 22:07:25,0
2014.02.02 22:07:27,1.34837,1.34841,1.34837,1.34841,8,2014.02.02 22:08:05,0
2014.02.02 22:08:06,1.34835,1.34835,1.34827,1.34827,26,2014.02.02 22:09:01,0
2014.02.02 22:09:02,1.34830,1.34833,1.34830,1.34833,8,2014.02.02 22:10:01,0
$ ./convert_mt_to_csv.py -i EURUSD1_2.fxt -f fxt4 | head
2014.02.02 22:00:00,1.34842,1.34842,1.34827,1.34833,19,2014.02.02 22:00:59,0
2014.02.02 22:01:11,1.34837,1.34842,1.34834,1.34839,23,2014.02.02 22:02:10,0
2014.02.02 22:02:05,1.34840,1.34841,1.34825,1.34827,11,2014.02.02 22:03:04,0
2014.02.02 22:03:11,1.34827,1.34832,1.34827,1.34827,2,2014.02.02 22:04:10,0
2014.02.02 22:04:05,1.34821,1.34821,1.34821,1.34821,3,2014.02.02 22:05:04,0
2014.02.02 22:05:00,1.34823,1.34845,1.34823,1.34845,16,2014.02.02 22:05:59,0
2014.02.02 22:06:26,1.34839,1.34840,1.34839,1.34840,6,2014.02.02 22:07:25,0
2014.02.02 22:07:27,1.34837,1.34841,1.34837,1.34841,8,2014.02.02 22:08:26,0
2014.02.02 22:08:06,1.34835,1.34835,1.34827,1.34827,26,2014.02.02 22:09:05,0
2014.02.02 22:09:02,1.34830,1.34833,1.34830,1.34833,8,2014.02.02 22:10:01,0

Check files from EURUSD-2014/EURUSD/2014/02 for more accurate CSV data to compare with.

The above FXT files has been uploaded below.

The above rows are in the following order: bar timestamp, open, high, low, close, volume, timestamp.

Sample files

  1. EURUSD_FXT_samples.zip
  2. EURUSD.ecn30_fxt_files.zip

These files can be read by convert_mt_to_csv.py script as shown above. Or visually you can open them by MT4 platform and open by File, Open Offline (files needs to be placed in tester/history folder of the platform dir.


See also related issue: #86


Est. 16h

TestGenerator: unmatched data error, low value is not reached from the least timeframe [$80 awarded]

Error happening during backtesting:

M5:

TestGenerator: unmatched data error (low value 1.08391 at 2015.07.20 09:15 is not reached from the least timeframe, low price 1.08474 mismatches)
TestGenerator: unmatched data error (volume limit 590 at 2015.07.20 09:30 exceeded)

M15:

TestGenerator: unmatched data error (low value 1.08294 at 2015.07.20 09:15 is not reached from the least timeframe, low price 1.08474 mismatches)
TestGenerator: unmatched data error (low value 1.08294 at 2015.07.20 09:15 is not reached from the least timeframe, low price 1.08474 mismatches)

M30

TestGenerator: unmatched data error (low value 1.08294 at 2015.07.20 09:00 is not reached from the least timeframe, low price 1.08391 mismatches)
TestGenerator: unmatched data error (low value 1.08294 at 2015.07.20 09:00 is not reached from the least timeframe, low price 1.08391 mismatches)

Tested data from 2015.

Testing (using VM):

  1. Close VM repo.

  2. Optionally provision VM by: vagrant up && vagrant ssh (or use local Linux/OSX shell)

  3. Run the test using M30 period, e.g. from the command line:

    ./scripts/run_backtest.sh -t -v -x -I TestPeriod=M30,TestModel=0 -e MACD -p EURUSD -d 2000 -s 10 -y 2015 -m 7 -D 5 -b DS -M 4.0.0.971
    
  4. Clone scripts repo.

  5. Clone/download some CSV files.

  6. Run: ./convert_csv_to_mt.py -v -i all.csv -s EURUSD -p 20 -S default -t M1 -f fxt4.

  7. Copy FXT files to platform dir at: tester/history/.

  8. To read FXT format, use: ./convert_mt_to_csv.py -i EURUSD1_0.fxt -f fxt4 | more instead.

Fix should be in convert_csv_to_mt.py file.

Similar issue: #70

The solution to this issue is to fix the converter script, so the above errors won't show for all timeframes, so the backtest test can run successfully.

Es. 6-8h

--- The **[$80 bounty](https://www.bountysource.com/issues/36644203-testgenerator-unmatched-data-error-low-value-is-not-reached-from-the-least-timeframe?utm_campaign=plugin&utm_content=tracker%2F20487492&utm_medium=issues&utm_source=github)** on this issue has been claimed at [Bountysource](https://www.bountysource.com/?utm_campaign=plugin&utm_content=tracker%2F20487492&utm_medium=issues&utm_source=github).

socket.error: [Errno 104] Connection reset by peer [$15 awarded]

The following exception happened while downloading:

Traceback (most recent call last):
  File "/usr/lib/python3.2/urllib/request.py", line 1542, in open
    return getattr(self, name)(url)
  File "/usr/lib/python3.2/urllib/request.py", line 1720, in open_http
    return self._open_generic_http(http.client.HTTPConnection, url, data)
  File "/usr/lib/python3.2/urllib/request.py", line 1703, in _open_generic_http
    response = http_conn.getresponse()
  File "/usr/lib/python3.2/http/client.py", line 1052, in getresponse
    response.begin()
  File "/usr/lib/python3.2/http/client.py", line 346, in begin
    version, status, reason = self._read_status()
  File "/usr/lib/python3.2/http/client.py", line 308, in _read_status
    line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1")
  File "/usr/lib/python3.2/socket.py", line 287, in readinto
    return self._sock.recv_into(b)
socket.error: [Errno 104] Connection reset by peer

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "./dl_bt_dukascopy.py", line 170, in <module>
    ds.download()
  File "./dl_bt_dukascopy.py", line 82, in download
    urllib.request.urlretrieve(self.url, filename=self.path)
  File "/usr/lib/python3.2/urllib/request.py", line 151, in urlretrieve
    return _urlopener.retrieve(url, filename, reporthook, data)
  File "/usr/lib/python3.2/urllib/request.py", line 1574, in retrieve
    fp = self.open(url, data)
  File "/usr/lib/python3.2/urllib/request.py", line 1546, in open
    raise IOError('socket error', msg).with_traceback(sys.exc_info()[2])
  File "/usr/lib/python3.2/urllib/request.py", line 1542, in open
    return getattr(self, name)(url)
  File "/usr/lib/python3.2/urllib/request.py", line 1720, in open_http
    return self._open_generic_http(http.client.HTTPConnection, url, data)
  File "/usr/lib/python3.2/urllib/request.py", line 1703, in _open_generic_http
    response = http_conn.getresponse()
  File "/usr/lib/python3.2/http/client.py", line 1052, in getresponse
    response.begin()
  File "/usr/lib/python3.2/http/client.py", line 346, in begin
    version, status, reason = self._read_status()
  File "/usr/lib/python3.2/http/client.py", line 308, in _read_status
    line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1")
  File "/usr/lib/python3.2/socket.py", line 287, in readinto
    return self._sock.recv_into(b)

It needs to be handled, so the script can retry the retrieval.

--- The **[$15 bounty](https://www.bountysource.com/issues/30005790-socket-error-errno-104-connection-reset-by-peer?utm_campaign=plugin&utm_content=tracker%2F20487492&utm_medium=issues&utm_source=github)** on this issue has been claimed at [Bountysource](https://www.bountysource.com/?utm_campaign=plugin&utm_content=tracker%2F20487492&utm_medium=issues&utm_source=github).

Prices of XAUUSD/XAGUSD are not correct

Reproduction steps:

  1. Run: ./dl_bt_dukascopy.py -p XAUUSD -y 2013 -m 3 -d 1 -c
  2. Run: head download/dukascopy/XAUUSD/2013/03/2013-03-01--01h_ticks.csv.

Result:

2013.03.01 01:00:00.060,15.82063,15.82364,0.00,0.00
2013.03.01 01:00:00.403,15.82033,15.82333,0.00,0.00
2013.03.01 01:00:00.464,15.81909,15.82272,0.00,0.00
2013.03.01 01:00:00.733,15.81963,15.82263,0.00,0.00
2013.03.01 01:00:00.793,15.81852,15.82192,0.00,0.00
2013.03.01 01:00:00.863,15.81803,15.82166,0.00,0.00
2013.03.01 01:00:00.933,15.81873,15.82174,0.00,0.00
2013.03.01 01:00:01.120,15.8168,15.82043,0.00,0.00
2013.03.01 01:00:01.194,15.81733,15.82043,0.00,0.00
2013.03.01 01:00:01.313,15.81652,15.81988,0.00,0.00

Correct prices should be like:

Local time,Ask,Bid,AskVolume,BidVolume
01.03.2013 00:00:00.605 GMT-0000,1581.15,1580.7870000000003,400,700
01.03.2013 00:00:00.665 GMT-0000,1581.171,1580.808,400,700
01.03.2013 00:00:00.786 GMT-0000,1581.253,1580.953,300,200
01.03.2013 00:00:01.407 GMT-0000,1581.1329999999998,1580.77,400,200
01.03.2013 00:00:01.476 GMT-0000,1581.213,1580.913,300,200

image

Please check the XAUUSD CSV file at https://www.dukascopy.com/plugins/fxMarketWatch/?historical_data such as: XAUUSD_Ticks_01.03.2013-01.03.2013.csv.gz

Please check similar go-duka tool for example, which I think converts the data correctly.

The issue should be solved for XAUUSD, XAGUSD and USDRUB.

Write Python script to read symbols.sel file. [$50 awarded]

Write a new script in Python 3 which will print the values from symbols.sel to the output.

Then print the known values to the output such as:

  • symbol
  • digits
  • index
  • group
  • pointSize
  • spread
  • tickType
  • time
  • bid
  • ask
  • sessionHigh
  • sessionLow
  • bid_2
  • ask_2
  • etc.

Can be in any output format, but format of fields should match its type (ask/bid should print float and time should print time, etc.).

See: symbols.sel.h

Check some other code examples for format details. Also this one.

Check these symbols.sel files as example.

--- The **[$50 bounty](https://www.bountysource.com/issues/31094557-write-python-script-to-read-symbols-sel-file?utm_campaign=plugin&utm_content=tracker%2F20487492&utm_medium=issues&utm_source=github)** on this issue has been claimed at [Bountysource](https://www.bountysource.com/?utm_campaign=plugin&utm_content=tracker%2F20487492&utm_medium=issues&utm_source=github).

Complete dl_bt_metaquotes.py script. [$100]

Complete dl_bt_metaquotes.py script which downloads DAT files (undocumented format) and converts them into plain CSV format. The converted data should be consistent, currently it's not.

It converts some of the data, but it still has some problems with unknown blocks.

Check for comments and TODOs for further info.

Please note that this task may be challenging, since the format is undocumented (unknown) and it may use some custom or possibly it's less-known compression, however most of it is already there, so it's a matter of debugging and implementing missing bits.

The output should predictable and fairly simply (timestamp, open, high, low, close, volume). The script dumps the partial data, but it doesn't understand all of it (its storing mechanism).

New script: dl_bt_metaquotes.py

New script needs to be created: dl_bt_metaquotes.py for downloading backtest data (DAT) for given symbol (-p) and years (-y) from metaquotes.net into given folder (-d, e.g. download/metaquotes/yyyy/mm/) and convert it to CSV (-c), similar to dl_bt_dukascopy.py.

Example URL: http://history.metaquotes.net/symbols/EURUSD/list.txt

The script should be fairly similar to: dl_bt_dukascopy.py.

Once downloaded, when -c option is specified, it should be converted into CSV format (the same format as generated via: dl_bt_dukascopy.py). Unfortunately I can't find any documentation on DAT format, possibly could be similar to HST/FXT (see: convert_mt_to_csv.py).

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.