Neural Networks in Trading: Can We Build an Oracle for the “Whole Market” ? Developing an NN implementation to Tackle Multi-instrument Forecasting

Through the “Neural Networks in Trading” series of posts we have gone through many interesting aspects dealing with this machine learning technique in the financial world, going through things such as input/output definitions, the results of some practical systems based on neural networks and the steps needed to build a neural network for financial forecasting. On today’s post I want to share with you some developments which I have carried out within the past few weeks that circle around the idea of building an “oracle”, a computational forecasting tool with statistically meaningful predictive power around a wide range of market instruments.

The market is a single entity and therefore we can assume that the movements on a given market instrument affect the results on others through one way or another. For most traders it becomes evident that analysing different pairs – when talking about Forex in particular – gives a better view of the market as you can better interpret the movements in a clearer way. For example you might be tempted to enter a strong bearish signal on the GBP/USD but when you realize – through a broader market perspective – that the movement is actually a small bearish signal on most GBP pairs and no movement is going on on most USD pairs you would be tempted to stop since overall currency directionality does not fall in line with your market view. It is evident that currency correlations might exist so it is worth it to export how strong they are and whether or not a simple analysis of a large number of instruments can lead to a forecast with significant predictive power.

My idea was therefore to attempt to build an “oracle” we could use to predict the outcome of the next candle over X number of instruments. First of all I carried out a careful data analysis, exporting and merging procedure to generate a table containing end of day (EOD) data for at least 19 different forex symbols (including majors and most minors) with great care in order to ensure that each entry matched the same day for all the different instruments. After having this table I carried out some processing to ensure that the data was compatible with NN input requirements and I then built a DLL using FANN and FreePascal as well as a FreePascal separate program and an MT4 interface which I could use to actually test this idea in practice.

Overall what I wanted to predict was not the “exact” change that would happen for the next candle but what I attempted to predict was simply whether or not the next candle would be bullish or bearish, such a prediction has a potentially significant edge since we can enter a long or short trade and close it at the end of the next candle if the predictive outcome changes. Overall the final neural network topology was quite complex, using the last 10 candles for each symbol, two layers of 190 hidden neurons and 19 output neurons which contained binary outputs that forecast whether a currency pair will go up or down within the next candle.

I have to say that no one should ever expect an approach like this to have extremely profitable or accurate results. Financial systems are quite chaotic so even complex non-linear correlations tend to change with time (often very quickly) which causes the neural network to be inaccurate a large percentage of the time. It is worth saying that what we’re looking for is not to be able to say with certainty what will happen tomorrow but simply to build a predictive capability which is able to grant us a long term statistical edge. Even if our long term predictive power is small (just 2-5% above chance), we could still have profitable results in trading.

The first results I have found through the use of this NN have been quite encouraging with most runs achieving a significant statistical edge over a period of at least 5 years. However several problems are still quite important such as the variability of network runs (which even appears while using committees) and the inevitably and painfully long testing runs which are caused by the inevitably large amount of training data. Certainly building a multi-threaded approach to this could solve the computational problem (at least in part) and gaining a better understanding on how to treat the input variables could lead to further improvements regarding variability.

An added consequence of the above developments – which I only came to consider a small while ago – is that the DLL also contains the ability to use daily data for a large variety of pairs in MT4 which allows us to effectively  build systems using multi-instrument data (which MT4 does not enable by default). Not only will the above DLL be extremely useful for the exploration of forecasting techniques but it will also provide us with an additional tool to build normal systems in MT4. The use of multi-instrument data and potentially other data sources such as the DOW and interest rates is therefore now an open and real possibility for our system development efforts.

In essence I believe this is a significant step towards a comprehensive NN solution to forecast a wide variety of market instruments and definitely this can lead to the generation of very powerful trading systems based entirely on Neural Networks for their logic. Certainly – due to the blackbox nature of NN solutions – I will be extremely careful with the training and testing procedures of such networks (with validating tests always being constantly updating runs) in order to ensure that when we release a system based on a solution like this it will be extremely robust and reliable.

Right now I am looking forward to the testing of this NN solution using some alternative pre-processing steps and also introducing some small changes in inputs to highlight some correlations dealing with swap rates (so that the network can also rely on carry trade information based on these aspects). Definitely work continues strong on NN solutions and potentially before the year ends we will have the first introduction of NN systems and money management techniques in Asirikuy :o)

If you would like to learn more about y work in automated trading and how you too can tackle automated trading based on understanding and learning please consider joining, a website filled with educational videos, trading systems, development and a sound, honest and transparent approach towards automated trading in general . I hope you enjoyed this article ! :o)

Print Friendly
You can leave a response, or trackback from your own site.

5 Responses to “Neural Networks in Trading: Can We Build an Oracle for the “Whole Market” ? Developing an NN implementation to Tackle Multi-instrument Forecasting”

  1. Franco says:

    Interesting Daniel, thanks for all your hard work! I cannot see how any NN will perform better than static TA strategies, but combining the two might give even better results.

    The ultimate NN though would be one that receives feeds of every important market in the world, but that is of course impossible.

    • Fd says:

      Hi Franco,

      if you mind I would like to answer your question:

      > I cannot see how any NN will perform better than static TA strategies
      We are searching for self adapting strategies with a low risk to fail. A NN can be made adaptive if it is not trained once but continously.

      > The ultimate NN though would be one that receives feeds of every important > market in the world, but that is of course impossible.
      I don’t think that this would be the ultimate NN. It is crucial for having success with a NN to have a very clear view which inputs are relevant. The definition of an “important market” now might not be the same as in 10 years. And here you have the problem of making a decision what it is important. NNs are very bad in finding out what is important when fed with redundant, noisy input, but when they are fed with meaningful input, they can be able to adapt to the inner laws while these relationships does not need to be known by the programmer. Applying PCA or entropy measures to the input will not completely eliminate this problem, because even if the you feed in non-random and non-redundant information, there is no guarantee that any outcome is predictable by nature. Not-random is not the same as predictable. Thus the ultimate solution would be to have a fitting model of market dynamics. Once this would be achieved, one could choose the most simple implementation.

      Best regards,

      • Franco says:

        Hey Fd,

        Thank you for your reply…

        Thanks for helping me on the right path here, I initially thought the bigger the NN the better but guess that is not what happens in reality.

        Determining an “important market” might be a huge challenge though, maybe someone in Asirikuy should consult a respectable economist who is familiar with the Forex market

  2. Bruno says:

    Wow, every time your efforts and hard work that lead to these kind of discoveries astonish me! Congratulations and thanks for this new achievement! Especially being able to use interest rates for forecasting has huge potential in my opinion.

  3. Jacob says:

    hi daniel and guys,
    very interesting – thanks!
    any possibility that one could use the cot report mentioned in this article for nn purposes?

    kind regards

Leave a Reply

Subscribe to RSS Feed Follow me on Twitter!
Show Buttons
Hide Buttons