Posted by: cjforex | January 29, 2013

BE the MANAGER. Distributed risk over multiple systems.

Youtube Video – BE the boss.

I posted this to a forum, and decided to paste it here as well in case any retail traders come across it.

A consideration with respect to a holy grail, and blowing up accounts is the reliance on a single approach. With a single approach traders are inclined to bet too heavily per trade.

Since all systems, and I mean all systems draw down, when betting heavily, the drawdown, even minor ones can be deadly.

When you can place trades, and each individual trade has no significance to you (because the trade size is very very small) and you can spread your risk among many trades, and many approaches, two things happen:

1. Your account drawdown becomes smoothed. This is because the drawdown of one approach is usually mitigated by the draw-up of one of your other approaches.

2. That $99 system actually has a chance. Every system has good times, bad times, and kick you in the teeth times. The question is can you ride out the bad times so the system can realize the good times. Only if it has a small portion of your account, and if it is among other systems that can mitigate it.

The last crumb of this diatribe is management.

There is a BBC show I think is called Traders. It’s on YouTube. It’s where they do an experiment and hire 10 people off the street, and train them to work in a trading room. 3 part series I think.

Watch it. Then BECOME the manager.

They constantly evaluated each trader. Gave some traders more money, gave others less. They constantly adjusted, and they started small so no trader could kill the account. Emulate it.

So whether you have the ability to create your own “employees” or you need to buy them for $99 each, get them. Lots of employees. Then BE the manager.

Don’t bet the farm on some new employee of the street.

screenshttp://www.hftreview.com/pg/blog/mike/read/70156/humans-vs-robots-man-fights-back

http://www.hftreview.com/pg/blog/spiron/read/69826/smart-analytics-helps-trading-desks-in-challenging-environment–big-data-in-the-financial-markets

http://www.portfolioprobe.com/2013/01/14/the-incoherence-of-risk-coherence/

Posted by: cjforex | January 13, 2013

“The Fifth Element of Effective Thinking: Change”

Yuichi Takano, Renata Sotirov
Abstract: We address the multi-period portfolio optimization problem with the constant rebalancing strategy. This problem is formulated as a polynomial optimization problem (POP) by using a meanvariance criterion. In order to solve the corresponding POPs of high degree, we develop a cutting-plane algorithm based on semidefinite programming. Our algorithm can solve problems that can not be handled directly by known polynomial optimization solvers.

Colin Atkinson a* & Gary Quek ab

A discrete-time model of portfolio optimization is studied under the effects of proportional transaction costs. A general class of underlying probability distributions is assumed for the returns of the asset prices. An investor with an exponential utility function seeks to maximize the utility of terminal wealth by determining the optimal investment strategy at the start of each time step. Dynamic programming is used to derive an algorithm for computing the optimal value function and optimal boundaries of the no-transaction region at each time step. In the limit of small transaction costs, perturbation analysis is applied to obtain the optimal value function and optimal boundaries at any time step in the rebalancing of the portfolio.

Posted by: cjforex | December 17, 2012

I’ll Take Two Please

Gresham Computing plc, a leading provider of transaction control solutions to international financial institutions, today announced that its reconciliation solution Clareti Transaction Control (CTC) has achieved unprecedented transaction processing times in a series of benchmarking tests conducted with Intel. These tests included load and match into a database of over 50,000 equity trade transactions per second and were conducted in June 2012 on the Intel® Xeon® Processor E7 family, at Intel’s Computing lab in Reading, United Kingdom.

CTC was built using the GigaSpaces XAP elastic application platform as an integral part of its infrastructure. XAP is geared for big data, employing an in-memory data grid for tremendous processing speed, ‘share-nothing’ partitioning for reliability and consistency and event-driven architecture that enables real-time processing of massive event streams and unlimited processing scalability. With CTC, organisations can rapidly replace existing manual and semi-automatic internal controls with automated controls, secure in the knowledge that CTC can easily scale to cope with higher volumes in the case of future growth or consolidation.

In the past institutions might have needed multiple reconciliation systems to handle large volumes of transactions close to real-time. They can now process these through a single instance of CTC, making significant cost and efficiency savings. CTC has the scalability to handle the transaction processing needs of industry consolidators such as OTC central clearing bodies or reconciliation service providers – all of whom have growing demands for volume and need to get things done faster.

Posted by: cjforex | December 17, 2012

The Online Algorithmic Complexity Calculator

This is interesting (I quote from the site):

For a long time researchers from all disciplines have avoided the use of universal mathematical measures of information theory (beyond the traditional computable, but limited, Shannon information entropy), measures such as Kolmogorov-Chaitin complexity, Solomonoff-Levin universal induction or Bennett’s logical depth, as well as other related measures, citing the fact that they are uncomputable.

These measures are, however, upper or lower semi-computable and are therefore approachable from below or above. For example, lossless compression algorithms can approximate Kolmogorov-Chaitin complexity (a compressed string is a sufficient test of non-randomness) and applications have proven to be successful in many areas. Nevertheless, compression algorithms fail to compress short strings and do  not represent an option for approximating their Kolmogorov complexity. This online calculator provides a means for approximating the complexity of binary short strings for which no other method has existed until now by taking advantage of the formal connections among these measures and putting together several concepts and results from theoretical computer science. This calculator implements our Coding Theorem Method (see the Publications). In the future it will cover a larger range of objects (e.g. non-binary and (n>1)-dimensional arrays) and other more specific tools.

http://www.complexitycalculator.com

This looks like an interesting approach, and I wonder if it can be applied to analysis of individual system yield equity curve. 

I first saw this in a blog post by the http://systematicinvestor.wordpress.com

He explains it:

The Multiple Factor Model can be used to decompose returns and calculate risk. Following are some examples of the Multiple Factor Models:

Multi-factor models are used to construct portfolios with certain characteristics, such as risk, or to track indexes. When constructing a multi-factor model, it is difficult to decide how many and which factors to include. One example, the Fama and French model, has three factors: size of firms, book-to-market values and excess return on the market. Also, models will be judged on historical numbers, which might not accurately predict future values.

Multi-factor models can be divided into three categories: macroeconomic, fundamental and statistical models. Statistical models are used to compare the returns of different securities based on the statistical performance of each security in and of itself.

via Multi-Factor Model Definition | Investopedia.

Older Posts »

Categories