Thursday, 21 August 2014

“If-then’s” generate “choices” or “how to free yourself of other’s causal claims”.

I read a bit of news—mainstream, twitter & FB--and I have this desperate urge to review a bit of logic.

For example, when someone declares "Do it my way or the highway," you don't have to worry.  You can reconstruct that statement into:

If it is x, then it must be y.
It is either not x or it is y.

For example:
If he wears brown shorts, then he must be a terrorist.
Either he does not wear brown shorts or he is a terrorist.

Translating the above threat, “Do it my way or the highway,” means “If you don’t do it my way then you must take the highway.”  The phrase “do it my way” is negated in the antecedent.

Let’s go further.  How about the statement:

“If you believe in me, then you will go to heaven”?

Translates to:

“Either you don’t believe in me or you will go to heaven.”

That’s proper.

But suppose that statement gets warped into:

“If you don’t believe in me then you will go to hell.”

Which translated into “or” form becomes:

“Either you believe in me or you will go to hell.”

Bottom-line:  when you get into arguments about choices, watch out for causation type statements as blame that turn out to be dichotomies.  Once you see this rule operating, you don't have to take sides.

You are free to mosey along.

Friday, 19 July 2013

Default Invariance:  The Three Approximations of Legal and Financial Reality

I took a break from blogging to work on a theory paper entitled, "Default Invariance, A Naive Category Theory of Law and Finance."  You can see its abstract on   I think the idea of taking a simple logical structure implied by the simplest form of a legal-financial phenomenon, namely, a financial contract with a one-period payment, and looking to the topological space implied by its terminal object will forever change the way we do law and finance.  In the paper there are three approximations of law and finance that correspond to the structure implied by the terminal objects, Pay, Not-Pay and Pay & Not-Pay.  In the simplest rough and ready terms, these terminal objects imply a point, a line ("risk homological chain complex") and a cyclic matrix ("a ring structure").  Each approximation defines the context-environment of the legal financial structures.  And I'm happy to say that we can may make explicit conceptual calculations which improve on ("correct") the works of three Nobel laureates in economics, Arrow, Debreu and Sharpe.  Arrow-Debreu-Sharpe basically set out the contingent claim model (Debreu, by the way applied abstract algebraic topological methods in his seminal work, A Theory of Value, which got rid of probability for one-period claims) which underlies everything we know and do in risk management, corporate governance, portfolio theory and practically, everything else we call "modern finance theory."  So, one way to read my Default Invariance paper is that it puts Arrow-Debreu-Sharpe into the perspective of a naive category theory, and shows our syntactical structures pre-dispose our conceptual calculations.  Anyway, you can read the paper to find out for yourself.  It's got 35 original diagrams that are meant to help "re-wire" one's own internal mapping of how the law and finance world works.

Fourth Approximation:  Taking Parts and Partitions Seriously

If we wanted to study law and finance as a physical process, we might find that there is a Darwinian-light version to the selection of laws and financial products that appears to apply.  Recall Darwin used the principle random selection for the adaptation of certain macro-features appearing to differentiate species according to external environments.

Consider Lancelot Law Whyte (1965) Internal Factors in Evolution, cited by John Bonner (20 July 2013) "Evolution, by chance?" New Scientist, 26-27, 26.  As Bonner states, "His [Whyte's] thesis was straightforward. Not only is there selection of organisms in the environment--Darwinian natural selection, which is eternal--but there is also continuous internal selection during development.  Maybe the idea was too simple and straightforward to have taken root." Bonner then goes on to state his own thesis, "This fits in neatly with my contention that the shape of microorganisms is more affected by randomness than for large, complex organisms.  Being small means very few development steps, with little or internal selection.  The effect of a mutation is likely to be immediately evident in the eternal morphology, so adult variants are produced with large numbers of different shapes and there is an increased chance that some of these will be untouched by natural selection. Compare this with what happens in a big, complex organism--a mammal, say. Only those mutations that occur at a late stage of development are likely to be viable--eye or hair colour in humans are obvious examples.  Any unfavourable mutation will likely be eliminated by internal selection."  He points out the evidence that the shapes of microorganisms are "less likely to be culled by natural selection" by citing Radiolaria (50,000 species) and diatoms (100,000 species) and Foraminifera (270,000 species).   Then he states, "If you are a strict adaptionist, you have to find a separate explanation for each shape. If you favour my suggestion that their shapes arose through random mutation and there is little or no selection, the problem vanishes." [p. 27]

What structure is implied by Bonner's internal versus external environment selection thesis?  I find his terminology a bit confusing.  For what is the external environment of a micro-organism?  Isn't everything outside it in a sense a micro-structure and therefore, could be in it, as well?  Perhaps we can clarify the thesis by translating the situation into a morphism f: A-->2.  Imagine the object A population with lots and lots of elements but having one partition such that you can maximise or minimise either part.  The f-morphism are injections to either of the two elements in 2.  So long as the partition exists, the 2 separate values will exist in 2.  So, we don't need the internal versus external division.  In category theory, there is a theorem which just gets rid of all "internal diagrams" so that anything and everything that can be possibly expressed can be done with external diagrams only.  I think the same can be said about Whyte's and Bonner's thesis above.  In other words, Darwin's random selection to adaption is preserved in the structure of parts and partition via a morphism f: A-->2.

The Fourth Approximation is taking the Third Approximation of Pay and Not-Pay as parts with a partition.  The structured implied from this terminal object is "Continuous Contingencies" (CC) to "Infinitely Discrete Randomness" (IDR).  This sounds extraordinarily vague, but what it means is that which is undifferentiable can be made into discrete unit choice.  In syntactic form:  g: CC-->IDR.  This is similar to the conceptual step of moving from "God as ubiquitous being" to "eating a properly cooked vegetarian meal is a morally correct choice of being."

Tuesday, 25 September 2012

Lecture 1 post hoc notes - Legal Aspects of Corporate Finance

Lecture 1: Legal Aspects of Corporate Finance Guest instructors:  Professor Edmond Curtin and PhD Candidate Rezarte Vukatana I walked in a few minutes late with a bundle of papers and just started talking about THEORY as if it were the most natural thing in the world.  I told them about a Russian table tennis star whose training regime included 6 hours of chalk and blackboard theory everyday. But the main point came from Hohfeld's definition of theory: "A theory is not even a theory unless it can be used by practitioners in their practice."  I don't think I introduced myself but I did introduce Edmond and later Rezi.  I mentioned a few themes: (1) WEAK EQUIVALENCE as the subtle equivalence of thoughts; (2) the UNITY OF SCIENCE CRITERION as the main ground for adjudicating theories-- a theory should be judged on how it helps us understand the unity of all knowledge of being; (3) sign, symbol (Edmond mentioned "signifier" pointing to the picture of the green man in the exit sign--everyone turned to look); (4) HOHFELD the undergrad chemistry student turned professor of Yale Law School who in early 20th century wrote only 6 articles and invented a periodic table for the law - 4 JURAL OPPOSITES and 4 JURAL CORRELATIVES with enormous theoretical effects; (5) CORBIN and WILLISTON who wrote encyclopediac tomes on contracts law, and how Corbin (a Hohfeldian student) took just one jural correlative, rights versus duties, and turned that tiny almost trivial legal distinction into 7 (or was it 9?) volumes of contract law; (6) And Where are Contracts anyway? shock horror to the civil law students ["on paper", "after the signature" they say] but no, says the common law jurisprudentem--CONTRACTS EXIST IN THE MIND [Edmond]; horror of horrors, is this the pure subjectivism, relativism and thus, total discretionary totalitarianism of the law?; (7) Why some questions within professional discourse make no sense ("What's north of the north pole, eh?"] and is there a way of understanding that transcends the bounds of discourse?  Later, the astute Russian student answering a question about "material information" asked a rhetorical question about the distinguishment of various risks.  Then I told a long story about Yuanjia, the Great Wun Chin master, who when cajoled by a Japanese martial artist that there are levels in the artistry of tea, replied, "The tea makes no such distinctions and is thoroughly enjoyed."   Thankfully, Edmond gave us a brief rendition on some of the essential legal principles of DERIVATIVES--how they actually create MORE RISK and MORE ANXIETY, and never less. Rezi described part of her PhD dissertation research--theory of self-fulfilling prophecy a la Merton (?) and how this can be used to help explain the strange behaviours of very complex nodes of financial system called intermediated securities accounts.   I passed around 3 LLM dissertations for the students' inspection, and gave them a homework assignment.   I filed some prospectuses at [if you want the password, you need to contact me] with my notes, and asked the students to write 2,000 words on (1) the risks of the prospectus transaction (either Salvatore Ferragamo or Prada); and (2) determine whether and what parts of the selected prospectus would need to be changed under the Directive 2010/73 Nov 2010.  They'll need to review about 600 to 800 pages and email me their work by 12noon Monday.  Nice shock therapy. 

Tuesday, 11 September 2012

Extreme Philosophy: On the Limits of Self-Referential Truth: Why Paradox Has Been Binned By Naive Category Theory

1. Here are two papers of EXTREME PHILOSOPHICAL SIGNIFICANCE: [1] Lawvere, F. William, "Diagonal arguments and cartesian closed categories with Author Commentary,"  Lecture Notes in Mathematics, 92 (1969), 134-145, available at: [2] Yanosky, Noson (2003) "A Universal Approach to Self-Referential Paradoxes, Incompleteness and Fixed Points,"  available at: 2. Unless you've studied a bit of category theory, i.e., read Lawvere and Schanuel (2008, 2nd edition) and Lawvere and Roseburgh (2003), Lawvere [1] will be very obscure even with Lawvere's commentary. But take a look and get a feel. Then, look at Yanosky [2] which explains in a more breezy (but precise) way what the genius Lawvere was up to, and even more cleverly in order to reach a "wider audience", dropped category theory altogether and explains Lawvere's discoveries in easy enough "set and function" language. 3. I realize that category theory is not for everybody (yet) and recently, in the literature, there is a push-back accusing category theory of making "foundational claims" that are unjustified. For example, that the entirety of mathematics can be put on a category theory footing and replace set theory as the fundamental theory which all other theories must bow down to. But I don't think category theory as it is practiced sets out to make any really big claims like these--that would be the job of propogandists. Rather it "solves" some rather apparent fundamental problems by "resolving" the problems into a diagrammatic logic. If you buy the diagrams as BEING DENOTIVE then you might also see how category theory IS linked to Aristotle's great work On Categories. Mac Lane in a footnote joked about how the title "category theory" came from "purloining words from the philosophoers, Aristotle and Kant" [pp. 29-30 of Categories for the Working Mathematician]. He doesn't say anything more about this jokey link. But if you read and understand Aristotle's motive in his Categories, you can see immediately that Aristotle set up foundational problems so they can be resolved. He analysed knowledged into what might be called "said-of" and "thing-in" and asked what are those abstractions that are primary, that is, what are those properties that are extended and therefore, must be. He listed 10 categories [what they are appears arbitrary] and he showed how you can use these primary categories to categorize everything else, that is, that which is not so extended and universal. Now, this mental-conceptual move to abstraction in order to solve a particular problem is a natural function. Lawvere & Schanuel in Conceptual Mathematics explain this movement in terms of isomorphisms: e.g. think of how you can understand what's happening in a film even after walking into the cinema late. In media res, you know Humphrey Bogart is playing a particular character and Audrey Hepburn is playing another character, and when you sort out who's who in the film, suddenly, you can follow the plot in the film with the actors as playing their roles. Similarly, being born in the middle of things, we open our eyes, stretch our arms and legs, and explore the universe, fully confident that we will be able to sort EVERYTHING out. This confidence comes from something pretty powerful within ourselves that enables us to gain knowledge. And the point here is that knowledge isn't at its rock bottom paradoxical. It is in all likelihood isomorphic. 4. Lawvere [1] takes a swipe at the propogandists who have been using some of the great theoretical work of theorists (such as Russell, Cantor, Godel, Tarski) and turned them into very general claims about the nature of paradox at the heart of knowledge. To put this into a general philosophical context, Aristotle's optimism was founded on his discovery of a general scientific method which if simply re-iterated, would eventually uncover all the mysteries of the universe. It was based on observing that which is and translating those into propositions which could be understood. If at the heart of heart of "proposition making" we have paradox, then this whole enterprise is doomed to failure. So, burdened with the prospect of failure, why start the programme of knowledge? 5. The answer by Lawvere [1] and Yanofsky [2] shows why the propogandists of paradox are simply wrong. In Yanofsky's terms, Lawvere's great little paper [1] has been largely ignored by category theorists and philosophers alike because it is written in a forbidding unpopular formalism. Yanofsky translates the results of Lawvere's paper by saying the classical paradoxes of self-referential truth (e.g. Liar's paradox, Russell paradox, Godel's incompleteness and so on) are just instances of overstepping the limitations of a discourse ("discourse" is my term). There must be a way of limiting what a discourse can say about itself. This "problem" comes up in law and finance whenever they try to talk about themselves. I call it the problem of structure. That is, there is no such question in law and finance that says, "What is the structure of law? What is the structure of statements about finance?" There is no call for self-consciousness within laws or financial practice. Rather, the call for such professional consciousness comes from without. But there is a way of understanding such questions about professional discourses from a category theory perspective. And not only do the questions about the structure of law and finance make sense, they actually direct in some fashion a resolution to answers about the structure of law and finance. For example, one of the things I have been harping on in this blog is that there is a fundamental structure to law and finance in the forms of an individual unit which I have dubbed the "financial contract" and the "great cycle of default invariance." From these structures, we can explain a lot of current practice at the individual-to-individual level of financial transactions on up to historical and contemporary nausea of continuously impending financial catastrophes. It's all a matter of "mapping" and translating apparent limitations within the discourse of law and finance into a notation which allows for mental journeys and conceptual calculations. By the way, one of the virtues of seeing how paradoxes are slain in [1] and [2] is that we can recover a sense of optimism that Aristotle once had in the unity of science. Again, I say, judge the value of a theory by its contribution to the unity of science.

Monday, 14 May 2012

On the Unity of Science: Law, Finance and the Philosophy of Category Theory

1.  We should measure our progress as a species of knowledge by how well individual disciplines meet the criterion of the unity of science.  This criterion was first stated as an almost urgent request by Edward O. Wilson with his concept of consilience.

2.  There are so many analogies between one subject and another, the vocabularies (as objects may be different), but the way in which these vocabularies are used (the morphisms) are so similar that they might as well be said to be the same.  What do we mean by "same" or better, if we capture the water colourist wash, and call it by its technical name, "weakened equivalence"?

3.  Mazur (June 12, 2007) in "When is one thing equal to some other thing," htt:// writes in tribute to one of the founders of category theory, Mac Lane, sets out an approachable essay on the question of the meaning of equivalence.  This is the deep point where all our equations and assertions in science sink to.  He sets out three approaches of how we have answered this question.

4.   The first is the "bureau of standards" where by convention we can point to something in a designated office that is an equivalent exemplar. [Id @ 4-5.]  The second is a type of universal quantification as in Frege's definition of cardinality. [Id @ 5.]  And the third is a compromise where "we indicate what we do rather than what we say we do when quizzed about our foundations." [Ibid.]  I call this third method a promiscuous stitching, using the same needle and thread or glue may be all we need to make appropriate connections between subjects, disciplines and fields of knowledge. 

5.  In mathematics, you can "package" entire mathematical theories either as (1) formal systems a la the David Hilbert programme or as (2) categories. [Id @6.]  On the one hand, the formal systems go all the way back to Euclid and are much admired under the rubric of axiomatization.  On the other hand, categories are a relatively recent invention (1945 with a paper by Samuel Eilenberg and Saunders Mac Lane was more an announcement of new technique than a new view of mathematics) and its method of a sparse vocabulary and sketches of arrows betrays its deep goal which is to reveal structure.  Mathematicians, like others engaged in doing or performing in their particular discourse, don't really "axiomize" but rather "play games with conviction." 

6.  Somewhere in "Tool and Object, A History and Philosophy of Category Theory," (2007), Kromer quotes Bill Lawvere (my preferred radical category theorist) for saying something to the effect, "the point is not to achieve maximal abstraction, but an optimal abstraction, a just-right abstraction that works appropriately at the level where it is most needed and used."  Of course, I am attributing a certain line of argument to Lawvere which I do not think he would disagree with.  Lawvere was motivated to find a theory of physics, to explain how things worked, but his work on the philosophy of category theory takes him on exoduses into Hegel.  I believe it is this urge to find "synthesis" with simple tools that motivates him.  He has been accused of being both revolutionary and idiosyncratic.  Revolutionary for advocating that all of mathematics can be thought of as a category of category theory.  And idiosyncratic because for such a great mathematician, he and Schanuel wrote a best selling book entitled, Conceptual Mathematics, A First Introduction to Category Theory (2009 2nd Edition) wherein you don't need any university level mathematics to understand.  In fact, I recommend this book to all my law and finance students who are interested in pursuing the application of category theory to the field of law and finance.

7.  The main points about the test for the unity of science (consilience) is that the most appropriate method for pursuing a rigorous apprehension of science (i.e., the three approaches stated by Mazur: bureaucratic standard, universal axiomization or balanced compromse (I call "promiscuous stitching") may be something so natural and simple that even our high school students can be engaged in this entreprise.

8.  For theories of law and finance, we see that there has been an influence of the latest trend or fashion from other fields that have filtered into the vocabulary of the legal theorist or financial theorist.  For example in the last 5 to 10 years, in both fields there is an emphasis at least in the titles to papers on the concept of "complexity" and "behaviours."  This is not to say these concepts are  red-herrings.  From my view, they are just another batch of ideas that come from a few equations.  Another example, what would Hart's programme of primary and secondary laws be without the notions of first order and second order logics emanating from the Cambridge logicians in the early part of the 20th century?  Not that Hart genuinely meant to implement the same programme, but the inspiration for an orderly resolution of the definition of the meaning of law was certainly intended to take the script from the philosophy department--and these were the ideas pre-Wittengstein.  In finance, the initial idea of covariance goes back to Bachelier's PhD dissertation (1905) and then developed as various methods for "curve fitting" against time horizons.  Very little work has been done on how the various theories of law and finance might be approached in a unified way.  But here the stumbling block may have been the limited view on the number of approaches to reach unification.  I do not mean by "unification" a form of axiomatiion or foundational premises evolved and expanded in a universalistic sense that may have endeared Spinoza.  Rather we have a very powerful alternative which is a kind of Kantian insight that the intellectual revolution begins with a recognition that

"There are only two possible ways in which synthetic representations and their objects...can meet one another.  Either the object (Genenstand) alone must make the representation possible, or the representation alone must make the object possible." [quoted from Mazur supra @ 20.]

9.  In law and finance theory, I would (and will) argue that one of the significant leaps in our imagination of how law and finance work together is to recognize the structure of something called "default invariance."  This is captured by or very conveniently set out with a category theory approach.  Default invariance permeates all financial contracts, and all states of the financial-regulatory-political system.


Friday, 13 April 2012


( ) Research question: Why is it important to understand the physical limits of financial trading?
( ) Application: possible training course for supervisors and regulators around the world

I shall be adding to this list of references from time to time and reorganising its entirety when appropriate:

( ) IOSCO'S Recommendation for Standard Regulation on HFTs
Regulatory Issues Raised by the Impact of Technological Changes on Market Integrity and Efficiency - Consultative Report
OISU - ISOCO (July 2011)

( ) SEC Report on May 6, 2010 Flash Crash (100930)

( ) Zerohedge (100623):

( ) Nanex Analysis of the Flash Crash (100618):

    [ ] pay particular attention to "quote stuffing" which Nanex says should be banned:

( ) Nanex Analysis of the SEC's Flash Crash (May 6, 2010) Report of Sept 30, 2010 (120412):  

     ( ) Email exchange on the definition of LIQUIDITY:

    ( ) Nanex updated analysis on April 11, 2011--check out the 2nd chart especially when Wadell and Clarke whom the SEC has blamed for the Flash Crash and the W&C has accepted such blame, could NOT PHYSICALLY be the one who was involved in the perpetration of the flash crash!

Background Readings:

[ ]

[ ]

[ ]

[ ]

HFT REVIEW White papers, academic and industry reports


[ ]  SEC Probes Ties to High Speed Traders

135 articles on high frequency trading as of April 4, 2012

[ ] MiFID 2 Barnier


WHARTON article

10 articles


[ ] Conflicting Codes and Codings
How Algorithmic Trading Is Reshaping Financial Regulation
Marc Lenglet

Contemporary financial markets have recently witnessed a sea change with the ‘algorithmic revolution’, as trading automats are used to ease the execution sequences and reduce market impact. Being constantly monitored, they take an active part in the shaping of markets, and sometimes generate crises when ‘they mess up’ or when they entail situations where traders cannot go backwards. Algorithms are software codes coding practices in an IT significant ‘textual’ device, designed to replicate trading patterns. To be accepted, however, they need to comply with regulatory texts, which are nothing else but codes of conduct coding accepted practices in the markets. In this article, I draw on ethnographic fieldwork in order to open these black boxes, while trying to describe their existence as devices encapsulating several points of views. I address the question of a possible misalignment between those visions, and more specifically try to draw the consequences raised by such discrepancies as regards the future of financial regulation.
algorithmic trading codes of conduct codings financial markets practices regulation
Articles citing this article
Codes and Codings in Crisis: Signification, Performativity and Excess
Theory, Culture & Society November 1, 2011 28: 3-23
AbstractFull Text (PDF)

[ ] Investigating Financial Fraud in High Frequency Trading
Afroditi Katika 
University of Manchester
Babis Theodoulidis 
University of Manchester - Manchester Business School
David Diaz 
University of Manchester - Manchester Business School; Universidad de Chile - Escuela de Economia y Negocios

December 20, 2011

Market monitoring is a very important process to the stock market. It is necessary in order to verify that all trades comply with the existing rules, as well as to detect any act of manipulation. Recently, a new kind of trading has emerged, High Frequency Trading, which allows traders to place and execute orders within milliseconds via a program running in a computer. It is doubtful whether existent market systems are capable of detecting inconsistencies in trades at this speed. This project has attempted to propose a design of a detection engine that could be incorporated to a monitoring framework so as to accommodate High Frequency trades. Understanding the field of stock markets and High Frequency Trading was a crucial part of this project. Since it is such a recent phenomenon with no confirmed cases (to the best of our knowledge) of market manipulation, we attempted to answer whether there are certain conditions that could benefit market manipulators. We used business intelligence techniques to analyse historical data and discover what sort of indications our detection engine should look for. Our results show that there have been violations of regulations that were not blocked in real-time which proves the inefficiency of current market monitoring systems. We also prove that an extreme number of orders within seconds can delay an exchange’s processes and systems. In the end we propose a design that takes those results into consideration.

Keywords: high frequency trading, flash crash, quote stuffing, financial markets, market manipulation, fraud detection, market monitoring, market surveillance

Working Paper Series
Date posted: December 20, 2011  
Suggested Citation
Katika, Afroditi, Theodoulidis, Babis and Diaz, David, Investigating Financial Fraud in High Frequency Trading (December 20, 2011). Available at SSRN:

[ ] On the Dark Side of the Market: Identifying and Analyzing Hidden Order Placements
Nikolaus Hautsch 
Humboldt-Universit├Ąt zu Berlin; CASE - Center for Applied Statistics and Economics; CFS
Ruihong Huang 
Humboldt University of Berlin

February 8, 2012

Trading under limited pre-trade transparency becomes increasingly popular on financial markets. We provide first evidence on traders' use of (completely) hidden orders which might be placed even inside of the (displayed) bid-ask spread. Employing TotalView-ITCH data on order messages at NASDAQ, we propose a simple method to conduct statistical inference on the location of hidden depth and to test economic hypotheses. Analyzing a wide cross-section of stocks, we show that market conditions reflected by the (visible) bid-ask spread, (visible) depth, recent price movements and trading signals significantly affect the aggressiveness of 'dark' liquidity supply and thus the 'hidden spread'. Our evidence suggests that traders balance hidden order placements to (i) compete for the provision of (hidden) liquidity and (ii) protect themselves against adverse selection, front-running as well as 'hidden order detection strategies' used by high-frequency traders. Accordingly, our results show that hidden liquidity locations are predictable given the observable state of the market.

Number of Pages in PDF File: 43

Keywords: limit order market, hidden liquidity, high-frequency trading, non-display order, iceberg orders

JEL Classifications: G14, C24, C25, G17

Working Paper Series
Date posted: February 13, 2012  
Suggested Citation
Hautsch, Nikolaus and Huang, Ruihong, On the Dark Side of the Market: Identifying and Analyzing Hidden Order Placements (February 8, 2012). Available at SSRN: or


[ ]  BATS Take-down

[ ]

[ ]

[ ]  Man Vs Machine: How Each Sees The Stock Market
Submitted by Tyler Durden on 04/03/2012
available at:

[ ]

[ ]

[ ]

[ ]

[ ]

[ ]

[ ]

[ ]

[ ]

[ ]

[ ]

[ ]

[ ]


[ ] Guest Post: Deconstructing Algos, Part 4 -Phase Space Reconstructions Of CNTY Busted Trades Suggests High Speed Gang-Bangs In The Market
Submitted by The World Complex Deconstructing Algos, Part 4: Phase Space ... Volatility Submitted by The World ComplexDeconstructing Algos, Part 4: Phase Space Reconstructions ... the algos picked up by Nanex on June 21, 2011 in CNTY using the data appended here . ...

Story - Tyler Durden - 07/31/2011 - 17:43 - 39 comments - 0 attachments

[ ] Guest Post: Deconstructing Algos, Part 2: Leveraging Chaos Into High-Frequency Arbitrage Opportunities
Submitted by The World Complex Deconstructing Algos, Part 2: Leveraging Chaos Into High-Frequency Arbitrage Opportunities The recent elegant explanation for the activities of the HFT algos ... Natural Gas Submitted by The World ComplexDeconstructing Algos, Part 2: Leveraging Chaos ...

[ ] Story - Tyler Durden - 07/01/2011 - 00:21 - 32 comments - 0 attachments

Guest Post: Deconstructing Algos 3: Quote Stuffing As A Means Of Restoring Arbitrageable Latency; Or Is The CQS TRYING To Crash The Market?
Submitted by The World Complex Deconstructing Algos 3: Quote Stuffing As A Means ... ComplexDeconstructing Algos 3: Quote Stuffing As A Means Of Restoring Arbitrageable LatencyIn a recent article Nanex has ... at least 33% too low. 3. An algo is testing how much more quote noise it needs to generate to cause ...

[ ] Story - Tyler Durden - 07/08/2011 - 22:08 - 63 comments - 0 attachments

Guest Post: Deconstructing Algos, Part 1
Submitted by The World Complex Deconstructing algos, part 1 The third part ... Natural Gas Submitted by The World ComplexDeconstructing algos, part 1 The third part of the series ... are critical to the decision-making module of the algo. The reconstructed phase space The difficulties ...

[ ] Story - Tyler Durden - 06/24/2011 - 21:16 - 82 comments - 0 attachments

Gold Tumbles More Than $100 As $1700 Stops Triggered
to algo driven liquidations following the earlier described shift in sentiment, or has some assistance ... whispers?  Felt Any MFG style whispers?  Felt like a bit more than algos.  More like a company iquidation ...

Thursday, 12 April 2012

Default Invariance: Product & Sum Modules - Sans Esquisses

Default Invariance:  
Product and Sum Category Theory Models of Financial Contracts
Product Model:  Pay or Not-Pay
Sum Model: Financial Legal Remedies

1.  Recall the Arrow-Debreu-Sharpe model of t0 to t1 corresponding to the state of initial financial contract in a world of infinite contingent states and the state of pay, respectively.

2.  We improved the ADS Model to a Default Invariance Model where the t1 state is now bi-valued to include (a) Pay and (b) Not-Pay, i.e. P and -P.  

3.  If t1 results in -P then the infinite contingent states continues at t1.  This is equivalent (or isomorphic) to the infinite-contingent-states being multiplied by 1.

4. If t1 results in P then the infinite contingent states is annihilated and the certainty of payment makes the financial contract certain and therefore, immediately disengages from the infinitely contingent states of world.  This is equivalent (or isomorphic) to the inifinite-contingent-states being multiplied by 0.

5.  The 2-state at t1 default invariance model can be further specified in terms of Product and Sum.  [This is going to get a bit technical, I warn you.]

Definition of Product
An object P together with a pair of maps P1:P->B1, and P2:P->B2 is called a product of B1 and B2 if for each object X and each pair of maps f1:X->B1, f2:X->B2, there is exactly one map f:X->P for which both f1=P1f and f2=P2f. [See Lawvere & Schanuel, 2009, p. 217.]

[I'll insert a diagram later. Hint: it looks like a chevron with X on the left and an arrow from X to P which is , another arrow from X to B1 labelled f1, an arrow P1 from P to B1, an arrow from X to B2 labelled f2, and an arrow P2 from P to B2.]

6.  Given this definition of Product, we can now apply what we stated in an earlier blog that payment of a financial contract makes it certain and therefore, takes it out of the realm of uncertainty and is no longer part of ("resides in") a world of infinite contingent states.  Thus, an occurrence of payment is equivalent to the value of 0 in the infinite contingent world.  In our Product Diagram B1 = (Infinite-Contigency) x (0).  Another way of saying this is B1 = Uncertainty x 0, which means, no more uncertainty.  This valuation is not what Sharpe and others had supposed, and had in fact given payment the value of 1, which leads to inconsistent and contradictory results.

7.  Also, we can see that non-payment or not-paying at t1 means that the infinite-contingent-states of the world continues at t1.  This is equivalent to:  (infinite-contingent-states) x (1).  So, non-payment is actually an identity morphism.  In our Default Invariance Product Diagram, B2 = 1.  Strangely, f2 will have to be equivalent to infinite contingency divided by infinite contingency.

8.  As a sum, we have the following definition:

A pair j1: B1->S, j2; B2->S, of maps in a category makes S a sum of B1 and B2 if for ach object Y and each pair g1:B1-Y, g2:B2-Y, there is exactly one map g:S->Y for which both g1 = gj1 and g2 = gj2. [See Lawvere & Schanuel (2009) Conceptual Mathematics, p. 222]

 It is my contention that a legal remedy to a financial contract has the form of a sum as above, where Y is a legal remedy and B1 and B2 to breach and not-breach situations of the contract.  This is of course a first approximation of a legal risk theory.  

9.  The virtue of a Product and a Sum Model for the fundamental and universal unit of law and finance is that by putting them together THROUGH TIME (that is, from t0, t1, t2, we begin to have a view of how law and finance may be seen under one perspective that allows for both (1) simplification and anticipation of direct results -- i.e., "rough and ready" calculations that border on immediate insight through very complex legal and financial phenomena; and (2) a very detailed and rigorous, bookkeeping or tracking methodology to ensure that our predictions make sense and are grounded on facts.