Compare commits

...

9 commits

5 changed files with 145 additions and 5 deletions

1
.gitignore vendored
View file

@ -1,3 +1,4 @@
/report.pdf
/result
/.pre-commit-config.yaml
/plantuml-images/

View file

@ -27,3 +27,5 @@ pdf-engine: pdflatex
table-of-contents: true
toc-depth: 1
number-sections: true
filters:
- pandoc-plantuml

BIN
figs/remote-plots-presentation-censored.png (Stored with Git LFS) Normal file

Binary file not shown.

View file

@ -36,6 +36,8 @@
buildInputs = with pkgs; [
gnumake
pandoc
pandoc-plantuml-filter
plantuml
(texlive.combine {
inherit (texlive)
scheme-small

142
report.md
View file

@ -1,5 +1,95 @@
# Exec Sum
When making my decision to major in Image Processing and Image Synthesis, my
main motivator was my growing curiosity for high performance programming, as I
thought that some of the subjects being taught would lend themselves well to
furthering my understanding of this domain.\
On my own, I discovered through a number of conference talks that I watched that
the field of finance provides many interesting challenges that aligned with
those interests.
When the opportunity arose to work for IMC, a leading firm in the world of
*market-making*, I jumped at the chance to apply to them for my internship. When
asked for the kind of work I wanted to do during that time, I highlighted my
interests in performance, which lead to the subject of writing a benchmark
framework for their new exchange connectivity layer, currently in the process of
being created and deployed.\
This felt like the perfect subject to learn more about finance, a field I had
been interested for some time at that point, and get exposed to people that are
knowledgeable, to cultivate my budding interest in the field.
During my internship, I was part of the *Global Execution* team at IMC, a new
team that was tasked with working on projects of relevance to multiple desks
(each of whom is dedicated to working on a specific exchange). Their first
contribution being to write the *Execution Gateway*, a piece of software that
should act as the intermediary between IMC's trading algorithm and external
exchanges. This is meant to replace the current pattern of each auto-trader
connecting directly to the exchanges to place orders, migrating instead to a
central piece of infrastructure to do the translation between internal IMC
protocols and decision, and exchange-facing requests.
As part of this migration process, my mission was to work on providing a simple
framework that could be used to measure the performance of such a gateway. To do
so, we must be able to instrument one under various scenarios meant to mirror
real-life conditions, or exercise edge-cases.
This lead me to first get acquainted with the components that go into running
the gateway, and what is necessary on the client side to make use of it through
the *Execution API*, which is in the interface exposed to downstream consumers
of the gateway: the trading algorithms.\
Similarly, I learned what the gateway itself needs to be able to communicate
with an exchange, as I would also need to control that side in the framework to
allow measurements of the time taken for an order from the client to the
exchange.\
To learn about those dependencies, I studied the existing tests for the
*Execution API*, which bundle client, gateway, and exchange in a single process
to test their correct behaviour.
Once I got more familiar with each piece of the puzzle, I got to work on writing
the framework itself. This required writing a few modules that would provide
data needed as input by the gateway as a pre-requisite to being able to run
anything. With those done, getting every component running properly was the next
step. With that done, the last part was to generate some dummy load on the
gateway, and collecting its performance measurements. Those could be analyzed
offline, in a more exploratory manner.\
With that work done, I had delivered my final product relating to my internship
subject. Taken from a first *Proof-Of-Concept* to a working end result. Along
the way I learned about the specific needs relating to the usage of those
benchmarks, and integrated them in the greater IMC ecosystem.
With the initial work on the benchmark framework done, my next area of focus was
to add compatibility testing for the gateway. The point of this work was to add,
as part of the *Continuous Integration* pipeline in use at IMC, tests that would
exercise the actual gateway binaries used in production, to test for both
forward and backward compatibility. This work fits perfectly as a continuation
of what I did to write the benchmark framework: both need to run the gateway and
generate specific scenarios to test its behaviour. This allowed me to reuse most
of the code what I had written for the benchmark, and apply it to writing the
tests.\
The need for reliable tests meant that I had to do a lot of ground work to
ensure that they were not flaky, this is probably the part that took longest in
the process, with some deep investigations to understand some subtle bugs and
behaviours that were exposed by the new tests I was attempting to integrate.
Towards the end of my internship, I presented the work I did on the framework to
other developers in the execution teams. This was in part to showcase the work
being done by the *Global Execution* team, and to participate in the regular
knowledge sharing that happens at IMC.
Joining a company during the COVID period of quarantines, working-from-home, and
the relatively low amount of face-to-face contact high-lighted the need for
efficient ways of communicating with my colleagues. Being part of a productive,
highly driven team has been a pleasure.
Overall, this internship was a great reaffirmation of my career choices. This
was my first experience with an actual project dealing with performance of
something actually used in production, and not simply a one-off project to work
on and leave aside. Being able to contribute something which will be used
long-term, and have impactful consequences, reassures me in the choice of being
a software engineer, and the impact of my studies at EPITA.
\newpage
# Thanks and acknowledgements
First off, I would like to thank Jelle Wissink, an engineer from the Global
@ -49,9 +139,9 @@ In the face of continuous improvement of their system, the performance aspect of
any upgrade must be kept at the forefront of the mind in order to stay
competitive, and rise to a dominant position globally.
My project fits into the migration of IMC's trading algorithms from their legacy
*driver* connecting each of them directly to the exchanges, to a new central
service being developed to translate and interface between IMC-internal
My project fits into the migration of IMC's trading algorithms from their
individual *drivers* connecting each of them directly to the exchanges, to a new
central service being developed to translate and interface between IMC-internal
communication and exchange-facing orders, requests, and notifications.
Given the scale of this change, and how important such a piece of software is in
@ -305,7 +395,8 @@ across the team. This work was two-fold:
* Do a presentation about the benchmark framework: because it only contains the
tools necessary as the basis for running benchmarks, other engineers will need
to pick it up to write new scenarios, or implement the benchmark for new
exchanges. To that end, I FIXME
exchanges. To that end, I presented my work, demonstrated its ease of use and
justified some design decisions.
* How to debug problems in benchmarks and compatibility test runs: due to the
unconventional setup required to run those, investigating a problem when running
@ -316,7 +407,41 @@ used to debug problems I encountered during their development.
## Gantt diagram
FIXME
```plantuml
@startgantt
ganttscale weekly
Project starts 2021-03-01
-- Team 1 --
[Discovering the codebase] as [Discover] lasts 14 days
then [Proof Of Concept] as [POC] lasts 7 days
then [Model RDS Server] as [RDS] lasts 14 days
then [Framework v0] as [v0] lasts 14 days
then [Refactor to v1] as [v1] lasts 14 days
then [Initial results analysis] as [Analysis] lasts 7 days
then [Dumping internal measurements] as [EventAnalysis] lasts 7 days
then [Benchmark different gateway] as [NNF] lasts 14 days
[Compatibility testing] starts 2021-05-30
note bottom
Fixing bugs
Reducing flakiness
end note
[Compatibility testing] ends 2021-08-15
[Intern week] starts 2021-06-28
note bottom
Learn about option theory
end note
[Intern week] ends 2021-07-04
[Holidays] starts 2021-08-16
[Holidays] ends 2021-09-01
@endgantt
```
# Engineering practices
@ -677,3 +802,10 @@ algorithms, it makes use of its execution platform to provide liquidity to
financial markets globally.
## Results & Comments
![Sample plots, with timings censored](figs/remote-plots-presentation-censored.png)
The figure above showcases a few sample results that can be obtained by using
the benchmark framework. The gateway keeps track of some internal events and
their timings, and reports them back to the benchmark which will dump the data
and allow for further analysis.