01/20/2023 | News release | Archived content
Over the last few years, there has been a step change in the role of data and technology in trading, risk management, and investment decision-making. Data analytic techniques previously considered emerging or experimental are becoming mainstream. Firms are deploying data science tools to improve risk assessment and business response strategies, and bring more rigor to their operations. Having on-demand access to significant computational resources via the cloud, with high-performance data stores and in-memory architectures, enables firms to do more ad-hoc analysis, testing and validation. This is done using the most granular levels of data and without the need to pre-aggregate or pre-format the data. Firms, however, do face the challenge of how to enable their quants and data scientists to produce high value work without compromising the security and restrictions on who can access and adjust official risk and PnL numbers.
Historically, risk managers have had to 'lock down' the official platforms. This created two main problems that hampered advanced analysis.
The first is duplication of work: to carry out sophisticated analysis using the firm's actual trade and/or market data, quants sometimes needed to re-develop the pricing models implemented as part of the firm's 'official' PnL platforms. This is time-consuming and complex, especially for derivative products or when an accurate representation of all trade details is required.
The second is inconsistency: inevitably, the models implemented independently by quants/ strats/ data scientists differ from the official ones, and divergence typically grows with the complexity of the products. This is particularly true when a technology provider incorporates pricing models in the PnL platform.
Modern data science tools integrated into the latest generation of Mark to Market (MtM) platforms solve both problems.
Open-source data science tools provide many possibilities for building various machine learning (ML) models and analyzing vast amounts of data. However, to be helpful in real-world applications, the data underlying any analysis must come from a source system. For modelling even simple products such as bonds, one usually needs relatively complex building blocks like interest rate curves, reference data, exact product definitions, and market quotes for interest rates and bonds. Collecting, representing, and normalizing this data is a complicated and tedious task, and inaccuracies in modelling these components can affect or even invalidate any further analysis.
A new set of API tools is now emerging. These tools are designed to seamlessly integrate open-source data science packages and programming environments with more traditional MtM risk platforms, such as Quantifi. These new APIs enable innovative integration between a standalone risk platform and programming environments that quants and traders can easily use on their desktops.
This framework provides the next level of interoperability, by allowing the transfer of fully calibrated complex objects (like curves, volatility surfaces, product or trade representations) to various parts of the risk ecosystem. Using these APIs, quants and quant traders can take an existing portfolio of trades from the risk platform and perform back-testing, custom VaR calculations, ad-hoc scenarios, or sensitivity analysis independent of the primary risk platform. Alternatively, users can simply extract the required data objects such as curves, quotes, and reference data and construct new trading strategies. They can also apply these objects to price bespoke derivatives not handled by the PnL platform.
Another advantage of this new technology is that it allows users to work in their preferred programming environment, for example, Python in Jupyter Notebooks or other popular programming languages and integrated development environments (IDE). Users can also perform their analyses and/or build advanced models using the APIs on their local machines. At the same time, the primary PnL platform can be operated elsewhere, even hosted in cloud-computing platforms.
Using the framework described above, users can ensure consistency between pricing trades in an 'official' MtM platform and their local development environment. This is because all the business objects required for calculations are passed directly from the platform, where they were created using the pre-defined 'official' set of pricing rules and parameters.
In addition, users benefit from a truly 'low-code' environment, where the risk platform handles the setup of complex trade pricing logic. Users can therefore focus on adding value by implementing high-level tasks such as portfolio back-testing, custom scenario or risk measure calculations, or portfolio optimization. All of this can be achieved without spending time on setting up the underlying risk factors, security, or trade details.
Furthermore, this framework can facilitate the transfer of objects representing any trade type supported by the MtM platform, no matter how complex. As a result, users can run analyses on portfolios consisting of mixed trades and hedges, both vanilla and derivatives. By using high-level code, users avoid the extra layers of complexity inherent in low-level code. Moreover, because the process runs standalone from the central MtM platform, users can be confident that they will not negatively affect the performance of the primary platform or the integrity of the data. The framework also allows integration with popular ML libraries, which often require extensive use of computing power, again without affecting the performance of the primary MtM platform. Consistency between the two is also maintained.
The MtM platform also takes care of the data dependencies (i.e., market and reference data) required to price trades. The process involves automating the trade and market data feeds and preserving the relationships and hierarchies of data from multiple sources. Users can therefore focus primarily on implementing new functionalities instead of 'cleaning' the data.
Advanced ML models can be set up using accurate trade and product representation, and consistent market data and pricing rules. This adds new levels of flexibility and robustness while ensuring consistency throughout the modelling process.
Quantifi's data science enabled platform enables quants and traders, in multiple financial institutions, to automate and 'outsource' the task of manually collecting and processing data. Users can focus on implementing the required custom business logic, complimentary to the functionality of the platform, and significantly reduce delivery times. This new data science platform provides clients with the ability to do complex data analysis and flexible reporting using Python, Jupyter Notebooks and other popular data science tools. Integrated with its advanced model library, clients benefit from complex data-driven analysis, strategy back-testing, ad hoc portfolio what-if scenarios - all using mixed data sets from diverse sources.