How long can CCPs afford to wait?

Clearing HouseTwo months ago the Basel Committee decided that banks will have to set aside less capital against trades through central clearing houses in a bid to encourage them to use their services. The aim is to make banks use the central counterparties (CCPs), making it easier for regulators to follow the flow of banks’ trades and exposures to each other.

This followed a joint statement by the European Central Bank and the Bank of England over the City’s clearing houses that finally agreed that Euro denominated transactions could be cleared outside the Eurozone whilst making a point that “CCP liquidity risk management remains first and foremost the responsibility of the CCPs themselves”1.

Read more

Posted in Non classé | Comments Off

Tackling the Look-through Challenges with In‑Memory Computing

Look Through ApproachIn the April 2015 edition of its Global Financial Stability Report, the IMF raised concerns about potential financial stability risks posed by the asset management industry, calling for regulatory scrutiny on a sector which intermediates 40% of the world’s financial assets. Whether under regulatory or client pressure, asset managers should consider the technology implications of a greater transparency in risk reporting sooner rather than later. This post will delve into the implications of the look-through approach from a data management standpoint, building the case for the use of modern in-memory aggregation technology to process massive amounts of highly granular data.

Read more

Posted in Big Data Analytics, Finance | Comments Off

Addressing FRTB challenges with in-memory computing

New FTRB rules will make life harder for banksIn a recent video blog published on March 18, Satyam Kancharla from Numerix* highlighted some of the issues introduced by the draft proposal of the Fundamental Review of the Trading Book (FRTB) run by the Basel Committee on Banking Supervision (BCBS). Among those challenges are the transition from Value-at-Risk to Expected Shortfall, the use of varying liquidity horizons, and revisions brought to the methodologies.

Read more

Posted in Finance | Comments Off

Crowded Trades: Are Clearing Houses Immune From Systemic Risk?

Crowded Trades: Are Clearing Houses Really Immune Against Systemic Risk?The sudden decision by the SNB to remove the longstanding cap on the Swiss Franc against the Euro took markets by surprise, causing many casualties amongst the foreign exchange broker community. As stated by the Financial Times on January 19, “In one of the most damaging currency swings in the modern trading area, the Swiss Franc soared in value, leaving investment banks across the world with big losses and hitting foreign exchange brokers particularly hard”.
Read more

Posted in Finance | Comments Off

How to make historical analysis work on real-time data

‘As of’ root cause analysis at any point in the pastHistorical data analysis is typically enabled using data duplication technologies. But is this method still valid today when users need to analyze historical data that’s moving fast and changing rapidly throughout the day? All we know is that in ActivePivot, we practically had to re-invent our core database to support the requirements of our customers who wanted to travel back in time and analyze large volumes of dynamic data.
Read more

Posted in Big Data Analytics, Technology | Tagged , , | Comments Off

From Multi-Core to Many-Core: Do not delay in Making Your Java Application “NUMA-Aware”

NUMA Aware ApplicationIn the last post, I explained the difference between SMP and NUMA architectures as we enter the “many-core” era. I also asked the following question: “Is it reasonable to expect massive performance improvements when you run an existing application on new NUMA-enabled hardware?” The answer is yes. However, improved performance is not guaranteed and you must be prepared to rewrite the code of your application to get the best out of many-core hardware.
Read more

Posted in Big Data Analytics, Technology | Tagged , , | Leave a comment

To NUMA or not to NUMA

Symmetric Multi-Processing (SMP)When your business is analyzing big data with the goal of providing answers in split seconds, you find yourself trying to squeeze every bit of speed into your solution. Among other things, this also includes finding the optimal processor architecture. This is why we’ve spent quite a lot of efforts studying the memory architecture alternatives – NUMA (non-uniform memory access) and SMP (symmetric multiprocessing) – to see which one could provide us with the best results.

Read more

Posted in Big Data Analytics, Technology | Tagged , , , | Leave a comment

In-memory analytics: doing things radically different

Think differentIn our previous post, How In-Memory Computing is Accelerating Business Performance, we explained the disruptive potential of in-memory computing. Performance gains resulting from faster execution of queries were one of the top benefits mentioned. However, in-memory computing goes way beyond performance gains, allowing organizations to do things differently and achieve new levels of competitiveness. This post illustrates this with a few examples.

Read more

Posted in Big Data Analytics, e-Commmerce, Supply Chain, Technology | Tagged | Leave a comment

From 24 hours to 24 seconds – How In-Memory Computing is Accelerating Business Performance

simulationsCountless articles are written about Big Data every day. Beyond the hype, the Big Data phenomenon is a real change agent delivering capabilities that were never thought of before. Financial institutes and banks, for example, can calculate and asses their risk in near-real time, throughout the day.

To a large degree, the phenomenal performance and interactive analysis capabilities of Big Data projects are enabled by in memory computing. In-memory databases have become the foundation of a new generation of business applications that bring the power of analytics to the hands of decision makers. Read more

Posted in Big Data Analytics, Finance, Technology | Tagged , , | Leave a comment

Dynamic Data Bucketing: A Use Case for Complex Aggregation

Fast & Dynamic Data BucketingComplex aggregation has become a common requirement for business users looking to analyze sophisticated metrics across multiple dimensions. There are numerous use cases for complex aggregation such as cross-currency aggregation, which was explored in our last post. Dynamic bucketing is another use case example.

This blog takes a deep look at the technical considerations for a successful implementation of dynamic bucketing.

Read more

Posted in Big Data Analytics, Technology | Leave a comment