In previous posts, we’ve delved into the principles of multidimensional databases. Among all the benefits that a multidimensional database delivers is complex aggregation, a process by which KPIs are written once and are immediately available across any dimensions, through any filtering, letting the user follow his train of thought.
But how does complex aggregation actually work? This post explores a concrete use case, articulates the technology challenges behind complex aggregation and demonstrates why ETL and SQL relational databases are not a fit.
Real-time decision making is commonly linked with Complex Event Processing (CEP). Indeed, CEP systems can extract and alert about meaningful events from streams of data. However, for decision makers to have context and turn a notification into a meaningful, actionable event, CEP must be supplemented with mixed workload and multidimensional capabilities.
Let’s take a look at what it means.
In a previous post comparing multidimensional and relational databases we mentioned that the decision making imperatives in the Big Data era were disrupting the clear-cut border between OLTP and OLAP, enabling a new type of mixed workload database that addresses both needs.
This post takes a closer look at mixed workload systems – what they are, how they work, and what are they useful for. Continue reading
Relational and multidimensional databases differ on almost any possible dimension: tables, columns and rows vs. cubes, measures and dimensions; queries across joint tables vs. pre-calculated aggregations across dimensions; Structured Query Language (SQL) vs. Multidimensional Expressions MDX. And the list of goes on and on.
Nevertheless, talk to a vendor from any of these two camps, and he will argue that the system can successfully perform any task. Cross-dimensional analytics, for instance, can be performed using both types of systems. Continue reading
Tom Groenfeldt, financial technology blogger for the internationally renowned business and finance publication, Forbes, outlines the benefits of using real-time risk management in the face of the Big Data conundrum: www.forbes.com/sites/tomgroenfeldt
Comparisons are drawn between the trend monitoring capabilities of ActivePivot and the analytical approach used by intelligence agencies. Georges Bory, MD and co-founder at Quartet FS highlights the breadth of the technology: “Whether it’s Homeland Security or fraud detection or operational risk or control in a trading room, you are trying to apply statistics to huge amounts of data.” Continue reading
In recent years regulatory demands have permeated all types of financial institutions. Banks are finding themselves under increasing scrutiny to account for their actions with never seen before speed and accuracy. The likes of Dodd Frank, Basel III, and EMIR are necessitating an increasing need to assess, on a pre-trade basis, the credit impact resulting from new OTC transactions. Continue reading
The collateral world is changing, and changing fast. The transition of the derivatives market from OTC to an exchange-traded, centrally cleared environment, as framed by the Dodd-Frank Act and European Market Infrastructure Regulation (EMIR) regulatory reforms, is a game changer for all market participants – dealers, prime brokers, custodians, asset managers and hedge funds alike.
The need for financial institutions to have real-time access to their exposures, pledged collateral and collateral requirements across all asset classes and counterparties is no trivial matter. Continue reading
Businesses are increasingly recognising the importance of Supply Chain Management (SCM) in the overall performance of their operations. In the Gartner “Hype Cycle for Supply Chain Management 2012” , Gartner analyst says that “After years of a cost-cutting focus, SCM continues to rise up the corporate agenda.” In fact, effective SCM presents a clear path for beating financial inefficiencies and retaining important customers, and is becoming increasingly important across industry. Continue reading
Faced with increased pressure and greater scrutiny from regulatory bodies, the demands on product control teams are increasing and are showing no signs of stopping. Teams require instant insight into Profit & Loss (P&L) data, the ability to quickly analyse a large number of KPIs, and the foresight to identify mis-matches to ensure the best possible operational decisions are made. It is in these demanding scenarios that in-memory analytics technologies bring significant advantages to the table and enable heightened visibility across the entire supply chain for product controllers. Continue reading
With poor product control cited as a factor in the 2008 fall of Lehman Brothers, it’s no surprise that over the past few years the spotlight has been shining on this element of trading activity. Today, banks face ever increasing scrutiny from regulatory bodies, with the result that the product control function is now expected to address a widening range of issues. When the financial crisis kicked off, the FSA outlined key concerns about failings in the area of valuations and product control in its ’Dear CEO’ letter. Consequently, product control is now being recognised as playing a central role in the end-to-end trading process.
While this is broadly good news, greater investment in improved tools and processes is needed if product control is to realise its potential and meet these demands. Continue reading