Scalable Python research sandbox

The BMLL Data Lab takes the most granular order book data available and combines it with easy-to-use APIs and analytics libraries in a secure cloud environment. Perform scalable research without the burden of data sourcing, data curation or data engineering.

High Granular Data & Analytics

Conduct detailed analysis using full-depth order book data, comprehensive reference datasets and 500+ pre-computed analytics derived from the Level 3 order book - at daily, 30 minute, and 1 minute frequencies.

Easy to Use API Libraries

Derive insight by utilising a fully customisable environment containing easy-to-use APIs explicitly designed to analyse the order book either in depth for a single venue or across multiple venues simultaneously.

Cloud-Native Environment

Access a cloud-native solution and scale the compute resource required through an intuitive UI. For the most compute-intensive research, utilise pre-configured cluster tooling with a few clicks.

Production and Workflow Tooling

Increase the speed to insight through workflow integration tooling, such as scheduled tasks jobs, to run analysis when required. SFTP servers deliver feed research results into your production system, or integrate proprietary datasets, and version-controlled datasets and environments to ensure robust analysis.

Support as a Service

Utilise dedicated quant support, pre-configured notebooks and robust documentation on every feature of the BMLL Data Lab to begin deriving insights immediately.

Request a Trial

Explore the full order book and derive insights that provide the clearest view of the market.

USE CASES

  • Optimising algorithms / SORs: Improve the performance of algorithms / SORs by utilising high-quality historical datasets within a scalable environment to identify market trends and backtest models.
  • Execution analysis: Demonstrate best execution by utilising detailed and complete historical datasets, including trade condition codes and flags as well as pre-computed analytics which provide statistics on the behaviour of individual securities and entire markets.
  • Risk management: Effectively backtest models, utilising a complete view of market behaviour across time and region, ensuring your models perform as expected against evolving market conditions.
  • Compliance / surveillance: Utilise Level 3 Data to monitor the life of an order(s) to better detect and investigate market abuse incidents such as spoofing, layering and ramping etc.
  • Regulatory change: Model and respond to the impact of regulation by utilising datasets and analytics harmonised across time and region to understand how regulation impacted market behaviour across different regulatory zones.
  • Market microstructure research / thought leadership: Differentiate your offering and promote your organisations as thought leaders by combining high-quality data with a scalable research environment to conduct detailed market microstructure analysis and generate unique insights into trading behaviours, liquidity dynamics, and order flow.