Financial institutions (FIs) face unprecedented challenges. With increasing consumer expectations and megatrends such as mobile banking, alternative lenders, and cloud computing, banks are under intense pressure. They need to innovate faster, offer new services, and deploy new applications to serve their customers better. Despite these imperatives, data center growth rates have become unsustainable. The need
Realizing a more productive environment for quantitative analysis in the financial services industry
Quantitative analysis refers to the use of mathematical and statistical modeling, measurement, and research to understand and quantify market behaviors. By studying these behaviors, analysts can develop models that predict the prices of financial instruments under different market scenarios. In this whitepaper sponsored by AMD, we present the results of a quantitaitive analysis based on
In today’s hyper-competitive business environment, data is at the heart of everything we do. Business leaders rely on up-to-date data to better understand their customers and competitors, make better, more informed decisions, and support new business initiatives. This paper examines some of the challenges with existing data pipelines and discuss five considerations for building a
COVID-19 was the first pandemic to emerge in the era of modern bioinformatics. Government and private labs were forced to dramatically scale their compute infrastructure for everything from the development of vaccines and therapeutics to genomic surveillance and variant tracking. Seqera Labs was one of the technology companies in the thick of the COVID battle.
I recently had the opportunity to work with Incorta developing an architecture guide for their Unified Data Analytics Platform. The guide is now live on Incorta’s website. This guide explores the architecture of Incorta’s unified data analytics platform and how it can augment or replace legacy analytics environments while taking a new and innovative approach
In March I worked on this article with folks at Dremio, explaining the challenge of data copies in modern “data lakehouse” environments. An organization’s data is copied for many reasons, namely ingesting datasets into data warehouses, creating performance-optimized copies and building BI extracts for analysis. Unfortunately, data replication, transformation and movement can result in longer