gord@storytek.com 905-717-1561

HPC

July 12, 2019

Apocalyptic warnings about the risks of AI are hardly new. Stephen Hawking warned that “the development of full AI could spell the end of the human race”(1) and Elon Musk has declared unregulated AI more dangerous than nuclear weapons(2). While computers aren’t out-thinking humans just yet, we’re already confronting the first challenge of the AI-age

December 5, 2018
No Comments

During September of 2018, members of our Cabot Partners team had the opportunity to work with talented engineers and industry experts from HPE, Cadence, Marvell, and Arm. We collaborated to develop a whitepaper and conduct preliminary benchmarks showing how new Cadence tools optimized for multi-core systems benefit from Marvell ThunderX2 Arm-based systems such as the

December 5, 2018
No Comments

Carrots and sticks: How new challenges are bringing fresh opportunities for HPC, data analytics, and AI This Cabot Partners article sponsored by IBM was published at HPCwire at this URL. This article is identical, but graphics are presented in higher resolution. ————- For decades, banks have relied on high-performance computing (HPC). When it comes to

June 18, 2016
No Comments

Cabot Partners – Software Defined Infrastructure in Life Sciences Final Life sciences research is advancing at a rapid pace. New techniques such as next-generation sequencing (NGS) are playing a vital role in growing scientific knowledge, facilitating the development of targeted drugs, and delivering personalized healthcare. By investigating the human genome, and studying it in the

May 9, 2015

Realizing a more cost-efficient elastic infrastructure Most of us recall the notion of elasticity from Economics 101. Markets are about supply and demand, and when there is an abundance of supply, prices usually go down. Elasticity is a measure of how responsive one economic variable is to another, and in an elastic market, the response

April 3, 2015
No Comments

Parallelizing R with BatchJobs – An example using k-means Gord Sissons, Feng Li Many simulations in R are long running. Analysis of statistical algorithms can generate workloads that run for hours if not days tying up a single computer. Given the amount of time R programmers can spend waiting for results, getting acquainted parallelism makes