Recently I had the opportunity to work on an interesting tutorial for Seqera Labs, explaining how to adapt a machine learning inference pipeline to use an AWS Batch GPU environment provisioned using Nextflow Tower. The example uses Stable Diffusion – an open source text to image model to generate a variable number of images and
COVID-19 was the first pandemic to emerge in the era of modern bioinformatics. Government and private labs were forced to dramatically scale their compute infrastructure for everything from the development of vaccines and therapeutics to genomic surveillance and variant tracking. Seqera Labs was one of the technology companies in the thick of the COVID battle.
I recently worked on a whitepaper for Dremio funded by Amazon Web Services around the topic of data analytics. You can download the paper @ TWDI – Building a Modern Architecture for Interactive Analytics on Amazon S3 Using Dremio | Transforming Data with Intelligence (tdwi.org)
Apocalyptic warnings about the risks of AI are hardly new. Stephen Hawking warned that “the development of full AI could spell the end of the human race”(1) and Elon Musk has declared unregulated AI more dangerous than nuclear weapons(2). While computers aren’t out-thinking humans just yet, we’re already confronting the first challenge of the AI-age
Carrots and sticks: How new challenges are bringing fresh opportunities for HPC, data analytics, and AI This Cabot Partners article sponsored by IBM was published at HPCwire at this URL. This article is identical, but graphics are presented in higher resolution. ————- For decades, banks have relied on high-performance computing (HPC). When it comes to
Cabot Partners – Software Defined Infrastructure in Life Sciences Final Life sciences research is advancing at a rapid pace. New techniques such as next-generation sequencing (NGS) are playing a vital role in growing scientific knowledge, facilitating the development of targeted drugs, and delivering personalized healthcare. By investigating the human genome, and studying it in the
Realizing a more cost-efficient elastic infrastructure Most of us recall the notion of elasticity from Economics 101. Markets are about supply and demand, and when there is an abundance of supply, prices usually go down. Elasticity is a measure of how responsive one economic variable is to another, and in an elastic market, the response
Parallelizing R with BatchJobs – An example using k-means Gord Sissons, Feng Li Many simulations in R are long running. Analysis of statistical algorithms can generate workloads that run for hours if not days tying up a single computer. Given the amount of time R programmers can spend waiting for results, getting acquainted parallelism makes
New tooling built into BigInsights for Hadoop simplifies landing and processing raw twitter data. If you blinked you might have missed it, but back in November 2014, IBM announced our Hadoop in the cloud service called BigInsights for Hadoop. Developers can get started with free versions of BigInsights including IBM Analytics for Hadoop on bluemix.net