According to their web site,Pervasive DataRushhas a simple mission:Simplify How You Process and Analyze Big Data. The company's product is a parallel dataflow platform that eliminates performance bottlenecks in big data preparation and analytics.
Pervasive Datarush set a remarkable performance record recentlyon an SGI Altix system with 384 cores. These types of benchmarks are measured in CUPS (cell updates per second) and the company was able to achieve nearly one TeraCUPS of performance on theSmith-Watermanalgorithm, which is popular in bioinformatics.
In thispodcast, I catch up with Davin Potts to talk about how Pervasive Datarush leverages the parallel processing capabilities of multicore processors and SMP systems to deliver extreme performance on big data.Download the podcastorsubscribe on iTunes.