Large scale simulations have been used to probe phenomena such as water transport in nanotubes and blood flows in micro-fluidics devices at spatiotemporal resolutions that are not yet accessible experimentally. However the results of these simulations are subject to errors that are inherent throughout the thought process of a computation. Our generation is fortunate to have the capability to address such errors through the availability of data obtained by advanced experimental modalities. But how should we integrate experimental data and simulations? I will argue that such integration goes well beyond the classical realm of validation and propose a Bayesian framework for quantifying the uncertainty in simulations. I will address issues such as embedding this HPC framework in massively parallel architectures and discuss the ramifications of our findings for existing and future studies in applications such as mass and heat transport in carbon nanotubes and high throughput blood sorting micro-fluidic devices.