This is a Preprint and has not been peer reviewed. The published version of this Preprint is available: https://doi.org/10.1016/j.advwatres.2018.11.016. This is version 4 of this Preprint.
Downloads
Supplementary Files
Authors
Abstract
Flow and transport in porous media are driven by pore scale processes. Particle tracking in transparent porous media allows for the observation of these processes at the time scale of ms. We demonstrate an application of defocusing particle tracking using brightfield illumination and a CMOS camera sensor. The resulting images have relatively high noise levels. To address this challenge, we propose a new calibration for locating particles in the out-of-plane direction. The methodology relies on extracting features of particle images by fitting generalized Gaussian distributions to particle images. The resulting fitting parameters are then linked to the out-of-plane coordinates of particles using flexible machine learning tools. A workflow is presented which shows how to generate a training dataset of fitting parameters paired to known out-of-plane locations. Several regression models are tested on the resulting training dataset, of which a boosted regression tree ensemble produced the lowest cross-validation error. The efficiacy of the proposed methodology is then examined in a laminar channel flow in a large measurement volume of 2048, 1152 and 3000 μm in length, width and depth respectively. The size of the test domain reflects the representative elementary volume of many fluid flow phenomena in porous media. Such large measurement depths require the collection of images at different focal levels. We acquired images at 21 focal levels 150 μm apart from each other. The error in predicting the out-of-plane location in a single slice of 240 μm thickness was found to be 7 μm, while in-plane locations were determined with sub-pixel resolution (below 0.8 μm). The mean relative error in the velocity measurement was obtained by comparing the experimental results to an analytic model of the flow. The estimated displacement errors in the axial direction of the flow were 0.21 pixel and 0.22 pixel at flows rates of 1.0 mL/h and 2.5 mL/h, respectively. These results demonstrate that it is possible to conduct three-dimensional particle tracking in a representative elementary volume based on a simple apparatus comprising a microscope with standard brightfield illumination and a camera with CMOS sensor.
DOI
https://doi.org/10.31223/osf.io/djy8n
Subjects
Artificial Intelligence and Robotics, Computer Sciences, Earth Sciences, Engineering, Fluid Dynamics, Hydrology, Life Sciences, Physical Sciences and Mathematics, Physics
Keywords
machine learning, astigmatic, feature extraction, particle tracking velocimetry, piv, ptv
Dates
Published: 2018-11-20 09:06
Last Updated: 2019-05-01 10:15
There are no comments or no comments have been made public for this article.