Summary: | Measurement issues leading to the acquisition of artifact-free shock wave pressure-time profiles are discussed. We address the importance of in-house sensor calibration and data acquisition sampling rate. Sensor calibration takes into account possible differences between calibration methodology in a manufacturing facility, and those used in the specific laboratory. We found in-house calibration factors of brand new sensors differ by less than 10% from their manufacturer supplied data. Larger differences were noticeable for sensors that have been used for hundreds of experiments and were as high as 30% for sensors close to the end of their useful lifetime. These observations were despite the fact that typical overpressures in our experiments do not exceed 50 psi for sensors that are rated at 1,000 psi maximum pressure. We demonstrate that sampling rate of 1,000 kHz is necessary to capture the correct rise time values, but there were no statistically significant differences between peak overpressure and impulse values for low-intensity shock waves (Mach number <2) at lower rates. We discuss two sources of experimental errors originating from mechanical vibration and electromagnetic interference on the quality of a waveform recorded using state-of-the-art high-frequency pressure sensors. The implementation of preventive measures, pressure acquisition artifacts, and data interpretation with examples, are provided in this paper that will help the community at large to avoid these mistakes. In order to facilitate inter-laboratory data comparison, common reporting standards should be developed by the blast TBI research community. We noticed the majority of published literature on the subject limits reporting to peak overpressure; with much less attention directed toward other important parameters, i.e., duration, impulse, and dynamic pressure. These parameters should be included as a mandatory requirement in publications so the results can be properly compared with others.
|