Over the last decade, analysis software tailored for individual experimental techniques has been a driving factor in bringing new analysis approaches into physical spectroscopies. Across techniques as diverse as ptychography, astrophysical imaging, cryo-electron microscopy, and various super-resolution and nonlinear microscopy techniques, instrumentation improvements require the integration of statistically sophisticated approaches to data analysis and acquisition. These advances are not always in hardware. Historically, advances in instrumentation have been as significant as theoretical breakthroughs because the ability to perform new experiments allows scientists to pave over speculation with experimental proof. The landscape of modern experimental physics is best conceived through the set of experimental tools that physicists use to interrogate space and matter. Finally, we discuss how AutodiDAQt enables a future of highly efficient machine-learning-in-the-loop experiments and analysis-driven experiments without requiring data acquisition domain expertise by using analysis code for external data acquisition planning.
AutodiDAQt addresses the essential needs for scientific data acquisition by providing simple concurrency, reproducibility, retrospection of the acquisition sequence, and automated user interface generation.
To ground the discussion, we demonstrate its merits for angle-resolved photoemission spectroscopy and high bandwidth spectroscopies. In this paper, we introduce AutodiDAQt to address these shortfalls in the scientific ecosystem. The burden of writing data acquisition software falls to scientists, who are not typically trained to write maintainable software. High-dimensional physical spectroscopies, such as angle-resolved photoemission spectroscopy, make these issues especially apparent because, while they use expensive instruments to record large data volumes, they require very little acquisition planning. Scientific data acquisition is a problem domain that has been underserved by its computational tools despite the need to efficiently use hardware, to guarantee validity of the recorded data, and to rapidly test ideas by configuring experiments quickly and inexpensively.