Minisymposium Presentation
Ptychography with Extreme Data
Description
X-ray microscopy via ptychographic imaging has been growing rapidly in the last decade. Where early experiments could only be performed by a few experts groups with their own specialized software implementations. Nowadays ptychographic experiments can be performed at plenty of beamlines and various open source software packages are being used and refined by collaborations across singular groups. With the increased availability and wide spread application of X-ray ptychography the initial challenge of "making it work robustly" has been overcome. New challenges are all connected to every users wish to image more, meaning faster as well as larger samples. On the experimental level these wishes are being address by pushing towards beamlines with more photons that utilize bigger and faster detectors. On the data reconstruction end this push towards more and more extreme data pushes the implementation of the reconstruction algorithms to their limits. Hardware architecture, suitable algorithm design an implementation start to dictate what datasets can and can not be reconstructed. In this talk the problem of "extreme data" in ptychography is dissected, current computation limits are explore and various ideas from the community on tackling these issues are presented and discussed.