Skip to content

Bug: RSoXS scans using Greateyes WAXS camera encounter timeout error for large number of images per energy #33

@PriyankaKetkarBNL

Description

@PriyankaKetkarBNL

Background:

For RSoXS energy scans, images were being chunked at the scan level such that all images taken at a single energy were gathered into a single chunk. These chunks defined the minimum amount of data that would be downloaded from Tiled. For scans in which a large number of images were captured at a single energy, timeout errors would occur when attempting to access data using Tiled.

For those with access, details can be found in these threads:


Troubleshooting and draft developments:

Changes have been drafted in a branch, and specifically this commit specifies the chunk size such that only a single 1024 x 1026 pixel is contained in one chunk.

For prior scans, metadata can be updated retroactively to correct the chunk size and eliminate the timeout error.

from tiled.client import *

catalog = from_profile("rsoxs")
run = catalog[94007] ## Example scan in which metadata was retroactively corrected

old_descriptors = copy.copy(run.primary.metadata['descriptors'])

for desc in old_descriptors: 		
    if 'Wide Angle CCD Detector_image' in desc['data_keys']: 				
        desc['data_keys']['Wide Angle CCD Detector_image']['chunks'] = (1,1,-1,-1) 

run.primary.update_metadata({'descriptors':old_descriptors})

Action items and questions

The changes detailed above would only work if the Greateyes camera images always have 4 dimensions. Historically, this has always been the case, and I can't immediately think of why we would have non-4D images. But I want to check if there might be future desired scan features that change this?

Scan ID 94007 is an example of a scan that originally had a timeout error, but the error was eliminated after retroactively updating the metadata. I can generate more test scans if needed.

My understanding is that these changes can be rolled out without coordination with Tiled or PyHyperScattering, though it would be good to inform the user community to avoid accessing data during the retroactive metadata update. Would the retroactive metadata updates impact the automated tests in PyHyperScattering? Am I missing anything else?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions