Ewoks tasks#

DahuJob#

Ewoks task that runs a Dahu job.

Optional inputs: - dahu_url (str): Tango URL of the Dahu device. - dahu_parameter_file (str): Dahu parameter file path (can be relative to config_directory). - extra_dahu_parameters (dict): Overwrite Dahu parameters. - config_directory (str): Directory of the Dahu parameter file and other config files. - timeout (float): Timeout waiting for the Dahu job in seconds (Default: 3600). - nobackup (bool): Save in the NOBACKUP directory (Default: False)). - dahu_job_index (int): Dahu job index for ordering results (Default: 0).

Outputs: - dahu_job_index (int): Dahu job index for ordering results. - dahu_job_id (Optional[int]): Dahu job id (None means no Dahu job was executed). - dahu_result (Optional[dict]): Dahu result (None means no Dahu job was executed).

Identifier:

ewoksbm29.tasks.base.dahu.DahuJob

Task type:

class

Required inputs:

Optional inputs:

config_directory, dahu_job_index, dahu_parameter_file, dahu_url, extra_dahu_parameters, nobackup, timeout

Outputs:

dahu_job_id, dahu_job_index, dahu_result

DahuJobWithIspybUpload#

Ewoks task that runs a Dahu job with uploading to Ipysb.

In addition to the inputs from DahuJob:

Optional inputs: - ispyb_metadata (dict): Scan metadata (see ISPyBMetadata). - ispyb_url (str): WDSL end-point of the Ispyb SOAP service.

Identifier:

ewoksbm29.tasks.base.dahu_ispyb.DahuJobWithIspybUpload

Task type:

class

Required inputs:

Optional inputs:

config_directory, dahu_job_index, dahu_parameter_file, dahu_url, extra_dahu_parameters, ispyb_metadata, ispyb_url, nobackup, timeout

Outputs:

dahu_job_id, dahu_job_index, dahu_result

DahuHplcSummary#

Rebuild the complete chromatogram with basic analysis.

In addition to the inputs from DahuJobWithIspybUpload:

Required inputs: - integrated_files (List[str]): Azimuthally integrated SAXS data.

Optional inputs: - hplc_summary_parameters (dict): Extra subtract parameters (see HplcSummaryParameters).

Identifier:

ewoksbm29.tasks.hplc_summary.DahuHplcSummary

Task type:

class

Required inputs:

integrated_files

Optional inputs:

config_directory, dahu_job_index, dahu_parameter_file, dahu_url, extra_dahu_parameters, hplc_summary_parameters, ispyb_metadata, ispyb_url, nobackup, timeout

Outputs:

dahu_job_id, dahu_job_index, dahu_result

DahuIntegrate#

Azimuthal integration of BM29 SAXS data.

In addition to the inputs from DahuJobWithIspybUpload:

Required inputs: - scan_data_slice (dict): Subset of parameters related to the scan data (see IntegrateParameters).

Optional inputs: - integrate_parameters (dict): Extra integrate parameters (see IntegrateParameters).

Identifier:

ewoksbm29.tasks.integrate.DahuIntegrate

Task type:

class

Required inputs:

scan_data_slice

Optional inputs:

config_directory, dahu_job_index, dahu_parameter_file, dahu_url, extra_dahu_parameters, integrate_parameters, ispyb_metadata, ispyb_url, nobackup, timeout

Outputs:

dahu_job_id, dahu_job_index, dahu_result

ReadScanDataSlice#

Return a subset of IntegrateParameters for a SAXS scan index range covered by the lima file with index lima_file_index.

Optional inputs (online & offline): - lima_file_index (int): Lima file index that determines the scan slice. - dahu_to_counter_name (dict): Map counter names to IntegrateParameters keys. - storage_ring_current (float): Storage ring current in mA. - exposure_time (float): Scan point exposure time in s.

Optional inputs (online): - scan_key (str): Blissdata scan key (tries offline when fails). - retry_timeout (float): Timeout when trying to access the Lima image dataset or wait for the scan to be PREPARED. - retry_period (float): Period in retry loops.

Optional inputs (offline): - scan_file_path (str): Bliss scan file name. - scan_number (str): Bliss scan number.

Outputs: - scan_data_slice (dict): Subset of IntegrateParameters related to the scan data. - has_data (bool): scan_data_slice contains a non-empty scan data slice. - lima_file_index (int): The lime file index to which scan_data_slice belongs. - next_lima_file_index (int): The next Lima file index.

Identifier:

ewoksbm29.tasks.read.ReadScanDataSlice

Task type:

class

Required inputs:

Optional inputs:

dahu_to_counter_name, exposure_time, lima_file_index, retry_period, retry_timeout, scan_file_path, scan_key, scan_number, storage_ring_current

Outputs:

has_data, lima_file_index, next_lima_file_index, scan_data_slice

AccumulateDahuJobResults#

Accumulate Dahu job results with the job index as a key.

Required inputs: - dahu_job_index (int): Dahu job index for ordering results. - dahu_job_id (Optional[int]): Dahu job id (None means no Dahu job was executed). - dahu_result (Optional[dict]): Dahu result (None means no Dahu job was executed).

Outputs: - dahu_results (Dict[int, dict]): Add the Dahu result to results from previous

executions within the same workflow.

Identifier:

ewoksbm29.tasks.results.AccumulateDahuJobResults

Task type:

class

Required inputs:

dahu_job_id, dahu_job_index, dahu_result

Optional inputs:

Outputs:

dahu_results

DahuSubtract#

Subtract average integrated buffer signal from the sample signal.

In addition to the inputs from DahuJobWithIspybUpload:

Required inputs: - sample_file (str): Azimuthally integrated SAXS data. - buffer_files (List[str]): Azimuthally integrated SAXS data of the buffer.

Optional inputs: - subtract_parameters (dict): Extra subtract parameters (see SubtractParameters).

Identifier:

ewoksbm29.tasks.subtract.DahuSubtract

Task type:

class

Required inputs:

buffer_files, sample_file

Optional inputs:

config_directory, dahu_job_index, dahu_parameter_file, dahu_url, extra_dahu_parameters, ispyb_metadata, ispyb_url, nobackup, subtract_parameters, timeout

Outputs:

dahu_job_id, dahu_job_index, dahu_result