Code Invocation

  • Some of the XRStools capabilities can be accessed by invocation of the XRS_swissknife script, providing as input a file in the yaml format.

  • To use the wizard the suggested instruction is

    XRS_wizard  --wroot ~/software/XRStoolsSuperResolution/XRStools/WIZARD/methods/

    the wroot argument tells where extra workflow can be found. In the above instruction we give workflows in the home source directory. This is practical because the wizard allows to edit them online and the modification will remain in the sources. or to access extra workflows that are not coming with the main disribution.

  • Depending on the details of your installation, you have the XRS_swissknife script sitting somewhere in a directory. Check the Installation page to see how to set PYTHONPATH and PATH in case of a local installation.

    The following documentation has been generated automatically from the comments found in the code.

GENERALITIES about XRS_swissknife

XRS_swissknife.generality_doc = 1

The script is called in this way

XRS_swissknife yourinput.yaml

The input file is in yaml format. In yaml format each line introduces an item and the indentation expresses the hierarchy. An example is

Fancy_Reduction :
      parameterTom : 3.14
      parameterJerry : False
      choicesBob     : [1,2,3]

In this example we create an item called Fancy_Reduction which contains three subitems.

The XRS_swissknife expects that for each operation that you want to apply, you provide an item named as the operation, and the associated subitems that provide that values for the parameters.

XRS_swissknife will do what you want provided you respect the proper indentation. A thing which helps is using emacs and activating the python mode, because python uses the same indentation principle to structure the code.

Each processing item has the additional, optional, key active. This key can be set to 0 or 1 to desactivate or not (default is 1, active) the processing. Here a desactivation example

Fancy_Reduction :
      active : 0
      parameterTom : 3.14
      parameterJerry : False
      choicesBob     : [1,2,3]

The following documentation has been created automatically, for each functionality, based on the documentation string written in the code for the fonctionality itself. You can also write mathematical expressions : \int{ x dx} = \frac { x^2 }{ 2 } and even include graphics.

Super Resolution

to fit optical responses of all the analysers (you selected a ROI for) and the pixel response based on a foil scan

embedded doc :

superR_fit_responses :

foil_scan_address : “demo_foilroi.h5:/ROI_FOIL/foil_scan/scans/Scan273” nref : 5 # the number of subdivision per pixel dimension used to

# represent the optical response function at higher resolution
niter_optical : 100 # the number of iterations used in the optimisation of the optical
# response
beta_optical : 0.1 # The L1 norm factor in the regularisation
# term for the optical functions
pixel_dim : 6 # The pixel response function is represented with a
# pixel_dim**2 array
niter_pixel : 100 # The number of iterations in the pixel response optimisation
# phase. A negative number stands for ISTA, positive for FISTA

beta_pixel : 1000.0 # L1 factor for the pixel response regularisation

## The used trajectories are always written whith the calculated response ## They can be reloaded and used as initialization(and freezed with do_refine_trajectory : 0 ) ## Uncomment the following line if you want to reload a set of trajectories ## without this options trajectories are initialised from the spots drifts ## # reload_trajectories_file : “response.h5”

## The method first find an estimation of the foil scan trajectory on each roi ## then, based on this, obtain a fit of the optical response function ## assuming a flat pixel response. Finally the pixel response is optimised ## ## There is a final phase where a global optimisation ## is done in niter_global steps. ## ## Each step is composed of optical response fit, followed by a pixel response fit. ## If do_refine_trajectory is different from zero, the trajectory is reoptimised at each step ## niter_global : 20

## if do_refine_trajectory=1 the start and end point of the trajectory are free ## if =2 then the start and end point are forced to a trajectory which is obtained ## from a reference scan : the foil scan may be short, then one can use the scan of ## an object to get another one : key trajectory_reference_scan_address ##

do_refine_trajectory : 2

## optional: only if do_refine_trajectory = 2

trajectory_reference_scansequence_address : “demo_newrois.h5:/ROI_FOIL/images/scans/” trajectory_threshold : 0.1

## if the pixel response function is forced to be symmetrical

simmetrizza : 1

## where the found responses are written

target_file : “demo_responses_bis.h5”

to extrapolate to a larger extent the ROIS and the foils scan, thus to cover a larger sample

embedded doc :


This command extend the rois and creates an extrapolated foil scan

The parameters are as following

superR_recreate_rois :

  ### we have calculated the responses in responsefilename
  ### and we want to enlarge the scan  by a margin of 3 times
  ### the original scan on the right and on the left 
  ###  ( so for a total of a 7 expansion factor )

  responsefilename :  "responses.h5:/fit"
  nex : 3

  ## the old scan covered by the old rois
  old_scan_address : "../nonregressions/demo_imaging.hdf5:ROI_B/foil_scanXX/scans/Scan273/"

  ## where new rois and bnew scan are written
  target_filename : "newrois.h5:ROI_B_FIT8/"
  filter_rois      : 1

to calculate the scalar product between a foil scan and a sample, for futher use in the inversion problem

embedded doc :


This step supposes that you have:

  • already extracted 2D images with the loadscan_2Dimages command.

The loadscan_2Dimages has then already accomplished the following requirements which are listed below for informative purposes :

  • these images must reside at sample_address
  • Under sample_address there must be a a set of datagroups

with name ScanZ where Z is an integer. The number of these datagroups will be called ZDIM - Inside each ScanZ there must be a a set of datagroup with name N where N is the ROI number. - inside each roi datagroup there is the dataset matrix. This is a three_dimensional array :

  • first dimension is YDIM : the number of steps in the Y direction
  • the other two dimensions are the dimensions of the ROI
  • Obtained the optical PSF of all

desired analyzers, and the maxipix response function. This can be done with the iPSF commands which will have provided the responses for a dirac Delta placed at different positions along X direction. The iPSF has then already taken care of placing in the delta_address data_group the following(listed for informational purposes):

  • a list of datagroup with name N, N being the number of the ROI.

  • Inside each datagroup a dataset called matrix exists
    • the matrix has 3 Dimensions
    • The first dimension is the number for steps done

    with the thin foil in the X direction to get super-resolution. This will be called XDIM - The other two dimensions are the dimensions of the ROI. They must be equal to those appearing in the the sample datas describe informatively above.

Here an example of the input file dedicated section

superR_scal_deltaXimages :
   sample_address : "demo_imaging.hdf5:ROI_B_FIT8/images/scans/"
   delta_address  : "demo_imaging.hdf5:ROI_B_FIT8/scanXX/scans/Scan273/"
   target_address         : "scalprods.hdf5:scal_prods/"

   # optional

   nbin           : 5                               # defaults to 1
                                # it will  bin 5 xpixels in one

   roi_keys       :  [60, 64, 35, 69, 34, 24, 5, 6, 71, 70, 39, 58, 56, 33]
   # roi_keys default to all the keys present in delta_address

   orig_delta_address  : "demo_imaging.hdf5:ROI_B/foil_scanXX/scans/Scan273/"
   # defaults to None. If given the integrated image and the average line will be written
   # to check the superposability between the foil scans and the sample scans

   ## optional
   optional_solution : /mntdirect/_scisoft/users/mirone/WORKS/Christoph/XRSTools/volumes_gasket.h5:/ref408_bis_423_548/Volume0p03
   ## a     solution with dimensions  [ZDIM,YDIM,XDIM] 
   ## If given, will be used to balance analyzer factors

If nbin is given the dimensios of the superresolution axis, will be reduced or increased, by binning together the foil PSFs. What the program will produce, under target_address datagroup, is

  • scalDS which is an array [ZDIM,YDIM,XDIM] , type “d” .
  • scalDD which is the total sum of the squared datas.
  • scalSS which is an array [XDIM,XDIM] , type “d” .

From these three quantities the volume can be reconstructed with iterative procedure in subsequent steps.

Here what they are :

  • scalSS is a 2D matrix, one of its elements is the scalar product of the response function for a given position of the foil, along X, with the response function for another position of the foil. The sum over ROIS is implicitely done.
  • scalDS is a 3D array. One of its element is the scalar product of the sample image for a given Z,Y position of the sample, with the reponse function for a given X position of the foil. The sum over the ROIs is implicitely done.

Other features[source]


Displays doc on the operations. In the input file

help :

will trigger printing of all the available operation names

help :

will print the help on create_rois and the help about load_scans. By the way, it is the same that you can read here because the sphinx doc-generation tool reads the same docstrings contained in the code.



This launches the roi selector. If yamlData contains instructions on the scan to be taken , the roi selector will pop up with the image ready to be used. Otherwise you have to browse some file around.

At the end you have the possibility to write the rois on a hdf5 file.

In the extreme case when you give no arguments ( parameters) beside the name of the method, the input file looks like this

create_rois :

you must select an image from the menu or, from the Local menu, select the experimental scans from which an image is summed up. The selection will be operated on this image.

The experimental scans can be given in input using the following structure

create_rois :

  expdata :  "absolutepathtoaspecfile"  # this points to a spec file
  scans   : [623,624]                   # a list containing one or more scans 
                                        # for the elastic part.
                                        # They will be summed to an image
                                        # and rois will then be extracted from this image.
  roiaddress : "myfile.hdf5:/path/to/hdf5/group"  # the target destination for rois

The selection will be written if you confirm before exiting ( File menu). If roiaddress is not given, before quitting, and after the validation of the selection, you have the possibility to write the rois into a hdf5 file. You do this by choosing a filename for the hdf5 file, and a name for the roi selection. An entry, with this name, will be created into the hdf5 file ( all subsequent treatements, done by other methods than this, which uses this roi selection ( called by its name ) will be reported into subentries of such entry)

The followings are optionals arguments. First the filter_path

filter_path : filename.h5:/groupname

It is formed by hdf5 file AND path where the filter is stored. It must be a matrix dataset with 1 for good pixels , 0 for bad ones.

This parameter is used to create a filter mask by hand

masktype : filter

then the target will be written with a mask of zero and one. The mask will have the same size as the image and will be zero for points to be discarded.



This command harvest the selected signals. the instructions on the scans to be taken must be in the form( as submembers ofload_scans )

load_scans :
    roiaddress :  "hdf5filename:nameofroigroup"  # the same given in create_rois
    expdata    :  "absolutepathtoaspecfile"  # this points to a spec file

    elastic_scans    : [623]
    fine_scans       : [626,630,634,638,642]
    n_loop           : 4
    long_scan        : 624

    signaladdress : "nameofsignalgroup"  # Target group for writing Relative to ROI (and in the same file)!!!!

    order : [0,1,2,3,4,5]  #  list of integers (0-5) which describes the order of modules in which the 
                           #    ROIs were defined (default is VD, VU, VB, HR, HL, HB; i.e. [0,1,2,3,4,5])

    rvd : -41              # mean tth angle of HL module (default is 0.0)
    rvu : 85               # mean tth angle of HR module (default is 0.0)
    rvb : 121.8            # mean tth angle of HB module (default is 0.0)
    rhl : 41.0             # mean tth angle of VD module (default is 0.0)
    rhr : 41.0             # mean tth angle of VU module (default is 0.0)
    rhb : 121.8            # mean tth angle of VB module (default is 0.0)




Function to extract the interesting signal after removal of Compton profile, linear baselines, Pearson profile Example

Extraction :
    active : 1
    dataadress   : "pippo.h5:/ROI_A/loaded_datas"         # where load_scans wrote data
    HFaddress    : "pippo.h5:/ROI_A/loaded_datas/HF_O"    # where compton profiles have been calculated
    # prenormrange : [ 5 , .inf ]        

    analyzerAverage :                                     #  averaging over analysers
        active : 1
        which : [0,11  , 36,59   ]
        errorweighing  : False

    removeLinearAv :                                      #  fit a linear baseline and remove it
        active  : 1
        region1 :  [520.0,532.0]   
        region2 :  None 
        ewindow : 100 
        scale : 1

    removePearsonAv:                                      # fit a Pearson and remove it
        active  : 0
        region1 :  [520.0,532.0]   
        region2 :  None  
        guess :
            Peak_position : 600.0
            FWHM          :  10
            Shape         : "Lorentzian" 
            Peak_intensity: 100.0
            linear_slope  : 1
            linear_background : 0
            scaling_factor : 1

    view   :   0
    target :   "myextraction"                            # path relative to dataadress where extracted signal will be written


function for building S(q,w) from tabulated Hartree-Fock Compton profiles to use in the extraction algorithm.


dataadress : "hdf5filename:full_nameof_signals_group"   # where load_scans wrote data
formulas   :  ['O']     # list of strings of chemical sum formulas of which the sample is made up
concentrations : [1.0]  # list of concentrations of how the different chemical formulas are mixed (sum should be 1)
correctasym    : [[0.0,0.0,0.0]]  #  single value or list of scaling values for the HR-correction to 
                          # the 1s, 2s, and 2p shells. one value per element in the list of formulas
hfspectrum_address : "nameofgroup" # Target group for writing Relative to dataadress (and in the same file)!!!!