Use case scenarios

Swiss National Grid  /  Use case scenarios

Matching your computational requirements with our use case scenarios


SwING technical experts have tested a set of use case scenarios for cyclic analysis, data set analysis, model calibration, and parameter sweep using local clusters or cloud infrastructures.



Do you need to?

  • Periodically re-run an application/a workflow when new data is generated and coping with peak cycles.
  • Process large data sets, requiring large-scale tightly coupled systems, large memory systems of large, fast scratch storage.
  • Modify input parameters applied to a computational model, which requires any combination of large-scale tightly coupled system, large memory system, large fast scratch storage.
  • Assess a range of parameter combinations based on a prototypical embarrassingly parallel use case, e.g. high energy physics.

These use case scenarios are summarised below. If you have similar computational requirements and would like our support, contact the Collaborative Distributed Support team. Interested in SwiNG's current use cases that are being tried and tested? Read more .....

Cyclic analysis scenario

Requirement: periodically re-running the same application/workflow, typically when new data is generated. The infrastructure must be able to cope with the peaks of each new cycle. Examples: 'time-driven' periods for processing meteorological data captured by weather and climate monitoring centres; 'event-driven' periods for processing data generated by instruments like sequencers and spectrometers.

Data set analysis scenario

Requirements: processing a large data set, making it publicly available or storing it privately in the group’s facility. Each step of the processing workflow depends on the specific requirements of each application, e.g. large-scale tightly coupled system, large memory system, large fast scratch storage, etc. Example: in structural biology, the modeling of a protein based on its amino-acid sequence generates many models that have to be validated against existing databases. A workflow is used for the data processing as it allows for a well-defined sequence. Current portfolio of use cases: Gweight and Rosetta. Read more ....

Model calibration scenario

Requirements: modifying input parameters applied to a computational model in order to find the best match against an observed set of data (reference model). This requires any combination of large-scale tightly coupled system, large memory system, large fast scratch storage. Example: Models with a large number of parameters are typically calibrated using global optimization algorithms and are characterized by an indefinite number of iterations over a parameter population. The algorithm eventually converges to the optimal parameter configuration that best matches the reference model. The evaluation of each parameter combination within a parameter population depends on the simulation model. The assessment of each parameter combination could be processed independently, allowing for a loosely coupled parallelism. A synchronization step is required at the end of each iteration (to assess convergence and generate the new parameters population), which requires all parameter populations to be evaluated.
Current portfolio of use cases: A4-Mesh, GEOTop. Read more ....

Parameter sweep scenario

Requirements: assessing a range of parameter combinations for any given model. A typical scenario is a prototypical embarrassingly parallel use case. Example: This use case scenario has been widely supported by various distributed infrastructures like computational grids, including EGI’s support of high-energy physics use cases. Each parameter combination could be processed independently from the others and no synchronization is required among the evaluation of the different parameters. Current portfolio of use cases: GEOTop, Gbiointeract. Read more ....