Ten Simple Rules - End Users Perspective

The Committee on Credible Practice of
Modeling & Simulation in Healthcare aims to establish a task-oriented collaborative platform to outline good practice of simulation-based medicine.
POST REPLY
User avatar
Ahmet Erdemir
Posts: 77
Joined: Sun Sep 10, 2006 1:35 pm

Ten Simple Rules - End Users Perspective

Post by Ahmet Erdemir » Wed Sep 18, 2013 5:58 am

This is a discussion thread for the END-USERS task team to establish Ten Simple Rules of credible practice of modeling and simulation in healthcare.

A summary of the activity can be found at:
http://wiki.simtk.org/cpms/Ten_Simple_R ... e_Practice

Details of the End Users task team can be found at:
http://wiki.simtk.org/cpms/CPMS_Task_Teams

Please respond to this forum by providing your insight on which of the following candidate actions (copied from here, add more if you find necessary) are important to establish credible practice. Ideally, you will rank these and provide justification on why one may have a precedence.
  • Use version control
  • Use credible solvers
  • Explicitly list your limitations
  • Define the context the model is intended to be used for
  • Define your evaluation metrics in advance
  • Use appropriate data (input, validation, verification)
  • Attempt validation within context
  • Attempt verification within context
  • Attempt uncertainty (error) estimation
  • Perform appropriate level of sensitivity analysis within context of use
  • Disseminate whenever possible (source code, test suite, data, etc)
  • Report appropriately
  • Use consistent terminology or define your terminology
  • Get it reviewed by independent users/developers/members
  • Learn from discipline-independent examples
  • Be a discipline-independent/specific example
  • Follow discipline-specific guidelines
  • Conform to discipline-specific standards
  • Document your code
  • Develop with the end user in mind
  • Provide user instructions whenever possible and applicable
  • Practice what you preach
  • Make sure your results are reproducible
  • Provide examples of use
  • Use traceable data that can be traced back to the origin
  • Use competition of multiple implementations to check and balance each other
Best,

Ahmet

User avatar
John Rice
Posts: 8
Joined: Thu May 30, 2013 10:08 pm

Re: Ten Simple Rules - End Users Perspective

Post by John Rice » Thu Oct 10, 2013 6:07 pm

The statement of a 'rule' may be simple and should be. The difficulty is in interpretation of what the rule mean to whom and under what conditions, and for what KIND of model or simulation. (Hummm, will there be two sets of rules? one for models and one for simulation using a model, or will the term of reference for the rules be Modeling AND Simulation?)

A key fault I have encountered in past efforts to establish rules (DOD big time) is the failure to clearly document the Spirit and Intent of the rule. Without S&I which is what individuals will need to commit to honor, the letter of the rule will never be translated across domains, and M&S types and stakeholders who are engaged with or effected by the model or simulation without being vaporized and reconstructed to suit their own immediate purposes (or budgets). Ideally a rule for creditable creation, use, or consumption of output from a model should apply to a physical model of an airplane as well as the computational model of its expected flight dynamics. Really ideal it would consider that not all models are created. Some are designated. "Company X is a model for a perfect organization." means what? Can I apply a rule about declaration of assumption to it. Yes, but I can't think of a time when anyone ever has done that for me. Result, the phrase usually is rhetorical but it is used all the time. Applying the rules should make that statement a meaningful since surely one of the rules would be specifying the context for the model. And another documenting its assumptions. ????

However, the test of how well a universal rule is interpreted in the convenient language of different domains of use or users with the same domain would be to determine that the listener can recite the purpose in the terms consistent with the original documented spirit and intent of the simple rule. If the spirit and intent is not document with the simple statement there can be no test of each hearer's interpretation and therefore what ever they DO, thinking they are compliant, may be irrelevant or worse counter productive to the rules intent.

If done well, I, if I want to be a simulationists in my work domain should be able to recite the rules word for word on demand, and ideally every morning when I arrive at my desk. They will be short enough that they can easily be cited in a simulationists code of ethics and would likely be at the foundation of the code. So writing on the "tablet" should be easy. As with the biblical 10 Commandments, the problem is that their spirit and intent was left to discussion subject to individual interpretation after the fact since it would not have fit on two stone tablets.

Hard part, but still not impossible to do, with very broad agreement, is writing the intent and capturing the spirit of the rule in the same document as the rule first published.

Next installment I'll try to tackle the dimensions of a diagram (almost said model but ....) depicting the height, width and depth of the space containing the elements of M or S or M&S that will require interpretation of each of the rules.
"

User avatar
John Rice
Posts: 8
Joined: Thu May 30, 2013 10:08 pm

Re: Ten Simple Rules - End Users Perspective

Post by John Rice » Fri Oct 11, 2013 7:33 pm

New rule candidate. Don't think I have seen this in your list nor anywhere else until this book.

Thou shalt not distribute a model or output from a model that has ever produced unexpected results that have not been document, investigated and explained. (JR)


Quoted (from introduction page 11 I think)

"c. Ignoring Unexpected Behavior

Although a validation process is recognized to be an essential stage in any modelling and simulation project, its main thrust generally is to confirm that expected behavior does occur. On the other hand, testing for unexpected behaviour is never possible. Nevertheless such behaviour can occure and when it is observed there is often a tendency to dismiss it particularly when validation test have provided satisfactory results. Ignoring such counterintuitive, or unexpected observations can lay the foundation for failure."
Brita, Louis G. and Arbez, G.
Modeling and Simulaiton: Exploring Dynamic System behavior, Springer-Verlag, London Limited 2007

POST REPLY