Workflows & Standards & good practices

The Committee on Credible Practice of
Modeling & Simulation in Healthcare aims to establish a task-oriented collaborative platform to outline good practice of simulation-based medicine.
User avatar
Lealem Mulugeta
Posts: 42
Joined: Tue Dec 21, 2010 11:03 am

Workflows & Standards & good practices

Post by Lealem Mulugeta » Thu Oct 15, 2015 1:43 pm

The following is a discussion initiated by Dr. Anthony Hunt with the CPMS Co-chairs via email as a consequence of the 2015 IMAG Multiscale Modeling Consortium meeting (http://www.imagwiki.nibib.nih.gov/imag- ... um-meeting). Given the significance of the discussion to the end goals of the Committee, we felt it was most appropriate to have the discussion posted and continue further discussions via the CPMS Public Forum.

We encourage anyone interested to participate in this discussion as it will play a role in the development of the Guidelines for Credible Practice of Modeling and Simulation in Healthcare.

Lealem

****Start of email discussion*****
From T. Hunt on Wednesday, September 16, 2015 6:56 PM
For me, there are several workflow protocols and practices that come before—and are actually more important (in terms of work product) than—a downstream choice (or avoidance) of some standard.

Ahmet, from you presentation on Thursday, I infer that the same may be true for you.

Might it be worthwhile for the CPMS to try to initiate discussions within the Consortium on good M&S good workflow practices?

There may (or not), for example, be differences between a good practice to which Ahmet tries to adhere (during development of a new FE model) and the corresponding good practice to which I try to adhere (when developing an agent-based model).

The survey suggests that there should be method agnostic good practices.

Activities exhibiting noteworthy differences should also be of interest to others.

I could say more, but I'll stop here to get your reactions.

-Tony-
---

From A. Erdemir on Mon, Sep 21, 2015 at 8:02 AM
Tony, for me as well workflow protocols and practices come before choice of standards. For my work standards (their existence and our compliance to them) become an issue when we need to exchange data/models/results and when there are no appropriate format conversion tools. With this mindset, I view standards as the means to facilitate a collaborative workflow, evaluation of its outcome, and reuse of its components. Will I need one if I can convert from different file types easily? May be not.

Manufacturing industry built various standards for geometry exchange, i.e. in computer aided design. There is STEP (an ISO standard), there is IGES (an ANSI standard). Exchange of geometry in manufacturing was a significant need as different parts may need to be build by different providers and need to match at high tolerances. Nonetheless, standards for models of finite element analysis (even for meshes) has not gained much traction. I suspect that this is simply due to companies not necessarily sharing models with other companies, and they unify the use of a specific finite element analysis software within the company (allowing exchange of models between different groups within the company using a proprietary and a non-standard form).

I will be interested in someone (may be you) provide the anatomy of an agent-based modeling & simulation study from start to end; similar to what I tried to convey with my presentation for FEA. In following, we can go through various steps to identify overlapping themes and some that may differ.

The way I see it; a broad workflow in M&S goes

i) find data to build and evaluate the model
(from a credibility perspective convince yourself and others that the data is useful)

ii) process data to bring it in a usable form to incorporate in to a model - derivative data
(from a credibility perspective convince yourself and others that your analysis is correct and useful)

iii) assemble the model using various components of the derivative data
(from a credibility perspective convince yourself and other that your assembly is correct, i.e. you defined interactions between components right)

iv) conduct simulations under desired cases
(from a credibility perspective convince yourself that your simulation software operates as expected and the cases you simulate are represented appropriately and are relevant)

v) evaluate your results
(from a credibility perspective convince yourself that your results are believable, i.e. can have a desired realism and can be utilized for interpretation within context)

vi) report your work
(from a credibility perspective convince yourself that your report reflects your previous actions and can be followed to reproduce the outcome of your work)

vii) share your data/models
(from a credibility perspective convince yourself that the way you share your data/models to not result in potential errors in their interpretation)

How much is this modeling & simulation workflow different based on varying simulation strategies, disciplines, etc.?

Thanks for your input.
Best,
ahm.
---

From T. Hunt on Mon, Sep 21, 2015 at 2:51 PM

Great response. Thanks for taking the time to respond. My feedback is inserted
Tony, for me as well workflow protocols and practices come before choice of standards. For my work standards (their existence and our compliance to them) become an issue when we need to exchange data/models/results and when there are no appropriate format conversion tools. With this mindset, I view standards as the means to facilitate a collaborative workflow, evaluation of its outcome, and reuse of its components.
concur
Will I need one if I can convert from different file types easily? May be not.

Manufacturing industry built various standards for geometry exchange, i.e. in computer aided design. There is STEP (an ISO standard), there is IGES (an ANSI standard). Exchange of geometry in manufacturing was a significant need as different parts may need to be build by different providers and need to match at high tolerances. Nonetheless, standards for models of finite element analysis (even for meshes) has not gained much traction. I suspect that this is simply due to companies not necessarily sharing models with other companies, and they unify the use of a specific finite element analysis software within the company (allowing exchange of models between different groups within the company using a proprietary and a non-standard form).

I will be interested in someone (may be you) provide the anatomy of an agent-based modeling & simulation study from start to end; similar to what I tried to convey with my presentation for FEA.

OK. I have in mind the following.
* I create a workflow diagram that provides "the anatomy of an agent-based modeling & simulation study from start to end.”
* You use the outline below and content from your FEA presentation to create a similar FEA workflow diagram (without trying to force any matches to mine).
* We email the Working Groups and ask leads if one or more Group members will volunteer do the following:
– offer constructive/critical feedback on our workflows (with a view toward making both a good practice example)
– generate and contribute a similar workflow diagram for their particular area of M&S

If important differences emerge, we will have learned something useful. Similarities across 3 or 4 very different M&S domains may suggest an overarching good practice. Whatever the outcome, it should make a publishable communication.

Your thoughts?

-T-
---

From T. Hunt on Tue, Sep 29, 2015 at 4:13 PM

Ahmet, please attachments. I attached a PDF of the .docx in case that is more convenient.

It turned out that we had separate parts of complete workflow document, but not a whole. So creating a complete workflow was worthwhile. I looks like your steps below are a subset of a more complete workflow.

Can you take an hour or so this week and expand what you have below so that somewhat parallels the attachment and add examples as I did?

If we can get two or three others to do the same (I plan to ask Linderman first), my expectation is that we will begin to see workflow steps/stages that can be generalized across diverse MSM research projects. It may also become clear that several of the generalized workflow steps/stages (alone or in combination) can be restated as Best Practices.

Comments/additional thoughts?

Regards
-Tony-
Worflows 29Sep15.pdf
(83.07 KiB) Downloaded 117 times
Worflows 29Sep15.docx
(124.44 KiB) Downloaded 110 times
---

From A. Erdemir on Thu, Oct 1, 2015 at 5:53 PM


Thanks Tony for these documents. I am on the road at this moment and coming back to the country on Saturday; so please bear with me as I try to assemble a more complete workflow for FEA. In the meanwhile, here are some thoughts:

As I quickly glance through your document it reminded me a manuscript that we wrote for reporting finite element analysis studies in fea. Many reporting parameters that we have listed there (under the categories Model Identification, Model Structure, Simulation Structure, Verification, Validation, Availability) are in parallel with the components of a complete workflow. Except, they need to be reiterated focusing on actions of M&S rather than their reporting. See manuscript at

http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3278509/

The workflow that I provided in my previous e-mail (and in my presentation) essentially deals with various technicalities of implementation.
---

From T. Hunt on Fri, Oct 9, 2015 at 2:40 PM


Ahmet, can you please send me your enhanced workflow within the next few days. I’d like to send out the pair of workflows next week before my academic workload ramps-up.
Thanks
-T-
---

From A. Erdemir on Mon, Oct 12, 2015 at 10:03 AM

Tony, I apologize for my tardy response. It took me longer than I anticipated to return back to this and detail the workflow. Attached are a doc and pdf versions of the workflow document that I prepared. In future, I will most likely detail this document further by giving more concrete examples. Nonetheless, it portrays a commony strategy we employ to conduct studies using finite element analysis. It will hopefully encourage others to come up with workflows for their domain and tool of choice.

Thank you for following up on this.
Best,

Ahmet
FEA-workflow-forTH.pdf
(72.9 KiB) Downloaded 116 times
Worflows 29Sep15.docx
(124.44 KiB) Downloaded 110 times
---

From T. Hunt on Mon, Oct 12, 2015 at 10:59 AM

This week, I’ll format my document similar to yours. Because FE M&S is more mature than AB M&S, my document will be relatively sparse. That’s fine.

Examples will help, but I think that we have enough to get the ball rolling.

-T-
---

From T. Hunt on Mon, Oct 12, 2015 at 2:31 PM

As written, it seems that the decision was made prior to your step [1] that a FE analysis (M&S) approach is best/preferred/essential. That’s not an issue. I just want to confirm that is the case.

When we have several workflows, there may be different branch points, with each branch leveraging a specialized workflow and methods. Your FE analysis would be one branch.

Do you concur? If yes, I’ll suggest a couple of steps prior to your [0]. If you like what I suggest, then you could add one (sufficient for now) example question (whatever) such that the FE analysis branch is the obvious best approach.

It’s too early for me to guess if most branches emanate from a common node (step [0] followed by your [1]).

Additional commentary welcomed.

-T-
---

From A. Erdemir on Mon, Oct 12, 2015 at 3:54 PM

It is true that, when preparing this workflow document, I took the FE analysis as the preferred approach as essential. I concur that there needs to be branching in relation to specialized methods.

It will be good to have a preliminary workflow that can help the user/developer decide upon which branch to follow. I suspect that is what you mean by steps prior to my [0]. Nonetheless, without defining the problem and metrics of interest, this may be hard to do. Therefore, it may be wiser to include my [1] and my [2] into the initial workflow to decide upon the specialized modeling strategy. Also, it may be useful that (as a reminder) to add a step in my workflow after defining problem and metrics as Confirm FEA as tool of choice. In such a case, I will pose a question to the user - "Are you looking to solve for field problems?", give a biological example , e.g. mechanical risk of cartilage degeneration after anterior cruciate ligament deficiency - therefore exploring stresses and deformations of the cartilage. Does this make sense?

Cheers,
ahm.
---

From T. Hunt on Tue, Oct 13, 2015 at 3:17 PM


Ahmet, let me set the stage for the comments that follow. At the start of most wet-lab/clinical research workflows, it is understood that, for example, the huge variety of available/accessible analytical chemistry methods are “on the table,” should they be needed. They may not be mentioned in early draft protocols.

At some future time, it will become “a given” that a huge variety of computational M&S methods are on the table when new projects are first discussed. Rather than looking backwards at what we’ve done, for this workflow discussion, let’s look forward and imagine that we are getting close to that future vision.

When a new project is first discusses, no M&S approach/method is off the table. As tasks and issues come into focus along with things like budgets, some M&S methods are conceptually “pushed aside” until it is clear that one or a small set of methods will be needed to achieve objectives and goals.

So, I envision a (future) research workflow starting broad, with many M&S methods on the table, and then getting progressively narrow. We are focused on two workflows (A & B) where both FE & AB methods are on the table (with other methods) early on. However, after some stage in workflow A, AB methods are off the table. Only FE methods remain. Whereas, after some stage in workflow B, FE methods are off the table. Only AB methods remain.

Is that scenario reasonably clear?

See the attachment. My example at step 1.1 is bone fracture union/nonunion in adults. At that stage, until more is specified, FE & AB methods would be on the table along with many others (systems M&S, pattern recognition [in proteomics big data], etc.). However, as we answer the questions, the set of remaining methods shrink.

I think that I can envision answers (specific example answers) such that by the time we get to 6.1, only FE methods remain on the table. Do you agree? Depending on the answers, I can imagine that all but FE methods get pushed off the table earlier.

I’m sure that I can envision answers (different specific example answers) such that by the time we get to 6.1, only AB methods remain on the table.

Am I still making sense? If yes, then can we do that? We can pick different 1.1 responses and provide different examples in response to the questions. Long term, my expectation is that following best practices will lead the research team to the best M&S methods rather than having a biased modeler like me pushing them prematurely in my direction.

-T-
GeneralWorkflow 13Oct15.pdf
(50.78 KiB) Downloaded 124 times
---

From A. Erdemir on Wednesday, October 14, 2015 at 7:56 AM

Tony, my comments are below.

Ahmet, let me set the stage for the comments that follow. At the start of most wet-lab/clinical research workflows, it is understood that, for example, the huge variety of available/accessible analytical chemistry methods are “on the table,” should they be needed. They may not be mentioned in early draft protocols.


AE: Agreed and understood.
At some future time, it will become “a given” that a huge variety of computational M&S methods are on the table when new projects are first discussed. Rather than looking backwards at what we’ve done, for this workflow discussion, let’s look forward and imagine that we are getting close to that future vision.


AE: I see. Essentially when drafting an broad workflow, we will assume that we have a toolbox full of variety of M&S methods, where each method may have their own nuance of conducting the work targeted to solve a specific problem (as described in M&S strategy specific workflow).
When a new project is first discusses, no M&S approach/method is off the table. As tasks and issues come into focus along with things like budgets, some M&S methods are conceptually “pushed aside” until it is clear that one or a small set of methods will be needed to achieve objectives and goals.


AE: Agreed. Logistics of the environment, expertise of developers, and capabilities of a M&S method may cause them to be pushed aside but until than the broad workflow should be immune to bias in choice of M&S.
So, I envision a (future) research workflow starting broad, with many M&S methods on the table, and then getting progressively narrow. We are focused on two workflows (A & B) where both FE & AB methods are on the table (with other methods) early on. However, after some stage in workflow A, AB methods are off the table. Only FE methods remain. Whereas, after some stage in workflow B, FE methods are off the table. Only AB methods remain.

Is that scenario reasonably clear?


AE: It is. As of now, some items in your broad workflow (provided in your e-mail) overlaps some of the M&S strategy specific workflows. Yet, when we end up using a specific M&S strategy, we (or the user) may want to assume that a broad workflow has been worked through to identify a relevant M&S strategy potentially suitable to address the specific problem of interest. Nonetheless, we can reframe overlapping steps in M&S specific workflows to relate to the broader workflow better; i.e. rather than saying "Define problem/context" in the specific workflow, we can say "Confirm this specific strategy fits to problem/context". I believe I do that in my mind anyways; for example, if I am interested in cartilage contact forces during walking, I think of the mechanics of the tissue (domains are defined) and I scan through different modeling strategies in my toolbox (rigid body dynamics based musculoskeletal movement simulations or finite element analysis). I may choose one over the other through a gross selection criteria. Let's say I decided to use FEA. Before I start working on the model, I think a bit harder to ensure that I can get what I want from FEA and then move on.
See the attachment. My example at step 1.1 is bone fracture union/nonunion in adults. At that stage, until more is specified, FE & AB methods would be on the table along with many others (systems M&S, pattern recognition [in proteomics big data], etc.). However, as we answer the questions, the set of remaining methods shrink.
AE: Agreed.

I think that I can envision answers (specific example answers) such that by the time we get to 6.1, only FE methods remain on the table. Do you agree? Depending on the answers, I can imagine that all but FE methods get pushed off the table earlier.

I’m sure that I can envision answers (different specific example answers) such that by the time we get to 6.1, only AB methods remain on the table.

AE: Agreed, ideally by 6.1 someone should decide which specific M&S workflow need to be considered. Nonetheless, economics and expertise maay be influential, particularly when multiple methods can do the work, i.e. alternatives exist. Also, one interesting concept is that the problem may dictate the use of multiple modeling and simulation strategies, e.g. use FEA to get the mechanical environment, feed the mechanical environment to an agent based model to calculate bone adaptation and return back to FEA to rerun with change material properties, i.e. curse of domain/scale coupling.
Am I still making sense? If yes, then can we do that? We can pick different 1.1 responses and provide different examples in response to the questions. Long term, my expectation is that following best practices will lead the research team to the best M&S methods rather than having a biased modeler like me pushing them prematurely in my direction.


AE: Agreed. I am indeed a biased modeler as well simply based on what I know and what I have access to but for someone who doesn't know and who may have access to all; an objective workflow, immune to ourselves, will be greatly helpful.

---

From T. Hunt on Wed, Oct 14, 2015 at 11:06 AM
An observation/comment followed by a request:

You stated:
AE: Agreed, ideally by 6.1 someone should decide which specific M&S workflow need to be considered. Nonetheless, economics and expertise may be influential, particularly when multiple methods can do the work, i.e. alternatives exist. Also, one interesting concept is that the problem may dictate the use of multiple modeling and simulation strategies, e.g. use FEA to get the mechanical environment, feed the mechanical environment to an agent based model to calculate bone adaptation and return back to FEA to rerun with change material properties, i.e. curse of domain/scale coupling.


Agree. That’s how it should work. We’re not there yet.

If we can generate a set of diverse workflows that start agnostic about M&S approach but end with creative use of one (or two) particular method, I think that they will prove useful in explaining to the larger community how M&S can be integrated with wet-lab methods to achieve better solutions to R&D problems.

Request (referring to [GeneralWorkflow 13Oct15.pdf]):
Please select a step 1.1 context that will lead to FEA at step 6.1. Below you stated, “e.g. mechanical risk of cartilage degeneration after anterior cruciate ligament deficiency - therefore exploring stresses and deformations of the cartilage.”

Maybe the step 1.1 context statement could be a question related to [cartilage stress and deformation] <=> [cartilage degeneration after anterior cruciate ligament deficiency]

I’ll do the same for AB methods & will provide example answers to all 1-6 questions. I’ll send that to you this afternoon, and the you do something similar for your context.

The next step would be to complete our specialized workflows beyond 6.1.

Given those two workflow examples, it will be easier for other MSM consortium members to contribute.

-T-
****End of email discussion and start of transition to Public Forum***
Attachments
FEA-workflow-forTH.doc
(36 KiB) Downloaded 103 times

User avatar
C. Anthony Hunt
Posts: 23
Joined: Sun Apr 21, 2013 2:18 pm

Re: Workflows & Standards & good practices

Post by C. Anthony Hunt » Fri Oct 16, 2015 4:19 pm

Hypothesis: the diversity of questionnaire rankings is a result of different responders focusing more on those portions of an overall workflow on which they have been focused or with which they have more experience.

Suppose that we have a set of six different example workflow documents (the current two plus four more), similarly formatted. We repeat the questionnaire but rephrase the request, something like this: given similarities and differences in M&S workflows, as reflected in the attached six examples, please rank the following rules in terms of relative importance across all workflows.

I think that we'd get a quite different and possibly more unified response.

-Tony-

User avatar
Lealem Mulugeta
Posts: 42
Joined: Tue Dec 21, 2010 11:03 am

Re: Workflows & Standards & good practices

Post by Lealem Mulugeta » Mon Oct 19, 2015 12:42 am

Tony and Ahmet,

Thanks very much for initiating this discussion. I've read through much of the documents and exchange, and I generally agree with the workflow elements.

In my experience of developing and applying a variety of models for varying context of use, I've come up with a series generalized workflows that are applicable to the type of model considered (FEM, lumped, multi-body dynamics, probabilistic, etc.). In reviewing the workflows you have shared thus far, there are a few key elements I believe that are being overlooked. Namely:
  1. Particularly at the very early stages of the work flow, when the domain, context, objectives and requirements are being defined, appropriate subject matter experts and end-users (e.g. physiologists, biologists, physical therapists, physicians) NEED to be involved. From first-hand experience, I've learned that if you do not have these stakeholders immediately involved, the likelihood that you will have an impact beyond your own lab or personal interest is greatly diminished. Given the goal of the Committee is to develop guidelines and workflows that facilitate the transition of models into the clinic, the SME and end-users must be engaged as early as possible.
  2. I believe that it is important to explicitly state the need to develop a conceptual model first prior to moving to code development. I realize code development is the fun part and we all want to get to it ASAP, but the best models are often those which start with a well defined conceptual model.
  3. In addition, the conceptual model must go through appropriate credibility assessment to ensure that it satisfies the needs of the SME and end-users, and the selected modeling approach is most appropriate to address the problems of interest.
    • I generally find #2 and 3 portions of M&S development and implementation process are completely ignored. As such I strongly recommend we consider augmenting the example workflow to explicitly to include appropriate development and vetting of a conceptual model before starting code development.
  4. The ordered list format of the example workflow is somewhat misleading because almost every section of the workflow can be quite the iterative process. Using an ordered list suggests a very linear process with an explicit end point. This is rarely the case, if at all.
  5. The nature of truly multiscale models, particularly in healthcare applications, tend to require the integration of multiple sub-models to establish an overarching larger model. I realize it may be possible to establish a single model without having to integrate multiple models, but my experience suggests that modular-models are going to be a common place. As such, the development and implementation of modular models for integration and independent use must also be captured for completeness.
I'm sure there are a couple of other items, but these are the big items I noticed.

In this light, I have attached a graphical workflow adapted from the spaceflight biomedical modeling workflows I've developed, implemented and published/presented over the last six and a half years (see attachment). I The adaptations I made are to change terminologies and adding granularity to the steps to mirror the equivalent process for terrestrial medical applications. I've done similar in my work with the ASME V&V40 standard sub-committee, and has received general acceptance.

I realize you have a desire to maintain a standardized format, but I don't think it is a good idea to stick to strictly stick to a list format. I think graphical flow diagrams need to be included to help capture the full picture. As such, I would like to suggest that we adapt a combination of what you've drafted and the flow diagrams I've provided as an example to share with the MSM Working Group Leads.

Thanks,

Lealem
Attachments
M&S Workflow - Lealem.pdf
(1.31 MiB) Downloaded 146 times

User avatar
Ahmet Erdemir
Posts: 77
Joined: Sun Sep 10, 2006 1:35 pm

Re: Workflows & Standards & good practices

Post by Ahmet Erdemir » Mon Oct 19, 2015 6:54 am

From our discussions, I have identified a few issue that we may need to clarify. As we started our discussions, our motivations appeared to be to provide our workflows as we utilize our specific modeling & simulation strategy. As our discussions evolved, we seemed to separate the workflow into a broad workflow agnostic to the modeling & simulation strategy and a specific one indicating nuances of the selected modeling & simulation strategy. As we moved on, we seemed to be discussing modeling context's influence on the workflow. Establishing a broader workflow is advantageous (similar to our strategy with the Ten Simple Rules discussions) to unify the modeling & simulation community. Yet, this comes at the risk of alienating specialized nuances that are important for day-to-day practice of modeling & simulation. Two paths on determining workflows seem to exist:
  • A broad workflow independent from context, modeling & simulation strategy, etc. Maybe during its application to a specific problem, the level (and order) at which individual components of the workflow are conducted may incorporate context, modeling & simulation strategy, and risk related nuances.
  • A specific workflow for each context, modeling & simulation strategy, etc.


It may be a good time to think a little bit harder about
  • Dependency of workflow on defined context, e.g., are the modeling & simulation workflows significantly different when one conducts modeling & simulation for research vs for clinics? Or is it just the intensity at which a broad workflow's component are accomplished?
  • Dependency of workflow on modeling & simulation strategy, e.g., are the modeling & simulation workflows significantly different when one uses agent-based modeling strategy vs finite element analysis?
  • Dependency of workflow on associated risks, e.g., are the modeling & simulation worklows significantly different when one is at risk of generating wrong scientific knowledge vs harming a patient?
  • Dependency of workflow on discipline, e.g., are the modeling & simulation workflows significantly different in systems biology vs biomechanics?
  • Dependency of workflow on environment, e.g., are the modeling & simulation workflows significantly different when one has time and money constraints?

User avatar
C. Anthony Hunt
Posts: 23
Joined: Sun Apr 21, 2013 2:18 pm

Re: Workflows & Standards & good practices

Post by C. Anthony Hunt » Mon Oct 19, 2015 2:04 pm

Ahmet, you identify good points. I’ll be interested in what different MSM Working Group members have to say. Where diversity exists, we want to identify and understand it.

About your “Two paths on determining workflows:” For now, I am most interested in a project’s initial and actual workflows (and subsequent critical paths).

Initial Workflow: the completed (by the modeler) planning documents describing how the project is expected to unfold. It may be somewhat different for a modeler-initiated project and for the future projects that we envision where the modeler is invited in at some stage.

Actual Workflow: highlights (key features) of the expanding record (maintained by the modelers) of the plan for the next task, documentation of what was actually done, and observations following task completion. Of course, during any task, new insights or information or circumstance changes may require revising planning documents. Having a record of those changes and an explanation of those changes will, I believe, be important for any future user.

Post hoc Workflows: ideally, a summary of an Actual Workflow would accompany any model made available for reuse and repurposing. I know that’s not done now (we are working in that direction), but the information contained within could be very useful and it would go a long way in supporting credibility. However, an idealized, post hoc workflow may do more harm than good.

Your five bullets under “think a little bit harder:”
My current opinion is that some of that information/insight becomes clearer when one has sufficiently specified use cases and requirements. Thereafter, I expect considerable workflow similarity. Having abridged workflows from very different MSM Consortium members will enlighten us.

Regarding Lealem’s defined conceptual model:
Where conceptual models enter (and would be identified and described) will likely be different depending on model/software artifacts produced. In our abridged workflow, conceptual models come in with most sub-sub bullets starting at 1.1.1.

I think that we need see many examples of MSM Consortium workflows. My expectation is that workflows become based on artifacts produced. The workflow should document or explain relationships between conceptual models the artifact produced.

User avatar
Lealem Mulugeta
Posts: 42
Joined: Tue Dec 21, 2010 11:03 am

Re: Workflows & Standards & good practices

Post by Lealem Mulugeta » Tue Oct 20, 2015 12:32 am

Ahmet and Tony,

You both make great points. My general experience has been that, irrespective of the type of model or context, the workflow can be maintained fairly standard. However, Ahmet makes an excellent point in that it may depend on the intensity and risk associated.

For example, if you're developing a model for general hypothesis testing for your own personal interest, then many elements can drop off. From experience, however, I find that you may still find it useful to go through the full extent of the workflow as much as possible to give yourself some confidence that you're not overlooking anything. But, you will probably not spend as much time iterating or scrutinizing certain steps as strictly.

Tony also makes a good point that we will need to seem many different examples. I think from these examples we may be able to come up with workflow(s) that are more or less generalized.

Regarding the conceptual model statement, I read through the sections Tony pointed to and I am not sure if we're using the term conceptual model the same way... I am personally using it in the form defined in our glossary here: http://wiki.simtk.org/cpms/Glossary_and ... tual_model

Perhaps Tony mean the same thing too and I need help understanding this.

Thanks,

Lealem

User avatar
C. Anthony Hunt
Posts: 23
Joined: Sun Apr 21, 2013 2:18 pm

Re: Workflows & Standards & good practices

Post by C. Anthony Hunt » Tue Oct 20, 2015 10:58 am

Lealem,
Regarding your 10/20 conceptual model statement: I don’t find the cited glossary definition to be very helpful. The phrase “validation experiments” implies that the theory about how the phenomena of interest are generated is relatively mature, and has survived multiple scientific falsification attempts. Most biomedical research is focused on phenomena about which considerable uncertainty remains.

In our work, a conceptual model statement starts with what I get when I ask a domain expert to provide her current best idea for how a phenomenon (on which we are focused; the phenomenon we would like to model in order to improve understanding) may be generated. She may simply hand me her recent review paper, and say, “it’s all in here.” Of course, refinement is required.

Biomedical review papers are often collections of conceptual model statements (hypotheses and supporting arguments). Pick a review journal like this http://pharmrev.aspetjournals.org. Select a review (e.g., http://pharmrev.aspetjournals.org/content/67/2/441.full). It will likely contain many conceptual models. For the same phenomenon, the conceptual models of two domain experts may be different in important ways.

Considerable work is required (framing, abstraction, assumptions, constraints, …) in order for a modeler to arrive at a description of a model system and plausible generative theory (mechanistic explanation) that is a suitable starting place (a set of descriptions, text, sketches, etc. spanning equally plausible explanatory theories) for a M&S project. That work (iterative refinement of the description) is an important part of the larger workflow. Having it documented as part of the workflow will, I believe, improve downstream credibility.

User avatar
Lealem Mulugeta
Posts: 42
Joined: Tue Dec 21, 2010 11:03 am

Re: Workflows & Standards & good practices

Post by Lealem Mulugeta » Tue Oct 20, 2015 11:01 pm

Hi Tony,

I suppose it is difficult to succinctly define this term in one or two sentences, particularly for such a multidisciplinary field of application. So I understand why the definition may not be helpful to you. I understand it perhaps I've naturally worked in an environment where this definition was formulated.

In any event, I see where you are coming from and I think we are actually saying the same thing. Moreover, I believe our views on this issue also fit with the glossary definition. I just didn't get it right away wen I read through your workflow.

With that said, I think you may have misread the definition regarding "validation experiments". It is not intended to suggest a high maturity state. As the definition reads,
The collection of abstractions, assumptions, and descriptions of physical processes representing the behavior of the reality of interest from which the mathematical model OR validation experiments can be constructed
In your case, the definition would only cover the abstractions, assumptions, and descriptions of the processes/mechanisms representing the behavior of the reality of interest from which the mathematical model is to be constructed... as you have described in your response.

Cheers,

Lealem

User avatar
C. Anthony Hunt
Posts: 23
Joined: Sun Apr 21, 2013 2:18 pm

Re: Workflows & Standards & good practices

Post by C. Anthony Hunt » Fri Oct 23, 2015 12:28 pm

Lealem,

Thanks for pushing the discussion about conceptual model definition. It forced me to realize that my working definition was not well-formed. So, I sought input, commentary and suggestions from colleagues. The outcome is that we are adding the following two new draft definitions to our glossary.

n. domain of discourse - The set of concepts, the lexicon, involved for adequately discussing or understanding a theory, explanation, model, hypothesis, etc.

n. conceptual model - an organization of concepts (a subset of the domain of discourse), showing which concepts are related and how they are related. A conceptual model relies on prior knowledge and understanding of the domain of discourse and can be informal or formal. Conceptual models can be weak or strong analogies to their referents. The weak-strong axis is orthogonal to the formality axis. Analogical weaknesses become more obvious as formality increases. But formal models are not necessarily strong analogies.

-Tony-

User avatar
Lealem Mulugeta
Posts: 42
Joined: Tue Dec 21, 2010 11:03 am

Re: Workflows & Standards & good practices

Post by Lealem Mulugeta » Sun Oct 25, 2015 2:36 pm

Hi Tony,

Not a problem, and thanks for sharing your definition. I like it!

Can you also upload it to the CPMS Glossary wiki page? I it will be a great source we can draw from to come up with a Committee definition. In fact, we encourage you and anyone else to look through our list of definitions and add to it where you think we are missing something. Feel free to invite your colleagues to do the same!

I think it is these kinds of discourses that allow the Healthcare M&S community to understand each other better in order to come up with a unified lexicon. Even in cases we may find it challenging to come up with a unified vocabulary, I believe we will be much further along to understanding each other.

Thanks!

Lealem

POST REPLY