Survey Design

The Committee on Credible Practice of
Modeling & Simulation in Healthcare aims to establish a task-oriented collaborative platform to outline good practice of simulation-based medicine.
POST REPLY
User avatar
Jacob Barhak
Posts: 64
Joined: Wed Apr 17, 2013 4:14 pm

Survey Design

Post by Jacob Barhak » Fri Oct 25, 2013 2:41 pm

This thread is initiated to promote open discussion on the survey design. Here is some summary of discussions and a snapshot of the current situation:

The committee decided to go ahead with a survey on October 15th conference call meeting:
https://simtk.org/websvn/wsvn/cpms/doc/ ... 131015.pdf

A survey tool on Google drive went through several iterations and can be accessed upon request from the committee co chairs.

To shorten design time, it was decided the survey will be anonymous and qualify for IRB exemption. An email discussion between Lealem Mulugeta, Ahmet Erdemir and myself formed a base design. This design is yet not finalized at the time of this posting, and with this posting interested parties are invited to join the discussion and the design process.

The survey design is on the wiki and can be accessed through this URL:
http://wiki.simtk.org/cpms/Ten_Simple_R ... vey_Design

The current action items to complete the design are:
  • Assemble list of communities / populations to invite to the survey, to this end the committee started a wiki page of external contacts:
    http://wiki.simtk.org/cpms/Committee_Ex ... ntact_List
    Committee members are encouraged to add contacts to this list to improve the committee outreach.
  • Write the invitation letter and other recruitment material and add those to the wiki
  • Finalize design and answer all questions in the wiki to remove all TBD marks
  • Resolve conflicts in design such in the issue of multiple invitations
  • Release the survey to the advisory committee and run an internal test run to see if our analysis of results seem reasonable or if we need another approach. If there are bugs we want to know before we release this survey further.
Committee member are therefore invited to discuss the survey and contribute to this effort.

User avatar
Lealem Mulugeta
Posts: 42
Joined: Tue Dec 21, 2010 11:03 am

Re: Survey Design

Post by Lealem Mulugeta » Sun Oct 27, 2013 3:27 pm

Hello everyone,

In preparation for the public release of the survey, we are asking the Committee Members and the Advisory Council to have the following tasks complete by November 1.
  1. Please review the public survey design for the Ten Simple Rules. If you have any feedback or updates you want to make on the survey design process outlined on the wiki page, please feel free to do so via the this forum thread, or update the Wiki page by using the “Edit (Text)” button at the top left corner of the Survey Design Wiki page.
  2. In order for the survey results to be representative of the greater community’s perspectives, we need to participation from as many of the stakeholders possible. So we are asking everyone to help us identify and contact potential communities that can participate in the survey. We’ve started a Committee External Contact List wiki page to keep track of this. So please help us update the list.
We also welcome feedback from the greater community regarding these items.

Thanks,
Lealem

User avatar
Lealem Mulugeta
Posts: 42
Joined: Tue Dec 21, 2010 11:03 am

Re: Survey Design

Post by Lealem Mulugeta » Sun Oct 27, 2013 3:40 pm

Hello everyone,

This is an update regarding the action items listed in the first post by Jacob:
jbarhak wrote:
  • Write the invitation letter and other recruitment material and add those to the wiki
  • Finalize design and answer all questions in the wiki to remove all TBD marks
  • Resolve conflicts in design such in the issue of multiple invitations
  • Release the survey to the advisory committee and run an internal test run to see if our analysis of results seem reasonable or if we need another approach. If there are bugs we want to know before we release this survey further.
Invitation letters: Co-chairs are drafting a skeleton letter to be used by the committee
Survey test: This is currently planned to happen during November 1-10
Survey design questions: All outstanding items have been filled out, but we need input from Advisory Council and Committee Members to finalize the document. We hope to close this out by November 10th once the survey testing is complete.

Lealem

User avatar
Tina Morrison
Posts: 6
Joined: Mon May 07, 2007 4:35 pm

Re: Survey Design

Post by Tina Morrison » Mon Oct 28, 2013 11:06 am

Hello Everyone,
I'm very excited about the idea of a broad reaching survey to our colleagues (worldwide) on M&S and the "ten simple rules". My understanding is that the survey is to get to the heart of what would be the most crucial aspects of M&S to consider regarding its utility in general, and maybe more specifically in health care.
I'm very confused about the IRB exemption portion and the complex “design” of the survey. This survey has nothing to do with patient protection. The IRB, based on my experience of applying for both exemptions/approvals for clinical research, is not responsible for maintaining or ensuring responders anonymity. I think that the approach to protect the identity of surveyors is wonderful and robust, but I think this is overkill.
Many of us are volunteering our time to contribute content to the survey and its development; the aspects laid out under the design portion seem extremely burdensome. I think further discussion is warranted before we think about consent forms, etc.
I’m on board with developing a robust, broad-reaching survey but I’m not sure I understand (and at this point I don’t support) the role of the IRB exemption.
Respectfully,
Tina

User avatar
Jacob Barhak
Posts: 64
Joined: Wed Apr 17, 2013 4:14 pm

Re: Survey Design

Post by Jacob Barhak » Mon Oct 28, 2013 12:24 pm

Hi Tina,

You have a point that this survey pales with regards to some clinical matters with regards to risk. Never the less, some preparation process is in order if we are to outreach a larger population.

It is better that you think of the design template in the wiki as a set of questions helping to make a better survey. The better prepared we are the better the survey will be. The tool itself is easy to program - yet getting a survey right is not trivial.

The time and effort we invest now in making this survey is worthwhile - especially if we plan to address larger populations with a large diversity. From some perspectives the survey design may seem conservative and from others it may be seem to liberal - it depends on many factors and on the beholder. The committee members are welcome to post their opinions. The template on the wiki helps us with this and since history is retained it is a good tools for visible collaborative design.

Note that this is only the start of the real effort - getting responses to a survey may be harder than imagined, and we will need the help of all committee members and advisory committee to make the proper contact with affiliates to promote this effort.

Also, the final results may not match the committee's ranking for various reasons such as: populations not aware of newer technologies and different lingos. Never the less such a survey may help identifying those issues and provide better information on the current state of credible practices. Moreover, it will expose the invited population to new ideas - just by asking the questions.

I hope this puts things into perspective.

User avatar
Lealem Mulugeta
Posts: 42
Joined: Tue Dec 21, 2010 11:03 am

Re: Survey Design

Post by Lealem Mulugeta » Wed Oct 30, 2013 11:42 pm

Hi all,

The following is an email chain between Tina and Jacob that expands on the issues they discussed in their last posts.

*********Jacob -> Tina (Time Stamp: Mon 10/28/2013 4:20 PM CDT)*********
Hi Tina,

Attached (private due to IP) you will find an IRB application for exemption for a survey from the University of Michigan.

Lealem and Ahmet saw this already before we started the first design wiki.

I got permission to share this file with the committee yet I did not ask for permission to upload this publicly - so I cannot put it on the forum.

Since you are involved with similar matters you may wish to look at the long list of questions which is much longer than our design wiki.

Since you are working at FDA and government regulations you may have better access to the sources of the current regulations with regards to subject participation.

If you wish to lax the rules to avoid longer processes then you should push this from within the government and know/learn all associated aspects.

Yet from my experience going through this process is quite helpful for the survey design. I noted this publicly in the forum.

I hope this helps understand the issue from a broader perspective.

Jacob

*********Tina -> Jacob (Time Stamp: Mon 10/28/2013 4:23 PM CDT)*********
Is this a concern from Univ of Michigan?

If so, then my guess is there are other groups that can sponsor/launch the survey.

I read through the questions on the study design - to me it seems like this route would be necessary for a survey conducted on patients about health care or health practices; that is not the case here?

Maybe I have a completely different understanding of the role the IRB plays in research; I thought they were involved in patient protections, not individual protections regarding anonymity of a survey. Please help me understand.

Kind regards,
Tina

*********Jacob -> Tina (Time Stamp: Mon 10/28/2013 4:50 PM CDT)*********

Hi Tina,

The university of Michigan has a certification process. I went through it a few years ago and I am no longer affiliated with the University. So if something changed I may not be aware of it.

From what I recall whenever human subjects are used then there is need for some sort of approval. At U of M the IRB handles this for surveys.

When I suggested a survey I had to go through this long process just to get IRB approval - the certification process asked for it and if it is so important for you I can look up details again - they should be public as there were on the web site and I you had to study those before taking the test.

In the example you see this is a completely anonymous survey. Yet I had to get the approval that it is actually IRB exempt. This was the procedure.

The point is that we want the survey to be IRB exempt. Therefore anonymity is important. And this is not related to U of M. If I recall correctly this is federal - yet do correct me if I am wrong - things change with time as well as memory.

I hope this helps understand the details better.

Jacob
**********End of email discussion at Lealem's request to keep the discussion going on the forum instead of privately**********

User avatar
Lealem Mulugeta
Posts: 42
Joined: Tue Dec 21, 2010 11:03 am

Re: Survey Design

Post by Lealem Mulugeta » Thu Oct 31, 2013 12:16 am

Dear all,

First off, please excuse this very lengthy post. But I promise it is with good reason, and I am confident this will resolve the issues that have been discussed by Tina and Jacob, and a few others have raised regarding the survey design protocol.

Initially, I do want to point out that Jacob has very good intentions by insisting that we go through the lengthy "Survey Design" protocol, which is loosely based off of University of Michigan’s protocol.

Similar to Tina, when Jacob raised this issue in the beginning, I was surprised that his past institution expected such a rigorous protocol for collecting non-attributable information via a survey. It defied my understandings and personal experiences regarding human subject research, and when one needs to get formal IRB approval/exemption. Anyhow, after much debate with Jacob, I decided to go ahead and complete the protocol for the sake of progress with the survey effort, and because I could have been missing something regarding federal regulations regarding such matters.

I will admit filling out the protocol was long, complicated and felt extremely conservative. In the environment I work in, I deal with highly sensitive information all the time with extremely conservative processes. Even under such a work environment, I can’t remember coming across a situation where I had to go through such a lengthy process to gather similar information our survey aims to collect. But as a sanity check, I sought advice from multiple researchers from several institutions and IRB board members to see if I was missing something regarding Jacob’s position on this. Their reaction was similar to mine, but we all admitted that it was best to learn the appropriate federal regulations to make a definitive determination of how the Committee should proceed in the future when it comes to such matters. In other words, what our gut instinct and institutional experiences suggest aren’t necessarily what we should lean on. Federal regulations are what we should be abiding by.

So I invested substantial amount of time to read through the applicable sections of the Code of Federal Regulations, Title 45 – Public Welfare (45 CFR) to get the ultimate answer on whether or not the Committee needs to follow the processes suggested by Jacob for this study or similar studies in the future. Please note that this is the same regulation the U.S. Department of Health & Human Services (which includes NIH) abides by.

According to 45 CFR, to establish our IRB status, we must first determine if our study is considered “human subject research”. For an investigation to be considered, “human subject research”, two separate determinations must be made in order to evaluate whether or not the study is IRB exempt:
  1. Can the activity be considered research? If the answer is “yes”, then the investigators must follow up with a second determination,
  2. Does the research involve human subjects?
Both determinations must be made using the regulatory definitions of “research” and “human subjects” in 45 CFR 46.102 (a-j) in order for the investigation to be considered NOT IRB exempt. If both criteria are not met, then the investigation is considered IRB exempt in accordance to the CFR. According to 45 CFR 46.102(a-j), “research” and “human subjects” are defined as follows:
  • Research-“a systematic investigation, including research development, testing and evaluation, designed to develop or contribute to generalizable knowledge. Activities which meet this definition constitute research, whether or not they are conducted or supported under a program which is considered research for other purposes.” Investigators unsure of whether an activity constitutes human research should contact their IRB.
  • Human subjects-“living individual(s) about whom an investigator (whether professional or student) conducting research obtains (1) data through intervention or interaction with the individual, or (2) identifiable private information.” Activities in which a researcher collects private, identifiable information about third parties would meet the definition of “human subjects.”
Depending on how strictly you interpret the definition of “research”, I can see how one might consider our survey activity meets this definition. But I would be inclined to say that our work does not satisfy this definition. But, as I said, it can be debated. So let's look at the second criterion.

When we look at the definition of “human subjects”, either condition (1) or (2) must be met in order to satisfy the definition of “human subjects” per 45 CFR 46.102. In other words, both must determine "false" in order for our work to not be classified as human subject research.

Under human subject definition condition (1), intervention and interaction with individuals are defined as:
  • Intervention includes both physical procedures by which data are gathered (for example, venipuncture) and manipulations of the subject or the subject's environment that are performed for research purposes. – Our situation does not satisfy this.
  • Interaction - includes communication or interpersonal contact between investigator and subject. – Our situation does not satisfy this because we are not collecting the data though direct communication or interpersonal contact with the participants.
So condition (1) of “human subjects” definition does not apply to the Committee’s survey activity.

When we consider condition (2), we need to determine if the following four questions we are asking about the survey participant are considered “identifiable private information” per the definitions listed in the CFR:
  • What is the primary setting you work in?
  • What is your primary field of academic/professional training?
  • What is your highest level of education?
  • How familiar are you with Computational Modeling and Simulation (M&S)?
According to 45 CFR 164.514(b)(2)(i), the following are considered “identifiable private information”:

(A) Names;
(B) All geographic subdivisions smaller than a State, including street address, city, county, precinct, zip code, and their equivalent geocodes, except for the initial three digits of a zip code if, according to the current publicly available data from the Bureau of the Census:
  • (1) The geographic unit formed by combining all zip codes with the same three initial digits contains more than 20,000 people; and
    (2) The initial three digits of a zip code for all such geographic units containing 20,000 or fewer people is changed to 000.
(C) All elements of dates (except year) for dates directly related to an individual, including birth date, admission date, discharge date, date of death; and all ages over 89 and all elements of dates (including year) indicative of such age, except that such ages and elements may be aggregated into a single category of age 90 or older;
(D) Telephone numbers;
(E) Fax numbers;
(F) Electronic mail addresses;
(G) Social security numbers;
(H) Medical record numbers;
(I) Health plan beneficiary numbers;
(J) Account numbers;
(K) Certificate/license numbers;
(L) Vehicle identifiers and serial numbers, including license plate numbers;
(M) Device identifiers and serial numbers;
(N) Web Universal Resource Locators (URLs);
(O) Internet Protocol (IP) address numbers;
(P) Biometric identifiers, including finger and voice prints;
(Q) Full face photographic images and any comparable images; and
(R) Any other unique identifying number, characteristic, or code;

Since the four questions we are asking of the survey participants do not fall under any of these pieces of information, we are not collecting any identifiable information.

Consequently, the definition of “human subject” is not met. This means, we pass the test for IRB exemption. Based on this, I consider the "Survey Design" protocol suggested by Jacob to be informative. But the Committee is not required to follow it since the above information from the CFR clearly shows that we are IRB exempt and we are not dealing with any attributable data. However, we can continue to use the information we have already gathered via the "Survey Design" protocol. Besides, I think we would have collected most of it any way in one form or another. But there should be no expectations of the Committee to follow it.

*************************************************************************************************************************************
Forward Plan Recommendation: The Committee should not be subjected to processes directly taken or partially derived from other institutions. Instead, we should follow the official federal regulatory processes as listed above from this day forward.

Academic and research institutions can impose whatever processes they see fit on their investigators. Imposing other institutional processes on the Committee is not appropriate since those institutions are not the governing body of the Committee. The Committee is under IMAG, which is directed by the NIH. Therefore, it is most appropriate that we always defer to the Code of Federal Regulations listed above, and not any other institution’s protocol.

Future data collection initiatives will proceed in accordance to 45 CFR. This will ensure that we are following ethical practices, ensure the credibility of our work and minimize subjectively imposed processes.

Unless there is a majority disagreement with my assessment and/or recommendation, I think this should be our official process. But, in general, I am strongly against imposing a protocol similar to the the current one implemented for the survey design. It is overly conservative for our purposes.
*************************************************************************************************************************************

I encourage you all to review the CFR when you get a chance. I personally found it very informative, and I feel better equipped to deal with future IRB protocols in a more streamlined manner in my personal work as well as extramural activities.

Thanks again to Jacob for raising this important issue. Without the concerns that he raised, we could have been at risk of making some bad assumptions regarding where we stood with IRB processes. Now we have the correct federal regulation code we need to follow to ensure that we are conducting our work ethically, and ensure strong credibility in our work.

Thanks for your patience, everyone.

Lealem

User avatar
Jerry Myers
Posts: 3
Joined: Wed Sep 05, 2007 7:58 am

Re: Survey Design

Post by Jerry Myers » Thu Oct 31, 2013 12:26 pm

My apologies for my late arrival to this lively debate....

Lealem - Thank you for your very complete review of the regulations. I am in agreement with your interpretation given the context of the survey.

In the end the 4 survey questions that are mentioned are what I would term a management or standards survey and not a research survey. The intent is to get a snapshot of the cross section of the participants that is meant to give the survey owner an understanding of the quality of the survey. As such I believe it does not meet criteria 1 for an IRB as Lealem has described from the regulations.

Given the time invested in this discussion thus far, I would propose that the committee consider that our due dilligence has been performed and that we agree that the regulations that govern this activity for the IMAG have been met. Further we table any discussion regarding the need for a more complex survey design protocol until after it is determined that more than this initial simple survey will be needed by the committee.

User avatar
Jacob Barhak
Posts: 64
Joined: Wed Apr 17, 2013 4:14 pm

Re: Survey Design

Post by Jacob Barhak » Thu Oct 31, 2013 11:01 pm

Lealem should be congratulated for his very detailed work. Lealem has defined well the boundaries of our operation with this survey.

Note that new technological tools may soon change these boundaries. Never the less the current survey design as it is presented in the wiki is very reasonable considering the current definitions.

I have no experience with focus groups, so this issue should be explained to me in the context of the survey. Yet I know that there are opportunities to conduct those - I would appreciate a discussion on this topic.

We still have to create recruitment material and review it as part of the design.

We also have to test the survey internally before release. I have concerns regarding anonymity with a Google tool, yet those can be easily tested before release, and there are plenty of easy altenatives for survey tools.

I look forward to further discussion at the meeting.

Jacob

User avatar
Martin Steele
Posts: 37
Joined: Tue Apr 23, 2013 9:52 am

Re: Survey Design

Post by Martin Steele » Fri Nov 01, 2013 4:16 am

jbarhak wrote: The survey design is on the wiki and can be accessed through this URL:
http://wiki.simtk.org/cpms/Ten_Simple_R ... vey_Design
Jacob, I'm intentionally keeping this short (for the moment): Why do you want to normalize the responses of each modeler? What justifies that? The raw results from the Survey will only be categories 0, 1, 2, 3, 4, 5, which really don't have any numerical meaning.

POST REPLY