Search found 83 matches
- Mon Dec 02, 2019 9:28 am
- Forum: OpenMM
- Topic: mixture Gaussian force with custom force
- Replies: 6
- Views: 541
Re: mixture Gaussian force with custom force
I will take a look, thanks for your help!
- Tue Nov 26, 2019 12:19 pm
- Forum: OpenMM
- Topic: mixture Gaussian force with custom force
- Replies: 6
- Views: 541
Re: mixture Gaussian force with custom force
Looks great! Does "CustomTorsionForce" support multiple torsions? I found that there is an example in the documentation: CustomTorsionForce* force = new CustomTorsionForce("0.5*k*(theta-theta0)^2"); I am wondering if it is possible to define two theta's and define following expression: CustomTorsion...
- Tue Nov 26, 2019 10:10 am
- Forum: OpenMM
- Topic: mixture Gaussian force with custom force
- Replies: 6
- Views: 541
Re: mixture Gaussian force with custom force
It is bonded force, involving two set of 4 atoms, first set (defining first dihedral) with index (4,6,8,14) and second set (defining second dihedral) with index (6,8,14,16).
- Tue Nov 26, 2019 9:08 am
- Forum: OpenMM
- Topic: mixture Gaussian force with custom force
- Replies: 6
- Views: 541
mixture Gaussian force with custom force
Hi, I am trying to implement a mixture Gaussian force expressed as $\sum_{i} V_i exp(-(\phi - \phi_i)^2 / (2 * \sigma_i^2) $ where $\phi$ is a dihedral angle of 4 atoms (out of many atoms) in a protein, $\phi_i$ and $\sigma_i$ are parameters for Gaussian. Does anyone know what might be the best way ...
- Tue Jul 23, 2019 10:38 am
- Forum: OpenMM
- Topic: CUDA_ERROR_NOT_INITIALIZED error when using Multiple GPUs with multiprocessing
- Replies: 4
- Views: 706
Re: CUDA_ERROR_NOT_INITIALIZED error when using Multiple GPUs with multiprocessing
I have tried that, but it did not work. I am thinking if there is some issue related with Plumed, as all the simulations are run with plumed. If I run simulations from command line (i.e. implement command line API and use `subprocess.check_output()` to run simulation), then parallelization is fine. ...
- Mon Jul 22, 2019 11:00 am
- Forum: OpenMM
- Topic: CUDA_ERROR_NOT_INITIALIZED error when using Multiple GPUs with multiprocessing
- Replies: 4
- Views: 706
Re: CUDA_ERROR_NOT_INITIALIZED error when using Multiple GPUs with multiprocessing
Sure, basically I have defined following class: class Simulation(object): def __init__(self): pass def run(self, output_dcd=None, n_checkpoints=10, gpu_idx='0', start_xyz=None, total_steps=1000000, interval=10000, equi_steps=100000, config_file='config.yml', plumed_file=None): spec = yaml.load(open(...
- Thu Jul 18, 2019 2:06 pm
- Forum: OpenMM
- Topic: CUDA_ERROR_NOT_INITIALIZED error when using Multiple GPUs with multiprocessing
- Replies: 4
- Views: 706
CUDA_ERROR_NOT_INITIALIZED error when using Multiple GPUs with multiprocessing
Hi, I am trying to run multiple simulations with multiple GPUs in parallel. I do this by creating a bunch of Simulation objects and call run() function for each of them with different parameters using multiprocessing.Process() , but I got following error message: Exception: Error initializing Contex...
- Wed Jun 26, 2019 10:31 am
- Forum: OpenMM
- Topic: periodic boundary conditions for setup with Charmm files?
- Replies: 1
- Views: 262
Re: periodic boundary conditions for setup with Charmm files?
Never mind, I found setBox() for psf file.
Thanks!
Thanks!
- Wed Jun 26, 2019 10:25 am
- Forum: OpenMM
- Topic: periodic boundary conditions for setup with Charmm files?
- Replies: 1
- Views: 262
periodic boundary conditions for setup with Charmm files?
Hi, I am trying to set up protein-ligand binding simulations with Charmm files (psf, str, prm) which already contains solvent and ions. But when I specify "nonbondedMethod=PME" in system = psf.createSystem(params, nonbondedMethod=PME, nonbondedCutoff=1*nanometer, constraints=HBonds) , it generates e...
- Mon Mar 18, 2019 12:29 pm
- Forum: OpenMM
- Topic: start simulation from specific frame from dcd file?
- Replies: 2
- Views: 275
Re: start simulation from specific frame from dcd file?
Great, that works, thank you!