<div dir="ltr"><p style="margin:0in 0in 0.0001pt;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial"><b><span style="color:black">OpenSim 4.2 Available for Download</span></b></p><p style="margin:0in 0in 0.0001pt;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial"><span style="color:black">We are pleased to announce </span><a href="https://simtk.org/frs/?group_id=91" title="https://simtk.org/frs/?group_id=91" target="_blank">OpenSim 4.2</a><span style="color:black">, the newest version of the OpenSim software. The new software brings improvements that make it easier to use inertial measurement unit (IMU) data and the state-of-the-art direct collocation method with OpenSim. OpenSense, our workflow for working with IMU data, is now directly available in the OpenSim application (GUI), where you can visualize your IMU data and compute inverse kinematics. OpenSim Moco, our workflow for using direct collocation, is now available directly with OpenSim's Python, Matlab, and C++ interfaces. Read more about the </span><a href="https://simtk-confluence.stanford.edu/pages/viewpage.action?pageId=48988768" title="https://simtk-confluence.stanford.edu/pages/viewpage.action?pageId=48988768" target="_blank">new features</a><span style="color:black"> and </span><a href="https://simtk.org/frs/?group_id=91" title="https://simtk.org/frs/?group_id=91" target="_blank">download OpenSim 4.2</a></p><p style="margin:0in 0in 0.0001pt;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial"><b><span style="color:black;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial"><br></span></b></p><p style="margin:0in 0in 0.0001pt;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial"><b><span style="color:black;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial">Rice Computational Neuromechanics Lab Openings for PhD Students and Post-doctoral Researcher </span></b></p><p style="margin:0in 0in 0.0001pt;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial"><span style="color:black">The </span><a href="https://rcnl.rice.edu/" target="_blank">Rice Computational Neuromechanics Lab</a><span style="color:black"> directed by Dr. B.J. Fregly is looking for two PhD students and one post-doctoral researcher to work on a new five-year project with OpenSim. The project will develop innovative software technology that allows engineers working in collaboration with clinicians to design effective personalized surgical and neurorehabilitation interventions for movement impairments. Such impairments often result from clinical conditions such as stroke, osteoarthritis, spinal cord injury, and even cancer. The proposed software will create a virtual representation of the patient’s neuromusculoskeletal system and then apply virtual treatments to the virtual patient to identify the treatment design that will maximize post-treatment movement function. </span><a href="https://simtk.org/opportunities.php" target="_blank">Learn more and apply </a></p><div><div style="color:rgb(80,0,80)"><p style="margin:0in 0in 0.0001pt;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial"><b><span style="color:black"><br></span></b></p><p style="margin:0in 0in 0.0001pt;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial"><b><span style="color:black">Apply to Participate in Virtual Office Hours for Biomechanical Modeling or Machine Learning Research Questions</span></b></p><p style="margin:0in 0in 0.0001pt;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial"><i><span style="color:black">Applications due: April 9, 2021, 5 PM local time</span></i></p><p style="margin:0in 0in 0.0001pt;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial"><i><span style="color:black">Meeting dates: Weeks of April 19 and April 26, 2021</span></i></p><p style="margin:0in 0in 0.0001pt;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial"><span style="color:black;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial"><br></span></p><p style="margin:0in 0in 0.0001pt;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial"><span style="color:black;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial">We encourage you to apply to our </span><span style="color:black">April Virtual Office Hours to support researchers working with wearable sensors, video technology, and other modalities in rehabilitation research. These office hours are offered as part of the training efforts of our <a href="https://restore.stanford.edu/" target="_blank"><span style="color:rgb(5,99,193)">Restore</span></a> and <a href="https://mobilize.stanford.edu/" target="_blank"><span style="color:rgb(5,99,193)">Mobilize</span></a> Centers. We will have two tracks: 1) Biomechanical modeling with <a href="https://opensim.stanford.edu/" target="_blank">OpenSim</a> and <a href="https://simtk-confluence.stanford.edu/display/OpenSim/OpenSense+-+Kinematics+with+IMU+Data" target="_blank">IMUs</a> or video, and 2) Machine learning, including video analysis. All phases of a research project are supported, such as formulating a research question, choosing and planning appropriate methods, and addressing issues with carrying out the study. We welcome and encourage individuals to apply as a team, if applicable. <a href="https://mobilize.stanford.edu/2021/03/12/apply-to-participate-in-virtual-office-hours-for-biomechanical-modeling-or-machine-learning-research-questions-2/" title="https://mobilize.stanford.edu/2021/03/12/apply-to-participate-in-virtual-office-hours-for-biomechanical-modeling-or-machine-learning-research-questions-2/" target="_blank"><span style="color:rgb(5,99,193)">Learn more and apply</span></a></span></p></div></div><div><br></div>-- <br><div dir="ltr" class="gmail_signature" data-smartmail="gmail_signature"><div dir="ltr"><div dir="ltr" style="color:rgb(136,136,136)"><div dir="ltr"><b><font color="#333333">Jennifer Hicks, Ph.D.</font></b><div><font color="#333333">Director of Data Science</font><span style="color:rgb(51,51,51)"> </span><span style="color:rgb(51,51,51)">|</span><span style="color:rgb(51,51,51)"> <a href="http://mobilize.stanford.edu/" style="color:rgb(17,85,204)" target="_blank">Mobilize Center</a></span></div><div><font color="#333333">Associate Director | <a href="http://www.stanford.edu/group/opensim/about/index.html" style="color:rgb(17,85,204)" target="_blank">NCSRR</a><br>R&D Manager | <a href="http://opensim.stanford.edu/" style="color:rgb(17,85,204)" target="_blank">OpenSim</a> </font></div><div><font color="#333333">Stanford University <br><a value="+16504984403" style="color:rgb(34,34,34)">650-498-4403</a> | <a href="mailto:jenhicks@stanford.edu" style="color:rgb(17,85,204)" target="_blank">jenhicks@stanford.edu</a></font></div></div></div></div></div></div>