--- a +++ b/docs/index.html @@ -0,0 +1,129 @@ +--- +layout: default +overview: true +--- +<style> +.logos { +border-radius: 25px; +text-align: center; +margin-top: 0.5em; +-webkit-box-shadow: 0 3px 10px rgba(0,0,0,0.1); +-moz-box-shadow: 0 3px 10px rgba(0,0,0,0.1); +box-shadow: 0 3px 10px rgba(0,0,0,0.1); +} +.logos img { + display: block; + max-width:230px; + max-height:45px; + width: auto; + height: auto; + display: inline-block; + padding-left: 0.9em; +padding-right: 0.9em; +margin-top: 0.3em; +margin-bottom: 0.3em; +}</style> +<section class="intro"> + <div class="grid"> + <div class="unit one-third center-on-mobiles" style="padding-top: 1em"> + <img src="https://s3.amazonaws.com/osim-rl/videos/running.gif" /> + </div> + <div class="unit two-thirds center-on-mobiles" style="padding-top: 1em; padding-left: 1em"> + <p class="first">Reinforcement learning with musculoskeletal models in OpenSim</p> + </div> + </div> +</section> +<section class="features"> + <div class="grid"> + <div class="unit golden-large"> + <h2>NeurIPS 2019: Learn to Move - Walk Around</h2> + <p> + Design artificial intelligent controllers for the human body to accomplish diverse locomotion tasks. Participate in the NeurIPS 2019 challenge to win prizes and fame. + </p> + <a href="/docs/nips2019/">Learn more about the challenge →</a> + </div> + <div class="unit golden-small"> + <h2>OpenSim RL</h2> + <p> + Use our musculoskeletal reinforcement learning environment for other projects in computer science, neuroscience, biomechanics, etc. + </p> + <a href="/docs/home/">Learn more about osim-rl →</a> + </div> + <!--div class="clear"></div> + <div class="unit whole center-on-mobiles logos" style="background-color: #ffffff;"> + <a href="https://news.stanford.edu/2017/08/07/virtual-competitors-vie-different-kind-athletic-title/"><img src="https://s3.amazonaws.com/osim-rl/logos/stanford.png" /></a> + <a href="https://people.eecs.berkeley.edu/"><img src="https://s3.amazonaws.com/osim-rl/logos/berkeley.png" /></a> + <a href="https://actu.epfl.ch/news/crowdai-the-open-data-science-challenge-platform/"><img src="https://s3.amazonaws.com/osim-rl/logos/epfl.png" /></a> + <a href="http://mobilize.stanford.edu/"><img src="https://s3.amazonaws.com/osim-rl/logos/mobilize.png" /></a> + <a href="https://www.crowdai.org/challenges/nips-2018-ai-for-prosthetics-challenge"><img src="https://s3.amazonaws.com/osim-rl/logos/crowdai.png" /></a> + <a href="https://news.developer.nvidia.com/nvidia-sponsors-learning-to-run-ai-competition-at-nips-2017/"><img src="https://s3.amazonaws.com/osim-rl/logos/nvidia.png" /></a> + <br /> + <a href="https://aws.amazon.com/blogs/machine-learning/nips-2017-challenge-pushes-deep-learning-to-improve-surgical-outcomes/"><img src="https://s3.amazonaws.com/osim-rl/logos/aws.png" /></a> + <a href="https://techcrunch.com/2017/08/07/dueling-ais-compete-in-learning-to-walk-secretly-manipulating-images-and-more-at-nips/"><img src="https://s3.amazonaws.com/osim-rl/logos/tc.png" /></a> + <a href="https://cloud.google.com/"><img src="https://s3.amazonaws.com/osim-rl/logos/gcp.png" /></a> + <a href="https://www.tri.global/"><img src="https://s3-eu-west-1.amazonaws.com/kidzinski/nips-challenge/tri1.png" /></a> + <a href="http://opensim.stanford.edu/about/"><img src="https://s3.amazonaws.com/osim-rl/logos/ncsrr.png" /></a> + + </div--> + </div> +</section> +<section class="quickstart"> + <div class="grid"> + <div class="unit one-fifth center-on-mobiles"> + <h3>Get up and running <em>in seconds</em>.</h3> + </div> + <div class="unit four-fifths code"> + <p class="title">Quick-start Instructions</p> + <div class="shell"> + <p class="line"> + <span class="path">~</span> + <span class="prompt">$</span> + <span class="command">conda create -n opensim-rl -c kidzik opensim python=3.6.1</span> + </p> + <p class="line"> + <span class="path">~</span> + <span class="prompt">$</span> + <span class="command">source activate opensim-rl</span> + </p> + <p class="line"> + <span class="path">~(opensim-rl)</span> + <span class="prompt">$</span> + <span class="command">conda install -c conda-forge lapack git</span> + </p> + <p class="line"> + <span class="path">~(opensim-rl)</span> + <span class="prompt">$</span> + <span class="command">pip install git+https://github.com/stanfordnmbl/osim-rl.git</span> + </p> + <p class="line"> + <span class="path">~(opensim-rl)</span> + <span class="prompt">$</span> + <span class="command">python</span> + </p> + <p class="line"> + <span class="output">from osim.env import ProstheticsEnv<br/> +env = ProstheticsEnv(visualize=True)<br/> +observation = env.reset()<br/> +for i in range(200):<br/> + o, r, d, i = env.step(env.action_space.sample())</span> + </p> + </div> + </div> + <div class="clear"></div> + </div> +</section> +<section class="free-hosting" style="padding-bottom: 1em; padding-top: 1em"> + <div class="grid"> + <h2>NIPS 2017: Learning to Run challenge</h2> + <p>In 2017 we used osim-rl in a challenge at NIPS were participants were asked to build controllers for running. They did great :)</p> + <div class="pane-content"> + + <div style="width: 100%; padding-bottom: 75%; position: relative;"> + <iframe style="position: absolute; top: 0; left: 0; width: 100%; height: 100%;" src="https://www.youtube.com/embed/rhNxt0VccsE" frameborder="0" allow="autoplay; encrypted-media" allowfullscreen></iframe> + </div> + </div> + </div> + <div class="clear"></div> + + </div> +</section>