Diff of /README.md [000000] .. [9f010e]

Switch to unified view

a b/README.md
1
# 𝜂𝜇Sim
2
![Fig1_github](https://github.com/saxenalab-neuro/muSim/assets/77393494/aefcb769-7427-4654-be72-08e1d6f59642)
3
4
5
Training LSTMs and ANNs to perform tasks with musculoskeletal models. 
6
Environments include monkey model performing cycling.
7
8
Please cite the following paper if using uSim/nuSim in your work:
9
10
Link to corresponding paper (https://www.biorxiv.org/content/10.1101/2024.02.02.578628v1)
11
12
## Installation
13
14
We highly recommend a linux system for easy installation.
15
16
First you will need to install Mujoco (older version). Please make sure that Anaconda as well as git are also installed on your system.
17
18
1. Download the library using this link: https://mujoco.org/download/mujoco210-linux-x86_64.tar.gz
19
20
2. Create a hidden folder in your root directory called .mujoco as such (replacing the path with the path on your computer): 
21
    
22
    `mkdir /home/username/.mujoco`
23
24
3. Extract the downloaded library into the newly created hidden folder:
25
26
    `tar -xvf mujoco210-linux-x86_64.tar.gz -C ~/.mujoco/`
27
28
4. Open your .bashrc file in your root/home directory:
29
30
    `nano .bashrc`
31
32
5. Once in the .bashrc file, add the following line replacing the path with your true home directory:
33
34
    `export LD_LIBRARY_PATH=/home/user-name/.mujoco/mujoco210/bin`
35
36
6. If your system has an nvidia GPU, add this line as well:
37
38
    `export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/usr/lib/nvidia `
39
40
7. Save, close, then source the .bashrc using the following command:
41
42
    `source ~/.bashrc`
43
44
8. Reboot your system to ensure changes are made
45
46
9. Create a new environment using conda:
47
48
    `conda env create --name nusim --file=requirements.yml`
49
50
10. Activate the conda environment:
51
52
    `conda activate nusim`
53
54
55
**If facing errors, installing the following libraries may help**
56
57
1. If you are on linux (and may apply to Mac as well), there will likely be additional packages necessary. Here is a list of possible packages:
58
59
    * patchelf
60
    * python3-dev
61
    * build-essential
62
    * libssl-dev
63
    * libffi-dev
64
    * libxml2-dev
65
    * libxslt1-dev
66
    * zlib1g-dev
67
    * libglew1.5
68
    * libglew-dev
69
    * libosmesa6-dev
70
    * libgl1-mesa-glx
71
    * libglfw3
72
73
    If facing errors, adding these packages may help.
74
75
2. Lastly, within the conda environment there are additional packages necessary to ensure the training can run:
76
77
    * cython
78
    * matplotlib
79
    * scipy
80
    * torch
81
    * PyYaml
82
    * configargparse
83
    * numpy
84
    * gym
85
    * pandas
86
    * pyquaternion
87
    * scikit-video
88
89
## Basic Usage for the Monkey Cycling Task
90
91
<p align="center"> <img src="https://github.com/saxenalab-neuro/muSim/assets/77393494/2073cc37-c44a-4558-82ae-a0c54e5573c4" width="50%" height="50%"> </p>
92
93
1. To train the controller, run the following in terminal:
94
95
    `python append_musculo_targets.py`
96
97
    `python find_init_pose.py --config configs/configs.txt`
98
99
    `python main.py --config configs/configs.txt`
100
101
    This will save the controller in the ./checkpoint file with training iterations. The highest reward should reach >= 55000 for kinematic accuracy.
102
103
   The episode reward with iterations should look like this:
104
   (There may be slight variations due to random seed but trend should look similar)
105
   
106
   <p align="center"> <img src="https://github.com/saxenalab-neuro/muSim/assets/77393494/d3a7578c-035d-4a8c-b87b-853e3d03187c" width="50%" height="50%"> </p>
107
108
## General Usage
109
110
**Inputs:**
111
112
**Musculoskeletal Model:**
113
114
1. Musculoskeletal Model: Save the MuJoCo musculoskeletal model in “./musculoskeletal_model/” as musculoskeletal_model.xml alongwith the Geometry files
115
116
(The path to the musculoskeletal_model.xml can also be specified in the configs.txt file with *musculoskeletal_model_path* param, if not using the above default path)
117
118
2. For conversion of musculoskeletal model from OpenSim to MuJoCo, please refer to MyoConverter: https://github.com/MyoHub/myoconverter
119
120
**Experimental Kinematics:**
121
122
1. Save the experimental kinematics in ‘./kinematics_data/kinematics.pkl’ as a Python dict object with the following format:
123
    
124
    dict{
125
    
126
127
```
128
    <'marker_names'> : <['marker_name_1', ..., 'marker_name_n']>,
129
130
    <'train'> : <dict_train>,
131
132
    <'test'> : <dict_test>
133
```
134
135
}
136
137
1. <’marker_names’> contain a list of names of the experimental markers that were recorded. The marker_name must correspond to a body name in the musculoskeletal model xml file. 
138
2. <dict_train> and <dict_test> are Python dictionary objects that contain kinematics in the following format
139
140
{ 
141
```
142
<key> :   <value>,
143
144
<key>:   <value>,
145
146
.
147
.
148
.
149
150
<key>: <value>
151
```
152
153
}
154
155
<key: int> is the integer index of the corresponding condition. (starts from 0 for the first condition for both the training and testing conditions) 
156
157
<value: numpy.ndarray> contains the kinematics for the corresponding condition with shape: [num_markers/targets, num_coordinates = 3, timepoints]. 
158
159
num_markers are the number of experimental markers/bodies that are recorded. The order of the num_markers must correspond to the order in which the marker_names are listed. For example, if ‘marker_names’ = [’hand’, ‘elbow’], num_marker= 0 should contain the experimental kinematics for hand and num_marker=1 should contain the experimental kinematics for elbow.
160
161
num_coordinates are the x [-->], y[↑] and z[ out of page] coordinates. Values of NaN for any coordinate will keep that coordinate locked. 
162
163
An example for saving the experimental kinematics for the cycling task is given in ./exp_kin_cycling/saving_exp_kin.ipynb
164
165
(The path to kinematics.pkl file can also be specified using *kinematics_path* param in configs.txt file) 
166
167
**Neural Data (optional):**
168
169
1. Save the recorded neural data for the training and testing conditions in ‘./nusim_neural_data/neural_activity.pkl’ as a Python dict object:
170
dict{
171
172
```
173
174
    <'train'> : <dict_train>,
175
176
    <'test'> : <dict_test>
177
```
178
179
}
180
181
1. <dict_train> and <dict_test> are Python dictionary objects that contain the neural data in the following format:
182
183
<key: int> is the integer index of the corresponding condition as in the kinematics file.
184
185
<value: numpy.ndarray> is the numpy array that contains recorded firing rates with the following shape: [timepoints, num_neurons]. num_neurons are the total number of recorded neurons.
186
187
Note: If this step is omitted, various post-processing analyses which require recorded neural data such as CCA, will not run. nuSim training will also not proceed.
188
189
(nusim_data_path can also be specified in the configs.txt file)
190
191
**Stimulus Data (optional):**
192
193
Provide any experimental stimulus data in ‘./stimulus_data/stimulus_data.pkl’ as a Python dict object. 
194
195
dict{
196
197
```
198
199
    <'train'> : <dict_train>,
200
201
    <'test'> : <dict_test>
202
```
203
204
}
205
206
1. <dict_train> and <dict_test> are Python dictionary objects that contain the experimental stimulus data in the following format:
207
208
<key: int> is the integer index of the corresponding condition as in the kinematics file.
209
210
<value: numpy.ndarray> is the numpy array that contains recorded stimulus data with the following shape: [timepoints, num_features]. num_features are the corresponding features in that stimulus.
211
212
**Initial Pose (optional):**
213
214
Save the initial pose (containing the qpos and qvel) as numpy arrays in ‘./inital_pose/’ as qpos.npy and qvel.npy with shape [nq, ]. nq is the number of joints in the xml model.
215
216
This step is optional. If omitted, the default initial pose for xml model will be used for CMA-ES and IK.
217
218
(initial_pose_path can also be specified in the configs.txt file)
219
220
**Specifications:**
221
222
Provide the parameters for various modules using the ‘./configs/configs.txt’ file. The details of each parameter/specification is given in the configs.txt file.
223
224
**General Usage:**
225
226
**Inverse Kinematics:**
227
228
1. **Append the xml model with targets:**
229
230
Run:
231
232
`python append_musculo_targets.py`
233
234
This will append targets to the musculoskeletal xml file that will follow the preprocessed markers kinematics during simulation.
235
236
2. **Find the initial pose for xml model using CMA-ES and Inverse Kinematics:**
237
238
a. Run the following command in the terminal:
239
240
`python find_init_pose.py --config configs/configs.txt --visualize True`
241
242
This will use inverse kinematics (IK) to find the initial pose for the xml model to match the initial timepoint of the target kinematics.
243
244
If you see the output, ‘Initial Pose found and saved’, skip 1b.
245
246
b. Run:
247
248
`python find_init_pose_ik_cma.py --config configs/configs.txt --visualize True`
249
250
This willl use CMA-ES optimization with IK to find a good initial pose for the xml model. 
251
252
If you see, ‘Initial Pose found and saved using CMA-ES and Inverse Kinematics’, proceed to the next step. 
253
    
254
Otherwise, provide a good inital pose for the xml model that preferably starts nearer to the inital marker/target position.
255
    
256
3. **Visualize the targets/markers trajectories using randomly initialized uSim network:**
257
258
Run
259
260
`python main --config configs/configs.txt --visualize True --mode test`
261
262
This will visualize the target trajectories using a randomly initialized uSim controller network. Make sure target trajectories look as desired. Otherwise, change the kinematics preprocessing parameters (e.g. trajectory_scaling, center) in the ./configs/configs.txt file.
263
264
4. **Visualize the musculoskeletal model trajectory and save the corresponding sensory feedback:**
265
266
Run:
267
268
`python visualize_trajectories_ik.py --config configs/configs.txt --visualize True`
269
    
270
    
271
This will visualize the xml model following/tracking the training target trajectories. Before proceeding, make sure that the target trajectories are feasible and lie within the bounds of the xml model. Otherwise, adjust the target trajectories using the kinematics preprocessing parameters in configs.txt file.
272
    
273
This will also save the generated sensory feedback in ‘./test_data/sensory_feedback_ik.pkl’ as Python dict object: 
274
275
<key: int> corresponds to the integer index of the corresponding training condition
276
277
<value: numpy.ndarray> with shape: [timepoints, num_of_state_feedback_variables]
278
279
This can be used to get Proprioception for training neural networks.
280
281
**Training the uSim Controller using DRL:**
282
283
**(Make sure DRL/SAC related parameters are specified correctly in the configs.txt file)**
284
285
1. To train the uSim controller using the provided DRL algorithm, run:
286
287
`python main.py --config configs/configs.txt`
288
    
289
2. To continue the training from the previous session, run:
290
291
`python main.py --config configs/configs.txt --load_saved_nets_for_training True`
292
293
**Testing the uSim Controller:**
294
295
To test the trained uSim controller, run:
296
297
`python main.py --config configs/configs.txt --mode test --visualize True`
298
299
This will visualize the xml model performing movements for training and testing conditions using the trained uSim controller. 
300
301
This will also save the files used for post training analyses.
302
303
**Post Training Analyses:**
304
305
After training, the following modules are used for various analyses. All these modules are in ‘./Analysis’
306
307
1. **Kinematics Visualization:**
308
309
To visualize the kinematics for the training and testing conditions, see visualize_kinematics.ipynb
310
311
2. **PCA:**
312
313
To visualize the uSim controller’s population trajectories in PCA subspace, run:
314
315
`python collective_pca.py`
316
317
3. **Canonical Correlation Analysis (CCA):**
318
319
see CCA.ipynb
320
321
4. **Linear Regression Analysis (LRA):**
322
323
see LRA.ipynb
324
325
5. **Procrustes:**
326
327
see procrustes.ipynb
328
329
6. **Fixed Point (FP) Analysis:**
330
331
Clone the fixed-point-finder in ./Analysis, https://github.com/mattgolub/fixed-point-finder
332
333
Run
334
335
`python find_fp.py`
336
337
The fixed point analysis is based on the original implementation: https://github.com/mattgolub/fixed-point-finder. Refer to the github repo for further information.
338
339
7. **Rotational Dynamics: (requires MATLAB)**
340
341
See and run jpca_nusim.m
342
343
Note: jPCA analysis is based on MM Churchland’s original implementation. Please see it for further details (https://www.dropbox.com/scl/fo/duf5zbwcibsux467c6oc9/AIN-ZiFsy2Huyh8h7VMdL7g?rlkey=3o5axmq5hirel4cij7g64jc0r&e=1&dl=0)
344
345
**Important for jPCA analysis:**
346
347
1. Make sure that ./Analyses/jPCA_ForDistribution is included in the MATLAB path alongwith all sub-directories
348
349
2. Make sure that usim test_data folder is included in the MATLAB path. test_data folder is where the jpca data is saved during usim test
350
351
**Perturbation Analyses:**
352
353
**Selective Feedback Elimination (SFE):**
354
355
Specify the part of the sensory feedback to be eliminated in ./SAC/perturbation_specs.py using *sf_elim* variable. Run:
356
357
`python main --config configs/configs.txt --mode SFE --visualize True`
358
359
**Sensory Perturbation:**
360
361
Specify the perturbation vector to be added to the selected sensory feedback in ./SAC/perturbation_specs.py, e.g. *muscle_lengths_pert*. Run:
362
363
`python main.py --config configs/configs.txt --mode sensory_pert --visualize True`
364
365
**Neural Perturbation:**
366
367
The neural perturbation will add the given perturbation to the nodes of the uSim/nuSim controller’s RNN.
368
369
Specify the neural perturbation vector in perturbation_specs.py using *neural_pert* variable. Run:
370
371
`python main.py --config configs/configs.txt --mode neural_pert --visualize True`
372
373
**Change Musculoskeletal Properties:**
374
375
To test the trained uSim controller under changed musculoskeletal properties:
376
377
1. Go to the folder ‘./musculoskeletal_model/’. Copy and paste the xml model ‘musculo_targets.xml’. Rename the copied model as ‘musculo_targets_pert.xml’.
378
379
2. Change the desired musculoskeletal properties in xml model ‘musculo_targets_pert.xml’.
380
381
3. Run:
382
383
`python main.py --config configs/configs.txt --mode musculo_properties --visualize True`
384
385
All the above perturbation analyses will change the post training analyses files in place. To run the post training analyses after perturbation see Post Training Analyses section.