-
Hi Guys, Am new to iCub. Please ignore any silly mistakes. Am trying to sample joint values of eye-hand coordination in the iCub simulator. Previously, i defined a position in the iCubs search space using the cartesian coordinate system, reach their using the right arm and when given the same x,y and z coordinates to the fixation point i can look at it and the head adjusts automatically. Now, i want to record all values i.e joint angles of the right arm + wrist (7 values), pan, tilt and vergence of the eyes (3 values) plus pitch,roll and yaw of neck (3 values) when given a position (x,y,z coordinates) in the search space which is iteratively updated, so 3 values for the coordinates. I have the following script to run my program:
I can see the arm configuration when i move to a particular location, in a terminal which i have associated with the right_arm as:
Question 1: How can i access or store the above joint angles? that are opened in a separate terminal. Question 2: What if i define a starting point and a final point in the search space for the arm to move around and that should be followed by the head (eyes+neck)? Question 3: If the given points are unreachable then what value will be stored? I have been using the following functions previously: Any guideline or comments if i am going in the right direction? Thank you |
Beta Was this translation helpful? Give feedback.
Replies: 8 comments
-
Hi @ashrafzia Let me suggest you to use Answer 1
Answer 2Perhaps I didn't catch your exact point, but the Cartesian space (what you call the "search space") is common to both the Cartesian controller of the arm and the Gaze controller; therefore, you can provide them with the same targets and, as result, the head and the eyes will somehow follow the arm movements. Answer 3Arm and Gaze controllers do employ nonlinear optimization underneath that intrinsically deals with unfeasible goals. Thus, if you ask to attain out-of-reach targets, the optimizer will instruct the controller to move to the closest point still reachable. |
Beta Was this translation helpful? Give feedback.
-
Hi @pattacini Thanks for your reply and guidance. I launched the yarpmanager but could not able to run sample application. I copied the template iCub_Simulator_Startup and saved as my application file because it consists of all the required modules that am going to use. I get the following error: I started yarprun as I also tried yarpmanager-console, issued the comman list app, no applications are showing and list mod, list res etc aren't displaying any result. I am using a pre-compiled installation. Would that be an issue? |
Beta Was this translation helpful? Give feedback.
-
The In a very minimal setting where you launch every module on your local machine, you could do the following:
In the same scenario, you can profitably rely on the following shortcut, skipping the first step:
Either way, you should be ready to run the application at this point. If you will still have problems with |
Beta Was this translation helpful? Give feedback.
-
Thank you @pattacini I have managed to successfully run the module, but the yarpdatadumper_recording_example isn't working. Here is the new issue. Even if i manage to run data dumper the values for connected elements like head, torso, rightArm will be recorded but how can i track values for (Pan, Tilt and Vergence) of the eyes? Could it be done the same way as other elements? |
Beta Was this translation helpful? Give feedback.
-
The head part encapsulates both the neck and the eyes encoders according to the following format:
The last line refers to the axis names used in the software, as defined in http://wiki.icub.org/wiki/ICub_joints. |
Beta Was this translation helpful? Give feedback.
-
Closing; feel free to reopen if needed. |
Beta Was this translation helpful? Give feedback.
-
I would add to the table of @pattacini that the axis names used in the iCub software (for example the names returned by the
@pattacini I took the freedom of adding this line to your table. |
Beta Was this translation helpful? Give feedback.
Hi @ashrafzia
Let me suggest you to use
yarpmanager
to launch modules: it's much handier than relying on dedicated scripts.Answer 1
yarp::dev::IEncoders
motor interface. See this tutorial to find out more.yarpdatadumper
.Answer 2
Perhaps I didn't catch your exact point, but the Cartesian space (what you call the "search space") is common to both the Cartesian controller of the arm and the Gaze controller; therefore, you can provide them with the same targets and, as result, the head and the eyes will somehow follow the arm …