Creating gaits, especially walking, trotting etc. is quite a challenging task, if you do this manually. It is therefore favorable to have a kinematic model of our cat. I wanted to show you my approach to create a new gait via Inverse Kinematics, where I'm using the IKPY library (https://github.com/Phylliade/ikpy).
Inverse kinematics at a glance is that you provide a point in space and the angles of the motors are calculated. In IKPY this is achieved by creating "chains" e.g. upper_arm--lower_arm--paw. Then you decide where to place the paw in space. Below you can see an image of Nybbles Kinematic model, to give you a first impression.
For the first tests I created an alternative walking gait that follows as swing-stance-pattern of this form:
# Walking pattern (1 == swing)
# [0,0,0,0,0, 0,0,0,0,0, 0,0,0,0,0, 1,1,1,1,1], # LL
# [0,0,0,0,0, 1,1,1,1,1, 0,0,0,0,0, 0,0,0,0,0], # RL
# [0,0,0,0,0, 0,0,0,0,0, 1,1,1,1,1, 0,0,0,0,0], # LA
# [1,1,1,1,1, 0,0,0,0,0, 0,0,0,0,0, 0,0,0,0,0], # RA
The longitudinal movement (blue) and the lift (orange) is based on sine/cosine-functions, that follow the pattern phase shift as shown above.
This pattern in combination with the IK-model will lead to this movement:
I also tried this new gait on Nybble for validation and it works quite well. The lift is 1 cm in amplitude, so Nybble can now step over a small obstacle.
Further updates and the Bittle-Version can be found here: https://github.com/ger01d/kinematic-model-opencat
So here's the code for the version above:
import ikpy
from ikpy.chain import Chain
from ikpy.link import OriginLink, URDFLink
import numpy as np
from numpy import sin, cos, pi
import matplotlib.pyplot as plt
import mpl_toolkits.mplot3d.axes3d as p3
deg2rad = pi/180
# Values in cm
armLength = 5
bodyLength = 12
bodyWidth = 10
distanceFloor = 7
stepLength = 5
swingHeight = 1
leanLeft = 0 # Not working, yet
leanRight = -leanLeft # Not working, yet
leanForward = 0 # cm
left_arm = Chain(name='left_arm', links=[
URDFLink(
name="center",
translation_vector=[leanForward, 0, distanceFloor],
orientation=[0, 0, 0],
rotation=[0, 0, 0],
),
URDFLink(
name="shoulder",
translation_vector=[0, bodyWidth/2, leanLeft],
orientation=[0, 0, 0],
rotation=[0, 1, 0],
bounds=(0.1*deg2rad, 179*deg2rad),
),
URDFLink(
name="upperArm",
translation_vector=[armLength, 0, 0 ],
orientation=[0, 0, 0],
rotation=[0, 1, 0],
bounds=(-179*deg2rad, -0.1*deg2rad),
),
URDFLink(
name="lowerArm",
translation_vector=[armLength, 0, 0],
orientation=[0, 0, 0],
rotation=[0, 0, 0],
)
])
right_arm = Chain(name='right_arm', links=[
URDFLink(
name="center",
translation_vector=[leanForward, 0, distanceFloor],
orientation=[0, 0, 0],
rotation=[0, 0, 0],
),
URDFLink(
name="shoulder",
translation_vector=[0, -bodyWidth/2, leanRight],
orientation=[0, 0, 0],
rotation=[0, 1, 0],
bounds=(0.1*deg2rad, 179*deg2rad),
),
URDFLink(
name="upperArm",
translation_vector=[armLength, 0, 0],
orientation=[0,0,0],
rotation=[0,1,0],
bounds=(-179*deg2rad, -1*deg2rad),
),
URDFLink(
name="lowerArm",
translation_vector=[armLength, 0, 0 ],
orientation=[0,0,0],
rotation=[0, 1, 0],
)
])
left_leg = Chain(name='left_leg', links=[
URDFLink(
name="center",
translation_vector=[leanForward, 0, distanceFloor],
orientation=[0, 0, 0],
rotation=[0, 0, 0],
),
URDFLink(
name="butt",
translation_vector=[-bodyLength, 0, 0],
orientation=[0, 0, 0],
rotation=[0, 0, 0],
),
URDFLink(
name="hip",
translation_vector=[0, bodyWidth/2, leanLeft],
orientation=[0, 0, 0],
rotation=[0, 1, 0],
bounds=(0.1*deg2rad, 100*deg2rad),
),
URDFLink(
name="upperLeg",
translation_vector=[armLength, 0, 0 ],
orientation=[0, 0, 0],
rotation=[0, 1, 0],
bounds=(1*deg2rad, 179*deg2rad),
),
URDFLink(
name="lowerLeg",
translation_vector=[armLength, 0, 0 ],
orientation=[0, 0, 0],
rotation=[0, 1, 0],
)
])
right_leg = Chain(name='right_leg', links=[
URDFLink(
name="center",
translation_vector=[leanForward, 0, distanceFloor],
orientation=[0, 0, 0],
rotation=[0, 0, 0],
),
URDFLink(
name="butt",
translation_vector=[-bodyLength, 0, 0],
orientation=[0, 0, 0],
rotation=[0, 0, 0],
),
URDFLink(
name="hip",
translation_vector=[ 0, -bodyWidth/2, leanRight],
orientation=[0, 0, 0],
rotation=[0, 1, 0],
bounds=(0.1*deg2rad, 100*deg2rad),
),
URDFLink(
name="upperLeg",
translation_vector=[armLength, 0, 0 ],
orientation=[0, 0, 0],
rotation=[0, 1, 0],
bounds=(1*deg2rad, 179*deg2rad),
),
URDFLink(
name="lowerLeg",
translation_vector=[armLength, 0, 0],
orientation=[0, 0, 0],
rotation=[0, 1, 0],
)
])
def buildGait(frames):
frame = np.arange(0,frames)
swingEnd = pi/2
# longitudinalMovement
swing = -cos(2*(frame*2*pi/frames))
stance = cos(2/3*(frame*2*pi/frames-swingEnd))
swingSlice = np.less_equal(frame,swingEnd/(2*pi/frames))
stanceSlice = np.invert(swingSlice)
longitudinalMovement = np.concatenate((swing[swingSlice], stance[stanceSlice]))
longitudinalMovement = np.concatenate((longitudinalMovement,longitudinalMovement,longitudinalMovement, longitudinalMovement))
# verticalMovement
lift = sin(2*(frame*2*pi/frames))
liftSlice = swingSlice
verticalMovement = np.concatenate((lift[liftSlice], np.zeros(np.count_nonzero(stanceSlice))))
verticalMovement = np.concatenate((verticalMovement, verticalMovement, verticalMovement, verticalMovement))
return longitudinalMovement, verticalMovement
frames = 43
longitudinalMovement, verticalMovement = buildGait(frames)
# Walking pattern
# [0,0,0,0,0, 0,0,0,0,0, 0,0,0,0,0, 1,1,1,1,1], # LL
# [0,0,0,0,0, 1,1,1,1,1, 0,0,0,0,0, 0,0,0,0,0], # RL
# [0,0,0,0,0, 0,0,0,0,0, 1,1,1,1,1, 0,0,0,0,0], # LA
# [1,1,1,1,1, 0,0,0,0,0, 0,0,0,0,0, 0,0,0,0,0], # RA
# Phase shift between arms/legs, [RA, RL, LA, LL]
shiftFrame = np.round(np.array([0, pi/2, pi, 3*pi/2])/2/pi*frames)
shiftFrame = shiftFrame.astype(int)
for frame in range(0, frames):
right_arm_angles = 180/pi*right_arm.inverse_kinematics(target_position = np.array([stepLength*longitudinalMovement[frame], -bodyWidth/2 ,swingHeight*verticalMovement[frame]]))
right_arm_correction = np.array([0, -90, 90, 0])
right_arm_angles = np.round(right_arm_angles+right_arm_correction)
right_arm_angles = np.delete(right_arm_angles, np.s_[0,3], axis=0)
right_arm_angles = right_arm_angles.astype(int)
right_leg_angles = 180/pi*right_leg.inverse_kinematics(target_position = np.array([-bodyLength+stepLength*longitudinalMovement[frame+shiftFrame[1]], -bodyWidth/2 , swingHeight*verticalMovement[frame+shiftFrame[1]]]))
right_leg_correction = np.array([0, 0, -90, -90, 0])
right_leg_angles = np.round(right_leg_angles+right_leg_correction)
right_leg_angles = np.delete(right_leg_angles, np.s_[0,1,4], axis=0)
right_leg_angles = right_leg_angles.astype(int)
left_arm_angles = 180/pi*left_arm.inverse_kinematics(target_position = np.array([stepLength*longitudinalMovement[frame+shiftFrame[2]], +bodyWidth/2 , swingHeight*verticalMovement[frame+shiftFrame[2]]]))
left_arm_correction = np.array([0, -90, 90, 0])
left_arm_angles = np.round(left_arm_angles+left_arm_correction)
left_arm_angles = np.delete(left_arm_angles, np.s_[0,3], axis=0)
left_arm_angles = left_arm_angles.astype(int)
left_leg_angles = 180/pi*left_leg.inverse_kinematics(target_position = np.array([-bodyLength+stepLength*longitudinalMovement[frame+shiftFrame[3]], +bodyWidth/2 , swingHeight*verticalMovement[frame+shiftFrame[3]]]))
left_leg_correction = np.array([0, 0, -90, -90, 0])
left_leg_angles = np.round(left_leg_angles+left_leg_correction)
left_leg_angles = np.delete(left_leg_angles, np.s_[0,1,4],axis=0)
left_leg_angles = left_leg_angles.astype(int)
# Writing sequence to file
gait_sequence = np.concatenate((left_arm_angles, right_arm_angles, right_leg_angles, left_leg_angles))
print(frame," ",gait_sequence)
f = open("gait_sequence.csv", "a")
f.write("#")
np.savetxt(f, gait_sequence[[0,2,4,6,1,3,5,7]], fmt='%3.1i', delimiter=',', newline=", ")
f.write("+")
f.write("\n")
f.close()
# Create plot and image for each frame
fig = plt.figure()
ax = p3.Axes3D(fig)
ax.set_box_aspect([3, 3/2,1])
ax.set_xlim3d([-20, 10])
ax.set_xlabel('X')
ax.set_ylim3d([-10, 10])
ax.set_ylabel('Y')
ax.set_zlim3d([0.0, 10])
ax.set_zlabel('Z')
right_arm.plot(right_arm.inverse_kinematics(target_position = np.array([stepLength*longitudinalMovement[frame], -bodyWidth/2 ,swingHeight*verticalMovement[frame]])), ax)
right_leg.plot(right_leg.inverse_kinematics(target_position = np.array([-bodyLength+stepLength*longitudinalMovement[frame+shiftFrame[1]], -bodyWidth/2 , swingHeight*verticalMovement[frame+shiftFrame[1]]])), ax)
left_arm.plot(left_arm.inverse_kinematics(target_position = np.array([stepLength*longitudinalMovement[frame+shiftFrame[2]], +bodyWidth/2 , swingHeight*verticalMovement[frame+shiftFrame[2]]])), ax)
left_leg.plot(left_leg.inverse_kinematics(target_position = np.array([-bodyLength+stepLength*longitudinalMovement[frame+shiftFrame[3]], +bodyWidth/2 , swingHeight*verticalMovement[frame+shiftFrame[3]]])), ax)
figureName = "Nybble_" + str(frame)
plt.savefig(figureName)
#plt.show()
Nice work Gero, I am very new here, I just started to know and learn Arduino a couple of months ago, and I found this project and your kinematic tool, very interesting and useful. I just noticed that in your code, you placed four times of longitudinalMovement and verticalMovement into list, as the gait pattern for each leg/arm only starts from 0,1/4,/1/2,3/4 frames, to put two times of the original longitudinalMovement\verticalMovement is enough.
Jason
aijnec@163.com
I created a python program to see how Bittle behaves. This will not work for studying his gait. Also, it is not a very smart program because I made it by piecing together various sample files.
The program reads a file with the behavior converted to csv.
The yellow plate shown in the graph is Bittle's new ground. The red arrows are the normal vectors. I really want to modify the Bittle's position to match the new ground, but I don't know how.
The result is also output as a gif animation file.
For some reason, the csv file could not be uploaded, so please rename action.txt to action.csv.
There is a run error:
Python 3.6.8
ikpy 3.2.2
Traceback (most recent call last):
File "kinematics_nybble.py", line 29, in <module>
rotation=[0, 0, 0],
TypeError: __init__() got an unexpected keyword argument 'translation_vector'
Great work! I came across that same article a few days ago when I was looking for literature to understand IK and how the functions are derived. At the time I wondered how I could adapt the code to achieve or verify the same foot trajectory, or any other shape to quickly experiment with new gaits. Please keep posting your progress.
@Gero very good attempt. I was interested to see the feet trajectory of your work, thus I mapped the horizontal longitudinal movement and the vertical movement. As expected, it is a simple sine curve with flat base:
Reversely, based on your derived gait numbers, I applied "forward kinematics" to verify that I can also work backwards to arrive at the same trajectory without using your original sine / cosine equations:
The reason I'm keen to dig deeper, is that I noticed in the original Bittle InstinctBittle.h, the gait numbers provided tend to result in Bittle dragging its feet when it walks (thus @Rongzhong Li's recommendation of running Bittle without socks, which doesn't make sense in normal circumstances). The feet doesn't seem to "push" Bittle up when it touches the ground, resulting in unstable Bittle movement. As Li did not disclose the original equations of how the numbers in InstinctBittle.h were derived, by applying the same forward kinematics approach, I mapped out the original Bittle's walk, run, crawl and trot feet trajectory:
Similar to your work, Li's given gait numbers all have a "flat" trajectory base, i.e. Bittle does not "push" itself up when the feet touches the ground, resulting in unstable forward-moving gestures. In a separate quadruped project, I came across this paper of a more sophisticated feet trajectory:
As one can see, during the stance phase, the feet trajectory is slightly convex downwards, which has a number of beneficiaries according to the authors. As I am still awaiting a few parts coming from eBay on my other project, I would work on applying this revised trajectory on Bittle, and see if it would result in better movement.
It depend of what you want to do with your robot, I will try to do tutorials about automation which is more suitable for easy tutorials and GUI tools, I already used some of Codecraft tools (I think it's them) with micro:bit or meowbit. A funny exemple that I have already seen here is to make the robot react depending on whether you are wearing a mask or not and there are many other fun possibilities ! It's true that adding movements or postures from the community can be made easier with GUI tools but generate movements is necessarily linked to a more complicated procedure... however it's a small part of the fun you can have with your robot ! OpenCat is made as an education tools for everyone I'm agree so... from noobs to nerds ! It's cool there are different levels from which start learning ! Everyone win ! I probably wouldn't have bought the robot if we couldn't do advanced things (at my level GUI tools are boring...) and yet, I will try to help noobs to have fun ! (but sadly time is limited... 😩) To finish, from what I know of software engineering I strongly doubt that Codecraft someday release an easy tool to make inverse kinematics.
In fact it is too cumbersome to work with formulas and therefore a graphic software could be more suitable and make the Bittle and the cat products more attractive, I think the Codecraft is a good starting point to be able to generate postures and behaviors and for this reason you should intervene at the Codecraft company so that they solve the problems related to the PREVIEW button that does not work and also to the data sequences during the insertion of the movements when trying to create a Skill. Only you @Rongzhong Li can solicit bug fixes it is in your company's interest above all to make the product user friendly and suitable for everyone
I summarize below what I understand
hoping to make a point about the situation:
1) You made a program by implementing Python's IKpy libraries.
2) In your program he entered some data of a hypothetical walking pattern and projected everything in the graph (the one with the orange and blue lines).
3) The graph is unfortunately static, the data can only be entered in the program which consequently generates the graph and therefore, when it is closed, a .csv file with the values relating to the hypothesized movement.
From my point of view it would instead take a 3D plane such as one of the images generated by the program and through the GUI interface implement something more dynamic in order to interact with the image itself being able to move the object (Bittle) on the horizontal plane (unfortunately it does not fly), generating a new position and even more important to set new positions by hooking a limb (shoulder or knee) with the mouse and dragging it to a new position generating defined sequences (frames) and at that point exporting the sequence of data to import into scketch instinct.h
I looked around and found a couple of Links that I will see to see if I can recover something I post them here but it would be interesting if a good graphic programmer would give us a hand as I don't think it is difficult to create a software even with the help of a spreadsheet integrated by VBA or a different high-level language. Meanwhile, the links are the following
http://gazebosim.org/
https://robodk.com/doc/it/Basic-Guide.html#Start
We need something like instead
As you suggested, I closed the window and in fact it generated a sequence of numbers in the console (I'm at a good point :) .... at least I hope).
However, I do not understand what movement it has generated since as I told you I cannot interact with the graph
0 [25 4 59 44 -8 31 64 5] 1 [29 2 64 33 -7 31 66 8] 2 [34 0 72 16 -6 30 67 11] 3 [39 -1 72 4 -6 36 68 14] 4 [44 -2 67 -5 -2 25 68 17] 5 [ 48 -3 57 -10 1 23 68 20] 6 [52 -2 43 -9 4 20 68 23] 7 [56 -1 27 -3 7 16 61 35] 8 [59 0 13 7 11 13 68 28] 9 [57 5 2 18 15 10 68 30] 10 [64 5 -5 28 20 7 58 44] 11 [66 8 -8 31 25 4 59 44] 12 [67 11 -7 31 29 2 64 33] 13 [68 14 -6 30 34 0 72 16] 14 [68 17 -6 36 39 -1 72 4] 15 [68 20 -2 25 44 -2 67 -5] 16 [ 68 23 1 23 48 -3 57 -10] 17 [61 35 4 20 52 -2 43 -9] 18 [68 28 7 16 56 -1 27 -3] 19 [68 30 11 13 59 0 13 7] 20 [58 44 15 10 57 5 2 18] 21 [59 44 20 7 64 5 -5 28] 22 [64 33 25 4 66 8 -8 31] 23 [72 16 29 2 67 11 -7 31] 24 [72 4 34 0 68 14 -6 30] 25 [67 -5 39 -1 68 17 -6 36] 26 [ 57 -10 44 -2 68 20 -2 25] 27 [43 -9 48 -3 68 23 1 23] 28 [27 -3 52 -2 61 35 4 20] 29 [13 7 56 -1 68 28 7 16] 30 [ 2 18 59 0 68 30 11 13] 31 [-5 28 57 5 58 44 15 10] 32 [-8 31 64 5 59 44 20 7] 33 [-7 31 66 8 64 33 25 4] 34 [-6 30 67 11 72 16 29 2] 35 [-6 36 68 14 72 4 34 0] 36 [-2 25 68 17 67 -5 39 -1] 37 [ 1 23 68 20 57 -10 44 -2] 38 [ 4 20 68 23 43 -9 48 -3] 39 [ 7 16 61 35 27 -3 52 -2] 40 [11 13 68 28 13 7 56 -1] 41 [15 10 68 30 2 18 59 0] 42 [20 7 58 44 -5 28 57 5]
... it also generated me precisely the images you were referring to
I ran kinematics_bittle.py but besides displaying the blue and orange graph and zooming in and out, it doesn't make me close any textures. Is there anything else to upload where I could interact to close the storyline?
Here I am, I was excited and I just finished installing Python 3.2.9 and IK libraries and I get this in screenshoot after i run the python program ....
it does not allow me to do anything and for this I ask you how you came from this model
to the CSV file that I loaded (correctly) in the instinct.h module I would like to help but for now I need your help to prepare the common material you are working on. I await your reply.
the file contains some data, how to proceed?
Should I copy and replace it in place of an existing predefined sequence or create a new instinct following the existing ones and program it on the remote control button?
Is the file to be modified instinct.h?
Hi Flavien,
I played a little bit with your Bittle version and changed the distanceFloor to 6.5 and swingHeight to 0.5:
Maybe you find a minute to apply it on Bittle and test it. Attached you can find the gait_sequence for the Instincts.h:
Ok, I know I'm going to lose credibility by asking this, but how are you generating the animation of the gait, is it a command in the py script, or are you using a 3rd party app to stitch the output together?
My modified model for bittle:
And.... YAY ! it walk (kind of... 😂)
I will make a pull request later, I'm with family this week-end, not much time.
I tried to use the code with a modified URDF model for bittle, I'm not sure for bounds angles and others things, but I will deepen the subject ;). Oh and in logs I see than matplotlib is not happy that you open too much plots in the processing loop (saying that it use a lot of memory), you should add plt.close() after plt.savefig() to save memory. Much much thanks !
Awesone job!
Grazie @Gero devo ora mettermi al lavoro per intanto capire come assemblare tutto il codice e poi spiegarlo ai miei teams. Se hai già prodotto qualche materiale e lo vuoi condividire sarà molto gradito.
Grazie
Wow, I am impressed of this discussion, can't wait to dive into it!
Thanks for sharing!
I updated the code a little bit: Now the calculation of a gait is twice as fast, because I call the IK-function only once per frame per chain. You can now plot the functions for the vertical and longitudinal movement by uncommenting line 181 (the inverse calculation will start after you closed the plot).
This might improve the understanding what the function buildGait() does.
I further noticed that it can be advantageous to change the optimization algorithm in IKPY. To do this, you have to locate the file where Python installs the libraries and change in "inverse_kinematics.py" line 134 from "L-BFGS-B" to e.g. "SLSQP".
The "L-BFGS-B" algorithm seems to have problems finding solutions for constrained problems (like the boundaries for the angles of Nybble). This leads to little jumps in the end-effector (paw) position. The "SLSQP" optimization resulted in my tests to less "jumps" which makes the movement smoother.
res = scipy.optimize.minimize(optimize_total, chain.active_from_full(starting_nodes_angles), method='SLSQP', bounds=real_bounds, options=options)
If you experience this jumpy behavior with the kinematic model, this can be a solution.
More details of the optimization possibilities can be found here:
https://docs.scipy.org/doc/scipy/reference/generated/scipy.optimize.minimize.html#rdd2e1855725e-6