So Friday was a milestone date for the assignment. I had intended to have the entirety of the conversion portion of the tool (the nodes and the commands) finished, along with a couple of the components.
I had a couple of components started, and I had the nodes and commands written up. Just had to do some bug testing. Honestly I’m really happy with how this is all coming together. Writing up the templates and the prototype node functionality early on really helped me out when it came to actually implementing the rest of the code. It meant I could write up classes and methods which were dependent on other functionality even if I hadn’t written the functionality of that dependency (e.g. the Attribute method connectTo which connects one attribute to another, which quite a few node classes depended on).
On Friday morning I did set up the Distance node class, which was my first time testing out getting an attribute array by index (e.g. worldMatrix). Naturally it was trying to get an index from my Attribute object, and since the Attribute class didn’t have __getitem__ functionality it just threw an error. I first tried having getitem return a new Attribute object, using the maya function set’s findPlug method and passing the the attribute name plus the index (e.g. ‘worldMatrix’) as a string. It turns out though that the MPlug class doesn’t actually directly handle indexes! It has two methods for that: elementByLogicalIndex (which I decided to use) and elementByPhysicalIndex.
So the __getitem__ method for the Attribute class looks (excluding the formatting, thanks to wordpress) like this:
def __getitem__(self, key):
#for array attributes e.g. input
return Attribute(node=self.node, attrName=self._MPlug.elementByLogicalIndex(key))
I’ve sent the conversion portion to a couple of classmates and my tutor to try out and asked for feedback – primarily at this stage I am looking for bugs and checking how intuitive it is to use (especially compared to pymel and maya.cmds) but I also asked that they pass on anything else they feel is noteworthy. I included a .mel file which just adds the folderpath for wherever they decided to place the project folder, imports the tool, and prints some instructions.
Formy tutor the mel file failed to import the tool correctly, which is odd as it worked for both myself and another person. I’ll be looking into that more tomorrow.
It imported fine for the one classmate who has had time to test it out. He was able to create most nodes, but found some of them as well as the commands confusing. I’ll be chatting with him about it tomorrow to see how I can make the documentation more clear.
He also found that the IkSolver class froze maya completely (my first thought was I may have an infinite loop but that class doesn’t contain any loops. I’ll have to look more into that tomorrow), MultiplyDivide has a missing reference to the parameter name (simple mistake, quick fix), and nurbsSurface creates only the transform node (which is odd because it created the shape when I tested it. I’ll be testing that more thoroughly tomorrow as well).
Since then I’ve been working on components! I’m defining components as a commonly used combination of nodes which can be applied to various body parts, or which can be swapped out for other common combinations within a body part. So things like IK/FK joint chains, ribbon splines, space switching, variable FK, etc. I do have a number of these written up already from previous projects – most of them use static axis orientations and they all use maya.cmds so they’ll need to be converted to using my Workbench Nodes and Commands and I’ll have to make them abstract/dynamic enough to work at any user-defined axis (or better yet to derive the axis from the nodes they’re being connected to).
Today I was working on insertJoints, which (as you may have guessed) inserts a variable number of joints into a joint chain. In modifying it I found that when I create joints they’re always oriented to the world! Not at all ideal. I wrote a method in the Joints class to orient a joint to another node. The easiest way I could think of to do so was to use an aim constraint (which meant I got to bugtest the Constraint class! Hurrah!). I checked cometScripts to see if he had a better way, but he used an aim constraint too, so that’s some nice validation for me. I had an issue for a while wherein the inserted joints were flipped by 180 degrees after being oriented to their children. It took me far too long to realise that I was orienting them before moving them into place, so they were being oriented as though they were at the world origin instead of their correct positioning. I probably should have started with a parent joint that was away from the origin – that would have made it obvious much quicker.
I’m avoiding the insertJoints maya command, opting instead to create a joint, move it into position, orient it, and then parent it into the chain. I thought finding the correct position for the joints would be difficult, but it turned out to be remarkably simple! All I needed was the vector from the start joint to the end joint (end joint position – start joint position) and the number of joints being placed. I divide that by the number of joints (+1) and voila: the relative translation difference from one joint to the next.
Tomorrow I’ll be working on fixing the bugs that were pointed out to me and then converting the control hierarchy component that I’d written for my original autorigger concept. I’ll be adding the custom shape saver from my projection tool to it, and at some stage I’ll build a library of default controls.
I’ll probably do the character hierarchy component right after that, since most components will be using it to choose where to parent their bits and pieces.