tsStickyLips tool

This tool will let you easily create a sticky lips setup for your character. The setup created is based on a wire deformer.

Installation

1. Download the script from this link: tsStickyLips.rar;

2. Copy file tsStickyLips.pyc into your maya scripts/ folder;

3. From the script editor, in a python tab type:

import tsStickyLips

tsStickyLips.main()

Usage

As the sticky lips deformation lies on top of other deformations, this should be your very last step in facial rigging.

1. Select top edges for the uplip and press ‘<<<’;

2.  Select top edges for the bottomlip and press ‘<<<’;

3. Press ‘Create stickyLips’ button.

The script will create a copy of the geometry with sticky lips setup. You can hide the previous geo and go rendering with this copy.

 

Optional parameters:

The tool provides automatically on weighting the wire deformer influence areas. Two optional parameters can be tweaked to change how the weighting is done:

- Selection growth: Grows vertex selection before setting weight influence to 1;

- Smoothness: How many times the weighting should be smoothed.

 

Here’s the source code of the main procedure for the tech guys:

def createStyckyLips(top_edges, bottom_edges, selection_growth = 2, smoothing=8):
 base_mesh = top_edges[0].split(".")[0]
 cmds.select(base_mesh)

 # Create curve on top edge sel on duplicated mesh
 cmds.select(top_edges)
 top_curve = cmds.polyToCurve(form=2, degree=1)[0]
 top_curve_shape = cmds.listRelatives(top_curve, c=1)[0]

 # Create curve on bottom edge sel on duplicated mesh
 cmds.select(bottom_edges)
 bottom_curve = cmds.polyToCurve(form=2, degree=1)[0]
 bottom_curve_shape = cmds.listRelatives(bottom_curve, c=1)[0]

 # Create wire average curve
 avg_node = cmds.createNode("avgCurves")
 cmds.setAttr(avg_node + ".automaticWeight", 0)
 cmds.setAttr(avg_node + ".automaticWeight", 0)
 avg_curve = cmds.duplicate(top_curve)[0]
 avg_curve_shape = cmds.listRelatives(avg_curve, c=1)[0]
 cmds.connectAttr(top_curve_shape + ".worldSpace[0]", avg_node + ".inputCurve1", force=1)
 cmds.connectAttr(bottom_curve_shape + ".worldSpace[0]", avg_node + ".inputCurve2", force=1)
 cmds.connectAttr(avg_node + ".outputCurve", avg_curve_shape + ".create", force=1)

 #Create duplicate mesh with inputs
 cmds.select(base_mesh)
 dup_mesh = mel.eval("polyDuplicateAndConnect;")[0]
 dup_top_edges = [x.replace(base_mesh, dup_mesh) for x in top_edges]
 dup_bottom_edges = [x.replace(base_mesh, dup_mesh) for x in bottom_edges]

 # Create wire deformer on duplicate mesh
 wire_deformer = mel.eval("wire -gw false -en 1.000000 -ce 0.000000 -li 0.000000 -w " + avg_curve + " " + dup_mesh +";")[0]
 base_wire = avg_curve + "BaseWire"
 base_wire_shape = cmds.listRelatives(base_wire, c=1)[0]
 cmds.connectAttr(avg_node + ".outputCurve", base_wire_shape + ".create", force=1)

 # Set wire deformer params
 cmds.setAttr( wire_deformer + ".scale[0]", 0)
 cmds.setAttr( wire_deformer + ".envelope", 1.2)
 editAttrs(wire_deformer, ["ce","te","li","ro","sc[0]"], l=1, k=0, cb=0)

 #Set weights
 cmds.select(dup_mesh)

 #Zero all weights
 mel.eval("artAttrToolScript 4 \"" + wire_deformer +"\";")
 mel.eval("artAttrPaintOperation artAttrCtx Replace;")
 mel.eval("artAttrCtx -e -value 0 `currentCtx`;")
 mel.eval("artAttrCtx -e -clear `currentCtx`;")

 #Select vertices
 cmds.select(dup_mesh)
 mel.eval("changeSelectMode -component;")
 mel.eval("hilite -r " + dup_mesh + " ;")
 mel.eval("setComponentPickMask \"Line\" true;")
 cmds.select(dup_top_edges, dup_bottom_edges)
 mel.eval("ConvertSelectionToVertices;")

 for i in range(selection_growth):
 mel.eval("GrowPolygonSelectionRegion;")

 #Set weights for region to 1
 mel.eval("artAttrInitPaintableAttr;")
 mel.eval("artAttrValues artAttrContext;")
 mel.eval("toolPropertyShow;")

 mel.eval("artAttrToolScript 4 \"" + wire_deformer +"\";")
 mel.eval("artAttrPaintOperation artAttrCtx Replace;")
 mel.eval("artAttrCtx -e -value 1 `currentCtx`;")
 mel.eval("artAttrCtx -e -clear `currentCtx`;")

 #Smooth n times
 cmds.select(dup_mesh)
 mel.eval("artAttrInitPaintableAttr;")
 mel.eval("artAttrValues artAttrContext;")
 mel.eval("toolPropertyShow;")
 mel.eval("artAttrPaintOperation artAttrCtx Smooth;")

 for i in range(smoothing):
 mel.eval("artAttrCtx -e -clear `currentCtx`;")

 mel.eval("changeSelectMode -object;")
 cmds.select(cl=1)

 # Group,rename all objects
 main_grp = cmds.group(em=1, w=1, n=resolveName("GRP_sticky"))
 cmds.setAttr(main_grp + ".visibility", 0)
 editAttrs(main_grp, ["tx","ty","tz","rx","ry","rz","sx","sy","sz"], l=1, k=0, cb=0)

 top_curve = cmds.rename(top_curve, resolveName("CRV_sticky_top"))
 bottom_curve = cmds.rename(bottom_curve, resolveName("CRV_sticky_bottom"))
 avg_curve = cmds.rename(avg_curve, resolveName("CRV_wire"))
 base_wire = cmds.rename(base_wire, resolveName(avg_curve + "BaseWire"))

 cmds.parent(top_curve, bottom_curve, avg_curve, base_wire, main_grp)
 if cmds.listRelatives(dup_mesh, p=1):
 cmds.parent(dup_mesh, w=1)

 wire_deformer = cmds.rename(wire_deformer, resolveName("stickyWire"))
 cmds.select(dup_mesh)

Have fun!

 

 

Procedural spider rig

Rig made for a tv mini series.
Set up of various expression to drive the procedural walking.
Each leg has its own procedural expression, so the local distance during walking are correctly evaluated for each leg.
Model: Adam Muratoff, Anthem Fx.

Rig made for a tv mini series.Set up of various expression to drive the procedural walking. Each leg has its own procedural expression, so the local distance during walking are correctly evaluated for each leg.
Model: Adam Muratoff, Anthem Fx.

Runway

While rebuilding entirely the website, I took some spare time to create this illustration.
Enjoy it and keep crawling around for the new website ;)


Updates

Hi all,
My work and other stuff kept me away from updating the website for a long time.
I am going to come up with a new webiste in the next weeks, with update contents and free tools for you.
Stay tuned!

tsPoseSpaceDeformer demo

This is a demo of one of my latest tools. It is inspired to the Michael Comet’s pose space deformer plugin, but with a substantial difference: no extra plugin is needed.
The pose space deformation is realized by creating a corrective blendshape target driven by a node network.
Here is how it works:

- Select your mesh in the binding pose. The tool will store the character pose as base pose;
- Put your character in a problematic pose and press ‘Start sculpting’. A proxy mesh (the green one) is generated and lets you correct the mesh geometry to the desired shape. Once finished hit ‘Finalize scuplting’;
- The next phase is connecting the corrective shape to your pose. You can choose between 3 different ways: (Attribute) Expression, (Attribute) Node network, (Joint) Radial distance.
The first two work basically in the same way, except the first is a combination of an expression and a node network, while the second one is fully network based. Each attribute determining the pose
is considered as a coordinate in a hyperspace. The blendtarget weight is the result of the normalized projection of the current pose vector on the “target pose – base pose” vector.
The third way, instead working on the single attributes, it works on influence basis. The blendshape weight is the result of a combination of the radial distance between the base pose and the target pose for each influence determining the pose.
- You can choose wich attributes/influences determine the pose on the Attribute list tab. The tool automatically proposes the influences that changed their coordinates from the base pose to the current pose.
- There is an option for pruning the blendshape weights (You can see the auto pruned target maps in the video): very useful if you end up creating tons of corrective shapes and you don’t want to loose performance;
- From the mirror frame, you can enable/disable the creation of the mirror blendshape;
- Finally, to connect the blendshape and its mirror, just click on “Create corrective blendshape”.
- Once finished, you can tweak the weights of your targets along the deformation by editing the ramp shaders driving them.

Future developments:

- High res sculpting for displacement map generation and driving.

Thanks for watching

Tommaso

Instancing random objects preserving frequency

Hi all. After the post talking about the math involving frequency for procedural animation, it is now time to talk about the randomness.

During the production in studio we had to deal with the following question: instantiate some base objects along particles randomly to form a crowd, using the Maya Instancer, but having the control over the count of the instances for each object (to have some characters less frequent than others).

The Gaussian approach

The most natural attempt would be using a Gaussian distribution (or Normal distribution), which results in a non-uniform distribution of the samples, being more dense around the mean value, having the variance as measure of the width of  distribution.

Following the Gaussian approach, we can use the result of a random.gauss() function as the index of the element extracted from the list of our objects to instance:


import random

numObjs =10
numSamples=600
dev=2

random.seed(2233)
l= range(numObjs)
count = [0 for x in l]

for i in range(numSamples-1):
    while True:        
        v = int (abs(random.gauss(0,dev)))
        if v < len(l):
            count[v] +=1
            break
print count
        

The list “l” contains the objects we want to instance (for the sake of simplicity, I filled it out with numbers as [0, 1, .... numObjs] ), wile numSamples reprents the numbers of object we want to instance (i.e. the number of extractions).
To get stable results, I initialized the seed of the random function (!Important).
What we are doing in the two nested loops is extracting #numSamples times a random value from the Gaussian distribution, having as mean value 0 (the bell centered at origin) and dev as the width of the bell curve. Since we are using the result of the extraction as index for the list containing our elements, the result must be an integer value and equal or greater than zero. Because our list has fixed size (equal to len(l) ) and the bell goes from -infinite to infinite, we repeat the estraction in case the resulting value is out of the list indices range.
The code above produce the following count:

[209, 175, 112, 61, 37, 3, 2, 0, 0, 0]

As we can see, the values on the left are extracted more frequently than the one on the right.
Because we are instancing objects, it would be nice to get some instances for each object.
To achieve this, we can enlarge the width of the bell curve by increasing dev:

dev=3: [141, 126, 117, 82, 53, 38, 27, 12, 1, 2]

dev=5: [92, 94, 78, 86, 68, 64, 36, 35, 25, 21]

dev=10: [67, 69, 63, 65, 67, 52, 61, 59, 43, 53]

As the numbers above suggests, the higher the deviation, the more the values tend to be uniform. By playing with the dev value, we have a certain amount of freedom on driving the extraction frequency of each sample, thus the frequency of instancing of each object.
Even if this approach is mathematically correct, it lacks on control over the real count (you can try to change the seed and you will easily get different values) plus it gives unpredictable results when the population (the number of base object) is numerically comparable to the number of extractions (the object to instance).
For example, let’s see this case:

numObjs= 20, numSamples=30, dev=2:

[13, 7, 4, 4, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]

We see that the bell is not large enough to give at least one instance to the last elements. Let’s try to increase dev:

dev=10: [3, 6, 1, 2, 1, 2, 1, 0, 2, 2, 2, 1, 1, 0, 0, 0, 0, 1, 1, 3]

Now we get a more even distribution, but we see some 0s around: that means some objects will be never instanced and we can even see that the higher number is the second (while we expect to be the first one): that means the element we want to be more frequent is less frequent than the second one, by changing the seed, we could be more lucky, but for sure we don’t have full control of what’s going on.

The Modulo approach

Let’s follow another approach, using the Modulo operation.
If we want evenly distribute some objects extracted from a list, we can exctract the element at index:

index = #numObjs mod n

Where n represents the number of the current extraction:


import random

l = ["Obj1", "Obj2", "Obj3", "Obj4", "Obj5"]
count = [0,0,0,0,0]

numInstances = 10

for i in range(numInstances):
    index = i %len(l)
   
    #instance object l[index]
    #...
   
    print l[index]
    count[index] +=1

print count

This code produces the following extractions: Obj1, Obj2, .., Obj5, Obj1, Obj2, .. and obviously the count would be equal for all objects (where the number of extractions is a multiple of the number of elements).
To get a random and numerically stable extraction, we can shuffle our list:


import random

random.seed(10)
l = ["Obj1", "Obj2", "Obj3", "Obj4", "Obj5"]
random.shuffle(l)
count = [0,0,0,0,0]

numInstances = 10

for i in range(numInstances):
    index = i %len(l)
   
    #instance object l[index]
    #...
   
    print l[index]
    count[index] +=1

The last step is adding the information about the frequency.
Let’s set the frequency for each element and rebuild the list l by adding each element # times as much as its frequency. For instance:

f['Obj1'] = 3
f['Obj2'] = 1
f['Obj3'] = 2

Our list would be: l=['Obj1', 'Obj1', 'Obj1', 'Obj2', 'Obj3', 'Obj3']
Now by shuffling this list, we can use the modulo function for the estraction, preserving the frequencies set above:


import random

random.seed(10)

f=dict()

f["Obj1"] = 3
f["Obj2"] = 1
f["Obj3"] = 2

l = list()

for k, v in f.items():
    for i in range(v):
        l.append(k)

print "Base list: ", l

random.shuffle(l)

print "Shuffled list: ", l

count = dict()

count["Obj1"] = 0
count["Obj2"] = 0
count["Obj3"] = 0


numInstances = 6

for i in range(numInstances):
    index = i %len(l)
    
    #instance object l[index]
    #...
    
    count[l[index]] +=1

print "Objects count: ", count

The above code produces the following output:

Base list: ['Obj1', 'Obj1', 'Obj1', 'Obj3', 'Obj3', 'Obj2']
Shuffled list: ['Obj2', 'Obj1', 'Obj1', 'Obj3', 'Obj1', 'Obj3']
Objects count: {‘Obj1′: 3, ‘Obj3′: 2, ‘Obj2′: 1}

Let’s raise the number of instances (numInstances=600):

Objects count: {‘Obj1′: 300, ‘Obj3′: 200, ‘Obj2′: 100}

We can see that the frequency is preserved. Everything works so far, except one last thing: we are getting a pseudo-random extractions that repeats the sequence everytime we visit the whole list l.
To get a more random behaviour, we can shuffle the list everytime we restart by extracting the first element, i.e.: index == 0:


import random

random.seed(10)

f=dict()

f["Obj1"] = 3
f["Obj2"] = 1
f["Obj3"] = 2

l = list()

for k, v in f.items():
    for i in range(v):
        l.append(k)

print "Base list: ", l

random.shuffle(l)

print "Shuffled list: ", l

count = dict()

count["Obj1"] = 0
count["Obj2"] = 0
count["Obj3"] = 0


numInstances = 12

for i in range(numInstances):
    index = i %len(l)
    if index == 0:
        random.seed(i)
        random.shuffle(l)
        
    #instance object l[index]
    #...
    print l[index]
    count[l[index]] +=1

print "Objects count: ", count

And don’t forget to change your seed before the shuffle() operation, or you’ll get the same list over and over.
The code above produces the following sequence:

Obj1, Obj1, Obj2, Obj1, Obj3, Obj3, Obj1, Obj2, Obj1, Obj1, Obj3, Obj3

With Objects count: {‘Obj1′: 6, ‘Obj3′: 4, ‘Obj2′: 2}.

Finally we got a way to instantiate objects randomly but preserving control over their frequency.
I suggest to use the Gaussian approach only in case our base population is not numerically comparable to the number of instances.