Morph one 3D - model to another using vertex shader? (SOLVED)

Yes, as the topic say.
I want to interpolate vertex positions from one dae object into another one. This should be possible but I lack the knowhow, but if someone have pointers are get any ideas please help me.

Right now I’m solving 3D animated objects by just replacing the meshes frame by frame, but it should be possible to interpolate between them using a vertexshader. The different frames have the exact same number of vertexes and hopefully even has kept their numbers through the conversion ( i build using maya lt, then duplicate the skinned mesh, export to fbx using the gameexporter, and finally convert to dae using the autodesk fbx-convertor ) .

Anyway, have anyone been playing around with this?

2 Likes

This is an interesting question! In GLSL, you might want to try something like this for your vertex shader:

uniform float morphFactor;
in vec3 vertPosA;
in vec3 vertPosB;
in vec4 colorA;
in vec4 colorB;
out vec4 outColor;

void main() {
  gl_Position = vec4(vertPosA * (1 - morphFactor) + vertPosB * morphFactor, 1);
  outColor = colorA * (1 - morphFactor) + colorB * morphFactor;
}

As morphFactor moves from zero to one, the positions and colors for each vertex should be linearly interpolated from model ‘A’ to model ‘B’. If you’re trying to do lighting and whatnot there’d be some additional calculations required for the normals, but this’ll work for a starting point.

The trick would be figuring out how to get the vertex data from each mesh into vertPosA and vertPosB and same with color, from a pure OpenGL standpoint I know how to do that but I’m not sure how you’d do it with Defold. I haven’t done a ton with GLSL recently so I wouldn’t trust that code if I were you but hopefully it gives you some ideas.

2 Likes

Ah, perfect, that’s what I needed, a starting ground :slight_smile:
I won’t need neither color or lightning since i keep all the colors in a texture, and the uv-positions are exactly the same.

Thanks!

2 Likes

Would you be willing to share some of your making of of this?

1 Like

Ha, well of course, first and foremost, as you can see I would like to use mesh-morphing to get a bit less messy chunka of models and fix some anim-framerate issues. I animate her in 30fps while the defold build run in 60fps.

I’m gonna finish it up to the most perfect realtime musicvideo i’ve ever done now .

2 Likes

@Bob: I saw your music video at our company kickoff last week and I must say I was very impressed! I love both the song and the video. Keep it up!

1 Like

I agree, it was really great to see and listen to! Also incredibly impressive to get that working without any built-in support for 3d anims.

To do the morphing, I think you could squeeze in the positions of the second mesh into the normals of the first mesh, since you don’t seem to use the normals in the shading. You would need to do that as a pre-step with maybe a python script. I have been playing around with mesh generation for some gpu stuff lately, here is a simple python script that creates meshes in a grid-type shape:

import sys

template = """<?xml version="1.0" encoding="utf-8"?>
<COLLADA xmlns="http://www.collada.org/2005/11/COLLADASchema" version="1.4.1">
  <asset><contributor><author></author><authoring_tool>FBX COLLADA exporter</authoring_tool><comments></comments></contributor><created>2012-10-10T12:34:45Z</created><keywords></keywords><modified>2012-10-10T12:34:45Z</modified><revision></revision><subject></subject><title></title><unit meter="1.000000" name="centimeter"></unit><up_axis>Y_UP</up_axis></asset>
  <library_geometries>
    <geometry id="pPlane1-lib" name="pPlane1Mesh">
      <mesh>
        <source id="pPlane1-POSITION">
          <float_array id="pPlane1-POSITION-array" count="{vertex_elem_count}">
            {positions}
          </float_array>
          <technique_common>
            <accessor source="#pPlane1-POSITION-array" count="{vertex_count}" stride="3">
              <param name="X" type="float"/>
              <param name="Y" type="float"/>
              <param name="Z" type="float"/>
            </accessor>
          </technique_common>
        </source>
        <source id="pPlane1-Normal0">
          <float_array id="pPlane1-Normal0-array" count="{vertex_elem_count}">
            {normals}
          </float_array>
          <technique_common>
            <accessor source="#pPlane1-Normal0-array" count="{vertex_count}" stride="3">
              <param name="X" type="float"/>
              <param name="Y" type="float"/>
              <param name="Z" type="float"/>
            </accessor>
          </technique_common>
        </source>
        <source id="pPlane1-UV0">
          <float_array id="pPlane1-UV0-array" count="{uv_elem_count}">
            {uvs}
          </float_array>
          <technique_common>
            <accessor source="#pPlane1-UV0-array" count="{uv_count}" stride="2">
              <param name="S" type="float"/>
              <param name="T" type="float"/>
            </accessor>
          </technique_common>
        </source>
        <vertices id="pPlane1-VERTEX">
          <input semantic="POSITION" source="#pPlane1-POSITION"/>
          <input semantic="NORMAL" source="#pPlane1-Normal0"/>
        </vertices>
        <triangles count="{tri_count}" material="lambert1">
          <input semantic="VERTEX" offset="0" source="#pPlane1-VERTEX"/>
          <input semantic="TEXCOORD" offset="1" set="0" source="#pPlane1-UV0"/>
          <p> {indices}</p></triangles>
      </mesh>
    </geometry>
  </library_geometries>
  <scene>
    <instance_visual_scene url="#"></instance_visual_scene>
  </scene>
</COLLADA>
"""

def flatten(x):
    return [v for sub in x for v in sub]

def gen_model(width, height, filename):
    quads = [[x, y] for x in range(width) for y in range(height)]
    positions = ([[x, y, 0, (x+1), y, 0, x, (y+1), 0, (x+1), (y+1), 0] for x,y in quads])
    positions = flatten(positions)
    vertex_elem_count = len(positions)
    vertex_count = vertex_elem_count / 3
    tri_count = len(quads)*2
    normals = ([[(x+0.5)/width, (y+0.5)/height, (x+(y*width))/float(width*height)] for x,y in quads for i in range(4)])
    normals = flatten(normals)
    uvs = [0.000000, 0.000000,
           1.000000, 0.000000,
           0.000000, 1.000000,
           1.000000, 1.000000]
    indices = [3, 2, 0, 3, 0, 1]
    indices = [[(i + q*4), i] for q in range(len(quads)) for i in indices]
    indices = flatten(indices)
    with open(filename, "w") as f:
        f.write(template.format(positions = " ".join(map(str, positions)),
            normals = " ".join(map(str, normals)),
            uvs = " ".join(map(str, uvs)),
            vertex_elem_count = vertex_elem_count,
            vertex_count = vertex_count,
            uv_elem_count = str(8),
            uv_count = str(4),
            indices = " ".join(map(str, indices)),
            tri_count = tri_count))

def main():
    width = int(sys.argv[1])
    height = int(sys.argv[2])
    filename = "models/grid_{}_{}.dae".format(width, height)
    gen_model(width, height, filename)

if __name__ == "__main__":
    main()
4 Likes

vertexNormals feels like the perfect solution.
My take is to just expand on my exporter script in mel where I already bake all my animation data.
Right now this is the steps I take and I just need to solve the final step:

  1. For-loop through the source-mesh all vertexes
  2. For each Vertex, apply the xyz cordinates from the destination-mesh into the vertex-normal data.

However Maya of course average the values right now in the vertex-normal data which kinda destroys them but, there should be a setting for it somewhere. I never knew what the whole “average normals” was about but as always when you dive deep into actually handling meshes by yourself on the code-side you get another wonderful perspective of it all. :slight_smile:

I’ll guess I could be getting some problems with backface-culling as well since they tend to relie on normals. However, if everything fails I could just output all data in a new file and push it into the project and read it straight on instead of trying to piggyride on the DAE-format. That could be the best solution when I think about it, and will also work the day I want to actually use the normals for something :smiley:

Hopefully we will have a working mesh-morpher for use with vertex-animations anyday now. In Maya LT at least.

Maya normalize normal ( who would have thought ) but at least I can store everything as vertexcolors for now.

This is my melscript for pushing all vertex-coords from the destination mesh into the vertex-color of the src-mesh.

for( $i=0; $i<16; ++$i ) 
{
    select -r morphed.vtx[$i] ;
    float $dstPos[3] = `xform -q -ws -t morphed.vtx[$i]`;
    
    
    select -r mesh.vtx[$i] ;
    polyColorPerVertex -rgb $dstPos[0] $dstPos[1] $dstPos[2];

}

And then to actually performe the morph, in maya for testing purposes, I do this:

for( $i=0; $i<16; ++$i ) 
{
    select -r mesh.vtx[$i] ;
    float $srcPos[3] = `xform -q -ws -t mesh.vtx[$i]`;
    float $dstPos[3] = `polyColorPerVertex -q -rgb`;
    float $intPos[3];
    
    $intPos[0]=$srcPos[0]+($dstPos[0]-$srcPos[0])*0.5;
    $intPos[1]=$srcPos[1]+($dstPos[1]-$srcPos[1])*0.5;
    $intPos[2]=$srcPos[2]+($dstPos[2]-$srcPos[2])*0.5;
    
    xform -ws -t $intPos[0] $intPos[1] $intPos[2] mesh.vtx[$i];
}

so it’s a start.
Now I need to figure out how to get the color-data inside the exported FBX into the NORMAL-field in the Collada DAE-file. Any takers? :slight_smile:

2 Likes

I have solved it! .
Finally. :slight_smile:

I got a morphed mesh up n running in defold now, so now I can optimize the amount of meshes for the animation to a minimum keyframe, and interpolate between them. Just need to build a little conversion for the dae-files, right now I move them manually in text-editor.

Thank you Ragnar for the nice normal-swap idea.
The fbx-exporter capped the rgb-values between 0-1 so before exporting I needed a added some value and divided by 4. So I’ll need to keep all vertex data in the meshes between -2 and 2 ,

for( $i=0; $i<8; ++$i )
{
select -r morphed.vtx[$i] ;
float $dstPos[3] = xform -q -ws -t morphed.vtx[$i];

$dstPos[0]=($dstPos[0]+2)/4;
$dstPos[1]=($dstPos[1]+2)/4;
$dstPos[2]=($dstPos[2]+2)/4;

select -r mesh.vtx[$i] ;

polyColorPerVertex -rgb $dstPos[0] $dstPos[1] $dstPos[2];

}

yeah, that’s the final mel-script which puts the destination mesh vertex positions into the color vertex values inside .
Then I convert from fbx to dae, ( and do some magic pasting in the file for hand for now )

and then, in the vertex-shader, this happens:

attribute mediump vec4 position;
attribute mediump vec4 normal;

uniform lowp vec4 time;
varying mediump vec4 morphPosition;

attribute vec4 Color;

void main()
{
morphPosition.x=(normal.x4-2)/100;
morphPosition.y=(normal.y
4-2)/100;
morphPosition.z=(normal.z*4-2)/100;

float sine_time = sin(time.x/2);

morphPosition.x=morphPosition.x*(1-sine_time)+position.xsine_time;
morphPosition.y=morphPosition.y
(1-sine_time)+position.ysine_time;
morphPosition.z=morphPosition.z
(1-sine_time)+position.z*sine_time;

gl_Position = mvp * vec4(morphPosition.xyz, 1.0);
}

And that will in a nice sine gently morph the first frame of the source mesh into the one saved in it’s normal value. Hurray!

5 Likes

Dived down into how the DAE format works and this little tool will soon be done and shared. Pew Pew. Then I’ll start finishing up my animation with a bit more flair and less meshes :slight_smile:

Also since now the morph-transform is outside of Maya it can be in the correct format so we don’t have to waste precious gpu-time with multiplications.

4 Likes

Sorry for spamming without anyone really answering,
but here’s the tool, with a little readme on how to implement it as a vertex-shader. Very simple stuff.
Not using any mel-scripts anymore. That was just to get to know the format and try to solve it in Maya. Now it works perfectly on any dae-objects as long as they have the same number of vertexes and such.

http://www.wadonk.com/wadonk-defold-daeMorpher.zip

Yäj!
final vertexshader ended up as:

uniform mediump mat4 view_proj;
uniform mediump mat4 world;
attribute mediump vec4 position;
attribute mediump vec4 normal;
uniform lowp vec4 time;

varying mediump vec4 morphPosition;

void main()
{
// Multiply view, projection and world matrices into one.
mediump mat4 mvp = view_proj * world;

// The varying var_texcoord0 is at texcoord0 at the position of
// this vertex.

var_texcoord0 = texcoord0;

vColor = Color;
vTexCoord = texcoord0;

morphPosition.x=normal.x;
morphPosition.y=normal.y;
morphPosition.z=normal.z;

float sine_time = (sin(time.x/2)+1)/2;

morphPosition.x=morphPosition.x*(1-sine_time)+position.xsine_time;
morphPosition.y=morphPosition.y
(1-sine_time)+position.ysine_time;
morphPosition.z=morphPosition.z
(1-sine_time)+position.z*sine_time;

gl_Position = mvp * vec4(morphPosition.xyz, 1.0);
}

And here it’s in “action”
https://instagram.com/p/BGHePXMhjj4/

5 Likes

Thank you for sharing! It’s very interesting!

I have some notes and want to test using this in the future. Previously I used multiple models as frames for pixel art like flipbook animations, but with voxel shapes. Maybe can use this to make smooth transitions between keyframes?

1 Like

Cool, but were the voxels vertex-based?
This is totally destroying the normaldata, but as long as it’s vertexposition based it should work :slight_smile:

1 Like

Yes, they are made with Qubicle and then exported / processed. I’ll share results when I test later.

2 Likes

Yeah, this is super cool and ultra inspiring! Really makes me want to spend the next 10,000 hours learning how to “art” :slight_smile:

2 Likes