headus 3D tools headus 3D tools / 3D scans
Support Forums
 
 FAQFAQ   SearchSearch    UsergroupsUsergroups   RegisterRegister 
 ProfileProfile   Log in to check your private messagesLog in to check your private messages   Log inLog in 
headus 3D scans

Duplicating template layouts -- easiest way?

 
Post new topic   Reply to topic
View previous topic :: View next topic  
Author Message
Dodger



Posts: 83
Joined: 14 Jan 2006

PostPosted: Sun May 24, 2009 7:38 pm    Post subject: Duplicating template layouts -- easiest way? Reply with quote

Okay, so here now I'm trying to do something different than usual...

Rather than trying to find the best UVs for a mesh, in this case I'm trying to duplicate an existing UV template so that a figure can take something else's textures (and more than that, actually)

However, the figure that I'm trying to match (in this case, DAZ's millenium 3 figures, for starters) does *not* have optimum UVs. Also the templates are made in a non-even aspect ratio, and are layered for multiple texture file use (i.e. there's a "head map" and a "Body Map" and an inexplicable "teeth map" (gums, too) for those who need photorealistic dental renders or something, and a separate bit for eyebrows, eyelashes and pubic hair (the Hair trans Map template)

So basically I'm trying to match my figure to V3/M3 texture UVs, but, of course, with different meshes.

I'm thinking there are tricks I don't know because all I've done so far is trying to make "best" UVs.

Any thoughts?

(others who might: No need to argue about IP nonsense -- while I firmly think that some companies are being absurd with their interpretation of IP law when they claim copyright on UV outlines, I'm not going to be distributing the Mil-3 mapped figure anyway, I'm using it to make an automatic texture convertor program to ship with the figure)
Back to top
View user's profile Send private message
headus
Site Admin


Posts: 2894
Joined: 24 Mar 2005
Location: Perth, Australia

PostPosted: Sun May 24, 2009 8:35 pm    Post subject: Reply with quote

So if I read that correctly, you want to copy the UVs from one mesh to another, even though the mesh structure/topology is different, but I'm assuming the overall shapes are the same? There's no tricks unfortunately ... at the moment there's only tools to copy uvs between meshes with the same topology.

Its an interesting idea though ... and maybe not too hard to do. For every vertex on mesh A you'd find the closest point on the surface of mesh B, and pull the UV from there. Seams might be a bit trickier, probably requiring some hand cleanup. I'm doing a fair bit of flying over the next few weeks, so I'll make this my "if the movie is boring" fallback entertainment :-)

Phil
Back to top
View user's profile Send private message Send e-mail Visit poster's website
Lewi



Posts: 61
Joined: 14 Jul 2006

PostPosted: Mon May 25, 2009 10:51 am    Post subject: Reply with quote

For now, if you can get the seams as close as possible on each model you could try the " Liquify " filter in Photoshop to stretch and warp the texture to fit the new mapping. It would probably be best to do it one shell at a time. Free transform it roughly into position and then apply the " Liquify " and tweak away.
Back to top
View user's profile Send private message
Dodger



Posts: 83
Joined: 14 Jan 2006

PostPosted: Mon May 25, 2009 1:12 pm    Post subject: Reply with quote

Coolness.

What I was thinking, specifically, is that, in theory, the objects wouldn't even have to necessarily have the same exact shape/dimensions/etc provided they were reasonably similar.

Then the idea is you could load Mesh A, the one with the UVs to match, and then load Mesh B, the one which needs UVs set. Mesh A would be loaded in a sort of "locked-readonly-edit" mode while mesh B would be loaded in "new UVs" mode, theoretically.

Then you'd cut Mesh B as closely as possible to Mesh A, drop the shells, and find their Mesh A equivalents. The seam would be stretched to match the mesh A shell target and the user might have to place some points. Then the *seam edges* would be flattened-but-held-in-place, so they could slide along the target mesh B's edges, but not otherwise move, until they were the right relative distances from one another.

Then optionally you might be able to pin a few other bits, to make sure, for instance, cheekbones or lip edges remained in the right place.

Then finally the whole thing would flatten as normal, with the edges and other bits pinned in place.

The main thing though would be that, due to different map aspects on different layers, it would have to treat the mesh

If, though, they really *were* the same shape exactly (or almost exactly), then I actually know a code approach to do that that I've worked out. It would work like so:

1: a little bit of inflate should be automatically applied to the mesh to be copied from -- however, it's probably a matter of trial-and-error how much and whether or not to do it before the rest, or only after the rest doesn't work a first time.

2: Then this approach should work:

Code:

    Tiangulate the donor mesh (the one the UVs are to be copied from)
    For each vertex in the recipient mesh (the mesh to receive UVs)
        For each polygon in the target mesh in which that vertex takes part
            Calculate the normal of the poly
        Calculate the average of the normals from the polys to get the vertex normal
        Temporarily rotate the universe (both meshes) so that the positive Z axis is aligned with the vertex normal just calculated.
        Temporarily translate the universe so the vertex is at X,Y,Z of 0,0,0
        For each polygon in the donor mesh
            Unless the polygon crosses the X axis AND crosses the Y axis and has  positive Z coordinates (i.e. is in front of the receipient vertex)
                Skip it and go to the next polygon
            Calculate the barycentric weights of each vertex in the local X/Y plane
            If none of the weights are negative and at least one is positive(i.e. the vertex is inside of the poly in X/Y space)
            Give the vertex UV coordinates based on the weights of the donor vertices


Then all that needs to be worked out is how to deal with the edges, since this approach will create "un-seamed" UVs with "stretchies" that jump across the seams, in places. Detect these stretchies, apply automatic seam cutting, and you're done!
Back to top
View user's profile Send private message
Dodger



Posts: 83
Joined: 14 Jan 2006

PostPosted: Mon May 25, 2009 4:12 pm    Post subject: Reply with quote

Lewi wrote:
For now, if you can get the seams as close as possible on each model you could try the " Liquify " filter in Photoshop to stretch and warp the texture to fit the new mapping. It would probably be best to do it one shell at a time. Free transform it roughly into position and then apply the " Liquify " and tweak away.


oh no, I'm talking about making a texture convertor program.

Once I have two copies of my mesh, one mapped like V3 and one mapped the way mine is mapped, it's a relatively simple procedure (though somewhat resource consuming)

Effectively I simply make a script go through and find each triangulated polygon from the "mapped-as-V3" version of my mesh, then, using the imagemagick libraries, extract that triangle of texture-ness from the source texture. Then I use an affine transform to reshape that triangle to it's new coordinates. Repeat for each triangle.

Of course there's also the complication that I have to consider different materials (edit: different groups of matrerials. I don't need to treat SkinHead, Skinscalp, Eyesocket, Lips, InnerMouth, Tongue, Lacrimal* and nostrils as separate objects, as they all go on the same map) as if they were separate objects, due to the overlapping maps, but that's relatively easy to handle as well.

*Their inaccurate material name for the caruncula
Back to top
View user's profile Send private message
Dodger



Posts: 83
Joined: 14 Jan 2006

PostPosted: Mon May 25, 2009 8:08 pm    Post subject: Reply with quote

Right now here's the approach I'm taking to do this manually:

1: Take a lot of snapshots of the cuts on the mesh to copy UVs from
2: Run a quick script to square the aspect ratio of the Head and Body maps (dental and hair trans layers are fine). (If I did any flatten without the maps aspect ratios being fixed first, there would be a lot of distortion happening in the next step)
3: Remap the mesh to copy UVs from slightly, by hitting "F" on everything just once so it scales the shells
3: Export this, pull into UVMapper, and save out a template
4: Load original with saved template as texture. Hide or show shells as necessary to position them and scale them as closely as possible to the UVLayout scaled-but-changed ones (this all without changing)
5: Remap my mesh, duplicating the cuts as closely as possible, using symmetry when feasible.
6: Position each cell to arrange as closely as possible to the result I got before
7: Optimise, and go have lunch.
8: Come back from lunch, find corners and pin them. Drag pinned corners to equivalent corners of mesh to copy UVs from
9: Optimise for a while
10: Pin midpoints and peaks along edges to their equivalent spots -- not all edges though, as my eyeballing may get things wrong.
11: Optimise again
12: Repeat with midpoints between pins

etc etc

right now I'm doing this and it seems to be working, Tedious, though, and I can't get a high enough trace map (4k isn't enough and it's topping out there -- when I set it higher it resets itself down to 4k) to really see what I'm doing with the area around the eyes. however, i can always pull the update into UVMapper where i get higher texture res, position the seam vertices there for those bits, pull back into UVLayout and pin them and repeat.

Eventually I'll have this working all the way.
Back to top
View user's profile Send private message
Lewi



Posts: 61
Joined: 14 Jul 2006

PostPosted: Tue May 26, 2009 1:06 am    Post subject: Reply with quote

Nicely figured out, glad its working. It certainly would be nice to do this semi automated, transfering textures to different topologys is tricky at the moment. The pixel density of the new map might become an issue if the warping is to large, but nothing a little massaging wont fix. Thanks for sharing your workflow and ideas. Lewi
Back to top
View user's profile Send private message
Dodger



Posts: 83
Joined: 14 Jan 2006

PostPosted: Tue May 26, 2009 7:26 pm    Post subject: Reply with quote

NP.
It's working, albeit...
Code:

v   e    r      y          s    l    o      w          l          y               y                         y
Back to top
View user's profile Send private message
Dodger



Posts: 83
Joined: 14 Jan 2006

PostPosted: Tue May 26, 2009 7:30 pm    Post subject: Reply with quote

You know what would make some of tihs really easier... if we could move vertices in 3D view...

then when I see seams that don't match up, I could, rather than switching back to UV mode and hoping I get the right one and move it the right way, I could just move 'em!
Back to top
View user's profile Send private message
headus
Site Admin


Posts: 2894
Joined: 24 Mar 2005
Location: Perth, Australia

PostPosted: Tue May 26, 2009 10:10 pm    Post subject: Reply with quote

"Then the *seam edges* would be flattened-but-held-in-place, so they could slide along the target mesh B's edges"

OK, I can see that working. So you're not necessarily after a totally automatic solution, but something to help match shell outlines after they've been cut already, right?

Phil
Back to top
View user's profile Send private message Send e-mail Visit poster's website
Dodger



Posts: 83
Joined: 14 Jan 2006

PostPosted: Wed May 27, 2009 3:20 pm    Post subject: Reply with quote

Totally.

I can't possibly imagine a totally automatic solution for differently shaped meshes anyway. That requires human intervention unless you happen to be inventing AI in your spare time B^)

Basically, if you have two differently but similarly shaped meshes (perhaps two people with different proportions, different default pose, etc) and very likely with different materials (i.e. one might have SkinChest and Nipple, where the other might have SkinTorso, Aureola and Nipple) and since UVLayout ignores materials anyway -- then there's no way without a human to decide on the "-ness" of things. A computer has no way to know where "Nipple-ness" is or "butt-crack-ness" or whatever and even Skynet and C-3P0 wouldn't be able conceieve of it, much less a real PC. Even Data from Star Trek would be hard-pressed to grasp it I think.

Another thing that would aid this procedure, I'm seeing right now as I'm doing it, is the idea of "suggestion pins" and reference curves.

I'm trying to think of the best way to explain this idea... I think maybe a user story would work best for this, and cover the way I'm seeing the rest, too:

User loads up both his meshes, donor and recipient, and cuts his seams as closely as possible to the donor and drops things, flattens a shell marks his edge reference pins (corners, peaks, valley bottoms, etc) and snaps it to the donor shell. The outline snaps to, relaxing/flattening as much as possible to preserve the edge length ratios, and the rest is left as-is for now until he flattens or optimises it.

He runs flatten and the mesh eases into a flatter shape, allowing him to see where distortion exists. Moreover, some way to see compared distortion (perhaps a key that hides the current shell inside a brush circle and shows the distortion underneath, or even a colour code option to show distortion difference rather than absolute distortion, so in difference mode, if the donor mesh is red-compressed, and the recipient mesh is nice minty green perfect, it would in this mode show as blue because it *needs to be compressed* to match what's going on underneath.

Then he looks in 3D mode, which would allow him to offset the donor mesh from the recipient so he can see both. He draws a line (which should not have to conform to edges) on either mesh, and it is made into some sort of 3D bezier curve constrained to the surface of the mesh (perhaps optimised to no more than 5 vertices in the bezier, or based on length). He can adjust it and, more importantly, he can adjust points on the OTHER mesh, because the bezier appears in the same relative place according to the UVs. The same adjustments can be made in UV space as well, but would be easiest to work with in 3D space.

The idea behind these reference curves is that the software would then know that each bezier point corresponds to the other, and to move the UVs to match them, but not specific points.

"Suggestion pins" would work very similarly and I'd probably use the same underlying mechanism. These would be points that are pinned, but not absolutely. They would be allowed to move to optimise the mesh, but as a last resort and less than other vertices (except ones that are absolutely pinned).

The suggestion pins would be very handy in other circumstances as well, not just UV matching. They could be used to stretch out areas that are overlapping or kinking, without making those places absolutely fixed so they can still optimise, but remain generally where they should be and were placed by the user. There are many times where I use the "smoosh over here" brush to move things where I want them but the preponderance of the mesh outvotes it and after an optimise things are back where they were before I pushed them.
Back to top
View user's profile Send private message
Dodger



Posts: 83
Joined: 14 Jan 2006

PostPosted: Wed May 27, 2009 3:28 pm    Post subject: Reply with quote

Oooh ooh oh, and on the shell outlines thing... some way for the software to know "This edge vertex needs to be here to make the seam work the same way as the other mesh's"

I.e. if a given vertex is at a given spot along the edge, relative to a donor mesh's vertex, then the corresponding vertex on the other side of the seam should match itself to the corresponding vertex of the donor mesh's other side of the seam. This would prevent having a mesh that fits perfectly, but the seams don't work right because a texture designed for the donor, when loaded onto the recipient, would be offset and show seams.

In most cases this wouldn't even be really noticable, so maybe it should be something that can be turned off, but, for instance, DAZ's Millenium 3 figures have a seam running right down the outside-upper part of the arm, right through prime tattoo territory. It's actually a pain, because short of using something like Projection Master in ZBrush, attempting to, for instance, reproduce my left arm which has a 30cm long tribal dragon head running down it, would be nigh impossible in Photoshop and *requires* projection software to "jump the seam" properly. An offset in this area would be disastrous for a tattoo on an existing texture, even if tiny details like skin and pores wouldn't be. (On something with more inherent edged or geometric detail, like a reptile, any seam would be a dead giveaway.)
Back to top
View user's profile Send private message
Display posts from previous:   
Post new topic   Reply to topic All times are GMT - 8 Hours
Page 1 of 1

 
Jump to:  
You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot vote in polls in this forum
You cannot attach files in this forum
You cannot download files in this forum


Powered by phpBB © 2001, 2005 phpBB Group