Skip to main content

CLO Help Center

How can we help you?

Edge problems with textured fabric

Comments

  • Official comment
    CLO Designers

    Hello @adna. There are couple little things that you can do to solve this issue;

    • With your Edit Sewing Tool select all the Pattern pieces for the garment and increase the fold angle closer to 360. 
    • Select the Fabric in the in Object Browser. Under the Material--> Basic Parameters--> Displacement Map. Lower the SHIFT amount (default is -.30), lower would be closer to -1.0

    These settings would need adjusting depending on which CLO Fabric you are using (Sherpa, Knit, etc). See the attached image.

  • jdiduch

    I don't have an answer for you, only some personal musings that may help you to understand what might be going on, and apologies if none of this makes sense to you- it barely makes sense to me.  I was having somewhat similar problems, though my pieces were not nearly as thick as yours, but only when bringing exported meshes into Substance painter; when I exported the mesh as "thick" in order to preserve the effects I had built up in Clo as well as the "double sided" pieces, the outer edges looked horrible and I didn't understand why.  It finally dawned on me that the surface information was being governed by the UV maps, and the UV maps were exactly the size of my flat pattern pieces.  The thick edges of the pieces did not have any mapped texture information so Substance was stretching the texture data around the edge so what should have been multiple picks and ends in the weave of the cloth ended up really distorted.  My thinking right now is that the only way around this for me is to export the mesh from Clo as "thin" and then extrude the edges in Zbrush (or something) in order to create actual mesh data that I can unwrap to an extended UV map.  I wish I had someone with better understanding of this to consult with.

    What I suspect is happening in your case is something akin to this.  If you define any kind of thickness for your cloth, whether it be in the thickness of the cloth parameters or in the "additional thickness" portion of the property editor, you need to set the 3D preview view to "thick textured surfaces" or your pieces appear to float away from each other.  Clo has created space between the layers to allow for what will be artificially created thickness, so we use the "thick textured surface" view to see the pieces connect.  A bit like using some of the modifiers in Blender.  It would appear to me that you have attempted to create more dimension than Clo is able to handle correctly so there are gaps between the edges of your pieces and the curvature and geometry of the edges aren't being extruded correctly.  Look at the shoulder of your cable knit- there seems to be some displacement happening but not nearly enough to recreate the actual geometry of the surface of your knit.   I have been experimenting with displacement maps both in Clo and Substance for similar reasons and I'm not sure if there is something I am missing or that the technology is not sufficiently advanced to handle things like this with maps instead of extremely complex mesh sculpting.

    I'm going to go have a coffee now and then re-read what I wrote to see if any of it makes any sense. 

    0
  • ottoline

    jdiduch if you set your CLO3D mesh to thin and export that model to blender then add exactly the same thickness to that imported thin model (inside blender) using a thickness modifier set to the same pattern render additional offset + fabric preset thickness you will get the surfaces to the same tolerance as you see in CLO3D. You can add a edge mesh material and normal to all seams as a modifier, that way you can get an exact match without the heavy export of a double skin surfaced mesh. 

     

    The offset surface plane within CLO3D under displacement when rendered in vray is relative to the grey scale displacement values, this means if you have values that fluctuate between 0 - 1 (the depth texture map) they will be treated as additional to the surface plane and not sit equidistant (either side of the mid plane surface = shift value), you therefore need to compensate for that by dialing down the graphic image displacements height so it's thickness is on the mid plane (thin) mesh simulated cloth surface, (hence the negative - 0.50 offset value) which sets the displacement under the current surface (50%) and above the cloth surface (50%) so when the positive values in the displacement map kick in there is a balanced leveling of the rendered mesh at all edges (eg: that they join seamlessly as a texture). 

    vray - interactive displacement help that you can try in realtime

    How displacement works in CLO3D - Part 1

    How displacement works in CLO3D - Part 2

    The only way to get rid of the plastic porridge look of a fluffy fabric is to actually have texture map data in the normal, specular, textures that renders the light correctly at all angles.

     

     

     

    Which means better fabric texture maps, where the data is not missing in the digitization capture process. Which is possible.

    0
  • jdiduch

    Ottoline,

    Thank you for your comments; the chaosgroup link was very enlightening and I will attempt the Blender mod.

    J

    0
  • ottoline

    Yes it's a handy site reference as it gives an interactive playbook on what effects the texture in the render output, which can be abstract to new users unfamiliar with vray.

    In general I think you need to be aware that fabric CG quality is often about digitization quality (if off a real world sample) so hardware also comes into play for some of the coarser weaves and open mesh fabrics, as you can only simulate relative to the frequency of data you retain at the time of digitization.Too many of the commercial fabric textures chop out all the really great frequency in a fabric texture that leads to the subtle qualitative aspect of 'softness, translucency, or color variation or simply don't have the equipment (hardware) or technical know-how on how to retain that at the time of image capture. Why I created my own hardware and processes as I were shocked at how poorly many complex fabrics were digitized and then how poorly they rendered. 

    However there are ways to capture that quickly and accurately without getting into costly hardware or having to spend too much time using parametric texture tools, or software that is complex to learn. So knowledge is certainly an advantage when it comes to fabric digitization at source.

     

    Below > digitization extraction of the yarn types and then light transport maps that allow one yarn to be lit correctly relative to the world light (HDRI) as it travels behind a weave. 

     

    On the very bottom image you can see how the dye color (and weave intensity or the warp/weft) changes as the light moves in behind the fabric, this is exactly how the fabric behaves - yet is missing from 100% of all commercial textures, they don't include this crucial image data map. Bonkers when all knitwear or loosely woven fabrics generally have this light transport quality.

     

    0
  • ottoline

    The other fabric digitization issue for open weaves is to be aware of the need for the masking layer to match the weave precisely. Otherwise the fabrics quality can suffer, looking plastic or too solid.  So this means precise masking down to the hair strand, possible if you use the correct hardware and capture that at source. Many don't or push that into the 'too-hard' basket of technical woe.

     

    So if you have textures that toss this level of data out at the time of digitization you can end up with no way to place that level of quality back into the texture maps. (When I digitize any fabric I do both sides at the same time) so they are pixel perfect - crucial. Then they will render down to the fiber level.

     

    Above you can see the normal curvature on a single strand of fiber is technically correct (to the curvature) and to the actual diameter of the hair strand. This is possible with technology today, yet all (and I mean 100% of commercial fabric texture digitization)  toss out this data. And once out, you just lost the ability to make a fabric render true to the light source at the time of capture. Frequency and noise within a fabric that is accurate (a digital equivalent) is what makes it sing. Caring about how to retain this data (at the time of digital capture) is the magic that leads to truly great fabric renders.

    0
Please sign in to leave a comment.