Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement 'fuzzy skin' displacement mapping #7985

Draft
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

undingen
Copy link
Contributor

@undingen undingen commented Jan 9, 2025

This is a work in progress PR which modifies the fuzzy skin logic to displace the objects skin based on the values in an image.
Unfortunately I don't have a 3d printer so could only test this a bit in the slicer - I would love to see 3d print using it....

Here some sample objects with a brick displacement map applied:
Screenshot From 2025-01-09 12-36-11

image

To enable: set "fuzzy skin noise type" to "Displacement Map" and change "fuzzy skin displacement map filepath" to the filepath to the image to use. The image must be a small (best to be around 128x128px) png grayscale 8bit per pixel displacement map. Each "fuzzy skin point distance" is on pixel in the texture (adjust it to something like 0.1 to get more detail).

It's based on the initial work of @Poikilos in PrusaSlicer which unfortunately did not gain much interest prusa3d/PrusaSlicer#8649.
There have been various feature requests for similar things: #3992 #5229

This is of course not ready for merging but I would like to see if the community is interested in this feature and get some feeback from the main contributors on where to store the png. There seems to be no existing feature which has the same requirements.

Update:
I was able to fix the visual artifacts I was seeing and its not based on the PR which integrates libnoise anymore.
Screenshot From 2025-01-26 12-29-34

Also I would appreciate if @Poikilos is looking at it, I'm not sure if its my modifications or a problem in the additional code but I do see some artifacts on objects depending on orientation. If a wall of the object is closer to 45° orientation it starts to show up - worse with arachne wall generation - seems to be some rounding errors/float precision?. ![Screenshot From 2025-01-08 22-29-47](https://github.com/user-attachments/assets/f3eee10c-d041-45fd-84fd-5de146a5445f)

I integrated with the #7678 as an additional noise type. But if this patch is not getting merged I can modify it to work without it. So please only take a look at the last commit implemented by me.

@OskarLindgren
Copy link

I don't have a 3d printer so could only test this a bit in the slicer - I would love to see 3d print using it....

I have both an ender 5 pro and a p1s, so both ends of the spectrum; I would love to help you test this in the real world (especially with multi material to see how that effects things) to see if it's viable and what issues could arise, please reach out to me either here or on discord: divide05

@discip
Copy link
Contributor

discip commented Jan 9, 2025

FINALY! 😊

Awesome! 👍

Copy link
Contributor

@discip discip left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

First of all: Thank you very much for this PR! 👍

I think, for consistency reasons, it would be nice to either have all displacement maps in a predefined list (Perlin, Billow, Brick ...), or to have to select them by specifying a path.

Or even better:
A separate folder in which displacement maps are stored and their respective names are listed in a drop-down menu under Fuzzy Skin Noise Type.

Please correct me if I'm wrong. 😊

src/slic3r/GUI/Tab.cpp Outdated Show resolved Hide resolved
@Keavon
Copy link

Keavon commented Jan 9, 2025

I'm not familiar with the code so I can't assess how easy or hard its implementation would be, but from a product design perspective, how about a drag-and-drop experience where the texture file can be placed onto the 3D model directly, or onto the dropdown menu in the UI which is used to pick the texture? It would probably be sufficient to just store one at a time (whichever was dropped onto the model or dropdown menu most recently). The dropdown menu could say "Custom Texture" if one is active. I think the drag-and-drop experience is reasonably easy but there could also be a dropdown entry called "Browse for Texture..." which, upon being selected, immediately opens the OS file browser dialog, and once successfully selected, switches to "Custom Texture". This would be preferable to putting files into some magical directory that users need to locate.

@Keavon
Copy link

Keavon commented Jan 9, 2025

The image must be a small (best to be around 128x128px) png grayscale 8bit per pixel displacement map. Each "fuzzy skin point distance" is on pixel in the texture (adjust it to something like 0.1 to get more detail).

Question: how does this all work to map a 2D texture to a 3D model if not with a UV map or some other technique like triplanar projection? And just to bring up the idea, how feasible would someday extending this to support UV-mapped displacement textures for full control of micro details on a model?

@Poikilos
Copy link

Poikilos commented Jan 9, 2025

The way to avoid artifacts in angled walls is either (depending on the shape of object, one or the other is better):

  • use the object's rotation to transform the cube projection (vector math is not my strong suit, I may need some help here)
  • add cylinder projection (the math is pretty easy here, but it works better on objects with a curved top profile such as an upright cylindrical tower, not as well for sides that are flat from top view)

@Poikilos
Copy link

Poikilos commented Jan 9, 2025

The image must be a small (best to be around 128x128px) png grayscale 8bit per pixel displacement map. Each "fuzzy skin point distance" is on pixel in the texture (adjust it to something like 0.1 to get more detail).

Question: how does this all work to map a 2D texture to a 3D model if not with a UV map or some other technique like triplanar projection?

Look up "Cube Mapping" online to get the idea of it. Primitive shapes were often used as an alternative to UV mapping in early graphics. Primitive mapping is still often used for environment maps of various sorts (backgrounds, fake [cubic] reflection maps).

  • Cube mapping is mostly a lost skill (except can probably be exported from Blender) so I think the 4-sided partial cube map is good for people who just have some seamless rectangular texture and want to throw it on a model. In fact, you can create a cube map that maps to the model perfectly, such as projecting an unwrapped UV map to a cube UV map in Blender, if you arrange the cube the same way (only 4 sides are used by my implementation, so just squish the top and bottom flat :edit: in Blender's UV editor, then move the UVs to map to the 4 edges to the seams of the texture--typically you would want a texture that has a 4:1 aspect ratio for this, 4 sides in a row in the order you want).
  • Of course, with models such as characters, there will be issues with concave areas including "very concave" areas such as between the body and an arm since the same part of the UV will cover both. Cube mapping (or my 4-sided cube mapping) is more suitable for structures, but could look ok on an organic model.

And just to bring up the idea, how feasible would someday extending this to support UV-mapped displacement textures for full control of micro details on a model?

There seem to be 2 options, and option B would be probably better (way easier, but still requires some work in regard to code usually only in graphics-oriented 3D programs), and B would reside in entirely different code to this PR:

A. (hacking this code even further, as in sending even more data to the slicer, not advisable)
You have to know the z position of the layer and the 3 verts to make a calculated normal (not a stored normal if present) of the face to recover the point's position on the original face using raycasting:

  • The face and face index (as corresponds to the UV map) would have to be attached (as variables/pointers) to the sliced point or segment resulting from it, then passed along to each fuzzy point (as a vector of 3 3d points and 1 face index or pointers to those 2 things).
  • During the displacement step, that is the hard part. You have to do raycasting from the fuzzy point to the preserved face, at a straight angle to the face:
    • calculate the face normal
    • move the vector along the face so it points to the position of the fuzzy point
    • intersect the ray with the face (raycasting) to recover the triangulated position
    • go back and recover the UV from the UV mapped face with the preserved face index, using that triangulation data to do the actual UV mapping

In other words, it is not very feasible, but possible.

B. (simpler, more standard)
The only way I could see around A is to utilize a UV map before slicing (before course segment is sent, which in this case would become detailed and bumpy right away), which would make the feature entirely separate from fuzzy skin which is done after (but technically compatible, if you want extra bumpiness overlayed on the UV mapped displacement). You'd probably have to cache a detailed version of the model with displacement applied, at minimum to the detail of the image, or maybe adjustable to higher for smoothly sampling the image. Essentially it is traditional displacement mapping in that scenario. Just send the detailed cached model to the slicing algorithm and you're done.

If B were done eventually, that would replace code in this PR. It is something I didn't think of earlier. But it is similarly tricky in that there has to be a cached model with all of the detail applied to it, and it requires more modeling code (hence significantly more code added than this PR, depending on what is available in your 3D engine).

Also, keep in mind STL does not have a UV map. That and the complexity are reasons why I didn't do it. If we at least get primitive mapping working well for various object shapes that is probably going to solve most use cases. However, if someone knows the vector math better than me and can do what I described above with raycasting, or the preferred pre-slicing displacement (option B), go for it. If we get that working, people may start to use other formats such as OBJ which support UV maps. Probably save UV maps for later, to keep an incremental approach.

@daniiooo
Copy link

daniiooo commented Jan 9, 2025

This feature looks great. I would like to try it out but I'm having hard time procuring decently working displacement maps.
Would you mind providing some to test it with?

@daniiooo
Copy link

daniiooo commented Jan 9, 2025

Alright I got one test print going with brick wall since this one worked out of the box. Sandstone didn't show up on sliced model with the same settings.

Settings
I'm not sure if those walls should be considered overhangs so that might be something that'll need ironing out.
Sliced model
Another potentially unwanted behaviour is that when using all walls setting for fuzzy skin even the interior walls are being displaced. Thats not the case for contour or contour and hole. This affects very heavily places where only walls are present like in photo 4.
Contour
All walls
Here are the photos of test model.
Photo 1
Photo 2
Photo 3
Photo 4

From other miscellaneous obervations, like fellow users mentioned there should be an easier way to apply the desired map, something like drag and drop or a file picker option.
Another useful option but nothing really critical would be some way to display the map on the not yet sliced model to see how the settings affect it.
Also the fuzzy skin filepath isn't getting saved properly in the 3mf project file on windows. After saving and reopening the project the backslashes get removed.
For example a path C:Users\user\Desktop\test_1.png becomes C:UsersuserDesktoptest_1.png

For reference here is the 3mf file and gcode of the print.
displace_map_test.zip

@undingen
Copy link
Contributor Author

undingen commented Jan 10, 2025

Thanks for all the input! It's encouraging to see interest in this PR.

In my opinion, adding proper UV maps or significantly increasing mapping complexity isn't the right approach. Even if we were to support UV maps, it’s likely that anyone needing such precise application would also want features like bumps on horizontal surfaces - something that isn’t feasible with the (current) fuzzy skin logic.

I believe users requiring this level of detail would benefit more from modifying the model's geometry in Blender or a similar program, which is far better suited for the task than trying to reimplement such functionality here.

If this small feature set ends up not being practical for real prints, at least the prototype serves as a useful exercise in highlighting that changing the geometry might be the better solution.

@undingen
Copy link
Contributor Author

Alright I got one test print going with brick wall since this one worked out of the box. Sandstone didn't show up on sliced model with the same settings.

Thanks for your detailed report and uncovering new bugs!

There is a bug I did not spot: depending on if wall generation is arachne or classic the bump map is applied negated. You tried with arachne which made the cement between the bricks dent out and the bricks dented in. In classic its the opposite. Will fix this later today. This likely is also the reason why I said in the initial post that arachne seems to have more issues - I likely did not notice that the displacement map was applied swapped...

Concerning the overhangs:
@OskarLindgren (thanks!) and me experimented with it a bit more (final results are not in yet) and we noticed that the bricks displacement map we where using had way to big steps between the cement and brick part which made the overhangs too big - likely adding a feature/option to smooth over the texture to create intermediate steps would be helpful to easily avoid running into to big overhang if they use a stock displacement map from the internet.
Also fuzzy skin thickness of 0.6 is quite a lot - maybe sticking around the 0.3 is better for now.
The other issue is the artifacts generated on the walls I will have to investigate more why they are here: there are horizontal lines in the sliced code on flat bricks which should not really cause an issue.

I will also add a workaround for the filepath issue in windows.

This is one of the benchys of @OskarLindgren which shows the problem with to big overhangs when the texture is switching between black and white instantly without making steps...
image

@undingen
Copy link
Contributor Author

I pushed three new commits:

  • make the displacements go both ways like they do for the random noise ones. Before I only return values between "0 and fuzzy_skin_thickness" not its "-fuzzy_skin_thickness to + fuzzy_skin_thickness". This is doubling the selected fuzzy skin thickness so configured values need to be smaller now.
  • arachne applied the offset swapped. This seems not related to my patch but before was not noticeable because it was just random offsets...
  • reordered the menu items and hide some unused ones

Concerning the windows file path \ issue: this seems to be a problem with the config save/load code which does not correctly escape the string (double unescapes it thereby removing the \). Could you please as a workaround manually changing the \ to / - I think they should be supported by Windows and will not cause issues when storing.

I played around with slightly rotating a cube and I can definitely notice that this changes how the displacement map is applied and that it does not take into account the rotation of the object: I think all coordinates are relative to the bed and if the object is slightly rotated x/y will only marginally increase but the code expects it to increase by a full step.
The cube in the distance is slightly rotated the front one is aligned to the bed axis:
image

@undingen
Copy link
Contributor Author

undingen commented Jan 10, 2025

@daniiooo
With the new code (which correctly applies the displacement map and not inverted) I think its slices with less overhangs (I did reduced skin thickness to 0.3 but I think this should be like the old 0.6 you used):
image

image

@daniiooo
Copy link

daniiooo commented Jan 10, 2025

The artifacts look way more pronounced on low layer heights(0.08mm in the attached picture)
Point distance: 0.08
Thickness: 0.3

Example
The workaround for path not saving properly works. Since it's a config save/load code though it probably should be fixed anyway even if not in this PR. If there is anything specific that needs to be tested out with actual prints feel free to reach out.

@SoftFever SoftFever added the enhancement New feature or request label Jan 13, 2025
@undingen
Copy link
Contributor Author

I fixed a few more artifacts :) but unfortunately could not figure out yet whats the issue with the voron cube (as @daniiooo pointed out). It seems like it only happens with the Arachne wall generator and just rotating the cube by 0.1° around Z fixes it.

@undingen
Copy link
Contributor Author

undingen commented Jan 20, 2025

I fixed now the issue with the voron cube :) @daniiooo - please let me know if you folks find new problems.
Fixed Voron Cube


In case somebody want to try it out on Windows and does not have access to finished build files generated by the CI - I uploaded the portable archive to my google drive: https://drive.google.com/file/d/1sEysp08gKGIDkKuS4DGQyApaKOcdCCjF/view?usp=drive_link

@daniiooo
Copy link

Had some time to check out the updated build. Printed voron cube at 0.12mm layer height. Used the sandstone map like last time.
Fuzzy skin point distance: 0.1
Fuzzy skin thickness: 0.3
The print quality with the map is very high.

Photo 1
Photo 2
Photo 3(seam)
Slicer(seam)
vid.webm

I noticed that the seams don't look that great though as shown in photo 3. I'm wondering if there would be a possibility to somehow make them look neater. Right now they are really jagged either because of the map applied or because of the slicer logic. That might be out of scope for this enhancement though.

@yw4z
Copy link
Contributor

yw4z commented Jan 25, 2025

very nice i got many high quality textures from my 3d apps. Did you tried 16-bit 8K images? :) 16-bit really makes difference in quality for 3D maybe it would be helpful on printing big parts. Also how you handle uv mapping?

@undingen
Copy link
Contributor Author

undingen commented Jan 25, 2025

8 bit resolution is enough for this case because FDM 3d printing resolution is not that high
e.g. assume 0.3mm fuzzy skin thickness => 0.6mm (+0.3 to -0.3) between 0 and 255 pixel value. 0.6mm/256 == 0.0024mm steps per shade...

The UV mapping in this PR is super simple it just taking the objects bounding box and treating it like a cube. It calculates on which face of the cube the current line (which needs "fuzzy skin" applied) is on based on the which direction the bump is pointing to determine the face of the cube. Afterwards calculates the u based on the x,y coordinate of the point on the line related to the bounding box and the determined face and v is just based on the current layers height... But this seems to work surprisingly good for the objects I tested but will fail for e.g. a ramp

@yw4z
Copy link
Contributor

yw4z commented Jan 25, 2025

You are right maybe 16-bit op for fuzzy skin but this feature can be applied separately for generating surfaces alternative to SVG that might open new possibilities. Here is few scenarios; user printing decorative wall tiles, user adds scar like details to surfaces..

maybe adding an option for which uv mapping will give some benefits. Angle on Z axis and type (cube, cylinder, sphere). cylinder has 1 seam compared to cube has 4. btw i use some fillet/radius on edges to get smooth transition on edges that gives better result on 3D and texture rendered seamless but it requires cylinder or perfectly tiled mapping on cube.

Btw great work, thanks for your efforts

@SoftFever
Copy link
Owner

@undingen
This is super cool!
Looking forward to merging it—let me know if you need anything from me to make it ready!

@SoftFever
Copy link
Owner

do you think if we can move the libnoise to deps?
Looks like it's decent sized library, it would be better to move it the deps so we don't have to build it every time.

@undingen
Copy link
Contributor Author

Thanks for the encouraging comments:
I think libnoise specific changes are best handled in #7678 because my PR is just building up on this change. But if that PR is not going to get merged because of e.g. top big external dependency I can rework my PR to not use it.

My main request would be guidance on:

  • the PNG should only get loaded once. In which C++ class should it get stored so it's not repeatedly load while slicing but also not leak it? I did not find a good place where to keep it but I'm also total new to the codebase.
  • how to integrate it into the config. It seems like there is no other config entry which is a filepath. Right now one has to copy paste in the filepath to the texture manually (and make sure to replace \ with / on windows) and - a file dialog would be nice - but also there have been comments about copying in the texture into the 3mf?

@SoftFever
Copy link
Owner

Thanks for the encouraging comments: I think libnoise specific changes are best handled in #7678 because my PR is just building up on this change. But if that PR is not going to get merged because of e.g. top big external dependency I can rework my PR to not use it.

Ah, good to know! If I understand correctly, this PR doesn’t use libnoise, right?
If so, could we rebase this PR onto the main branch? It would make reviewing and tracking changes much easier in the future.

My main request would be guidance on:

  • the PNG should only get loaded once. In which C++ class should it get stored so it's not repeatedly load while slicing but also not leak it? I did not find a good place where to keep it but I'm also total new to the codebase.
  • how to integrate it into the config. It seems like there is no other config entry which is a filepath. Right now one has to copy paste in the filepath to the texture manually (and make sure to replace \ with / on windows) and - a file dialog would be nice - but also there have been comments about copying in the texture into the 3mf?

I’ll need to go through the changes first and then get back to you.
Would you mind rebasing the changes onto the main branch first? It’s hard to figure out what changes are introduced in this PR.

@undingen
Copy link
Contributor Author

undingen commented Jan 26, 2025

@SoftFever I rebased (and squashed) my PR on top of current main branch but kept the basic abstraction the same as the libnoise patch to make it easy to integrate them again. This should now be much easier to review :-).

Looking forward to hearing where the image data should get stored to remove the current hack... and what else needs to get changed to get this in :)

@undingen undingen changed the title WIP: Implement 'fuzzy skin' displacement mapping Implement 'fuzzy skin' displacement mapping Jan 26, 2025
@Poikilos
Copy link

Poikilos commented Jan 28, 2025

. . .

maybe adding an option for which uv mapping will give some benefits. Angle on Z axis and type (cube, cylinder, sphere). cylinder has 1 seam compared to cube has 4. . . .

Here is some code I would suggest, but maybe wait for another PR? #8226

Add new fuzzy skin type: displacement map. This feature uses an 8-bit grayscale PNG to displace the object's surface based on pixel values.

The displacement map is applied by mapping the surface points of the 3D model to the corresponding points on the 2D displacement map.

The UV mapping in this implementation is simple: it treats the object's as a cube.
It determines which face of the cube the current line is on based on the direction of the bump (which is just perpendicular to the line).
The U coordinate is calculated based on the x, y coordinates of the point on the line relative to the bounding box and the determined face.
The V coordinate is based on the current layer's height.
This method works surpsingly well for many objects but may fail for more complex shapes like ramps.

This feature is based on work by the PrusaSlicer community member Poikilos, who initially implemented displacement and cube mapping.
@undingen
Copy link
Contributor Author

undingen commented Feb 5, 2025

@SoftFever just a gentle reminder that I'm looking to get feedback where the image data should get stored to remove the current hack... and what else needs to get changed to get this in :)

@SoftFever
Copy link
Owner

Thank you for your patience.

I had a quick look at the proposed changes. I’m unsure whether there is a suitable place for image loading or storage here, primarily because the proposed solution is designed to function as a special fuzzy skin during the perimeter generation step. I think PerimeterGenerator might not be a appropriate location to apply a displacement map. This approach would likely be computationally costly and could conflict with the current architecture design. While there might be a workaround, I haven’t had time to investigate this thoroughly.

Below are my high-level thoughts on displacement maps :

Ideally, we can treat displacement maps as an object manipulation (similar to scale/cut model operations). Applying a displacement map would involve baking it directly into the model’s geometry, making it a standard model editing operation.
Alternatively, we can also use displacement maps in the rendering pipeline for visualization, while bake actual geometry changes in the slice volume step.

@Poikilos
Copy link

Poikilos commented Feb 8, 2025

@SoftFever Just a thought, this is computationally intensive once, whereas doing a displacement map preview is computationally intensive per frame. Even if you hide it from the preview or simulate the displacement using a shader which would be far faster, and the geometry were computed later (non-visually) as a step before slicing, the geometry required is incredibly large in memory. The reason is that it requires higher DPI than the image itself if oversampling is desired (though even lower DPIs such as matching the image res or undersampled would still require a large amount). The reason it is more memory intensive is that you are doing the following (pseudocode):

  • vertex: fvec3 per dot (inch times dpi)
    • memory usage scales geometrically in 2D since we are talking about surface area not volume, but we are storing a vec3 (object twice as wide has 4 times as many fvec3 points)
    • entire set of points is stored in memory
      in contrast to mapping during slicing:
  • sliced vertex (vec2) per dot via pixel_intensity: float = uv_map_sample(...) at vertical dpi implied by actual layer height at this layer (perfectly optimized) & the horizontal dpi of existing fuzzy skin setting
    • memory usage scales geometrically in 2D (still dpi per surface area) but now we are storing vec2 (object twice as wide has 4 times as many fvec2 points)
    • but in this case only sliced vertex times slice point count is stored in memory (may be freed or reused after instructions are written to G-code)
      When I made the original code in PrusaSlicer (and no one noticed until undingen adapted it), it also conflicted with the architecture of PrusaSlicer, so I simply passed another param or two up the callstack, which could include z offset and an image pointer that could be generated and cached in code before slicing even begins, such as when the image was selected in settings or settings was loaded (if file still exists). Doing it at the slicing level was not done without regard to optimization--doing it there was itself a conscious optimization. Making a choice to pass more info up the stack that isn't part of the architecture (changing the signature of several methods in the callstack in that case) I would suggest as a reasonable trade off to use far less memory and computing total. Essentially, resampling a 3D model at a visually appealing DPI (as opposed to allowing modifiers during slicing) has the same problem as doing it in 3D software (it only eliminates the long-term storage and upload/download usage associated with such models, not the RAM, VRAM, CPU speed, and GPU speed requirements on the client side). The hardware is often even more limited on 3D print servers (such as via OctoPrint plugins or on-device slicing).

I'd be glad if someone created a feature and showed that it took less than 30 seconds per frame on an old i7 laptop with Intel graphics, but I am concerned that it will not be significantly faster (may be somewhat faster than Blender since less metadata and editing feature code such as mesh editing). I am concerned it will take far longer to actually generate the geometry, even if no visual is shown. If a shader rather than geometry generation is done, know that it took well over 30 seconds to render frame 1 after a mesh edit was done due to geometry caching or upload to VRAM or whatever it was doing, and possibly longer in your case since generation would also have to be done on frame 1 after changing the input image, object scale, object rotation, etc (though live feedback is doable if done using a shader or while mouse button is down even just a flat overlay that darkens lower parts and then is applied less economically when mouse is released). My initial rationale for the design choices is here: prusa3d/PrusaSlicer#8649

If you mean something else by architecture than arguments available at the top of the call stack during slicing, please explain. In any case please share this information with the team.

@Keavon
Copy link

Keavon commented Feb 8, 2025

Ideally, we can treat displacement maps as an object manipulation (similar to scale/cut model operations). Applying a displacement map would involve baking it directly into the model’s geometry, making it a standard model editing operation.

I have absolutely no familiarity with this project's code and I'm just a lurker here, but for what it's worth, I just wanted to say that the graphics engineer in me shuddered reading that. I'd recommend finding an alternate path to go down, since that removes the entire point of this feature compared to just applying a displacement in other software like Blender at the cost of an utterly bonkers amount of geometric detail.

@Poikilos
Copy link

Poikilos commented Feb 8, 2025

@Keavon Yes, that's the short version of what I was trying to say. Graphics technology solved detail optimization using maps that apply at the latest possible place in the pipeline instead of using more 3D geometry. I did a search and it was invented in 1978 by James Blinn (sounds familiar, I've used Blinn shaders in a few programs). Like ray tracing, bump mapping is only now able to occur in realtime, and it considered top of the line technology (such as in parallax occlusion mapping, more intensive than normal mapping) and we still wouldn't be able to render that in realtime if the geometry itself was nearly as detailed (though parallax occlusion mapping simulates it, that requires an expensive GPU and there still are not real verts even in RAM).

@SoftFever
Copy link
Owner

I have absolutely no familiarity with this project's code and I'm just a lurker here, but for what it's worth, I just wanted to say that the graphics engineer in me shuddered reading that. I'd recommend finding an alternate path to go down, since that removes the entire point of this feature compared to just applying a displacement in other software like Blender at the cost of an utterly bonkers amount of geometric detail.

Thank you for highlighting this. You’re correct that displacement maps in the rendering pipeline are standard for visualization.
I did mention this in the last part of my previous response.
The discussion here is more about how to apply the displacement map to the model’s geometry, as this is what we are looking for.
I’ll share the detailed rationale in a follow-up comment.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

8 participants