|
Post by gbordelon on Sept 30, 2019 20:32:09 GMT
I'm currently struggling to repurpose my pattern logic to be used for normal maps. A material has a pattern object for calculating colors in the lighting() function. I would like for a material to have a pattern object for calculating normal perturbations. I think that means calling pattern_at_shape() inside the normal_at() function. Obviously the pattern_at_shape() needs to handle vectors in addition to colors for the logic to be reusable.
Has anyone tried this approach and succeeded?
|
|
|
Post by Jamis on Sept 30, 2019 22:03:58 GMT
That's very similar to what I did, though I didn't reuse the patterns. I implemented a separate class of objects that accepted a point and a normal vector, and returned the perturbed normal vector. Each shape's material then had an optional reference to one of these objects, which is invoked (as you suggested) in normal_at(). It works pretty well! Here's an example with a few different perturbation functions:
|
|
|
Post by gbordelon on Oct 1, 2019 9:34:48 GMT
Great! Thanks. I was eventually able to replicate all the parts of that image.
|
|
|
Post by mrwetsnow on Nov 17, 2019 21:28:35 GMT
Would (either of you) mind sharing the code for a sine wave perturbation? I think I am missing something fundamental about what I am supposed to be doing. The book talks about adding a small vector to the Normal at a given point, but everything I am reading online says that the original normal doesn't matter. But I do need to convert the new normal (calculated from the tangent and bitangent - but not sure what the right formula here is) from tangent space to world space (or object space first?).
Thank you Dan
|
|
|
Post by mrwetsnow on Nov 18, 2019 14:48:28 GMT
Few hours later, some progress. I managed to get a noise generator bump map working with basically this simple code:
func (p *NoisePerturber) Perturb(n Vector, pt Point) Vector {
pt = pt.Scale(p.scale) epsilon := 0.001 f0 := p.n.Eval3(pt.X(), pt.Y(), pt.Z()) * p.maxNoise fx := p.n.Eval3(pt.X()+epsilon, pt.Y(), pt.Z()) * p.maxNoise fy := p.n.Eval3(pt.X(), pt.Y()+epsilon, pt.Z()) * p.maxNoise fz := p.n.Eval3(pt.X(), pt.Y(), pt.Z()+epsilon) * p.maxNoise
df := NewVector((fx-f0)/epsilon, (fy-f0)/epsilon, (fz-f0)/epsilon)
return n.SubVector(df).Normalize() }
Which finds some offsets for the (offset) x,y,z values based on opensimplex (like perlin noise). Then finds the gradient vector and directly adjusts the original normal by the gradient vector. All good there. Obviously, in this case, no change of coordinates is needed.
It seems there is a second way of doing it, used with image based height maps. Here the original normal is not used. A new normal at point P is calculated based by:
- mapping P(x,y,z) to (u,v) on the image map (using, I imagine, any of the various types of mapping technics). - find the normal at (u,v) based on its neighbors. This seems to work by finding the tangent and bi-tangent (and then the normal). - Moving the normal to world space from tangent space. It's somewhat unclear to me how to do this right now, but it seems that you build a matrix based on the vectors found above.
Is this basically correct?
|
|
|
Post by Jamis on Nov 18, 2019 15:51:32 GMT
mrwetsnow -- my code with the normal perturbations is on my other computer, so I can't get at it just now. If it would be helpful, let me know and I'll share my implementation later. For the image-based height maps, I've not actually implemented that before. I have implemented texture mapping (and wrote a bonus chapter for it, here: raytracerchallenge.com/bonus/texture-mapping.html ) and it seems to me that many of the techniques there could be reused here. I think you're probably on the right track as far as thinking about how the normal would be calculated.
|
|
|
Post by mrwetsnow on Nov 18, 2019 15:53:34 GMT
Ok thanks. I think I have the normal perturbation under control. I'll try and get to image based height maps in the next few days and see how that goes. I've already implemented texture mapping. I think you are right in that there is a lot of overlap.
|
|