Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Physically Based: A Database of PBR Values for Real-World Materials (80.lv)
121 points by ibobev on Aug 21, 2022 | hide | past | favorite | 46 comments


There is also a similar effort[1] to collect data on refractive index curves (against wavelength) for various materials, under CC0 (not CC-BY-SA, which seems a bit ridiculous for objective physical properties). That’s much less immediately useful for rendering, though.

[1] https://refractiveindex.info/


Physically based rendering (PBR) is a computer graphics approach that seeks to render images in a way that models the flow of light in the real world.

https://en.wikipedia.org/wiki/Physically_based_rendering


What's the difference between PBR and ray tracing?


Ray tracing tells you what the light hit. PBR tells you what happens to the light when it hits something.

They can be used separately or together. For example you can use PBR with rasterization instead of ray tracing (which describes many modern games), and you can use ray tracing with simplified or non-realistic materials (e.g. toon shading) instead of PBR materials.

Also, the term "PBR" has generally come to mean a set of specific techniques and equations that are still a radical simplification of what happens in the real world. These techniques are a good approximation in many cases but some materials will not be well modeled by the typical PBR equations. There are more advanced material models that can be used. These are often published at SIGGRAPH.


you can ray trace with pbr. Ray tracing is a technique to calculate the amount of light on a surface by projecting light rays from the camera (there are other related techniques as well). Once you have the amount of light on a surface, you need to figure out the interaction of that surface with the light. Traditionally this was done with heuristic methods like phong shading. Pbr uses more physically plausible (and measurable!) heuristics.


PBR can be used without raytracing. Instead of the previous models of using diffuse lighting, specular reflections (see Phong lighting for example), PBR is a set of rules: how metallic is the object, what is the albedo, is it a conductor or a dielectric, etc. This gives us a common way to describe materials. What you do with this, is up to you. You can keep faking it with math that approximates real results, or you can do raytracing.


I believe it is ray tracing, but focused on how light physically interacts with objects.


You can do pbr with othering rendering techniques like rasterization.


It's a good effort, but somewhat naive to its approach. Material libraries like these are readily available in most renderer packages, and from many other places.

Check out the Maxwell renderer and some of their efforts to parametrize materials. They're collecting multiple values across multiple wavelengths, each measured over multiple angles of incidence to fully understand how light interacts with different materials.


Its existence is very welcome by me. If you're ever writing a ray tracer as a hobby project, the thought of having to find IORs/metalness/albedo for a material is pretty intimidating.

Sure if you're doing something in a professional renderer, probably not necessary.


I think this is useful in that you don't need to use any specific render package.

It is nice to have a baseline of materials in an open accessible form.


Just color? https://physicallybased.info/ I would have hoped for BRDF info as well.


TIL

BRDF stands for Bidirectional Reflectance Distribution Function

> [It] is a function of four real variables that defines how light is reflected at an opaque surface. [...] The function takes an incoming light direction, ωi, and outgoing direction, ωr (taken in a coordinate system where the surface normal n lies along the z-axis), and returns the ratio of reflected radiance exiting along ωr to the irradiance incident on the surface from direction ωi.

Source: https://en.wikipedia.org/wiki/Bidirectional_reflectance_dist...


Could you explain your workflow using BRDF, and what application you're using? I just want to know the real world use case for this kind of data. I'm open to expanding the database to include BRDF data as well.


I contributed a tiny bit to pbrt[1], and one of the things I loved was that if you just plugged in physical values you almost always got great results with minimal tweaking.

The Octane data seems most complete at first glance (with complex IOR etc), but for things like milk and blood I expected at the very least some absorption coefficient for the translucency or similar.

[1]: https://pbrt.org/


Creating fragment shaders from scratch


I'm just building ray tracers from scratch mostly and having raw BRDF data would make that easier.


It's kind of interesting that no materials have metalness between 0 and 1. It might as well be a boolean or a tag.

I know it doesn't physically make sense to have it in between, but then I wonder why that ability is even included in a system that supposedly should help you make physically correct materials.

Maybe metalness of 0.5 could be used to describe a fine mixture of 50% metal and 50% non metal?

Maybe it's needed to provide an alias of sort when transitioning between a metal and a non metal in a texture


Blending between materials of various roughness, e.g, rust / paint peeling, etc.


I assume it's more about when you don't have the level of microscopic detail required to model the texture of a material so you just average it out. E.g. brushed steel is built from a material that has a metalness of 1, but in practice for most rendering you'll just represent the brushed microstructure using a metalness value somewhere between 0 and 1.


The main reason is having metals and non-metals in the same texture and you want the blending to look nice. Also consider cases where the material properties change with a higher frequency than the texture resolution can capture (e.g. metal-dust on a dielectric). If you don't blend this in the texture it might produce aliasing. There are also cases where you might want to use a non-binary metalness for dusty metals or rusty metals or painted metals and stuff like that.

Metalness is essentially F0, the base reflectivity of a material (looking straight at it), which can be high for materials with very high indices of refraction, like Diamonds and other gemstones. Afaik Diamond has a metalness of about 0.2.


What does does the 'linear' term in 'sRGB linear' mean? sRGB is not a linear encoding. Does this mean that the original value was sRGB encoded but is now linear (can be stamped directly into a vec3 for BRDF work)?


Linear vs non-linear is a confusing terminology. Linearity is a relationship. Anyway, usually it means values are encoded in the sRGB color space (with sRGB primaries) without applying any "gamma" curve (also bad term).


If this is true, for the OP, please provide raw linear values!


Like subb said, "sRGB Linear" means sRGB primaries without the sRGB EOTF (gamma curve) applied, which you could say is "raw linear" values, the most common way to input colors in 3d applications.


Thanks.


That jello-looking blood isn't too convincing. Real blood is opaque.


It’s not. Blood is a colloid and is actually quite transmissive. Opaque blood looks like plastic. Getting the right values is hard.

Blood, like milk, also exhibits quite a bit of subsurface scattering... it’s a hard problem to make it look good.


Nitpicking. Milk and blood are both opaque, compared to the clear red jello on that picture.


This is really really not nitpicking. This is literally what rendering people think about for a living.


The transmissivity is too high in that image, I agree btw. And ray tracing can’t really do convincing blood


Why would ray tracing not have convincing blood


Classic ray tracing doesn’t do gi well, and gi is important for materials like blood that do lots of subsurface scattering. Also, blood is a fluid and with fluids your often better off using a ray marching algorithm that can handle implicit surfaces than trying to do surface reconstruction


Thanks for clarifying


Please consider adding an optional 'roughness' column for the materials. You include them in the shader snippets. :)

How are roughness, specular, etc, measured?


Roughness is only included in the API as a default value if, say, a plugin, would access the data and create a material in an application. Then it would get an approximate roughness value, so glass is glossy, and concrete is rough.

There's however too much variability in roughness values since it's a property of an item rather than a property of the material itself, so a drinking glass could be very rough if worn, or very glossy if new. That's why I haven't added it as a property on the website.

Hope that makes sense!


Has anyone built something like DALL-E but for 3d rendered scenes? Seems like a PBR db like this could help a lot in an effort like that.


I saw someone recently hook up output from DALL-E or relative of it into kaedim3d http://www.cgchannel.com/2022/08/ai-based-web-app-kaedim-tur... and it worked


There is a lot of circumstantial evidence that suggests that Kaedim's 3D models aren't AI generated, but manually crafted: https://mobile.twitter.com/danjohncox/status/156028399204932...


Damn, it did sound too good to be true :(


Machine learning is the new hotness in computer graphics. It drives me insane (I don't do ML).


Unless I missed something, Luxrender is - sadly - missing from the list of engines ...


Noted! I'll be adding more engines to the list in a future update.


For homogeneous real-world materials


LOL, skin labeled I to V instead of giving them descriptive name.

Someone is afraid of the lynch mob.


....or they're just using the correct labels from Von Luschan's chromatic scale.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: