There is also a similar effort[1] to collect data on refractive index curves (against wavelength) for various materials, under CC0 (not CC-BY-SA, which seems a bit ridiculous for objective physical properties). That’s much less immediately useful for rendering, though.
Ray tracing tells you what the light hit. PBR tells you what happens to the light when it hits something.
They can be used separately or together. For example you can use PBR with rasterization instead of ray tracing (which describes many modern games), and you can use ray tracing with simplified or non-realistic materials (e.g. toon shading) instead of PBR materials.
Also, the term "PBR" has generally come to mean a set of specific techniques and equations that are still a radical simplification of what happens in the real world. These techniques are a good approximation in many cases but some materials will not be well modeled by the typical PBR equations. There are more advanced material models that can be used. These are often published at SIGGRAPH.
you can ray trace with pbr. Ray tracing is a technique to calculate the amount of light on a surface by projecting light rays from the camera (there are other related techniques as well). Once you have the amount of light on a surface, you need to figure out the interaction of that surface with the light. Traditionally this was done with heuristic methods like phong shading. Pbr uses more physically plausible (and measurable!) heuristics.
PBR can be used without raytracing. Instead of the previous models of using diffuse lighting, specular reflections (see Phong lighting for example), PBR is a set of rules: how metallic is the object, what is the albedo, is it a conductor or a dielectric, etc. This gives us a common way to describe materials. What you do with this, is up to you. You can keep faking it with math that approximates real results, or you can do raytracing.
It's a good effort, but somewhat naive to its approach. Material libraries like these are readily available in most renderer packages, and from many other places.
Check out the Maxwell renderer and some of their efforts to parametrize materials. They're collecting multiple values across multiple wavelengths, each measured over multiple angles of incidence to fully understand how light interacts with different materials.
Its existence is very welcome by me. If you're ever writing a ray tracer as a hobby project, the thought of having to find IORs/metalness/albedo for a material is pretty intimidating.
Sure if you're doing something in a professional renderer, probably not necessary.
BRDF stands for Bidirectional Reflectance Distribution Function
> [It] is a function of four real variables that defines how light is reflected at an opaque surface. [...] The function takes an incoming light direction, ωi, and outgoing direction, ωr (taken in a coordinate system where the surface normal n lies along the z-axis), and returns the ratio of reflected radiance exiting along ωr to the irradiance incident on the surface from direction ωi.
Could you explain your workflow using BRDF, and what application you're using? I just want to know the real world use case for this kind of data.
I'm open to expanding the database to include BRDF data as well.
I contributed a tiny bit to pbrt[1], and one of the things I loved was that if you just plugged in physical values you almost always got great results with minimal tweaking.
The Octane data seems most complete at first glance (with complex IOR etc), but for things like milk and blood I expected at the very least some absorption coefficient for the translucency or similar.
It's kind of interesting that no materials have metalness between 0 and 1. It might as well be a boolean or a tag.
I know it doesn't physically make sense to have it in between, but then I wonder why that ability is even included in a system that supposedly should help you make physically correct materials.
Maybe metalness of 0.5 could be used to describe a fine mixture of 50% metal and 50% non metal?
Maybe it's needed to provide an alias of sort when transitioning between a metal and a non metal in a texture
I assume it's more about when you don't have the level of microscopic detail required to model the texture of a material so you just average it out. E.g. brushed steel is built from a material that has a metalness of 1, but in practice for most rendering you'll just represent the brushed microstructure using a metalness value somewhere between 0 and 1.
The main reason is having metals and non-metals in the same texture and you want the blending to look nice. Also consider cases where the material properties change with a higher frequency than the texture resolution can capture (e.g. metal-dust on a dielectric). If you don't blend this in the texture it might produce aliasing. There are also cases where you might want to use a non-binary metalness for dusty metals or rusty metals or painted metals and stuff like that.
Metalness is essentially F0, the base reflectivity of a material (looking straight at it), which can be high for materials with very high indices of refraction, like Diamonds and other gemstones. Afaik Diamond has a metalness of about 0.2.
What does does the 'linear' term in 'sRGB linear' mean? sRGB is not a linear encoding. Does this mean that the original value was sRGB encoded but is now linear (can be stamped directly into a vec3 for BRDF work)?
Linear vs non-linear is a confusing terminology. Linearity is a relationship. Anyway, usually it means values are encoded in the sRGB color space (with sRGB primaries) without applying any "gamma" curve (also bad term).
Like subb said, "sRGB Linear" means sRGB primaries without the sRGB EOTF (gamma curve) applied, which you could say is "raw linear" values, the most common way to input colors in 3d applications.
Classic ray tracing doesn’t do gi well, and gi is important for materials like blood that do lots of subsurface scattering. Also, blood is a fluid and with fluids your often better off using a ray marching algorithm that can handle implicit surfaces than trying to do surface reconstruction
Roughness is only included in the API as a default value if, say, a plugin, would access the data and create a material in an application. Then it would get an approximate roughness value, so glass is glossy, and concrete is rough.
There's however too much variability in roughness values since it's a property of an item rather than a property of the material itself, so a drinking glass could be very rough if worn, or very glossy if new. That's why I haven't added it as a property on the website.
[1] https://refractiveindex.info/