[英]Passing PointLight Info to a custom Shader with three.js
I want to create an effect like the undulating sphere described in the Aerotwist Tutorial . 我想创建一种类似于Aerotwist教程中所述的起伏球体的效果。 However, in the tutorial Paul creates a fake GLSL hard-coded light in the fragment shader - instead I want to pass info from a three.js PointLight instance to my shaders, manipulate vertices/normals, then perform Phong shading.
但是,在教程中,Paul在片段着色器中创建了一个伪造的GLSL硬编码光源-相反,我想将信息从Three.js PointLight实例传递给我的着色器,操纵顶点/法线,然后执行Phong着色。
My understanding of the various levels of GPU consideration when shading a scene in three.js is as follows (sticking with Phong, for example): 我对在three.js中的场景进行着色时对GPU考虑的各个级别的理解如下(例如,坚持使用Phong):
Q1: Is the above understanding accurate? 问题1:以上理解是否正确?
Q2: Is there a way to do something between levels 2 and 3? 问题2:是否有办法在2级和3级之间做某事? I want the ability to customize the shaders to mess with vertex positions/normals, but I don't want to write my own Phong shader when a perfectly good one is included with three.js.
我希望能够自定义着色器以使顶点位置/法线混乱,但是当three.js包含一个非常好的着色器时,我不想编写自己的Phong着色器。
Q3: If there is no such middle ground between levels 2 and 3, and I need to just go for level 3, whats the best way to go about it? 问题3:如果在第2级和第3级之间没有中间立场,而我只需要进入第3级,那么实现这一目标的最佳方法是什么? Do I pass the light's position, intensity, etc. as uniforms, do my vertex/normal modifications, then finally explicitly write the Phong shading calculations?
我是否要以均匀的形式传递灯光的位置,强度等,是否进行顶点/法线修改,然后最后明确编写Phong阴影计算?
It's very straightforward to do what you are asking with three.js 使用three.js来完成您的要求非常简单
I'm not sure where it falls in your Q[] 我不确定它在您的Q []中的位置
Q1 Q1
Now for how would you do this. 现在,您将如何执行此操作。 Say that you are following this tutorial and you have this shader:
假设您正在遵循本教程,并且拥有此着色器:
// same name and type as VS
varying vec3 vNormal;
void main() {
//this is hardcoded you want to pass it from your environment
vec3 light = vec3(0.5, 0.2, 1.0);//it needs to be a uniform
// ensure it's normalized
light = normalize(light);//you can normalize it outside of the shader, since it's a directional light
// calculate the dot product of
// the light to the vertex normal
float dProd = max(0.0,
dot(vNormal, light));
// feed into our frag colour
gl_FragColor = vec4(dProd, // R
dProd, // G
dProd, // B
1.0); // A
}
Here's what you need to do: 这是您需要做的:
GLSL GLSL
uniform vec3 myLightPos;//comes in
void main(){
vec3 light = normalize(myLightPos);//but you better do this in javascript and just pass the normalized vec3
}
Javascript Java脚本
new THREE.ShaderMaterial({
uniforms:{
myLightPos:{
type:"v3",
value: new THREE.Vector3()
}
},
vertexShader: yourVertShader,
fragmentShader: yourFragmentShader
});
Q1: Correct. 问题1:正确。 Although, some users on this board have posted work-arounds for hacking
MeshPhongMaterial
, but that is not the original intent. 虽然,此板上的某些用户已发布了骇
MeshPhongMaterial
的变通办法,但这并不是其初衷。
Q2 and Q3: Look at ShaderLib.js
and you will see the "Normal Map Shader". Q2和Q3:查看
ShaderLib.js
,您将看到“法线贴图着色器”。 This is a perfect template for you. 这是您的理想模板。 Yes, you can duplicate/rename it and modify it to your liking.
是的,您可以复制/重命名并根据自己的喜好对其进行修改。
It uses a Phong-based lighting model, and even accesses the scene lights for you. 它使用基于Phong的照明模型,甚至可以为您访问场景灯。 You call it like so:
您这样称呼它:
var shader = THREE.ShaderLib[ "normalmap" ];
var uniforms = THREE.UniformsUtils.clone( shader.uniforms );
. . .
var parameters = {
fragmentShader: shader.fragmentShader,
vertexShader: shader.vertexShader,
uniforms: uniforms,
lights: true // set this flag and you have access to scene lights
};
var material = new THREE.ShaderMaterial( parameters );
See these examples: http://threejs.org/examples/webgl_materials_normalmap.html and http://threejs.org/examples/webgl_materials_normalmap2.html . 请参阅以下示例: http : //threejs.org/examples/webgl_materials_normalmap.html和http://threejs.org/examples/webgl_materials_normalmap2.html 。
For coding patterns to follow, see ShaderLib.js
and ShaderChunk.js
. 有关要遵循的编码模式,请参见
ShaderLib.js
和ShaderChunk.js
。
three.js r.67 three.js r.67
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.