_id
int64
0
49
text
stringlengths
71
4.19k
14
glDrawArrays draws nothing I am trying to draw a triangle using shaders in LWJGL, but nothing is being drawn on the screen, and no error is being produces. I can't figure out what I'm doing wrong. To create a vao, I use int buffer glGenBuffers() int vertexArray glGenVertexArrays() ByteBuffer data ByteBuffer.allocateDirect(6 8).order(ByteOrder.nativeOrder()) data.putFloat( 0.5f) data.putFloat(0.5f) data.putFloat( 0.5f) data.putFloat( 0.5f) data.putFloat(0.5f) data.putFloat( 0.5f) glBindBuffer(GL ARRAY BUFFER, buffer) glBufferData(GL ARRAY BUFFER, data, GL STATIC DRAW) glBindVertexArray(vertexArray) int positionAttributeLocation glGetAttribLocation(program, quot position quot ) glEnableVertexAttribArray(positionAttributeLocation) glVertexAttribPointer(positionAttributeLocation, 2, GL FLOAT, false, 8, 0) and then I draw using glUseProgram(program) glDrawArrays(GL TRIANGLES, 0, 3) Here's my vertex shader version 110 in vec2 position void main(void) gl Position vec4(position.xy, 1, 1.0) and fragment shader version 110 void main(void) gl FragColor vec4(0.0, 0.0, 0.0, 1.0)
14
vertex pixel shaders and "materials" What relationship is there, if any, of "materials" and vertex pixel shaders (or the "effects" that combine the latter two)? I have the impression that before the advent of HLSL, materials were explicitly handled by Direct3D or OpenGL. I also have the impression that with HLSL, many prior effects are now handled directly (and generically) via the rendering pipeline in vertex pixel shaders. In other words, I hae the impression that materials are often little more than what DirectX would call an "Effect." If so, this means that many models that specify materials would be transformed by a content pipeline into an "Effect". Is this correct? Where can I get a little more history info that either corroborates or corrects the impression I have?
14
Show the edges between the clipping plane and clipped objects I want to utilize shaders to not only discard fragments if they are on one side of a predefined plane but also render a contour along the intersection. My fragment shader currently does something along the lines of float dot dot(world coordinate, normalize(clipping normal.xyz)) clipping normal.w if (dot gt 0.0f) discard this works but without the desired contour. I tried comparing the dot product against values close to 0.0 but this results in a contour with varying width depending on view etc... This is what I am trying to achieve. Notice that the white contour edge of where the plane intersects the sphere is of consistent width So below is what the result I currently see With the fragment shader in vec4 color in vec3 world position out vec4 frag color void main() float dist (dot(clipping plane.xyz, world position) clipping plane.w) dot(clipping plane.xyz, clipping plane.xyz) if(dist gt 0.0f amp amp dist lt 0.05f) frag color vec4(0.0f, 0.0f, 0.0f, 1.0f) else if(dist lt 0.0f) discard else frag color ComputePhong(color)
14
2D Hidden Object Silhouette Shader Right now I am not using any depth information in my engine, but I just found out that it would be neat to be able to render the silhouette of important game objects that are "behind" something with a shader, though I am quite at a loss how to do this in a 2D scene with little to no depth information. I will propably have to use depth information for something like that, but I am not quite sure how. I guess the abstract concept would be zn depth of pixel that should be rendered zc depth of pixel at current position in some sort of buffer if(zn lt zc) render pixel with a "silhouette" color else render pixel with it's intended color Is this a proof of concept I can follow in my next few workhours? How can I use and access stuff like the Z buffer of the current framebuffer in the shader? As an example, I most recently saw this in Titan Souls Also remember that I am thinking 2D only, so application of the painter's algorithm might or might not be desirable. EDIT So right now, from comments and such, I got the following proof of concept activate stencil mechanism to set stencil render obscuing objects deactivate stencil render scene, using painter's algorithm or depth checks to obscure activate stencil mechanism to check weather or not something got rendered within the stencil area render important objects again, using a different fragment shader to render pixels within the stencil as a dark shadow amirite?
14
How to write shaders that can be compiled for DirectX, OpenGL, and Vulkan I recently finished writing the DirectX renderer for my game engine. Now I have an OpenGL, DirectX as well as a not yet finished Vulkan renderer. Well, the majority of the renderers work perfectly now but I have a problem I need a shader programming language. The problem is that OpenGL and Vulkan use GLSL but DirectX uses HLSL (and Apple's Metal API uses MSL). So I searched for a High Level Shader Language and found only C for graphics from NVIDIA. But since this project was deprecated I looked for something else Without success. It's a bit annoying to write for 3 shader programming languages at the same time, so I'm looking for a language that can be translated into the native language immediately when the game starts (or is simply compatible with a lot of rendering APIs) After several weeks of finding nothing, I decided to write my own language for it. But before I invest too much time I want to know if there is another solution to this problem.
14
OpenGL ES Shader help (Blending) Earlier I required assistance getting to grips with how to retain the alpha channel of a transparent texture in my colourised texture shader program. Whilst playing with that first version of my program (before obtaining the solution to my first requirement), I managed to enable transparency for the whole texture (effectively blending via GLSL), and I quite liked this, and I would now like to know if and how it is possible to retain this blending effect, on top of the existing output without affecting the original alpha channel as I don't know how to input this transparency via the parameter that is already being provided with the textures alpha channel. A basic example of the blending program I am referring to (minus any other functionality) is as follows... varying vec2 texCoord uniform sampler2D texSampler void main() gl FragColor vec4(texture2D(texSampler,texCoord).xyz,0.5) Where 0.5 is the transparency (blending effect) of the whole texture. This is the current version of my program, which provides the ability to colour a texture according the colour parameter passed to the program, and retains the alpha channel of the original texture. varying vec2 texCoord uniform sampler2D texSampler uniform vec3 colour void main() gl FragColor vec4(colour,1) vec4(texture2D(texSampler,texCoord).xyz,texture2D(texSampler,texCoord).w) I need to know if it is possible to apply transparency on top this program, without affecting the original alpha channel which I have already preserved. I hope this makes enough sense, I am sure it is possible, and if so I should imagine it is rather simple, but this has me stumped. Any help much appreachiated. Cheers, Chris
14
Is there an HLSL equivalent to GLSL's "map" function? Google has not helped me in this area. Here is the GLSL shader vec3 globalIllumination(vec3 p, vec3 n) vec3 g vec3(0.0) float dist for (float i 1.0 i lt samples i ) dist stepDistance i float d vec3(dist map(p n dist, returnColor)) g returnColor d i return g I know how to convert this to HLSL, except for that map function. I could use some help.
14
pixel displacement with shader Having started learning shaders and experimenting with tools like Shadertoy, I am attempting to make stereoscopic (anaglyph or autostereogram) shaders as an exercise. For this, I need to displace each pixel to the left right depending on its depth value. I am trying to generate a double image, based on the original image and the depth map, with each image pixel displaced to the left and right by a value depending on the depth map so background pixels are displaced more less (depending on the mode) than foreground pixels. It would be easy to get the color of the pixel, say, 30 px to the left or right, if I needed a displacement of 30 for this particular pixel. The problem is, this would not actually work instead, I need to change the color of the pixel 30 px to the left or right, and displace the color of the current pixel there. One solution would be to use a for loop and check each pixel in the possible displacement range for its depth value. However this has an impact on performance that becomes untenable for large displacement values. With an average eye distance of 65 mm, this can easily require hundreds of pixels of displacement range. Another solution (as per this answer) would be to generate two viewpoints and combine them, but I want to avoid doing so here. I am searching for a solution based on a single image and its depth map instead. Is there a another, more efficient way to displace pixels in such a way using shaders? Note I am using Shadertoy for ease of use, but if a solution exists but is not usable in it, for example requiring a 3D engine to implement, it is still of interest.
14
Why aren't my 2D primitives visible using a custom effect? I'm working with Monogame and rendering a triangle using the following code. vertices new new VertexPositionColor(new Vector3(100, 200, 0), Color.White), new VertexPositionColor(new Vector3(200, 100, 0), Color.White), new VertexPositionColor(new Vector3(300, 200, 0), Color.White), graphicsDevice.DrawUserPrimitives(PrimitiveType.TriangleList, vertices, 0, 1) Ordinarily I'd apply a BasicEffect before the draw call, but this time I'm using a custom effect. For testing, I wrote the following shader code to return solid green for each pixel within the triangle. struct VertexShaderInput float4 Position POSITION float4 Color COLOR struct VertexShaderOutput float4 Position POSITION float4 Color COLOR VertexShaderOutput VertexShaderFunction(VertexShaderInput input) VertexShaderOutput output output.Color input.Color output.Position input.Position return output float4 PixelShaderFunction(VertexShaderOutput input) COLOR return float4(0, 1, 0, 1) technique Technique0 pass Pass0 VertexShader compile vs 4 0 level 9 1 VertexShaderFunction() PixelShader compile ps 4 0 level 9 1 PixelShaderFunction() The problem is that, although the shader compiles successfully, nothing shows up. Why is my triangle invisible using this shader? Based on my understanding of vertex and pixel shaders, the above HLSL code should work correctly. In the vertex shader, since (for the moment) I'm working in strictly 2D without a camera, no transformations need to be done on the vertices (i.e. they're already in screen space). From there, the pixel shader simply returns solid green for each pixel. I've also ruled out potential backface culling issues by swapping the order of vertices, and I've tried using different semantics on my HLSL structure variables. No luck. Shaders still feel uncomfortable to me, but I feel my logic is correct based on what I've read. If that's the case, there must be some weird quirk I'm not aware of in having these HLSL shaders work at all, apart from the logic. What am I missing? Is there a better way to debug these problems short of changing random variables and hoping for the best?
14
What are the pros and cons of HLSL vs GLSL vs cg? What are the pros cons of the three?
14
Shader authoring editing tools for GLSL ES Since Render Monkey has been discontinued (perhaps due to the complexity of today's shading languages), there are few successors that can match its functionality. Is there any useful tool for material editing aimed at developers and artists alike, but concentrated (or with substantial support for) on embedded GLSL shaders? Render Monkey itself wasn't that flexible (I'm not aware if it allowed the usage of several textures, each with its own set of texture coordinates plus, it didn't seem to be that intuitive either). Apart from complete engines, is there such a stand alone tool that can be used together with a stand alone engine (able to interface with a custom application)?
14
cocos2d mask rotation I've been experimenting with Ray Wenderlich's tutorial about masking sprite using shaders with cocos2D 2.0. It works pretty well but now I'd like to rotate the mask independently of the masked texture. Does anyone have any idea about how to achieve it ?
14
How many Pipelines in a Typical Rendered Scene DirectX12 I'm learning DirectX12 right now and I'm missing a few pieces of the puzzle in my own head on the overall structure of how you would setup a game. Specifically, I'm trying to get an idea of how shaders are processed through the pipeline. So in DirectX12, there seems to be a lot of work to setup a pipeline state object (PSO) and then you can assign one each of the following vertex, pixel, domain, hull, and geometry shaders. Well, it seems like there are probably many different shaders being rendered at one given time, so you would have your water shaders, glass shaders, ground, etc.. and you would need a separate pixel shader for each of these correct? So if you had about 50 pixel shaders running on a particular frame, you would need to create and configure another 50 pipeline objects? This seemed like overkill to me, so I'm just trying to figure out how this would work in a real world game engine. Thanks!
14
Why is the distance from object to eye irrelevant in illumination models? For example in Phong model and Blinn model, the light intensity does not change depending on how far away the camera is. Why is that?
14
Bilinear filter in repeating texture, HLSL I have a repeating texture that I'm using as a scroll surface. The idea is that as I pan the surface I adjust the texture coordinates, filling in what gets wrapped on the right as it disappears from the left. This is all working quite well. When bilinear filtering is switched on (D3DTEXF LINEAR) I think the repeat filters the edge of the texture with the corresponding pixels on the other side given the wrap. This is wrong of course. I've had a look around for bilinear filtering shaders that ignore edge texels but can't seem to find anything. Does anyone have an ideas about how to do this? Is there a way of configuring a sampler to ignore edge texels when filtering? Is there a bilinear filtering shader knocking around I can use as a reference?
14
Alpha is not working in diffuse light shader I am following this tutorials series on rastertek.com and I got a bit stuck on the Diffuse Lighting Tutorial. Particulary, the part that does not work for me is alpha channel of the light color. Here are the I think they are relevant code parts Light Shader cbuffer MatrixBuffer matrix worldMatrix matrix viewMatrix matrix projectionMatrix Texture2D shaderTexture SamplerState SampleType cbuffer LightBuffer float4 ambientColor float4 diffuseColor float3 lightDirection float padding struct PixelInputType float4 position SV POSITION float2 tex TEXCOORD0 float3 normal NORMAL float4 LightPixelShader(PixelInputType input) SV TARGET float4 textureColor float3 lightDir float lightIntensity float4 color textureColor shaderTexture.Sample(SampleType, input.tex) lightDir lightDirection lightIntensity saturate(dot(input.normal, lightDir)) color saturate(diffuseColor lightIntensity) color color textureColor return color PixelInputType LightVertexShader(VertexInputType input) PixelInputType output input.position.w 1.0f output.position mul(input.position, worldMatrix) output.position mul(output.position, viewMatrix) output.position mul(output.position, projectionMatrix) output.tex input.tex output.normal mul(input.normal, (float3x3)worldMatrix) output.normal normalize(output.normal) return output The problem here is the shader completely ignores the alpha channel of the light. If I set the diffuse color to red, then the texture will be reddish no matter what alpha is supplied. The only way I could make the alpha work is changing the line color color textureColor to color color color.w textureColor in the pixel shader. This however gives overall wrong result. Might it be the DirectX setup issue, here are some parts of the device initialization Set regular 32 bit surface for the back buffer. swapChainDesc.BufferDesc.Format DXGI FORMAT R8G8B8A8 UNORM ... Set up the description of the depth buffer. depthBufferDesc.Width screenWidth depthBufferDesc.Height screenHeight depthBufferDesc.MipLevels 1 depthBufferDesc.ArraySize 1 depthBufferDesc.Format DXGI FORMAT D24 UNORM S8 UINT depthBufferDesc.SampleDesc.Count 1 depthBufferDesc.SampleDesc.Quality swapChainDesc.SampleDesc.Quality depthBufferDesc.Usage D3D11 USAGE DEFAULT depthBufferDesc.BindFlags D3D11 BIND DEPTH STENCIL depthBufferDesc.CPUAccessFlags 0 depthBufferDesc.MiscFlags 0 ... Setup the raster description which will determine how and what polygons will be drawn. rasterDesc.AntialiasedLineEnable false rasterDesc.CullMode D3D11 CULL BACK rasterDesc.DepthBias 0 rasterDesc.DepthBiasClamp 0.0f rasterDesc.DepthClipEnable true rasterDesc.FillMode D3D11 FILL SOLID rasterDesc.FrontCounterClockwise false rasterDesc.MultisampleEnable false rasterDesc.ScissorEnable false rasterDesc.SlopeScaledDepthBias 0.0f What's wrong with this shader? I'm not sure what other code might be relevant, I will gladly add any at your request.
14
How to get the texture coordinate of a neighbouring pixel for a blur shader? I'm still having some trouble to get my head around fragment shaders and doing some image processing on textures. The context is a 2D sprite a simple texture painted on a quad. All done with OpenGL ES 2.0. My very basic goal is a simple blur filter using a 3x3 Kernel with average weights every pixel used is weighed 1 9th and summed up. Besides many ways to improve the performance of the fragment shader(code below) so far I'm still having some difficulties to find the right texture coordinate for the kernel. My approach so far is to use the actual size of the quad on the screen on which the texture is painted and pass those two values to the shader. This is done outside the shader and passed as a uniform to the shader program. glUniform2f( offset, 1 spriteWidth, 1 spriteHeight) This should result in the step in both directions to calculate a texture coordinate in the 0 to 1 space. The result is kind of looking good. BUT I am still struggling if this is something that could be done within the shader. Is there a way to get the size of the texture within the fragment shader? If we would be only doing this on a bitmap, I'll just go from pixel to pixel and read the color of the surrounding pixels. I am wondering if my understanding of a fragment shader is quite right It's run per rendered pixel on the screen. I found some examples for the GLSL to do this but I wasn't able to port it to OpenGL ES, so I had to start from scratch. For the sake of readability I write a bit more code in hope it's easier to understand the fragment shader varying vec2 v texCoord uniform vec2 u offset uniform sampler2D u texture const int size 3 const int KernelSize size size void main() int i, j vec4 sum vec4(0.0) vec4 intensityOfPixel vec2 texCoordForKernel for (i 0 i lt size i ) for (j 0 j lt size j ) texCoordForKernel vec2(v texCoord.x (float(i) 1.0) u offset.x, v texCoord.y (float(j) 1.0) u offset.y) intensityOfPixel texture2D(u texture, texCoordForKernel) sum intensityOfPixel 1.0 float(KernelSize) gl FragColor sum Thanks alot in advance!
14
Cascading Shadow Maps Spotlight Projection In all the cascading shadow map explanations I have read, the final projection into the shadow map is done using an orthographic projection. This makes sense for a directional light because a directional light "sees" things orthographically. Spot lights on the other hand "see" things with a perspective transformation. Once we have our light space bounding volume, should we be performing the final projection into the shadow map using the spotlight's projection matrix, rather than an orthographic projection matrix? Perhaps I am missing the point here. Cascading shadow maps are necessary to provide shadow maps over large areas, thus because spotlights tend to have a very limited area of effect, perhaps cascading shadow maps are not necessary for spotlights.
14
How do I create a manual object with colors for each vertex? How do I create a shaded manual object with colours for each vertex? Eg if ogreObj is the Ogre ManualObject ogreObj gt begin("BaseWhiteNoLighting", Ogre RenderOperation OT TRIANGLE LIST) will allow me to select each vertex's colour with ogreObj gt colour(r, g, b) after each ogreObj gt position(x, y, z) and ogreObj gt normal(x, y, z) call. However, if I change the material to BaseWhite, color() instructions are ignored. I read that you must disable lighting int the .material script, but I need it active... Any advice? ANSWER This Ogre forum's thread has a simple .material script that works for this purpose material Voxel Default technique pass diffuse vertexcolour specular vertexcolour ambient vertexcolour lighting on
14
bug in webgl phong shader lighting rotates with object I'm working on a simple phong shader in webgl, and I think I'm getting close but something is still wrong. Dead give away if I have a billboard and have it roll (so it spins like a wheel), the part of the billboard that is lit up spins with it (. This confuses me, because it seems like a problem with the model matrix, but the transform makes all the positions amp rotations correct, and lighting math is done entirely in world coordinates , just the lighting wrong. Ditto with the view matrix, I can move around and look freely and everything is located in its proper place, just lit wrong. Here are my shaders (minus the definitions for space, and with the lighting in the model matrix moved into GPU for clarity) if you prefer reading in github https github.com nickgeorge quantum blob master index.html L41 lt script id "fragment shader" type "x shader x fragment" gt void main(void) vec3 lightWeighting if (!uUseLighting) lightWeighting vec3(1.0, 1.0, 1.0) else vec3 lightDirection normalize(vLightPosition.xyz vPosition.xyz) float directionalLightWeighting max(0.0, dot( normalize(vTransformedNormal), lightDirection)) lightWeighting uAmbientColor uPointLightingColor directionalLightWeighting vec4 fragmentColor if (uUseTexture) fragmentColor texture2D(uSampler, vec2(vTextureCoord.s, vTextureCoord.t)) else fragmentColor uColor gl FragColor vec4(fragmentColor.rgb lightWeighting, fragmentColor.a) lt script gt lt script id "vertex shader" type "x shader x vertex" gt void main(void) vPosition uModelMatrix vec4(aVertexPosition, 1.0) TODO Move back to CPU vLightPosition uModelMatrix vec4(uPointLightingLocation, 1.0) gl Position uPerspectiveMatrix uViewMatrix vPosition vTextureCoord aTextureCoord vTransformedNormal normalize(uNormalMatrix aVertexNormal) lt script gt Thanks a lot, and let me know if there's anything else useful to add.
14
Phaser Shader Chain I want to implement lighting via shadowmaps. I see process as 1) render something to RenderTexture1(size as game) 2) create RenderTexture2 (custom size) 3) add it to Image2 (custom size) 4) apply "generate lightmap" shader to that image, with RenderTexture1 as additional channel 5) create RenderTexture3(size as game) 6) add it to Image3(size as game) 7) Apply "generate light from lightmap" shader, with rendered lightmap as additional channel I have a problem on last step, seems texture passing as additional channel is not rendered with shader. however it's ok in image. code create() ... this.sourceRT this.game.make.renderTexture(this.game.width, this.game.height) this.shadowMapRT this.game.make.renderTexture(this.SHADER SIZE, this.SHADER SIZE) this.shadowMapImage this.game.add.image(0, 0, this.shadowMapRT) this.shadowMapImage.filters this.shadowTexureShader this.lightingRT this.game.make.renderTexture(this.game.width, this.game.height) this.lightingImage this.game.add.sprite(200, 0, this.lightingRT) this.lightingImage.filters this.shadowCastShader this.shadowTexureShader.uniforms.iChannel0.value this.sourceRT this.shadowCastShader.uniforms.iChannel0.value this.shadowMapRT update() this.light.x this.game.input.activePointer.x this.light.y this.game.input.activePointer.y this.sourceRT.renderRawXY(this.renderGroup, 0, 0, true) this.renderGroup is a group with shadow casters this.shadowTexureShader.uniforms.uLightPosition.value this.light.x, this.light.y this.shadowCastShader.uniforms.uLightPosition.value this.light.x, this.light.y Additional re explanation with picture I take texture0, apply shader to it (texture1 as an additional channel), to get shadowmap. Then i want to take texture1, apply shader2 to it, with texture2 (given by shader as additional channel) to get shadows from objects. Problem at the question mark instead of getting texture2 (processed by shader1) i get texture0 (still unprocessed). I want to get rendered texture2 in my shader2.
14
Axis Aligned Bilboards in shader Hi I need to implement following effect using vertex shaders. Basically its a shader for particle laser beam that rotates to particle along its own y axis till its "best" visible (Roughly). My idea was Take "y" axis of particle model (its a rectangle) and transform it to view space ("vy"). Calculate the vector orthogonal to "vy" and eye vector ("w"), to get the direction on the screen which "x" of partilce should be oriented. Change particle vertices model coordinates using "w" instead of x.
14
bug in webgl phong shader lighting rotates with object I'm working on a simple phong shader in webgl, and I think I'm getting close but something is still wrong. Dead give away if I have a billboard and have it roll (so it spins like a wheel), the part of the billboard that is lit up spins with it (. This confuses me, because it seems like a problem with the model matrix, but the transform makes all the positions amp rotations correct, and lighting math is done entirely in world coordinates , just the lighting wrong. Ditto with the view matrix, I can move around and look freely and everything is located in its proper place, just lit wrong. Here are my shaders (minus the definitions for space, and with the lighting in the model matrix moved into GPU for clarity) if you prefer reading in github https github.com nickgeorge quantum blob master index.html L41 lt script id "fragment shader" type "x shader x fragment" gt void main(void) vec3 lightWeighting if (!uUseLighting) lightWeighting vec3(1.0, 1.0, 1.0) else vec3 lightDirection normalize(vLightPosition.xyz vPosition.xyz) float directionalLightWeighting max(0.0, dot( normalize(vTransformedNormal), lightDirection)) lightWeighting uAmbientColor uPointLightingColor directionalLightWeighting vec4 fragmentColor if (uUseTexture) fragmentColor texture2D(uSampler, vec2(vTextureCoord.s, vTextureCoord.t)) else fragmentColor uColor gl FragColor vec4(fragmentColor.rgb lightWeighting, fragmentColor.a) lt script gt lt script id "vertex shader" type "x shader x vertex" gt void main(void) vPosition uModelMatrix vec4(aVertexPosition, 1.0) TODO Move back to CPU vLightPosition uModelMatrix vec4(uPointLightingLocation, 1.0) gl Position uPerspectiveMatrix uViewMatrix vPosition vTextureCoord aTextureCoord vTransformedNormal normalize(uNormalMatrix aVertexNormal) lt script gt Thanks a lot, and let me know if there's anything else useful to add.
14
Is the series of books "GPU Gems" still good for a modern OpenGL 3.0 approach? I noticed that this series of free books from Nvidia is really popular and well know in the devs world, the problem is that is simply old you would recommend to read those 3 books to a developer that wants to improve his skills with a programmable pipeline and is starting to do so in the 2012 ?
14
Pixel Shader stage did not run I can't figure out why the pixel shader won't run. I'm using the Blinn Phong per pixel shader from here. Only change I've made is that I pass an aditional color per vertex which gets multiplied by the light color in the pixel shader. So far the Graphics Analyzer shows me a valid IA result and the same (okay looking) result in the VS stage. I disabled depth stencil for testing and though everything looks rather fine I always see "Stage did not run. No output." when I inspect the captured frame. This is my Projection matrix projection Matrix.PerspectiveFovLH((float) Math.PI 4.0f, viewports 0 .Width viewports 0 .Height, 0.1f, 100f) Values are Width 1346, Height 800 My View is calculated like this rotation Quaternion.RotationYawPitchRoll(Yaw, Pitch, Roll) Vector3.Transform(ref target, ref rotation, out target) Vector3 up Vector3.UnitY Vector3.Transform(ref up, ref rotation, out up) view Matrix.LookAtLH(Position, target, up) Whereas Position X 50, Y 50, Z 300, Target X 50, Y 50, Z 0 and Yaw, Pitch, Roll are all 0. World Matrix is currently Matrix.Identity. The quad I try to render spans itself from X 0, Y 0 to X 100, Y 100 with 4 vertices and 6 indices. I first thought that SV POSITION must be a normalized value since it's obviously a SV coordinate and someone posted in another thread that this solved his problem but then the VS stage does not show anything at all and PS still won't run. It's been quite some years since I last worked with DirectX so I'm not sure anymore why this happens.
14
glDrawArrays draws nothing I am trying to draw a triangle using shaders in LWJGL, but nothing is being drawn on the screen, and no error is being produces. I can't figure out what I'm doing wrong. To create a vao, I use int buffer glGenBuffers() int vertexArray glGenVertexArrays() ByteBuffer data ByteBuffer.allocateDirect(6 8).order(ByteOrder.nativeOrder()) data.putFloat( 0.5f) data.putFloat(0.5f) data.putFloat( 0.5f) data.putFloat( 0.5f) data.putFloat(0.5f) data.putFloat( 0.5f) glBindBuffer(GL ARRAY BUFFER, buffer) glBufferData(GL ARRAY BUFFER, data, GL STATIC DRAW) glBindVertexArray(vertexArray) int positionAttributeLocation glGetAttribLocation(program, quot position quot ) glEnableVertexAttribArray(positionAttributeLocation) glVertexAttribPointer(positionAttributeLocation, 2, GL FLOAT, false, 8, 0) and then I draw using glUseProgram(program) glDrawArrays(GL TRIANGLES, 0, 3) Here's my vertex shader version 110 in vec2 position void main(void) gl Position vec4(position.xy, 1, 1.0) and fragment shader version 110 void main(void) gl FragColor vec4(0.0, 0.0, 0.0, 1.0)
14
Retrieving DirectX9 Shader Input Parameters (vertex Input Layout) Is it possible to retrieve the vertex input layout from an effect file in Direct3D9? In Direct3D11 I can achieve this by using D3D11 SIGNATURE PARAMETER DESC paramDesc ID3D11ShaderReflection pRefector gt GetInputParameterDesc(paramIndex, amp paramDesc) to get info about the input parameters. I wish there was a way in Direct3D9 to do the same.
14
Distort a quad with a World Matrix Is it possible to distort a quad multiplying its vertices by a specific world matrix? See the picture to understand what kind of distortion I need Please note, that the quad is in 3d space and I particularly need it's VERTICES to be distorted and NOT to rotate a perspective camera. The quad is laying on a plane with Z 0, all of the quad's vertices have their Z components equal to 0 and they should have their Z component equal to 0 after the transformation. I know that it can quite easily be done by moving the vertices in a vertex shader, but the reason why I need to distort the quad using a World matrix is because I am rendering hundreds of such quads with a single draw call. Some of them need to be distorted and some dont. So I need a uniform vertex shader for every case and before doing branching in the shader I would like to know if such distortion can be done with a World matrix.
14
What do shaders encompass? I'm researching shaders as I'm thinking about doing them for my final year project at Uni. I've looked at a lot of examples online and I think I get it. It's something that you apply to an object or scene in order to create a desired effect without changing the original object scene, I think. I know that there are different types of shaders but that's the basic goal right? Furthermore, what do shaders encompass? Are weather systems in games shaders? Particle effects? Landslides? I don't know where the line is drawn between animations cinematics and shaders. Also I guess shaders which effect something in the game world, for example, a tower blowing up in BF4. The explosion itself would be shaders but the way the tower effects the world around it would be physics and collisions. Am I right?
14
How to have qt5 work on desktop like on mobile devices (es2, correct glsl version)? I am looking for help information concerning this issue My work I have an opengl es2 render engine that works on an iOS app. I almost managed to make it work by calling the same openGL rendering engine from a Qt(4.8 first, then 5.0 since today) app with a QGLWidget. To support es2 function set my widget also inherits QGLFunctions (QtOpenGLFunctions with qt 5.0.0). However I still have issues with shaders I could not compile them on qt with glCompileShader because of almost every keywords (lowp, vec4, vec2 ) were returning compilations errors. So I compiled it with QOpenGLShader program, but I had to specify version 120 , the closest to es2. But still, my sprites don t show up the right size and the only potentially influencing different pieces of code with iOS are in shader compilation version. I think the issue in the shader is that gl PointSize is not taken into account. Is there any better way for me to have shaders compiled on qt like they are on iOS ? ( I know glsl es2 version is coming from version 120 but I don t know to what extent they differ ). The hello gl es2 example did not help me because glsl version also returns 1.20. I ll happily receive any hints, thank you ! My shaders, working well on iOS but not on Qt const GLchar vShaderStr ifdef BUILD DESKTOP " version 120 n" necessary for Qt to compile shaders endif BUILD DESKTOP "attribute lowp vec4 Position n" "attribute mediump vec2 TextureCoord n" "attribute lowp float Weight n" "uniform mat4 MVP n" "varying mediump vec2 TextureCoordOut n" "void main(void) n" " n" " gl Position MVP Position n" " TextureCoordOut TextureCoord n" " gl PointSize Weight n" " n" const GLchar fShaderStr ifdef BUILD DESKTOP " version 120 n" necessary for Qt to compile shaders endif BUILD DESKTOP "varying mediump vec2 TextureCoordOut n" "uniform sampler2D Sampler n" "uniform bool IsSprite n" "uniform lowp vec3 TextureColor n" "uniform lowp float Opacity n" "void main(void) n" " n" " lowp vec4 textureColorResult n" " textureColorResult texture2D(Sampler, IsSprite ? gl PointCoord TextureCoordOut) n" " gl FragColor IsSprite ? vec4(mix(textureColorResult.rgb,TextureColor, 1.0), n" " textureColorResult.a Opacity) textureColorResult n" " n" Edit replaced preprocessor QT OPENGL LIB with custom BUILD DESKTOP as the first is no more declared in qt5.
14
How to create heat haze effect in Libgdx? I would like to create a heat haze effect for a 2D game I am making. Do You have ideas or suggestions, how to get that effect. I am using Libgdx(OpenGL 2.0). Thank You in advance!
14
DirectX 9 Light projection I am trying to see changes of component 'z' from light space. In vertex shader component 'z' divide 'w' is not 0. But after sending float4 with texcoord1 to pixel shader its 0. All matrices are good. Here is the code float4x4 MatWorld float4x4 MatView float4x4 MatProjection float4 LightViewMatrix float4 LightProjectionMatrix float4 LightPosition texture tex0 sampler2D ShadowMap sampler state texture tex0 struct VertexOut float4 position POSITION float2 tex TEXCOORD0 float4 lightViewPosition TEXCOORD1 struct VertexIn float4 position POSITION float2 tex TEXCOORD0 VertexOut VSLight(VertexIn In) VertexOut Out In.position.w 1 Out.position mul(In.position, MatWorld) Out.lightViewPosition Out.position Out.lightViewPosition mul(Out.lightViewPosition, LightViewMatrix) Out.lightViewPosition mul(Out.lightViewPosition, LightProjectionMatrix) Out.position mul(Out.position, MatView) Out.position mul(Out.position, MatProjection) Out.tex In.tex return Out float4 PSLight(float4 Color COLOR, float2 tex TEXCOORD0, float4 lightViewPosition TEXCOORD1) COLOR float depth lightViewPosition.z lightViewPosition.w if (depth 0) return float4(0, 0, 0, 1) return float4(depth, 0, 0, 1) technique T1 pass P0 VertexShader compile vs 2 0 VSLight() PixelShader compile ps 2 0 PSLight() All objects are black. Sorry my english is slightly poor.
14
How to detect syntax error in unity shader scripting Hi. I'm following shader scripting tutorial and I often make syntax mistakes. like other languages, can you have visual studio automatically underline with red color?
14
How to read neighbor pixels in GLSL? I'm using SFML 2.1, it's much more straightforward for me so I can jump directly to learning the shading language. I'm trying to do something similar to conway's game of life. I already learned that I will need to use 2 textures since I need to read and write at the same time. My questions How do I read neighbor pixels ? Do I need to pass a vec2 from the vertex shader to the fragment shader like this ? vertex shader out vec2 pixelpos void main() pixelpos gl Position fragment shader in sampler2d mytexture in vec2 pixelpos void main() if(mytexture pixelpos.x pixelpos.y .r gt 0.5) How do I write to a texture ?
14
ShaderBytecode Compiler one technique multiple passes I have an effect code with a basic structure like technique TechniqueName pass FirstPass Profile fx 4 0 VertexShader RenderFirstVS GeometryShader null PixelShader RenderFirstPS pass SecondPass Profile fx 4 0 VetrexShader RenderSecondVS GeometryShader null PixelShader RenderSecondPS pass ThirdPass Profile fx 4 0 VertexShader RenderThirdVS GeometryShader null PixelShader RenderThirdPS Now I tried to compile this with using (BinaryReader reader new BinaryReader(stream)) CompilationResult result ShaderBytecode.Compile(reader.ReadBytes((int)stream.Length), "fx 4 0") if (result.HasErrors) throw new Exception(result.Message, new Exception(result.ResultCode.ToString())) Data result.Bytecode.Data stream is new MemoryStream(Encoding.Default.GetBytes(effectContent)). The Data Property (byte ) is about 1 KiB large but if I try to load it via context.InputAssembler.InputLayout new InputLayout(device, effect.Data, someElements) it crashes with following exception D3D11 ERROR ID3D11Device CreateInputLayout Input Signature in bytecode could not be parsed. Data may be corrupt or in an unrecognizable format. STATE CREATION ERROR 161 CREATEINPUTLAYOUT UNPARSEABLEINPUTSIGNATURE Any idea how I can fix this or why the error is thrown? I do not want to use multiple shaders because I reuse many parameters I do not want to reassign in a deferred shading setup.
14
What is the math behind the light effect in krakatoa? I'd like to know the math behind the light effect in krakatoa (click here for an example). Light source is traveling with particles, but how is shading done? Is it something simple, like Phong shading? Is it possible to implement such effect in real time on GPU?
14
How should I implement multi pass rendering in a game engine? I have done multi pass rendering, before, and understand how it works. I made a simple example, which rendered a basic scene with shadows. This was all part of one file. Now, I am trying to figure out is how to put it into my game engine. Currently, my game engine uses a single pass. It is in a hierarchical structure, and uses Direct3D 9. I have a graphics component, which will load and draw a 3D model. In my game loop, I update all of the entities in the world, then I call the draw function for each one. This draw function gets the vertex buffer, index buffer and texture or material, and draws the 3D model using a shader. This works fine. To do multi pass rendering, to allow for shadows, I will need to draw each model multiple times. It doesn't seem right, to me, that in each models draw function I should put the second pass code this will then be completed before the next models first pass. How should I implement multi pass rendering in a game engine?
14
Understanding diffuse lighting in The Division Here is a screenshot of the main character walking under a bright lamp His hat, that was originaly dark grey, turns completely white. My question is how can such lightsource do this without making everything around it superbright? Cuz that's what my humble attempts resulted in. I can only think of two options they either use a standard diffuse formula (it has got to be diffuse light as it does not react to camera movement) and have a lightsource that fades extremely quickly (fastar than the standard quadratic attenuation model. or they use some clever shaders for clothes to make them "catch light" so fast. Maybe there is an easier solution? Here are two more shots, the floor is not nearly as bright as the character that goes under the lamp UPDATE I think I got my problem they have a different attenuation formula. I think if I improve my attenuation I whould be able to achieve similar results.
14
Shader effect similar to Metro 2033 gasmask I was thinking about effects in games the other day and I was reminded of the Gasmask effect from Metro 2033. Once you put the gasmask on it blurred a bit in the corners and could ice up and even get cracked. I assume that something like that is done using a shader. I have been experimenting a bit with game development, so far mostly playing with existing rendering engines and adding physics support etc. I would like to learn more about this sort of effect. Can someone give me a simple example of a shader that would alter the entire scene like this. Or if not a shader then an idea on how it would be done. Thanks. Edit Include screenshot of the metro 2033 gasmask effect.
14
Shader registers for different graphics card The title is not very explicative, so i'll try to make myself clear. I have two "working"(on which i work on) PCs a desktop that runs an NVIDIA GT440 an a laptop with a RADEON HD 4650. I have a shader for multipass rendering that uses both POSITION0 and POSITION1 registers for the vertex shader output struct VS OUT float4 Position POSITION0 float4 WorldPosition POSITION1 More variables for normal mapping and more This code works just fine on my desktop, but when i run it on my laptop the rendering is screwed. I finally tracked down the problem to the POSITION1 output parameter, if i change it to TEXCOORD0 it works. Now ... i don't like to do things i do not understand, so i'd like to understand WHY it happens. It's something GPU related for sure, but i'd be glad if you could link me some papers, articles or anything that explains how the registers work, and why not... even something in deep about the programmable pipeline. Thank you for your time and happy coding!
14
Replace each white spot of image to radial gradient Recently, I read an article about a sun shader (XNA Sun Shader) and decided to implement it using OpenGL ES 2.0, but I encountered a problem with the shader I have two textures, one of them is the fire gradient texture The other one is a texture which will have each white part filled using the gradient So, I'm going to have a result similar to the image below (do not pay attention to the texture being rendered on a sphere mesh) I really hope that somebody knows how to implement this shader. As an alternative, how do I make it in photoshop?
14
Is there a way to make the boundary between materials wavy? I have a sea trading game that I'm working on developing. Right now, my world looks like this There are 4 different "biomes", with more to be added. Internally, this is a large mesh which has 4 different types of materials added to it, to make it work. Each region has a material associated with it. The problem that I'm trying to overcome is to make it look less squary, I believe the process is known as bitmasking. The traditional way to do so I believe is to use images to show the boundary between each one, having one for each possible curve. That doesn't work with my current mesh architecture so far as I can tell. What I'm wondering is if there is a way to make the edges of the mesh to be curvy instead of straight lines. I assume it would have to be a shader of some kind, but my mind isn't quite figuring out the specifics of how to make it work. Any tips? Thanks!
14
Blinn, Normal maps Fresnel? So something came up today when I was going over my Blinn shader. As I've been taught, the half angle vector is calculated in the fragment shader, and is equal to normalize(lightDir viewDir). This seems to work fine. When adding a normal map, this still seems to works, except I use tangent space vectors and dot the half angle with the normal from the normal map. But here's what I don't understand when using normal maps, shouldn't the light amp view directions be re computed for each fragment? Seeing as the interpolation between the vectors is no longer linear, lightDir and viewDir can't possibly be correct anymore? Somehow I feel like the per pixel normal should play a part in calculating the view light directions. Am I missing something? This came up when working on a fresnel effect which also uses the half angle vector. Without any changes to the half angle vector, the fresnel effect can't possibly know anything about the normal map, which seems wrong (and actually looks wrong). Here's the fresnel calculation Fresnel (Schlick approximation) float base 1 dot(viewDir, halfAngle) float exp pow(base, 5) float fresnel exp fZero (1.0 exp) Given that neither viewDir nor halfAngle incorporate the normals from the normal map, the fresnel effect is computed based on interpolated normals and looks "blocky". Maybe this isn't a problem for the Blinn computation, but I thought it might be too.
14
How to send data from compute shader to vertex shader I have some shaders, every shader has the same constant buffer, constant buffer cbuffer cbPerFrame register(b0) float3 gEyePosW float4x4 gView float4x4 gProj float gDim float3 gVoxelOffset float gVoxelSize I think it's a waste to send these same constant buffer several times, so I want to pack them in one Constant Buffer, and share them to other vertex shaders. I have tried send constant buffer to compute shader, but they don't work in vertex shader. Do I need to pack these data in one structured buffer? Is it faster than send constant buffer several times? Is deferred shading relevant to this problem? Which solution do we usually use to solve this problem?
14
Is there a way to set a shader define in L ve? I know I can use shader send() to change variables, but is there a way to set change a define?
14
Overlap color between objects I'm currently trying to build a game with Ogre3D that is basically a moving vehicle that leaves a green trail (2D manual mesh) in it's path, what i'm trying to achieve is exactly what this image shows My problem is that i need to change, by some method technique, the color of the intersected path where the two meshes overlap (red area). I've been searching around the Ogre forum and found this thread http www.ogre3d.org forums viewtopic.php?f 2 amp t 47674, I've replicated that solution in my code but now on the screen i only see the intersected path. I'm a total newbie in stencil buffers and in Ogre generally, so I'm still not sure if this is the best approach to solve my problem. should I try another method rather than applying a stencil buffer? vertex fragment shader code that could help? Any advice or direction that you could provide will be very appreciated. Thanks a lot UPDATE According to JasonPh's answers i've managed to start adding some code 1) Create manual texture Ogre TextureManager tmgr Ogre TextureManager getSingletonPtr() gkString mMapTextureName "pathTexture" if (!tmgr gt resourceExists(mMapTextureName)) Ogre TexturePtr ptr tmgr gt createManual(mMapTextureName, Ogre ResourceGroupManager DEFAULT RESOURCE GROUP NAME, Ogre TEX TYPE 2D, 480, Width 640, Height 1, Depth 0, Ogre PF A8R8G8B8, Ogre TU RENDERTARGET) ptr gt createInternalResources() ptr gt load() Ogre RenderTexture pathTexture ptr gt getBuffer() gt getRenderTarget() gkEngine engine gkEngine getSingletonPtr() Ogre Camera camera engine gt getActiveScene() gt getMainCamera() gt getCamera() pathTexture gt addViewport(camera) pathTexture gt getViewport(0) gt setClearEveryFrame(true) pathTexture gt getViewport(0) gt setBackgroundColour(Ogre ColourValue Black) pathTexture gt getViewport(0) gt setOverlaysEnabled(false) pathTexture gt setAutoUpdated(true) 2) Create material from scratch and use the previously created texture. This material is then assigned to my "path" entity. Ogre MaterialManager mmgr Ogre MaterialManager getSingletonPtr() mMaterialName uniqueMaterialName("pathMaterial") mMaterial mmgr gt create(mMaterialName, "General") Ogre Technique tec mMaterial gt getTechnique(0) tec gt setSchemeName("ShaderGeneratorDefaultScheme") Ogre Pass pass tec gt getPass(0) pass gt setVertexProgram("pathMaterial vs", false) pass gt setFragmentProgram("pathMaterial fs", false) pass gt setCullingMode(Ogre CULL NONE) pass gt setColourWriteEnabled(true) pass gt setLightingEnabled(true) Ogre TextureUnitState tus pass gt createTextureUnitState() tus gt setTextureFiltering(Ogre TFO NONE) tus gt setTextureAddressingMode(Ogre TextureUnitState TAM CLAMP, Ogre TextureUnitState TAM CLAMP, Ogre TextureUnitState TAM CLAMP) tus gt setTexture(tmgr gt getByName(mMapTextureName)) mMaterial gt prepare() mMaterial gt load() 3) Fragment shader code uniform sampler2D pathTexture void main() vec4 color texture2D(pathTexture, gl TexCoord 0 .xy) if(color vec4(0.0, 1.0, 0.0, 1) green gl FragColor vec4(1.0, 0.0, 0.0, 1) red else gl FragColor vec4(0.0, 1.0, 0.0, 1) green This still needs some fixes to work, so new questions have emerged 1) Do I realy need to render "pathTexture" on screen to this to work? Maybe that texture could only be used to decide pixel colors an then discard it? 2) To only use "pathTexture" as an "input" for my shader, should I add a second pass on my material file with "pathTexture" as a texture unit? Thanks!
14
Understanding diffuse lighting in The Division Here is a screenshot of the main character walking under a bright lamp His hat, that was originaly dark grey, turns completely white. My question is how can such lightsource do this without making everything around it superbright? Cuz that's what my humble attempts resulted in. I can only think of two options they either use a standard diffuse formula (it has got to be diffuse light as it does not react to camera movement) and have a lightsource that fades extremely quickly (fastar than the standard quadratic attenuation model. or they use some clever shaders for clothes to make them "catch light" so fast. Maybe there is an easier solution? Here are two more shots, the floor is not nearly as bright as the character that goes under the lamp UPDATE I think I got my problem they have a different attenuation formula. I think if I improve my attenuation I whould be able to achieve similar results.
14
Get a warning about extension when compile GLSL code with glslc compiler shipped with vulkan SDK Trying to compile my pixel shader,and a warning generated D CS ComputerGraphics vulkan WindowsProject1 gt D ProgrammingTools vulkan Bin glslc shader.frag o frag.spv shader.frag 3 warning ' extension' extension not supported GL KHR vulkan glsl 1 warning generated. The following is my pixel shade code version 460 extension GL ARB separate shader objects enable extension GL KHR vulkan glsl enable extension GL EXT debug printf enable Input layout(location 0) in vec3 fragColor layout(location 1) in vec2 fragTexCoord Output layout(location 0) out vec4 outColor uniform layout(set 0,binding 2) uniform sampler2D texSampler void main() outColor texture(texSampler, fragTexCoord) how do I fix this?
14
Which is worst Too many VertexShader instructions, or FragmentShader instructions? I want to have a better understanding of how to create shaders with optimal performance and realize where some bottlenecks can occur. Is it usually favorable (when possible) to delegate most of the work to the VertexShader since in certain situations it may only need to process a few vertices vs. having the Fragment process the same thing potentially on thousands of pixels? For example, if I have scrolling UV values for a texture, is it best that I pass this scroll amount in the Vertex Shader so that it adjusts the start UV scroll UV to a Variant register for the Fragment to read, or to just use a Fragment Constant with the scroll UV amount stored in it which will be added with the start UV's read from a Variant register? Is passing values to one kind of Constant register (Vertex vs. Fragment) faster than the other?
14
Need help transforming DirectX 9 skybox hlsl shader to DirectX 11 I am in the middle of implementing a skybox to my game. I have been following this tutorial http rbwhitaker.wikidot.com skyboxes 2. I am using MonoGame as a framework and in order to support both Windows and Windows 8 metro I need to compile the shader with pixel and vertex shader 4. compile vs 4 0 level 9 1 compile ps 4 0 level 9 1 However some of the hlsl syntax has been updated with DX10 and DX11. I need to update this hlsl code float4x4 World float4x4 View float4x4 Projection float3 CameraPosition Texture SkyBoxTexture samplerCUBE SkyBoxSampler sampler state texture lt SkyBoxTexture gt magfilter LINEAR minfilter LINEAR mipfilter LINEAR AddressU Mirror AddressV Mirror struct VertexShaderInput float4 Position POSITION0 struct VertexShaderOutput float4 Position POSITION0 float3 TextureCoordinate TEXCOORD0 VertexShaderOutput VertexShaderFunction(VertexShaderInput input) VertexShaderOutput output float4 worldPosition mul(input.Position, World) float4 viewPosition mul(worldPosition, View) output.Position mul(viewPosition, Projection) float4 VertexPosition mul(input.Position, World) output.TextureCoordinate VertexPosition CameraPosition return output float4 PixelShaderFunction(VertexShaderOutput input) COLOR0 return texCUBE(SkyBoxSampler, normalize(input.TextureCoordinate)) technique Skybox pass Pass1 VertexShader compile vs 2 0 VertexShaderFunction() PixelShader compile ps 2 0 PixelShaderFunction() I quess I need to change Texture into TextureCube, change sampler, swap texCUBE() with TextureCube.Sample() and change PixelShader return semantic to SV Target0. I'm very new in shader languages and any help is appreciated!
14
Apply a grain effect to all the elements of a level I'm currently experimenting a little bit on level design. Let's say that I've a room composed of walls and a floor tiles. I'd like to apply to all these elements a sort of "grain" effect, similar to concrete. I wonder which is the better way to go... the only think I can immagine is working with texture directly, so for each element create the texture that already contains the "grain" effect. I've also tried to apply a generic screen shader but the result is not good because the grain is fixed to the screen, I wanted it to be fixed to walls floor. Obviously creating all the textures is a long process and I have to keep UV absolutelly proportional for each element. I'm ok with that solution actually but I wonder if there is anyother way to apply this effect all in once in a simpler and "safer" way, soemthing like a decal that maybe adds also some imperfections.
14
Rotate mesh to normal I have some instanced geometry (basic tube meshes) laid out in a grid, and I have a noise texture (normal map) that I want to use to rotate my instances with. So head pixel in my texture is a normal and I want to rotate each instance with the corresponding normal in a shader. How can I achieve that in a shader only? Unless I am mistaking, there should only be a rotation around the X and Z axes.
14
OpenGL ES 2.0 not drawing images with shadows I'm using OpenGL ES 2 to program a simple game 2D for Android mobile phones. I'm coding the rendering portion of the software, using the GLES20 default library. All my sprites are rendered from a large image containing all the pictures of what I need. I use very basic fragment and vertex shaders which are the following Vertex shader uniform mat4 uMVPMatrix attribute vec4 vPosition attribute vec2 a texCoord varying vec2 v texCoord void main() gl Position uMVPMatrix vPosition v texCoord a texCoord Fragment shader precision mediump float varying vec2 v texCoord uniform sampler2D s texture void main() vec4 dst texture2D(s texture, v texCoord) gl FragColor dst Those are the shaders I use and everything is rendering fine. I also set the blending function to GLES20.glBlendFunc(GLES20.GL SRC ALPHA,GLES20.GL ONE MINUS SRC ALPHA) The problem is that I needed a sort of shadow around an object, so I decided to solve this problem simply making the game image with already the shadow in it. So, as you can see, here is a piece of my image with a large orange outside shadow so I get the effect that the main object (the two columns in the middle) is glowing. (Ignore the black background, I put it there to make the shadow visible, it's not there in the real game image) Anyway, when I render this image I get this result As you can clearly see, there is not any sort of shadow, even it is there in the source image. What is really strange from my point of view is that I don't generate the shadow using the shaders, I created it in a very static (and probably not very effective) way using Photoshop. How can I make the outer orange glow visible ? Thank you in advice
14
VS2013 Compiling Shaders with Shader Model 5.0 When I try to compile two HLSL files included in my project, the compilation fails with an error Error error X4502 invalid vs 2 0 input semantic 'INSTANCE' However, I notice it's trying to use shader model 2.0 when I'm trying to use 5.0 Why is the shader compiler trying to use the 2.0 model when I've told VS to use 5.0? Or have I misunderstood?
14
How do I pass an object location into a vertex shader? I am using Blender Game Engine. I want to create a large flat plane, and deform it locally near a moving object. So far (despite being a beginner at shaders) I've written a vertex shader for the plane which moves the vertices to their correct positions (constant positions, for now). I cannot find a way to swap that constant location with an object's location updated every frame, while the shader is running. I am not even sure if it's possible. I only want to access a specific object's center from the shader.
14
Difference between Material and Shader In games, materials often only influence the visual appearance of objects. The visual appearance is effected by shaders. So regarding to terminology is there a difference between materials and shaders? Should you write one shader for one material?
14
Godot error(10) expected '(' after Identifier When trying to write up a new shader for a material, I've been trying to create a variable to use to alter speed and test animation. Now, the below outputs fine until I try to assign a value to number. shader type canvas item uniform float time factor 1.0 uniform vec2 amplitude vec2(10.0,5.0) uniform sampler2D frame1 uniform sampler2D frame2 uniform sampler2D frame3 uniform float speed 1.0 float number 1.9 void vertex() VERTEX.x sin(TIME time factor VERTEX.x VERTEX.y) amplitude.x VERTEX.y cos(TIME time factor VERTEX.y VERTEX.x) amplitude.y I had seen tutorials demonstrating that this would have worked fine in previous versions.
14
DirectX 9 Light projection I am trying to see changes of component 'z' from light space. In vertex shader component 'z' divide 'w' is not 0. But after sending float4 with texcoord1 to pixel shader its 0. All matrices are good. Here is the code float4x4 MatWorld float4x4 MatView float4x4 MatProjection float4 LightViewMatrix float4 LightProjectionMatrix float4 LightPosition texture tex0 sampler2D ShadowMap sampler state texture tex0 struct VertexOut float4 position POSITION float2 tex TEXCOORD0 float4 lightViewPosition TEXCOORD1 struct VertexIn float4 position POSITION float2 tex TEXCOORD0 VertexOut VSLight(VertexIn In) VertexOut Out In.position.w 1 Out.position mul(In.position, MatWorld) Out.lightViewPosition Out.position Out.lightViewPosition mul(Out.lightViewPosition, LightViewMatrix) Out.lightViewPosition mul(Out.lightViewPosition, LightProjectionMatrix) Out.position mul(Out.position, MatView) Out.position mul(Out.position, MatProjection) Out.tex In.tex return Out float4 PSLight(float4 Color COLOR, float2 tex TEXCOORD0, float4 lightViewPosition TEXCOORD1) COLOR float depth lightViewPosition.z lightViewPosition.w if (depth 0) return float4(0, 0, 0, 1) return float4(depth, 0, 0, 1) technique T1 pass P0 VertexShader compile vs 2 0 VSLight() PixelShader compile ps 2 0 PSLight() All objects are black. Sorry my english is slightly poor.
14
Vertex normals in the geometry shader using directx I'm in directx 11 with the geometry shader. Is is possible to calculate vertex normals? Just one like segment per vertex? In the geometry shader? I did vertex normals per face, this is what I got, My exercise in the book wants something like
14
Forward rendering and separation of shaders logic I'm currently playing with writing a rendering engine and implementing a forward rendering pipeline. I have few doubts on how things should be implemented regarding the render passes as well as the rendering of multiple lights. So I'm wondering what is the better approach here Uber shader that is generated dynamically to adjust for the number of lights that currently affect the active scene. This have the nice benefits of only one render pass for the lighting shader so much less binding occurs which is a plus. have separate lighting shader for each type of light and have a render pass for each mesh with each type of light, combining in the end the result of (possibly 2 render target that we ping pong between) with additive blending. This have the nice benefits of separation and more modular and maintainable shader code, but in the other hand the're more render passes and more binding going on and not to mention the ping pong of additive blending between the render targets. Other Ideas that I didn't heard thought about and would love to hear Ofcourse there's always room for optimization of both techniques such as ocllusion test, test which point spot light affect which mesh (relevant to the second technique). So I'm wondering what is the take on it of modern rendering game engine and maybe I didn't list what their approach is and would love to hear about it. Also if you side for one technique or another would like to have your thoughts on the pros and cons. Thanks in advance!
14
Godot error(10) expected '(' after Identifier When trying to write up a new shader for a material, I've been trying to create a variable to use to alter speed and test animation. Now, the below outputs fine until I try to assign a value to number. shader type canvas item uniform float time factor 1.0 uniform vec2 amplitude vec2(10.0,5.0) uniform sampler2D frame1 uniform sampler2D frame2 uniform sampler2D frame3 uniform float speed 1.0 float number 1.9 void vertex() VERTEX.x sin(TIME time factor VERTEX.x VERTEX.y) amplitude.x VERTEX.y cos(TIME time factor VERTEX.y VERTEX.x) amplitude.y I had seen tutorials demonstrating that this would have worked fine in previous versions.
14
Godot error(10) expected '(' after Identifier When trying to write up a new shader for a material, I've been trying to create a variable to use to alter speed and test animation. Now, the below outputs fine until I try to assign a value to number. shader type canvas item uniform float time factor 1.0 uniform vec2 amplitude vec2(10.0,5.0) uniform sampler2D frame1 uniform sampler2D frame2 uniform sampler2D frame3 uniform float speed 1.0 float number 1.9 void vertex() VERTEX.x sin(TIME time factor VERTEX.x VERTEX.y) amplitude.x VERTEX.y cos(TIME time factor VERTEX.y VERTEX.x) amplitude.y I had seen tutorials demonstrating that this would have worked fine in previous versions.
14
Refractive glass shader I have an infinite hexagonal floor, generated by tessellating a point grid in a tessellation shader pair Note that this is a flat wireframe the "shadows" are a lighting trick Now, I'd like to make this appear to be thick, refractive glass, but am unsure how to proceed. The first thing that came to mind is to set a uniform containing the requested "thickness" of the blocks When calculating lighting, use Snell's law to calculate the optical path length a ray would take through the hex block, if it were actually as thick as the "thickness" uniform says, and sum the alpha over that length. That would give the transparency, but doesn't handle things like internal reflection TIR, etc. I haven't tried that yet, so I'm not sure what the visual result would be. Ultimately, for this particular level, I'm trying to get that glass, hexagonal floor look used in Tron Legacy during the disc battle. (See this image for an example.) Suggestions?
14
Replacing 4 additive sprite layers with a single shader. Just can't get it right I'm using directx9 and have 4 textures I want to draw on top of each other. if I do this PDevice SetRenderState(D3DRS SRCBLEND,D3DBLEND SRCALPHA) PDevice SetRenderState(D3DRS DESTBLEND,D3DBLEND ONE) SetTexture(0,texture1) DrawSprite() SetTexture(0,texture2) DrawSprite() SetTexture(0,texture3) DrawSprite() SetTexture(0,texture4) DrawSprite() I Get EXACTLY the cool effect i want. but I suspect this level of blatant overdraw is inefficient. I'm sure this can be compressed into a single draw call somehow, and run faster. The trouble is, everything I try does not look the same, whether I try a fixed function muyltitexturing approach setting the 4 textures as different texture stages, or whether I try to implement it inside an fx file as a shader. Nothing has the same effect. Is it even possible? I'm guessing within a shader it could be done by just mulitplying the sampled texture at each stage with the current screen value, but tbh I'm not sure if that is accessible to the shader (or how). My current code works, but it bugs me I'm drawing 4 huge textures separately D
14
I have a frag shader, one with an empty s lightMap, how and why is it effecting the output? I have an image of concrete rocks of different shades of colors, and I'm applying this shader, but without referencing s lightMap's uniform in my program precision mediump float varying vec2 v texCoord uniform sampler2D s baseMap uniform sampler2D s lightMap void main() vec4 baseColor vec4 lightColor baseColor texture2D( s baseMap, v texCoord ) lightColor texture2D( s lightMap, v texCoord ) gl FragColor baseColor (lightColor 0.25) it displays a picture with variable light color added to the final pixel color, but s lightMap isn't even linked into the program, what is happening in this case, at first I thought it would just do the baseColor as the FragColor, but the addition of the 0.25 makes a non negative result to (lightColor 0.25). I'm confused, one minute I think lightColor would be set to a texture of 1's the next an array of 0's. Or is it just random data? It doesn't appear at all random in the picture, it looks like it's obeying a rule of shading. I'd like to mimic this effect in code that's not broken. Here's the unshadded image Example "gimped" image of how it has variable shades (though this is an invert)
14
D3D12 ConstantBuffer Shader receives wrong values im having trouble with one constantbuffer struct CameraConstData urd Matrix projection 64 ( 16 floats) urd Matrix view 64 ( 16 floats) urd Vec3 viewPosition 12 ( 3 floats) urd Vec3 viewDir 12 ( 3 floats) 104 bytes (26 4) float offset 26 inside the shader its defined like desc heap cbv cbuffer CameraConstBuffer register(b0) float4x4 projectionMatrix float4x4 viewMatrix float3 viewPos float3 viewDir now viewdir is always filled with wrong buffer indices. viewDir.x is the passed y value viewDir.y is the passed z value viewDir.z is 0.0 checked it in the shader like float3 vpos normalize(float4(viewDir, 0.0)).xyz float value vpos.x uses the passed y value float value vpos.y uses the passed z value color float4(vpos, 1.0) GPU Debugging shows that the constant buffer looks fine i checked the vectors against the cpu side and they match. So why is the shader reading index 36 38 instead of 35 37? Its always affecting the 4th member of the struct. If i switch viewPosition with viewDir, viewPosition is false.
14
Is it possible with DirectX11 to have pixel shader output an integer rather than float4? I am trying to implement object picking via a shaders. My intent is to create a texture2d that I would write out ID values describing individual objects. Following drawing the objects, I would query back the pixel value at the focus point and read out the ID of the object. I've created my texture2d as follows Create a very tiny selection texture. This will record the color ID as unique identifier of a selectable object. D3D11 TEXTURE2D DESC desc desc.Width 1 desc.Height 1 desc.MipLevels 1 desc.ArraySize 1 desc.Format DXGI FORMAT R32 UINT 16 bit integer, hold up to 65536 possible unique IDs. desc.SampleDesc.Count 1 desc.SampleDesc.Quality 0 desc.Usage D3D11 USAGE STAGING We will write via GPU and read to CPU! desc.BindFlags D3D11 BIND RENDER TARGET desc.CPUAccessFlags D3D11 CPU ACCESS READ desc.MiscFlags 0 HRESULT result m deviceResources gt GetD3DDevice() gt CreateTexture2D( amp desc, 0, amp m colorRenderTarget) if (FAILED(result)) OutputDebugString(L"PickRenderer Unable to create Texture2D") Is it possible to then to make a pixel shader that would write out an R32 integer value, rather than the float values for RGBA? If so, how would I best do this? My pixel shader currently writes float RGB values out as follows A constant buffer that stores the model transform. cbuffer ModelConstantBuffer register(b0) float4x4 modelToWorld float3 color Instead use unsigned int32 here if possible? Per vertex data used as input to the vertex shader. struct PixelShaderInput min16float4 pos POSITION Can this use a different return value for uint? min16float4 main(PixelShaderInput IN) SV TARGET return min16float4(color, 1.f) Thanks, I've been trying to find an answer to this, but it seems very unusual to write out non float values if it is possible.
14
Writing a shader once without using CG? Is there a tool that can convert a fragment shader from hlsl to glsl or glsl to hlsl? I do not want to use cg since it is not able to work on mobile platforms. Is there a tool that can make it so I only write a shader once? Thanks
14
How do you make a water shader? I'm working on a minecraft like world and recently saw this video with a water shader. I searched but couldn't find any good info on how to do something similar. Maybe I'm just googling the wrong thing. Does anyone have any resources on how to do something like this? minecraft water shader video
14
Vertex Shader Fundamental Workings I understand that water ripples (e.g. stone thrown into a pond) are often handled with vertex shaders. My first question is are the ripples nothing more than an algorithm that is the function of time? If yes, it means that the size and diameter of ripples is not "additive." It means water vertices do not statefully "remember" their previous "disturbance" positions and accumulate more translation info. Rather it means that, as a function of time, the position of "disturbed" water vertices are freshly computed each frame per unit time. If no, it means that indeed the vertices accumulate disturbance translation information the vertices are stateful. I hope the answer is "yes," because that actually makes sense to me. If the answer is no, the I feel it creates tremendous burden on the CPU GPU to keep track of all the state per vertice. If the answer is "neither," do tell. ) My second question is, assuming a "yes" above, how does such a "water disturbance shader algorithm" account for continuous interaction with irregular shapes? For example, please look at the video 40 second mark showing a car crashing through water. It is not so clear how the vertex shader knows how to make a rectangular disturbance shape (the shape of the car). Perhaps, over simplifying, the vertex shader takes both time and a vector to generate the ripples, where the vector is the speed direction of a car (and the shader code always makes a car shaped rectangle no matter what). Is this the right high level understanding of how this water trick works?
14
Screen Space reflections not tracing correctly GLSL I've been trying to implement screen space reflections for the past couple of days, however it's been difficult finding specific implementation instructions or guides. Most of the hits on the subject that I can find relate to UE4 or Unity, or a sample implementation in HLSL derived from GLSL work both using different coordinate systems for the y and z axes. Following what little I could find, I've been using the following shader version 430 layout (std430, binding 2) buffer Camera Frag mat4 pMatrix mat4 InvPMatrix vec2 ScreenSize uniform sampler2D ViewNormalMap uniform sampler2D DepthMap uniform sampler2D LightMap in vec2 TexCoord layout (location 0) out vec4 FragColor vec2 RayCast(vec3 dir, inout vec3 hitCoord, out float dDepth) dir 0.25f for(int i 0 i lt 20 i) hitCoord dir vec4 projectedCoord pMatrix vec4(hitCoord, 1.0) projectedCoord.xy projectedCoord.w projectedCoord.xy projectedCoord.xy 0.5 0.5 float depth texture(DepthMap, projectedCoord.xy).r dDepth hitCoord.z depth if(dDepth lt 0.0) return projectedCoord.xy return vec2(0.0f) void main(void) vec3 View Normal texture(ViewNormalMap, TexCoord).xyz float View Depth texture(DepthMap, TexCoord).r vec3 ScreenPos 2.0f vec3(TexCoord, View Depth) 1.0f vec4 View Pos InvPMatrix vec4(ScreenPos, 1.0f) View Pos View Pos.w Reflection vector vec3 reflected normalize(reflect(normalize(View Pos.xyz), normalize(View Normal))) Ray cast vec3 hitPos View Pos.xyz float dDepth float minRayStep 0.1f vec2 coords RayCast(reflected max(minRayStep, View Pos.z), hitPos, dDepth) FragColor textureLod(LightMap, coords, 0) I get the scene projected further down, but it isn't upside down. Notice the vases and the plants on top of them. Its like the scene gets resampled and pushed downwards. Can anyone help me understand why I'm not getting the expected results? I've never done any raytracing before, so perhaps I'm misunderstanding how this is supposed to work.
14
OpenGL ES Shader help (Blending) Earlier I required assistance getting to grips with how to retain the alpha channel of a transparent texture in my colourised texture shader program. Whilst playing with that first version of my program (before obtaining the solution to my first requirement), I managed to enable transparency for the whole texture (effectively blending via GLSL), and I quite liked this, and I would now like to know if and how it is possible to retain this blending effect, on top of the existing output without affecting the original alpha channel as I don't know how to input this transparency via the parameter that is already being provided with the textures alpha channel. A basic example of the blending program I am referring to (minus any other functionality) is as follows... varying vec2 texCoord uniform sampler2D texSampler void main() gl FragColor vec4(texture2D(texSampler,texCoord).xyz,0.5) Where 0.5 is the transparency (blending effect) of the whole texture. This is the current version of my program, which provides the ability to colour a texture according the colour parameter passed to the program, and retains the alpha channel of the original texture. varying vec2 texCoord uniform sampler2D texSampler uniform vec3 colour void main() gl FragColor vec4(colour,1) vec4(texture2D(texSampler,texCoord).xyz,texture2D(texSampler,texCoord).w) I need to know if it is possible to apply transparency on top this program, without affecting the original alpha channel which I have already preserved. I hope this makes enough sense, I am sure it is possible, and if so I should imagine it is rather simple, but this has me stumped. Any help much appreachiated. Cheers, Chris
14
Cocos2d x Differences between applying shader to child node and entire scene? I'm beginning with shader. I wonder what if i apply shader for single node, what'll happen? The gl FragCoord (0.5,0.5) is the bottom left of the screen or the bottom left of the node? I get some wrong caculations while wanting to draw something in the center of the node (not the screen). Many shader tutorias with sample code to draw at center using gl FragCoord.xy resolution.xy, if it's 0.5, it's center. In this case, it might be wrong because i want to draw in center of the node. Finally, how's about texture2D i use texture2D instead of texture, are they the same? I've applied this code on the node and it prints out the whole screen with stretch version (resolution device resolution) vec2 xy gl FragCoord.xy resolution.xy vec4 texColor texture2D(CC Texture0,xy) gl FragColor texColor
14
How can I replicate the look of Zelda BotW in my own shaders? At the moment I really want to develop a game with a similar shading technique that is used in Zelda BOTW however I do not know how I can achieve that look? To me it looks like they are using a cel shading technique with rim lighting, however I do not know is how to design textures or use them to achieve similar results. Or even if I ve got the right basic idea. I would really like to know how these kind of aesthetics could be replicated and if it would be possible using a tool like Substance Designer (using a PBR workflow) or not.
14
GLSL Issue replacing ternary operator with mix I was expecting these two code snippets to do the same thing return vec3( 1.0 b.r gt a.r ? 0.0 1.0 ((1.0 b.r) a.r), 1.0 b.g gt a.g ? 0.0 1.0 ((1.0 b.g) a.g), 1.0 b.b gt a.b ? 0.0 1.0 ((1.0 b.b) a.b) ) and return mix( ONE3 ((ONE3 b) a), ZERO3, vec3(greaterThanEqual(ONE3 b, a)) ) ONE3 vec3(1.0,1.0,1.0) ZERO3 vec3(0.0,0.0,0.0) For some reason, they have different outputs. Do you know why? (a can have zeros sometimes)
14
OpenGL ES Shader help (Blending) Earlier I required assistance getting to grips with how to retain the alpha channel of a transparent texture in my colourised texture shader program. Whilst playing with that first version of my program (before obtaining the solution to my first requirement), I managed to enable transparency for the whole texture (effectively blending via GLSL), and I quite liked this, and I would now like to know if and how it is possible to retain this blending effect, on top of the existing output without affecting the original alpha channel as I don't know how to input this transparency via the parameter that is already being provided with the textures alpha channel. A basic example of the blending program I am referring to (minus any other functionality) is as follows... varying vec2 texCoord uniform sampler2D texSampler void main() gl FragColor vec4(texture2D(texSampler,texCoord).xyz,0.5) Where 0.5 is the transparency (blending effect) of the whole texture. This is the current version of my program, which provides the ability to colour a texture according the colour parameter passed to the program, and retains the alpha channel of the original texture. varying vec2 texCoord uniform sampler2D texSampler uniform vec3 colour void main() gl FragColor vec4(colour,1) vec4(texture2D(texSampler,texCoord).xyz,texture2D(texSampler,texCoord).w) I need to know if it is possible to apply transparency on top this program, without affecting the original alpha channel which I have already preserved. I hope this makes enough sense, I am sure it is possible, and if so I should imagine it is rather simple, but this has me stumped. Any help much appreachiated. Cheers, Chris
14
Particles not rendering over projectors I am using projectors for shadows...When I use particles for bike speed up i.e., nitro speed the particles get cutout by those shadows.... Here is screenshot of it, Here is my shader code of projectors , Shader "Projector Projector Multiply Black" Properties ShadowTex("Cookie", 2D) "gray" TexGen ObjectLinear ShadowStrength("Strength",float) 1 Subshader Tags "RenderType" "Transparent" "Queue" "Transparent 100" Pass ZWrite Off Fog Mode Off Blend DstColor Zero CGPROGRAM pragma vertex vert pragma fragment frag pragma fragmentoption ARB fog exp2 pragma fragmentoption ARB precision hint fastest include "UnityCG.cginc" struct v2f float4 pos SV POSITION float2 uv Main TEXCOORD0 sampler2D ShadowTex float4x4 unity Projector float ShadowStrength v2f vert(appdata tan v) v2f o o.pos mul(UNITY MATRIX MVP, v.vertex) o.uv Main mul(unity Projector, v.vertex).xy return o half4 frag(v2f i) COLOR half4 tex tex2D( ShadowTex, i.uv Main) half strength (1 tex.a ShadowStrength) tex (strength,strength,strength,strength) return tex ENDCG Here is my particle code, Simple additive particle shader. Shader "Custom Particle additive" Properties MainTexture ("Particle Texture (Alpha8)", 2D) "white" Category Tags "Queue" "Transparent" "IgnoreProjector" "True" "RenderType" "Transparent" Blend SrcAlpha One Cull Off Lighting Off ZWrite Off Fog Color (0,0,0,0) BindChannels Bind "Color", color Bind "Vertex", vertex Bind "TexCoord", texcoord SubShader Pass SetTexture MainTexture combine primary, texture primary
14
How should I implement multi pass rendering in a game engine? I have done multi pass rendering, before, and understand how it works. I made a simple example, which rendered a basic scene with shadows. This was all part of one file. Now, I am trying to figure out is how to put it into my game engine. Currently, my game engine uses a single pass. It is in a hierarchical structure, and uses Direct3D 9. I have a graphics component, which will load and draw a 3D model. In my game loop, I update all of the entities in the world, then I call the draw function for each one. This draw function gets the vertex buffer, index buffer and texture or material, and draws the 3D model using a shader. This works fine. To do multi pass rendering, to allow for shadows, I will need to draw each model multiple times. It doesn't seem right, to me, that in each models draw function I should put the second pass code this will then be completed before the next models first pass. How should I implement multi pass rendering in a game engine?
14
GLSL 2d Per Pixel Lighting First time writing shaders, and having a bit of an issue getting per pixel lighting to work. The problem is pretty simple, but I'm afraid am doing something wrong here. Vertex Shader version 150 uniform mat4 proj ortho projection matrix uniform mat4 world world transformation (usually only rotation) in vec2 pos position for this vertex out vec2 ex pos pass position to frag shader void main() gl Position proj world vec4(pos, 0.0, 1.0) ex pos pos Fragment Shader version 150 uniform vec2 light pos light position uniform vec3 light color light color uniform vec3 intensity intensity value to apply to the light uniform float attenuation c constant attenuation uniform float attenuation l linear attenuation uniform float attenuation q quadratic attenuation in vec2 ex pos interpolated position out vec4 frag final fragment color value void main() compute the current distance from light float d distance(light pos, ex pos) compute attenuation for this point vec3 frag intensity intensity (attenuation c attenuation l d attenuation q pow(d,2) ) blend mode is additive frag vec4(frag intensity light color, 1.0) So, this seems pretty straightforward, but I am getting a hopelessly dark polygon (i.e., the additive fragment color always ends up 0,0,0). I have tested that simply setting d to any positive value causes the entire light polygon area to be uniformly lit, so I'm thinking that my distance computation is just wrong, but I'm not sure how it could be. I did have this working a bit earlier by doing vertex lighting instead of fragment, so I'm pretty sure my uniforms are being set correctly. Is it the interpolated position that is causing a problem? Thanks for any help.
14
Is there a successor to RenderMonkey? I'm starting with GLSL shader programming and have been looking into RenderMonkey. Sadly, AMD no longer supports it. Why? Is there a successor to it?
14
DirectX shader how to spread raytracer computation over multiple frames? I'm playing around making a shadertoy style SDF raytracer in HLSL and to make it run faster on high resolutions (1080p and up) I'd like to spread the computation over multiple frames. Right now I have a fragment shader that does raytracing for every pixel of a screen sized quad. I've noticed that if I discard (clip()) the fragment for every second pixel, I get 0 framerate improvement despite effectively only rendering half the image. But if I lower the render texture resolution to half, I see improvement obviously. Also if I clip a contiguous half of the fragment and render the other half, I get a similar performance boost. This must mean that the gpu thread groups are by default allocated in a tiled fashion on the screen, and that all threads of a group must be done computing before the (other) resources of the thread group can be used again. Does anyone know of any magic that can force a fragment shader to only allocate resources for say half the pixels of the screen? (but keep the render target size) Alternatively I suppose I could assign my own thread groups in a Compute shader system where I manage all this stuff myself. Is that the only way to do it? And how would I handle the render target then? (know any resources?) I'm also pretty sure I've seen people make this kind of grainy multi frame raytracers years ago before compute shaders were a thing.
14
WebGL two different approaches to Point Light what is the difference? So I'm studying Webgl and after directional light I'm approaching point light. So I've seen two different approaches Take the directional light (diffuse and specular components) and multiply them by a falloff attenuation, for example with the attenuation function is by Tom Madams that anyway take in account the distance from the fragment surface point to the light point. This approach is spotted in this webgl light walkthrough tutorial, and I find it easier to understand. Seems also the same approach from this OpenGL tutorial (in the section Point Light) The other approach listed here does not involve a light attenuation falloff but calculate the light direction from any fragment position and the light point source. It seems a little bit more complicated to me. Is there something that I have totally misunderstood? What is the difference between this two approaches? When to use the first and when to use the second? Thanks!
14
What is the math behind the light effect in krakatoa? I'd like to know the math behind the light effect in krakatoa (click here for an example). Light source is traveling with particles, but how is shading done? Is it something simple, like Phong shading? Is it possible to implement such effect in real time on GPU?
14
Normal maps in 3dsMax OgreMax how to view associated tangent vectors? Reading this post I found out that 3dsMax might not use the same tangents when rendering normal mapped objects. In my case, I was using Ogre and OgreMax to analyze results and they were very inconsistent with what Max was showing. I am able to "colour map" the tangent space in Ogre using shaders, but in Max I can't find out how it is computed wrt to the actual geometry. I also posted the question on the Ogre forums, but nobody answered. Has anyone encountered such an issue up until now? How did you go around it? For further information, this is the Ogre post.
14
Can't import shaders textures to Godot from Blender 2.8 I'm using this shader for my models and then i import them like this This is the original model in blender the result in godot I think i might have missed some step but not sure of which one, i followed the tutorial video on how to import and other similar guides but the results is the same. Any ideas of what i'm doing wrong? Looks like my model not importing shader and textures cor
14
How can I pass a texture to a custom deferred lighting model in Unity? I've replaced the Internal DeferredShading.shader with my own shader, and it's working fine, but I want to add a uniform texture for it to sample from. I've tried adding a texture as a property and assigning it a default value in the inspector but at run time the shader still uses the default value given in the shader (e.g. white). Is it possible to do this?
14
VS2012 C Unable to ID3D11Device CreateVertexShader when loading compiled CSO file I have been trying to get a tutorial Direct X 11 application running for some time now. I'm being stymied when I try to create the vertex shader. Setting the Direct X Debug setting reveals that I am getting this error. D3D11 ERROR ID3D11Device CreateVertexShader Encoded Vertex Shader size doesn't match specified size. STATE CREATION ERROR 166 CREATEVERTEXSHADER INVALIDSHADERBYTECODE I have compiled the hlsl shader code using Visual Studio 2012 Express into "shaders.cso" and that appears to load okay. According to the HRESULT return from D3DReadFileToBlob. However, the CreateVertexShader call returns "E INVALIDARG One or more arguments are invalid." After this the program fails on a Draw attempt. I have checked the size of the blob buffer and it is the same size as the shaders.cso file. I have included the code which sets this up below. void InitPipeline() HRESULT hr load and compile the two shaders ID3DBlob VS nullptr, PS nullptr ID3DBlob errorBlob nullptr const D3D SHADER MACRO defines "EXAMPLE DEFINE", "1", NULL, NULL UINT flags D3DCOMPILE ENABLE STRICTNESS if defined( DEBUG ) defined( DEBUG ) flags D3DCOMPILE DEBUG endif hr D3DReadFileToBlob(L"shaders.cso", amp VS) encapsulate both shaders into shader objects auto vbs VS gt GetBufferSize() auto vbp VS gt GetBufferPointer() hr dev gt CreateVertexShader(VS gt GetBufferPointer(), VS gt GetBufferSize(), nullptr, amp pVS) This function returns "E INVALIDARG One or more arguments are invalid." and pVS is returned as NULL set the shader objects devcon gt VSSetShader(pVS, 0, 0) create the input layout object D3D11 INPUT ELEMENT DESC ied "POSITION", 0, DXGI FORMAT R32G32B32 FLOAT, 0, 0, D3D11 INPUT PER VERTEX DATA, 0 , "COLOR", 0, DXGI FORMAT R32G32B32A32 FLOAT, 0, 12, D3D11 INPUT PER VERTEX DATA, 0 , dev gt CreateInputLayout(ied, 2, VS gt GetBufferPointer(), VS gt GetBufferSize(), amp pLayout) devcon gt IASetInputLayout(pLayout) For reference this is the shader code. cbuffer ConstantBuffer float4x4 matFinal struct VOut float4 position SV POSITION float4 color COLOR VOut main(float4 position POSITION, float4 color COLOR) VOut output output.position mul(matFinal, position) output.color color return output I'll admit to not knowing much about shader code. Honestly, I'd prefer to do without it at this stage of re learning D3D, but it appears to be required to use DX11. Can anyone shed any light on this?
14
How is this particular HLSL condition treated with respect to compile or run time evaluation? Let's say I have this very simple pixel shader (cbuffers and other stuff omitted) float4 PS(VertexOut pin, uniform bool useLighting) SV Target float4 retColor gDiffuseMap.Sample( sampler0, pin.Tex ) if (useLighting) retColor retColor float4(gAmbientLight, 1.0f) return retColor and two techniques such as technique11 TexTech pass P0 SetVertexShader( CompileShader( vs 4 0, VS())) SetGeometryShader(NULL) SetPixelShader(CompileShader( ps 4 0, PS(false))) technique11 TexLitTech pass P0 SetVertexShader( CompileShader(vs 4 0, VS())) SetGeometryShader(NULL) SetPixelShader(CompileShader(ps 4 0, PS(true))) The way I understand it, the useLighting condition is evaluated during compile time and each technique will have its own version of the pixel shader function without any branching. That means the useLighting condition wouldn't have any runtime penalties. Is that correct? So it's kind of like C preprocessing? Why can the pin variable just be left out like that in the CompileShader call? It makes sense, of course, I'm just wondering if this is some special HLSL or Effect Framework syntax?
14
What is the difference between these two shaders in terms of performance? I have implemented a two pass Gaussian blur shader in GLSL like this ... const float weights float (0.382928, 0.241732, 0.060598, 0.005977, 0.000229) vec2 offset vec2(1.0f) vec2(textureSize(image, 0)) vec3 result texture(image, vec3(io textureCoordinates, layer)).rgb weights 0 if(horizontal) for(int i 1 i lt weights.length() i ) result texture(image, vec3(io textureCoordinates vec2(offset.x float(i), 0.0f), layer)).rgb weights i result texture(image, vec3(io textureCoordinates vec2(offset.x float(i), 0.0f), layer)).rgb weights i else for(int i 1 i lt weights.length() i ) result texture(image, vec3(io textureCoordinates vec2(0.0f, offset.y float(i)), layer)).rgb weights i result texture(image, vec3(io textureCoordinates vec2(0.0f, offset.y float(i)), layer)).rgb weights i ... Where variable horizontal and layer are uniforms (the image is a texture2DArray, but this is not that important). I use a 9x9 kernel (variable weights, the first weight is the center) and first I blur the image horizontally, then I change the horizontal uniform to false and rerun the shader to blur the image vertically. It works fine, but the performance is really bad. I don't know why, because I don't do anything special. The variable horizontal is a uniform, so it remains the same for every pixel in a draw call, so each shader core executes the the same path for every pixel. Then I changed the code a little bit, like this ... const float weights float (0.382928, 0.241732, 0.060598, 0.005977, 0.000229) vec2 offset vec2(1.0f) vec2(textureSize(image, 0)) vec3 result texture(image, vec3(io textureCoordinates, layer)).rgb weights 0 vec2 ofs horizontal ? vec2(offset.x, 0) vec2(0, offset.y) for(int i 1 i lt weights.length() i ) result texture(image, vec3(io textureCoordinates ofs float(i), layer)).rgb weights i result texture(image, vec3(io textureCoordinates ofs float(i), layer)).rgb weights i ... It's almost the same, I just use horizontal in another place, but the performance is much better. Can someone explain what is the difference between the two shaders what causes a big difference in performance? I thought that maybe the first version executes both sides of the if else, which means two times more texture sampling, however the second version is much faster even if I execute it two times more. I'm using WebGL 2 and Nvidia GTX 1050M.
14
How to get the texture coordinate of a neighbouring pixel for a blur shader? I'm still having some trouble to get my head around fragment shaders and doing some image processing on textures. The context is a 2D sprite a simple texture painted on a quad. All done with OpenGL ES 2.0. My very basic goal is a simple blur filter using a 3x3 Kernel with average weights every pixel used is weighed 1 9th and summed up. Besides many ways to improve the performance of the fragment shader(code below) so far I'm still having some difficulties to find the right texture coordinate for the kernel. My approach so far is to use the actual size of the quad on the screen on which the texture is painted and pass those two values to the shader. This is done outside the shader and passed as a uniform to the shader program. glUniform2f( offset, 1 spriteWidth, 1 spriteHeight) This should result in the step in both directions to calculate a texture coordinate in the 0 to 1 space. The result is kind of looking good. BUT I am still struggling if this is something that could be done within the shader. Is there a way to get the size of the texture within the fragment shader? If we would be only doing this on a bitmap, I'll just go from pixel to pixel and read the color of the surrounding pixels. I am wondering if my understanding of a fragment shader is quite right It's run per rendered pixel on the screen. I found some examples for the GLSL to do this but I wasn't able to port it to OpenGL ES, so I had to start from scratch. For the sake of readability I write a bit more code in hope it's easier to understand the fragment shader varying vec2 v texCoord uniform vec2 u offset uniform sampler2D u texture const int size 3 const int KernelSize size size void main() int i, j vec4 sum vec4(0.0) vec4 intensityOfPixel vec2 texCoordForKernel for (i 0 i lt size i ) for (j 0 j lt size j ) texCoordForKernel vec2(v texCoord.x (float(i) 1.0) u offset.x, v texCoord.y (float(j) 1.0) u offset.y) intensityOfPixel texture2D(u texture, texCoordForKernel) sum intensityOfPixel 1.0 float(KernelSize) gl FragColor sum Thanks alot in advance!
14
Cylinder scrolling wrap around world using shaders? I am making a 2D game using OpenGL ES 2.0 (libgdx), where the world will wrap around itself on Y axis. Apart from implementing the "modulo" method in my rendering code, I'm wondering whether it's possible to render such a world using only shaders? How would I go about handling the edge case in GLSL? EDIT Basically, I want to move this part of the code to the GPU, if possible.
14
Why is clip space 3 dimensional? A vertex shader is basically a transformation function that converts a vertex in your world space to a space that can be rendered on screen. Since the screen is a 2 dimensional surface whats the purpose of the intermediate clip space? Why not have the vertex shader directly transform a 3d vertex to a 2d point on the raster surface?
14
Basic equation for pure metallic reflectance I'm working on a live wallpaper that shows a pure metallic object. Since it's a live wallpaper, I can make a ton of approximations...this isn't a full blown scene in a game world. My shader only has to support this one material. People won't expect extreme detail, etc. So, for example, I don't even bother with diffuse since a pure conductive material barely has any. So I'm trying to implement a reflective conductive surface that pulls the reflection from a cubemap representing the lighting environment. On a very basic level it looks like this vec3 fresnelSchlick (vec3 f0, float cosTheta) return mix(f0, vec3(1.0), pow(1.0 cosTheta, 5.0)) ... float NdV dot(normal, viewDir) vec2 brdf texture2D(u brdf, vec2(u roughness, NdV)).rg from a LUT vec3 reflection textureCube(u cubemap, reflect(viewDir, normal)) gl FragColor reflection (fresnelSchlick(materialColor, NdV) brdf.x brdf.y) The problem with my simple math here is that the range of possible final pixel colors does not produce a proper looking range of hues. Suppose my material color is 0050ff and the environment is monochrome white. Then all of the possible colors that could be seen are This doesn't allow for bright looking reflections at the light sources in the envmap. I found this example material render for comparison. The cubemap in the example here is close to monochrome white. But at the bright areas of the reflection, there is a lot of red component in the color. With my limited range of colors, my object looks dull and lifeless. Can anyone point out what I'm missing in my equation?
14
Can't import shaders textures to Godot from Blender 2.8 I'm using this shader for my models and then i import them like this This is the original model in blender the result in godot I think i might have missed some step but not sure of which one, i followed the tutorial video on how to import and other similar guides but the results is the same. Any ideas of what i'm doing wrong? Looks like my model not importing shader and textures cor
14
Disable depth testing for only some faces I have some meshes and I need to be able to draw some part of them without depth test. Is it possible to turn off depth testing from within shader, wo that some parts will be rendered no matter what depth is in z buffer ? Or transform Z, W somehow to fake this ?
14
HLSL equivilant to "Object" data from "Texture Coordinate" node in Blender I mocked up a shader how I wanted it with the node editor in Blender. Now I'm trying to write it in HLSL. In Blender there is a node group called "Texture Coordinate". If I use the "uv" node from the group it behaves like a normal unlit frag shader but if I use the "object" node to give coordinates to the texture, it ignores the uv data and just maps the texture like an image overlayed onto the object. This is actually the effect I want. However, I can't find a way to replicate this in HLSL. As far as I can see I can use TEXCOORD0 and POSITION as texture coordinates to produce uv mapping and world mapping respectively for a texture onto an object. Maybe what I want is object mapping? And if it matters I'm using a generated texture
14
Compute min max position of tile in compute shader I'm trying to implement tiled deferred lighting with OpenGL compute shaders. For that, I need to compute the minimum and maximum position of each tile. My first approach was to use atomicCounters for the computation, like that shared int minX shared int minY shared int minZ shared int maxX shared int maxY shared int maxZ ... atomicMin(minX, positionWorld.x) atomicMax(minY, positionWorld.y) barrier() now the shared variables contain the min max position per tile But that runs painfully slow (I guess because the atomic operations aren't very well for parallel processing). So what would be the best fastest way to do this?
14
How to have qt5 work on desktop like on mobile devices (es2, correct glsl version)? I am looking for help information concerning this issue My work I have an opengl es2 render engine that works on an iOS app. I almost managed to make it work by calling the same openGL rendering engine from a Qt(4.8 first, then 5.0 since today) app with a QGLWidget. To support es2 function set my widget also inherits QGLFunctions (QtOpenGLFunctions with qt 5.0.0). However I still have issues with shaders I could not compile them on qt with glCompileShader because of almost every keywords (lowp, vec4, vec2 ) were returning compilations errors. So I compiled it with QOpenGLShader program, but I had to specify version 120 , the closest to es2. But still, my sprites don t show up the right size and the only potentially influencing different pieces of code with iOS are in shader compilation version. I think the issue in the shader is that gl PointSize is not taken into account. Is there any better way for me to have shaders compiled on qt like they are on iOS ? ( I know glsl es2 version is coming from version 120 but I don t know to what extent they differ ). The hello gl es2 example did not help me because glsl version also returns 1.20. I ll happily receive any hints, thank you ! My shaders, working well on iOS but not on Qt const GLchar vShaderStr ifdef BUILD DESKTOP " version 120 n" necessary for Qt to compile shaders endif BUILD DESKTOP "attribute lowp vec4 Position n" "attribute mediump vec2 TextureCoord n" "attribute lowp float Weight n" "uniform mat4 MVP n" "varying mediump vec2 TextureCoordOut n" "void main(void) n" " n" " gl Position MVP Position n" " TextureCoordOut TextureCoord n" " gl PointSize Weight n" " n" const GLchar fShaderStr ifdef BUILD DESKTOP " version 120 n" necessary for Qt to compile shaders endif BUILD DESKTOP "varying mediump vec2 TextureCoordOut n" "uniform sampler2D Sampler n" "uniform bool IsSprite n" "uniform lowp vec3 TextureColor n" "uniform lowp float Opacity n" "void main(void) n" " n" " lowp vec4 textureColorResult n" " textureColorResult texture2D(Sampler, IsSprite ? gl PointCoord TextureCoordOut) n" " gl FragColor IsSprite ? vec4(mix(textureColorResult.rgb,TextureColor, 1.0), n" " textureColorResult.a Opacity) textureColorResult n" " n" Edit replaced preprocessor QT OPENGL LIB with custom BUILD DESKTOP as the first is no more declared in qt5.
14
What do shaders encompass? I'm researching shaders as I'm thinking about doing them for my final year project at Uni. I've looked at a lot of examples online and I think I get it. It's something that you apply to an object or scene in order to create a desired effect without changing the original object scene, I think. I know that there are different types of shaders but that's the basic goal right? Furthermore, what do shaders encompass? Are weather systems in games shaders? Particle effects? Landslides? I don't know where the line is drawn between animations cinematics and shaders. Also I guess shaders which effect something in the game world, for example, a tower blowing up in BF4. The explosion itself would be shaders but the way the tower effects the world around it would be physics and collisions. Am I right?
14
DirectX11 how to use textures and samplers in slots in shaders I have a system to render many objects, but I don t know how to render more tan one object with same shader, let me explain I have a sphere and a cylinder, but both objects can be rendered by different shaders, example Shader1 Shader to render object using 1 texture Shader2 Shader to render object using 2 textures Both objects need to coexist in space, so if I want to render sphere with shader 1 and I use shader1 resources context gt PSSetShaderResources(0, 1, amp texture) context gt PSSetSamplers(0, 1, amp sampler) shader2 resources context gt PSSetShaderResources(1, 1, amp texture) context gt PSSetSamplers(1, 1, amp sampler) context gt PSSetShaderResources(2, 1, amp texture) context gt PSSetSamplers(2, 1, amp sampler) in Shader1 the resources are references like this Texture2D colorTexture register(t0) SamplerState sampler register(s0) And in Shader2 the resources are references like this Texture2D colorTexture register(t1) SamplerState sampler register(s1) Texture2D colorTexture register(t2) SamplerState sampler register(s2) But what if I need to use shader1 resource s in shader2???? How to manage those resources or do I need to replicate shader2 with shader1 registers?? this is the simplest example, this is part of a very much complex system with many many shaders and many textures, but I don t know in what slot will be setted the resources, this can be absolutely generic, example, slot 5 will be used for texture 1 of shader2 It is possible to render many objects with minimal change of shaders, but the resources could be updated at any time.. I m using directX11.