_id
int64
0
49
text
stringlengths
71
4.19k
14
Using Ogre particle point billboards with shaders I'm learning about using Ogre particles and had some questions about how the point type particles work. Q. I believe point type particles are implemented as a single position. Is one single vertex is passed to the vertex shader? Q. If one vertex is passed to the vertex shader then what gets sent to the fragment shader? Q. Can I pass the particle size to the shader? Perhaps with a custom parameter?
14
What can I do with the 4th component of gl Position? When I set gl Position I usually assign it such as gl Position vec4(in position, 1.0) where in position as a vector of three components representing a vertex of my model. But looking up tutorials and such I cannot find anything explaining what the 4th component of the gl Position vec4 is doing aside from making the vector big enough so matrix transformations can be applied to it. Q What can I do with the 4th component of gl Position what does it influence in the rendering process?
14
How to send data from compute shader to vertex shader I have some shaders, every shader has the same constant buffer, constant buffer cbuffer cbPerFrame register(b0) float3 gEyePosW float4x4 gView float4x4 gProj float gDim float3 gVoxelOffset float gVoxelSize I think it's a waste to send these same constant buffer several times, so I want to pack them in one Constant Buffer, and share them to other vertex shaders. I have tried send constant buffer to compute shader, but they don't work in vertex shader. Do I need to pack these data in one structured buffer? Is it faster than send constant buffer several times? Is deferred shading relevant to this problem? Which solution do we usually use to solve this problem?
14
How to sample a texture with specified LOD in SM 2.0? That is my question, in SM 3.0 there is the HLSL intrasic function tex2Dlod to pick a color on a texture2d at specified coords and specified level of detail. But after long search on the web, I don't find equivalent in SM 2.0 (ps 4 0 level 9 3)...
14
Need help with a grab Shader How can I make grab Shader similar to https docs.unity3d.com Manual SL GrabPass.html in GLSL. The idea, is that you grab exact pixels from background (ex. from composited renderTarget) and put on rendered object. So it's like 100 transparency, because it always has the pixels right behind. Later if you disort the pixels, can give very cool effects like https www.shadertoy.com view Mdc3Rl. SO far I have texture of background (plane to renderTarget), and object cube right before it. My vertex shader for cube is gl Position WVP spos and I have no idea how do the frag shader to copy the pixels from background. Any help appreciated.
14
2d game view camera zoom, rotation offset using 'Filter' 'Shader' processing? I wish to add the ability to zoom in, zoom out, rotate and move the view in a top down view over a collection of points and lines in a large 2d map. I split the map into a grid so I only need to render the points that are 'near' the camera. My question is, how do I render a point A(Xp,Yp) assuming the following details Offset of the camera pov from the origin of the map is Xc, Yc Meaning the camera center is positioned on top of that point. If there's a point in Xc, Yc it is positioned in the center of the screen. The rotation angle is alpha The scale is S Read my answer first. I am thinking there is more optimized solution, thanks. My question is how to include the following improvement I read in the AS3 Bible book that In regards to ShaderInput, You can use these methods to coerce Pixel Bender to crunch huge sets of data masquerading as images, without doing too much work on the ActionScript side to make them look like images. Meaning if I am performing the same linear function on a lot of items, I can do it all at once if I use Shaders correctly and save processing time. Does anyone know how that is accomplished? Here is a sample of what I mean http wonderfl.net c eFp0
14
How to write shaders that can be compiled for DirectX, OpenGL, and Vulkan I recently finished writing the DirectX renderer for my game engine. Now I have an OpenGL, DirectX as well as a not yet finished Vulkan renderer. Well, the majority of the renderers work perfectly now but I have a problem I need a shader programming language. The problem is that OpenGL and Vulkan use GLSL but DirectX uses HLSL (and Apple's Metal API uses MSL). So I searched for a High Level Shader Language and found only C for graphics from NVIDIA. But since this project was deprecated I looked for something else Without success. It's a bit annoying to write for 3 shader programming languages at the same time, so I'm looking for a language that can be translated into the native language immediately when the game starts (or is simply compatible with a lot of rendering APIs) After several weeks of finding nothing, I decided to write my own language for it. But before I invest too much time I want to know if there is another solution to this problem.
14
Algorithm for a rain effect (Shaders) I'm trying to implement a rain effect using shader, I use godot 3 alpha, which uses a simplified GLSL 3.0 language. But I'm just finding very complex examples for me, I understand little about shaders, I wanted to achieve something very basic even to be able to walk. I thought of doing the following, moving the texture with just a few painted pixels. Yet I still do not know where to start. I found this code, I really wanted to understand what is done in it http www.glslsandbox.com e 36547.0 I managed to translate the code for godot 3, but it does not work very well, I believe this line is to blame p 0.5 0.35 sin(11.0 fract(sin((s p scale) mat2(vec2(7,3),vec2(6,5))) 5.0)) f What really is done that line? I can not understand what he does. I did so shader type canvas item uniform float direction uniform float velocity 1.0 uniform float intensity float snow(vec2 uv, float scale, float time) float w smoothstep(1.0,0.0, uv.y (scale 10.0)) if(w lt 0.1) return 0.0 uv time scale uv.y time velocity scale VELOCITY uv.x sin(uv.y time 0.5) scale uv scale vec2 s floor(uv) vec2 f fract(uv) vec2 p float k 3.0 float d vec2 t (s p vec2(scale)) mat2(vec2(7.0, 3.0), vec2(6.0,5.0)) p.y (0.5 0.35 sin(11.0 fract(sin((s.y p.y scale) t.y) 5.0)) f.y) p.x f.x d length(p) k min(d,k) k smoothstep(0.0,k,sin(f.x f.y) 0.01) return k w void fragment() vec2 uv (COLOR.xy 2.0 SCREEN UV.xy) min(SCREEN UV.x,SCREEN UV.y) float c smoothstep(1.0,0.3,clamp(uv.y 0.3 0.8,0.0,0.75)) c snow(uv, 30.0, TIME) 0.3 c snow(uv, 20.0, TIME) 0.5 c snow(uv, 15.0, TIME) 0.8 c snow(uv, 10.0, TIME) c snow(uv, 8.0, TIME) c snow(uv, 6.0, TIME) c snow(uv, 5.0, TIME) c snow(uv, 2.0, TIME) COLOR vec4(vec3(c),0.5) and this happens I have discovered that by decreasing the value I am passing to the scale, it increases the "particles", I mean how octaves in a noise, the smaller the number last, the greater the peaos generated. But my screen seems to be divided in two, cut by a diagonal line, I do not understand much because. I realize that here is the problem vec2 uv (COLOR.xy 2.0 SCREEN UV.xy) min(SCREEN UV.x,SCREEN UV.y) Up 1 I managed to improve with this vec2 uv UV now it's like this but what he wanted was to be able to pass three uniforms, direction, velocity and intensity, and to be able to control the direction (right equer), the speed with which they fall, the intensity (higher or lower particles, more or less drops). But I still do not understand how well it is done to do that. Even more that what I will do is rain, I have to understand what is done with the colors to get the idea of rain and not snow. Up 2 I was stirring here and the velocity, intensity and direction I was able to configure passing a uniform. But I can not change the color of the drops, it always comes out black and white. How would you put a blue color for example? I try to change the color to red for example using the mix() function vec3 col2 vec3(1.0, 0.0, 0.0) vec3 resultColor mix(vec3(c), col2, 0.7) COLOR vec4(vec3(resultColor),0.3) and I get this I wanted to change only the color of the drops? R I got it like this COLOR vec4(vec3(0.3, 0.3, c),0.5) but I'm not very fond of the result, it's very bright, and my breasts are too big, I'll fix it.
14
Direct3d Techniques and Windows 8 Under Windows 8, what is the alternative to this code technique11 Light0Tex pass P0 ... technique11 Light1Tex pass P0 ... And Light1Tech mFX gt GetTechniqueByName("Light1") ... Light1Tech gt GetPassByIndex(p) gt Apply(0, md3dImmediateContext)
14
Transparency behaviour on PowerVR I'm doing some graphics optimizing tests, especially on PowerVR transparency. Now I made a scene where there are two groups of simple objects. One group has the shape in geometry using diffuse shader and the other with quads using transparency shader. The camera looks through a plane towards a group. The test was about the "lateness" of PowerVR's Z tests. And whether or not use better geometry instead of transparencies for e.g leaves or flowers. The results were not wholly what I expected. The transparency group benefitted from changing the plane from transparent to diffuse by increasing the FPS by 20(expected). But why is the group without the plane infront so much faster than when the plane is in front?(Not expected) FPS was 38, screenshotting lowered it. Tests on the geometry group were opposite of what I had learnt in the past few days. There was only 2FPS difference between transparent and diffuse planes. (Not expected). Shouldn't the transparent plane infront cause the whole scene to render as the Z test is at the bottom of the pipeline?
14
Get a warning about extension when compile GLSL code with glslc compiler shipped with vulkan SDK Trying to compile my pixel shader,and a warning generated D CS ComputerGraphics vulkan WindowsProject1 gt D ProgrammingTools vulkan Bin glslc shader.frag o frag.spv shader.frag 3 warning ' extension' extension not supported GL KHR vulkan glsl 1 warning generated. The following is my pixel shade code version 460 extension GL ARB separate shader objects enable extension GL KHR vulkan glsl enable extension GL EXT debug printf enable Input layout(location 0) in vec3 fragColor layout(location 1) in vec2 fragTexCoord Output layout(location 0) out vec4 outColor uniform layout(set 0,binding 2) uniform sampler2D texSampler void main() outColor texture(texSampler, fragTexCoord) how do I fix this?
14
Need help transforming DirectX 9 skybox hlsl shader to DirectX 11 I am in the middle of implementing a skybox to my game. I have been following this tutorial http rbwhitaker.wikidot.com skyboxes 2. I am using MonoGame as a framework and in order to support both Windows and Windows 8 metro I need to compile the shader with pixel and vertex shader 4. compile vs 4 0 level 9 1 compile ps 4 0 level 9 1 However some of the hlsl syntax has been updated with DX10 and DX11. I need to update this hlsl code float4x4 World float4x4 View float4x4 Projection float3 CameraPosition Texture SkyBoxTexture samplerCUBE SkyBoxSampler sampler state texture lt SkyBoxTexture gt magfilter LINEAR minfilter LINEAR mipfilter LINEAR AddressU Mirror AddressV Mirror struct VertexShaderInput float4 Position POSITION0 struct VertexShaderOutput float4 Position POSITION0 float3 TextureCoordinate TEXCOORD0 VertexShaderOutput VertexShaderFunction(VertexShaderInput input) VertexShaderOutput output float4 worldPosition mul(input.Position, World) float4 viewPosition mul(worldPosition, View) output.Position mul(viewPosition, Projection) float4 VertexPosition mul(input.Position, World) output.TextureCoordinate VertexPosition CameraPosition return output float4 PixelShaderFunction(VertexShaderOutput input) COLOR0 return texCUBE(SkyBoxSampler, normalize(input.TextureCoordinate)) technique Skybox pass Pass1 VertexShader compile vs 2 0 VertexShaderFunction() PixelShader compile ps 2 0 PixelShaderFunction() I quess I need to change Texture into TextureCube, change sampler, swap texCUBE() with TextureCube.Sample() and change PixelShader return semantic to SV Target0. I'm very new in shader languages and any help is appreciated!
14
How can I make a shader effect that looks like a lightly shaded pencil drawing? I want to make a shader effect using OpenGL ES 2.0 that looks like this image I'm not sure if this image was painted or is the result of some filter, but I want to create a shader that produces images looking similar. I want to know a Shader algoritm to produce this kind of output!
14
How can I avoid applying textures to a fragment in a shader when the surface normal is (0, 1, 0)? I have a small GLSL shader with a vertex shader and a fragment shader. I want to avoid applying textures to faces that have a normal equal to (0, 1, 0). Is this possible? These are my shaders Vertex version 120 varying vec2 UV varying vec3 normal void main() UV gl MultiTexCoord0 gl Position gl ModelViewProjectionMatrix gl Vertex normal gl Normal Fragment version 120 varying vec2 UV uniform sampler2D diffuseMap void main(void) gl FragColor texture(diffuseMap, UV)
14
Premultiplied Alpha And Alpha Testing I have a shader that is supposed to work with either alpha blending or alpha testing, but the color values being passed in are premultiplied alpha values. Is there an easy standard way to have it produce the "correct" results for both alpha blending and alpha testing when using premultiplied alpha? For example, if the final result of my shader is RGBA(1,0,0,0.75) straight alpha and therefore RGBA(0.75,0,0,0.75) premultiplied alpha, the result should be remain RGBA(1,0,0,0.75) when alpha testing. Perhaps one option is "dividing out" the alpha channel if we are alpha testing, i.e. divide the R by A above so that we get 0.75 0.75 1.0, our original red value. But this becomes non deterministic when the alpha goes to 0. Maybe I can clarify with an example Say the back buffer is RGBA(1,1,1,1), and i am rendering with a partially transparent green color RGBA(0,1,0,0.75) (straight alpha). The color is passed in as premultiplied alpha as RGBA(0,0.75,0,0.75). When alpha blending, i blend using one invsrcalpha and get a final color of RGBA(0.25,1,0.25,1). So my green channel stays at 100 , because the back buffer had green at 100 , and so did my original green color, so no way to get less than 100 on the green channel. When Alpha testing, there is no blending going on. Instead, my green color will be rendered straight to the back buffer as RGBA(0,0.75,0,1). But the color I wanted was RGBA(0,1,0,1).
14
Forward rendering and separation of shaders logic I'm currently playing with writing a rendering engine and implementing a forward rendering pipeline. I have few doubts on how things should be implemented regarding the render passes as well as the rendering of multiple lights. So I'm wondering what is the better approach here Uber shader that is generated dynamically to adjust for the number of lights that currently affect the active scene. This have the nice benefits of only one render pass for the lighting shader so much less binding occurs which is a plus. have separate lighting shader for each type of light and have a render pass for each mesh with each type of light, combining in the end the result of (possibly 2 render target that we ping pong between) with additive blending. This have the nice benefits of separation and more modular and maintainable shader code, but in the other hand the're more render passes and more binding going on and not to mention the ping pong of additive blending between the render targets. Other Ideas that I didn't heard thought about and would love to hear Ofcourse there's always room for optimization of both techniques such as ocllusion test, test which point spot light affect which mesh (relevant to the second technique). So I'm wondering what is the take on it of modern rendering game engine and maybe I didn't list what their approach is and would love to hear about it. Also if you side for one technique or another would like to have your thoughts on the pros and cons. Thanks in advance!
14
Does it make sense to do more calculations in the fragment shader if there are more vertices than pixels? I'm very new to graphics programming, and as I understand it vertex shaders are called per vertex and fragment shaders are per pixel (ignoring anti aliasing). When it comes to optimization all sources I refer to say that calculations that could be done in either shaders should be done in the vertex shader, and it makes sense 5 8 done 3 times is way better than 5 8 done 300'000 (640 480 for example). And that's fine for this scale but the question almost poses itself what happens when the amount of vertices exceeds the resolution. Here's a practical example a typical AAA game could have upto 3'000'000 polys rendering on the screen, for the sake of simplicity let's say it's a very detailed triangle strip, that would give it exactly 3'000'002 vertices while a full HD screen has 1920 1080 2'000'000 pixels. As you can tell a difference of a million operation could be a substantial performance increase. Is my thought process sound, or does my ignorance of some details leave something to be desired?
14
Draw cube in glsl shader I am working on a voxel engine. Currently my cubes are rendered as vbos. I thought, it may be better to load only the cooridinates of voxels to the shader and make it draw a cube itself. Can someone tell me, if thats possible, and if so, how?
14
Sprite outline based on light position in UE4 I have simple sprite (for example black box) and I like to add to it outline (red line on example image) but based on light direction. I know UE4 use deffered renderer, so I cant access to lights, but I can pass informations as params to material (angle, distance, intensity...). But still Im not sure how to compute visible corner which is lit by light. Can you please point me how to achieve this effect? Thank you.
14
OpenGL ES 2.0 Repository of Quality Shaders Could I kindly ask, to suggest me a repository of high quality OpenGL (OpenGL ES 2.0) vertex and fragment shaders, please? I am looking for pixel based ligting shaders (such as phong) and simmilar. It would be nice to see more of them, to be able to choose between quality vs shader performance.
14
Are there technical reasons to use short variable names in shader code? I have been reading and writing both GLSL and CG for the past few years, and have noticed a trend. In programming, we are generally advised to be as meaningful and concise as possible with variable names, but I see this suggestion being ignored almost every time I look at shaders. Here is a random example I grabbed off of Shadertoy vec3 tex wmcs(vec2 p) float s mix(0.2,1.2,sqrt(smoothNoise2(p 3.0))) vec3 col mix(vec3(0.15,0.2,0.2) 0.4,vec3(0.2,0.15,0.11) 0.7 s,sqrt(max(0.0,sm... vec2 p2 p vec2(4.0,10.0) vec2(smoothNoise2(p 24.0),smoothNoise2... p2.x floor(p2.y) 0.5 float bh pow(0.5 0.5 cos(floor(p2.y) 14.0) cos(floor(p2.x) 1.0),2.0) float brick brickt(p2) vec3 bn normalize(vec3(brickt(p2 vec2(1e 3,0.0)) brick,b... ... Is there a practical reason so many graphics programmers minify their code to such a point? Is there a technical downside to spelling out brickNormal instead of leaving it as bn, or using brick tangent or brickTexture instead of brickt?
14
Depth of Field Blur Weighted sampling? I've been studying Intel's fantastic article titled "An investigation of fast real time GPU based image blur algorithms" (here), where in they state that Gaussian blur would need to be customized for proper use in achieving a depth of field effect. The relevant part states They are all generic algorithms that are fully applicable to effects like a Bloom HDR filter. However, in a specific scenario, for example for use in Depth of Field, they would require additional customization (such as weighted sampling to avoid haloing bleeding) which can impact relative performance. However they don't go into detail about how to compute the weighted sampling? Does anyone know what this means, or know of relevant documentation example code? I have been using guassian blur for DoF and have indeed seen the bleeding and other artifacts, so I've been looking for ways to solve these problems.
14
What are the pros and cons of HLSL vs GLSL vs cg? What are the pros cons of the three?
14
Rotate mesh to normal I have some instanced geometry (basic tube meshes) laid out in a grid, and I have a noise texture (normal map) that I want to use to rotate my instances with. So head pixel in my texture is a normal and I want to rotate each instance with the corresponding normal in a shader. How can I achieve that in a shader only? Unless I am mistaking, there should only be a rotation around the X and Z axes.
14
Creating Blender like cavity shader I'm trying to achieve in Godot something similar to cavity effect from Blender (this can be enabled in viewport options cube on the left is displayed by viewport with this option enabled) How would you replicate this?
14
What is the minimum of shader I need to use to run basic calculation on GPU? I read, that the Hull Shader, Domain Shader, Geometry Shader and Pixel Shader can be used optional. So, is the Vertex Shader optional too? If no What does a basic Vertex Shader look like? Just like a simple pass through? Is the Vertex Shader necessary to tell what kind of datastructure (Van Stripes or Meshes) are used? What can I do, with just the vertex shader? Are the fixed functions working without any help of programming a programmable stage?
14
How would one construct a realistic "infrared vision" effect? How would you go about constructing a realistic infrared vision effect with shaders? By realistic I mean one that looks realistic, like this example. I have an idea about making a texture to determine how much heat a material emits and then determine by using the dot product of normal and view vector how much of that heat reaches the viewer but I'm not even sure that this is how thermal vision even works so I wanted to check if there's a better approach before starting to implement something that might be entirely wrong.
14
DirectX11 Using Multiple Shaders I currently have a scenario where I am rendering terrain with a shadow map. I have two passes, one for the depth buffer to create the shadow map (which is rendered to a texture) and a second that takes the shadow map texture as input and actually shadows the terrain, the terrain gets drawn once for each pass. Within the second shader I also texture the terrain. I now want to split the second shader so that shadows and texturing are separate shaders. How do I go about getting the output from the shadow shader to the texture shader (or vice versa), or do I simply do it in the same way as I have with the shadows (rendering to a texture, passing it as an input).
14
Direction vector in raycasting When I read about how to get the direction vector in raycasting, for example on this site http www.daimi.au.dk trier ?page id 98 They first render the mesh with front face culling and then with back face culling. And then subtract the backface from the front to get at a direction vector for each pixel. But is this not to much work to get the direction vector, is it not more simpler and faster to just take the vertex position(in world coordinates) and subtract the camera position in the fragment shader to get the direction vector? This should give the exact same answer but we skip the backface and frontface rendering.
14
Screen effects and antialiasing I have been working on a game for a while using glut for basic window creation. I was rendering to an offscreen buffer so that I could implement various effects like screen bulging, motion blur, refraction, etc. I also used the screen texture with antialiasing (fxaa). Now I have changed from glut to sfml. I switched on the in built antialiasing and it looked much better than my version, but now I don't have the screen in a texture so I can't use the screen effects. So my question is, how to people normally deal with this issue? Can I take advantage of sfml's antialiasing functionality and retain my effects? I thought about using glReadPixels, but that seems way too slow. Does sfml do offscreen rendering behind the scenes and can I access that texture? This is not specific to sfml. How do AAA games do it? Do they always implement their own antialiasing techniques?
14
Specular intensity of non metals(plastics) in metalness pbs workflow The specular color in metalness roughness workflow is usually defined as following float3 specColor lerp(0.03f, albedoColor, metallic) The Cook Torrance BRDF is given with the following formula The final color will be Ci (diffColor (n.l) Ks specColor cook) My question is won't specular be too dim for non metals? Cook Torrance term will be multiplied by 0.03 in case of dielectrics this will make the specular component virtually non existent. This doesn't seem realistic to me, because smooth plastic reflects a lot of light, almost as much as metals What am I missing?
14
How to get Pixel Coordinates of certain colors in a Texture? I have a relatively big Texture, and I try to find a certain color pixels pattern eg. White, Black, White, Green . They are lying next to each other, If I use Texture2d.Getpixels() on every pixel every frame its possible, but way too slow. Is there a "real good" way to do this? For example a shader way? The "big idea" is to make a game which is "remote playable". For example through Twitch or of a streamed youtube video. The user should not be connected to the server in any way, Instead he is sending messages or not. For this I have to find a health bar. This health bar has the left upper corner in black white black green pixels lying next to each other. In the shader I CAN find these colors, but I dont really know the coordinates. Also I think there "should" be a solution with a ... compute shader? I dont really want to render new stuff. Just check pixel colors. I just want to know where these pixel colors are. I imagined something like this If PixelColorIsWhite(x,y) If PixelColorIsBlack(x 1,y) If PixelColorIsWhite(x 2,y) If PixelColorIsGreen(x 3,y) return x 20,y 100 because the player is 20 pixels right and 100 pixels below In a shader i guess i can access the current coordinate, but I cant access the other pixels. Its paralell processing... right? So its impossible to access the "entire thing" in one go , right? I think it must work somehow with... getting the big "chunk" of texture data, a very long array of pixels, giving it to a compute shader, and letting the CS find it. I hope I could clarify the problem now ) Update I found out it (at least) should work with a compute shader. I feel like im so close to the solution, yet im getting errors. ( So... This is my C Code public ComputeShader shader public Material thisMat public RenderTexture ScreenTex public int x public int y public const string INPUTTEX "InputTexture" private void Update() RunShader() public void RunShader() RenderTexture tex new RenderTexture(1024, 786, 24) tex.enableRandomWrite true tex.Create() shader.SetTexture(0, INPUTTEX, ScreenTex) shader.SetFloat(" ScreenWidth", Camera.main.pixelWidth) shader.SetFloat(" ScreenHeight", Camera.main.pixelHeight) int data new int 2 ComputeBuffer buffer new ComputeBuffer(data.Length, sizeof(int) 2) buffer.SetData(data) shader.SetBuffer(0, "readWriteIntBuffer", buffer) thisMat.mainTexture tex int kernelHandle shader.FindKernel("CSMain") shader.SetTexture(kernelHandle, "Result", tex) shader.Dispatch(kernelHandle, 1024 8, 786 8, 1) buffer.GetData(data) x data 0 y data 1 buffer.Release() And this is my ComputeShader Code pragma kernel CSMain RWTexture2D lt float4 gt Result RWStructuredBuffer lt int gt readWriteIntBuffer Texture2D lt float4 gt InputTexture float ScreenWidth float ScreenHeight numthreads(8, 8, 1) void CSMain(uint3 id SV DispatchThreadID) if (id.x gt (uint)( ScreenWidth ScreenHeight)) return int y id.x int( ScreenWidth) int x id.x int( ScreenWidth) readWriteIntBuffer 0 999 some debug testnumbers that show me that there were no colors found readWriteIntBuffer 1 998 if (InputTexture float2(x,y) .r 1.0f amp amp InputTexture float2(x, y) .g 0.0f amp amp InputTexture float2(x, y) .b 0.0f) float2 coordinates float2((float)id.x (float) ScreenWidth, (float)id.y (float) ScreenHeight) readWriteIntBuffer 0 coordinates.x readWriteIntBuffer 1 coordinates.y This Computeshader checks if a single pixel in a texture is red and writs it in a buffer. The buffer gets read by C with getdata. it seems we're almost getting there! But "something" seems to be wrong, since I dont seem to get the colors i want. But the compute shader seems to work, since i also get the triange pattern in my material.
14
What are mental ray shaders and can I use them in my own game? I'm using the FBX SDK to import and display FBX models in my OpenGL app. It works fine so far with a simple phong shader and displays basic models. However I have some FBX models that have custom (Mental Ray) Arch amp Design materials. Can I render these models with such materials in my app? My (completely noob) understanding of how this works is that the Arch amp Design thing is a shader. A graphic designer edited some data fields, and the way the model is rendered is that such data is passed to the Arch amp Design shader during rendering, and the shader takes care of it (ie does all the lighting calculations etc). Is it possible for me to use that flashy Arch amp Design shader in my app, so I manually parse the FBX model and send the data fields to the shader? How do I get hold of it? Any clarification will be appreciated, you can probably tell that I'm somewhat confused.
14
How to reproduce the 3ds Max Gradient Ramp effect? The material definition of a mesh is composed of these three components Self Illumunation, Refletcion and Refraction. Each of these components has a Gradient Ramp as a map and the mapping mode is set to spherical environment. I'm searching for a way to reproduce these effects in a shader (the shader language doesn't matter). Is it possible? My first idea was to save the Gradient Ramp as a texture you can see the result in the image below (reflection component) and using them in the shader through environment spherical mapping, but the result is wrong. In this image you can see the original rendering (3ds Max, reflection component) and in this image you can see my rendering The underlying idea seems to be correct, in fact there's a correlation between the two images but something goes wrong. Thanks for any suggestion.
14
Lens Distortion not working with Unity 5.3.5 I have downloded the latest sdk from this link https github.com googlevr gvr unity sdk I imported the package into my project.From the prefab added GvrMain to my scene and set the Distortion Correction to none as mentioned in the video below and edited the "GvrDistortion.cginc" shader . https www.youtube.com watch?v yJVkdsZc9YA But the lens distortion is not working for me. I have a dout that whether to edit the same folder or should create a new shader and include the existing shader file I am using unity 5.3.5 and GoogleVRForUnity package
14
HLSL Shore fading in a water shader I am modifying an old water shader for the game "MTA San Andreas" (the multiplayer modification). MTA uses HLSL and comes with some builtin predefined variables and functions. The water shader itself is already modified by me, but now I want to add shore fading to it. Objects slightly below water surface should be visible, but objects deeper in the water should be completely invisible and that effect should be configurable. I found a water shader that has shore fading included, but it was covered within very complex other things in a very large file. I managed to extract most of the necessary information (i think) but I get an error message in the game Invalid PS 3 0 input semantic "POSITION" in water.fx 139,23 I know that this line 138 has no errors and the only reason why the game throws this error is actually line 268... Debugging HLSL is a pain, horrible language. And there may be much more wrong stuff. This is the water shader https pastebin.com VDFxS7RX I started my modifications in line 268, where objdepth seems to be "wrong" and causes the compile error float objdepth input.position.z input.position.w float nonlinearobjdepth objdepth objdepth 1.0 max(1.0 objdepth, 0.000000001) float planardepth tex2D(SamplerDepth, input.textureCoords.xy).r float depth tex2D(SamplerDepth, input.textureCoords.xy).r if (nonlinearobjdepth gt depth) depth planardepth float scenedepth depth planardepth 1.0 max(1.0 planardepth,0.000000001) depth 1.0 max(1.0 depth,0.000000001) float depthfact (depth objdepth) WaterParameters1.z depthfact depthfact WaterParameters1.x depthfact depthfact (depthfact 0.50 1.0) if (scenedepth gt 0.99999) depthfact 1.0 float backside saturate(WaterParameters1.w 10.0) depthfact 1.0 backside float shorefade (planardepth objdepth) shorefade shorefade WaterParameters1.x shorefade saturate(13.40 shorefade 1000.0 0.05) depthfact shorefade finalColor.a shorefade return finalColor
14
What is an efficient way to manage uniforms in a game? Most engines on the market have their drawbacks and it's difficult to find a simple light weight one that's open source and doesn't have to put you through a rather complex learning process. Writing one is a difficult task on its own, but it might not be a bad idea if what you want that engine to do is to support a specific kind of games (e.g. 2.5 D games on mobile devices). So, in search for a good game engine architecture, I've found a few logistical issues. Consider this scenario Objects Each object is comprised of two principle structures of render information a model (geometry mainly) and a material (that tells the object what textures and what shaders to use). Of course, it is natural to allow an object to switch its material definition on the fly. But a material encapsulates the shaders, so these drag along with them some slots for uniforms and vertex attributes. Since an uniform, for example, can be object specific (color, specular exponent, etc.) global superglobal (lights, weather conditions fog wind,etc.) or specific to a group of objects (they all have, let's say, a reflectivity factor) then it means that it's wrong to put them in either the object's property region or in the material's property region. It's clear that both uniforms and attributes are always declared in the shader sections of a material, but where their values come from is an enigma in the frames of a pretty general rendering engine. You have to allow for the existence of numerous types (by semnatics!) of uniforms position, colour, bone matrices, indices, lighting parameters, etc. The big question now how would you suggest to organize and manage uniforms? (especially the information flow they're declared by shaders, but their values are supplied by apparently different type of entities some renderable, some more abstract, being themselves controllers or managers).
14
DirectX11 how to use textures and samplers in slots in shaders I have a system to render many objects, but I don t know how to render more tan one object with same shader, let me explain I have a sphere and a cylinder, but both objects can be rendered by different shaders, example Shader1 Shader to render object using 1 texture Shader2 Shader to render object using 2 textures Both objects need to coexist in space, so if I want to render sphere with shader 1 and I use shader1 resources context gt PSSetShaderResources(0, 1, amp texture) context gt PSSetSamplers(0, 1, amp sampler) shader2 resources context gt PSSetShaderResources(1, 1, amp texture) context gt PSSetSamplers(1, 1, amp sampler) context gt PSSetShaderResources(2, 1, amp texture) context gt PSSetSamplers(2, 1, amp sampler) in Shader1 the resources are references like this Texture2D colorTexture register(t0) SamplerState sampler register(s0) And in Shader2 the resources are references like this Texture2D colorTexture register(t1) SamplerState sampler register(s1) Texture2D colorTexture register(t2) SamplerState sampler register(s2) But what if I need to use shader1 resource s in shader2???? How to manage those resources or do I need to replicate shader2 with shader1 registers?? this is the simplest example, this is part of a very much complex system with many many shaders and many textures, but I don t know in what slot will be setted the resources, this can be absolutely generic, example, slot 5 will be used for texture 1 of shader2 It is possible to render many objects with minimal change of shaders, but the resources could be updated at any time.. I m using directX11.
14
HLSL Circle all white I have been trying to get my shader code (HLSL) to draw a simple circle but after a day and a half I am getting nowhere. It seems people are using x 2 y 2 r 2 and remap texcoords but I only get a white quad. struct VertexShaderStruct float4 Position POSITION0 float2 Tex0 TEXCOORD0 VertexShaderStruct VertexShaderFunction(VertexShaderStruct input) VertexShaderStruct output float4 worldPosition mul(input.Position, World) float4 viewPosition mul(worldPosition, View) output.Position mul(viewPosition, Projection) output.Tex0 input.Tex0 return output float4 PixelShaderFunction(VertexShaderStruct input) COLOR0 float dx 2 input.Tex0.x 1 float dy 2 input.Tex0.y 1 float hyp (dx dx dy dy) return (hyp 1)? circleColor otherColor I define circleColor as blue and otherColor as white, so it seems hyp 1 always fails.
14
Interactive texture modification (such as fluid mixing) in Unity I am working on a game where I want to allow users to mix multiple colors (similar to shown in this video) https youtu.be 11UFYyv8hjs?t 316. I have the following questions I am expecting that this is done using shaders. What kind of shader I have to use for such implementation? What will be the logic to create this type of shader? Are there any other ways using which this feature can be implemented? Are there any open source tutorials or examples? I would appreciate any suggestions and thoughts on this topic. Thank you.
14
Second pass in multipass effect is ignored I am trying to render my vertecies in 2 passes, but it seems I am doing something wrong, because only one pass applying. I cannot make the second one work despite that second pass is applying. Here is my code var effectByteCode ShaderBytecode.CompileFromFile( "Content Effects SpriteBatchEffect.fx", "fx 5 0") spriteBatchEffect new Effect(this.graphicsDevice, effectByteCode) pass spriteBatchEffect.GetTechniqueByName("SpriteBatch").GetPassByIndex(0) Layout from VertexShader input signature var passSignature pass.Description.Signature layout new InputLayout(graphicsDevice, passSignature, inputElements) Effect code Texture2D Texture SamplerState TextureSampler matrix OrthoMatrix struct VertexInputType float4 position SV POSITION float4 color COLOR float2 tex TEXCOORD0 struct PixelInputType float4 position SV POSITION float4 color COLOR float2 tex TEXCOORD0 PixelInputType SpriteVertexShader(VertexInputType input) PixelInputType output output.color float4(0, 0, 0, 0) output.position float4(0, 0, 0, 0) Change the position vector to be 4 units for proper matrix calculations. input.position.w 1.0f output.position mul(input.position, OrthoMatrix) Store the texture coordinates for the pixel shader. output.tex input.tex output.color input.color return output float4 SpritePixelShader(PixelInputType input) SV TARGET return Texture.Sample(TextureSampler, input.tex) input.color float4 SpritePixelShader2(PixelInputType input) SV TARGET return Texture.Sample(TextureSampler, input.tex) input.color return float4(1,0,0,0) technique10 SpriteBatch pass P0 SetGeometryShader(0) SetVertexShader(CompileShader(vs 4 0, SpriteVertexShader())) SetPixelShader(CompileShader(ps 4 0, SpritePixelShader())) pass P1 SetGeometryShader(0) SetVertexShader(CompileShader(vs 4 0, SpriteVertexShader())) SetPixelShader(CompileShader(ps 4 0, SpritePixelShader2())) And how I am using it graphicsDevice.SetVertexBuffers(0, vertexBufferBinding) var technique tempEffect.GetTechniqueByName("SpriteBatch") for (int i 0 i lt technique.Description.PassCount i ) var localpass technique.GetPassByIndex(i) if (localpass.IsValid) localpass.Apply(graphicsDevice) graphicsDevice.DrawIndexed(indexCount, 0, 0) Could someone point me why second pass is not apply? What I am missing here? Note that I am not using SharpdDX toolkit just plain SharpDX C
14
Rain effect using DirectX 9 capabilities Is it possible to achieve something similar to nVidia's rain demo using only shader model 3.0 capabilities? If yes, could you point out a few documents web resources that are suitable candidates and do not require a heavy programming load (e.g. not more than two hard weeks of programming for one single person)? It would be nice if the answer could also contain a pro con phrase for the proposed idea (e.g. postprocessing rain shader vs. a particle based effect).
14
HLSL SetVertexShader Texture2DArray Sample I want to do some texture samples in the vertex shader, but it seems this cannot be done in the same was as when using the Pixel shader. The code is basically.. Texture2DArray gTexture VS() gTexture.Sample(samPoint, float3(x, y, z) PS() gTexture.Sample(samPoint, float3(x, y, z) technique11 main pass P0 SetVertexShader(CompileShader(vs 4 0, VS())) SetGeometryShader(NULL) SetPixelShader(CompileShader(ps 4 0, PS())) The PS() will compile and work fine but I also need to do this sort of thing in VS(). When putting this code in the VS() I get Error X4532 cannot map expression to vs 4 0 instruction set .... I have done some googling and it looks like you can sample a texture in the VS but I cant get enough detail together to make this work. Any help on this would be good. Just for reference I am using DirectX11 VS2015
14
What does this shader error in L ve2D mean? What could I've done wrong to get the following error message when trying to create a GLSL shader in L ve2D? I'm sincerely clueless of what the mistake could be, though I suspect the error being a compile time error because it's happening during love.load, in which I require the shader from another file (and not during love.draw). The full code can be found here.
14
Pixel Shader stage did not run I can't figure out why the pixel shader won't run. I'm using the Blinn Phong per pixel shader from here. Only change I've made is that I pass an aditional color per vertex which gets multiplied by the light color in the pixel shader. So far the Graphics Analyzer shows me a valid IA result and the same (okay looking) result in the VS stage. I disabled depth stencil for testing and though everything looks rather fine I always see "Stage did not run. No output." when I inspect the captured frame. This is my Projection matrix projection Matrix.PerspectiveFovLH((float) Math.PI 4.0f, viewports 0 .Width viewports 0 .Height, 0.1f, 100f) Values are Width 1346, Height 800 My View is calculated like this rotation Quaternion.RotationYawPitchRoll(Yaw, Pitch, Roll) Vector3.Transform(ref target, ref rotation, out target) Vector3 up Vector3.UnitY Vector3.Transform(ref up, ref rotation, out up) view Matrix.LookAtLH(Position, target, up) Whereas Position X 50, Y 50, Z 300, Target X 50, Y 50, Z 0 and Yaw, Pitch, Roll are all 0. World Matrix is currently Matrix.Identity. The quad I try to render spans itself from X 0, Y 0 to X 100, Y 100 with 4 vertices and 6 indices. I first thought that SV POSITION must be a normalized value since it's obviously a SV coordinate and someone posted in another thread that this solved his problem but then the VS stage does not show anything at all and PS still won't run. It's been quite some years since I last worked with DirectX so I'm not sure anymore why this happens.
14
DirectX 11, using Tessellation Geometry shader in a single pass Before all, sorry for my poor english ! With DirectX 11, i'm trying to create a random map full with GPU. Using Hull shader stage, I'm managing LOD with tessellation. Using Domain shader stage, I'm generating the map (based on perlin noise). Now my goal, is to compute normals in the geometry shader (normal on vertex). For that, I must use vertex adjency, like geometry is capable of. But here is the problem... For tessellation, my primitives must be D3D11 PRIMITIVE TOPOLOGY 3 CONTROL POINT PATCHLIST. But for geometry shader with 6 vertex (triangle primitive and adjency), I must use D3D11 PRIMITIVE TOPOLOGY TRIANGLELIST ADJ. Think I'm missing something... It must be possible to tessellate and use the results in the geometry shader... However, it's working with 3 points, but I cannot use the 3 others (they are 0.0, 0.0, 0.0).... Thank you in advance for any help )
14
Fresnel shader excluding one axis I'm trying to get an outline like effect shown on the right in this video. Using fresnel seems to make the whole thing go white at certain angles. How do I prevent that? I'm using amplify to make this. Here's my current setup
14
How can I use shaders to make a square have a waving effect? I'm new to using shaders to do some fancy effects and I'm struggling with them. I'm using DirectX 11 and HLSL. I have this square in the middle of the screen It's just a square that I've created using 4 vertices. I want to give it this waving effect on the sides, but I don't know how to do it. This is for a little 2D game I'm making, so I have an orthogonal projection matrix which is the one I use. View matrix is set to identity. DirectX XMMATRIX view DirectX XMMatrixIdentity() DirectX XMMATRIX projection DirectX XMMatrixOrthographicOffCenterLH(0.0f, SCREEN WIDTH, SCREEN HEIGHT, 0.0f, 0.0f, 1.0f) This is the constant buffer I set each frame, so I can pass a float time variable to the shader struct constant buffer DirectX XMMATRIX WVP this value starts in 0 and increases by 1 each frame until 359 degrees float time constant buffer cbuffer The shader that I'm currently using is very simple, it just applies the matrix to each vertex to transform it. struct in pshader float4 position SV POSITION float4 color COLOR cbuffer constant buffer float4x4 WVP float time in pshader vshader(float4 position POSITION, float4 color COLOR) in pshader output output.position mul(position, WVP) output.color color return output float4 pshader(float4 position SV POSITION, float4 color COLOR) SV TARGET return color I'm guessing there's some sine function involved around, so that's why I decided to pass angles per frame as input. I tried many combinations, but I'm such a newbie and can't get it right. ( Any help on this would be very appreciated.
14
Vertex Shader Fundamental Workings I understand that water ripples (e.g. stone thrown into a pond) are often handled with vertex shaders. My first question is are the ripples nothing more than an algorithm that is the function of time? If yes, it means that the size and diameter of ripples is not "additive." It means water vertices do not statefully "remember" their previous "disturbance" positions and accumulate more translation info. Rather it means that, as a function of time, the position of "disturbed" water vertices are freshly computed each frame per unit time. If no, it means that indeed the vertices accumulate disturbance translation information the vertices are stateful. I hope the answer is "yes," because that actually makes sense to me. If the answer is no, the I feel it creates tremendous burden on the CPU GPU to keep track of all the state per vertice. If the answer is "neither," do tell. ) My second question is, assuming a "yes" above, how does such a "water disturbance shader algorithm" account for continuous interaction with irregular shapes? For example, please look at the video 40 second mark showing a car crashing through water. It is not so clear how the vertex shader knows how to make a rectangular disturbance shape (the shape of the car). Perhaps, over simplifying, the vertex shader takes both time and a vector to generate the ripples, where the vector is the speed direction of a car (and the shader code always makes a car shaped rectangle no matter what). Is this the right high level understanding of how this water trick works?
14
Transform texture coordinates when using shader Assuming I define four vertices of a quad with texture coordinates that cover a whole texture or region of a texture, I can animate these coordinates by setting a transform using SetTransform( D3DTS TEXTURE0, amp texTrans ) ...scaling, translating etc. If I render using a shader, and still want to animate the coordinates, presumably I can pass in the same transformation matrix and multiply the coordinates in the vertex shader? Instead of in the vertex shader Output.TextureUV vTexCoord0 do Output.TextureUV mul( vTexCoord0, texTrans ) Is this a the correct way to render an animated sprite with shader?
14
Correctly Implementing SSAO I am trying to implement Normal Oriented SSAO and i'm having an issue with the results. Portions of the screen are inverted wrong, typically half way through the screen but not always, it seems tied to the camera, but can also be anything lol. Here is the resulting render And here is the SSAO Shader void main() vec3 WorldPosition texture(GPositionsTexture, In TexCoords).xyz vec3 WorldNormal normalize(UnScaleNormal( texture( GNormalsTexture, In TexCoords ).xyz )) vec3 Random texture(KernalNoiseTexture, In TexCoords KernelNoiseScale).xyz vec3 Tangent normalize(Random WorldNormal dot(Random, WorldNormal)) vec3 BiTangent cross(WorldNormal, Tangent) mat3 TBN mat3(Tangent, BiTangent, WorldNormal) float Occlusion 0.0 for (int i 0 i lt KERNEL SIZE i) get sample position vec3 Sample inverse(TBN) Kernel i Sample Sample Radius WorldPosition project sample position vec4 Offset vec4(Sample, 1.0) Offset CamData.Projection CamData.View Offset Offset.xy Offset.w Offset.xy Offset.xy 0.5 0.5 get sample depth float SampleDepth texture(GPositionsTexture, Offset.xy).z range check amp accumulate float RangeCheck abs(WorldPosition.z SampleDepth) lt Radius ? 1.0 0.0 Occlusion (SampleDepth lt Sample.z ? 1.0 0.0) RangeCheck Occlusion 1 (Occlusion KERNEL SIZE) Out Diffuse pow(Occlusion, Power)
14
2D day night mapping I'm looking for this kind of effect MINUS the lights and snow (Another problem). It needs to change depending on the time of year. Doesn't need snow or city lights. Now I'm pretty new to shaders (learnt them yesterday in my spare time) but so far I have achieved This moves across the screen display light on both sides. Now I'm completely lost, as to how I can make it seem like a light map I.E. it needs to be more square sharper (As it's going off the edges) amp have the bottom and top times of year.. I though maybe I could pass sphere mesh to the vertex? Or do something with the map normal. Or maybe use blending with two textures but I've looked around and it looks extremely difficult. Code I have so far Fragment Pixel shader attributes from vertex shader varying vec4 vColor varying vec2 vTexCoord our texture samplers uniform sampler2D u texture diffuse map uniform sampler2D u normals normal map values used for shading algorithm... uniform vec2 Resolution resolution of screen uniform vec3 LightPos light position, normalized uniform vec4 LightColor light RGBA alpha is intensity uniform vec4 AmbientColor ambient RGBA alpha is intensity uniform vec3 Falloff attenuation coefficients uniform float lightX X Position of light. Can also feed Y for times of the year. void main() RGBA of our diffuse color vec4 DiffuseColor texture2D(u texture, vTexCoord) RGB of our normal map vec3 NormalMap texture2D(u normals, vTexCoord).rgb int numberOfLights 3 vec3 lightPoses 3 vec3 lightDirections 3 LightX goes to 0, then back to 1. lightPoses 0 vec3(lightX 1.0, LightPos.y, LightPos.z) lightPoses 1 vec3(lightX, LightPos.y, LightPos.z) lightPoses 2 vec3(lightX 1.0, LightPos.y, LightPos.z) TODO Introduce one extra light for top and bottom. OR Figure out how to squash the Y. TODO Needs to be sharper light. vec3 Sum vec3(0.0) Go though both lights. for(int index 0 index lt numberOfLights index ) The delta position of light vec3 LightDir vec3(lightPoses index .xy (gl FragCoord.xy Resolution.xy), LightPos.z) Correct for aspect ratio LightDir.x LightDir.x (Resolution.x Resolution.y) Make it bigger. (smaller the value the bigger.) LightDir vec3(0.55, 0.4, 1.0) Determine distance (used for attenuation) BEFORE we normalize our LightDir float D length(LightDir) normalize our vectors vec3 N normalize(NormalMap 2.0 1.0) vec3 L normalize(LightDir) Pre multiply light color with intensity Then perform "N dot L" to determine our diffuse term vec3 Diffuse (LightColor.rgb LightColor.a) max(dot(N, L), 0.0) pre multiply ambient color with intensity vec3 Ambient AmbientColor.rgb AmbientColor.a Because there are more lights, take off total ambient power. Ambient vec3(1.0 float(numberOfLights)) Calculate attenuation (The amount of fade the light has.) float Attenuation 1.0 (Falloff.x (Falloff.y D) (Falloff.z D D)) the calculation which brings it all together vec3 Intensity Ambient Diffuse Attenuation vec3 FinalColor DiffuseColor.rgb Intensity Sum FinalColor gl FragColor vec4(Sum, DiffuseColor.a) Vertex Shader combined projection and view matrix uniform mat4 u projTrans "in" attributes from our SpriteBatch attribute vec4 a position attribute vec4 a color attribute vec2 a texCoord0 "out" varyings to our fragment shader varying vec4 vColor varying vec2 vTexCoord void main() vColor a color vTexCoord a texCoord0 gl Position u projTrans a position
14
The number of shaders a large game or game engine has Wondering the scale basically. The number of shaders a large game or game engine has. I've seen some metal repos but they typically just have 1 or 2 shaders for small demos. I think I've seen a few with 5 or 10, 1 with maybe 20. But I haven't seen anything where there is like 100 or 1,000 shaders or anything like that. Wondering what the scale is like for typical large games or game engines, if they are on the order of 1 10, or 100, or 1000, sorta thing. And if it's a large number, maybe a quick idea of what the various tasks they are used for would be interesting but not required ). Thank you!
14
How to have a gradient blur in objects on Unity? This is related to a previous question I've asked here How to blur entire scene but a specific spot in Unity? At the time I managed to solve my problem with a "hard" cutoff on the blurred and non blurred objects. But now I need to have the same effect, but with some sort of "transition" between them, like the objects still being blurred at the edges of the circle "mask" I'm using and getting "non blurred" as they get closer to the center. As it is a mobile game, performance is really important, so is there any way of doing that without destroying my fps? Thanks for your attention.
14
Why doesn't my simple HLSL shader work? I'm using Monogame to draw 2D primitives to the screen. To do that, rather than use included structures like VertexPositionColor, I wrote my own vertex class for 2D. public struct VertexColor IPositionable, IVertexType private static readonly VertexDeclaration Declaration new VertexDeclaration ( new VertexElement(0, VertexElementFormat.Vector2, VertexElementUsage.Position, 0), new VertexElement(8, VertexElementFormat.Color, VertexElementUsage.Color, 0) ) public VertexColor(Vector2 position, Color color) Position position Color color public Vector2 Position get set public Color Color get set public VertexDeclaration VertexDeclaration gt Declaration Here's my associated HLSL shader, meant to simply translate vertices from normalized space to screen space. define VShaderModel vs 4 0 level 9 1 define PShaderModel ps 4 0 level 9 1 matrix Projection struct VertexShaderInput float4 Position SV Position0 float4 Color Color0 struct VertexShaderOutput float4 Position SV Position0 float4 Color Color0 VertexShaderOutput VertexShaderFunction(VertexShaderInput input) VertexShaderOutput output (VertexShaderOutput)0 output.Position mul(input.Position, Projection) output.Color input.Color return output float4 PixelShaderFunction(VertexShaderOutput input) Color return input.Color technique Technique0 pass Pass0 VertexShader compile VShaderModel VertexShaderFunction() PixelShader compile PShaderModel PixelShaderFunction() Where Projection is an orthographic matrix computed in code and passed to the shader based on current window size. However, when I attempt to draw primitives, I receive the following exception An error occurred while preparing to draw. This is probably because the current vertex declaration does not include all the elements required by the current vertex shader. The current vertex declaration includes these elements SV Position0, COLOR0. Now, I'm familiar with this exception and I understand what it means. The exception is telling me that my shader does not include all required elements based on the current vertex declaration. However, as you can see, my shader does include those two elements (SV Position0 and Color0). What am I missing here?
14
The way of avoiding branching for textured and not textured objects I know that branching is an expensive operation on GPU (not as much as it used to be, but still). The most common situation where I use branching is when I have both textured and non textured models objects rendered with single shader (switching between shaders is also quite expensive and it only differs in 1 2 small parts). For example, here's some pixel shader for rendering 2D objects constbuffer constBufferPerObject float4 textureCoordORColor hasTexture is true gt it's texCoord(x,y), color otherwise bool hasTexture ... float4 PS(VS OUTPUT input) SV TARGET if(hasTexture) input.TexCoord was calculated in VS (it's meaningless if hasTexture is false) return ObjTexture.Sample(ObjSamplerState, input.TexCoord) else return textureCoordORColor What would be a smart way to re write the PS in order to avoid branching here?
14
How to determine vertex index using Shader Model 3 or lower? I need something like SV VertexId (added in Shader Model 4) in HLSL shader to determine which vertex is currently handled. Unfortunatelly, I can compile only vs 3 0 or lower. The objective is to change position of one specific vertex using HLSL. I can't edit the mesh and can't pass any data from game engine, my capabilities are limited by HLSL shader. Shader is writing only for one mesh (humain face), I need to change it's shape a bit (for example, close an eye or make it smiling). I already tried to locate vertex by TEXCOORD, but had no idea how to separate it from other triangles verticles connected to it (placed at the same point where many triangles are met). if ( VS.TC 0 gt 0.8828125 amp amp VS.TC 1 gt 0.6328125 amp amp VS.TC 1 lt 0.671875 ) Upper lip center VS.Position VS.Normal uUpLip Could you, please, give me any advice how to identify move only one specific verticle in HLSL? I need something like moving verticle in Blender's Edit Mode (wytn neighbour triangles are still connected in one point), but my attempt with TEXTCOORD makes them moving in different directions Thanks
14
D3D11 SetShader States I have some questions regarding the XXSetShader and what happens after, for instance I would like to know if when XXSetShader is called the subsequent calls would be bound to that particular shader, like PSSetShaderRsources. Because at load time I am bounding the resource views needed for that particular shader, then the next one and so on, but what I found is that the resources were not bound, I need to set them again every time the XXSetShader is called. Am I doing something wrong? it's not supposed to work like that, the purpose of this is to have the least change states at runtime. Thanks.
14
D3D11 SetShader States I have some questions regarding the XXSetShader and what happens after, for instance I would like to know if when XXSetShader is called the subsequent calls would be bound to that particular shader, like PSSetShaderRsources. Because at load time I am bounding the resource views needed for that particular shader, then the next one and so on, but what I found is that the resources were not bound, I need to set them again every time the XXSetShader is called. Am I doing something wrong? it's not supposed to work like that, the purpose of this is to have the least change states at runtime. Thanks.
14
HLSL Shore fading in a water shader I am modifying an old water shader for the game "MTA San Andreas" (the multiplayer modification). MTA uses HLSL and comes with some builtin predefined variables and functions. The water shader itself is already modified by me, but now I want to add shore fading to it. Objects slightly below water surface should be visible, but objects deeper in the water should be completely invisible and that effect should be configurable. I found a water shader that has shore fading included, but it was covered within very complex other things in a very large file. I managed to extract most of the necessary information (i think) but I get an error message in the game Invalid PS 3 0 input semantic "POSITION" in water.fx 139,23 I know that this line 138 has no errors and the only reason why the game throws this error is actually line 268... Debugging HLSL is a pain, horrible language. And there may be much more wrong stuff. This is the water shader https pastebin.com VDFxS7RX I started my modifications in line 268, where objdepth seems to be "wrong" and causes the compile error float objdepth input.position.z input.position.w float nonlinearobjdepth objdepth objdepth 1.0 max(1.0 objdepth, 0.000000001) float planardepth tex2D(SamplerDepth, input.textureCoords.xy).r float depth tex2D(SamplerDepth, input.textureCoords.xy).r if (nonlinearobjdepth gt depth) depth planardepth float scenedepth depth planardepth 1.0 max(1.0 planardepth,0.000000001) depth 1.0 max(1.0 depth,0.000000001) float depthfact (depth objdepth) WaterParameters1.z depthfact depthfact WaterParameters1.x depthfact depthfact (depthfact 0.50 1.0) if (scenedepth gt 0.99999) depthfact 1.0 float backside saturate(WaterParameters1.w 10.0) depthfact 1.0 backside float shorefade (planardepth objdepth) shorefade shorefade WaterParameters1.x shorefade saturate(13.40 shorefade 1000.0 0.05) depthfact shorefade finalColor.a shorefade return finalColor
14
Using Jump Flooding Alghorithm for pathfinding? I am creating a distance to river texture. I am using the Jump Flood Alghorithm in a compute shader to do this using simple distance() and it works well. See code below. I want to augment this by adding blockers walls so raw distance to seed might be blocked by pixels inbetween. I need to adapt it to a more pathfinding alghorithm. Is this possible to do with JFA? Or do I need some other solution? cbuffer JFAConstants register( CBUFFER REGISTER EXTRA ) int2 gSampleOffset RWTexture2D lt uint2 gt gJumpFloodRiverMap register( UAV REGISTER 0 ) void UpdateClosest( inout int2 currentValue, int2 currentTexcoord, int2 jumpTexcoord ) int2 jumpValue gJumpFloodRiverMap jumpTexcoord if ( IsZero( jumpValue ) ) return float jumpDistance distance( (float2)currentTexcoord, (float2)jumpValue ) if ( IsZero( currentValue ) ) gJumpFloodRiverMap currentTexcoord jumpValue currentValue jumpValue return float currentDistance distance( (float2)currentTexcoord, (float2)currentValue ) if ( jumpDistance lt currentDistance ) gJumpFloodRiverMap currentTexcoord jumpValue currentValue jumpValue numthreads(TERRAIN NORMAL THREADS AXIS, TERRAIN NORMAL THREADS AXIS, 1) void cs main(uint3 groupID SV GroupID, uint3 dispatchTID SV DispatchThreadID, uint3 groupTID SV GroupThreadID, uint groupIndex SV GroupIndex) int2 texCoord dispatchTID.xy int2 currentValue gJumpFloodRiverMap texCoord int2 texCoordT texCoord int2( 0, gSampleOffset.y ) int2 texCoordTR texCoord gSampleOffset int2 texCoordR texCoord int2( gSampleOffset.x, 0 ) int2 texCoordBR texCoord int2( gSampleOffset.x, gSampleOffset.y ) int2 texCoordB texCoord int2( 0, gSampleOffset.y ) int2 texCoordBL texCoord gSampleOffset int2 texCoordL texCoord int2( gSampleOffset.x, 0 ) int2 texCoordTL texCoord int2( gSampleOffset.x, gSampleOffset.y ) UpdateClosest( currentValue, texCoord, texCoordT ) UpdateClosest( currentValue, texCoord, texCoordTR ) UpdateClosest( currentValue, texCoord, texCoordR ) UpdateClosest( currentValue, texCoord, texCoordBR ) UpdateClosest( currentValue, texCoord, texCoordB ) UpdateClosest( currentValue, texCoord, texCoordBL ) UpdateClosest( currentValue, texCoord, texCoordL ) UpdateClosest( currentValue, texCoord, texCoordTL )
14
Simple coherent noise function to use in a GLSL shader I'm looking for a simple (but especially fast) coherent noise function to use it in a shader written in GLSL. I don't need it to be excessively smooth or good looking, I just need that it has the following properties Passing in the same input value will always return the same output value. A small change in the input value will produce a small change in the output value. A large change in the input value will produce a random change in the output value. I really need it to be fast, as it will be called once for each pixel by the GPU (to have an idea of how fast it should be, I tried Perlin Noise and it crashed my application). What method should I use? I'd also like if the same pattern didn't repeat over time.
14
How to convert this shader from Worldspace into Localspace? Based on this question (LINK) Shader quot Unlit WorldspaceTiling quot Properties MainTex ( quot Texture quot , 2D) quot white quot SubShader Tags quot RenderType quot quot Opaque quot LOD 100 Pass CGPROGRAM pragma vertex vert pragma fragment frag include quot UnityCG.cginc quot struct appdata float4 vertex POSITION float2 uv TEXCOORD0 struct v2f float2 uv TEXCOORD0 float4 vertex SV POSITION sampler2D MainTex float4 MainTex ST v2f vert (appdata v) v2f o o.vertex mul(UNITY MATRIX MVP, v.vertex) Gets the xy position of the vertex in worldspace. float2 worldXY mul( Object2World, v.vertex).xy Use the worldspace coords instead of the mesh's UVs. o.uv TRANSFORM TEX(worldXY, MainTex) return o fixed4 frag (v2f i) SV Target fixed4 col tex2D( MainTex, i.uv) return col ENDCG So this has these two issues a) If the object moves through worldspace (eg. a moving platform), the texture will appear to crawl along it. b) Rotating the object won't rotate the texture, so it will tile across it diagonally. How is this code supposed to be to fix these two problems? EDIT This is what I'm trying to achieve over all I have no idea if my current method is good or bad, but its basically Use a flat mesh created in blender with UVs unwrapped. They're all basically a bunch of quads with UVs having literally the shape same as the mesh itself. Attach a material to each mesh in Unity with a particular texture. Resize, rotate and align mesh into correct size and position. Use as physics object in a 2D sideview game. May have additional meshes attached to it as non interactive decorations, but otherwise using the same method to build them.
14
How to apply blur effect to 2D textures using shaders? I'm making a 2D game in Monogame and I'd like to learn how to apply the blur effect for layers that are "out of focus" in the deep background or very close in the foreground using shaders. By manipulating the bitmap array of a Texture2D (it's pretty slow), I figured it would be better to draw each layer onto its own Texture2D and blur the whole thing instead of blurring individual textures. But then I learned there are shaders which are supposed to make this much faster. I've never used shaders for anything yet. How would I use shaders for the blur effect? Most of the shader tutorials for monogame xna I've seen so far deal only with 3D or do something I don't understand in 2D which doesn't seem to do the blur effect. Following DMGregory's suggestion in the comments, I decided to try the second shader which is Pixel shader applies a one dimensional gaussian blur filter. This is used twice by the bloom postprocess, first to blur horizontally, and then again to blur vertically. sampler TextureSampler register(s0) define SAMPLE COUNT 15 float2 SampleOffsets SAMPLE COUNT float SampleWeights SAMPLE COUNT float4 PixelShaderF(float2 texCoord TEXCOORD0) COLOR0 float4 c 0 Combine a number of weighted image filter taps. for (int i 0 i lt SAMPLE COUNT i ) c tex2D(TextureSampler, texCoord SampleOffsets i ) SampleWeights i return c technique GaussianBlur pass Pass1 if SM4 PixelShader compile ps 4 0 level 9 1 PixelShaderF() elif SM3 PixelShader compile ps 3 0 PixelShaderF() else PixelShader compile ps 2 0 PixelShaderF() endif Populating it with some random floats and Vector2s in the range from 0 to 1, then 0 to 100, then 100 to 100 and it just doesn't produce any effect. I tried moving the line blurEffect.CurrentTechnique.Passes 0 .Apply() before and after all the drawing methods and in between, but nothing happens. I had trouble loading it at first but then I figured it out with the content folder thing, so that doesn't seem to be the issue, or else it would probably throw an exception on the loading methods. I've seen the exception for setting the effect parameters incorrectly, so seems like I'm doing that right too. I think the problem might be with the values themselves. I don't know what I'm supposed to use there and why the 15 samples and not some other number. Ideally I would like a complete noob guide to shaders on this one example of gaussian blur applied to everything (or only to certain background textures).
14
How do I get HDRP render enabled so that the shader graph will work I'm using Unity 2019.2.17 personal, and I imported the shader graph package and created a pbr shader graph. I get a warning the render pipeline is not compatible with the master node. I assume it means I need to be using HDRP or LWDRP. I can't figure out how to enable HDRP. How do i do this?
14
Draw cube in glsl shader I am working on a voxel engine. Currently my cubes are rendered as vbos. I thought, it may be better to load only the cooridinates of voxels to the shader and make it draw a cube itself. Can someone tell me, if thats possible, and if so, how?
14
How to change FoV in vertex shader? Is it possible to change the Field of View vertex shader? I zoom my camera by changing the FoV value. However this will cause the skybox also be zoomed and this is not what I wanted. Is it possible to change the FoV value of the projection matrix in vertex shader? Specifically, I pass the projection matrix by uniform to a glsl shader. I want to change the FoV value of the projection matrix (to make a new projection matrix) in the vertex shader of the skybox.
14
HLSL SetVertexShader Texture2DArray Sample I want to do some texture samples in the vertex shader, but it seems this cannot be done in the same was as when using the Pixel shader. The code is basically.. Texture2DArray gTexture VS() gTexture.Sample(samPoint, float3(x, y, z) PS() gTexture.Sample(samPoint, float3(x, y, z) technique11 main pass P0 SetVertexShader(CompileShader(vs 4 0, VS())) SetGeometryShader(NULL) SetPixelShader(CompileShader(ps 4 0, PS())) The PS() will compile and work fine but I also need to do this sort of thing in VS(). When putting this code in the VS() I get Error X4532 cannot map expression to vs 4 0 instruction set .... I have done some googling and it looks like you can sample a texture in the VS but I cant get enough detail together to make this work. Any help on this would be good. Just for reference I am using DirectX11 VS2015
14
Debugging Shader Code? I'm writing a game engine, and when I use a perspective camera I get a black screen. I am not going to ask exactly why this is because there would be a lot of code to share and, frankly, I think that's a bit petty a question even to bother you all with. The trouble is that I don't know how to debug it. All that changes is my projection matrix, and if my projection matrix looks fine, I don't know why it doesn't work. Ideally I'd print out the values of various things as the shader did its calculations, but GLSL inconveniently doesn't have a printf() function. So my question is how do I debug my problem? The only thing I can think of is checking as many values as I can client side and then programming by permutation, but I've done that and gotten nowhere. Is there a way I can see what's happening in the video card? Is there a completely different technique I could be using? I'm using GLSL version 420 (and features specific to that version), so I don't think that glslDevil is an option, considering that it was last updated in 2010. EDIT I managed to solve my problem through some completely unrelated debugging.
14
How to get Pixel Coordinates of certain colors in a Texture? I have a relatively big Texture, and I try to find a certain color pixels pattern eg. White, Black, White, Green . They are lying next to each other, If I use Texture2d.Getpixels() on every pixel every frame its possible, but way too slow. Is there a "real good" way to do this? For example a shader way? The "big idea" is to make a game which is "remote playable". For example through Twitch or of a streamed youtube video. The user should not be connected to the server in any way, Instead he is sending messages or not. For this I have to find a health bar. This health bar has the left upper corner in black white black green pixels lying next to each other. In the shader I CAN find these colors, but I dont really know the coordinates. Also I think there "should" be a solution with a ... compute shader? I dont really want to render new stuff. Just check pixel colors. I just want to know where these pixel colors are. I imagined something like this If PixelColorIsWhite(x,y) If PixelColorIsBlack(x 1,y) If PixelColorIsWhite(x 2,y) If PixelColorIsGreen(x 3,y) return x 20,y 100 because the player is 20 pixels right and 100 pixels below In a shader i guess i can access the current coordinate, but I cant access the other pixels. Its paralell processing... right? So its impossible to access the "entire thing" in one go , right? I think it must work somehow with... getting the big "chunk" of texture data, a very long array of pixels, giving it to a compute shader, and letting the CS find it. I hope I could clarify the problem now ) Update I found out it (at least) should work with a compute shader. I feel like im so close to the solution, yet im getting errors. ( So... This is my C Code public ComputeShader shader public Material thisMat public RenderTexture ScreenTex public int x public int y public const string INPUTTEX "InputTexture" private void Update() RunShader() public void RunShader() RenderTexture tex new RenderTexture(1024, 786, 24) tex.enableRandomWrite true tex.Create() shader.SetTexture(0, INPUTTEX, ScreenTex) shader.SetFloat(" ScreenWidth", Camera.main.pixelWidth) shader.SetFloat(" ScreenHeight", Camera.main.pixelHeight) int data new int 2 ComputeBuffer buffer new ComputeBuffer(data.Length, sizeof(int) 2) buffer.SetData(data) shader.SetBuffer(0, "readWriteIntBuffer", buffer) thisMat.mainTexture tex int kernelHandle shader.FindKernel("CSMain") shader.SetTexture(kernelHandle, "Result", tex) shader.Dispatch(kernelHandle, 1024 8, 786 8, 1) buffer.GetData(data) x data 0 y data 1 buffer.Release() And this is my ComputeShader Code pragma kernel CSMain RWTexture2D lt float4 gt Result RWStructuredBuffer lt int gt readWriteIntBuffer Texture2D lt float4 gt InputTexture float ScreenWidth float ScreenHeight numthreads(8, 8, 1) void CSMain(uint3 id SV DispatchThreadID) if (id.x gt (uint)( ScreenWidth ScreenHeight)) return int y id.x int( ScreenWidth) int x id.x int( ScreenWidth) readWriteIntBuffer 0 999 some debug testnumbers that show me that there were no colors found readWriteIntBuffer 1 998 if (InputTexture float2(x,y) .r 1.0f amp amp InputTexture float2(x, y) .g 0.0f amp amp InputTexture float2(x, y) .b 0.0f) float2 coordinates float2((float)id.x (float) ScreenWidth, (float)id.y (float) ScreenHeight) readWriteIntBuffer 0 coordinates.x readWriteIntBuffer 1 coordinates.y This Computeshader checks if a single pixel in a texture is red and writs it in a buffer. The buffer gets read by C with getdata. it seems we're almost getting there! But "something" seems to be wrong, since I dont seem to get the colors i want. But the compute shader seems to work, since i also get the triange pattern in my material.
14
.rrr subscript for float type I'm a little confused with this construction float sdcolor MyColor.r MyColor has float4 type float rcolor sdcolor.rrr .ggg or .bbb isn't working return float4(rcolor , rcolor , rcolor , 1) Hence the question how it's possible to subscript anything from a float type? What is the meaning of the .rrr subscript?
14
GLSL Editor and Debugger for MacOSX with ES2 support is there a GLSL editor for the mac? I need it for iOS OpenGLES2 shader. How do you best debug shader? Regards
14
Component wise GLSL vector branching I'm aware that it usually is a BAD idea to operate separately on GLSL vec's components separately. For example use instrinsic functions, they do the calculation on 4 components at a time. float dot v1.x v2.x v1.y v2.y v1.z v2.z WRONG float dot dot(v1, v2) RIGHT Multiply one by one is bad too, since the ALU can do the 4 components at a time. vec3 mul vec3(v1.x v2.x, v1.y v2.y, v1.z v2.z) WRONG vec3 mul v1 v2 RIGHT I've been struggling thinking, are there equivalent operations for branching? For example vec4 Overlay(vec4 v1, vec4 v2, vec4 opacity) bvec4 less lessThan(v1, vec4(0.5)) vec4 blend for(int i 0 i lt 4 i) if(less i ) blend i 2.0 v1 i v2 i else blend i 1.0 2.0 (1.0 v1 i ) (1.0 v2 i ) return v1 (blend v1) opacity This is a Overlay operator that works component wise. I'm not sure if this is the best way to do it, since I'm afraid these for and if can be a bottleneck later. Tl dr, Can I branch component wise? If yes, how can I optimize that Overlay function with it?
14
XNA games C application executable work on one win7 not the other one Our company wrote a game in XNA studio 4 almost ten years ago. we try to reinstall it in win7 with only the executable. Both installed XNA Game Studio 4.0. Below is the environments parameter I can find. The one on the laptop is not rendering the effect(I am new to this, this is my guess). Laptop Desktop OS win7 Service Pack1 win7 Service Pack1 DirectX version 11 11 Graphic Card AMD firepro M4000 Mobility Prographics integrated GPU Intel HD graphics 4600 Chip Type AMD FirePro(0x682D) Intel(R) HD graphics family We don't know what make these two executable behave differently. Part of the window image just not shown. Second check on the source code, the different between the image that can be show and can not be shown on the failed Laptop is working one is using Bitmap and the not working on is using Texture2D with Microsoft.Xna.Framework.Graphics Effect with passes. Any ideas. Thanks.
14
ShaderBytecode Compiler one technique multiple passes I have an effect code with a basic structure like technique TechniqueName pass FirstPass Profile fx 4 0 VertexShader RenderFirstVS GeometryShader null PixelShader RenderFirstPS pass SecondPass Profile fx 4 0 VetrexShader RenderSecondVS GeometryShader null PixelShader RenderSecondPS pass ThirdPass Profile fx 4 0 VertexShader RenderThirdVS GeometryShader null PixelShader RenderThirdPS Now I tried to compile this with using (BinaryReader reader new BinaryReader(stream)) CompilationResult result ShaderBytecode.Compile(reader.ReadBytes((int)stream.Length), "fx 4 0") if (result.HasErrors) throw new Exception(result.Message, new Exception(result.ResultCode.ToString())) Data result.Bytecode.Data stream is new MemoryStream(Encoding.Default.GetBytes(effectContent)). The Data Property (byte ) is about 1 KiB large but if I try to load it via context.InputAssembler.InputLayout new InputLayout(device, effect.Data, someElements) it crashes with following exception D3D11 ERROR ID3D11Device CreateInputLayout Input Signature in bytecode could not be parsed. Data may be corrupt or in an unrecognizable format. STATE CREATION ERROR 161 CREATEINPUTLAYOUT UNPARSEABLEINPUTSIGNATURE Any idea how I can fix this or why the error is thrown? I do not want to use multiple shaders because I reuse many parameters I do not want to reassign in a deferred shading setup.
14
What will happen if the argument of mix() or clamp() is above 1 or below 0? There's two magnificent intrisincs mix() in GLSL and clamp() in HLSL, which are used to implement linear interpolation. Let's say we have a variable float v ? where ? can be FLOAT MAX, FLOAT MAX and then we do gl FragColor mix(value1, value2, v) So, the question is does it works the correct way under GL or DirectX? Should I EXPLICITLY normalize the value of v like this gl FragColor mix(value1, value2, clamp(v, 0.0, 1.0))
14
DirectX 9 Light projection I am trying to see changes of component 'z' from light space. In vertex shader component 'z' divide 'w' is not 0. But after sending float4 with texcoord1 to pixel shader its 0. All matrices are good. Here is the code float4x4 MatWorld float4x4 MatView float4x4 MatProjection float4 LightViewMatrix float4 LightProjectionMatrix float4 LightPosition texture tex0 sampler2D ShadowMap sampler state texture tex0 struct VertexOut float4 position POSITION float2 tex TEXCOORD0 float4 lightViewPosition TEXCOORD1 struct VertexIn float4 position POSITION float2 tex TEXCOORD0 VertexOut VSLight(VertexIn In) VertexOut Out In.position.w 1 Out.position mul(In.position, MatWorld) Out.lightViewPosition Out.position Out.lightViewPosition mul(Out.lightViewPosition, LightViewMatrix) Out.lightViewPosition mul(Out.lightViewPosition, LightProjectionMatrix) Out.position mul(Out.position, MatView) Out.position mul(Out.position, MatProjection) Out.tex In.tex return Out float4 PSLight(float4 Color COLOR, float2 tex TEXCOORD0, float4 lightViewPosition TEXCOORD1) COLOR float depth lightViewPosition.z lightViewPosition.w if (depth 0) return float4(0, 0, 0, 1) return float4(depth, 0, 0, 1) technique T1 pass P0 VertexShader compile vs 2 0 VSLight() PixelShader compile ps 2 0 PSLight() All objects are black. Sorry my english is slightly poor.
14
Where should shaders and lights be in a component based entity system? Where should I put the shader and the light shadow calculation? Should that be a component too? And should the rendering system know how to handle them or should there be a separate light system? I'm specifically talking about about a 2D system, but it should be the same in for 3D I think.
14
Strauss model no specular component I implemented the Strauss model with the metalness, transparency and smoothness parameters, taking the formulas from the book "Programming vertex geometry and pixel shaders", this is how I implemented it float fresnel(float x) float Kf 1.12 float result 1.0 pow(x Kf,2.0) 1.0 (Kf Kf) result 1.0 pow(1.0 Kf,2.0) 1.0 (Kf Kf) return result float shadow(float x) float Ks 1.01 float result 1.0 pow(1.0 Ks,2.0) 1.0 pow(x Ks,2.0) result 1.0 pow(1.0 Ks,2.0) 1.0 (Ks Ks) return result vec4 Strauss(vec3 L, vec3 N, vec3 V, float smoothness, float metalness, float transparency) vec3 H reflect(L,N) float HdotV max(dot(H,V),0.0) float NdotL max(dot(N,L),0.0) float NdotV max(dot(N,V),0.0) float k 0.1 Diffuse component vec3 diffuse NdotL (1.0 metalness smoothness) (1.0 pow(smoothness,3.0)) (1.0 transparency) vec3(materialDiffuse) Specular component float r (1.0 transparency) (1.0 pow(smoothness,3.0)) (1.0 transparency) float j fresnel(NdotL) shadow(NdotL) shadow(NdotV) float reflect min(1.0,r j (r k)) vec3 C1 vec3(1.0,1.0,1.0) vec3 Cs C1 metalness (1.0 fresnel(NdotL)) (vec3(materialSpecular) C1) vec3 specular pow( HdotV, 3.0 (1.0 smoothness)) reflect Cs return max(lightDiffuse vec4(diffuse,1.0) ,0.0) max(lightSpecular vec4(specular,1.0), 0.0) The parameters metalness, smoothness and transparency are uniforms. I try to vary them, but I cannot get something similar to this result This is what I get with roughness 0.9, transparency 0.1 and metalness 0.1 It is practically equal to a lambert shader! I tried a lot of parameters combination, but I never get a shiny teapot like it should be. I also debugged the shader, and it seems like the specular component is zero in many parts of the teapot. I tried to add an if that makes look red all the areas where the specular component is zero, that's the result I checked many times the formulas and the implementation, and it seems like I've done all correctly. What's the problem?
14
I don't think my shaders are working, looking for help Using sharpdx(directx 11) developing on UWP. This is a link to a previous question of not being able to compile the shader files(written in hlsl) How to compile shader files in UWP Later I have found some method to avoid the original exception(unable to open or find shader files) but still the shaders seem not to be working. My method to quot avoid the original exception quot is to change the file path to System.IO.Path.Combine(Windows.ApplicationModel.Package.Current.InstalledLocation.Path, quot ProjectName quot ) If the method works, I would surely post it as an answer to the previous question The current problem is no display on the screen(all black), while I can check the stored data in the vertex buffer using graphic diagnostics tools built in visual studio. Here is my code for the shaders, problem may be going from here because this is the first shader file I've written. Please have a look and discuss may things go wrong. pixel shader file Pixel PS.fx struct VertexOut float4 PosH SV POSITION float4 Color COLOR float4 PS(VertexOut pin) SV Target return pin.Color vertes shader file Transf VS.fx cbuffer dataBuffer register(b0) matrix ViewProjection struct VertexIn float3 PosL POSITION float4 Color COLOR struct VertexOut float4 PosH SV POSITION float4 Color COLOR VertexOut VS(VertexIn vin) VertexOut vout Transform to homogeneous clip space. vout.PosH mul(float4(vin.PosL, 1.0f), ViewProjection) Just pass vertex color into the pixel shader. vout.Color vin.Color return vout where I compiled them byte vertexShaderByteCode ShaderBytecode.CompileFromFile(this.path quot Transf VS.fx quot , quot VS quot , quot vs 5 0 quot ) this.vertexShader new D3D11.VertexShader( device, vertexShaderByteCode ) this.inputSignature new ShaderSignature(vertexShaderByteCode) this.pixelShader new D3D11.PixelShader( device, ShaderBytecode.CompileFromFile(this.path quot Pixel PS.fx quot , quot PS quot , quot ps 5 0 quot ) ) data structure definition public struct ScatterVertex SharpDX.Vector3 Position SharpDX.Color4 Color inputlayout initialization this.inputLayout new D3D11.InputLayout( this.device, inputSignature, new SharpDX.Direct3D11.InputElement new D3D11.InputElement( quot Position quot , 0, SharpDX.DXGI.Format.R32G32B32 Float, 0 ), new D3D11.InputElement( quot Color quot , 0, SharpDX.DXGI.Format.R32G32B32A32 Float, I'm not very sure whether or why I should use this option 0 ) ) Here's the projection matrix in the camera class, which is later sent to the constant buffer private Matrix View get return Matrix.LookAtLH(this.Eye, this.Target, this.Up) private Matrix Proj get return Matrix.PerspectiveFovLH(this.Fov, this.Aspect, this.Near, this.Far) public Matrix WorldViewProject get Matrix wvp (this.View this.Proj) wvp.Transpose() return wvp If you need anything else to solve this, just let me know and I'll manage to get it. Thank you!
14
How do I get HDRP render enabled so that the shader graph will work I'm using Unity 2019.2.17 personal, and I imported the shader graph package and created a pbr shader graph. I get a warning the render pipeline is not compatible with the master node. I assume it means I need to be using HDRP or LWDRP. I can't figure out how to enable HDRP. How do i do this?
14
Simple square vertex lifting shader I am trying to rebuild the fur effect in Viva Pinata. Here each square becomes a pattern of fur I imagine the process to be like this... U lift one end of the triangles. Now I need to achieve "lifting one end of square". I can do either vertex, fragment, geometry shader. However I am clueless when it comes to determining which vertex is "end of square", so that I know which vertex to lift up.
14
Rain effect using DirectX 9 capabilities Is it possible to achieve something similar to nVidia's rain demo using only shader model 3.0 capabilities? If yes, could you point out a few documents web resources that are suitable candidates and do not require a heavy programming load (e.g. not more than two hard weeks of programming for one single person)? It would be nice if the answer could also contain a pro con phrase for the proposed idea (e.g. postprocessing rain shader vs. a particle based effect).
14
Other displacement deformation methods than vertex displacement? Using vertex displacement on low res models lead to artifacts, and I imagine tessellation might be a bit slow, if I want it pixel perfect. I know signed distance fields can be used, though I doubt that's fast either. Are there any other approaches? Is using tessellation fast enough for a large scene with hundreds of constantly deforming objects?
14
Is it possible to retrieve shader function names associated with a technique pass using the DirectX Effect API? For example, given the pass pass p0 SetVertexShader(CompileShader(vs 4 0, VSFunction())) SetPixelShader(CompileShader(ps 4 0, PSFunction())) Ss it possible to retrieve the names VSFunction and PSFunction? It doesn't look like any of the associated shader descriptors actually contain the name of the entry point.
14
Shader optimization Cg HLSL pseudo and via multiplication HLSL Cg do not allow texture fetching inside conditional blocks. To get around this I am first checking a variable and performing some computations, afterwards I set a float flag to 0.0 or 1.0, depending on the computations. I'd like to trigger a texture fetch only if the flag is 1.0 or not null. I kind of hoped this would do the trick float4 TU0 atlas colour pseudoBool tex2Dlod(TU0 texture, float4(tileCoord, 0, mipLevel)) I want to know if pseudoBool is 0, will the texture fetch function still be called and produce overhead? I was hoping to prevent it from getting executed via this trick that usually works in plain C C .
14
Why can't my .exe find my .fx file? I'm having a problem with my .fx file in my D3D 11 application. I can run the application just fine from Visual Studio, but when I run the .exe it fails when trying to load the .fx file. I've checked the file name in the code and tried all sorts of paths but I can't figure this out, why is this happening?
14
How Apple Metal API distinguishes uniforms from vertices buffers? I am not sure how metal distinguishes uniforms from vertices buffers? As far as I know code for passing uniforms in buffers are the same like for vertices self.commandEncoder setVertexBuffer positionBuffer offset 0 atIndex 0 self.commandEncoder setVertexBuffer uniformBuffer offset 0 atIndex 1 In shader code there is no keyword defining some struct as uniforms, so how Metal know it? In Opengl GLSL was uniform keyword which was clear for me but I can't figure how it is solved in metal.
14
Do shader compilers typically know not to look up unused texture channels? I have a texture that's from the color attachment of an FBO in OpenGL ES 2.0, so I have limited control over the number of channels in the image. Suppose I only need the color from the R channel of this texture. If I immediately swizzle out the R in the fragment shader like this float brightness texture2D(u texture, v texCoord).r can I expect the GPU to know not to bother looking up G, B, and A? I figure this is something that might vary from device to device, depending on the optimizations of the shader compiler, but to me this looks like a very obvious and easy optimization. I'd like to know if I can be reasonably confident that almost all GPUs will make the optimization. This shader is a multiple texture shader, and I'd hate to get surprised by some percentage of devices running it much slower than what I've tested on. If I cannot, then I may want to take pains to use a different method of generating my image so I can ensure a texture with fewer channels.
14
Wrap texels between desired values I'm using a basic pixel shader here uniform sampler2D texture uniform float pixel threshold void main() float factor 1.0 (pixel threshold 0.001) vec2 pos floor(gl TexCoord 0 .xy factor 0.5) factor gl FragColor texture2D(texture, pos) gl Color It works great in my game. But on a new enemy it doesn't. This enemy is different because it uses a texture atlas one big texture holding more frames than just the character. The sampling gets messed up because factorapproaches 0 so does the xy coord therefore sampling some pixels outside of the character's sheet. Likewise when pixel threshold gets small, the xy coord goes outside of the character's spot in the texture. Clamping is not the solution because we want it to wrap around. I've tried many ways of wrapping but not seem to offer the desired effect of the original. I tried using 4 uniforms left,top,width,height and passing in the masking values used when drawing the sprite from the spritesheet. vector2 size mySprite.getSize() float left mySprite.getTextureRect().left size.x float top mySprite.getTextureRect().top size.y float width mySprite.getTextureRect().width size.x float height mySprite.getTextureRect().height size.y myShader.setUniformf("left", left) ... myShader.setUniformf("height", height) How can I have the pixel blur shader stay within the texel ranges provided by the uniform? Thanks.
14
What is a fragment in 3D graphics programming? What is a fragment in a fragment shader? Wikipedia says that In general, a fragment can be thought of as the data needed to shade the pixel, plus the data needed to test whether the fragment survives to become a pixel (depth, alpha, stencil, scissor, window ID, etc.) So is it textures, vertices or something else?
14
How can I make a tendril flame like aura visual effect? I am a bit new to UE4 and I'm trying to get a tendril flame like aura like the picture below. Does anyone know how I would go about this? Should I use post processing or particles?
14
DirectX11 Using Multiple Shaders I currently have a scenario where I am rendering terrain with a shadow map. I have two passes, one for the depth buffer to create the shadow map (which is rendered to a texture) and a second that takes the shadow map texture as input and actually shadows the terrain, the terrain gets drawn once for each pass. Within the second shader I also texture the terrain. I now want to split the second shader so that shadows and texturing are separate shaders. How do I go about getting the output from the shadow shader to the texture shader (or vice versa), or do I simply do it in the same way as I have with the shadows (rendering to a texture, passing it as an input).
14
Writing a leather shader I'm trying to write a leather material shader. I have a normal map, bump map (grayscaled), specular map, diffuse map, cube maps. I have done the following version 100 precision highp int precision highp float uniform sampler2D diffuseColorMap uniform sampler2D ambientOcclusionMap uniform sampler2D normalMap uniform sampler2D specularMap uniform sampler2D bumpMap uniform samplerCube envMap varying vec2 texCoord 2 varying vec3 viewWorld uniform float reflectionFactor uniform float diffuseFactor uniform float opacity varying vec3 eyeVector varying mat3 world2Tangent varying vec3 lightVec varying vec3 halfVec varying vec3 eyeVec void main() vec3 normal 2.0 texture2D (normalMap, texCoord 0 ).rgb 1.0 normal normalize (normal) compute diffuse lighting float lamberFactor max (dot (vec3(0.0,0.0,1.0), normal), 0.1) vec3 bumpNormal texture2D(bumpMap, texCoord 0 ).rgb vec3 normalWorld normalize(world2Tangent bumpNormal) vec3 refDir viewWorld 2.0 dot(viewWorld,normalWorld) normalWorld vec4 diffuseMaterial texture2D (diffuseColorMap, texCoord 0 ) vec4 diffuseLight vec4(1.0,1.0,1.0,1.0) In doom3, specular value comes from a texture vec4 specularMaterial texture2D (specularMap, texCoord 0 ) vec4 specularLight vec4(1.0,1.0,1.0,1.0) float shininess pow (max (dot (halfVec, normal), 0.0), 2.0) vec4 reflection textureCube(envMap, refDir) gl FragColor diffuseMaterial diffuseLight lamberFactor gl FragColor specularMaterial specularLight shininess gl FragColor reflection 0.3 The result is shown My question is how would I use the bump map (Grayscale) to the result of the reflection or what's wrong in my shader ?
14
Fragment shader for lighting in isometric perspective What I'm trying to do is to achieve basic lighting in 2D from an isometric perspective. Here I have 2 textures that are used as tiles for the ground Color Normal map I have a fragment shader that was adjusted a bit to my needs, but mostly made from this tutorial https github.com mattdesl lwjgl basics wiki ShaderLesson6 Here is my fragment shader uniform sampler2D texture uniform sampler2D normal texture values used for shading algorithm... uniform vec2 Resolution resolution of screen uniform vec4 AmbientColor ambient RGBA alpha is intensity uniform vec3 LightPos 3 light position, normalized uniform vec4 LightColor 3 light RGBA alpha is intensity uniform vec3 Falloff 3 attenuation coefficients void main() vec4 DiffuseColor texture2D(texture, gl TexCoord 0 .xy) vec3 NormalMap texture2D(normal texture, gl TexCoord 0 .xy) vec3 Sum vec3(0.0) for (int i 0 i lt 3 i ) The delta position of light vec3 LightDir vec3(LightPos i .xy (gl FragCoord.xy Resolution.xy), LightPos i .z) Correct for aspect ratio LightDir.x Resolution.x Resolution.y Determine distance (used for attenuation) BEFORE we normalize our LightDir float D length(LightDir) normalize our vectors vec3 N normalize(NormalMap 2.0 1.0) vec3 L normalize(LightDir) Pre multiply light color with intensity Then perform "N dot L" to determine our diffuse term vec3 Diffuse (LightColor i .rgb LightColor i .a) max(dot(N, L), 0.0) pre multiply ambient color with intensity vec3 Ambient AmbientColor.rgb AmbientColor.a calculate attenuation float Attenuation 1.0 (Falloff i .x (Falloff i .y D) (Falloff i .z D D)) the calculation which brings it all together vec3 Intensity Ambient Diffuse Attenuation vec3 FinalColor DiffuseColor.rgb Intensity Sum FinalColor gl FragColor gl Color vec4(Sum, DiffuseColor.a) This shader WORKS with traditional 2D perspective. That's because in traditional perspective the ground tiles normal map would have a color scheme closer to something like this, just wouldn't be tilted to fit an isometric perspective of course. The issue How do I go about adjusting the math in my fragment shader to fit an isometric perspective and work with the "greenish" ground normal maps. Here's what I'm currently getting with this shader. Light should be equally going in all directions, but it's as if it's going down a slope.
14
Need help transforming DirectX 9 skybox hlsl shader to DirectX 11 I am in the middle of implementing a skybox to my game. I have been following this tutorial http rbwhitaker.wikidot.com skyboxes 2. I am using MonoGame as a framework and in order to support both Windows and Windows 8 metro I need to compile the shader with pixel and vertex shader 4. compile vs 4 0 level 9 1 compile ps 4 0 level 9 1 However some of the hlsl syntax has been updated with DX10 and DX11. I need to update this hlsl code float4x4 World float4x4 View float4x4 Projection float3 CameraPosition Texture SkyBoxTexture samplerCUBE SkyBoxSampler sampler state texture lt SkyBoxTexture gt magfilter LINEAR minfilter LINEAR mipfilter LINEAR AddressU Mirror AddressV Mirror struct VertexShaderInput float4 Position POSITION0 struct VertexShaderOutput float4 Position POSITION0 float3 TextureCoordinate TEXCOORD0 VertexShaderOutput VertexShaderFunction(VertexShaderInput input) VertexShaderOutput output float4 worldPosition mul(input.Position, World) float4 viewPosition mul(worldPosition, View) output.Position mul(viewPosition, Projection) float4 VertexPosition mul(input.Position, World) output.TextureCoordinate VertexPosition CameraPosition return output float4 PixelShaderFunction(VertexShaderOutput input) COLOR0 return texCUBE(SkyBoxSampler, normalize(input.TextureCoordinate)) technique Skybox pass Pass1 VertexShader compile vs 2 0 VertexShaderFunction() PixelShader compile ps 2 0 PixelShaderFunction() I quess I need to change Texture into TextureCube, change sampler, swap texCUBE() with TextureCube.Sample() and change PixelShader return semantic to SV Target0. I'm very new in shader languages and any help is appreciated!
14
DirectX11 Using Multiple Shaders I currently have a scenario where I am rendering terrain with a shadow map. I have two passes, one for the depth buffer to create the shadow map (which is rendered to a texture) and a second that takes the shadow map texture as input and actually shadows the terrain, the terrain gets drawn once for each pass. Within the second shader I also texture the terrain. I now want to split the second shader so that shadows and texturing are separate shaders. How do I go about getting the output from the shadow shader to the texture shader (or vice versa), or do I simply do it in the same way as I have with the shadows (rendering to a texture, passing it as an input).
14
Understanding diffuse lighting in The Division Here is a screenshot of the main character walking under a bright lamp His hat, that was originaly dark grey, turns completely white. My question is how can such lightsource do this without making everything around it superbright? Cuz that's what my humble attempts resulted in. I can only think of two options they either use a standard diffuse formula (it has got to be diffuse light as it does not react to camera movement) and have a lightsource that fades extremely quickly (fastar than the standard quadratic attenuation model. or they use some clever shaders for clothes to make them "catch light" so fast. Maybe there is an easier solution? Here are two more shots, the floor is not nearly as bright as the character that goes under the lamp UPDATE I think I got my problem they have a different attenuation formula. I think if I improve my attenuation I whould be able to achieve similar results.
14
HLSL texture sampler always returns white I'm facing in problem in HLSL with Monogame that I can't figure out. The gist is that sampling from a texture seems to always return white rather than the texture's actual color. My pixel shader code is below, and then I'll say more. texture2D fillTexture sampler fillSampler sampler state Texture lt fillTexture gt sampler s0 struct VertexShaderOutput float4 Color COLOR0 float4 Position SV POSITION0 float2 TexCoords TEXCOORD0 float4 PixelShaderFunction(VertexShaderOutput input) COLOR0 return tex2D(fillSampler, input.TexCoords) technique Technique0 pass Pass0 PixelShader compile ps 4 0 PixelShaderFunction() First, in my C code, I'm using the SpriteBatch to draw a large rectangle. That part works fine. Second, I have a smaller texture that I'm passing as a parameter into my pixel shader (fillTexture). No matter what I try, the resulting large rectangle is pure white. I've already verified that my shader is being applied, as the following code results in a red block. float4 PixelShaderFunction(VertexShaderOutput input) COLOR0 return float4(1, 0, 0, 1) I've also verified that my texture coordinates within the pixel shader function are correct. The following code produces the image below. float4 PixelShaderFunction(VertexShaderOutput input) COLOR0 float2 texCoords input.TexCoords return float4(texCoords.x, texCoords.y, 0, 1) I've also verified that the fill texture itself has valid colors (i.e. it isn't pure white itself). Apart from that, I've tried every combination I've seen. I've tried using sampler2D rather than sampler, fillTexture.Sample rather than tex2D, even compiling with ps 4 0 level 9 1 and level 9 3. I've tried using SamplerState rather than sampler state, binding registers, everything I can think of. Any help would be greatly appreciated. Thank you )
14
Converting Shader to ShaderGraph for URP conversion I know there are a bunch of threads asking for help with converting old shaders to ShaderGraph, and this one is no different. I spent a lot of time trying to get a shader to cast shadows from sprites that also works with transparency. Eventually landing on this thread I was able to work out the following shader that did what I wanted Shader quot Custom SpriteShadowWithAlpha quot Properties PerRendererData MainTex( quot Texture quot , 2D) quot white quot EffectColor1( quot Effect Color quot , Color) (1,1,1,1) Crossfade( quot Fade quot , float) 0 FlashColor( quot Flash Color quot , Color) (1,1,1,1) FlashAmount( quot Flash Amount quot ,Range(0.0,1.0)) 0 Cutoff( quot Alpha Cutoff quot , Range(0,1)) 0.9 Color ( quot Color quot , Color) (1,1,1,1) Toggle( ALPHABLEND ON) ALPHABLEND ON( quot Enable Dithered Shadows quot , Float) 0.0 SubShader Tags quot Queue quot quot Transparent quot quot IgnoreProjector quot quot True quot quot RenderType quot quot TransparentCutOut quot quot PreviewType quot quot Plane quot quot CanUseSpriteAtlas quot quot True quot Cull Off Lighting Off ZWrite Off Blend SrcAlpha OneMinusSrcAlpha CGPROGRAM pragma surface surf Lambert alpha blend fullforwardshadows alphatest Cutoff pragma target 3.0 struct Input fixed2 uv MainTex fixed4 color COLOR sampler2D MainTex fixed4 EffectColor1 fixed Crossfade fixed4 FlashColor float FlashAmount void surf(Input IN, inout SurfaceOutput o) fixed4 col tex2D( MainTex, IN.uv MainTex) fixed4 returnColor lerp(col, col EffectColor1, Crossfade) EffectColor1.a col (1.0 EffectColor1.a) o.Albedo returnColor.rgb IN.color.rgb o.Alpha col.a IN.color.a o.Albedo lerp(o.Albedo, FlashColor.rgb, FlashAmount) ENDCG Fallback quot Standard quot After that I found that using the built in render pipeline was the difference between 40 and 7 GPU utilization when converted to URP using a different sprite shadow shader. Problem is the URP shader I found did make sprites cast shadows, but all the lighting was off, and messing with the X rotation on the sprites to get them to stand off the ground would cause them to explode into balls of white light from the directional light....I don't get why. So I set off to try and convert my old working Shader to ShaderGraph I have been trying to deduce what parts of this code corelate to what blocks exist in ShaderGraph. I understand when 2 variables have a between them I need to use a multiply block After about 3 hours of linking different blocks I was left with....nothing and a mess I was wondering if anyone could point me in the right direction as to what blocks correspond to what variables. Like the code void surf(Input IN, inout SurfaceOutput o) I am not too sure what sort of block this would be, it seems like a function, and I have no clue how to make that visually or the block of quot Tags quot no idea where to start there or the toggling of ALPHABLEND ON