_id
int64
0
49
text
stringlengths
71
4.19k
14
FXC Error X3501 'main' entrypoint not found I am trying to compile a vertex shader using VS2013, but every time I try, FXC returns the following error Error error X3501 'main' entrypoint not found I've reduced the vertex shader to its simplest form and yet I'm still getting the same result DefaultVS.hlsl include quot Include.hlsl quot cbuffer CameraTransform float4x4 ViewProjMat VS OUT main(VS IN input) VS OUT result result.Position mul(input.Position, mul(input.WorldMat, ViewProjMat)) return result Include.hlsl struct VS IN float4 Position POSITION float4x4 WorldMat INSTANCE TRANSFORM struct VS OUT float4 Position SV POSITION And the properties of both files Zi E quot main quot Od Fo quot Path To Output DefaultVS.cso quot vs quot 5 0 quot nologo Zi Od Fo quot Path To Output Include.cso quot nologo
14
Fragment shader operations before vector transformations I feel like I'm misunderstanding how to work with vector fragment shaders. My vector shader is as follows uniform mat4 uVMatrix view (camera transformations) uniform mat4 uMMatrix model (object transformations) uniform mat4 uPMatrix projection attribute vec4 aVertexPosition passed in attribute vec4 aVertexColor varying vec4 vColor void main() gl Position uPMatrix uVMatrix uMMatrix aVertexPosition vColor aVertexColor pass the vertex's color to the fragment shader Pretty simple. Right now I just have a simple square that I'm transforming in 3D space and drawing. I want to make a see through circle in the middle of the suqare as follows This square has 4 vertices that I transform in the vector shader. Here's my fragment shader precision mediump float how precise to be with floats varying vec4 vColor interpolated from the vertices void main() gl FragColor vColor Now, I've seen I have access to gl FragCoord, but that coordinate is after all the vector transformations, right? How can I manipulate the pixels in the square, before all the projection transformations, etc.? I don't think I can do it in the vector shader, as there's only 4 vectors...
14
Low level GPU code and Shader Compilation Bear with me, because I will raise several questions at once. I still feel, though, that overall this can be treated as one question that may be answered succinctly. I recently dove into solidifying my understanding of the assembly language, low level memory operations, CPU structure, and program optimizations. This also sparked my interest in how higher level shading languages, GLSL and HLSL in particular, are compiled and optimized, as well as what formats they are reduced to before machine code is generated (assuming they are not converted directly into machine code). After a bit of research into this, the best resource I've found is this presentation from ATI about the compilation of and optimizations for HLSL. I also found sample ARB assembly code. This sort of addressed my original curiosity, but it raised several other questions. The assembler code in the ATI presentation seems like it contains instructions specifically targeted for the GPU, but is this merely a hypothetical example created for the purpose of conceptual understanding, or is this code really generated during shader compilation? If so, is it possible to inspect it, or even write it in place of the higher level syntax? My initial searches for an answer to the last question tell me that this may be disallowed, but I have not dug too deep yet. Also, along the same lines, are GLSL shader programs compiled into ARB assembly code before machine code is generated, and is it possible to write direct ARB assembly? Lastly, and perhaps what I am most interested in finding out are there comprehensive resources on shader compilation and low level GPU code? I have been unable to find any thus far. I ask simply because I am curious )
14
Subtract waves from tilemap I'm wondering how I could create a shader that would turn a randomly generated shape like this And turn it into something more like this Essentially just creating a top down view of ocean waves that would flow in and out. I would prefer to be able to subtract this shape from a tilemap. Any resources or pseudo code on how I could get started with this would be great.
14
What are Hull, Domain and Geometry shaders used for? I've done my fair share of 3D game programming for my (former) employer, and also in my own custom engines for my own indie games. Initially, I started with Direct3D 9, and D3DX9, which pretty much did everything for me, and didn't require me to think in terms of shaders at all. After that, I wrote my first Direct3D 9 shaders, but mostly used one very simple shader for everything I did. In the most recent iteration of my game engine, I moved to Direct3D 11, and with that I created lots of shaders. I did GPU skinning, GPU calculated particles, lots of lighting and post processing effects, all in the GPU. Really cool stuff. So far I have only used vertex and pixel fragment shaders. Even though there are still lots of things I still haven't done, I believe I have a solid knowledge of what the vertex and pixel fragment shaders do, and how that all fits into the entire 3D pipeline. Catching up with more recent developments, I've become very interested with the newer shader stages. That is, the Geometry Shader, and even newer, the Hull and Domain shaders. I have never used these stages, but from what I know, the Geometry shader, if enabled, is run after the vertex shader, once for each transformed vertex (or once per primitive?) and allows you to discard vertices (and primitives?), and create new ones (which I guess go back to the beginning of the pipeline?). My guess is that the main use of the geometry shader would be to programatically generate geometry in the GPU. A common usage would be to create billboard quads based on a single vertex, but I don't really visualize many other common scenarios apart from generating fractals and other stuff you can generate 100 programatically. As for the Hull and Domain shaders, it seems like they're related to tessellation (creating smoother surfaces out of rough surfaces?), and must be used together or not at all. The term "patch" also seems to be common in here. Would anybody care to explain to me, in practical terms, what these new ish shader stages are for, how they fit into the 3D pipeline, and in which cases should I consider using them?
14
Trouble projecting pixel back to worldspace in PixelShader DX11 For the last two days, I've been trying to get some code working to project pixels back to world space in my pixel shader. Im working on a fairly basic deferred renderer, and I'm using this world space position to calculate distance to a light, so I can calculate the light falloff intensity. My first thought was to take the view and projection matrix from the g buffer pass, multiply them, take the inverse, transpose and then write to the constant buffer. .cpp file auto gBufferWorld DirectX XMMatrixIdentity() auto gbufferTransform DirectX XMMatrixMultiply( gBufferWorld, gBufferView ) gbufferTransform DirectX XMMatrixMultiply( gbufferTransform, gBufferProj ) gbufferTransform DirectX XMMatrixInverse( nullptr, gbufferTransform ) pixel shader code float4 pixelWorldPos float4( input.TexCoord.x 2.f 1.f, ( 1.f input.TexCoord.y ) 2.f 1.f, ( depth nearClip ) ( farClip nearClip ), 1.f ) pixelWorldPos mul( pixelWorldPos, invViewProjMatrix ) pixelWorldPos pixelWorldPos.w But, I end up getting totally incorrect values. I also tried using just the TexCoords for pixelWorldPos.xy, using the depth value sampled from the gbuffer for pixelWorldPos.z, and still, nothing seems to give me the correct world position. I also tried using trigonometric functions to achieve this, I dont have the exact code on hand anymore, but it looked something like... float tanHalfY tanf( fovY 2.f ) float tanHalfZ tanHalfY aspectRatio In Pixel Shader float3 dir normalize( float3( x tanHalfX, y tanHalfY, 1.f ) ) float4 pixelWorldPos mul( float4( dir depth, 1.f ), invViewMatrix ) pixelWorldPos normalize( pixelWorldPos ) Of course I could just write an extra GBuffer texture containing each pixels worldspace position, since it would be trivial to compute in the GBuffer pass, but I'm unsure if the extra VRAM consumption is worth it, since I can just calculate in the final lighting pass. I was hoping someone can point me in the right direction, I really didnt expect to have so much trouble with something that I thought would be so trivial.
14
HLSL's Tex2D for GLSL? I am trying to port a HLSL shader to GLSL. I'm just not quite sure how to convert this line outA Input.Color.a tex2D(s, Input.TexCoord.xy float2( 4.0 pxSz.x blurSize,0)).a 0.05 It's mostly the Tex2D I'm having trouble with. In GLSL, it seems to work differently. I'm porting a horizontal blur texture al tex sampler2D s sampler state texture lt al tex gt int tWidth int tHeight float blurSize 5.0 float4 ps main(VS OUTPUT Input) COLOR0 float2 pxSz float2(1.0 tWidth,1.0 tHeight) float4 outC 0 float outA 0 outA Input.Color.a tex2D(s, Input.TexCoord.xy float2( 4.0 pxSz.x blurSize,0)).a 0.05 outA Input.Color.a tex2D(s, Input.TexCoord.xy float2( 3.0 pxSz.x blurSize,0)).a 0.09 outA Input.Color.a tex2D(s, Input.TexCoord.xy float2( 2.0 pxSz.x blurSize,0)).a 0.12 outA Input.Color.a tex2D(s, Input.TexCoord.xy float2( pxSz.x blurSize,0)).a 0.15 outA Input.Color.a tex2D(s, Input.TexCoord.xy float2(0,0)).a 0.16 outA Input.Color.a tex2D(s, Input.TexCoord.xy float2(pxSz.x blurSize,0)).a 0.15 outA Input.Color.a tex2D(s, Input.TexCoord.xy float2(2.0 pxSz.x blurSize,0)).a 0.12 outA Input.Color.a tex2D(s, Input.TexCoord.xy float2(3.0 pxSz.x blurSize,0)).a 0.09 outA Input.Color.a tex2D(s, Input.TexCoord.xy float2(4.0 pxSz.x blurSize,0)).a 0.05 outC.a outA return outC Thanks
14
Metaballs created through raycasting in fragment shader billboards are still recognizable Good day everyone, I'm currently programming a fluid simulation through use of metaballs and I have a very specific problem question. The steps I use for metaball simulation 1. Simulate particles' positions 2. Create billboards that always face the camera with the particles' positions at its center 3. Cast a ray from camera to every fragment pixel and determine the value of every particles' contribution along the ray. Currently, I have nice looking metaballs with lighting, however, when you look closely you can see the edges of a billboard that is infront of another metaball because the colors are slightly off. The fragments that fail the raycasting test get discarded, but they are obviously still somehow interfering and I just don't know how or why. Want I want to know, because I have thought about it and didn't come to a conclusion since the actual fragments of the billboards where there is no implicit surface are discarded, that means that not the color value of the "obstructing" discarded billboard gets changed, but instead that of the actual goal billboard. where are possible points of conflicts that may create this behaviour? Thanks in advance! EDIT I think I am wrong about my earlier assumption the raycast used to determine the implicit surface most likely hits a metaball BEHIND the billboard, so it actually doesn't get discarded... it then takes the normal of the actual intersection, but uses the light direction vector that is made by using the fragment's position, not the actual intersection. So, my guess is that I need a better condition to let the raycast know it didn't hit anything instead of continuing and hitting a ball that is a little further away.
14
How to mix effects together? Lets say I have got terrain effect, contains multitexturing, light. Now water effect. It must be different, but must be also affected by light. Other one, player effect, must also be affected by light, but doesn't share much stuff with others. How all of these should be mixed together to create proper effect? Should I write multiple techniques in single effect file, or should I split it to as many as possibile and choose which are needed when rendering?
14
How can I replicate the color limitations of the NES with an HLSL pixel shader? So since 256 color mode is depreciated and no longer supported under Direct3D mode, I got the idea to use a pixel shader instead to simulate the NES palette of all possible colors so that fading objects and whatnot don't have smooth fade outs with alpha channels. (I know objects couldn't really fade out on the NES, but I have all objects that do fade in and out on a solid black background, which would be possible with palette swapping. Also, the screen fades in and out when you pause which I know also was possible with palette swapping as it was done in a few Mega Man games.) Problem is, I know next to nothing about HLSL shaders. How do I do it?
14
Shader Graph Surface type change Is there any way to control what surface type is used in shader graph insted of changing it in PBR Master? I'm creating dissolve effect for albedo emission transparent surfaces (https www.youtube.com watch?v taMp1g1pBeE). Then I'm making 3 materials based on that one shader that I can control with variables. It works fine for Opaque but when I want to use it for glass I need to change Surface to Transparent in PBR Master and Opaque surfaces are getting bugged due to Two Sided setting. Can I control what type of surface is used based on some variable or maybe take it out to inspector as in standard materials?
14
Unreal 4 why does reducing scalability take so long to compile? I have almost nothing in my scene in fact, I reduced my scalability settings from "High" to "Medium" before even loading my level layout and doing this has crashed the program multiple times. It didn't crash this time, but it is telling me it is compiling over 5000 shaders (!) and my objects' materials are not showing up in the viewport. 1) that is a ton of shaders, what shaders are these? I only made like 8 shaders. 2) What is making this take so long? it is taking forever, over 10 minutes already. 3) Is this normal?
14
Use of the xyY color space? What's the use of the xyY colorspace in games? I'm not sure what's the advantage of using it in shader programming or elsewhere.
14
Failed to pass uniform in Metal shader modifier I'm trying to write a simple shader able to pass the color to be used for drawing in the fragment shader, through a uniform. I load the shader modifier and pass the uniform let fragmentShaderPath bundle.pathForResource("Cube", ofType "fragment")! let fragmentShader try String(contentsOfFile fragmentShaderPath) let shaderModifiers String String SCNShaderModifierEntryPointFragment fragmentShader cube.geometry?.shaderModifiers shaderModifiers SCNTransaction.begin() cube.geometry?.setValue(NSValue(SCNVector4 SCNVector4Make(0.0,1.0,0.0,1.0)), forKey "myColor") SCNTransaction.commit() And this is my fragment shader modifier code include lt metal stdlib gt using namespace metal float4 myColor output.color myColor But nothing happens the object is drawn as black. If I try declaring myColor as a uniform, I get an error
14
OpenGL ES 2.0 not drawing images with shadows I'm using OpenGL ES 2 to program a simple game 2D for Android mobile phones. I'm coding the rendering portion of the software, using the GLES20 default library. All my sprites are rendered from a large image containing all the pictures of what I need. I use very basic fragment and vertex shaders which are the following Vertex shader uniform mat4 uMVPMatrix attribute vec4 vPosition attribute vec2 a texCoord varying vec2 v texCoord void main() gl Position uMVPMatrix vPosition v texCoord a texCoord Fragment shader precision mediump float varying vec2 v texCoord uniform sampler2D s texture void main() vec4 dst texture2D(s texture, v texCoord) gl FragColor dst Those are the shaders I use and everything is rendering fine. I also set the blending function to GLES20.glBlendFunc(GLES20.GL SRC ALPHA,GLES20.GL ONE MINUS SRC ALPHA) The problem is that I needed a sort of shadow around an object, so I decided to solve this problem simply making the game image with already the shadow in it. So, as you can see, here is a piece of my image with a large orange outside shadow so I get the effect that the main object (the two columns in the middle) is glowing. (Ignore the black background, I put it there to make the shadow visible, it's not there in the real game image) Anyway, when I render this image I get this result As you can clearly see, there is not any sort of shadow, even it is there in the source image. What is really strange from my point of view is that I don't generate the shadow using the shaders, I created it in a very static (and probably not very effective) way using Photoshop. How can I make the outer orange glow visible ? Thank you in advice
14
Why doesn't my simple HLSL shader work? I'm using Monogame to draw 2D primitives to the screen. To do that, rather than use included structures like VertexPositionColor, I wrote my own vertex class for 2D. public struct VertexColor IPositionable, IVertexType private static readonly VertexDeclaration Declaration new VertexDeclaration ( new VertexElement(0, VertexElementFormat.Vector2, VertexElementUsage.Position, 0), new VertexElement(8, VertexElementFormat.Color, VertexElementUsage.Color, 0) ) public VertexColor(Vector2 position, Color color) Position position Color color public Vector2 Position get set public Color Color get set public VertexDeclaration VertexDeclaration gt Declaration Here's my associated HLSL shader, meant to simply translate vertices from normalized space to screen space. define VShaderModel vs 4 0 level 9 1 define PShaderModel ps 4 0 level 9 1 matrix Projection struct VertexShaderInput float4 Position SV Position0 float4 Color Color0 struct VertexShaderOutput float4 Position SV Position0 float4 Color Color0 VertexShaderOutput VertexShaderFunction(VertexShaderInput input) VertexShaderOutput output (VertexShaderOutput)0 output.Position mul(input.Position, Projection) output.Color input.Color return output float4 PixelShaderFunction(VertexShaderOutput input) Color return input.Color technique Technique0 pass Pass0 VertexShader compile VShaderModel VertexShaderFunction() PixelShader compile PShaderModel PixelShaderFunction() Where Projection is an orthographic matrix computed in code and passed to the shader based on current window size. However, when I attempt to draw primitives, I receive the following exception An error occurred while preparing to draw. This is probably because the current vertex declaration does not include all the elements required by the current vertex shader. The current vertex declaration includes these elements SV Position0, COLOR0. Now, I'm familiar with this exception and I understand what it means. The exception is telling me that my shader does not include all required elements based on the current vertex declaration. However, as you can see, my shader does include those two elements (SV Position0 and Color0). What am I missing here?
14
Recreating this flat shaded look I'll keep it short. How does one achieve the effect depicted in the image below? Is it feasible to do in realtime? It looks deceptively simple, but it probably isn't. Are there any keywords I can search for to get more information about programming the shaders to achieve this look? Thanks.
14
Is there a successor to RenderMonkey? I'm starting with GLSL shader programming and have been looking into RenderMonkey. Sadly, AMD no longer supports it. Why? Is there a successor to it?
14
How to implement color changing fragment shader? I have a background of a given size and filled with a given color. I want to change it with an animation effect, starting from the center and spread out until it extends the whole background. The new color should fade blend smooth into the existing color from the background. Some kind of radial gradient that changes the color and then spreads out over the whole background. I am working with SpriteKit on iOS and I am really sure that the best way to implement this is to do this with fragment shaders which are new to iOS 8 SpriteKit SDK. I have done some work with shaders and understand how they work but I am asking for help more on the mathematics behind this.
14
How to apply a shader to only one side of an object in Unity3D So I have this rather simple surface shader that essentially cuts a hole in the object based on another texture. Shader "Custom NewSurfaceShader" Properties MainTex("Texture (RGB)", 2D) "white" SliceGuide("Slice Guide (RGB)", 2D) "white" SliceAmount("Slice Amount", Range(0.0, 1.0)) 0.5 SubShader Tags "RenderType" "Opaque" Cull OFF CGPROGRAM pragma surface surf Lambert addshadow struct Input float2 uv MainTex float2 uv SliceGuide float SliceAmount sampler2D MainTex sampler2D SliceGuide float SliceAmount void surf(Input IN, inout SurfaceOutput o) clip(tex2D( SliceGuide, IN.uv SliceGuide).rgb SliceAmount) o.Albedo tex2D( MainTex, IN.uv MainTex).rgb ENDCG Fallback "Diffuse" This produces the following effect. Now as cool as this looks I want to only apply it to, let's say, one side of the cube and its opposite side. Any suggestions on how do I go about doing this will be appreciated.
14
How can I create an efficient bloom shader with GLSL? I have searched the net for resources related to rendering a bloom effect using GLSL, but haven't found anything. Although the tutorial at Philip Rideout's website is a good one, it performs very poorly on my Nvidia GPU. Can anyone guide me as to how should I approach this problem and build a fairly efficient implementation of a bloom effect?
14
GPU optimization question pre computed or procedural? I'm learning shader program and need some general direction. I want to add noise to my laser beam. Which is the best way to handle it? I could pre compute an image and pass it to the shader. I could then use the image to change the opacity and easily animate the smoke by changing the offset of the texture lookup. I could also generate noise in the shader and do the same thing the texture was used for. Is it generally better to avoid I O to the graphics card or the opposite?
14
How can I create a wind visual effect like Ori? I'm working on a mobile game in Cocos2D. The game mechanic is very similar to Ori and the Blind Forest's parachute (Kuro's feather) mechanic. I want to be able to render a visual effect for wind similar to Ori's (see here) to make parachuting feel better, but I'm a bit at a loss on how to do it. Do I make a custom shader that draws everything from scratch? Do I start with a wind texture and apply some transformations to it? I think adding particles that blow around in the wind is pretty simple but the ribbony wind field look in addition to particles would feel much better. In case the video dies (but an image doesn't do the effect justice, it's very dynamic)
14
Simple square vertex lifting shader I am trying to rebuild the fur effect in Viva Pinata. Here each square becomes a pattern of fur I imagine the process to be like this... U lift one end of the triangles. Now I need to achieve "lifting one end of square". I can do either vertex, fragment, geometry shader. However I am clueless when it comes to determining which vertex is "end of square", so that I know which vertex to lift up.
14
Per Texel lighting? (not per vertex or per pixel) In my game, I have a 3D shader that does lighting using color ramps on models with very low resolution textures. Basically, the vertex shader calculates color values for direct light and ambient occlusion, and then I look up into a color ramp what color to draw, and multiply this by the texture map. Here's some pseudocode for that vertex shader color vert.r ambient occlusion(vertex) color vert.g direct lighting(vertex) color vert.b shadow lighting(vertex) fragment shader First, the color is whatever the direct lighting was color fragment direct lighting ramp lookup(color vert.g) Multiply by ambient occlusion and shadow color fragment ambient lighting ramp lookup(color vert.r) color fragment shadow lighting ramp lookup(color vert.b) Mutiply by the surface texture color fragment texture lookup(color vert.uv) Now, this kind of lighting would be fine if I had very high resolution textures, but unfortunately my textures are very low resolution. So each texel (that is, the block of pixels corresponding to one pixel of the texture) is quite large, and lighting gets interpolated across the texel. Here's an image of what that looks like And here's a closeup to show you what I mean Notice how the color is smoothly getting interpolated accross the low resolution texels? How can I write my shader such that the lighting gets clamped on a per texel basis? Is such a thing even possible? The only solution I can think of is to first generate a lightmap for the whole scene with 1 1 mapping to texels, but I'm afraid that would be too slow use too much memory.
14
GLSL Shader Editors for Linux Are there any good IDE's for linux that lets us edit GLSL shaders and visualize their effect? Note Shader Designer By Typhoon Labs is a good option but I am looking for alternatives as this software has some issues with Ubuntu 11.10
14
What's the most efficient way to implement low poly flat shading style? I want to know what's the most efficient way to implement low poly style flat shading. I have checked this post, however the options listed in the page don't feel perfect to me. I list them below for reference. Duplicate each vertex so that same vertex can associate with a different normal. Use flat to disable interpolation. Evaluate normal at fragment shader using dFdx and dFdy. To me, option 1 wastes some vertex attributes, since duplicate vertices have the same position. Option 3 perform unnecessary calculation on every fragment even though the normal should all be the same. Option 2 feels conceptually right to me, but I don't know what normal value should be saved at each vertex. This is also the approach hinted in book 'Real Time Rendering' (4e, p120). My question is what's the best way to implement it? Given so many commercial games used this style, I want to know how it is typically implemented by real world games?
14
Spherical procedural terrain shader based on slope I've created a spherical terrain object out of 6 sphere projected (normalised) planes, each plane has been heightmapped post to being normalised. I'm looking to create a CG shader which will firstly calculate the slope of the heightmapped terrain, I understand this will need to look at the normals to calculate this can anyone elaborate on this for me? The next problem is that I want to do the slope calculations based upon the terrain "flat" rather than normalised. Would it be best to either reverse the normalisation of the vertices during the slope calculations, or pre calculate and store the vertices slope data (on the flat plane) during the mesh generation? This is an example of what I want to achieve with the shader Any suggestions would be brilliant. Thanks, C. Update in response to opatut answer I've output the current slope data based of how I "think" was suggested and these were the results I think my issue is that as each plane begins to curve that curve is includes as a slope which is the obvious thing to happen. I want to "flatten" out each of the 6 planes again for the slope calculations.
14
What can I do with the 4th component of gl Position? When I set gl Position I usually assign it such as gl Position vec4(in position, 1.0) where in position as a vector of three components representing a vertex of my model. But looking up tutorials and such I cannot find anything explaining what the 4th component of the gl Position vec4 is doing aside from making the vector big enough so matrix transformations can be applied to it. Q What can I do with the 4th component of gl Position what does it influence in the rendering process?
14
Kinect User Silhouette Shader I have this usermap from kinect's depth data (size is 320x280) and i want to display it on my game. The problem, of course it's ugly (first image) and i want to have beautiful effect like in the second image The usermap is put on a bilboard as a texture2D and i've been thinking i should use fragment shader. I'm wondering what kind of shader techniques i should use so i can possibly make the same effect as shown on second image Usermap http www.azerdev.com darkmachine usermap.png Glowing man http www.azerdev.com darkmachine silhouette.jpg
14
Overlap color between objects I'm currently trying to build a game with Ogre3D that is basically a moving vehicle that leaves a green trail (2D manual mesh) in it's path, what i'm trying to achieve is exactly what this image shows My problem is that i need to change, by some method technique, the color of the intersected path where the two meshes overlap (red area). I've been searching around the Ogre forum and found this thread http www.ogre3d.org forums viewtopic.php?f 2 amp t 47674, I've replicated that solution in my code but now on the screen i only see the intersected path. I'm a total newbie in stencil buffers and in Ogre generally, so I'm still not sure if this is the best approach to solve my problem. should I try another method rather than applying a stencil buffer? vertex fragment shader code that could help? Any advice or direction that you could provide will be very appreciated. Thanks a lot UPDATE According to JasonPh's answers i've managed to start adding some code 1) Create manual texture Ogre TextureManager tmgr Ogre TextureManager getSingletonPtr() gkString mMapTextureName "pathTexture" if (!tmgr gt resourceExists(mMapTextureName)) Ogre TexturePtr ptr tmgr gt createManual(mMapTextureName, Ogre ResourceGroupManager DEFAULT RESOURCE GROUP NAME, Ogre TEX TYPE 2D, 480, Width 640, Height 1, Depth 0, Ogre PF A8R8G8B8, Ogre TU RENDERTARGET) ptr gt createInternalResources() ptr gt load() Ogre RenderTexture pathTexture ptr gt getBuffer() gt getRenderTarget() gkEngine engine gkEngine getSingletonPtr() Ogre Camera camera engine gt getActiveScene() gt getMainCamera() gt getCamera() pathTexture gt addViewport(camera) pathTexture gt getViewport(0) gt setClearEveryFrame(true) pathTexture gt getViewport(0) gt setBackgroundColour(Ogre ColourValue Black) pathTexture gt getViewport(0) gt setOverlaysEnabled(false) pathTexture gt setAutoUpdated(true) 2) Create material from scratch and use the previously created texture. This material is then assigned to my "path" entity. Ogre MaterialManager mmgr Ogre MaterialManager getSingletonPtr() mMaterialName uniqueMaterialName("pathMaterial") mMaterial mmgr gt create(mMaterialName, "General") Ogre Technique tec mMaterial gt getTechnique(0) tec gt setSchemeName("ShaderGeneratorDefaultScheme") Ogre Pass pass tec gt getPass(0) pass gt setVertexProgram("pathMaterial vs", false) pass gt setFragmentProgram("pathMaterial fs", false) pass gt setCullingMode(Ogre CULL NONE) pass gt setColourWriteEnabled(true) pass gt setLightingEnabled(true) Ogre TextureUnitState tus pass gt createTextureUnitState() tus gt setTextureFiltering(Ogre TFO NONE) tus gt setTextureAddressingMode(Ogre TextureUnitState TAM CLAMP, Ogre TextureUnitState TAM CLAMP, Ogre TextureUnitState TAM CLAMP) tus gt setTexture(tmgr gt getByName(mMapTextureName)) mMaterial gt prepare() mMaterial gt load() 3) Fragment shader code uniform sampler2D pathTexture void main() vec4 color texture2D(pathTexture, gl TexCoord 0 .xy) if(color vec4(0.0, 1.0, 0.0, 1) green gl FragColor vec4(1.0, 0.0, 0.0, 1) red else gl FragColor vec4(0.0, 1.0, 0.0, 1) green This still needs some fixes to work, so new questions have emerged 1) Do I realy need to render "pathTexture" on screen to this to work? Maybe that texture could only be used to decide pixel colors an then discard it? 2) To only use "pathTexture" as an "input" for my shader, should I add a second pass on my material file with "pathTexture" as a texture unit? Thanks!
14
Transform luminosity (Watt) to be usable in shader I'm studying the color and the Luminosity of Stars. I'm trying to figure how to transform information of Luminosity i perceive at a given distance to something usable into a shader. For instance , Given the Radius and temperature of a star, with inversed square law on Distance(meters) i obtain the perceived luminosity of a star in Watt (brut force, without any scattering). For sample, the luminosity I obtain for the Sun ON Earth is about 1361W and a RGB color about w 1361 r 255 g 245 b 242 Astronomical and physics are quite OK, but how to transform this values into usable shader formulas or values? best practices ? samples ? I guess i will simply take the aboves value as Light reference an normalize the intensity(1361) to be 1 on each rgb componemt value. I' ll keep you posted.
14
Difference between Material and Shader In games, materials often only influence the visual appearance of objects. The visual appearance is effected by shaders. So regarding to terminology is there a difference between materials and shaders? Should you write one shader for one material?
14
What is a Fragment Pipe? I remember someone saying "24 fragment pipes on nVidia 7800" in a presentation. Am I correct in saying that a fragment is the data that can generate a pixel in the frame buffer? Or are fragments the same thing as pixels? I'm getting confused here. What is a fragment pipe?
14
Pixel Shader from Visual Studio Graphics Diagnostics i want to check my pixel shader variables in graphics diagnostics but unfortunately when i click start debugging in pixel history (mentioned below), a new tap will open and says source is not available. am i have to do anything before start debugging my shader in graphics diagnostics tool ? Thanks just to mention that, I use DirectX 11 and C
14
Has anyone ever made a shader that produces something similar to pixel art? I find that pixel art is really pleasing to look at it has this kind of crisp, satisfying perfection to it. I've been following some beginner pixel art tutorials, when I got an idea what if someone made a shader (in GLSL for example) that took 3D scenes and did the best possible job at making it look like it was drawn by a pixel artist? The concepts of anti aliasing and shading should be relatively easy to perform. Constraining your colour palette would be very simple as well. Of course, no computer program could outdo the manual work of a great pixel artist, but has anyone ever tried this before? Is this idea at all possible? Are there any games that use it? Have there been (documented) failed attempts?
14
How to determine vertex index using Shader Model 3 or lower? I need something like SV VertexId (added in Shader Model 4) in HLSL shader to determine which vertex is currently handled. Unfortunatelly, I can compile only vs 3 0 or lower. The objective is to change position of one specific vertex using HLSL. I can't edit the mesh and can't pass any data from game engine, my capabilities are limited by HLSL shader. Shader is writing only for one mesh (humain face), I need to change it's shape a bit (for example, close an eye or make it smiling). I already tried to locate vertex by TEXCOORD, but had no idea how to separate it from other triangles verticles connected to it (placed at the same point where many triangles are met). if ( VS.TC 0 gt 0.8828125 amp amp VS.TC 1 gt 0.6328125 amp amp VS.TC 1 lt 0.671875 ) Upper lip center VS.Position VS.Normal uUpLip Could you, please, give me any advice how to identify move only one specific verticle in HLSL? I need something like moving verticle in Blender's Edit Mode (wytn neighbour triangles are still connected in one point), but my attempt with TEXTCOORD makes them moving in different directions Thanks
14
Transform texture coordinates when using shader Assuming I define four vertices of a quad with texture coordinates that cover a whole texture or region of a texture, I can animate these coordinates by setting a transform using SetTransform( D3DTS TEXTURE0, amp texTrans ) ...scaling, translating etc. If I render using a shader, and still want to animate the coordinates, presumably I can pass in the same transformation matrix and multiply the coordinates in the vertex shader? Instead of in the vertex shader Output.TextureUV vTexCoord0 do Output.TextureUV mul( vTexCoord0, texTrans ) Is this a the correct way to render an animated sprite with shader?
14
Vertex Displacement Distortion Correction I am developing for Mobile VR using GoogleVRSDK and Unity. My target platform is Android. I have a shader which displaces vertices to create a reverse lens distortion. Following is my shader, Shader "Unlit Cube" Properties Color("Main Color", Color) (1,1,1,1) Category Tags "Queue" "Geometry" "IgnoreProjector" "True" "RenderType" "Opaque" Blend Off AlphaTest off Cull off Lighting Off ZWrite On ZTest LEqual Fog Mode Off SubShader Pass CGPROGRAM pragma target 3.5 pragma target 2.0 pragma only renderers gles2 pragma only renderers gles pragma vertex VertexProgram pragma fragment FragmentProgram pragma multi compile GVR DISTORTION include "GvrDistortion.cginc" struct VertexInput half4 vertex POSITION half4 texcoord TEXCOORD0 struct v2f half4 vertex SV POSITION half4 uv TEXCOORD0 v2f VertexProgram (VertexInput v) v2f o o.uv v.texcoord o.vertex undistortVertex(v.vertex) return o fixed4 Color fixed4 FragmentProgram (v2f fragment) COLOR return Color ENDCG Following is my GvrDistortion.cginc if defined(GVR DISTORTION) float4x4 Undistortion float MaxRadSq float NearClip float4x4 RealProjection float4x4 FixProjection float distortionFactor(float rSquared) float ret 0.0 ret rSquared (ret Undistortion 1 1 ) ret rSquared (ret Undistortion 0 1 ) ret rSquared (ret Undistortion 3 0 ) ret rSquared (ret Undistortion 2 0 ) ret rSquared (ret Undistortion 1 0 ) ret rSquared (ret Undistortion 0 0 ) return ret 1.0 Convert point from world space to undistorted camera space. float4 undistort(float4 pos) Go to camera space. pos mul(UNITY MATRIX MV, pos) if (pos.z lt NearClip) Reminder Forward is Z. Undistort the point's coordinates in XY. float r2 clamp(dot(pos.xy, pos.xy) (pos.z pos.z), 0, MaxRadSq) pos.xy distortionFactor(r2) return pos Multiply by no lens projection matrix after undistortion. float4 undistortVertex(float4 pos) return mul( RealProjection, undistort(pos)) Surface shader hides away the MVP multiplication, so we have to multiply by FixProjection inverse(VP) RealProjection and then by inverse(M), in order to cancel it out and leave our own transform in place. float4 undistortSurface(float4 pos) float4 proj mul( FixProjection, undistort(pos)) return mul(unity WorldToObject, proj) else Distortion disabled. Just do the standard MVP transform. float4 undistortVertex(float4 pos) return mul(UNITY MATRIX MVP, pos) Surface shader hides away the MVP multiplication, so just return pos. float4 undistortSurface(float4 pos) return pos endif This shader works completely as excepted in Samsung S6 and LG G3 and the Unity Editor. But we are working for a phone named Venus. In that phone shader renders but the displacement of vertices doesn't happen. I tried disabling conditional compiling. I tried changing render targets and forcing different OpenGL versions in render. Nothing worked. I would love to hear a solution. Thank you. The specs of the Venus, GPU Adreno(TM) 405 Runs OpenGL ES 3.0 GPU SM 4.0 and VRAM 512MB RM 2gigs Screen 1080x1920 60hz dpi 480 Android 5.1 API 22 CPU is ARMv7 VFPv3 NEON (8 cores) The shader in LG G3 The shader in Venus
14
Shader Resource Binding Im trying to set a constant buffer in my shader with a value. But getting nothing, no results. Code in shader cbuffer MatrixBuffer register(b0) float4 test In main part pragma pack(push,1) struct CB GBUFFER UNPACK D3DXVECTOR4 test pragma pack(pop) D3D11 BUFFER DESC cbDesc ZeroMemory( amp cbDesc, sizeof(cbDesc) ) cbDesc.Usage D3D11 USAGE DYNAMIC cbDesc.BindFlags D3D11 BIND CONSTANT BUFFER cbDesc.CPUAccessFlags D3D11 CPU ACCESS WRITE cbDesc.ByteWidth sizeof( CB GBUFFER UNPACK ) device gt CreateBuffer( amp cbDesc, NULL, amp m pGBufferUnpackCB ) D3D11 MAPPED SUBRESOURCE MappedResource ic gt Map( m pGBufferUnpackCB, 0, D3D11 MAP WRITE DISCARD, 0, amp MappedResource ) CB GBUFFER UNPACK pGBufferUnpackCB ( CB GBUFFER UNPACK )MappedResource.pData pGBufferUnpackCB gt test.x 10.0f pGBufferUnpackCB gt test.y 10.0f pGBufferUnpackCB gt test.z 10.0f pGBufferUnpackCB gt test.w 0.0f ic gt Unmap( m pGBufferUnpackCB, 0 ) ic gt PSSetConstantBuffers( 0, 1, amp m pGBufferUnpackCB ) What seems to be my problem?
14
glDrawArrays draws nothing I am trying to draw a triangle using shaders in LWJGL, but nothing is being drawn on the screen, and no error is being produces. I can't figure out what I'm doing wrong. To create a vao, I use int buffer glGenBuffers() int vertexArray glGenVertexArrays() ByteBuffer data ByteBuffer.allocateDirect(6 8).order(ByteOrder.nativeOrder()) data.putFloat( 0.5f) data.putFloat(0.5f) data.putFloat( 0.5f) data.putFloat( 0.5f) data.putFloat(0.5f) data.putFloat( 0.5f) glBindBuffer(GL ARRAY BUFFER, buffer) glBufferData(GL ARRAY BUFFER, data, GL STATIC DRAW) glBindVertexArray(vertexArray) int positionAttributeLocation glGetAttribLocation(program, quot position quot ) glEnableVertexAttribArray(positionAttributeLocation) glVertexAttribPointer(positionAttributeLocation, 2, GL FLOAT, false, 8, 0) and then I draw using glUseProgram(program) glDrawArrays(GL TRIANGLES, 0, 3) Here's my vertex shader version 110 in vec2 position void main(void) gl Position vec4(position.xy, 1, 1.0) and fragment shader version 110 void main(void) gl FragColor vec4(0.0, 0.0, 0.0, 1.0)
14
Why would an ambient occlusion (AO) shader's performance be dependent on light direction? One of my favourite games recently implemented ambient occlusion as a graphics feature, which appears to perform very well in most circumstances except during sunrise and sunset. As someone who is getting into shader programming, I'm intrigued by this. My understanding of AO is that you precompute the diffuse shading for a range of incident vectors, then use that information to produce a correct shadowing scale for real time use by enumerating over all the lights and computing how their position in relation to the object compares to each of the incident vectors, essentially interpolating across the closest values. Since the occlusion is pre computed against a model, taking complex shapes and self occlusion into account, you don't need to raytrace and the performance is much better. So here's where I get lost since the diffuse map is precomputed, and you're just comparing light positions to those maps for all objects within the screen space, why would performance suffer for cases where the ambient light (the sun) is close to the horizon? It seems like this shouldn't matter surely the performance of such a shader scales against vertices, not the angle of the light? The game uses the latest version of Unity (they just bumped up a minor release) but I don't know if they use the built in Unity SSAO shader. Performance hit is major 28fps at 1440p during "day" hours, dropping to 12 14fps during sunrise set. This only occurs when AO is enabled. I'm not interested in finding a solution I'm sure they'll fix it sooner or later I'd just like to get a better idea of how AO performance scales, what its pain points are, and why this kind of behaviour might appear.
14
Is multitexturing really just "using more than one texture"? This might seem stupid, but it bugs me. From what I understand, multitexturing is just using more than 1 texture per shader (usually to blend them somehow). So instead of creating 1 texture, I create 2 or more which is... pretty obvious! Why is there a special term for that? Is there more to it than just "using more than 1 texture"? Am I missing something?
14
Animated light effects in games I have been wondering now for quite some time, how certain animated texture effects are done, specifically involving light effects. Some good examples of what I mean are the green bridges in Darkspore, who have a pulsating light effect (pic), the light bridges in Portal or some spell effects from Oblivion (pic). Are those effects purely done with shaders? And if so, how would you go about doing them? If I for example wished to create an effect, which would send little lightning bolts from a character models feet to the top as illustration that he got electrocuted, how would that work? To my person Although I am proficient in programming, I am mostly a beginner, when it comes to shaders. I understand most of the math behind it, but have almost no practical experience with them. I hope you can clear things up for me a bit! Thanks you for reading and in advance for any help! Christoph
14
glsl wrong light direction I'm practicing the Phong lighting model with glsl, and here's my shaders vertex shader version 330 core layout (location 0) in vec3 aPos layout (location 1) in vec2 aTexCoord layout (location 2) in vec3 aNormal out vec2 TexCoord out vec3 LightColor uniform mat4 model layout (std140) uniform coord mats mat4 view mat4 projection vec3 lightPos vec3 lightColor void main() vec3 vertex position vec3(model vec4(aPos, 1.0)) vec3 vertex position vec3(view model vec4(aPos, 1.0)) vec3 vertex normal normalize(vec3(model vec4 (aNormal, 1.0))) vec3 vertex normal normalize(vec3(view model vec4 (aNormal, 1.0))) vec3 light position vec3(vec4(lightPos, 1.0)) vec3 light position vec3(view vec4(lightPos, 1.0)) vec3 light dir normalize(light position vertex position) float diff max(dot( light dir,vertex normal), 0.0) vec3 diffuse diff lightColor vec3 ambient 0.1 lightColor LightColor diffuse ambient gl Position projection view model vec4(aPos, 1.0) TexCoord vec2(aTexCoord.x, aTexCoord.y) fragment shader version 330 core out vec4 FragColor in vec2 TexCoord in vec3 LightColor texture sampler uniform sampler2D texture1 void main() FragColor vec4(LightColor,1.0) texture(texture1, TexCoord) The diffuse light works right, that is, faces which are facing to the light source, are bright, when backing the light source, they are dark(basically ambient color). Now I want to calculate the specular light, in order to get the camera position for free, I changes the coordinate to camera view coordinate(using the commented code, just multiply the view matrix to the vertex position, normal and light position), but weird things happened all faces which are facing the light is dark, all faces which are backing to the light is bright. How could this happened?
14
infer half vector length in BRDF it's my first question on stack. Is it possible to infer length of the half angle vector for specular lighting from N L and N V without the whole view and light vectors? I may be completely off track, but I have this gut feeling it's possible... Why? I'm working on a skin shader and I'm already doing one texture lookup with N L N E and one texture lookup for specular with N H N V. The latter one can be transformed into N L N E lookup if only I had the half vector length. Doing so could simplify the shader a bit and move some operations into the pre computed lookup texture. It would make a huge difference since I'm trying to squeeze as much functionality as possible to a single pass mobile version so instruction count matters. Thanks. Edit, for clarity, I'm doing something like this float NdotL dot(s.Normal, lightDir) float NdotE dot(s.Normal, viewDir) fixed3 diffAndTransl 2.0 tex3D( Lookup3d, half3(diffNdotL, NdotE 0.5 0.5, depth)).rgb now we have diffuse with SSS and backscattering from lookup everything below could be just a lookup if I had length(h) information when calculating the lookup float3 h lightDir viewDir float hLen length(h) float3 H normalize(h) float3 H h hLen float NdotH dot(s.Normal, H) float NdotH (NdotL NdotE) hLen float EdotH dot(viewDir, H) fixed ph by fresnelReflectance tex3D( Lookup3d, half3(NdotH 0.5 0.5, EdotH 0.5 0.5, s.Gloss)).a half spec saturate(NdotL) max(ph by fresnelReflectance hLensq, 0 ) this could go into lookup as well I was thinking I could infer length(h) from NdotL and NdotE when generating the specular lookup to reduce above code. In my mind I was imagining something like a triangle on the plane made by lightDir and viewDir vectors could be thought of in 2D space. Don't the two dot products give me the (half?) angle and since both vectors are of length one, wouldn't it be possible to infer the length of the last side? I tried some trigonometric voodoo programming but with no luck (chord of theta inferred from dot products?). Does anybody have any idea if it's at all possible? Maybe I just need to sleep more. ps. I know tex3D won't work well on mobile, just testing the grounds...
14
How can I make a tendril flame like aura visual effect? I am a bit new to UE4 and I'm trying to get a tendril flame like aura like the picture below. Does anyone know how I would go about this? Should I use post processing or particles?
14
Any good books on graphics programming? I've been looking for a book that takes a bottom up approach for graphics programming. So something that starts with 2d filtering, maybe moving into normal mapping, then ambient occlusion, etc. I ask because I've been lazy for the last few years in game development and always used an engine that handles this. I want to start writing some shaders for my games instead of relying on the cryptic ones that I've borrowed in the past. I think a very strong knowledge in this will help and I'm a bottom up kind of learner so please help me out! I know the GPU Gems series is great, but they seem to be more like a cookbook than a bottom up approach that I want. You tend to get more scattered theory from cookbooks instead of building on theory from previous chapters. EDIT Preferably something you've read! I can search amazon for this, but it's hard to get a unbiased review that way.
14
My lighting stays the same when I go indoors? My problem can be seen here notice how the shadows don't change intensity or color when moving indoors to the cave. The only thing that changes is the highlights when I move under the directional light. I was thinking about adding global illumination but I don't think that would work with a toon shader. How can I get my lighting to become darker once I enter indoors e.g a cave. Thanks!
14
Starting with shaders and particles I'm working on a flow field with particle systems and starting to learn about shaders in three.js. Current state of my work is available here. As such amount of calculations is quite heavy for a big amount of particles I am wondering is it possible to move it to the vertex shader? (Im just a beginner, no experience with shaders, only what i've read, so please be patient) The calculations requires previous position to add further displacement and to move the particle further according to the noise value on given position, but i don't think it is possible to output anything from the shader?
14
Rain effect using DirectX 9 capabilities Is it possible to achieve something similar to nVidia's rain demo using only shader model 3.0 capabilities? If yes, could you point out a few documents web resources that are suitable candidates and do not require a heavy programming load (e.g. not more than two hard weeks of programming for one single person)? It would be nice if the answer could also contain a pro con phrase for the proposed idea (e.g. postprocessing rain shader vs. a particle based effect).
14
pixel displacement with shader Having started learning shaders and experimenting with tools like Shadertoy, I am attempting to make stereoscopic (anaglyph or autostereogram) shaders as an exercise. For this, I need to displace each pixel to the left right depending on its depth value. I am trying to generate a double image, based on the original image and the depth map, with each image pixel displaced to the left and right by a value depending on the depth map so background pixels are displaced more less (depending on the mode) than foreground pixels. It would be easy to get the color of the pixel, say, 30 px to the left or right, if I needed a displacement of 30 for this particular pixel. The problem is, this would not actually work instead, I need to change the color of the pixel 30 px to the left or right, and displace the color of the current pixel there. One solution would be to use a for loop and check each pixel in the possible displacement range for its depth value. However this has an impact on performance that becomes untenable for large displacement values. With an average eye distance of 65 mm, this can easily require hundreds of pixels of displacement range. Another solution (as per this answer) would be to generate two viewpoints and combine them, but I want to avoid doing so here. I am searching for a solution based on a single image and its depth map instead. Is there a another, more efficient way to displace pixels in such a way using shaders? Note I am using Shadertoy for ease of use, but if a solution exists but is not usable in it, for example requiring a 3D engine to implement, it is still of interest.
14
Why is H (blinn) used instead of R (phong) in specular shading? I can't find a good reason for this anywhere. The reflection vector used in phong has a simple basis in physics. But the half vector used in blinn seemingly has no rational basis, and does not constitute a proper reflection. And yet it is used in every so called "physically based" shading function. If there is a good physical basis for it, I'd like to know. What I've been able to find are a few reasons It's faster there's mixed information on this, but even so it would have been a great reason... in the year 1998. It handles angles higher than 90 degrees better as far as I can tell the only reason for this is because the phong term has been used improperly. The dot product of the reflection and the view gives an angle between 1 and 1. Usually this angle is clamped to 0 to 1, this is the direct cause of the 90 degree problem. Re normalize the angle instead of clamping it and you get the full 180 degree coverage. I refuse to believe a simple x 0.5 0.5 operation has eluded the graphics world for 40 years. it handles edges better The edge "problem" also exists in the blinn solution, just to a lesser degree. The main cause is improper simulation of area lighting at the terminator, which should be essential for any "physically based" shader. But even in simpler situations a sigmoid function can approximate a soft terminator line correctly. Multiplying into a lambert term is incorrect as it attenuates the specular term improperly, this could cancel out a fresnel term and lead to further errors. It has long reflections at the edge It seems to me that while anisotropic reflections may be realistic, blinn is not the correct way to implement them, as they only appear at the edge. It is merely a happy coincidence that an error in the H term happens to look realistic. None of these reasons are satisfactory, I want to sort out this madness. I want to clarify that I am not talking about blinn and phong specifically, but instead about the vector components H and R, which are used as the basis for these shaders as well as others.
14
The number of shaders a large game or game engine has Wondering the scale basically. The number of shaders a large game or game engine has. I've seen some metal repos but they typically just have 1 or 2 shaders for small demos. I think I've seen a few with 5 or 10, 1 with maybe 20. But I haven't seen anything where there is like 100 or 1,000 shaders or anything like that. Wondering what the scale is like for typical large games or game engines, if they are on the order of 1 10, or 100, or 1000, sorta thing. And if it's a large number, maybe a quick idea of what the various tasks they are used for would be interesting but not required ). Thank you!
14
VS2013 Compiling Shaders with Shader Model 5.0 When I try to compile two HLSL files included in my project, the compilation fails with an error Error error X4502 invalid vs 2 0 input semantic 'INSTANCE' However, I notice it's trying to use shader model 2.0 when I'm trying to use 5.0 Why is the shader compiler trying to use the 2.0 model when I've told VS to use 5.0? Or have I misunderstood?
14
VS2013 Compiling Shaders with Shader Model 5.0 When I try to compile two HLSL files included in my project, the compilation fails with an error Error error X4502 invalid vs 2 0 input semantic 'INSTANCE' However, I notice it's trying to use shader model 2.0 when I'm trying to use 5.0 Why is the shader compiler trying to use the 2.0 model when I've told VS to use 5.0? Or have I misunderstood?
14
Phong shading blows out things close to white OpenGL is there a way around this? I'm doing per pixel lighting using the Phong shading model. In pseudo code this looks like vec4 ambientColor light.ambientIntensity material.diffuse vec4 diffuseColor light.diffuseIntensity material.diffuse vec4 specularColor light.specularIntensity material.specular finalColor (ambientColor diffuseColor specularColor) And you can add in a shadow map or light attenuation easily. Even color the light via the light.xIntensity values. My problem is that I have a cloud texture that's all white values with varying levels of alpha. This is textured on a sphere. When rendering, what happens is that at the very appex of the light, i.e. the point closest to the light source the following seems to be happening When my light.ambientIntensity is at 0.2,0.2,0.2,0.2 I get an ambientColor on the whitest parts of the texture is say 0.2,0.2,0.2,0.2 and on the darkest parts of the texture could be 0.2,0.2,0.2,0.0. With a light.diffuseIntensity of 1,1,1,1 I will get diffuseColor ranging from 0,0,0,0 (when we're nowhere near the light, i.e. the other side of the sphere) through to 1,1,1,1 when we're on the tip closest to the light and the diffuse color of the material is also full white. I can skip specular here. Because the end result is now 1.2,1.2,1.2,1.2. The problem is that I'm now losing some detail because any values that should have been 0.8,0.8,0.8,0.8 are now full opaque white. I understand that this would be desirable under some situations, like using HDR, but in this particular case it's not the result I want. I'm using OpenGL 4.4 but I assume this problem and a solution is API agnostic.
14
I have a frag shader, one with an empty s lightMap, how and why is it effecting the output? I have an image of concrete rocks of different shades of colors, and I'm applying this shader, but without referencing s lightMap's uniform in my program precision mediump float varying vec2 v texCoord uniform sampler2D s baseMap uniform sampler2D s lightMap void main() vec4 baseColor vec4 lightColor baseColor texture2D( s baseMap, v texCoord ) lightColor texture2D( s lightMap, v texCoord ) gl FragColor baseColor (lightColor 0.25) it displays a picture with variable light color added to the final pixel color, but s lightMap isn't even linked into the program, what is happening in this case, at first I thought it would just do the baseColor as the FragColor, but the addition of the 0.25 makes a non negative result to (lightColor 0.25). I'm confused, one minute I think lightColor would be set to a texture of 1's the next an array of 0's. Or is it just random data? It doesn't appear at all random in the picture, it looks like it's obeying a rule of shading. I'd like to mimic this effect in code that's not broken. Here's the unshadded image Example "gimped" image of how it has variable shades (though this is an invert)
14
Issues to display a texture loaded with libPNG with opengles 3.0 I'm trying to load a texture loaded with libPNG on a XY coordinates. I know the width and the height of the texture (32x32). The texture is already loaded successfully and I have his Id. Here is my function to print any (32x32) texture on a XY position void printTexture(GLuint idTexture, GLfloat x, GLfloat y) glActiveTexture(GL TEXTURE0) glBindVertexArray(this gt VAO 0 ) glUniform1i(this gt mTagShaderHandle, 0) to execute a particular code on the Pixel Shader glUniform1i(this gt mTextShaderHandle, 0) the uniform Sampler2D from my Fragment Shader bound to 0. GLfloat xpos x GLfloat ypos y GLfloat w 32 GLfloat h 32 GLfloat vertices 6 4 xpos, ypos h, 0.0, 0.0 , xpos, ypos, 0.0, 1.0 , xpos w, ypos, 1.0, 1.0 , xpos, ypos h, 0.0, 0.0 , xpos w, ypos, 1.0, 1.0 , xpos w, ypos h, 1.0, 0.0 glBindTexture(GL TEXTURE 2D, idTexture) checkGlError("glBindBuffer") glBufferSubData(GL ARRAY BUFFER, GL ZERO, sizeof(vertices), vertices) glDrawArrays(GL TRIANGLE STRIP, GL ZERO, 6) glBindVertexArray(GL ZERO) glBindTexture(GL TEXTURE 2D, GL ZERO) Here is how I initialize my VAO and VBO glBindVertexArray(this gt VAO 0 ) glBindBuffer(GL ARRAY BUFFER, this gt VBO 0 ) glBufferData(GL ARRAY BUFFER, sizeof(GLfloat) 6 4, GL ZERO, GL DYNAMIC DRAW) glEnableVertexAttribArray(GL ZERO) glVertexAttribPointer(GL ZERO, 4, GL FLOAT, GL FALSE, 4 sizeof(GLfloat), GL ZERO) glBindVertexArray(GL ZERO) Here is how I set up my texture glTexImage2D(GL TEXTURE 2D, GL ZERO, GL RGB, temp width, temp height, GL ZERO, GL RGB, GL UNSIGNED BYTE, image data) glTexParameteri(GL TEXTURE 2D, GL TEXTURE MIN FILTER, GL LINEAR) glTexParameteri(GL TEXTURE 2D, GL TEXTURE MAG FILTER, GL LINEAR) glTexParameteri(GL TEXTURE 2D, GL TEXTURE WRAP S, GL CLAMP TO EDGE) glTexParameteri(GL TEXTURE 2D, GL TEXTURE WRAP T, GL CLAMP TO EDGE) glTexParameteri(GL TEXTURE 2D, GL TEXTURE WRAP R, GL CLAMP TO EDGE) And my Vertex Shader const char gVertexShader " version 320 es n" "layout (location 0) in vec4 vertex n" "out vec2 TexCoords n" "uniform mat4 projection n" "uniform int tag " "void main() n" " gl Position projection vec4(vertex.xy, 0.0, 1.0) n" " if (tag gt 0) n" " TexCoords vertex.zw n" " else n" " TexCoords vertex.xy n" here is the code that should be executed by the pixel shader program because Tag 0. " n" " n" The texture is not displaying at all. I tried a lot of things but nothing change. I should keep this pixel shader because I have some functions to print texts so that's why I have the if else statement on my Pixel Shader code. What's wrong with my code ? Thanks a lot for help.
14
HLSL's Tex2D for GLSL? I am trying to port a HLSL shader to GLSL. I'm just not quite sure how to convert this line outA Input.Color.a tex2D(s, Input.TexCoord.xy float2( 4.0 pxSz.x blurSize,0)).a 0.05 It's mostly the Tex2D I'm having trouble with. In GLSL, it seems to work differently. I'm porting a horizontal blur texture al tex sampler2D s sampler state texture lt al tex gt int tWidth int tHeight float blurSize 5.0 float4 ps main(VS OUTPUT Input) COLOR0 float2 pxSz float2(1.0 tWidth,1.0 tHeight) float4 outC 0 float outA 0 outA Input.Color.a tex2D(s, Input.TexCoord.xy float2( 4.0 pxSz.x blurSize,0)).a 0.05 outA Input.Color.a tex2D(s, Input.TexCoord.xy float2( 3.0 pxSz.x blurSize,0)).a 0.09 outA Input.Color.a tex2D(s, Input.TexCoord.xy float2( 2.0 pxSz.x blurSize,0)).a 0.12 outA Input.Color.a tex2D(s, Input.TexCoord.xy float2( pxSz.x blurSize,0)).a 0.15 outA Input.Color.a tex2D(s, Input.TexCoord.xy float2(0,0)).a 0.16 outA Input.Color.a tex2D(s, Input.TexCoord.xy float2(pxSz.x blurSize,0)).a 0.15 outA Input.Color.a tex2D(s, Input.TexCoord.xy float2(2.0 pxSz.x blurSize,0)).a 0.12 outA Input.Color.a tex2D(s, Input.TexCoord.xy float2(3.0 pxSz.x blurSize,0)).a 0.09 outA Input.Color.a tex2D(s, Input.TexCoord.xy float2(4.0 pxSz.x blurSize,0)).a 0.05 outC.a outA return outC Thanks
14
Does texture splatting always sample 4 x N textures per fragment (regardless of the weights)? Texture splatting usually is done by vertex painting, where each channel R G B A is assigned as a different texture weight. Due to the way shaders are executed, doesn't it mean that the fragment shaders will ALWAYS sample 4 textures even if the weights are (1,0,0,0) or (0,1,0,0)? Lets say some parts of the terrain are almost exclusively grass and the other part of the terrain is exclusively dirt, which means the times when more than one texture sample is necessary is very, very rare on the transitions, but regardless 4 textures are sampled? Or do GPUs perform some sort of optimization to minimize the impact of this?
14
How do I set the UV coordinate from a texture value in HLSL? I have a UV render pass, where RG is the image and B is always set to 255 I want to set the UV pass texture value to UV Coordinate when I test it, the result should pixelate noise, like the below image I have tested it in other shader languages like cgprogramm, GLSL test in Unity or useing Zdepth for mipmap but I can not get an anti aliased result all the results are the same float4x4 World float4x4 View float4x4 Projection float4x4 WorldInverseTranspose float4 AmbientColor float4(1, 1, 1, 1) float AmbientIntensity 0.1 float3 DiffuseLightDirection float3(1, 0, 0) float4 DiffuseColor float4(1, 1, 1, 1) float DiffuseIntensity 1.0 float Shininess 200 float4 SpecularColor float4(1, 1, 1, 1) float SpecularIntensity 1 float3 ViewVector float3(1, 0, 0) texture ModelTexture texture UvTexture sampler2D textureSampler sampler state Texture (ModelTexture) MinFilter Point MagFilter Point AddressU Wrap AddressV Wrap sampler2D textureSamplerUV sampler state Texture (UvTexture) MinFilter Linear MagFilter Linear AddressU Clamp AddressV Clamp struct VertexShaderInput float4 Position POSITION0 float4 Normal NORMAL0 float2 TextureCoordinate TEXCOORD0 struct VertexShaderOutput float4 Position POSITION0 float2 Color COLOR0 float3 Normal TEXCOORD0 float2 TextureCoordinate TEXCOORD1 VertexShaderOutput VertexShaderFunction(VertexShaderInput input) VertexShaderOutput output float4 worldPosition mul(input.Position, World) float4 viewPosition mul(worldPosition, View) output.Position mul(viewPosition, Projection) float4 normal normalize(mul(input.Normal, WorldInverseTranspose)) float lightIntensity dot(normal, DiffuseLightDirection) output.Color saturate(DiffuseColor DiffuseIntensity lightIntensity) output.Normal normal output.TextureCoordinate input.TextureCoordinate return output float4 PixelShaderFunction(VertexShaderOutput input) COLOR0 is here float4 textureColorUV tex2D(textureSamplerUV, input.TextureCoordinate) float2 cord float2(textureColorUV 0 ,textureColorUV 1 ) float4 textureColor tex2D(textureSampler, cord) return saturate(textureColor) technique Textured pass Pass1 VertexShader compile vs 1 1 VertexShaderFunction() PixelShader compile ps 2 0 PixelShaderFunction() How do I set the UV coordinate from a texture value in HLSL?
14
Rendering different materials in a voxel terrain Each voxel datapoint in my terrain model is made up of two properties density and material type. Each is stored as an unsigned integer value (but the density is interpreted as a decimal value between 0 and 1). The density values are used to generate a mesh with the marching cubes algorithm. My current idea for rendering these different materials on the terrain mesh is to store eleven extra attributes in each vertex six material values corresponding to the materials of the voxels that the vertices lie between, three decimal values that correspond to the interpolation each vertex has between each voxel, and two decimal values that are used to determine where the fragment lies on the triangle. The material and interpolation attributes are the exact same for each vertex in the triangle. The fragment shader samples each texture that corresponds to each material and then uses the aforementioned couple of decimal values to interpolate between these samples and obtain the final textured color of the fragment. It should work fine, but it seems like a big memory hog. I won't be able to reuse vertices in the mesh with indexing, and each vertex will have a lot of data associated with it. It also seems pretty slow. What are some ways to improve or replace this technique for drawing materials on a voxel terrain mesh?
14
Normal Matrix in plain English I'm into shader language with Webgl and GLSL. I've seen some tutorial about normal matrix and I don't really understand it. I mean, I think I'm ok with the math such as modelViewMatrix mat4.multiply(camera.view, modelMatrix) inverseModelViewMatrix mat4.invert(this.modelViewMatrix) normalMatrix mat3.fromMat4(inverseModelViewMatrix) normalMatrix mat3.transpose(this.normalMatrix) But why do I need it? Where can I find a case where I can see the difference between using it and not using it? Ot when do I don't need it?
14
Which coordinate space is the canonical default for each shader pipeline stage? I'm working with Direct3D 11 and HLSL. I use four different shaders (vertex, hull, domain and pixel). I always have troubles using the right coordinate space in my shaders. Could somebody identify the appropriate space for the vertex, hull, domain and pixel shader stages?
14
Are shaders always faster? Will using shaders or some other way of talking directly to the GPU always be faster than drawing to the screen in whatever language a game is being (mainly) written in? I guess that code with lots of conditionals is slower on many, (most? all?) GPUs but is that the only heuristic that can be reasonably applied? I understand that there will be edge cases with different hardware and measuring is the only way to know for sure, but I'm trying to figure out whether I should do everything that looks feasible to do in a shader, in a shader, given I don't want to implement it twice to find out which is faster.
14
HLSL How to flip geometry horizontally I want to flip my asymmetric 3d model horizontally in the vertex shader alongside an arbitrary plane parallel to the YZ plane. This should switch everything for the model from the left hand side to the right hand side (like flipping it in Photoshop). Doing it in pixel shader would be a huge computational cost (extra RT, more fullscreen samples...), so it must be done in the vertex shader. Once more this is NOT reflection, i need to flip THE WHOLE MODEL. I thought I could simply do the following Turn off culling. Run the following code in the vertex shader input.Position mul(input.Position, World) World 3 0 holds x value of the model's pivot in the World. if (input.Position.x lt World 3 0 ) input.Position.x World 3 0 input.Position.x else input.Position.x input.Position.x World 3 0 ... The model is never drawn. Where am I wrong? I presume that messes up the index buffer. Can something be done about it? P.S. it's INSANELY HARD to format code here. Thanks to Panda I found my problem. SOLUTION Do thins before anything else in the vertex shader. Position.x 1 To invert alongside the object's YZ plane.
14
outline object effect How can I achieve an outline effect similar to the ones found in League of Legends or Diablo III? Is it done using a shader? How? I would prefer answers that are not tied up to any particular engine but one that I can adapt to whatever engine I'm working on.
14
How can I create a shader that will reproduce this lighting effect on terrain? Notice the way in which the major light source in each image reflects off the ground, as a function of the distance between the light source and the viewer (?). Is this a (bumped) specular map? The effect is seen all over in World of Warcraft I specifically remember it on the snow in Dun Morogh, and on the shores of Darrowmere lake. Looking directly toward the sun is required. The effect is also commonly on the surface of water in RL and in CG. Specifically, I want to build the effect where the reflection is more intense on the ground surface closer to the light source, and falls off as it approaches the viewer.
14
How do you add turbulence to a particle system using noise? I have implemented a basic particle system using transform feedback in openGL, hoping to replicate a dust cloud. I have looked at methods of adding turbulence, such as vortices found here. But they do not give the desired effect. The guide discusses the idea of using perlin or simplex noise to add turbulence. But I don't understand how this is done? Would you use a pre generated noise texture and take sample data from it? Or calculate data at run time in the shader? What is the basic idea behind implementing noise turbulence? Edit My main question would be, how do you get a turbulent velocity from noise values?
14
Matcap and BRDF Shading I just would like to know what's the difference between the Matcap shaders used in ZBrush for example and the Bidirectional Radiance Distribution Function shader. Are there two techniques the same ? Is Matcap done using BRDF or are they different ?
14
How does one multiply by a constant in a VS1.1 assembly vertex shader? I'd like to multiply a vector by 1 e.g. mul r0, r0, 1 When I try this, I get this error message SimpleShaderA.vsh(20,17) error X2000 syntax error unexpected integer '1l' How can I specify 1 in my (first) shader program?
14
How do I pass an object location into a vertex shader? I am using Blender Game Engine. I want to create a large flat plane, and deform it locally near a moving object. So far (despite being a beginner at shaders) I've written a vertex shader for the plane which moves the vertices to their correct positions (constant positions, for now). I cannot find a way to swap that constant location with an object's location updated every frame, while the shader is running. I am not even sure if it's possible. I only want to access a specific object's center from the shader.
14
Component wise GLSL vector branching I'm aware that it usually is a BAD idea to operate separately on GLSL vec's components separately. For example use instrinsic functions, they do the calculation on 4 components at a time. float dot v1.x v2.x v1.y v2.y v1.z v2.z WRONG float dot dot(v1, v2) RIGHT Multiply one by one is bad too, since the ALU can do the 4 components at a time. vec3 mul vec3(v1.x v2.x, v1.y v2.y, v1.z v2.z) WRONG vec3 mul v1 v2 RIGHT I've been struggling thinking, are there equivalent operations for branching? For example vec4 Overlay(vec4 v1, vec4 v2, vec4 opacity) bvec4 less lessThan(v1, vec4(0.5)) vec4 blend for(int i 0 i lt 4 i) if(less i ) blend i 2.0 v1 i v2 i else blend i 1.0 2.0 (1.0 v1 i ) (1.0 v2 i ) return v1 (blend v1) opacity This is a Overlay operator that works component wise. I'm not sure if this is the best way to do it, since I'm afraid these for and if can be a bottleneck later. Tl dr, Can I branch component wise? If yes, how can I optimize that Overlay function with it?
14
Why is H (blinn) used instead of R (phong) in specular shading? I can't find a good reason for this anywhere. The reflection vector used in phong has a simple basis in physics. But the half vector used in blinn seemingly has no rational basis, and does not constitute a proper reflection. And yet it is used in every so called "physically based" shading function. If there is a good physical basis for it, I'd like to know. What I've been able to find are a few reasons It's faster there's mixed information on this, but even so it would have been a great reason... in the year 1998. It handles angles higher than 90 degrees better as far as I can tell the only reason for this is because the phong term has been used improperly. The dot product of the reflection and the view gives an angle between 1 and 1. Usually this angle is clamped to 0 to 1, this is the direct cause of the 90 degree problem. Re normalize the angle instead of clamping it and you get the full 180 degree coverage. I refuse to believe a simple x 0.5 0.5 operation has eluded the graphics world for 40 years. it handles edges better The edge "problem" also exists in the blinn solution, just to a lesser degree. The main cause is improper simulation of area lighting at the terminator, which should be essential for any "physically based" shader. But even in simpler situations a sigmoid function can approximate a soft terminator line correctly. Multiplying into a lambert term is incorrect as it attenuates the specular term improperly, this could cancel out a fresnel term and lead to further errors. It has long reflections at the edge It seems to me that while anisotropic reflections may be realistic, blinn is not the correct way to implement them, as they only appear at the edge. It is merely a happy coincidence that an error in the H term happens to look realistic. None of these reasons are satisfactory, I want to sort out this madness. I want to clarify that I am not talking about blinn and phong specifically, but instead about the vector components H and R, which are used as the basis for these shaders as well as others.
14
Is there an alternative to decals that will let me put bullet holes on a complicated object? I'm trying to figure out a way to do "bullet holes" but actually apply the texture at the shader level? If I'm dealing with a model which doesn't have a "flat" face I'd like to be able to wrap the "decal" across the model. Anyone point me at a tutorial that will do something similar to this.
14
Rendering terrain only with GPU This is not about generating plane geometry and then applying a shader on it. Instead, I want a big single flat plane, then apply a shader on it. The vertex shader has a uniform vec3 realPlanePosition and calls a height calculation function Vertex shader code uniform vec3 realPlanePosition varying vPosition float heightCalculation(vec2 verticePosition) Just for pseudo code (more black magic append here). return magicNoise2D(verticePosition.x, verticePosition.y) void main(void) vec3 vPosition position vec2 verticePosition vec2(vPosition.x realPlanePosition.x, vPosition.y realPlanePosition.z) vPosition.z heightCalculation(verticePosition) gl Position projectionMatrix modelViewMatrix vec4(vPosition, 1.0) In my plane update code, I just change realPlanePosition with player position, for example. The plane moves on (x, z) (X rotation PI 2.0). The only thing the vertex shader updates is the z (due to plane rotation) value of gl Position. Is this good practice? Also, how do I detect collisions with it? And finally, how can I control other terrain objects' generation (rocks, trees, ...)
14
Does GLSL copy function arguments by value? My question is about passing variables to GLSL shader. I'm not sure how that works and what are the performance implications. Say I got a function that accepts a "vec4" variable. The question is is that variable copied at the entrance? I guess it makes impact on performance if so. And if it happens to be that way is there a way to pass only references like in C C ?
14
Vertex shader are evil for performance? I found that the vertex shaders are sometimes very useful, especially because they can generate geometries and extract and use a lot of informations from just 1 image. The problem is that my project still isn't that big and looking to bigger project or just to commercial games, seems like the vertex shaders are rarely used. Am I wrong or the pixel shaders are much more used than the vertex shaders? This has something to do with performance issues and compatibility issues? What about some possible bottleneck like the memory controller CPU ?
14
How do I create a manual object with colors for each vertex? How do I create a shaded manual object with colours for each vertex? Eg if ogreObj is the Ogre ManualObject ogreObj gt begin("BaseWhiteNoLighting", Ogre RenderOperation OT TRIANGLE LIST) will allow me to select each vertex's colour with ogreObj gt colour(r, g, b) after each ogreObj gt position(x, y, z) and ogreObj gt normal(x, y, z) call. However, if I change the material to BaseWhite, color() instructions are ignored. I read that you must disable lighting int the .material script, but I need it active... Any advice? ANSWER This Ogre forum's thread has a simple .material script that works for this purpose material Voxel Default technique pass diffuse vertexcolour specular vertexcolour ambient vertexcolour lighting on
14
XNA games C application executable work on one win7 not the other one Our company wrote a game in XNA studio 4 almost ten years ago. we try to reinstall it in win7 with only the executable. Both installed XNA Game Studio 4.0. Below is the environments parameter I can find. The one on the laptop is not rendering the effect(I am new to this, this is my guess). Laptop Desktop OS win7 Service Pack1 win7 Service Pack1 DirectX version 11 11 Graphic Card AMD firepro M4000 Mobility Prographics integrated GPU Intel HD graphics 4600 Chip Type AMD FirePro(0x682D) Intel(R) HD graphics family We don't know what make these two executable behave differently. Part of the window image just not shown. Second check on the source code, the different between the image that can be show and can not be shown on the failed Laptop is working one is using Bitmap and the not working on is using Texture2D with Microsoft.Xna.Framework.Graphics Effect with passes. Any ideas. Thanks.
14
WebGL geometry calculations I have a dynamic surface in WebGL, that is animated in vertex shader. I want other objects to interact with this surface (for example, an object riding on dynamic terrain). What's the best way to do this? Should all these calculations be done on CPU? Is there a way to calculate this stuff on GPU? Basically, what I want is vertex shader with access to other (already transformed) vertices that would be perfect.
14
How to get Pixel Coordinates of certain colors in a Texture? I have a relatively big Texture, and I try to find a certain color pixels pattern eg. White, Black, White, Green . They are lying next to each other, If I use Texture2d.Getpixels() on every pixel every frame its possible, but way too slow. Is there a "real good" way to do this? For example a shader way? The "big idea" is to make a game which is "remote playable". For example through Twitch or of a streamed youtube video. The user should not be connected to the server in any way, Instead he is sending messages or not. For this I have to find a health bar. This health bar has the left upper corner in black white black green pixels lying next to each other. In the shader I CAN find these colors, but I dont really know the coordinates. Also I think there "should" be a solution with a ... compute shader? I dont really want to render new stuff. Just check pixel colors. I just want to know where these pixel colors are. I imagined something like this If PixelColorIsWhite(x,y) If PixelColorIsBlack(x 1,y) If PixelColorIsWhite(x 2,y) If PixelColorIsGreen(x 3,y) return x 20,y 100 because the player is 20 pixels right and 100 pixels below In a shader i guess i can access the current coordinate, but I cant access the other pixels. Its paralell processing... right? So its impossible to access the "entire thing" in one go , right? I think it must work somehow with... getting the big "chunk" of texture data, a very long array of pixels, giving it to a compute shader, and letting the CS find it. I hope I could clarify the problem now ) Update I found out it (at least) should work with a compute shader. I feel like im so close to the solution, yet im getting errors. ( So... This is my C Code public ComputeShader shader public Material thisMat public RenderTexture ScreenTex public int x public int y public const string INPUTTEX "InputTexture" private void Update() RunShader() public void RunShader() RenderTexture tex new RenderTexture(1024, 786, 24) tex.enableRandomWrite true tex.Create() shader.SetTexture(0, INPUTTEX, ScreenTex) shader.SetFloat(" ScreenWidth", Camera.main.pixelWidth) shader.SetFloat(" ScreenHeight", Camera.main.pixelHeight) int data new int 2 ComputeBuffer buffer new ComputeBuffer(data.Length, sizeof(int) 2) buffer.SetData(data) shader.SetBuffer(0, "readWriteIntBuffer", buffer) thisMat.mainTexture tex int kernelHandle shader.FindKernel("CSMain") shader.SetTexture(kernelHandle, "Result", tex) shader.Dispatch(kernelHandle, 1024 8, 786 8, 1) buffer.GetData(data) x data 0 y data 1 buffer.Release() And this is my ComputeShader Code pragma kernel CSMain RWTexture2D lt float4 gt Result RWStructuredBuffer lt int gt readWriteIntBuffer Texture2D lt float4 gt InputTexture float ScreenWidth float ScreenHeight numthreads(8, 8, 1) void CSMain(uint3 id SV DispatchThreadID) if (id.x gt (uint)( ScreenWidth ScreenHeight)) return int y id.x int( ScreenWidth) int x id.x int( ScreenWidth) readWriteIntBuffer 0 999 some debug testnumbers that show me that there were no colors found readWriteIntBuffer 1 998 if (InputTexture float2(x,y) .r 1.0f amp amp InputTexture float2(x, y) .g 0.0f amp amp InputTexture float2(x, y) .b 0.0f) float2 coordinates float2((float)id.x (float) ScreenWidth, (float)id.y (float) ScreenHeight) readWriteIntBuffer 0 coordinates.x readWriteIntBuffer 1 coordinates.y This Computeshader checks if a single pixel in a texture is red and writs it in a buffer. The buffer gets read by C with getdata. it seems we're almost getting there! But "something" seems to be wrong, since I dont seem to get the colors i want. But the compute shader seems to work, since i also get the triange pattern in my material.
14
How does the GPU know how to form triangles for a given mesh? I have just begun learning shader programing. What I learned is that the rasteriser groups three vertices to form a triangle for doing further operations. If that's true how does the rasteriser determines appropriate vertices which form a triangle in the mesh geometry? Is it the 3D software which saves geometry in the proper order in model file like .3ds or .obj, .x or is it the GPU which internally triangulates the geometry irrespective of the order the vertices are passed?
14
Where should shaders and lights be in a component based entity system? Where should I put the shader and the light shadow calculation? Should that be a component too? And should the rendering system know how to handle them or should there be a separate light system? I'm specifically talking about about a 2D system, but it should be the same in for 3D I think.
14
Early Z culling Ogre For Ogre experienced people, but also experts in the field Early Z culling is sometimes quite desirable, and that's what I tried to do in Ogre by using a two pass material. The first one is writing to the Z Buffer, but not to the frame buffer. This is how it looks like pass EarlyZ texture unit TU0 ambient diffuse texture texture TU0 TEXTURE tex coord set 0 filtering trilinear cull software none cull hardware none lighting off colour write off shading flat scene blend alpha blend alpha rejection greater equal 200 depth bias 5 5 ugly hack without it, objects tend to flicker The biggest problem I get is with alpha objects and shadows. For example, now I can't get tree impostors to cast correct shadows instead of blocks. Although they are rendered correctly, the PSSM isn't working correctly, so the shadows tend to look like stencil shadows. Any ideas on how to fix it? As many people said is it possible to perform early Z culling and still have transparent objects in the scene? If yes, some hints to do it in Ogre? Here are some screenshots
14
pixel shader and vertex shader problem in visual studio 2008 I have just installed XNA for VB2008 professional, now i try to run my first game development and i get this error, any way around this, am using windows xp sp3 my system configuration Host Name USER PC OS Name Microsoft Windows XP Professional OS Version 5.1.2600 Service Pack 3 Build 2600 OS Manufacturer Microsoft Corporation OS Configuration Standalone Workstation OS Build Type Uniprocessor Free Registered Owner user Registered Organization user pc Product ID 55274 640 1011873 23081 Original Install Date 1 26 2011, 7 00 52 PM System Up Time 0 Days, 10 Hours, 31 Minutes, 22 Seconds System Manufacturer Hewlett Packard System Model HP d330 uT(DC579AV) System type X86 based PC Processor(s) 1 Processor(s) Installed. 01 x86 Family 15 Model 2 Stepping 9 GenuineIntel 2593 Mhz BIOS Version COMPAQ 20031003 Windows Directory C WINDOWS System Directory C WINDOWS system32 Boot Device Device HarddiskVolume1 System Locale en us English (United States) Input Locale en us English (United States) Time Zone N A Total Physical Memory 1,527 MB Available Physical Memory 689 MB Virtual Memory Max Size 2,048 MB Virtual Memory Available 2,008 MB Virtual Memory In Use 40 MB Page File Location(s) C pagefile.sys Domain WORKGROUP Logon Server USER PC Hotfix(s) 10 Hotfix(s) Installed.
14
Logic behind a cubic reflection I'm creating a cube in WebGL made of mirros that have to reflect the object inside of it. At the moment I'm stuck because I don't understand the correct logic to implement to achieve the result. I tried put a camera in every face of the cube in order to have a cubemap to apply. The problems come when I try to rotate the cube I'm not able to make the cameras follow the rotation. What's the correct logic to implement? These images could give the idea of what I'm trying to do
14
One draw call with one big mesh VS many draw calls with many little meshes I have read that in order to optimize WebGL application, one should reduce an amount of draw calls. But does it mean that computing a one big mesh from all single meshes on CPU by modifying vertices position (which I heard is much slower than GPU while talking about matrix multiplications) and calling the draw call once would be faster than drawing every single mesh using model matrix inside GPU, calling the draw call for each one of them?
14
Screen effects and antialiasing I have been working on a game for a while using glut for basic window creation. I was rendering to an offscreen buffer so that I could implement various effects like screen bulging, motion blur, refraction, etc. I also used the screen texture with antialiasing (fxaa). Now I have changed from glut to sfml. I switched on the in built antialiasing and it looked much better than my version, but now I don't have the screen in a texture so I can't use the screen effects. So my question is, how to people normally deal with this issue? Can I take advantage of sfml's antialiasing functionality and retain my effects? I thought about using glReadPixels, but that seems way too slow. Does sfml do offscreen rendering behind the scenes and can I access that texture? This is not specific to sfml. How do AAA games do it? Do they always implement their own antialiasing techniques?
14
Does it make sense to do more calculations in the fragment shader if there are more vertices than pixels? I'm very new to graphics programming, and as I understand it vertex shaders are called per vertex and fragment shaders are per pixel (ignoring anti aliasing). When it comes to optimization all sources I refer to say that calculations that could be done in either shaders should be done in the vertex shader, and it makes sense 5 8 done 3 times is way better than 5 8 done 300'000 (640 480 for example). And that's fine for this scale but the question almost poses itself what happens when the amount of vertices exceeds the resolution. Here's a practical example a typical AAA game could have upto 3'000'000 polys rendering on the screen, for the sake of simplicity let's say it's a very detailed triangle strip, that would give it exactly 3'000'002 vertices while a full HD screen has 1920 1080 2'000'000 pixels. As you can tell a difference of a million operation could be a substantial performance increase. Is my thought process sound, or does my ignorance of some details leave something to be desired?
14
Godot error(10) expected '(' after Identifier When trying to write up a new shader for a material, I've been trying to create a variable to use to alter speed and test animation. Now, the below outputs fine until I try to assign a value to number. shader type canvas item uniform float time factor 1.0 uniform vec2 amplitude vec2(10.0,5.0) uniform sampler2D frame1 uniform sampler2D frame2 uniform sampler2D frame3 uniform float speed 1.0 float number 1.9 void vertex() VERTEX.x sin(TIME time factor VERTEX.x VERTEX.y) amplitude.x VERTEX.y cos(TIME time factor VERTEX.y VERTEX.x) amplitude.y I had seen tutorials demonstrating that this would have worked fine in previous versions.
14
Axis Aligned Bilboards in shader Hi I need to implement following effect using vertex shaders. Basically its a shader for particle laser beam that rotates to particle along its own y axis till its "best" visible (Roughly). My idea was Take "y" axis of particle model (its a rectangle) and transform it to view space ("vy"). Calculate the vector orthogonal to "vy" and eye vector ("w"), to get the direction on the screen which "x" of partilce should be oriented. Change particle vertices model coordinates using "w" instead of x.
14
Need help transforming DirectX 9 skybox hlsl shader to DirectX 11 I am in the middle of implementing a skybox to my game. I have been following this tutorial http rbwhitaker.wikidot.com skyboxes 2. I am using MonoGame as a framework and in order to support both Windows and Windows 8 metro I need to compile the shader with pixel and vertex shader 4. compile vs 4 0 level 9 1 compile ps 4 0 level 9 1 However some of the hlsl syntax has been updated with DX10 and DX11. I need to update this hlsl code float4x4 World float4x4 View float4x4 Projection float3 CameraPosition Texture SkyBoxTexture samplerCUBE SkyBoxSampler sampler state texture lt SkyBoxTexture gt magfilter LINEAR minfilter LINEAR mipfilter LINEAR AddressU Mirror AddressV Mirror struct VertexShaderInput float4 Position POSITION0 struct VertexShaderOutput float4 Position POSITION0 float3 TextureCoordinate TEXCOORD0 VertexShaderOutput VertexShaderFunction(VertexShaderInput input) VertexShaderOutput output float4 worldPosition mul(input.Position, World) float4 viewPosition mul(worldPosition, View) output.Position mul(viewPosition, Projection) float4 VertexPosition mul(input.Position, World) output.TextureCoordinate VertexPosition CameraPosition return output float4 PixelShaderFunction(VertexShaderOutput input) COLOR0 return texCUBE(SkyBoxSampler, normalize(input.TextureCoordinate)) technique Skybox pass Pass1 VertexShader compile vs 2 0 VertexShaderFunction() PixelShader compile ps 2 0 PixelShaderFunction() I quess I need to change Texture into TextureCube, change sampler, swap texCUBE() with TextureCube.Sample() and change PixelShader return semantic to SV Target0. I'm very new in shader languages and any help is appreciated!
14
Converting Shader to ShaderGraph for URP conversion I know there are a bunch of threads asking for help with converting old shaders to ShaderGraph, and this one is no different. I spent a lot of time trying to get a shader to cast shadows from sprites that also works with transparency. Eventually landing on this thread I was able to work out the following shader that did what I wanted Shader quot Custom SpriteShadowWithAlpha quot Properties PerRendererData MainTex( quot Texture quot , 2D) quot white quot EffectColor1( quot Effect Color quot , Color) (1,1,1,1) Crossfade( quot Fade quot , float) 0 FlashColor( quot Flash Color quot , Color) (1,1,1,1) FlashAmount( quot Flash Amount quot ,Range(0.0,1.0)) 0 Cutoff( quot Alpha Cutoff quot , Range(0,1)) 0.9 Color ( quot Color quot , Color) (1,1,1,1) Toggle( ALPHABLEND ON) ALPHABLEND ON( quot Enable Dithered Shadows quot , Float) 0.0 SubShader Tags quot Queue quot quot Transparent quot quot IgnoreProjector quot quot True quot quot RenderType quot quot TransparentCutOut quot quot PreviewType quot quot Plane quot quot CanUseSpriteAtlas quot quot True quot Cull Off Lighting Off ZWrite Off Blend SrcAlpha OneMinusSrcAlpha CGPROGRAM pragma surface surf Lambert alpha blend fullforwardshadows alphatest Cutoff pragma target 3.0 struct Input fixed2 uv MainTex fixed4 color COLOR sampler2D MainTex fixed4 EffectColor1 fixed Crossfade fixed4 FlashColor float FlashAmount void surf(Input IN, inout SurfaceOutput o) fixed4 col tex2D( MainTex, IN.uv MainTex) fixed4 returnColor lerp(col, col EffectColor1, Crossfade) EffectColor1.a col (1.0 EffectColor1.a) o.Albedo returnColor.rgb IN.color.rgb o.Alpha col.a IN.color.a o.Albedo lerp(o.Albedo, FlashColor.rgb, FlashAmount) ENDCG Fallback quot Standard quot After that I found that using the built in render pipeline was the difference between 40 and 7 GPU utilization when converted to URP using a different sprite shadow shader. Problem is the URP shader I found did make sprites cast shadows, but all the lighting was off, and messing with the X rotation on the sprites to get them to stand off the ground would cause them to explode into balls of white light from the directional light....I don't get why. So I set off to try and convert my old working Shader to ShaderGraph I have been trying to deduce what parts of this code corelate to what blocks exist in ShaderGraph. I understand when 2 variables have a between them I need to use a multiply block After about 3 hours of linking different blocks I was left with....nothing and a mess I was wondering if anyone could point me in the right direction as to what blocks correspond to what variables. Like the code void surf(Input IN, inout SurfaceOutput o) I am not too sure what sort of block this would be, it seems like a function, and I have no clue how to make that visually or the block of quot Tags quot no idea where to start there or the toggling of ALPHABLEND ON
14
Iris wipe shader not properly working I'm working on creating an iris wipe transition, like the ones you see in old cartoons a fully transparent circle closes on a certain point, leaving a full screen of a solid color. Additionally, the background around the circle fades in from full transparency as well. I decided that shaders would be easier than creating vertices to build the iris effect. I may have been wrong. EDIT I've made some progress since asking this question there is now an iris effect, but it opens and closes around a blank CornflowerBlue screen before displaying the screen, and the background around the circle is always full transparency. Shader sampler TextureSampler register(s0) float2 irisCenter float radius float4 backColor float4 PixelShaderFunction(float4 pos SV POSITION, float4 color1 COLOR0, float2 coords TEXCOORD0) COLOR0 float4 p pos float2 c irisCenter float r radius float alpha abs(1 step(pow(p.x c.x, 2) pow(p.y c.y, 2), r r)) return float4(backColor.r, backColor.g, backColor.b, alpha backColor.a) technique Technique1 pass Pass1 PixelShader compile ps 3 0 PixelShaderFunction() The alpha variable should be deciding if a pixel is inside the circle and should be transparent, or if the pixel is outside the circle and shouldn't be. irisCenter, radius, and backColor are all set on each frame by public void Draw() if (isInitialized amp amp (isRunning dir EffectDirection.Backward)) float r color.R 255f float g color.G 255f float b color.B 255f float a (color.A currentFadeLevel) 255f var irisEffect GameServices.Effects "IrisEffect" irisEffect.Parameters "irisCenter" .SetValue(irisCenter) irisEffect.Parameters "radius" .SetValue(irisRadius) irisEffect.Parameters "backColor" .SetValue(new Vector4(r, g, b, a)) quadRenderer.Render(irisEffect) IrisEffect.cs QuadRenderer.cs Instead of behaving as an iris wipe, it first draws a full screen rectangle for half the effect, and then a full screen Color.CornflowerBlue rectangle (I guess the quad from QuadRenderer isn't drawing with transparency). I'm not sure what's wrong as I'm new to HLSL. Question 1 How can I get the coordinates of the current pixel being processed by the shader? Question 2 What am I doing wrong? Question 3 How do I draw the rest of the quad transparently?
14
How Apple Metal API distinguishes uniforms from vertices buffers? I am not sure how metal distinguishes uniforms from vertices buffers? As far as I know code for passing uniforms in buffers are the same like for vertices self.commandEncoder setVertexBuffer positionBuffer offset 0 atIndex 0 self.commandEncoder setVertexBuffer uniformBuffer offset 0 atIndex 1 In shader code there is no keyword defining some struct as uniforms, so how Metal know it? In Opengl GLSL was uniform keyword which was clear for me but I can't figure how it is solved in metal.
14
Direction vector in raycasting When I read about how to get the direction vector in raycasting, for example on this site http www.daimi.au.dk trier ?page id 98 They first render the mesh with front face culling and then with back face culling. And then subtract the backface from the front to get at a direction vector for each pixel. But is this not to much work to get the direction vector, is it not more simpler and faster to just take the vertex position(in world coordinates) and subtract the camera position in the fragment shader to get the direction vector? This should give the exact same answer but we skip the backface and frontface rendering.
14
Screen Space reflections not tracing correctly GLSL I've been trying to implement screen space reflections for the past couple of days, however it's been difficult finding specific implementation instructions or guides. Most of the hits on the subject that I can find relate to UE4 or Unity, or a sample implementation in HLSL derived from GLSL work both using different coordinate systems for the y and z axes. Following what little I could find, I've been using the following shader version 430 layout (std430, binding 2) buffer Camera Frag mat4 pMatrix mat4 InvPMatrix vec2 ScreenSize uniform sampler2D ViewNormalMap uniform sampler2D DepthMap uniform sampler2D LightMap in vec2 TexCoord layout (location 0) out vec4 FragColor vec2 RayCast(vec3 dir, inout vec3 hitCoord, out float dDepth) dir 0.25f for(int i 0 i lt 20 i) hitCoord dir vec4 projectedCoord pMatrix vec4(hitCoord, 1.0) projectedCoord.xy projectedCoord.w projectedCoord.xy projectedCoord.xy 0.5 0.5 float depth texture(DepthMap, projectedCoord.xy).r dDepth hitCoord.z depth if(dDepth lt 0.0) return projectedCoord.xy return vec2(0.0f) void main(void) vec3 View Normal texture(ViewNormalMap, TexCoord).xyz float View Depth texture(DepthMap, TexCoord).r vec3 ScreenPos 2.0f vec3(TexCoord, View Depth) 1.0f vec4 View Pos InvPMatrix vec4(ScreenPos, 1.0f) View Pos View Pos.w Reflection vector vec3 reflected normalize(reflect(normalize(View Pos.xyz), normalize(View Normal))) Ray cast vec3 hitPos View Pos.xyz float dDepth float minRayStep 0.1f vec2 coords RayCast(reflected max(minRayStep, View Pos.z), hitPos, dDepth) FragColor textureLod(LightMap, coords, 0) I get the scene projected further down, but it isn't upside down. Notice the vases and the plants on top of them. Its like the scene gets resampled and pushed downwards. Can anyone help me understand why I'm not getting the expected results? I've never done any raytracing before, so perhaps I'm misunderstanding how this is supposed to work.
14
My lighting stays the same when I go indoors? My problem can be seen here notice how the shadows don't change intensity or color when moving indoors to the cave. The only thing that changes is the highlights when I move under the directional light. I was thinking about adding global illumination but I don't think that would work with a toon shader. How can I get my lighting to become darker once I enter indoors e.g a cave. Thanks!