_id
int64
0
49
text
stringlengths
71
4.19k
14
How can I create a screen space distortion effect? I am trying to create a screen space distortion effect using shaders. This image demonstrating the effect I'm after is from Nvidia Let's consider that I have access to the background texture. How can I create this kind of effect? What should I do in the vertex and fragment function? It seems that I need to retrieve screen space normals of my object, but how?
14
DirectX11 Using Multiple Shaders I currently have a scenario where I am rendering terrain with a shadow map. I have two passes, one for the depth buffer to create the shadow map (which is rendered to a texture) and a second that takes the shadow map texture as input and actually shadows the terrain, the terrain gets drawn once for each pass. Within the second shader I also texture the terrain. I now want to split the second shader so that shadows and texturing are separate shaders. How do I go about getting the output from the shadow shader to the texture shader (or vice versa), or do I simply do it in the same way as I have with the shadows (rendering to a texture, passing it as an input).
14
Fragment shader for lighting in isometric perspective What I'm trying to do is to achieve basic lighting in 2D from an isometric perspective. Here I have 2 textures that are used as tiles for the ground Color Normal map I have a fragment shader that was adjusted a bit to my needs, but mostly made from this tutorial https github.com mattdesl lwjgl basics wiki ShaderLesson6 Here is my fragment shader uniform sampler2D texture uniform sampler2D normal texture values used for shading algorithm... uniform vec2 Resolution resolution of screen uniform vec4 AmbientColor ambient RGBA alpha is intensity uniform vec3 LightPos 3 light position, normalized uniform vec4 LightColor 3 light RGBA alpha is intensity uniform vec3 Falloff 3 attenuation coefficients void main() vec4 DiffuseColor texture2D(texture, gl TexCoord 0 .xy) vec3 NormalMap texture2D(normal texture, gl TexCoord 0 .xy) vec3 Sum vec3(0.0) for (int i 0 i lt 3 i ) The delta position of light vec3 LightDir vec3(LightPos i .xy (gl FragCoord.xy Resolution.xy), LightPos i .z) Correct for aspect ratio LightDir.x Resolution.x Resolution.y Determine distance (used for attenuation) BEFORE we normalize our LightDir float D length(LightDir) normalize our vectors vec3 N normalize(NormalMap 2.0 1.0) vec3 L normalize(LightDir) Pre multiply light color with intensity Then perform "N dot L" to determine our diffuse term vec3 Diffuse (LightColor i .rgb LightColor i .a) max(dot(N, L), 0.0) pre multiply ambient color with intensity vec3 Ambient AmbientColor.rgb AmbientColor.a calculate attenuation float Attenuation 1.0 (Falloff i .x (Falloff i .y D) (Falloff i .z D D)) the calculation which brings it all together vec3 Intensity Ambient Diffuse Attenuation vec3 FinalColor DiffuseColor.rgb Intensity Sum FinalColor gl FragColor gl Color vec4(Sum, DiffuseColor.a) This shader WORKS with traditional 2D perspective. That's because in traditional perspective the ground tiles normal map would have a color scheme closer to something like this, just wouldn't be tilted to fit an isometric perspective of course. The issue How do I go about adjusting the math in my fragment shader to fit an isometric perspective and work with the "greenish" ground normal maps. Here's what I'm currently getting with this shader. Light should be equally going in all directions, but it's as if it's going down a slope.
14
Infinite world floor grid shader I am trying to render an infinite world floor grid, similar to this question. My project is using SceneKit with Metal Shading Language but the concepts are no doubt similar between GLSL HLSL. Here is an example project where I attempt to render a plane as a full screen quad with the grid lines drawn using a fragment shader. Full shader code can be found here. Grid Rendering A simple vertex shader sets the position and grid coordinate for the plane f.position scn node.modelViewProjectionTransform float4(v.position, 1.0) f.ij v.position.xy The fragment shader then determines which color to render the grid or floor fragment half4 floor fragment(FragmentIn f stage in ) float2 fractional abs(fract(f.ij 0.5)) float2 partial fwidth(f.ij ) float2 point smoothstep( partial, partial, fractional) float saturation 1.0 saturate(point.x point.y) return half4(mix(backgroundColor.rgb, gridColor.rgb, saturation), 1.0) I then remove any MVP transforms and simply set the vertex position as is in clip space to fill the screen. This does have the intended effect of filling the screen but mangles the grid lines. inside floor vertex() f.position float4(v.position, 1.0) f.coordinate v.position.xy Ray Casting As suggested in the comments, the fragment position should be projected into world space and intersected with the floor plane to find a new vector on the plane. I have an intersect(plane ray) method returns a hit with a distance and a point on the plane or otherwise false if the ray is orthogonal to the plane. It is my understanding that by multiplying the position in clip space with the inverseViewProjectionTransform the result will be in camera space. The plane is also converted to camera space with the modelViewTransform and I then test my ray against this which appears to work but with incorrect results. fragment half4 floor fragment(Fragment f stage in , constant SCNSceneBuffer amp scn frame buffer(0) , constant NodeBuffer amp scn node buffer(1) ) f.ij is in the vertex position in clip space ( 1, 1) to (1, 1) convert position into camera space float4 position (scn frame.inverseViewProjectionTransform float4(f.ij.x, f.ij.y, 0.0, 1.0)) create ray from camera with direction Ray ray Ray .origin float3(0.0), .direction normalize(position.xyz) convert floor plane from world space to camera space float3 worldFloor (scn node.modelViewTransform float4(float3(0.0, 5.0, 0.0), 1.0)).xyz hit test ray against floor plane Plane plane Plane .position worldFloor, .normal float3(0.0, 1.0, 0.0) RayHitTest hitTest intersect(plane, ray) if(!hitTest.hit) return f.backgroundColor grab xz values to determine floor fragment color float2 uv hitTest.vector.xz float2 fractional abs(fract(uv 0.5)) float2 partial fwidth(uv) float2 point smoothstep( partial, partial, fractional) float saturation 1.0 saturate(point.x point.y) return half4(mix(f.backgroundColor.rgb, f.gridColor.rgb, saturation), 1.0) This video shows the final results where you can see that the grid is not correctly aligned with the world axis. The floor plane is positioned at 0.0, 5.0, 0.0 and the rectangle at 0.0, 4.5, 0.0 which should be aligned with each other but this is not the case. It looks to me that the perspective is off as the grid seems much larger than is should be. Can anyone please point out the obvious mistake(s) I am making?
14
HLSL Circle all white I have been trying to get my shader code (HLSL) to draw a simple circle but after a day and a half I am getting nowhere. It seems people are using x 2 y 2 r 2 and remap texcoords but I only get a white quad. struct VertexShaderStruct float4 Position POSITION0 float2 Tex0 TEXCOORD0 VertexShaderStruct VertexShaderFunction(VertexShaderStruct input) VertexShaderStruct output float4 worldPosition mul(input.Position, World) float4 viewPosition mul(worldPosition, View) output.Position mul(viewPosition, Projection) output.Tex0 input.Tex0 return output float4 PixelShaderFunction(VertexShaderStruct input) COLOR0 float dx 2 input.Tex0.x 1 float dy 2 input.Tex0.y 1 float hyp (dx dx dy dy) return (hyp 1)? circleColor otherColor I define circleColor as blue and otherColor as white, so it seems hyp 1 always fails.
14
Using multiple shaders I'm currently studying opengl shaders but I can't figure out something how to apply different shaders to the objects, for example, a teapot rendered using toon shader and another one in the same scene using a very reflective surface and other distorted from a noise function, like in this video http www.youtube.com watch?v 1ogg4ZfdBqU Another one is applying a bloom shader in a scene and a motion blur shader afterwards. How to achieve those effects when you can only have one vertex shader and one fragment shader? Is there any trick such as using more than one shader program?
14
Any good books on graphics programming? I've been looking for a book that takes a bottom up approach for graphics programming. So something that starts with 2d filtering, maybe moving into normal mapping, then ambient occlusion, etc. I ask because I've been lazy for the last few years in game development and always used an engine that handles this. I want to start writing some shaders for my games instead of relying on the cryptic ones that I've borrowed in the past. I think a very strong knowledge in this will help and I'm a bottom up kind of learner so please help me out! I know the GPU Gems series is great, but they seem to be more like a cookbook than a bottom up approach that I want. You tend to get more scattered theory from cookbooks instead of building on theory from previous chapters. EDIT Preferably something you've read! I can search amazon for this, but it's hard to get a unbiased review that way.
14
Partial Shader Signatures HLSL D3D11 C I had been debugging a problem I was having in a single shader file with 2 functions in it. I'm using DirectX 11, vs 5 0 and ps 5 0. I have stripped it down to its basic components to understand what was going wrong with the shaders, because the different named components of the Pixel and Vertex shaders were swapping the data being input void QuadVertex ( inout float4 position SV Position, inout float4 color COLOR0, inout float2 tex TEXCOORD0 ) ViewProject is a 4x4 matrix, just included here to show the simple passthrough of the data position mul(position, ViewProjection) And a Pixel Shader float4 QuadPixel ( float4 color COLOR0, float2 tex TEXCOORD0 ) SV Target0 Color is filled with position data and tex is filled with color values from the Vertex Shader return color The ID3D11InputLayout and associated C code correctly compiles the shaders and sets them up with some simple primitive data data 0 .Position.x 0.0f 210 data 0 .Position.y 1.0f 160 data 0 .Position.z 0.0f data 1 .Position.x 0.0f 210 data 1 .Position.y 0.0f 160 data 1 .Position.z 0.0f data 2 .Position.x 1.0f 210 data 2 .Position.y 1.0f 160 data 2 .Position.z 0.0f data 0 .Colour Colors Red data 1 .Colour Colors Red data 2 .Colour Colors Red data 0 .Texture Vector2 Zero data 1 .Texture Vector2 Zero data 2 .Texture Vector2 Zero When used with the shader, the float4 color always ended up with the position data, and the float2 tex always ended up with the color data. After a moment, I figured out that the shader's input and output signatures needed to be in the correct order and the correct format and be laid out in the exact order of the output from the Vertex Shader, regardless of the semantics float4 QuadPixel ( float4 pos SV Position, float4 color COLOR0, float2 tex TEXCOORD0 ) SV Target0 return color After finding this out, My question is Why don't the semantics map the appropriate components when going from Vertex Shader to Pixel Shader? Is there any way that I can make it so certain semantics are always mapped to other semantics, or do I always have to follow the rigid Shader Signature (in this case, Position, Color, and Texture) ? As a side note for why I'm asking I know that when using XNA, my shader signatures for functions could differ in position and even drop items from Vertex Shader to Pixel Shader function parameters, having only the COLOR0 and TEXCOORD0 components being used (and it would still match up correctly). However, I also know that XNA relied on DX9 (and maybe a little DX10) implementation, and that maybe this kind of flexibility no longer exists in DX11?
14
WebGL two different approaches to Point Light what is the difference? So I'm studying Webgl and after directional light I'm approaching point light. So I've seen two different approaches Take the directional light (diffuse and specular components) and multiply them by a falloff attenuation, for example with the attenuation function is by Tom Madams that anyway take in account the distance from the fragment surface point to the light point. This approach is spotted in this webgl light walkthrough tutorial, and I find it easier to understand. Seems also the same approach from this OpenGL tutorial (in the section Point Light) The other approach listed here does not involve a light attenuation falloff but calculate the light direction from any fragment position and the light point source. It seems a little bit more complicated to me. Is there something that I have totally misunderstood? What is the difference between this two approaches? When to use the first and when to use the second? Thanks!
14
Subtract waves from tilemap I'm wondering how I could create a shader that would turn a randomly generated shape like this And turn it into something more like this Essentially just creating a top down view of ocean waves that would flow in and out. I would prefer to be able to subtract this shape from a tilemap. Any resources or pseudo code on how I could get started with this would be great.
14
Fallback for R32F alpha blending I'm rendering an aggregation of value into an R32F format texture, using alpha blending to accomplish the sum operation. R32F alpha blending is not supported on all hardware. What kind of fallback options do I have for accomplishing the addition in when the hardware doesn't support blending?
14
outline object effect How can I achieve an outline effect similar to the ones found in League of Legends or Diablo III? Is it done using a shader? How? I would prefer answers that are not tied up to any particular engine but one that I can adapt to whatever engine I'm working on.
14
How to create a depth recording shader for shadow mapping? I am attempting to implement shadow maps. I am first going to implement spotlights over directional lights (I am aware this is harder). Since I was already rendering the geometry, my attempt was to recycle my old shaders and modify them to make shaders to create the depth maps. This created the following vertex shader version 450 layout(location 0) in vec3 position (x,y,z) coordinates of a vertex layout(std430, binding 3) buffer data buffer vec4 cubes info first 3 values are position of object uniform mat4 view mat4(1) Camera orientation and position uniform mat4 proj mat4(1) The projection parameters (FOV, viewport dimensions) void main() gl Position proj view (vec4(position, 1.0) vec4(vec3(cubes info gl InstanceID ),0)) And fragment shader version 450 layout(location 0) out float fragmentdepth void main() fragmentdepth gl FragCoord.z As you can see, both shaders are pretty basic, they are a little more complex than they would otherwise be since they are using instancing, but overall they are simple. However the output of these shaders is Where the original image is As you can see, the depth is being lost. I think this is because my perspective, projection and model matrices are identical for both shaders. In other words, I think that the camera I use for rendering the second image does not store the depth information. I wanted to know how can you generate a similar projection matrix to store the depth?
14
Why is clip space 3 dimensional? A vertex shader is basically a transformation function that converts a vertex in your world space to a space that can be rendered on screen. Since the screen is a 2 dimensional surface whats the purpose of the intermediate clip space? Why not have the vertex shader directly transform a 3d vertex to a 2d point on the raster surface?
14
OpenGL ES 2.0 How to batch draw particles that have unique translations, rotations, scales, and alphas? I've combined all of my vertex data for many particles into a single array. How would I batch draw all of those particles in a manner that preserves their unique translations? Any code examples would be greatly appreciated.
14
Matcap and BRDF Shading I just would like to know what's the difference between the Matcap shaders used in ZBrush for example and the Bidirectional Radiance Distribution Function shader. Are there two techniques the same ? Is Matcap done using BRDF or are they different ?
14
Reflections based on distance from plane Let's consider, for example, a surface like the volleyball court, we can see that legs and shoes of the players are reflected, with a blur effect, but body and stadium don't (as each object not near to the court). I've already made a reflection effect, but it works as a specular reflection, and I need to achieve an effect like the photo above. So, I would like to make a reflection that is based on the distance between the object and the plane, in this manner a close object would reflect more than an object that is positioned far away from the plane. What is the best way to achieve this effect? My first idea was to use the depth value (taken from the reflected camera), and use that value to blend between reflection and court. But I don't know if it's a correct way. Edit as rendering engine I use Ogre that already provides a reflections system reflecting the camera through a plane (obviously I can select the models to draw from the reflected camera). After a render to texture pass I can blend the reflected texture with the original plane. So, if possible, I'm looking for a way that best suits my system.
14
Bilinear filter in repeating texture, HLSL I have a repeating texture that I'm using as a scroll surface. The idea is that as I pan the surface I adjust the texture coordinates, filling in what gets wrapped on the right as it disappears from the left. This is all working quite well. When bilinear filtering is switched on (D3DTEXF LINEAR) I think the repeat filters the edge of the texture with the corresponding pixels on the other side given the wrap. This is wrong of course. I've had a look around for bilinear filtering shaders that ignore edge texels but can't seem to find anything. Does anyone have an ideas about how to do this? Is there a way of configuring a sampler to ignore edge texels when filtering? Is there a bilinear filtering shader knocking around I can use as a reference?
14
Disney BRDF where is metallic factor input into BRDF? Burley states about metallic parameter "This is a linear blend between two different models. ..." What are the two models? I can't see this described in the Frostbite or the Unreal papers either. Is it simply a blend between the diffuse and specular terms or a scaling factor on the specular term? Disney Paper
14
Is there a successor to RenderMonkey? I'm starting with GLSL shader programming and have been looking into RenderMonkey. Sadly, AMD no longer supports it. Why? Is there a successor to it?
14
Issues compiling .fx shader to MGFX I've been trying to port Catalinzima's 2D lighting example over to MonoGame to try and get some basic understanding around it and adapt it for use in my engine. I've been trying to convert the two .fx files provided with it to MonoGame compatable .mgfxo files. The first file ran through the converter and compiles successfully. However, the second file ran into issues converting cmd Dump C Program Files (x86) MSBuild MonoGame v3.0 gt 2MGFX.exe resolveShadowsEffect.fx DEBUG C Program Files (x86) MSBuild MonoGame v3.0 resolveShadowsEffect.fx(213,8) error X5608 Compiled shader code uses too many arithmetic instruction slots (85). Max. allowed by the target (ps 2 0) is 64. (1,1) error X5609 Compiled shader code uses too many instruction slots (99). Max. allowed by the target (ps 2 0) is 96. Failed to compile the input file 'resolveShadowsEffect.fx' So i thought fine, ill just change the PixelShader model to ps 3 0. Now this converts fine, but fails to compile in visual studio with the following errors A namespace cannot directly contain members such as fields or methods (1, 1) and.. Unexpected character '' (1, 5) In the visual studio error message this character shows as ' ' but its not the same when i paste it out. I've tried comparing the first lines of the files, but have not been able to notice any differences. I'm pretty new to using shaders, so i haven't been able to debug whats going on effectively. Any ideas? You can find the original shader files and compiled versions here http puu.sh 9ZfMe efe8fbaf9d.zip
14
Unreal 4 why does reducing scalability take so long to compile? I have almost nothing in my scene in fact, I reduced my scalability settings from "High" to "Medium" before even loading my level layout and doing this has crashed the program multiple times. It didn't crash this time, but it is telling me it is compiling over 5000 shaders (!) and my objects' materials are not showing up in the viewport. 1) that is a ton of shaders, what shaders are these? I only made like 8 shaders. 2) What is making this take so long? it is taking forever, over 10 minutes already. 3) Is this normal?
14
How do multiple shaders in a DirectX Engine work? So as I have been learning DirectX 11, I have been looking at a few tutorials online and in the sample code provided there are more than the usual pixel and vertex shaders (along with the geometry shaders etc..). Based on my understanding there should be minimum 2 shaders (vertex amp pixel) which work together. With these additional shaders how do they work with the vertex and pixel shaders? In these additional shaders there are alse PS and VS functions so could someone explain whats going on here? So as soon as I call the return function in one shader the data is passed to the next?
14
Loading a vertex shader compiled by Visual Studio 2012 I've got an extremely simple vertex shader that Visual Studio 2012 compiles into a .cso file. Now I want to load this file and create a vertex shader on the graphics device using the ID3D11Device CreateVertexShader function. So far I have the following code ifstream vs stream size t vs size char vs data vs stream.open(vsCompiledPath, ifstream in ifstream binary) if(vs stream.good()) vs stream.seekg(0, ios end) vs size size t(vs stream.tellg()) vs data new char vs size vs stream.seekg(0, ios beg) vs stream.read( amp vs data 0 , vs size) vs stream.close() result device gt CreateVertexShader( amp vs data, vs size, 0, amp m vertexShader) if(FAILED(result)) return false else return false I seem to be able to load the file fine, I get sane values in vs size and vs data however CreateVertexShader returns E INVALIDARG and sets m vertexShader to null. How can I fix this error? note that I can't use anything in the d3dx headers since they are not supported in the Windows 8 App store
14
I don't think my shaders are working, looking for help Using sharpdx(directx 11) developing on UWP. This is a link to a previous question of not being able to compile the shader files(written in hlsl) How to compile shader files in UWP Later I have found some method to avoid the original exception(unable to open or find shader files) but still the shaders seem not to be working. My method to quot avoid the original exception quot is to change the file path to System.IO.Path.Combine(Windows.ApplicationModel.Package.Current.InstalledLocation.Path, quot ProjectName quot ) If the method works, I would surely post it as an answer to the previous question The current problem is no display on the screen(all black), while I can check the stored data in the vertex buffer using graphic diagnostics tools built in visual studio. Here is my code for the shaders, problem may be going from here because this is the first shader file I've written. Please have a look and discuss may things go wrong. pixel shader file Pixel PS.fx struct VertexOut float4 PosH SV POSITION float4 Color COLOR float4 PS(VertexOut pin) SV Target return pin.Color vertes shader file Transf VS.fx cbuffer dataBuffer register(b0) matrix ViewProjection struct VertexIn float3 PosL POSITION float4 Color COLOR struct VertexOut float4 PosH SV POSITION float4 Color COLOR VertexOut VS(VertexIn vin) VertexOut vout Transform to homogeneous clip space. vout.PosH mul(float4(vin.PosL, 1.0f), ViewProjection) Just pass vertex color into the pixel shader. vout.Color vin.Color return vout where I compiled them byte vertexShaderByteCode ShaderBytecode.CompileFromFile(this.path quot Transf VS.fx quot , quot VS quot , quot vs 5 0 quot ) this.vertexShader new D3D11.VertexShader( device, vertexShaderByteCode ) this.inputSignature new ShaderSignature(vertexShaderByteCode) this.pixelShader new D3D11.PixelShader( device, ShaderBytecode.CompileFromFile(this.path quot Pixel PS.fx quot , quot PS quot , quot ps 5 0 quot ) ) data structure definition public struct ScatterVertex SharpDX.Vector3 Position SharpDX.Color4 Color inputlayout initialization this.inputLayout new D3D11.InputLayout( this.device, inputSignature, new SharpDX.Direct3D11.InputElement new D3D11.InputElement( quot Position quot , 0, SharpDX.DXGI.Format.R32G32B32 Float, 0 ), new D3D11.InputElement( quot Color quot , 0, SharpDX.DXGI.Format.R32G32B32A32 Float, I'm not very sure whether or why I should use this option 0 ) ) Here's the projection matrix in the camera class, which is later sent to the constant buffer private Matrix View get return Matrix.LookAtLH(this.Eye, this.Target, this.Up) private Matrix Proj get return Matrix.PerspectiveFovLH(this.Fov, this.Aspect, this.Near, this.Far) public Matrix WorldViewProject get Matrix wvp (this.View this.Proj) wvp.Transpose() return wvp If you need anything else to solve this, just let me know and I'll manage to get it. Thank you!
14
What is the minimum of shader I need to use to run basic calculation on GPU? I read, that the Hull Shader, Domain Shader, Geometry Shader and Pixel Shader can be used optional. So, is the Vertex Shader optional too? If no What does a basic Vertex Shader look like? Just like a simple pass through? Is the Vertex Shader necessary to tell what kind of datastructure (Van Stripes or Meshes) are used? What can I do, with just the vertex shader? Are the fixed functions working without any help of programming a programmable stage?
14
Why is clip space 3 dimensional? A vertex shader is basically a transformation function that converts a vertex in your world space to a space that can be rendered on screen. Since the screen is a 2 dimensional surface whats the purpose of the intermediate clip space? Why not have the vertex shader directly transform a 3d vertex to a 2d point on the raster surface?
14
Why would an ambient occlusion (AO) shader's performance be dependent on light direction? One of my favourite games recently implemented ambient occlusion as a graphics feature, which appears to perform very well in most circumstances except during sunrise and sunset. As someone who is getting into shader programming, I'm intrigued by this. My understanding of AO is that you precompute the diffuse shading for a range of incident vectors, then use that information to produce a correct shadowing scale for real time use by enumerating over all the lights and computing how their position in relation to the object compares to each of the incident vectors, essentially interpolating across the closest values. Since the occlusion is pre computed against a model, taking complex shapes and self occlusion into account, you don't need to raytrace and the performance is much better. So here's where I get lost since the diffuse map is precomputed, and you're just comparing light positions to those maps for all objects within the screen space, why would performance suffer for cases where the ambient light (the sun) is close to the horizon? It seems like this shouldn't matter surely the performance of such a shader scales against vertices, not the angle of the light? The game uses the latest version of Unity (they just bumped up a minor release) but I don't know if they use the built in Unity SSAO shader. Performance hit is major 28fps at 1440p during "day" hours, dropping to 12 14fps during sunrise set. This only occurs when AO is enabled. I'm not interested in finding a solution I'm sure they'll fix it sooner or later I'd just like to get a better idea of how AO performance scales, what its pain points are, and why this kind of behaviour might appear.
14
Making a crosshatching effect in Unity Shader Graphs I'm learning Shader Graph and am trying to experiment with toon shading effects. One thing I'd love to do is make a traditional art styled crosshatching effect either inside shadows or at the edge of shadows. However, I'm lost as to how to do that. I have a sample crosshatch texture I can use as a texture2D but I'm unsure how to use this as an input in shadows and to which nodes to hook it up to. I'm very new to dabbling in shaders so I'd appreciate any tips.
14
Make part of albedo transparent I have a shader which creates a circle inside of a plane mesh. I would like to get get rid of the parts around the circle, which are the r and b parts of the ALBEDO but I can't seem to figure out how to do it. The only thing I've managed to find is ALPHA but that changes the transparency of the entire shader and not just parts of it. shader type spatial float circle(vec2 position, float radius, float feather) return smoothstep(radius, radius feather, length(position vec2(0.5))) void fragment() ALBEDO vec3(0, circle(UV vec2(0), 0.5, 0.005), 0) Which currently looks like
14
How many Pipelines in a Typical Rendered Scene DirectX12 I'm learning DirectX12 right now and I'm missing a few pieces of the puzzle in my own head on the overall structure of how you would setup a game. Specifically, I'm trying to get an idea of how shaders are processed through the pipeline. So in DirectX12, there seems to be a lot of work to setup a pipeline state object (PSO) and then you can assign one each of the following vertex, pixel, domain, hull, and geometry shaders. Well, it seems like there are probably many different shaders being rendered at one given time, so you would have your water shaders, glass shaders, ground, etc.. and you would need a separate pixel shader for each of these correct? So if you had about 50 pixel shaders running on a particular frame, you would need to create and configure another 50 pipeline objects? This seemed like overkill to me, so I'm just trying to figure out how this would work in a real world game engine. Thanks!
14
OpenGL GLSL LWJGL return value from shader I'm trying to do some bone animation. I don't want the whole skeleton to be loaded to the GLSL shader every time, even tough it's not needed, because one vertex is parented to one bone, not more. My question is simple (How) can i return some value from the vertex shader back to my code? I'm using LWJGL Java
14
Fragment shader operations before vector transformations I feel like I'm misunderstanding how to work with vector fragment shaders. My vector shader is as follows uniform mat4 uVMatrix view (camera transformations) uniform mat4 uMMatrix model (object transformations) uniform mat4 uPMatrix projection attribute vec4 aVertexPosition passed in attribute vec4 aVertexColor varying vec4 vColor void main() gl Position uPMatrix uVMatrix uMMatrix aVertexPosition vColor aVertexColor pass the vertex's color to the fragment shader Pretty simple. Right now I just have a simple square that I'm transforming in 3D space and drawing. I want to make a see through circle in the middle of the suqare as follows This square has 4 vertices that I transform in the vector shader. Here's my fragment shader precision mediump float how precise to be with floats varying vec4 vColor interpolated from the vertices void main() gl FragColor vColor Now, I've seen I have access to gl FragCoord, but that coordinate is after all the vector transformations, right? How can I manipulate the pixels in the square, before all the projection transformations, etc.? I don't think I can do it in the vector shader, as there's only 4 vectors...
14
Fragment shader for lighting in isometric perspective What I'm trying to do is to achieve basic lighting in 2D from an isometric perspective. Here I have 2 textures that are used as tiles for the ground Color Normal map I have a fragment shader that was adjusted a bit to my needs, but mostly made from this tutorial https github.com mattdesl lwjgl basics wiki ShaderLesson6 Here is my fragment shader uniform sampler2D texture uniform sampler2D normal texture values used for shading algorithm... uniform vec2 Resolution resolution of screen uniform vec4 AmbientColor ambient RGBA alpha is intensity uniform vec3 LightPos 3 light position, normalized uniform vec4 LightColor 3 light RGBA alpha is intensity uniform vec3 Falloff 3 attenuation coefficients void main() vec4 DiffuseColor texture2D(texture, gl TexCoord 0 .xy) vec3 NormalMap texture2D(normal texture, gl TexCoord 0 .xy) vec3 Sum vec3(0.0) for (int i 0 i lt 3 i ) The delta position of light vec3 LightDir vec3(LightPos i .xy (gl FragCoord.xy Resolution.xy), LightPos i .z) Correct for aspect ratio LightDir.x Resolution.x Resolution.y Determine distance (used for attenuation) BEFORE we normalize our LightDir float D length(LightDir) normalize our vectors vec3 N normalize(NormalMap 2.0 1.0) vec3 L normalize(LightDir) Pre multiply light color with intensity Then perform "N dot L" to determine our diffuse term vec3 Diffuse (LightColor i .rgb LightColor i .a) max(dot(N, L), 0.0) pre multiply ambient color with intensity vec3 Ambient AmbientColor.rgb AmbientColor.a calculate attenuation float Attenuation 1.0 (Falloff i .x (Falloff i .y D) (Falloff i .z D D)) the calculation which brings it all together vec3 Intensity Ambient Diffuse Attenuation vec3 FinalColor DiffuseColor.rgb Intensity Sum FinalColor gl FragColor gl Color vec4(Sum, DiffuseColor.a) This shader WORKS with traditional 2D perspective. That's because in traditional perspective the ground tiles normal map would have a color scheme closer to something like this, just wouldn't be tilted to fit an isometric perspective of course. The issue How do I go about adjusting the math in my fragment shader to fit an isometric perspective and work with the "greenish" ground normal maps. Here's what I'm currently getting with this shader. Light should be equally going in all directions, but it's as if it's going down a slope.
14
Gamma adjustment slider implementation Various online sources talk in sufficient detail about gamma correction. By following them, I achieved a rendering pipeline that looks somewhat like this Both are set to 2.2 uniform float gammaIn uniform float gammaOut void main() vec3 color pow(texture(material.albedo, texCoord).rgb, vec3(gammaIn)) Do lighting and then HDR tone mapping here. color ... FragColor pow(color, vec3(1.0 gammaOut)) That is the easy part. Now, many games implement the gamma correction adjustment slider that is meant to accommodate for display differences in various monitors that players may have. This presents some questions that I couldn't find any definitive answer to Which value should the adjustment slider affect? I reckon it will be gammaOut because gammaIn deals with decoding of the sRGB picture and assuming that all textures are sRGB, this should always be the const 2.2. With this approach a lower gammaOut means darker picture. What's the reasonable value range for the slider? Should it start at 1.0 or somewhere higher? Where should it end? If I were to display the "barely visible invisible" comparison combo picture to help the user with doing the adjustment correctly, what should be the color values for the "should be barely visible" dark picture and its background and ditto for the "should be invisible" bright picture and its background?
14
Highlighting shader but using touch points mouse I have done the following highlighting shader using GLSL. What I want to do is getting the mouse coordinates and "start" the highlighting from that point. is that possible ? What's the math or the idea behind doing that ? https youtu.be 8etJho4agpg
14
Can't import shaders textures to Godot from Blender 2.8 I'm using this shader for my models and then i import them like this This is the original model in blender the result in godot I think i might have missed some step but not sure of which one, i followed the tutorial video on how to import and other similar guides but the results is the same. Any ideas of what i'm doing wrong? Looks like my model not importing shader and textures cor
14
Linking one uniform variable to many shaders Let's say, that I have 3 programs, and in each of those programs there is a view matrix uniform, which should be the same in all those programs. Right now, when my camera moves, I need to re upload the modified matrix to every program separately. Is it possible to create some kind of global uniforms which are constant for all programs linked to it, so I could just upload the matrix once? I tried creating a globalUniforms object which looked kinda like this var globalUniforms program , (...) vMatrixUniform null, (...) initialize function() vMatrixUniform gl.getUniformLocation(this.program, 'uVMatrix') So I could just link it to proper programs like this program.vMatrixUniform globalUniforms.vMatrixUniform , and then pass the matrix like this if (camera.isDirty.viewMatrix ! false) camera.isDirty.viewMatrix false gl.uniformMatrix4fv(globalUniforms.vMatrixUniform, false, camera.viewMatrix.element) but unfortunately it throws an error Uncaught exception gl.INVALID VALUE was caused by call to getUniformLocation called from line 272, column 2 in () in mysite js mesh.js vMatrixUniform gl.getUniformLocation(this.program, 'uVMatrix') Summing up is there a more efficient way of managing shaders which follows my logic?
14
Water cut off effect in UE4 Working on a side scroller game in Unreal Engine 4 and I need to create a water cut off effect like seen in the screenshot of Ori and the Blind forest. Basically need a side perspective on water bodies like lakes and ponds where I can see objects underwater as well as above water. Water may be subject to ripples and disturbances (caused by player or fish swimming in the water). Thanks
14
How can I pass a texture to a custom deferred lighting model in Unity? I've replaced the Internal DeferredShading.shader with my own shader, and it's working fine, but I want to add a uniform texture for it to sample from. I've tried adding a texture as a property and assigning it a default value in the inspector but at run time the shader still uses the default value given in the shader (e.g. white). Is it possible to do this?
14
How to create a depth recording shader for shadow mapping? I am attempting to implement shadow maps. I am first going to implement spotlights over directional lights (I am aware this is harder). Since I was already rendering the geometry, my attempt was to recycle my old shaders and modify them to make shaders to create the depth maps. This created the following vertex shader version 450 layout(location 0) in vec3 position (x,y,z) coordinates of a vertex layout(std430, binding 3) buffer data buffer vec4 cubes info first 3 values are position of object uniform mat4 view mat4(1) Camera orientation and position uniform mat4 proj mat4(1) The projection parameters (FOV, viewport dimensions) void main() gl Position proj view (vec4(position, 1.0) vec4(vec3(cubes info gl InstanceID ),0)) And fragment shader version 450 layout(location 0) out float fragmentdepth void main() fragmentdepth gl FragCoord.z As you can see, both shaders are pretty basic, they are a little more complex than they would otherwise be since they are using instancing, but overall they are simple. However the output of these shaders is Where the original image is As you can see, the depth is being lost. I think this is because my perspective, projection and model matrices are identical for both shaders. In other words, I think that the camera I use for rendering the second image does not store the depth information. I wanted to know how can you generate a similar projection matrix to store the depth?
14
Strange depth map projection I'm trying to implement depth only SSAO and for that, I render a depth map into a texture and pass it to my SSAO shader which then uses it. The problem is that when I try to output the depth map values from SSAO fragment shader (for testing purpose), I get something really weird. Here is the result And, here is how it normally looks rendered before giving it to the SSAO shader The depth buffer seems OK, so I guess it comes from a transformation done in the SSAO shader. Here are the shaders (I'm using the bgfx library, but the shader language is very similar to GLSL) Here is the way I output the depth from my SSAO for testing Vertex Shader input a position, a texcoord0 output v texcoord0 include ".. common common.sh" void main() gl Position mul(u modelViewProj, vec4(a position, 1.0) ) v texcoord0 a texcoord0 Fragment Shader input v texcoord0 include ".. common common.sh" SAMPLER2D(s depth, 0) float readDepth( in vec2 coord ) if BGFX SHADER LANGUAGE HLSL float z texture2D( s depth, coord ) else float z texture2D( s depth, coord ) 2.0 1.0 endif BGFX SHADER LANGUAGE HLSL return z void main() float depth readDepth( v texcoord0 ) ... Some computation ... gl FragColor vec4(vec3 splat(depth), 1) And here the shader for storing the depth into a texture Vertex Shader input a position include ".. common common.sh" void main() gl Position mul(u modelViewProj, vec4(a position, 1.0) ) Fragment Shader void main() gl FragColor gl FragCoord.z gl FragCoord.w My guess is that I may be projecting the depth texture in a wrong way. Thanks in advance for any help.
14
How do I control which calculations are done on the CPU and which are done on the GPU? My current understanding is that anything done in a shader file is done on the GPU, and anything done in my (Java, in my case) code is done on the CPU. Is this an accurate description?
14
Fresnel shader excluding one axis I'm trying to get an outline like effect shown on the right in this video. Using fresnel seems to make the whole thing go white at certain angles. How do I prevent that? I'm using amplify to make this. Here's my current setup
14
HLSL equivilant to "Object" data from "Texture Coordinate" node in Blender I mocked up a shader how I wanted it with the node editor in Blender. Now I'm trying to write it in HLSL. In Blender there is a node group called "Texture Coordinate". If I use the "uv" node from the group it behaves like a normal unlit frag shader but if I use the "object" node to give coordinates to the texture, it ignores the uv data and just maps the texture like an image overlayed onto the object. This is actually the effect I want. However, I can't find a way to replicate this in HLSL. As far as I can see I can use TEXCOORD0 and POSITION as texture coordinates to produce uv mapping and world mapping respectively for a texture onto an object. Maybe what I want is object mapping? And if it matters I'm using a generated texture
14
FXC Error X3501 'main' entrypoint not found I am trying to compile a vertex shader using VS2013, but every time I try, FXC returns the following error Error error X3501 'main' entrypoint not found I've reduced the vertex shader to its simplest form and yet I'm still getting the same result DefaultVS.hlsl include quot Include.hlsl quot cbuffer CameraTransform float4x4 ViewProjMat VS OUT main(VS IN input) VS OUT result result.Position mul(input.Position, mul(input.WorldMat, ViewProjMat)) return result Include.hlsl struct VS IN float4 Position POSITION float4x4 WorldMat INSTANCE TRANSFORM struct VS OUT float4 Position SV POSITION And the properties of both files Zi E quot main quot Od Fo quot Path To Output DefaultVS.cso quot vs quot 5 0 quot nologo Zi Od Fo quot Path To Output Include.cso quot nologo
14
How can I remove branches from a fragment shader function? I have a fragment shader, when I've carefully managed to remove most branching decisions, as I have found out through research here that they are bad. But I have one function that I just can't work out how to do it without them. The function takes in a HSV vector, and 'expands' it. By that I mean any Value (V of HSV) from 0.0 to 0.5 gets expanded to 0.0 to 1.0. Any Value from 0.5 to 1.0 gets de saturated, so that it becomes more white. At V 1.0 any input colour is white. Here is my working function. Can anyone help me de 'if' it? expand doubles the level up to 0.5 gt 1.0 for over 0.5 then the saturation is reduced from where it is now (V 0.5) to 0.0 (v 1.0) vec3 expand() vec3 expanded float newSaturation float newValue HSV rgb2hsv(inputColor) if (HSV.z lt 0.5) x Hue,y Sat,z Val ,w bugger all newValue HSV.z 2.0 double value 0.0 to 0.5 gt 0.0 to 1.0 no need to touch saturation newSaturation HSV.y else newValue 1.0 value is max need to de saturate proportional to level over 0.5 (1.0 ((v 0.5) 2)) s newSaturation (1.0 ((HSV.z 0.5) 2.0)) HSV.y expanded hsv2rgb(vec3(HSV.x, newSaturation, newValue)) return expanded
14
Any good books on graphics programming? I've been looking for a book that takes a bottom up approach for graphics programming. So something that starts with 2d filtering, maybe moving into normal mapping, then ambient occlusion, etc. I ask because I've been lazy for the last few years in game development and always used an engine that handles this. I want to start writing some shaders for my games instead of relying on the cryptic ones that I've borrowed in the past. I think a very strong knowledge in this will help and I'm a bottom up kind of learner so please help me out! I know the GPU Gems series is great, but they seem to be more like a cookbook than a bottom up approach that I want. You tend to get more scattered theory from cookbooks instead of building on theory from previous chapters. EDIT Preferably something you've read! I can search amazon for this, but it's hard to get a unbiased review that way.
14
Curved Meters and Gauges I'm wondering how people here on GameDev stack exchange would handle curved meter GUI elements for things such as life bars or energy bars. My thought on the matter was that you could use a shader with a cutoff value and an image which has one channel dedicated to masking the image (alpha) and one that also has a gradient which I compare to a uniform float to determine whether or not the fragment should be fully transparent. However, I've been running into some strange behavior when writing this shader. Specifically, the shader's output has a weird artifact where the cutoff begins that looks almost like ripped paper the line that indicates the end of the meter has a sloppy contour. This image has some distortion effect going on with the pixels where the lifebar is supposed to end via the cutoff uniform. There's definitely got to be a better way of doing this same thing in a more tactful way. Shader code is below CGPROGRAM pragma vertex vert pragma fragment frag sampler2D MainTex float4 Color float Cutoff struct Vert IN float4 loc POSITION float4 texcoords TEXCOORD0 struct Frag IN float4 pos SV POSITION float4 uv TEXCOORD0 Frag IN vert( Vert IN input ) Frag IN output output.uv input.texcoords output.pos mul( UNITY MATRIX MVP, input.loc ) return output float4 frag( Frag IN input ) COLOR float4 value Color float4 valueFromMask tex2D( MainTex, input.uv.xy ) value.w valueFromMask.w float desiredTransparency step( valueFromMask.x, Cutoff ) value.w min( value.w, desiredTransparency ) return value ENDCG (Again, the red channel of this image is actually a gradient mask used to determine where the cutoff point should be. Other channels were going to be used for something else (like special scrolling patterns or what not) ) Example of the asset used in the shader I'm assuming there's probably something I'm missing when using the step function that could help ease the fade dropoff in order to make a better looking end result? What do you think is a good method for making curved meter GUI elements? Is there something wrong with the shader code presented above that causes this strange page tearing artifact?
14
DirectX 11, using Tessellation Geometry shader in a single pass Before all, sorry for my poor english ! With DirectX 11, i'm trying to create a random map full with GPU. Using Hull shader stage, I'm managing LOD with tessellation. Using Domain shader stage, I'm generating the map (based on perlin noise). Now my goal, is to compute normals in the geometry shader (normal on vertex). For that, I must use vertex adjency, like geometry is capable of. But here is the problem... For tessellation, my primitives must be D3D11 PRIMITIVE TOPOLOGY 3 CONTROL POINT PATCHLIST. But for geometry shader with 6 vertex (triangle primitive and adjency), I must use D3D11 PRIMITIVE TOPOLOGY TRIANGLELIST ADJ. Think I'm missing something... It must be possible to tessellate and use the results in the geometry shader... However, it's working with 3 points, but I cannot use the 3 others (they are 0.0, 0.0, 0.0).... Thank you in advance for any help )
14
HLSL How to flip geometry horizontally I want to flip my asymmetric 3d model horizontally in the vertex shader alongside an arbitrary plane parallel to the YZ plane. This should switch everything for the model from the left hand side to the right hand side (like flipping it in Photoshop). Doing it in pixel shader would be a huge computational cost (extra RT, more fullscreen samples...), so it must be done in the vertex shader. Once more this is NOT reflection, i need to flip THE WHOLE MODEL. I thought I could simply do the following Turn off culling. Run the following code in the vertex shader input.Position mul(input.Position, World) World 3 0 holds x value of the model's pivot in the World. if (input.Position.x lt World 3 0 ) input.Position.x World 3 0 input.Position.x else input.Position.x input.Position.x World 3 0 ... The model is never drawn. Where am I wrong? I presume that messes up the index buffer. Can something be done about it? P.S. it's INSANELY HARD to format code here. Thanks to Panda I found my problem. SOLUTION Do thins before anything else in the vertex shader. Position.x 1 To invert alongside the object's YZ plane.
14
PBR texture conversion Is it possible to convert from specular glossiness to metallic roughness textures for UE4?
14
D3D12 ConstantBuffer Shader receives wrong values im having trouble with one constantbuffer struct CameraConstData urd Matrix projection 64 ( 16 floats) urd Matrix view 64 ( 16 floats) urd Vec3 viewPosition 12 ( 3 floats) urd Vec3 viewDir 12 ( 3 floats) 104 bytes (26 4) float offset 26 inside the shader its defined like desc heap cbv cbuffer CameraConstBuffer register(b0) float4x4 projectionMatrix float4x4 viewMatrix float3 viewPos float3 viewDir now viewdir is always filled with wrong buffer indices. viewDir.x is the passed y value viewDir.y is the passed z value viewDir.z is 0.0 checked it in the shader like float3 vpos normalize(float4(viewDir, 0.0)).xyz float value vpos.x uses the passed y value float value vpos.y uses the passed z value color float4(vpos, 1.0) GPU Debugging shows that the constant buffer looks fine i checked the vectors against the cpu side and they match. So why is the shader reading index 36 38 instead of 35 37? Its always affecting the 4th member of the struct. If i switch viewPosition with viewDir, viewPosition is false.
14
How can I run a shader over the entire screen without interfering with other running programs? How can I run a shader over the entire screen without interfering with other running programs? Specifically, I'd like to adjust the screen output with usability tweaks for my severely colorblind nephew. He has trouble playing certain games. I'd like to avoid hooking into DirectX, and just run a shader over the entire screen. Is this something I can do in Windows without draining an unreasonable amount of resources? I can use whatever language or tool is most practical for this, but I'm most comfortable with Java and HLSL.
14
What game development tool can I use on a computer that does not support shaders? I tried SFML, but found out that my computer does not support shaders. Given this restriction, what could I use instead?
14
Why are faces being drawn like this with my custom shader? I've been writing a custom surface shader which allows for vertex colors (with alphas) to be set programmatically. From test runs of the shader itself, it works perfectly fine, and I am able to set the color of vertices programmatically. Unfortunately, I ran into some other issues. Currently, the shader is rendering mesh faces quite strangely, and I cannot figure out why The shader code itself currently looks like this Shader "Custom VertexColorSurface" Properties Color ("Color", Color) (1,1,1,1) MainTex ("Albedo (RGB)", 2D) "white" SubShader Tags "RenderType" "Cutout" CGPROGRAM pragma surface surf Lambert alpha pragma target 3.0 struct Input float2 uv MainTex float4 color COLOR sampler2D MainTex fixed4 Color void surf(Input IN, inout SurfaceOutput OUT) OUT.Albedo tex2D( MainTex, IN.uv MainTex).rgb Color.rgb IN.color.rgb OUT.Alpha tex2D( MainTex, IN.uv MainTex).a Color.a IN.color.a OUT.Specular 0.2 OUT.Gloss 1.0 ENDCG FallBack "Diffuse" Currently the two solutions I've tried have been using ZWrite On and ColorMask 0 in a Pass block, and while both of these methods fix the face rendering issues, they totally screw up alpha blending, which is something I need. What might be causing this issue, and how do I fix it?
14
Why aren't my 2D primitives visible using a custom effect? I'm working with Monogame and rendering a triangle using the following code. vertices new new VertexPositionColor(new Vector3(100, 200, 0), Color.White), new VertexPositionColor(new Vector3(200, 100, 0), Color.White), new VertexPositionColor(new Vector3(300, 200, 0), Color.White), graphicsDevice.DrawUserPrimitives(PrimitiveType.TriangleList, vertices, 0, 1) Ordinarily I'd apply a BasicEffect before the draw call, but this time I'm using a custom effect. For testing, I wrote the following shader code to return solid green for each pixel within the triangle. struct VertexShaderInput float4 Position POSITION float4 Color COLOR struct VertexShaderOutput float4 Position POSITION float4 Color COLOR VertexShaderOutput VertexShaderFunction(VertexShaderInput input) VertexShaderOutput output output.Color input.Color output.Position input.Position return output float4 PixelShaderFunction(VertexShaderOutput input) COLOR return float4(0, 1, 0, 1) technique Technique0 pass Pass0 VertexShader compile vs 4 0 level 9 1 VertexShaderFunction() PixelShader compile ps 4 0 level 9 1 PixelShaderFunction() The problem is that, although the shader compiles successfully, nothing shows up. Why is my triangle invisible using this shader? Based on my understanding of vertex and pixel shaders, the above HLSL code should work correctly. In the vertex shader, since (for the moment) I'm working in strictly 2D without a camera, no transformations need to be done on the vertices (i.e. they're already in screen space). From there, the pixel shader simply returns solid green for each pixel. I've also ruled out potential backface culling issues by swapping the order of vertices, and I've tried using different semantics on my HLSL structure variables. No luck. Shaders still feel uncomfortable to me, but I feel my logic is correct based on what I've read. If that's the case, there must be some weird quirk I'm not aware of in having these HLSL shaders work at all, apart from the logic. What am I missing? Is there a better way to debug these problems short of changing random variables and hoping for the best?
14
How many active shaders at one frame in the game should I typically use? 5 or more like 100? How many shaders are usually active, at the same time in one scene, in modern games? I know that multiple shaders are being used, with the games switching between them in each frame, and it's common to draw objects via the shader Draw all objects with shader one Change from shader one to shader two Draw all objects with shader two Still, I know it's not as simple, especially with effects like a glow effect for whole scene, render to texture, etc., but I guess we can assume it works that way most of the time, right? The "group by shader" approach is good, because switching shaders is an expensive operation. From one side, you cannot have too many shaders, because you want to render the scene fast. On the other hand, you need many different shaders (or uber shader with branches quite similar) for skin, metal, water etc. How many (and which) different shaders would the theoretical, modern, third person, 3D detective game for PC (DirectX 11, if it matters) use? It would be 5, 20 or more like 100 active shaders, counting only active, at some "frame X"? I know it's not one number, but I wonder what scale and factors are important, in consideration for a PC game. In my sample game, I would use about 9 11 per frame (count it as different, small shaders or one uber shader doesn't matter now) Skin shader Eye shader (not too much? but they are different) Metal shader Ground shader Snow rain shader (if required) Water shader (if the water exists in scene) Glow shader (only when some special effects are involved) Light emiter shader (street lamps etc.) Standard shader (for all other, just standard shading) Standard shader with normal maps 2D shader (for GUI etc.) Is it "much" or "not many"? Did I forget about some important shaders that I would need?
14
PBR texture conversion Is it possible to convert from specular glossiness to metallic roughness textures for UE4?
14
Special relativity shader in GLSL I'm trying to implement a GLSL shader which helps understanding special relativity Lorentz Transformation. Let's take two axis aligned inertial observer O and O' . The observer O' is in motion w.r.t observer O with velocity v (v x,0,0). When described in terms of O' coordinates, an event P' (x',y',z',ct') has transformed coordinates (x,y,z,ct) L (x',y',z',ct') where L is a 4x4 matrix called Lorentz transformation which helps us writing the coordinates of event P' in O coordinates. (for details look http en.wikipedia.org wiki Lorentz transformation Boost in the x direction) I've wrote down a first preliminary vertex shader that apply the Lorentz transformation given the velocity to every vertex, but I can't get the transformation to work correctly. vec3 beta vec3(0.5,0.0,0.0) float b2 (beta.x beta.x beta.y beta.y beta.z beta.z ) 1E 12 float g 1.0 (sqrt(abs(1.0 b2)) 1E 12) Lorentz factor (boost) float q (g 1.0) b2 http en.wikipedia.org wiki Lorentz transformation Matrix forms vec3 tmpVertex (gl ModelViewMatrix gl Vertex).xyz float w gl Vertex.w mat4 lorentzTransformation mat4( 1.0 beta.x beta.x q , beta.x beta.y q , beta.x beta.z q , beta.x g , beta.y beta.x q , 1.0 beta.y beta.y q , beta.y beta.z q , beta.y g , beta.z beta.x q , beta.z beta.y q , 1.0 beta.z beta.z q , beta.z g , beta.x g , beta.y g , beta.z g , g ) vec4 vertex2 (lorentzTransformation) vec4(tmpVertex,1.0) gl Position gl ProjectionMatrix (vec4(vertex2.xyz,1.0) ) This shader should apply to every vertex and perform the non linear Lorentz transformation, but the transformation it performs is clearly different from what I'd expect (in this case a length contraction on x axis). Has somebody already worked on special relativity shader for 3D videogame?
14
How to determine vertex index using Shader Model 3 or lower? I need something like SV VertexId (added in Shader Model 4) in HLSL shader to determine which vertex is currently handled. Unfortunatelly, I can compile only vs 3 0 or lower. The objective is to change position of one specific vertex using HLSL. I can't edit the mesh and can't pass any data from game engine, my capabilities are limited by HLSL shader. Shader is writing only for one mesh (humain face), I need to change it's shape a bit (for example, close an eye or make it smiling). I already tried to locate vertex by TEXCOORD, but had no idea how to separate it from other triangles verticles connected to it (placed at the same point where many triangles are met). if ( VS.TC 0 gt 0.8828125 amp amp VS.TC 1 gt 0.6328125 amp amp VS.TC 1 lt 0.671875 ) Upper lip center VS.Position VS.Normal uUpLip Could you, please, give me any advice how to identify move only one specific verticle in HLSL? I need something like moving verticle in Blender's Edit Mode (wytn neighbour triangles are still connected in one point), but my attempt with TEXTCOORD makes them moving in different directions Thanks
14
Using Ogre particle point billboards with shaders I'm learning about using Ogre particles and had some questions about how the point type particles work. Q. I believe point type particles are implemented as a single position. Is one single vertex is passed to the vertex shader? Q. If one vertex is passed to the vertex shader then what gets sent to the fragment shader? Q. Can I pass the particle size to the shader? Perhaps with a custom parameter?
14
Performance of static const variable in shader I am using static const variables based on uniform variables. For example uniform uint uSampleCount static const float invSampleCount 1.0 float(uSampleCount) Is this static recalculated for every vertex and fragment, or are shader frameworks smart enough to initialize this once before the shader starts processing? Does using or not using static and or const make a difference?
14
Calculating distance from viewer to object in a shader Good morning, I'm working through creating the spherical billboards technique outlined in this paper. I'm trying to create a shader that calculates the distance from the camera to all objects in the scene and stores the results in a texture. I keep getting either a completely black or white texture. Here are my questions I assume the position that's automatically sent to the vertex shader from ogre is in object space? The gpu interpolates the output position from the vertex shader when it sends it to the fragment shader. Does it do the same for my depth calculation or do I need to move that calculation to the fragment shader? Is there a way to debug shaders? I have no errors but I'm not sure I'm getting my parameters passed into the shaders correctly. Here's my shader code void DepthVertexShader( float4 position POSITION, uniform float4x4 worldViewProjMatrix, uniform float3 eyePosition, out float4 outPosition POSITION, out float Depth ) position is in object space outPosition is in camera space outPosition mul( worldViewProjMatrix, position ) calculate distance from camera to vertex Depth length( eyePosition position ) void DepthFragmentShader( float Depth TEXCOORD0, uniform float fNear, uniform float fFar, out float4 outColor COLOR ) clamp output using clip planes float fColor 1.0 smoothstep( fNear, fFar, Depth ) outColor float4( fColor, fColor, fColor, 1.0 ) fNear is the near clip plane for the scene fFar is the far clip plane for the scene
14
OpenGL UseProgram() fails I have a rather strange exception on my application using OpenTK on Linux (ArchLinux with Mono 3.2.8) with GL.UseProgram(). I wrote a class to combine multiple files to one program public sealed class Shader IContent public string Name get return name readonly string name readonly ShaderElement elements int programId public Shader(string name, params ShaderElement elements) this.name name this.elements elements this.programId 1 public void Compile() programId GL.CreateProgram() for (int i 0 i lt elements.Length i ) var info new FileInfo(elements i .File) if (info.Exists) int shaderId GL.CreateShader(elements i .Type) using (var reader info.OpenText()) GL.ShaderSource(shaderId, reader.ReadToEnd()) GL.AttachShader(programId, shaderId) else throw new FileNotFoundException("Shader file not found.", info.Name) GL.LinkProgram(programId) public void Apply() if (programId 1) throw new InvalidOperationException("Compile() has to be called before Apply()") GL.UseProgram(programId) public void Unapply() GL.UseProgram(0) public void Dispose() GL.DeleteProgram(programId) My sample shader is very short because I try to understand how OpenGL works Vertex precision highp float void main() gl FrontColor gl Color gl Position ftransform() Fragment precision highp float void main() gl FragColor gl Color (1 clamp(gl FragCoord.z 0.05, 0, 1)) gl FragColor.w 1 A sample call for Apply() public override void Draw(double interpolation) GL.MatrixMode(MatrixMode.Modelview) GL.LoadMatrix(ref translate) GL.MatrixMode(MatrixMode.Projection) GL.LoadMatrix(ref projection) GL.Color3(Color.Red) simpleShader.Apply() sphere.Draw() simpleShader.Unapply() Running the application under Windows does not generate any exception. The following is crashing running under Linux First call of Draw() gt First call of Shader.Apply() gt Draws geometry for first frame gt First call of Shader.Unapply() Second call of Draw() gt Second call of Shader.Apply() gt GL throws InvalidOperation Error. Any ideas why Windows behaves different to Linux and how this can be resolved?
14
Spherical fragment shader shockwave regardless of screen dimensions I've been working on a shockwave shader based on some examples I've looked at. My primary issue is that the screen resolution dictates the shape of the shockwave. I need it to be spherical, but with default screen shapes it's an oval. I can resolve the issue by reverse the resolution ratio calc but that winds up stretching the actual texture. https www.shadertoy.com view XlfBWj void mainImage( out vec4 fragColor, in vec2 fragCoord ) float offset (iTime floor(iTime)) iTime float CurrentTime (iTime) (offset) vec3 WaveParams vec3(10.0, 0.8, 0.1 ) vec2 center vec2(0.5, 0.5) vec2 uv fragCoord.xy iResolution.xy float ratio iResolution.y iResolution.x WaveCentre.y ratio texCoord.y ratio float dist distance(uv, center) vec4 Color texture(iChannel0, uv) if (dist lt CurrentTime WaveParams.z amp amp dist gt CurrentTime WaveParams.z) The pixel offset distance based on the input parameters float Diff (dist CurrentTime) float ScaleDiff (1.0 pow(abs(Diff WaveParams.x), WaveParams.y)) float DiffTime (Diff ScaleDiff) The direction of the distortion vec2 dir normalize(uv center) Perform the distortion and reduce the effect over time uv ((dir DiffTime) (CurrentTime dist 40.0)) Color texture(iChannel0, uv) fragColor Color
14
Cycling through colors with the same brightness Ok. So I have a game I am working on with a bunch of colored particles. Each of them gets their own color however they are all close to a specific shade. To explain that better I am using the HSV color space, and I have a global variable that is cycling from 0 360 (H) and each of the specific particles add or subtract a random amount from that. I use HSV because it is the easiest way for me to cycle through colors because you can just change H. This looks very nice and all, however the particles are supposed to be pure light that is colored. Right now all they are are circles colored with the shade, but later on I might add glow effects and other shading techniques. The problem is that not all shades (I dont mess with H or V) are equal as far as the brightness our eyes see, and this causes problems. For example when all of them are shades of blue things look pretty dark. Obviously in the shader I have to convert these HSV colors to RGB colors and doing some math I quickly saw what was wrong... the colors didn't have equal luminance! As I understand it luminance is roughly (0.2126 color.r 0.7152 color.g 0.0722 color.b) which is why everything is off. Is their a way to get it so I can still get all of the different chroma (colors on the wheels degrees) while maintaining a high luminance? Perhaps I should work with a different type of color then HSV, or perhaps their is a formula that finds a color at degree x on the color wheel with luminance of y? Thanks much!
14
How to achieve "concept art" graphics? I'm currently working on a 3D Game and I want a unique graphic style. Concept Arts have a cool "painted" look. You can clearly see that this picture is painted on a graphic tablet. (https i.stack.imgur.com pnyzB.jpg) Source Destiny Concept Arts How can you achieve that kind of graphics? I guess you draw the texture just like the concept art and apply a light cel shader?
14
WebGL fragment shader pixel position I have written a fragment shader, on a shape which is a square made with gl.TRIANGLE STRIP, and which displays a radar To build the radar, I'm using distance from the center of the shape. Currently, I'm using gl FragCoord and have to pass the shader the size of the window, the position of the center of the radar, in order to determine where I am in the shape. I'm not really OK with that.. I would like to know if there is a better way to make the shader, to make it independent from display variables. For example I thought I could use gl PointCoord whose x and y coordinates goes from 0. to 1., but it requires to use GL POINTS... I could give it a try by using GL POINTS, but I'm not sure it's THE best solution. For example, I could make a 3d scene, with lot of cubes, and display this radar on the faces of that cube. If the shader isn't independent from gl FragCoord, it feels like it will be a nightmare to make it work. Is there a way to get the relative position of the pixel we are giving the color, in the shape (triangle) we are rendering? Do you have suggestions tips?
14
OpenGL GLSL LWJGL return value from shader I'm trying to do some bone animation. I don't want the whole skeleton to be loaded to the GLSL shader every time, even tough it's not needed, because one vertex is parented to one bone, not more. My question is simple (How) can i return some value from the vertex shader back to my code? I'm using LWJGL Java
14
How to sample png heightmap in HLSL I am working on simple parallax shader and I have problem with sampling height texture for value. My aproach amples texture and give me RGB values of picture tex2D(TextureHeightSampler, input.BaseOutput.TexCoordAndViewDistance.xy) But I am not sure how to convert RGB data to height value. How to solve that?
14
Dynamic Shader Linkage in DirectX12? Do the dynamic shader linking concepts introduced in DirectX11 ShaderModel 5.0 still work in DirectX12? In the documentation the reflection API is still available (e.g. ID3D12LibraryReflection), claiming it is provided especially for the shader linking technology, but I couldn't find other important interfaces like ID3D11FunctionLinkingGraph or ID3D11ClassLinkage in DX12.
14
Basic equation for pure metallic reflectance I'm working on a live wallpaper that shows a pure metallic object. Since it's a live wallpaper, I can make a ton of approximations...this isn't a full blown scene in a game world. My shader only has to support this one material. People won't expect extreme detail, etc. So, for example, I don't even bother with diffuse since a pure conductive material barely has any. So I'm trying to implement a reflective conductive surface that pulls the reflection from a cubemap representing the lighting environment. On a very basic level it looks like this vec3 fresnelSchlick (vec3 f0, float cosTheta) return mix(f0, vec3(1.0), pow(1.0 cosTheta, 5.0)) ... float NdV dot(normal, viewDir) vec2 brdf texture2D(u brdf, vec2(u roughness, NdV)).rg from a LUT vec3 reflection textureCube(u cubemap, reflect(viewDir, normal)) gl FragColor reflection (fresnelSchlick(materialColor, NdV) brdf.x brdf.y) The problem with my simple math here is that the range of possible final pixel colors does not produce a proper looking range of hues. Suppose my material color is 0050ff and the environment is monochrome white. Then all of the possible colors that could be seen are This doesn't allow for bright looking reflections at the light sources in the envmap. I found this example material render for comparison. The cubemap in the example here is close to monochrome white. But at the bright areas of the reflection, there is a lot of red component in the color. With my limited range of colors, my object looks dull and lifeless. Can anyone point out what I'm missing in my equation?
14
How to access material properties in metal shader I am struggling to find any clear information on how to access the properties of a material inside a metal shader, specifically the diffuse color. My geometry sources are defined with vertices, normals and texture coordinates let sources SCNGeometrySource(vertices vertices), SCNGeometrySource(normals normals), SCNGeometrySource(textureCoordinates textureCoordinates) let elements meshIndices.map SCNGeometryElement(indices 0, primitiveType .triangles) self.init(sources sources, elements elements) Inside my shader I can access the vertex position and normal using the semantic attributes typedef struct float3 position attribute(SCNVertexSemanticPosition) float3 normal attribute(SCNVertexSemanticNormal) Vertex Whilst SCNVertexSemanticColor is a valid attribute, these are not defined on a per vertex basis in my case with the color instead being derived from the diffuse property of the material attached to the shader program. Is there a way to access the properties of the material attached to my geometry? I would assume that creating a struct and binding it to an input attribute is the correct way forward but I am unsure how these are surfaced to the shader. typedef struct float3 diffuse materials? how to reference material.diffuse.contents? Material I assume that these properties must be available to the shader as everything is rendered as expected if you leave SceneKit to do its own thing. Do I need to bind these properties as uniforms myself or is there a specific argument I can use to access them?
14
Iris wipe shader not properly working I'm working on creating an iris wipe transition, like the ones you see in old cartoons a fully transparent circle closes on a certain point, leaving a full screen of a solid color. Additionally, the background around the circle fades in from full transparency as well. I decided that shaders would be easier than creating vertices to build the iris effect. I may have been wrong. EDIT I've made some progress since asking this question there is now an iris effect, but it opens and closes around a blank CornflowerBlue screen before displaying the screen, and the background around the circle is always full transparency. Shader sampler TextureSampler register(s0) float2 irisCenter float radius float4 backColor float4 PixelShaderFunction(float4 pos SV POSITION, float4 color1 COLOR0, float2 coords TEXCOORD0) COLOR0 float4 p pos float2 c irisCenter float r radius float alpha abs(1 step(pow(p.x c.x, 2) pow(p.y c.y, 2), r r)) return float4(backColor.r, backColor.g, backColor.b, alpha backColor.a) technique Technique1 pass Pass1 PixelShader compile ps 3 0 PixelShaderFunction() The alpha variable should be deciding if a pixel is inside the circle and should be transparent, or if the pixel is outside the circle and shouldn't be. irisCenter, radius, and backColor are all set on each frame by public void Draw() if (isInitialized amp amp (isRunning dir EffectDirection.Backward)) float r color.R 255f float g color.G 255f float b color.B 255f float a (color.A currentFadeLevel) 255f var irisEffect GameServices.Effects "IrisEffect" irisEffect.Parameters "irisCenter" .SetValue(irisCenter) irisEffect.Parameters "radius" .SetValue(irisRadius) irisEffect.Parameters "backColor" .SetValue(new Vector4(r, g, b, a)) quadRenderer.Render(irisEffect) IrisEffect.cs QuadRenderer.cs Instead of behaving as an iris wipe, it first draws a full screen rectangle for half the effect, and then a full screen Color.CornflowerBlue rectangle (I guess the quad from QuadRenderer isn't drawing with transparency). I'm not sure what's wrong as I'm new to HLSL. Question 1 How can I get the coordinates of the current pixel being processed by the shader? Question 2 What am I doing wrong? Question 3 How do I draw the rest of the quad transparently?
14
How to send data from compute shader to vertex shader I have some shaders, every shader has the same constant buffer, constant buffer cbuffer cbPerFrame register(b0) float3 gEyePosW float4x4 gView float4x4 gProj float gDim float3 gVoxelOffset float gVoxelSize I think it's a waste to send these same constant buffer several times, so I want to pack them in one Constant Buffer, and share them to other vertex shaders. I have tried send constant buffer to compute shader, but they don't work in vertex shader. Do I need to pack these data in one structured buffer? Is it faster than send constant buffer several times? Is deferred shading relevant to this problem? Which solution do we usually use to solve this problem?
14
D3D11 SetShader States I have some questions regarding the XXSetShader and what happens after, for instance I would like to know if when XXSetShader is called the subsequent calls would be bound to that particular shader, like PSSetShaderRsources. Because at load time I am bounding the resource views needed for that particular shader, then the next one and so on, but what I found is that the resources were not bound, I need to set them again every time the XXSetShader is called. Am I doing something wrong? it's not supposed to work like that, the purpose of this is to have the least change states at runtime. Thanks.
14
Is it possible to retrieve shader function names associated with a technique pass using the DirectX Effect API? For example, given the pass pass p0 SetVertexShader(CompileShader(vs 4 0, VSFunction())) SetPixelShader(CompileShader(ps 4 0, PSFunction())) Ss it possible to retrieve the names VSFunction and PSFunction? It doesn't look like any of the associated shader descriptors actually contain the name of the entry point.
14
Why would an ambient occlusion (AO) shader's performance be dependent on light direction? One of my favourite games recently implemented ambient occlusion as a graphics feature, which appears to perform very well in most circumstances except during sunrise and sunset. As someone who is getting into shader programming, I'm intrigued by this. My understanding of AO is that you precompute the diffuse shading for a range of incident vectors, then use that information to produce a correct shadowing scale for real time use by enumerating over all the lights and computing how their position in relation to the object compares to each of the incident vectors, essentially interpolating across the closest values. Since the occlusion is pre computed against a model, taking complex shapes and self occlusion into account, you don't need to raytrace and the performance is much better. So here's where I get lost since the diffuse map is precomputed, and you're just comparing light positions to those maps for all objects within the screen space, why would performance suffer for cases where the ambient light (the sun) is close to the horizon? It seems like this shouldn't matter surely the performance of such a shader scales against vertices, not the angle of the light? The game uses the latest version of Unity (they just bumped up a minor release) but I don't know if they use the built in Unity SSAO shader. Performance hit is major 28fps at 1440p during "day" hours, dropping to 12 14fps during sunrise set. This only occurs when AO is enabled. I'm not interested in finding a solution I'm sure they'll fix it sooner or later I'd just like to get a better idea of how AO performance scales, what its pain points are, and why this kind of behaviour might appear.
14
How do I mask a height based fog? I've been trying to implement a 3D (with the Y axis being utilized) fog of war system similar to what XCOM uses. There is only one hint that really seems to nail it, but I can't read the actual function that was in place. http etiennecarrier.com zombie tycoon 2 fog of war shader The key difference here is that I'm also trying to apply this to a multi leveled building. So the question is... How do you go about implementing a height based fog shader, with a masking texture to prevent it from drawing in certain world spaces.
14
Screen effects and antialiasing I have been working on a game for a while using glut for basic window creation. I was rendering to an offscreen buffer so that I could implement various effects like screen bulging, motion blur, refraction, etc. I also used the screen texture with antialiasing (fxaa). Now I have changed from glut to sfml. I switched on the in built antialiasing and it looked much better than my version, but now I don't have the screen in a texture so I can't use the screen effects. So my question is, how to people normally deal with this issue? Can I take advantage of sfml's antialiasing functionality and retain my effects? I thought about using glReadPixels, but that seems way too slow. Does sfml do offscreen rendering behind the scenes and can I access that texture? This is not specific to sfml. How do AAA games do it? Do they always implement their own antialiasing techniques?
14
GLSL Issue replacing ternary operator with mix I was expecting these two code snippets to do the same thing return vec3( 1.0 b.r gt a.r ? 0.0 1.0 ((1.0 b.r) a.r), 1.0 b.g gt a.g ? 0.0 1.0 ((1.0 b.g) a.g), 1.0 b.b gt a.b ? 0.0 1.0 ((1.0 b.b) a.b) ) and return mix( ONE3 ((ONE3 b) a), ZERO3, vec3(greaterThanEqual(ONE3 b, a)) ) ONE3 vec3(1.0,1.0,1.0) ZERO3 vec3(0.0,0.0,0.0) For some reason, they have different outputs. Do you know why? (a can have zeros sometimes)
14
XNA games C application executable work on one win7 not the other one Our company wrote a game in XNA studio 4 almost ten years ago. we try to reinstall it in win7 with only the executable. Both installed XNA Game Studio 4.0. Below is the environments parameter I can find. The one on the laptop is not rendering the effect(I am new to this, this is my guess). Laptop Desktop OS win7 Service Pack1 win7 Service Pack1 DirectX version 11 11 Graphic Card AMD firepro M4000 Mobility Prographics integrated GPU Intel HD graphics 4600 Chip Type AMD FirePro(0x682D) Intel(R) HD graphics family We don't know what make these two executable behave differently. Part of the window image just not shown. Second check on the source code, the different between the image that can be show and can not be shown on the failed Laptop is working one is using Bitmap and the not working on is using Texture2D with Microsoft.Xna.Framework.Graphics Effect with passes. Any ideas. Thanks.
14
Apply a grain effect to all the elements of a level I'm currently experimenting a little bit on level design. Let's say that I've a room composed of walls and a floor tiles. I'd like to apply to all these elements a sort of "grain" effect, similar to concrete. I wonder which is the better way to go... the only think I can immagine is working with texture directly, so for each element create the texture that already contains the "grain" effect. I've also tried to apply a generic screen shader but the result is not good because the grain is fixed to the screen, I wanted it to be fixed to walls floor. Obviously creating all the textures is a long process and I have to keep UV absolutelly proportional for each element. I'm ok with that solution actually but I wonder if there is anyother way to apply this effect all in once in a simpler and "safer" way, soemthing like a decal that maybe adds also some imperfections.
14
I don't think my shaders are working, looking for help Using sharpdx(directx 11) developing on UWP. This is a link to a previous question of not being able to compile the shader files(written in hlsl) How to compile shader files in UWP Later I have found some method to avoid the original exception(unable to open or find shader files) but still the shaders seem not to be working. My method to quot avoid the original exception quot is to change the file path to System.IO.Path.Combine(Windows.ApplicationModel.Package.Current.InstalledLocation.Path, quot ProjectName quot ) If the method works, I would surely post it as an answer to the previous question The current problem is no display on the screen(all black), while I can check the stored data in the vertex buffer using graphic diagnostics tools built in visual studio. Here is my code for the shaders, problem may be going from here because this is the first shader file I've written. Please have a look and discuss may things go wrong. pixel shader file Pixel PS.fx struct VertexOut float4 PosH SV POSITION float4 Color COLOR float4 PS(VertexOut pin) SV Target return pin.Color vertes shader file Transf VS.fx cbuffer dataBuffer register(b0) matrix ViewProjection struct VertexIn float3 PosL POSITION float4 Color COLOR struct VertexOut float4 PosH SV POSITION float4 Color COLOR VertexOut VS(VertexIn vin) VertexOut vout Transform to homogeneous clip space. vout.PosH mul(float4(vin.PosL, 1.0f), ViewProjection) Just pass vertex color into the pixel shader. vout.Color vin.Color return vout where I compiled them byte vertexShaderByteCode ShaderBytecode.CompileFromFile(this.path quot Transf VS.fx quot , quot VS quot , quot vs 5 0 quot ) this.vertexShader new D3D11.VertexShader( device, vertexShaderByteCode ) this.inputSignature new ShaderSignature(vertexShaderByteCode) this.pixelShader new D3D11.PixelShader( device, ShaderBytecode.CompileFromFile(this.path quot Pixel PS.fx quot , quot PS quot , quot ps 5 0 quot ) ) data structure definition public struct ScatterVertex SharpDX.Vector3 Position SharpDX.Color4 Color inputlayout initialization this.inputLayout new D3D11.InputLayout( this.device, inputSignature, new SharpDX.Direct3D11.InputElement new D3D11.InputElement( quot Position quot , 0, SharpDX.DXGI.Format.R32G32B32 Float, 0 ), new D3D11.InputElement( quot Color quot , 0, SharpDX.DXGI.Format.R32G32B32A32 Float, I'm not very sure whether or why I should use this option 0 ) ) Here's the projection matrix in the camera class, which is later sent to the constant buffer private Matrix View get return Matrix.LookAtLH(this.Eye, this.Target, this.Up) private Matrix Proj get return Matrix.PerspectiveFovLH(this.Fov, this.Aspect, this.Near, this.Far) public Matrix WorldViewProject get Matrix wvp (this.View this.Proj) wvp.Transpose() return wvp If you need anything else to solve this, just let me know and I'll manage to get it. Thank you!
14
Normal Matrix in plain English I'm into shader language with Webgl and GLSL. I've seen some tutorial about normal matrix and I don't really understand it. I mean, I think I'm ok with the math such as modelViewMatrix mat4.multiply(camera.view, modelMatrix) inverseModelViewMatrix mat4.invert(this.modelViewMatrix) normalMatrix mat3.fromMat4(inverseModelViewMatrix) normalMatrix mat3.transpose(this.normalMatrix) But why do I need it? Where can I find a case where I can see the difference between using it and not using it? Ot when do I don't need it?
15
How can I update a Stage based HUD from another Stage's Actor in libGDX? I feel like this is a simple problem, but I'm having issues finding the correct search terms. I have a Screen that includes two Stages. The first, stage, contains a number of objects deriving from Actor and added via the standard stage.addActor(Actor). This is for the actual game objects. The second, hudStage, consists of a single Table actor. This table includes a button as well as two Labels. This is for the static user interface HUD. I'm handling touch events on my custom actors in the game stage like the following public class GameObject extends Actor private final static String TAG GameObject.class.getName() ... public GameObject() ... addListener(new InputListener() Override public boolean touchDown(InputEvent event, float x, float y, int pointer, int button) Gdx.app.log(TAG, "touchDown on GameObject actor lt " x "," y " gt ") return super.touchDown(event, x, y, pointer, button) ) This works perfectly and logs what I need. However, I would now like to update the HUD with information about the object the user touched. My screen has defined the Label I want to update (added via hudStage), but I'm not sure how to go from an actor in stage to the screen, or to the hudStage label. public class GameScreen implements Screen Stage stage Stage hudStage This is the label I want to update. Label infoLabel ... I know I can call setText(String) on the label, but I'm not sure the best way to bubble the click up. Plenty of examples have simplified versions where objects are defined in the main class (in my case, GameScreen) and are able to refer to them that way, but I've already broken these apart. I started to look at creating a custom event that my actor could trigger, and that the stage would handle, but documentation is sparse and I'm not sure this is required. I've also thought about scrapping the idea of having a label in the hudStage and instead having my actor draw text where the HUD would have. However, from what I continue to see, I think Scene2D is the right direction to go for the UI. Thanks! EDIT What I'm looking for An actor in GameScreen.stage is target of a touchDown event. Logic in actor fires and updates object. touchDown event in actor finishes by updating infoLabel in GameScreen.hudStage or updates String value in GameScreen (if easier, I'll go this route).
15
How to match font size with screen resolution? So I'm working on a game using LibGDX, and I have a problem. To make my game fit most resolutions, I created a base asset for each aspect ratio, for example, a main menu background image, I made it in 800X600 for 4 3, 1280X720 for 16 9 etc. Now I am trying to incorporate TextButtons into the game, for the options My problem is that I can't figure out which font sizes match with which screen resolutions. Is there a way to figure this out, or do I have to go one by one through each of the resolutions I have and just manually match the text to the resolution?
15
How does mass work in Box2D I'm making a 2D platformer, shooter type game in LibGDX, and I'm wondering about Box2D's mass system. I'ven't used mass for anything in the game so far, and everything has worked ok. We move the player, enemies, and bullets around by body.applyLinierImpulse(new Vector2(impulseX, impulseY), body.getWorldCenter) and even have a variable jump height system, and it all works fine. To change the speed or impact of objects, we just change the impulse. I've tried using aplyForce() but it won't move the body. Everything is usually just a simple polygon, but when I'm creating fixtures there's an option to set the density, and bodies have the option to set the massData, but I don't know what either of these properties does. The gravity of the world is around 11. Can anyone tell me how I should be handling mass with my objects in the game? Thanks.
15
LibGDX TiledMap Unit Scale I'm working on a 2D rpg style game and I'm currently rewriting my game to get a more clean and structured code. Some thing that has been bugging me is that I have no idea how unit scales work with the OrthogonalTiledMapRenderer, it's kind of like a "Hey it works, I don't know how, but it works" solution. Basically to scale my map up, since it's a pixel game I use renderer new OrthogonalTiledMapRenderer(map, 4) which seems completely wrong since I've seen other examples using things such as 1 32f. However using that makes me unable to even find my map in game when moving the camera. I've looked up on unit scales but I have a hard time grasping why and how I should use them and if they are really necessary. And by using the code above, which I'm guessing I use the wrong way, does that bring any problems? Thanks in advance!
15
Using native code with Libgdx, (Raknet) I was searching for a game engine (Java maybe C ) with built in real time networking library to develop 2D cross platform (Android iOS) multiplayer game. Actually I have not found well suited solution apart from Unity, however I don't like some of the features of Unity. (Here is Discussion) After a while I decided to use Libgdx and tried to find a network libray that meets my requirements and I found Raknet as network engine which is written in C and cross platform. So here is my question Is there any way to use Raknet with Libgdx, or any other solutions you recommend as engine or anything else? And also is it possible to use any C library with Libgdx like Android's NDK?
15
How do I use Libgdx viewport for UI? I am using LibGDX to make a game for Android and I am using 2 Viewports. One for UI and one for the actual game content. My problem is I can't find a good way to scale my UI using a Viewport. This is a picture of my UI using a FitViewport As you can see it adds space on the top and bottom to fit the screen, but I want my pause button to stay on the top of the screen. I tried using a StretchViewport and that worked well for the pause button, but then the joysticks were no longer circle because they were distorted. Is there a way to make a StretchViewport that only stretches the middle? (since nothing will be in the middle I don't care about that. Should I make a custom Viewport? And if so, how would I go about that? Thank you for any help!
15
LibGDX ImageButton resizeing How can I resize ImageButton ? It show very big on screen . How I know it is an Actor and it is not resizing automatically . I can t find the method for resizing it. Any ideas?
15
How to get this type of movement in LibGDX? I want objects in my game to move randomly but always from right side of the screen to the left side. The movement must be as smooth and natural as possible, like they are following a path. Can i obtain this with the wander steering behavior without having objects going in every direction randomly? (that's what i have right now)
15
libgdx automove Actor does not work I am trying to make my sprite start moving from the moment it is created and do its animation. It draws on the screen right and animates but doesn't move. It should start moving towards the direction it is facin this means right and if collision detected to stop the animation do whatever and continue the animation. Here is my code public class grt extends ApplicationAdapter Misheva mshv Override public void create () mshv new Misheva() mshv.init() Override public void render () mshv.drawIt() mshv.act(Gdx.graphics.getDeltaTime()) Override public void dispose() mshv.disposer() Misheva.java public class Misheva extends Actor int state TextureAtlas atl TextureRegion walk SpriteBatch batch Animation anim float stateTime float elapsed 0 int horizontalspeed 10 public void reset () state 0 stateTime 0 Override public void act(float delta) super.act(delta) stateTime delta switch(state) case 0 setX(getX() horizontalspeed delta) public void drawIt() Gdx.gl.glClearColor(1, 1, 1, 1) Gdx.gl.glClear(GL20.GL COLOR BUFFER BIT) batch.begin() elapsed Gdx.graphics.getDeltaTime() batch.draw(anim.getKeyFrame(elapsed,true), 0, 0) batch.end() public void init () atl new TextureAtlas(Gdx.files.internal("data tomaton.atlas")) batch new SpriteBatch() walk new TextureRegion 3 walk 0 atl.findRegion("tomato") walk 1 atl.findRegion("t walk2") walk 2 atl.findRegion("t walk1") anim new Animation(1 10f,walk) protected void disposer() batch.dispose() atl.dispose()
15
Use Ashley with Scene2d in libGDX? I want to use Ashley but I also want to use actions and listeners with Scene2d. I read that it is not a good idea, but why? Ican create a system with stage.act() and stage.draw() in the update method or is it a bad idea? Thanks.
15
Ligdx Box2d convert box2d coordinates into screen coordinates I'm new in libgdx and i have a little problem. I created a body in box2d and i want to assign it a sprite for that i need to get the position of the body,then insert into the main loop the updated position of the sprite, right ? So..i made so but the sprite doesn't align right on the body, but in a corner. What should I write in body.setPosition() and in sprite's position ? P.S sorry but i can't post a screenshot (
15
How to implement own UI widget in libGDX? I want to implement my own widget, here is an image what I want to get. Instead of numbers there will be some names, but I think it doesn't matter. I'm pretty new in libGDX, so could you please briefly describe me plan, how can I do something like this (if there will be some code, it would be awesome)? I tried to google it, but didn't find something relevant. Thanks in advance!
15
Tile Texture If Dimensions of Sprite Are Larger Than Texture I have a sprite that acts as a wall in a game that I am building. I would like to have the brick texture repeat itself instead of stretch to fit the height of the screen. I tried Texture lSideTexture new Texture(Gdx.files.internal("wall.png")) lSideTexture.setWrap(Texture.TextureWrap.Repeat, Texture.TextureWrap.Repeat) lSideSprite new Sprite(lSideTexture) lSideSprite.setPosition( 50, 100 (height width) 2) lSideSprite.setSize(5,100 (height width)) But I am still getting a texture that has been stretched to fit the dimensions rather than repeated. Any ideas?
15
How to sort entities order by nearest entity distance? I have created a comparator class called DistanceSquaredComparator and implements it with Comparator lt Entity gt . Then in the override compare method, I get the distance square of each nearby position to owner position, then get the signum int value to return. Now the problem is I don't know if this is the efficient way of sorting the array by nearest distance. Entity owner Override public int compare(Entity o1, Entity o2) Vector2 ownerPosition Mapper.transform.get(owner) Vector2 nearby1 Mapper.transform.get(o1).position Vector2 nearby2 Mapper.transform.get(o2).position return (int) Math.signum(nearby2.dst2(ownerPosition) nearby1.dst2(ownerPosition)) usage DistanceSquaredComparator comparator new DistanceSquredComparator() ImmutableArray lt Entity gt players ... ImmutableArray lt Entity gt enemies ... Array lt Entity gt sortedEnemies ... ... fill sortedEnemies for(Entity player players) comparator.setOwner(player) sortedEnemies.sort(comparator) In the image below, green circle is the observer while the red circle is the observables.