_id
int64 0
49
| text
stringlengths 71
4.19k
|
---|---|
34 | Why can't I add a float4 read from a Texture2D.Sample in the Vertex Shader? These lines work (compile fine) float4 offset HeightMap.Sample(HeightSampler, input.Texcoord) input.Position.xyzw float4(0, 1, 0, 0) These do not (any use of offset together with input output Position causes the error) float4 offset HeightMap.Sample(HeightSampler, input.Texcoord) input.Position.xyzw offset No matter what I try to do to use the offset values to change the positions it breaks float4 offset HeightMap.Sample(HeightSampler, input.Texcoord) matrix lt float, 4, 4 gt offs 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, offset.x, offset.y, offset.z, 1 output.Position input.Position output.Position mul(output.Position, offs) Debug output (32,9) error X4532 cannot map expression to vs 5 0 instruction set (62,19) There was an error compiling expression (49) Error compiling effect hr S OK (0x00000000) Anyone have an idea why? |
34 | Exporting models from Crocotile3D to Unreal Engine 4 (.obj). Texures are always blurry. I want crisp, clean textures Can anyone help me? I am exporting models with textures (.obj) from Crocotile3D to Unreal Engine 4. I want them to be crisp and clear, as in the left image (Crocotile editor), but only get blurry, as in the right image (Unreal editor). Whether I export the model at scale x 1 or greater doesn't make any difference. My textures are 16x16 pixels. I have watched endless videos and tinkered with settings but been stuck for two days. Any help or advice will be much appreciated! |
34 | Tool to convert Textures to power of two? I'm currently porting a game to a new platform, the problem being that the old platform accepted non power of two textures and this new platform doesn't. To add to the headache, the new platform has much less memory so we want to use the tools provided by the vendor to compress them which of course only takes power of two textures. The current workflow is to convert the non power of tho textures to dds with 'texconv', then use the vendors compression tools in a batch. So, does anyone know of a tool to convert textures to their nearest 'power of two' counterparts? Thanks |
34 | OpenGL ES texture rendered not as expected with disproportional aspect ratio triangles I cannot seem to understand how texture coordinates work. I try to render a texture into two triangles and this is what I get Where the expected output is a normal continuous image as you can imagine. Triangles coordinates (where w 1080, h 1920) 0, 0, w, 0, 0, h, w, 0, 0, h, w, h Texture coordinates 0, 1, 1, 1, 0, 0, 1, 1, 1, 0, 0, 0 Texture parameters gl.glTexEnvf(GL10.GL TEXTURE ENV, GL10.GL TEXTURE ENV MODE, GL10.GL REPLACE) gl.glTexParameterf(GL10.GL TEXTURE 2D, GL10.GL TEXTURE MIN FILTER, GL10.GL LINEAR) gl.glTexParameterf(GL10.GL TEXTURE 2D, GL10.GL TEXTURE MAG FILTER, GL10.GL LINEAR) gl.glTexParameterf(GL10.GL TEXTURE 2D, GL10.GL TEXTURE WRAP S, GL10.GL CLAMP TO EDGE) gl.glTexParameterf(GL10.GL TEXTURE 2D, GL10.GL TEXTURE WRAP T, GL10.GL CLAMP TO EDGE) |
34 | Where can I find animated textures for free? I'm looking for some animated textures, especially water textures that show movement. After a bit of searching, I could only find this list. Although those textures are certainly not bad, there are only 4 to choose from. I was wondering if there is another website with free animated textures. For static textures, I can find hundreds of websites which offer them for free, but for animated textures, websites seem to be rare. Does anyone know of a website that offers free animated textures? |
34 | Are global shader variables slower than texture look ups? I want to send quite a bit of data to the GPU, the data will never change (or will change very rarely) after it has been created. Is there a performance impact to using global shader variables or should I pack the data into a texture to perform look ups? |
34 | Shimmering, scrolling panning a texture Shimmering I understand shimmering in textures is caused by sub pixel accuracy aliasing against the regular grid of pixels on the screen when drawn. I have a good example of it here, whereby I'm attempting to scroll pan a 2d texture by modifying the texture coordinates rather than by moving some actual geometry. The texture is "wrapped" and my idea was to fill in the bit falling off the left (or right), filling it in on the other side, to effect a single scrolling panning surface of (almost) infinite size. Unfortunately with random noise on the texture, when I'm zoomed all the way out the shimmering is kind of horrific. When I zoom in a bit it seems OK, which is kind of confusing to me. But anyway, is there a filter of some kind I can put into my shader to prevent this shimmering? I'm at something of a loss here. (To complicate matters my texture is actually just L16 luminance and I'm palettising it in the shader the L16 is sampled with LINEAR, the palette texture is POINT, though I'm pretty sure this makes little difference to the shimmering). Please note that with the video, YouTube has done its absolute best to remove any detail from it, meaning it's difficult to see the shimmer under the encoding artifacts. |
34 | Removing texture wrapping artifacts in OpenGLES I'm drawing an 2D array of cubes in 3D space using OpenGLES. The texture that I have bound is spritesheet style that is, it's multiple textures packed into one. If you look closely at the front of the image below, you can see 1 pixel thick lines that are lighter than the crates. And, thicker lines in the back of the image. They are not supposed to be there, and are what I believe to be pixels from another texture in the spritesheet. I'm using GL NEAREST MIPMAP NEAREST and GL NEAREST for the min and mag filters, and I've tried all of the different clamping modes and nothing changes (GL CLAMP TO EDGE, GL REPEAT, GL MIRROR REPEAT). At first I thought it was float precision error, but now I think it's something else (since using precision highp float in the shader makes no difference). |
34 | why are texture atlases called texture atlases and not image atlases More specifically, is there anything about texture atlases specific to textures? From what I've read of them, they might just as well be called image atlases. Images are not the same as textures, but textures can be created out of images. |
34 | UE4 Apply texture on landscape based on angle I am creating a material for my landscape. I know that one could for example use the landscapes heightmap as a mask to apply textures at certain heights, but I want to apply textures at certain angles. For example, applying a cliff texture if the angle at that point in the landscape is larger than 45 degrees. Thanks in advance! |
34 | GLSL pack floats into an RGBA texture I want to compose conventional triangle based models and particles with a ray traced scene at a reasonable frame rate. webGL does not let you write the gl FragDepth in the fragment shader. You cannot have multiple render targets, but you can render to an RGBA texture and then use that texture as input to another draw op. So I can render my ray traced scene and store the depth as input to the next stage. This is obviously far from ideal, but workable. I'd love to be wrong about this, and this is gathered from trial and error rather than any definitive source please correct any flawed assumptions How can you pack unpack a float and, ideally, some flag bits and such, into an RGBA texture efficiently in a GLSL fragment shader? |
34 | How to expose a child node s texture from the parent in Godot I built a simple scene that I instanciate at will. Simply a KinematicBody2D as root, containing a Sprite and a CollisionShape2D. There is a script linked to the root node that basically expose a few properties to describe an orbit and make it move in orbit around its parent. The point being to make a very basic solar system simulation or anything that requires a node to move around another in a circular fashion. The movement work fine and I ve been playing around my scene tree, instancing a bunch and watching them move around as moons of planets, etc. But the texture is all the same. If I make children editable, I can go change the texture of the sprite, but I d like to expose that property from the root node so I can access it directly from the editor. I tried adding an exposed property with a setter and a getter to assign and retrieve the texture, but it seems to crash Godot completely with this code export(Texture) onready var texture setget texture set, texture get func texture set(newtexture) "Sprite".texture newtexture func texture get() return "Sprite".texture Is there a solution for that? Or simply another way? |
34 | Other procedural material generators for Unity? In the new version of Unity 3.4, unity announced that they would now support procedural materials (which is awesome, by the way). While I was researching it, I found this in the manual Allegorithmic's Substance Designer can be used to create Procedural Materials, but there are other applications (3D modeling apps, for example) that incorporate the Substance technology and work just as well with Unity. I don't have Allegorithmic's substance designer and don't plan on buying it soon. What other applications or 3D modeling apps can make procedural materials that work in Unity? Edit I found that Allegorithmic has a program called Player. But it is on Windows only. I'm on mac. |
34 | Best way to load multiple TextureRegion from multiple bitmaps in AndEngine I've seen a lot of examples of AndEngine usage where some bitmaps are mapped into some TextureRegions and loaded into an Atlas. But what if I have a lot of bitmaps, where each bitmap is a sprite sheet, with several sprites for a given entity? How can I extract multiple texture regions from each bitmap and load all that into a single Atlas? |
34 | projected textures not appear on the "back" of the mesh as well? I want to create blood wounds on my character's bodies by using projected textures. I've watched some commentaries on games like Left 4 Dead and they say they use projected textures for the blood. But the way projected textures work is that if you project a texture on a rigged character, say his chest, it will also appear on his back. So what's the trick? How to get projected textures appear only on one "side" of the mesh? I use the Panda3D game engine, if that will help. |
34 | Projecting a moon texture to the background I want to display a 2d moon image to the background. I have a normalized direction for the center location of my moon texture. I also have the normalized view direction from the camera for each pixel. I'm working in HLSL and I need to find the uv for the texture. How can I do this without the texture being distorted. I tried a variety of spherical uv mapping equations but none of them worked. This post works but only if d is (0,0,1) in a right handed Y up coordinate system How can I calculate the U,V texture coordinates on a disk at infinity given only a view vector and a vector pointing to the disk 39 s center? It seems like you have to rotate the up righting right vectors with moon's location but how? |
34 | How can I spread my texture evenly across a cylinder in Unreal Engine 4? How can I spread my texture evenly across a cylinder in Unreal Engine 4? Specifically, in the image below, I would like have texture on cylinder (on the left) match the one on the box (on the right). I need a many sides because I want the cylinder to be very smooth. |
34 | How to achieve highly detailed textures on buildings architecture? I have a bit of experience when it comes to 3D modeling, but something I never really wrapped my head around is environment modeling. You know, you walk down a road and you see lots of buildings, they all look good, with great texture details, and yet they don't go over the texture budget and the framerate is nice. Now I'm working on a little prototype, and I wanted to test how to make assets for it. This is supposed to be one floor of a fantasy tower. I did the modeling in Blender (with a rough UV), then I exported it into Quixel, I made a simple texture, 4K resolution because why not at this stage. The problem is, when I look at it ingame or in blender, it's quite awful. This is a single texture and a single material for the whole mesh. I was expecting this result, because the wall is some 20 meters long, so the texel density is really low. But this is already a 4K texture, when I should be using a 512 or 1k texture really. I mean, when you look at modern games, stuff like this looks sharp enough, with lower res textures than what I'm using now. So, basically, what is the right approach to something like this? I was thinking about something like this Have the "core" with a low res tiling texture, and then details as separate meshes that I would build in the engine (UE4 in this case). But I'm kinda worried about performance. I mean one floor would be at least 15 meshes, so 15 draws calls, so for ten floors it would be 150 drawcalls or something, I could barely make a village this way. I know some games do fully modular buldings, like Fallout 4, they have a bunch of "kits" made of 2x2m blocks that snap together and each block has its texture and it looks good. But I don't really need that level of granularity, and I want more individual looking buildings, that don't look really copy pasted. And even if I did go that way, if one building is like 100 pieces alone, that's some 100 draw calls, how does that even work ingame? An example from RDR2, how would go about making something like this? It can't possibly be one single mesh with just one texture. How can I achieve this kind of result? |
34 | " SRGB" suffix for BC texture format doesn't result in sRGB to linear correction at sampling I am working on a 3D engine as a hobby (Direct3D 11). Currently I am trying to implement sRGB gt linear gt sRGB color space conversions via texture formats with quot SRGB quot suffix. So, my textures are supposed to be sRGB images (for example, DDS files compressed in BC1 UNORM SRGB format), output is also gamma corrected (thanks to R8G8B8A8 UNORM SRGB frame buffer) and all shader calculations are made in linear color space. The problem is, whether I use sRGB or non sRGB format for the DDS file, after sampling the texture I get exactly the same color values inside the shader that are stored in original image. But if I get it correctly, sampler should apply implicit gamma correction (pow(color, 2.2f)) for the input values. So, for example, if I want to output the same color as is in the map (lets say, 0.5f for R channel), I sample it from the texture (0.5f becomes 0.218f after transferring to linear color space by sampler), then I do nothing with it inside the shader and send to output. As the frame buffer has sRGB format, merger (or some other part of pipeline, that does it) will apply gamma re correction to our value (pow(color, 1.0f 2.2f)), and 0.218f will become 0.5f again. I will get the same image as I had on the input. The output colors are definitely gamma corrected, as I see clear difference when I change frame buffer format from sRGB to non sRGB. But as no input correction is applied, the final image looks brightened comparing to the input one. As I said, I checked the value in pixel shader after texture.Sample() call, and it is exactly the same as colorpicker shows for the source texture. When I swap the texture format to non sRGB, nothing changes (I use Visual Studio 2013 Update 4 for format changing). To load DDS files I use DDSTextureLoader. I also called Direct3D methods directly to create resource and view from the file, but nothing changed. Both resource and shader resource view have quot SRGB quot format, as I can see in Graphics Debugger, so they are definitely sRGB ones. I've read at MSDN that in Direct3D 11 setting texture format is enough for sampler to recognize sRGB image. Is there something I am doing wrong or missing, or understand incorrectly? Maybe someone had similar issues? Any advice would be highly appreciated! |
34 | Why loading 250mb of compressed texture data spills Out of memory error on Windows I am making SDL2 OpenGL 2D game for Windows with a lot of pre rendered sprites. When testing on my laptop I got SDL Surface creation failed out of memory error during loading assets. Windows' Application Manager on other machine shows that my game in stress uses less then 170 mb of ram memory and MSI Afterburner shows that vram usage for entire system is below 800 mb. My laptop has 8 gb of RAM and 2048 mb of vram (GeForce 840 M). I am obviously doing something terribly wrong, but I don't know wheres my ram usage counting technique is wrong, or maybe for some reasons I have less memory then available on system, or something completely else is happening. But I don't know where to search for. I will gladly post some code if I need to, but I don't know which parts will be useful in diagnosing the problem. One more thing, I am using GL COMPRESSED RGBA ARB for all textures Edited It's not exactly the same situation as what's described here. I've already measured vram usage, and numbers are way below available count, yet still I get out of memory error. So, the question is "How measure properly vram usage", not "How measure vram usage", as in other question. I am not a native English speaker, so maybe there is some cloud in my title I will gladly accept correction for title Edited I reckon the image load procedure might be useful here, since it is heavily related to issue. Game Texture Game Asset GetTex(const UString amp path) Texture ret assetsTex path if (ret) return ret const UString amp absolutePath GetAbsolutePath(path) SDL Surface surface IMG Load(absolutePath.GetCStr()) if (!surface) ERR( U("Failed to load surface 1 "), UString FromA(IMG GetError())) Here's my problem! int bpp surface gt format gt BytesPerPixel UInt8 pixels (UInt8 )surface gt pixels if (bpp ! 3 amp amp bpp ! 4) WARN( U("Texture ' 1 ' is not 24 bpp or 32 bpp! Aborting load..."), path) SDL FreeSurface(surface) return nullptr GLuint tex glGenTextures(1, amp tex) glBindTexture(GL TEXTURE 2D, tex) if (bpp 3) glTexImage2D(GL TEXTURE 2D, 0, GL COMPRESSED RGBA ARB, surface gt w, surface gt h, 0, GL RGB, GL UNSIGNED BYTE, pixels) else if (bpp 4) glTexImage2D(GL TEXTURE 2D, 0, GL COMPRESSED RGBA ARB, surface gt w, surface gt h, 0, GL RGBA, GL UNSIGNED BYTE, pixels) glTexParameteri(GL TEXTURE 2D, GL TEXTURE MIN FILTER, GL LINEAR) glTexParameteri(GL TEXTURE 2D, GL TEXTURE MAG FILTER, GL LINEAR) glTexParameteri(GL TEXTURE 2D, GL TEXTURE WRAP S, GL CLAMP TO EDGE) glTexParameteri(GL TEXTURE 2D, GL TEXTURE WRAP T, GL CLAMP TO EDGE) ret new Texture(tex, surface gt w, surface gt h) SDL FreeSurface(surface) assetsTex path ret return ret Edited I moved from SDL Image to stb image for image load, but I still got out of memory error message Edited I've installed VS Studio 2015 on my laptop and found out Task Manager shows application eats up 650 mb in the moment of Out of memory casted GetProcessMemoryInfo returns PeakWorkingSetSize 720 mb and WorkingSetSize 650 mb as well But Visual 2015 UI (Debug Diagnostic, memory usage) shows that application actually consumed 1.7 gb. Even if this is true and both former are not, there is still plenty of room for allocations (application is 32 bit, there are 8 gb of on board memory, not counting page file) I tried to convert png tga and load them up, to check if png uncompressing is eating memory, but the result was exactly the same, the same Out of memory error Out of memory is spit in png load function which has nothing to do with OpenGL and GPU at all. I am retagging and retitleing the post |
34 | Replicating no. of sprites without letting the app to slow down and crash Is it possible that if I'm making a a simple drag n drop game, does making a new sprite via constructor with texture as a parameter makes the game slower and depletes more memory until it crashes or does making a new texture via constructor cause the memory to go slow? I'm on a revision for the new drag n drop puzzle game where the only goal is to stack the most no. of objects as you can before it falls off the platform. Is it a good idea to create another sprite with texture as a parameter every time a player drag n drop another object? Here's the start of the dry run looked like and I set it to debug mode so that the platform is completely block the pit to test the replication test. Afterwards, when it reaches from hundreds to thousands, this will be like this and the game gets slower. When the game crashes, here's the result via Logcat 12 11 16 38 53.953 E AudioTrack(25701) AudioFlinger could not create track, status 22 12 11 16 38 53.953 E SoundPool(25701) Error creating AudioTrack 12 11 16 38 53.993 D dalvikvm(25701) GC EXPLICIT freed 947K, 14 free 14414K 16672K, paused 2ms 4ms, total 96ms 12 11 16 39 16.363 E AudioTrack(25701) AudioFlinger could not create track, status 22 12 11 16 39 16.363 E SoundPool(25701) Error creating AudioTrack 12 11 16 39 16.433 D dalvikvm(25701) GC FOR ALLOC freed 2120K, 15 free 14418K 16848K, paused 47ms, total 47ms By the way, I'm using BodyEditor library for Box2D program and rendering, a physics engine. |
34 | Ogre3d 2.1 How to apply texture on mesh? I have searched forums and tried to read the Ogre 2.1 samples, but I still have no clue how to apply texture on mesh ( Here what I've done so far. I use Easy Ogre Exporter to export the scene (actually I just need the model and texture). I got the following files modelRoot.mesh modelRoot.skeleton model.material model.tga Note I also use OgreMeshTool d.exe to upgrade the mesh from v1 to v2. I render this model into Ogre In the file resources2.cfg Essential Zip .. Data DebugPack.zip I add the model files (4 files I got from Easy Ogre Exporter at step 1 above) into DebugPack.zip file. And I use this code to add the model into the Ogre scene void MeshHelper CreateMesh(Ogre String szFileName, Ogre Vector3 scale) Bring the mesh to scene m meshItem m sceneManager gt createItem(szFileName, Ogre ResourceGroupManager AUTODETECT RESOURCE GROUP NAME, Ogre SCENE DYNAMIC) m meshSceneNode m sceneManager gt getRootSceneNode(Ogre SCENE DYNAMIC) gt createChildSceneNode(Ogre SCENE DYNAMIC) m meshSceneNode gt attachObject(m meshItem) if (nullptr ! scale) m meshSceneNode gt scale(scale gt x, scale gt y, scale gt z) ... meshHelper.CreateMesh("modelRoot.mesh", amp meshScale) But I can only render the model without texture ( Please help me to apply these texture into my model model.material model.tga Thanks for reading ) |
34 | Generating mipmaps for volume texture Does texconv from DirectxTex support generating mipmaps for volume textures? If not what are the other alternatives? I am asking because I have tried to create the mipmaps by using textconv.exe m 3 my.dds, but it looks like this (I am viewing it in the DDS plugin for photoshop) And now I wonder whether the DDS plugin is bugged that it displays only mipmap for first texture or the texconv doesn't support it? EDIT Acutally by taking a deeper look on this image, the mipmap is somehow blended from these 2 main textures. Why? (high resolution texture here) |
34 | Difference between texture arrays and multiple single textures? I've just learnt that DirectX 10 and above have a feature called "texture arrays". Which, basically, is just a normal array of textures (shader resources) which in a shader is declared like Texture2D myTextures 2 What I've been using so far is 2 multiple separate textures Texture2D myFirstTexture Texture2D mySecondTexture Is there any practical (performance, memory etc...) difference between the two? |
34 | reading from texture2d resource in directx11 Hi i am trying to read data in resource, which i used to do well without any problems. but suddenly it is not working. first i made immutable resource that has data in it, which is XMFLOAT4(1,1,1,1) here. next i made staging resource for reading. lastly, i called map unmap to read and store data into outputArr. (all HRESULT checked already) int WIDTH 10, HEIGHT 2 ID3D11Texture2D resource create texture D3D11 TEXTURE2D DESC texDesc texDesc.BindFlags D3D11 BIND SHADER RESOURCE texDesc.Usage D3D11 USAGE IMMUTABLE texDesc.Format DXGI FORMAT R32G32B32A32 FLOAT texDesc.Width WIDTH texDesc.Height HEIGHT texDesc.CPUAccessFlags 0 texDesc.ArraySize 1 texDesc.MipLevels 1 texDesc.SampleDesc.Count 1 texDesc.SampleDesc.Quality 0 texDesc.MiscFlags 0 XMFLOAT4 initValues new XMFLOAT4 WIDTH HEIGHT for (int i 0 i lt WIDTH HEIGHT i) initValues i XMFLOAT4(1,1,1,1) D3D11 SUBRESOURCE DATA data data.pSysMem initValues data.SysMemPitch sizeof(XMFLOAT4) WIDTH data.SysMemSlicePitch 0 device gt CreateTexture2D( amp texDesc, amp data, amp resource) ID3D11Texture2D staging create texture for reading D3D11 TEXTURE2D DESC stgDesc stgDesc.BindFlags 0 stgDesc.Usage D3D11 USAGE STAGING stgDesc.Format DXGI FORMAT R32G32B32A32 FLOAT stgDesc.Width WIDTH stgDesc.Height HEIGHT stgDesc.CPUAccessFlags D3D11 CPU ACCESS READ stgDesc.ArraySize 1 stgDesc.MipLevels 1 stgDesc.SampleDesc.Count 1 stgDesc.SampleDesc.Quality 0 stgDesc.MiscFlags 0 device gt CreateTexture2D( amp stgDesc, nullptr, amp staging) XMFLOAT4 outputArr new XMFLOAT4 WIDTH HEIGHT READ dContext gt CopyResource(staging, resource) D3D11 MAPPED SUBRESOURCE mappedResource ZeroMemory( amp mappedResource, sizeof(D3D11 MAPPED SUBRESOURCE)) dContext gt Map(staging, 0, D3D11 MAP READ, 0, amp mappedResource) outputArr reinterpret cast lt XMFLOAT4 gt (mappedResource.pData) std vector lt XMFLOAT4 gt testV for (int y 0 y lt HEIGHT y) for (int x 0 x lt WIDTH x) int idx y WIDTH x testV.push back(outputArr idx ) dContext gt Unmap(staging, 0) and it turns out, only when WIDTH is multiple of 16(HEIGHT doesn't seem to be matter here), it copies the data well into ALL element of array, otherwise it fill out just 0 into array until next 16 element. For example, if width height is 10 2, first 10 elements of outputArr will have proper data and next 6 elements have just 0, and next another 10 elements with data, and 6 elements with 0, so on. i haven't had any problem on dealing with resources. and struggle still. just my humble assumption is that there might be specific alignment in number of width of resource that i miss. Or silly mistake in my process. Hope anyone can find something from this question. thanks |
34 | HLSL An array of textures and sampler states The shader must switch between multiple textures depending on the Alpha value of the original texture for each pixel. Now this would word fine if I didn't have to worry about SamplerStates. I have created my array of textures and can select a texture based on the Alpha value of the pixel. But how do I create an Array of SamplerStates and link it to my array of textures? I attempted to treat the SamplerState as a function by adding the (int i) but that didn't work. Also I can't use Texture.Sample since this is shader model 2.0. shader model 2.0 (DX9) texture subTextures 255 SamplerState MeshTextureSampler(int i) Texture (subTextures i ) float4 SampleCompoundTexture(float2 texCoord, float4 diffuse) float4 SelectedColor SAMPLE TEXTURE(Texture, texCoord) int i SelectedColor.a texture SelectedTx subTextures i return tex2D(MeshTextureSampler(i), texCoord) diffuse |
34 | projected textures not appear on the "back" of the mesh as well? I want to create blood wounds on my character's bodies by using projected textures. I've watched some commentaries on games like Left 4 Dead and they say they use projected textures for the blood. But the way projected textures work is that if you project a texture on a rigged character, say his chest, it will also appear on his back. So what's the trick? How to get projected textures appear only on one "side" of the mesh? I use the Panda3D game engine, if that will help. |
34 | Opengles 2.0 multi Textures fade out one by one I have multi textures, A was drawed on top of B, B was on top of C ...., ( like Picture 1) .I want to fade out A first, then B will appear and fade out either... But when A fade out, background color appeared on top of B, ( like Picture 1 )? Why? What should I do? I have enable Blend like this glClearColor(0, 104.0 255.0, 55.0 255.0, 1.0) glClear(GL COLOR BUFFER BIT GL DEPTH BUFFER BIT) glEnable(GL DEPTH TEST) glEnable(GL BLEND) glBlendFunc(GL ONE, GL ONE MINUS SRC ALPHA) and Fragment shader "void main(void) 2 n" "gl FragColor texture2D(Texture, TexCoordOut) n" "gl FragColor alpha n" " n" Picture 1 background color is green, A and B are both rectangle, A is on top of B Picure 2 When I fade out A, the result like this I aso try to use glBlendFunc(GL SRC ALPHA, GL ONE MINUS SRC ALPHA) but it doesn't work either. Update 1 shader const GLchar vertex shader source attribute vec4 Position uniform mat4 Projection uniform mat4 Modelview varying vec4 DestinationColor attribute vec2 TexCoordIn New varying vec2 TexCoordOut New void main(void) gl Position Projection Modelview Position TexCoordOut TexCoordIn New " const GLchar fragment shader source uniform highp float alpha uniform sampler2D Texture New varying highp vec2 TexCoordOut New void main(void) 2 gl FragColor texture2D(Texture, TexCoordOut) "gl FragColor.a alpha |
34 | Tools for assembling textures into DDS files There are plenty of tools for making images. I'm not looking for one of those I have many tools for creating an image. I've got tools for compressing images, generating mipmaps, and even for poking at their basic data format. My issue is with texture assembly. DDS files support cubemaps, array textures, and even cubemap arrays. But I don't know of a tool that can pack a series of images into a cubemap or the like. What tools are available for doing this kind of thing? |
34 | How to change texture from pvr.ccz in Cocos2d? I'm using .pvr.ccz file that texturePacker application created for me , I created a Sprite that it's image choosed from this file(.pvr.ccz) like below CCSprite mySprite CCSprite createWithSpriteFrameName("first.png") But how can I change it's texture using .pvr.ccz file? I knew below solution for changing texture of a sprite, but it works when second.png file is normal PNG file and is not compressed in a.pvr.ccz file. mySprite gt setTexture(CCTextureCache sharedTextureCache() gt addImage("second.png")) |
34 | How to get the "(human) Face" position from Webcam Texture I am searching a way to recalculate the current Position of the Face in "Virtual World Position" From a Webcam Texture. I was thinking about a way to solve this via taking the Average Pupillary distance (which is 60mm). And... With this number I somehow need to get a actual position ... somehow... of the... Face... (The Center) I dont know how exactly though. Any Ideas? |
34 | How to make part of the Texture not to be scaled.? I am new to LibGdx, i was doing one example for my learning and struck with this question, i had searched in google but i am not able to find the answer for this. 1) How to make part of the Texture not to be scaled. as you can see in the image i am reducing the height of bottom pipes so that the top pipe scale is reduced to fit the height given but i want the top part of the pipe not to be scaled. in this example whole pipe is one texture, for fixing this problem i thought to split the top and body to separate texture and set height for only body but i thought there should be some other solution so that only i had put this question here. |
34 | How to make a texture appear the same way in a cube? My problem seems so simple, that before asking the question here I did a search on google and this site https forums.unrealengine.com development discussion content creation 33665 applying and manipulating textures on actors https www.youtube.com watch?v p6O4AgSwTmQ Trouble applying a texture to a cube How to produce a texture to represent a vector field Why do I get a blank material in Unreal Engine 4? How to create smoke that spreads outward in all directions? But nothing seemed to be related to my problem. I imported this image.png I created a material and added the image to it But I did not get the expected result in the game (I wish this yellow line were under that pink arrow that I drew) Innocently I thought the solution was obvious, so I added another image.png and added it to the material To my surprise, the result was worse, the yellow line did not even appear I realized this happens because the floor of my game is a "stretched" cube, so the image I added is appearing correctly on one of its faces, not just the one I want. I'd like to know how to make it look the same on all faces, or some value that I can change so that it appears correctly on the face I want. EDIT 1 (attempt using rotator) I put the input time to see if any value would be compatible with what I want, but none is. I tried with both texture images, but both presented the same behavior See that spinning looks like the shaft rotates around the top left of the floor. For some moments the yellow line will disappear. |
34 | minecraft mapping from tibia OTBM file I am new to minecraft modding but i program for a living and it looks quite simple to get around. I have this fantasy and a quick internet search shows im not the only one. For those of you who do not know tibia is an MMORPG first released in 1997. It is a 2d tile based game and i would like to see it in 3D. since it is a tile based game it fits nicely with the minecraft architecture. Tibia is a close sourced game but there is TibiaOT which is developed by the community and the map file format is OTBM (open tibia binary map). This is a set of 15 2D grids where each grid represents a floor or the height of the grid, the file comes with two more files with monster spawns and default item locations like chests. There is a tibia texture pack for minecraft called "Tibian texture pack", so we got the materials covered. all in all we have the following a native tibia map and a minecraft texture pack people have been building tibia maps manually for years now, it seems stupid because we have a binary file for the real map. what we need is a way to map the OTBM file to a minecraft map. i am looking for tools and any other tips a on how to digest the files into a minecraft map file. an explanation of minecraft map file format would also be very helpful. |
34 | How can I pack repeated textures into one texture efficiently? I have an obj mesh with some textures, several of them are repeated(uvs are not within 0,1 ). I'd like to merge all those textures into one texture and transform the uvs of obj. |
34 | Why are examples of game textures often represented on a sphere model? I have recently started working with Unity and was looking for textures for the background (castle walls and such pieces) on a game that I am thinking about eventually building. I noticed that the textures are often represented as ball shaped, such as for example in this sample image here For example, why are the texture patterns not shown as simply flat squares? I guess there is a reason behind this that has to do with mapping to 3d surfaces but I was not sure so I wanted to check here. Why are textures often represented in spherical shapes in examples and previews? |
34 | Displaying 2 textures on the same surface? I'm messing around with XNA in a very simple Minecraft esque world I'm making and I'm wondering how I can do the "breaking the block" visuals in XNA. If you're unfamiliar with the game, you begin interacting with the cube and based one what tool you're using, the block will take X amount of time to break. There's about 10 different stages of cracking that the cube will display before the cube breaks. I'm curious about how to display those 10 stages of cracking on top of the original skin of the block. I know the game works using a Sprite sheet to color the blocks. This sprite sheet does not include every possible design at every possible stage of breaking the block, each stage is a different sprite with some transparency. In short, how can I quickly and efficiently overlay a texture on a surface that already has a texture? Do I have to combine them? If so, how? |
34 | Monogame Texture anti aliasing result in transparent edges This is a follow up question on a previous issue I faced with a game I'm trying to make. It's in a 2D isometric perspective and the levels are created from individual tiles. A simple 3x3 map can be seen here The tiles are individually added to the sprite batch, and (at least, this is what I think is happening) each one get anti aliased on their edges, resulting in a small blur which creates the thin edges that you can see in the picture. To make matters worse, I actually want to provide a bit more complex tiles in my game than what is displayed above. For example, I want to be able to show roads. To do so, I wanted to layer multiple sprites on top of each other, resulting in the picture below I think you can see where I'm going here. The same problem that I described with the aliasing is now occuring aroud each texture. So sad. I know I can use Point sampling to make the textures align pixel perfect, but even though this does work, the (relatively) high resolution textures really do not seem fit for such a setting I mean, these edges simply need to be anti aliased. So, my question to you all is Is there a way to stitch my tiles (and tile pieces) together in a way so that anti aliasing does not ruin it with those annoying edges, while still being able to produce a nicely anti aliased result? I hope there is, and that someone here can show me how ) Thanks, Rutger Edit 1 Changed the title to be more representative to the problem (I hope?) |
34 | Computing uv coordinates on a sphere approximated from an octahedron Right now I'm texturing the octahedron with the formular from wikipedia(Finding UV on a sphere) UV Coordinates Wikipedia That unfortunately leads to heavy distortion right at the beginning, which of course leads to a lot of distortion when splitting up the triangles when approximating it to a sphere. The upper and lower triangles are okay, the left and right ones are distorted. Now I'm looking for different formulars to calculate the uv's. Are there any other easy formulars? Or would another approach be much more complicated? |
34 | Avoid double compression of resources I am using .pngs for my textures and am using a virtual file system in a .zip file for my game project. This means my textures are compressed and decompressed twice. What are the solutions to this double compression problem? One solution I've heard about is to use .tgas for textures, but it seems ages ago, since I've heard that. Another solution is to implement decompression on the GPU and, since that is fast, forget about the overhead. |
34 | Best Practices for Texturing Large Objects Kind of a 3D graphics newbie here. This is the biggest prop I've ever tried to texture, and I'm kind of stumped. How do you get a decent resolution on large meshes like this without exporting enormous maps? Texture atlases? Tiling textures? And what's the best way to set up the UVs, if so? |
34 | Converting DDS textures to TGA I have a model which has textures in DDS format, but my game takes textures in TGA format. How can I convert the textures from DDS to TGA? |
34 | So, I need to create varied animal textures for my game. How would I go about doing that? For example, one of the creatures planned is an isopod like animal. Would I need to get pictures of real animals and adjust them to my needs? |
34 | Alpha blending is too smooth I got the exercise to create a color map from a heightfield and given normals with alpha blending. Therefore I have to blend 4 textures (deep flat, deep steep, high flat, high steep) depending on the height and slope. First I'm calculating the 3 alpha values (lowest layer is fully opaque, does not need to be blended gras texture) slope 1.0f normal.z at current pixel void calcAlphas(float height, float slope, float amp alpha1, float amp alpha2, float amp alpha3) alpha1 (1 height) slope low steep gt dirt texture alpha2 height high flat gt pebbles texture alpha3 height slope high steep gt rock texture After that I'm getting the color at the current pixel of all four textures and calculating the final color 3 3 3 (1 3) (2 2 1 2 (1 1 1 1 0)) The result looks pretty good but it's very smooth. To compare my result I got a pattern how it should be. Orignal My own color map |
34 | How to read a color of a pixel from texture (cocos2d js)? How to read a color of a pixel at x,y from a texture (in cocos2d js)? |
34 | What would be theoretical the perfect texture size? I am more a newbie considering game development but I am dealing very often with optimization of games for work and because I am interested in it. So as far as I know higher texture size need less draw calls if you make serveral objects share the same texture but it takes up more VRAM. Also you have to consider how acuit you want to have your textures. But the more I thought about the topic I thought there would be a theoretical perfect texture size, correct me if I am wrong. Thought experimet Considering you always want the same acuity of your game and you have 4 objects that take the exact same amount of UV space but have different textures. Then there would be a perfect spot between texture size and number of objects that share the same texture considering your goal is best performance. In this case there would be 4 possibilities 1 texture where all objects share the same one 2 texture for two of the objects 3 texures where it would be a mix of those above or 4 textures where every object have their own If performance is your main goal then there would be only one variable to change depending on how acuit you want your game, the acuity, wouldnt it? Hope you understood my thoughts |
34 | Distortion of color space in UDK (material editor) Question Using UDK's material editor, I wish to use a texture as a two dimensional array of values for mathematical computations (basically, the color components of each pixel in the texture alter the behavior of some transformations applied to another texture). I want to retain rather precise color values for my mathematical operations to keep their meaning, but in my experiments I noticed a heavy distortion of the color space dark colors are darker than they should be, and light colors are lighter than they should. Additional information This effect is best shown by the following material, which takes a simple linear gradient texture, and applies a rounding operation on the colors If the colors were correctly preserved, the preview on the left should have 10 bands of equal length (a black band, a 10 gray band, a 20 gray band etc. with each band being 10 of the width of the image), however the black band is huge, the next (10 gray) one rather large, and the bands get smaller and smaller as they get lighter. In this example, I take the red channel of the image, but taking the other channels gives similar results The red and blue channels are identical, and the distortion is slightly different with the green channel, and taking the combination of these logically produces small bands of purple and green between any two bands. I guess this is a compression issue with the original texure, but I don't know which parameters I should change so the colors are preserved when I use the texture in the material editor. Above is the texture I used, created with The Gimp. The colors are correct, it is a linear gradient the color of the pixels in column 0 is (0,0,0), the pixels in the column 255 are (255,255,255) and I manually checked at many points in the image that the pixels in the column x are (x,x,x)). Above are the properties for this texture. Nothing special here, just the defaults. Specifically, the Adjust Brightness Curve and Adjust RGB Curve properties are set to 1.0 (by default), which seems correct since they are the power to which the HSV and RGB curves must be elevated. The distortion I observed looks a lot like if the values were raised to a given power. |
34 | Need to know the origin and coordinates for 2d texture and 2d 3d vertices in webgl Long story short, I know my coordinates are off and I believe my indices might be off. I'm trying to render a simple 2d rectangle with a texture in webgl here's the code I have for the vbo ibo rectVertices.vertices new Float32Array( 0.5, 0.5, Vertice 1, bottom left 0.0, 0.0, UV 1 0.5, 0.5, Vertice 2, top left 0.0, 1.0, UV 2 0.5, 0.5, Vertice 3, top right 1.0, 1.0, UV 3 0.5, 0.5, Vertice 4, bottom right 1.0, 0.0, UV 4 ) rectVertices.indices new Int16Array( 0,1,2,2,0,3 ) I'm assuming the vertices go like this ( 0.5, 0.5) ( 0.5, 0.5) ( 0.5, 0.5) ( 0.5, 0.5) with the origin in the middle and the texture coordinates go like this ( 0.0, 1.0) ( 1.0, 1.0) ( 0.0, 0.0) ( 1.0, 0.0) so as you can see I'm all messed up. I'm also using gl.pixelStorei(gl.UNPACK FLIP Y WEBGL, true) Here's the output of the program the texture I'm using is this So, I guess I need to know the origins, but the triangle strip looks way off. I am doing this as well create VBO and IBO vbo gl.createBuffer() gl.bindBuffer(gl.ARRAY BUFFER, vbo) gl.bufferData(gl.ARRAY BUFFER, triangleVertices.vertices, gl.STATIC DRAW) ibo gl.createBuffer() gl.bindBuffer(gl.ELEMENT ARRAY BUFFER, ibo) gl.bufferData(gl.ELEMENT ARRAY BUFFER, triangleVertices.indices, gl.STATIC DRAW) and gl.bindBuffer(gl.ARRAY BUFFER, vbo) gl.bindBuffer(gl.ELEMENT ARRAY BUFFER, ibo) gl.vertexAttribPointer(vertexAttribute ,2, gl.FLOAT, false,FLOAT 2,FLOAT 0) position gl.vertexAttribPointer(textureAttribute,2, gl.FLOAT ,false,FLOAT 2,FLOAT 2) texture gl.enableVertexAttribArray(vertexAttribute) gl.enableVertexAttribArray(textureAttribute) gl.drawElements(gl.TRIANGLE STRIP, 6, gl.UNSIGNED SHORT, 0) It almost seems as though the Vertices and the UV coordinates are getting mixed up. |
34 | How are minimaps rendered for open worlds? When I've created minimaps for smaller to medium levels before I would use a large texture mapped to a plane for the minimap and world map. The minimap world map camera would then view this texture from a top down view. For larger, open world style environments though I'm facing a performance vs fidelity issue. For example, a large environment requires a large texture to fit everything in, but at the same time the texture also needs to be a high enough resolution for when the player decides to zoom in onto a specific point in the map. I can make the map high resolution with enough detail up close but this creates quite a large texture size. Or I can make a lower res map which is more efficient but it becomes blurry pixelated when zooming in. Is this just a balance that I have to play with, or is there a better way to handle this? For example, creating a simple 3d mesh whose shape would act as the map. Or instead of using a texture using a vector graphic. How is this usually done? |
34 | How to texture voxel terrain without triplanar texturing? How can a voxel terrain (marching cubes) be textured without triplanar mapping ? The goal being to have more artistic freedom. I think, I could unwrap the mesh while extracting the isosurface then use projective painting. But I do not know how to handle terrain modifications without breaking the texture. I also guess that virtual texturing could help here. Links for these matters would be appreciated. |
34 | Gradient Banding in Unity Android In an Android game, I have a gradient background that changes color over time. This was achieved by creating a 1x2 pixel texture in code, and stretching it over a large quad with bilinear filtering. This all works fine in the editor, but as soon as it is viewed on an Android device, the banding is ridiculously noticeable and annoying. Since this texture is generated in code, I cannot set the compression to "truecolor" like most other post suggest, nor can I tick the "Generate Mipmap" option. Here is how the gradient look on the phone Code to generate gradient (stripped of unimportant stuff) Texture2D gradient MeshRenderer render Color HSV1, HSV2 Color primaryColor, secondaryColor render GetComponent lt MeshRenderer gt () Update() HSV1.x 0.01F Time.deltaTime HSV2.x HSV1.x if (HSV1.x gt 1) HSV1.x HSV1.x 1 if (HSV2.x gt 1) HSV2.x HSV2.x 1 primaryColor Color.HSVToRGB(HSV1.r, HSV1.g, HSV1.b HSVdifferece) secondaryColor Color.HSVToRGB(HSV2.r, HSV1.g HSVdifferece, HSV1.b HSVdifferece) gradient.SetPixel(0, 0, primaryColor) gradient.SetPixel(0, 1, secondaryColor) gradient.filterMode FilterMode.Bilinear gradient.Apply() render.material.SetTexture(" MainTex", rainbow) The color banding is really obvious and annoying. How can I fix this? |
34 | How to create a texture2d from an array of pixels in DirectX11 I currently have a ray casting camera that holds an array of pixel colors (XMFLOAT4 color) that is 1024x768. I am trying to create a Texture2D in DirectX11 that is also 1024x768 and simply copy over the pixels to the texture. I then want to apply the texture to a fullscreen quad to essentially display what geometry my rays are intersecting with. D3D11 TEXTURE2D DESC textureDesc 0 textureDesc.Width 1024 textureDesc.Height 768 textureDesc.MipLevels 1 textureDesc.ArraySize 1 textureDesc.Format DXGI FORMAT R32G32B32A32 FLOAT textureDesc.SampleDesc.Count 1 textureDesc.SampleDesc.Quality 0 textureDesc.Usage D3D11 USAGE DEFAULT ID3D11Texture2D fullScreenTexture nullptr ThrowIfFailed(mGame gt Direct3DDevice() gt CreateTexture2D( amp textureDesc, nullptr, amp fullScreenTexture), "ID3D11Device CreateTexture2D() failed.") Assuming I have XMFLOAT4 pixels 1024 768 Filled in appropriately with colors. I'm not sure where how I would take my pixel data and apply it to the texture. Any help would be greatly appreciated. |
34 | pre rendering light on texture based on bump map Using a gray scale bump map and N sources of colored light, what is the algorithm to render the light on the textured surface, assuming I have the angle(s) and distance of each light source? (I am pre rendering in software). I have the part where it renders light based on the position and angle of light source. My question is twofold How do I decide on an RGB color based on multiple different light sources? Is it simply additive by nature? What do I do about cases where the angle is not a sufficient way to determine intensity? For instance when there is a deep caveat in the surface and a higher elevated area is blocking the light to that lower shaded area? How do you handle something like that? |
34 | texture size of characters and geosets I am a 3D artist, not a game developer but I am interested in dabbling none the less. Would anybody know the texture sizes currently used for games like WOW or Guild Wars 2? Don't worry, I'm not trying to make a massive game that takes teams of hundreds, I'm simply interested in the efficiency of MMO type games. From what I can find on the net, it's roughly 512k x 512k textures for the base character although its surprisingly hard to find the information and I don't have the games installed anymore to see if I can check. And... If anybody can explain Geo sets to a noob like me that would be great. My general understanding is that the model is separated into pieces such as head, hands, legs etc then the model with textures swapped some how to show new models and textures for say a new pair of gloves. Many thanks. |
34 | What is it called when you combine textures for an object I am trying to gain a better understanding for modeling and texturing for Unreal 4 and Blender. Now my question is What is the process result called when packing textures into one image as shown below? |
34 | Which gaming engine can be used for pixel manipulation? Here is the gist of where I am going with this. I work for a printer company and basically all of our input files(or output files for virtual printers) are bitmap files. We currently have a tool built in java over 10 years ago that works, but chokes on our high end 1600x1600 files. So I was thinking, why not utilize a game engine's built in ability to utilize the power of a gpu? I am thinking all I would have to do is create a 1600x1600 plane, feed it the image file and draw the scene. I figure this is the easy part. However, the hard part is reaching into the image for example and clicking the mouse on a particular pixel. Getting RGB data, converting to CYMK, making changes, converting back into RGB and then redrawing the scene. Not only that but pulling the modified texture out of memory and saving it back into a bitmap. Is this even possible? I haven't used a game engine since college and that was opengl 2.0. Just figured this looked like a good place to ask. Thanks. |
34 | Grainy texture from distance I am using SharpDX, a C wrapper over DirectX 11 to render terrain. While I am able to render terrain correctly, I noticed that moving around creates a lot of visual noise and makes the texture of the ground very grainy. I have attached a picture of how the texture looks on the terrain. (Make sure to make full screen for clarity) Here is the dirt texture I am using Here is the dirt texture normal map I thought I set up the mip maps appropriately in this code (note that filename is just the path to the terrain bitmap I showed above) public bool Initialize(SharpDX.Direct3D11.Device device, string fileName) try using (var texture LoadFromFile(device, new SharpDX.WIC.ImagingFactory(), fileName)) ShaderResourceViewDescription srvDesc new ShaderResourceViewDescription() Format texture.Description.Format, Dimension SharpDX.Direct3D.ShaderResourceViewDimension.Texture2D, srvDesc.Texture2D.MostDetailedMip 0 srvDesc.Texture2D.MipLevels 1 TextureResource new ShaderResourceView(device, texture, srvDesc) device.ImmediateContext.GenerateMips(TextureResource) TextureResource ShaderResourceView.FromFile(device, fileName) return true catch return false public Texture2D LoadFromFile(SharpDX.Direct3D11.Device device, ImagingFactory factory, string fileName) using (var bs LoadBitmap(factory, fileName)) return CreateTexture2DFromBitmapSource(device, bs) public BitmapSource LoadBitmap(ImagingFactory factory, string filename) var bitmapDecoder new SharpDX.WIC.BitmapDecoder( factory, filename, SharpDX.WIC.DecodeOptions.CacheOnDemand ) var result new SharpDX.WIC.FormatConverter(factory) result.Initialize( bitmapDecoder.GetFrame(0), SharpDX.WIC.PixelFormat.Format32bppPRGBA, SharpDX.WIC.BitmapDitherType.None, null, 0.0, SharpDX.WIC.BitmapPaletteType.Custom) return result public Texture2D CreateTexture2DFromBitmapSource(SharpDX.Direct3D11.Device device, BitmapSource bitmapSource) Allocate DataStream to receive the WIC image pixels int stride bitmapSource.Size.Width 4 using (var buffer new SharpDX.DataStream(bitmapSource.Size.Height stride, true, true)) Copy the content of the WIC to the buffer bitmapSource.CopyPixels(stride, buffer) return new SharpDX.Direct3D11.Texture2D(device, new SharpDX.Direct3D11.Texture2DDescription() Width bitmapSource.Size.Width, Height bitmapSource.Size.Height, ArraySize 1, BindFlags SharpDX.Direct3D11.BindFlags.ShaderResource BindFlags.RenderTarget, Usage SharpDX.Direct3D11.ResourceUsage.Default, CpuAccessFlags SharpDX.Direct3D11.CpuAccessFlags.None, Format SharpDX.DXGI.Format.R8G8B8A8 UNorm, MipLevels 1, OptionFlags ResourceOptionFlags.GenerateMipMaps, ResourceOptionFlags.GenerateMipMap SampleDescription new SharpDX.DXGI.SampleDescription(1, 0), , new SharpDX.DataRectangle(buffer.DataPointer, stride)) Here is my samplerstate code SamplerStateDescription samplerDesc new SamplerStateDescription() Filter Filter.MinMagMipLinear, AddressU TextureAddressMode.Clamp, AddressV TextureAddressMode.Clamp, AddressW TextureAddressMode.Clamp, MipLodBias 0, MaximumAnisotropy 1, ComparisonFunction Comparison.Always, BorderColor new Color4(0, 0, 0, 0), Black Border. MinimumLod 0, MaximumLod float.MaxValue SamplerState new SamplerState(device, samplerDesc) Is there anything wrong here? |
34 | Creating a large ground texture for mobile Does anyone has any insight on creating texturing a large mesh that will be used for a mobile game's level map? Specifically, I was wondering how games like Vainglory achieve this. I am familiar with the idea of splat maps, however this is where my knowledge of the subject ends... Is the use of splats on mobile a good solution? Should the mesh be broken down into small sections which are UV'd and textured separately? Neither of these things? |
34 | Accessing and changing texture data in SlimDX How can I access and change texture data in SlimDX? I have a Texture2D and a Texture3D and I need to be able to go in and change either a single or group of pixels. |
34 | How to calculate a projection of 360 equirectangular slides? My goal is to have a light source inside of a transparent sphere that has an equirectangular texture to act as a 360 slide. I want to get the projection on arbitrary objects as a texture image on these objects in high resolution. This will happen on a webserver. So my input should be the equirectangular and the object and optionally position of the light source and my output should be texture files of each flat surface of the arbitrary object. What algorithms can I use to make this as simple, quick and fast in action as possible? |
34 | What are the texture coordinates for a tetrahedron EDIT This is the WebGL code for initializing the tetrahedron points. You may want to skip to the second code block, because you may be able to answer without this. Making a tetrahedron with equal sides Using rotation matrices to determine the points 120 degrees var q Math.PI 2.0 4.0 3.0 Transformation matrix for X axis rotation var rotationArrayX 1.0, 0.0, 0.0, 0.0, 0.0, Math.cos(q), Math.sin(q), 0.0, 0.0, Math.sin(q), Math.cos(q), 0.0, 0.0, 0.0, 0.0, 1.0 var rotationMatrixX mat4.create(rotationArrayX) var d vec3.create(new Array(0.0, 1.0, 0.0)) Topmost point var a vec3.create() mat4.multiplyVec3(rotationMatrixX, d, a) Now we have the top most point and the first point of the base After rotating the vector A with 120 degrees two times, we have the 3 base points 120 degrees q q Transformation matrix for Y axis rotation var rotationArrayY Math.cos(q), 0.0, Math.sin(q), 0.0, 0.0, 1.0, 0.0, 0.0, Math.sin(q), 0.0, Math.cos(q), 0.0, 0.0, 0.0, 0.0, 1.0 var rotationMatrixY mat4.create(rotationArrayY) Calculating points B and C var b vec3.create() mat4.multiplyVec3(rotationMatrixY, a, b) var c vec3.create() mat4.multiplyVec3(rotationMatrixY, b, c) The remaining point is the top point var vertices new Array() bottom vertices.push(a) vertices.push(b) vertices.push(c) front vertices.push(b) vertices.push(c) vertices.push(d) right vertices.push(c) vertices.push(a) vertices.push(d) left vertices.push(a) vertices.push(b) vertices.push(d) How should I imagine texturing these triangles? Is this a valid set of texture coordinates? var textureCoords bottom 0.5, 1.0, 0.0, 0.0, 1.0, 0.0, front 0.5, 1.0, 0.0, 0.0, 1.0, 0.0, right 0.5, 1.0, 0.0, 0.0, 1.0, 0.0, left 0.5, 1.0, 0.0, 0.0, 1.0, 0.0, I based this on http www.codeguru.com forum showpost.php?p 1542703 amp postcount 2 (0.5,1) Texture Image (0,0) (1,0) Thanks in advance! |
34 | DirectX9 texture flashing or disappearing I am learning DirectX 9 and I have these test models in my scene and this weird texture flashing happens. If you look at the bottom of the picture you will see a blue area, that is supposed to be filled with the ground texture I have. So every time I move my camera the ground texture flashes very quickly and parts of it disappear lots of times a second. I know it's not a problem with my video card. What could cause this? |
34 | GLSL pack floats into an RGBA texture I want to compose conventional triangle based models and particles with a ray traced scene at a reasonable frame rate. webGL does not let you write the gl FragDepth in the fragment shader. You cannot have multiple render targets, but you can render to an RGBA texture and then use that texture as input to another draw op. So I can render my ray traced scene and store the depth as input to the next stage. This is obviously far from ideal, but workable. I'd love to be wrong about this, and this is gathered from trial and error rather than any definitive source please correct any flawed assumptions How can you pack unpack a float and, ideally, some flag bits and such, into an RGBA texture efficiently in a GLSL fragment shader? |
34 | SDL2 Textures bleeding 1px border around tile maps SDL RenderCopyEx taking integer arguments https forums.libsdl.org viewtopic.php?t 9486 This post gives a good indication of my question. Basically if you set SDL2 logicalScale or otherwise and render textures at native window resolution they appear fine. However, with tile maps if you resize the window in anyway, you get a bleed where an integer rounding issue creates a 1px border around certain tiles. Is my only option to create a 1px border around all my images to stop this bleed rounding error? Or a semi transparent border with the main color. What are my options? Is this solved in any of the latest SDL2.X.Y ? EDIT A simpler method I have used is reducing my images from 64x64px to 62x62px in SDL2 (not the actual sprite) and using it's own sprite as a 1px border, and using Render Scaling to scale up that 1px, which stops bleed. It reduces the quality on background images ever so slightly, but it requires no tweaking of any code or sprites... but again wondering if there's a more elegant solution. |
34 | What does the max texture unit mean? My graphics card is a UHD 620 from intel and it says that it has 24 texture units (as far as I know). Using SDL2 creating and loading 25 texture works fine, so am I not understanding what the max texture unit actually means? If I were to launch a program that uses more than 24 texture units would it crash if I tried to draw them all at the same time or would it just be really slow having to cycle between then in the shader program? |
34 | How Would You Animate a 3d Characters Face (With Textures) So I'd like to do a low poly style of modeling, and one of the corners I'd like to cut is on head geometry. Specifically, making the details of the face a texture (like a Mii or a character in Animal Crossing for example). How would you go about accomplishing animating the face? Getting it onto the model in the first place? I'm using Unreal Engine to learn and working on my 3d modeling skills at the same time, so knowing a general workflow would give me some structure to work with. |
34 | Substance Painter Jagged edges when painting normals I am painting normals on a normal layer using the Textures tab to make bumps and crevices. But the effect is so jagged. How can I fix this? INB Texture Set is already on 4K so it's not a quality issue. |
34 | How do I convert a horizontal, panoramic skymap texture into a "dome" version of itself? How do I convert the following skymap image into a "dome" version like the one below? I'm looking towards making use of Adobe Photoshop for the task at hand. However, I'm at a loss on what tools I'm going to use and how I will go about it. So, how do I convert a horizontal, panoramic skymap texture into a "dome" version of itself? |
34 | 3ds max "thread stiching" plugin for cloths or furniture i m trying to find a plugin or way to add 3D thread stiching to a mesh. It would be perfect if you could creat thread stiches along edges in "editpoly" mode. I know that you can somehow do this in Z Brush with customized alpha brushes but in my opinion its a bit complicated. I could also imagine alining a editable spline along a mesh edge. Important for me would be that this "unknown tool" would also add the displacement on the canvas and the stiches so i can use "render to texture" to give seems realistic look. Thisway you could always creat perfect stiches with realistic looking displacement like in the picture below without to much affort. http www.esunroof.com contrast stitch.jpg http t1.gstatic.com images?q tbn ANd9GcQplgCvhur ynthz15Qdaqpv1QRwkG4K6sjhVjDqzzxm6OaY9Bt would be awesome if you know a way to do this, Thanks a lot lot ) |
34 | Why are textures always square powers of two? What if they aren't? Why are the resolution of textures in games always a power of two (128x128, 256x256, 512x512, 1024x1024, etc.)? Wouldn't it be smart to save on the game's file size and make the texture exactly fit the UV unwrapped model? What would happen if there was a texture that was not a power of two? Would it be incorrect to have a texture be something like 256x512, or 512x1024? Or would this cause the problems that non power of two textures may cause? |
34 | How to create a texture2d from an array of pixels in DirectX11 I currently have a ray casting camera that holds an array of pixel colors (XMFLOAT4 color) that is 1024x768. I am trying to create a Texture2D in DirectX11 that is also 1024x768 and simply copy over the pixels to the texture. I then want to apply the texture to a fullscreen quad to essentially display what geometry my rays are intersecting with. D3D11 TEXTURE2D DESC textureDesc 0 textureDesc.Width 1024 textureDesc.Height 768 textureDesc.MipLevels 1 textureDesc.ArraySize 1 textureDesc.Format DXGI FORMAT R32G32B32A32 FLOAT textureDesc.SampleDesc.Count 1 textureDesc.SampleDesc.Quality 0 textureDesc.Usage D3D11 USAGE DEFAULT ID3D11Texture2D fullScreenTexture nullptr ThrowIfFailed(mGame gt Direct3DDevice() gt CreateTexture2D( amp textureDesc, nullptr, amp fullScreenTexture), "ID3D11Device CreateTexture2D() failed.") Assuming I have XMFLOAT4 pixels 1024 768 Filled in appropriately with colors. I'm not sure where how I would take my pixel data and apply it to the texture. Any help would be greatly appreciated. |
34 | Difference between texture.Load and texture.Sample(PointSampler,..) in HLSL Dx Conceptually, I understand the difference between load and sample but I want to know if essentially point sampling and load do the same thing when it comes to selecting a texture value (ignoring an out of bound situation). |
34 | Streaming webcam via websocket performance issues (Code included) I finally figured out, how I can stream my webcam, but I'm getting not the best performance. In low resolutions it's working fine, but as soon as I turn the resolution up, the performance gets worse. I think Unity can't handle the decoding in that quality and speed so that the game is getting slowed down. Here is my Code Texture2D screenshot new Texture2D(Screen.width, Screen.height, TextureFormat.ARGB32, false) screenshot.ReadPixels(new Rect(0,0,Screen.width,Screen.height),0,0) screenshot.Apply() ws.Send(System.Convert.ToBase64String(screenshot.EncodeToJPG(quality))) Hopefully there is something to improve. |
34 | What is it called when you combine textures for an object I am trying to gain a better understanding for modeling and texturing for Unreal 4 and Blender. Now my question is What is the process result called when packing textures into one image as shown below? |
34 | FPS drop after moved from Textures to TextureAtlas in LibGDX I've started using LibGDX some time ago and I was making a test project to get used to this library. I've created some images and added them to the assets folder and loaded each image as a Texture using the AssetManager. Everything was working fine and I had 60 FPS. I wanted to work in a more efficient way so I packed all my images into an atlas using the TexturePacker tool. I loaded the atlas using the AssetManager again and started using TextureRegions instead of Textures. After this change I've started to notice sudden drops in FPS from 60 to 50 and even 30 once. I've tried to change the pixel format to RGBA4444, I've made sure that the min and mag filter were both seth to Nearest, but still I see those annoying frame drops. I'm not doing anything heavy in the game itself, it's currently some actors in a stage. I got some MoveActions and Animation, but nothing special yet. Does anyone have a clue what can cause the FPS drop? Thanks |
34 | Replicating no. of sprites without letting the app to slow down and crash Is it possible that if I'm making a a simple drag n drop game, does making a new sprite via constructor with texture as a parameter makes the game slower and depletes more memory until it crashes or does making a new texture via constructor cause the memory to go slow? I'm on a revision for the new drag n drop puzzle game where the only goal is to stack the most no. of objects as you can before it falls off the platform. Is it a good idea to create another sprite with texture as a parameter every time a player drag n drop another object? Here's the start of the dry run looked like and I set it to debug mode so that the platform is completely block the pit to test the replication test. Afterwards, when it reaches from hundreds to thousands, this will be like this and the game gets slower. When the game crashes, here's the result via Logcat 12 11 16 38 53.953 E AudioTrack(25701) AudioFlinger could not create track, status 22 12 11 16 38 53.953 E SoundPool(25701) Error creating AudioTrack 12 11 16 38 53.993 D dalvikvm(25701) GC EXPLICIT freed 947K, 14 free 14414K 16672K, paused 2ms 4ms, total 96ms 12 11 16 39 16.363 E AudioTrack(25701) AudioFlinger could not create track, status 22 12 11 16 39 16.363 E SoundPool(25701) Error creating AudioTrack 12 11 16 39 16.433 D dalvikvm(25701) GC FOR ALLOC freed 2120K, 15 free 14418K 16848K, paused 47ms, total 47ms By the way, I'm using BodyEditor library for Box2D program and rendering, a physics engine. |
34 | Implement spherical mapping for texture coordinates I am using a texture of a world map and I am trying to put that image on a sphere made up of many triangles. Each triangle has points a,b,c with their own (x,y,z) coordinates. I am trying to use the coordinate system conversions formula from Wikipedia. This is my world to spherical coordinates function function worldToSpherical(p) const r Math.sqrt(Math.pow(p 0 ,2) Math.pow(p 1 ,2) Math.pow(p 2 ,2)) const u Math.atan2(p 1 ,p 0 ) const i Math.atan2(Math.sqrt(Math.pow(p 0 ,2) Math.pow(p 1 ,2)),p 2 ) const s r Math.sin(i) Math.cos(u) const t r Math.sin(i) Math.sin(u) return s, t But this is what my output looks like It seems to be wrapping around twice. Am I using the wrong formula, or using it wrong? |
34 | Any reason not to combine an AO Map and Cavity Map into one texture? If an Ambient Occlusion Map is for mapping shadows based on closeness of neighboring geometry, and a Cavity Map is for mapping smaller shadows based on angle or distance, then is there any reason not to mix these into one unified shadow map? It seems desirable to do so for saving a game from having to load extra textures, but I ask in case I'm overlooking something. This article gives an overview of the differences between the two types of maps, but it never mentions combining them. This made me wonder if there was some reason to keep them as separate images. Thanks in advance for any insight. |
34 | How to texture a VBO? I'm not sure which way is the best way to do this for my purposes. My purposes being having a textured (with an image) cube. Also, I've been following a blend of tutorials, so I'm not sure which way is the most right and what I'm doing wrong (currently I haven't had the texture fit right). Do I use glDrawArrays or glDrawElements? For pointers, do I say gl.glVertexPointer(3, GL.GL FLOAT, 0, vertices) and gl.glTexCoordPointer(2, GL.GL FLOAT, 0, textureData) or gl.glVertexPointer(3, GL.GL FLOAT, 0, 0) and gl.glTexCoordPointer(2, GL.GL FLOAT, 0, 0) When do I bind unbind the buffers? (gl.glBindBuffer(GL.GL ARRAY BUFFER, VBOVertices) gl.glBindBuffer(GL.GL ARRAY BUFFER, vboTextureCoordHandle) ) Before I establish the pointers or not in my drawing method at all? Do I need to specify the UV coordinates Texture Coordinates in the shader? If I'm getting a weirdly stretched texture, does that mean my fragment shader and setting the uniforms is working correctly? Thank you! I can supply more code, but I think I just need more information on the basics. |
34 | How do I get an instance of KX PolygonMaterial in Blender? I've got a question concerning using Python in Blender's Game Engine. Before I start I want to state that I'm trying to change the color of an object in Blender's game engine. To do this, I'm attempting to find a way to update the texture of the object (I basically want two or three states, red, (yellow), green). What I'm doing right now is scene GameLogic.getCurrentScene() pingMeter scene.objects 'Ping Meter' mesh pingMeter.meshes materials mesh 0 .materials material materials 0 However, when I do print(material. class . name ) it outputs KX BlenderMaterial. Shouldn't I be getting KX PolygonMaterial if I'm running the Blender Game Engine? Is there anyway to change color or texture with KX BlenderMaterial because I can't find anything in the documentation. Can I get an instance of KX PolygonMaterial out of the code above? ...or should I just take a different approach all together? Thanks! EDIT I'm using Blender 2.65 which uses Python 3 in case anyone is wondering. |
34 | Creating single texture for all materials in substance painter with ID map. ID map doesn't bake properly I am trying to create single texture for multiple materials in substance painter according to this tutorial. https www.youtube.com watch?v HMP2xpGHimY I have two models made in Blender. One with several materials(for generating ID map), another with one material. Mesh for ID map Main mesh Both are totally identical from geometry and scale standpoint. Generating ID map settings(it is a mobile game so I use only Unlit texture and don't require other maps and channels except for BaseColor) Painting the model with masks created with ID map and actual problem. Id map Model's materials in Blender (everything is great here) How can I handle this issue? |
34 | Why is my texture displaying incorrect colours using dx11? I am trying to load my data from a binary file (ppm) and create a texture using this data. It is important that I learn to do it this way as I am eventually going to be packing all of my textures into a single binary file and then index them so creating the texture with pure binary data is something that I need to be able to do. It seems that the texture is drawing correctly, but the colours are incorrect. I saved my image as .ppm just for this test application. Here is the code to load my data ppm ppm ppm.read(std string("textureppm.ppm")) just to ensure the data is correct uint32 t val ppm.pixels 0 unsigned char r (val amp 0xFF000000) gt gt 24 unsigned char g (val amp 0x00FF0000) gt gt 16 unsigned char b (val amp 0x0000FF00) gt gt 8 unsigned char a (val amp 0x000000FF) ID3D11ShaderResourceView texSRV nullptr D3D11 SUBRESOURCE DATA initData amp ppm.pixels, ppm.width sizeof(uint32 t), 0 D3D11 TEXTURE2D DESC desc desc.Width ppm.width desc.Height ppm.height desc.MipLevels 1 desc.ArraySize 1 desc.Format DXGI FORMAT R8G8B8A8 UNORM desc.SampleDesc.Count 1 desc.Usage D3D11 USAGE IMMUTABLE desc.BindFlags D3D11 BIND SHADER RESOURCE ID3D11Texture2D tex HRESULT hr getDevice() gt CreateTexture2D( amp desc, amp initData, amp tex) if (SUCCEEDED(hr)) D3D11 SHADER RESOURCE VIEW DESC SRVDesc SRVDesc.Format DXGI FORMAT R8G8B8A8 UNORM SRVDesc.ViewDimension D3D11 SRV DIMENSION TEXTURE2D SRVDesc.Texture2D.MipLevels 1 hr getDevice() gt CreateShaderResourceView(tex, amp SRVDesc, amp texSRV) if (FAILED(hr)) throw 0 else setTexture(texSRV) I have packed each byte into a uint32 t as it seems that is the format that is required DXGI FORMAT R8G8B8A8 UNORM Here is the packing uint32 t ppm CreateRGBA(unsigned char r, unsigned char g, unsigned char b, unsigned char a) uint32 t value 0 int r2 (r amp 0xff) lt lt 24 int g2 (g amp 0xff) lt lt 16 int b2 (b amp 0xff) lt lt 8 int a2 (a amp 0xff) value r2 g2 b2 a2 return value This code produces the following texture When the original texture is Does anyone know what I am doing wrong? |
34 | Basic terrain shader without using external texture I have this (Right now I have the height map in a x x size 2D array and a 1D vector too.) What I am trying to achieve is something like this Without using any textures, only plain colors. So basically smooth transitions and some shadow (using shaders). My vertex shader looks like this version 330 layout (location 0) in vec3 Position layout (location 1) in vec3 Normal layout (location 2) in vec3 Color out vec3 fragmentNormal out vec4 ex color,pos out vec3 N out vec3 v void main () pos vec4(Position,1) ex color vec4(Color,1) fragmentNormal Normal v vec3(gl ModelViewMatrix pos) N normalize(gl NormalMatrix Normal) gl Position gl ModelViewProjectionMatrix vec4(Position,1) I have normals for all the vertices. Color is set simply in the c code based on height. Here is the fragment shader in vec3 N in vec3 v in vec4 ex color void main(void) vec3 L normalize(gl LightSource 0 .position.xyz v) vec4 Idiff gl FrontLightProduct 0 .diffuse max(dot(N,L), 0.0) Idiff clamp(Idiff, 0.0, 1.0) gl FragColor Idiff ex color So I guess my problem is what formula should I use to mix the colors. I think I don't need to set the colors in the c code but in the shaders. Update Here is the wireframe of the terrain. Update2 Based on Babis' answer the result is So the gradient is not quot projected quot onto the surface as I would like to do. What could cause this? Maybe my qustion wasn't clear. |
34 | How can I create random noise that is seamless across modular game assets? Does anyone know how I can create a texture for a modular tile, such that when the tiles are arranged, random weathering effects such as scratches will be continuous across seams? An example photo below shows the ideal result. The scratches are continuous between different tiles. Is creating weathering effects like this possible using any procedural software? |
34 | Interact with texture flat surface (Unity) I don't know exactly how to put this into words, so googling did nothing really for me. Imagine a bowling game. On the oldern bowling lanes there was only a scheme of the pins above the lane, lighting up all the pins that are still standing, while the ones that already fell, were not lit anymore. How can I achieve this in Unity? Is there any way to do this, without having to make something stupid like creating hundreds of textures, with all the possible combinations and swap them out? I'm still very new to Unity, so I'd love to know the best approach to something like this. Thank you very much in advance |
34 | Finding texture coordinates for plane I am creating a 3D ray tracer and want to add textured planes. My planes are stored using a position vector and a normal vector. I want to use a square texture and map it repetitively onto the plane. How do I figure out a transformation from points (3D vectors) on the plane and points (2D vectors) on a texture? Once I know how to do this transformation, I don't think making it repetitive will be hard, as we can just apply modulo arithmetic. I thought I could make a basis on the plane by calculating a vector B1 perpendicular to the normal, and then another one, B2, perpendicular to the normal and B1. But once I have these two vectors, I still don't know how to express the point (which is the argument for the function below) in this basis. Vector3 position, normal Bitmap texture public override Vector3 GetColor(Vector3 point) int u ? int v ? return texture.GetPixel(u, v) |
35 | How do I fix a "missing mcp.cfg" error? I am trying to use mod coder pack (MCP) to create mods of my own. I have downloaded MCP 9.4 version from www.modcoderpack.com. The error that I have encountered is that when I try to run a file called decompile.sh in the mac terminal, the terminal responds with an error ERROR root !! Missing mcp.cfg. I have checked my MCP folder and made sure that it contained the file mcp.cfg. The file is in my folder. I am currently running on a Macbook Pro with MacOS Mojave Version 10.14.4. |
35 | How do I add a custom mob to Minecraft? Basically decided to make my own mob, I have Created my mob's entity class Created my mobs model class Drawn the model Added the function call for addMapping within the EntityList class I'm stuck on what to do next. I've tried finding the code that deals with passive animal spawning in the world, however I can't seem to find it. Help greatly appreciated. |
35 | How can I read a portion of one Minecraft world file and write it into another? I'm looking to read block data from one Minecraft world and write the data into certain places in another. I have a Minecraft world, let's say "TemplateWorld", and a 2D list of Point objects. I'm developing an application that should use the x and y values of these Points as x and z reference coordinates from which to read constant sized areas of blocks from the TemplateWorld. It should then write these blocks into another Minecraft world at constant y coordinates, with x amp z coordinates determined based on each Point's index in the 2D list. The issue is that, while I've found a decent amount of information online regarding Minecraft world formats, I haven't found what I really need more of a breakdown by hex address of where what everything is. For example, I could have the TemplateWorld actually be a .schematic file rather than a world I just need to be able to read the bytes of the file, know that the actual block data starts always at a certain address (or after a certain instance of FF, etc.), and how it's stored. Once I know that, it's easy as pie to just read the bytes and store them. |
35 | How to pass arguments with BungeeCord Bukkit plugin messaging I am trying to send a plugin message from Bukkit, to BungeeCord, but can not figure out how to send arguments. Here is the code from the Bukkit plugin, which sends the message ByteArrayDataOutput out ByteStreams.newDataOutput() out.writeUTF("BungeeCord") out.writeUTF("Argument") If you don't care about the player Player player Iterables.getFirst(Bukkit.getOnlinePlayers(), null) Else, specify them Player plr Bukkit.getPlayerExact("spacegeek224") plr.sendPluginMessage(p, "BungeeCord", out.toByteArray()) Here is the code in the main class of my BungeeCord plugin Override public void onEnable() this.getProxy() ProxyServer.getInstance().getPluginManager().registerListener(this, new ChannelListener()) this.getProxy() ProxyServer.getInstance().registerChannel("Return") And finally, here is the code for the ChannelListener package net.spacegeek224.metro.util import java.io.ByteArrayInputStream import java.io.ByteArrayOutputStream import java.io.DataInputStream import java.io.DataOutputStream import java.io.IOException import net.md 5.bungee.api.ProxyServer import net.md 5.bungee.api.chat.BaseComponent import net.md 5.bungee.api.chat.ComponentBuilder import net.md 5.bungee.api.config.ServerInfo import net.md 5.bungee.api.event.PluginMessageEvent import net.md 5.bungee.api.plugin.Listener import net.md 5.bungee.event.EventHandler public class ChannelListener implements Listener EventHandler public void onPluginMessage(PluginMessageEvent e) if (e.getTag().equalsIgnoreCase("BungeeCord")) DataInputStream in new DataInputStream(new ByteArrayInputStream(e.getData())) try String channel in.readUTF() channel we delivered if(channel.equals("BungeeCord")) ProxyServer.getInstance().broadcast(new ComponentBuilder(e.getReceiver().toString()).create()) else ProxyServer.getInstance().broadcast(new ComponentBuilder(e.getReceiver().toString() " " channel).create()) catch (IOException e1) e1.printStackTrace() public void sendToBukkit(String channel, String message, ServerInfo server) ByteArrayOutputStream stream new ByteArrayOutputStream() DataOutputStream out new DataOutputStream(stream) try out.writeUTF(channel) out.writeUTF(message) catch (IOException e) e.printStackTrace() server.sendData("Return", stream.toByteArray()) I have tried many things, including Google, and looking at what other methods are available, but have come up with nothing. |
35 | Is there a way to drain durability from an item instead of a sword, for instance? I am modding Minecraft 1.8 using Eclipse with Forge, and I just wondered, is there a way to drain durability from a battery, for instance, instead of a piece of armor when hit. I have tried things like getting an item from the player's inventory and attempting to reduce its durability, so that would be a possibility, however, I was unable to find a way to do so? |
35 | How can I mod Minecraft 1.7.9? I've looked up a lot of tutorials on YouTube and all of them only work for versions of Minecraft prior to 1.7.9. I first got a Minecraft Coder Pack (MCP) off of this website, but then realized that only decompiles Minecraft 1.6.4. Then I found a more recent MCP (that's not on the website for some reason) and it is version 9.03, downloaded here. This decompiles Minecraft version 1.7.2 (when I followed this video's instructions, I run the decompile.bat file and it says Json file not found in C Users mike AppData Roaming .minecraft versions 1.7.2 1.7.2.json). Basically I can't decompile Minecraft 1.7.9, but I can decompile older versions. However, I don't have any older versions downloaded onto my computer. I have only 1.7.9. Then I tried using Forge, but realized that most videos were using versions of Minecraft prior to 1.6.4, meaning they use the bin folder that does not exist anymore. Even after trying to figure that out as well, the decompiling would never work. I tried to do what this video did, but couldn't replicate it. Then I finally looked at this video about using Forge and I could replicate it, but this didn't decompile Minecraft. It just set up a workspace in Eclipse that I'm not sure how to use. TL DR I can decompile Minecraft 1.6.4 and 1.7.2 but I can't decompile version 1.7.9. Should I download an older version of Minecraft, wait for an MCP for 1.7.9, or something else? Is there something I'm missing, where I actually can decompile and mod Minecraft 1.7.9? |
35 | How do I use the Minecraft Coder Pack on Linux I downloaded the Minecraft coder pack to mod the game and decompile it, but how do I use it on Linux. Some sources seem to give specific directions, but they are not clear and they involve executing the .bat files. My Ubuntu does not recognize .bat file. How do I decompile a class file from changing a Minecraft version jar file to a zip and unzipping it? How do I decompile one of those class files using Minecraft coder pack? I have been wondering what at least some of the Minecraft source code looks like. |
35 | Source code of Minecraft servers? Is there any way to get the source code of Minecraft servers? I tried decompiling but I get very obfuscated arguments, classes, and methods. If the answer is no, how did services like Bukkit and Spigot create their 'servers'? |
35 | How much more do I have to learn? So I started learning java sometime ago and now I am understanding packages. My main goal is to do minecraft modding. Can someone with modding experience please tell me how hard(or easy) minecraft modding is? And how much time will it take for me to get to a level when I can start modding? |
35 | How do I fix a "missing mcp.cfg" error? I am trying to use mod coder pack (MCP) to create mods of my own. I have downloaded MCP 9.4 version from www.modcoderpack.com. The error that I have encountered is that when I try to run a file called decompile.sh in the mac terminal, the terminal responds with an error ERROR root !! Missing mcp.cfg. I have checked my MCP folder and made sure that it contained the file mcp.cfg. The file is in my folder. I am currently running on a Macbook Pro with MacOS Mojave Version 10.14.4. |
35 | No MODS button in Windows Minecraft when selecting Forge profile I have installed a Minecraft launcher in Windows that supports Minecraft 1.14. I have installed Forge 1.14.4 forge 28.0.13. I have created the mods folder and placed a mod jar in it which is Xaeros Minimap 1.17.4 Forge 1.14.4. I have selected the Forge profile in the launcher. When running the launcher, I have to remove JVM option XX CMSIncrementalMode in order to run it (also when running the regular profile). I have Java 11 installed. But after launching, there is no MOD button in the menu. What could be the problem? |
35 | How to get an AbstractClientPlayer for all players? I'm working on an minecraft forge mod (1.8.8). I have en custom (ownable) entity and want to set its texture to the texture of its owner, because it's a mini version of the owner. I found out that i can get the texture of players with AbstractClientPlayer getLocationSkin(), but i can't figure out how to access either EntityOtherPlayerMP or EntityPlayerSP, which implement AbstractClientPlayer, in my custom renderer. Is there a way to get all AbstractClientPlayer, regardless of SP or MP? I can access the GameProfile and i have the EntityPlayer of the owner. |
35 | How can I use IntelliJ to make Minecraft mods? I'm starting out creating Minecraft mods with my son. I've seen one YouTube tutorial which sets up project with Eclipse. Since I don't like Eclipse much, I ask how would I setup IntelliJ or Android Studio(if feasible) to develop Minecraft mods? My son specifically want to create roller coaster mods. |
35 | No MODS button in Windows Minecraft when selecting Forge profile I have installed a Minecraft launcher in Windows that supports Minecraft 1.14. I have installed Forge 1.14.4 forge 28.0.13. I have created the mods folder and placed a mod jar in it which is Xaeros Minimap 1.17.4 Forge 1.14.4. I have selected the Forge profile in the launcher. When running the launcher, I have to remove JVM option XX CMSIncrementalMode in order to run it (also when running the regular profile). I have Java 11 installed. But after launching, there is no MOD button in the menu. What could be the problem? |
Subsets and Splits