(February 2016) Monthly WIP screenshot thread

Kicking off this month’s thread with some debris particle testing.
Before:


After:

Yee :smile:

Every time a module is obliterated a debris object is spawned which is a particle emitter that emits 7 particles and then waits until they are dead then deletes itself (this takes about a minute). No lag whatsoever. (For now :frowning:)

Another note, all current 3 weapons have been completely redone effectwise and also the way the hits are calculated. Lasers and plasma guns are now hitscan and the flak cannons spawn a jbullet rigidbody shell that listens for its collision and detonates, damaging eveything in a few block radius. (video incoming)

Edit: I now realised that it looks like each piece of the debris is its own particle, which is not the case. In fact it looks like this (2x2):

These were adapted from Rebel Galaxy’s assets just for testing out my code but aren’t going to be used for obvious reasons.

23 Likes

That plasma… wow! How it’s done?

These ones?

Well they are two alpha (not additively) blended quads arranged 90 deg to each other in a + fashion that have repeating textures in the y direction adjusted with texture coords. It was slightly tricky to make since the beam needs to stay the same but become shorter if it hits anything, so I have to modify the texture coords according to the beam’s length.

Finally I wrote a very simple shader that moves the texture along the y axis as fast as it can look.

The texture I made for it:

Oh and the gif is running at a slightly reduced speed so the beam moves even faster at 60fps.

Btw, this was my inspiration:

Everspace in UE4, reminding me how good my game won’t ever look… :stuck_out_tongue:

13 Likes

That’s pretty cool, though. Seems like if you could randomly perturb the sides of the quads you’d come even closer to the effect. Let those look more light bits of electricity that way.

Still a neat effect either way. Note: re: the texture coordinate, if you just make the texture coordinate the length then you could probably calculate the actual coordinate in the shader using % (modulo) or something.

P.S.: I had not heard of everspace before… so thanks for that, too. :slight_smile:

I got the feeling that February is going to become a very good month here… :chimpanzee_closedlaugh:

1 Like

That sounds like it might look good, but I only have the two quads atm and I’d need to redesign the thing quite a bit to get those extra subdivisions. :confused:

Well I calculate that when setting the coord, but that would surely be more optimal. Is there any way of getting the quad dimensions in the frag shader? I figure that that is something I have to pass myself since the shader has no idea what the geometry looks like I guess.

Yeah, that’s true. But it would look really cool! :smile:

I may have misunderstood how things were setup or projected forward possible solutions to the other issue. And I guess the % would have to be done in the frag shader so you still might be better off with just setting the right texture coordinate.

But it’s mainly because you have voxel spacecrafts (which is a different design choice).
With voxel spacecrafts you can do things that other games can’t.
You could plan a version 2.0 which uses “AAA segments” not voxels.
There is a project called “shipyard” on Blendswap.
Something like that.
Comes at the cost of not having maximum creative freedom (not Minecraft-like anymore).
:chimpanzee_smile:

If you meant the gfx effects (eye candy) - I don’t see anything that you could not achieve (most of the effects shown in the .gif are well known things.
:chimpanzee_smile:

If you want to cut through the asteroids, you could use either CSG or something like the isosurface demo from pspeed (there is already someone experimenting with it - he tries to find out how to make a terrain editor with that - I think…). :chimpanzee_cool:

But It’s not competely voxel based and I am planning to have larger modules for higher tiers. As I always say it’s kind of like TerraTech in space with not so comic graphics.

It’s not that the engine can’t do it or that it can’t be done, its just that I have no idea what I’m doing. Literally started programming about a year and a half ago.
Then there’s some polish thing that some games have that just makes them look right and I can’t seem to figure out what. Perhaps its the lens flares or some filter. Perhaps its a consistent art style…

Well I’m still deciding if It’s worth the time and effort I could be putting into more important gameplay aspects. I can always make asteroids glow red and explode which should look about as good.
Oh and I need some actual asteroids first -.-’

CSG can help with that too. Some games look cheap if solid objects just dissolve into particles of an explosion. Better cut the thing into 3 or 4 pieces dynamically and them become new asteroids (like in the original “Asteroids” game).

But it’s like you said: The gameplay you want will define the features that you need. :chimpanzee_smile:

Look on Blendswap - lot’s of free Blender assets…

Well I was thinking of procedurally generating them. Actually, nevermind probably too resource intensive…

CSG can help with that - combine some 100 basic shapes over and over again (and use 1 to 4 combine operations). Other procedural generation can work too (use metaballs or just simple vertex shifting as a start). :chimpanzee_smile:

Btw: For you being a starter in both coding and game coding, your stuff looks pretty neat. Imagine you in 2 or 3 years and teamed up with a 3d environment artist…

2 Likes

@MoffKalast

A great a start for February

1 Like

I decided to add in another UI theme, this one called Glass. The Glass theme uses a tiled normal map to distort the background image and then uses a color balance formula to modify the background color. I had to add a second UV layer to my NinePartQuad which NiftyGUI elements are drawn with so I could tile the normal map. This theme runs a good deal faster than my Frost theme.

Right now there are two settings, low and high, but high probably won’t make it into the final game. The low setting does not display windows behind one another while the high setting does. The problem I’m having with the high setting is that there is a small amount of flickering in some small areas, I’m not sure exactly what’s causing this, it’s not z-fighting as the windows are drawn back to front with depth write disabled. I’m guessing it has to do with the fact that I’m distorting the same texture that’s being drawn to and maybe the buffer updates the texture at different rates or the pixels are drawn in a different order from one frame to the next.

9 Likes

Yeah yeah yeah. This in Steam or other sales platforms and jME soon as popular as the other engines. :chimpanzee_closedlaugh:

It might be float precision. Yesterday I toyed with the Bullet physics engine and found out that setting the precision of the engine to a very high value (1/1000 instead of 1/60) caues strange artifacts - suddenly the player bounces off and is shooted into the sky. It’s most certainly that float has only 7 decimal digits of precision and 4 decimal places are already taken by the 1/1000. Float sucks. And float in GLSL sucks even more since it is not really float but some kind of “short float”.

Hey @Tryder what’s your game about? it looks really sleek and cool, but I’m slightly confused about the media player. In a what-would-you-need-that-for sense.:stuck_out_tongue:

@MoffKalast It’s not really about anything, it’s all game with no storyline :wink:

It’s a turn based strategy game that’s focused primarily on combat, there’s no tech tree or researching. You start out with two NightHawk attack craft and a Cargo Ship. Cargo Ships hold up to four multi-purpose assemblies which can be used individually to repair damaged craft or, with a full load, a Cargo Ship can be used to construct a Space Station which harvests energy from stars and ore from asteroids and is used to build additional craft.

I don’t plan on including any music with the game, instead I wrote a media player and playlist manager/editor so gamers can listen to their own music in either MP3 or Ogg format while playing the game.

@Ogli Yeah I’ll keep looking into it, but it’s a cool effect either way. I just tried converting the Mesh that is used to draw the UI elements to use a DoubleBuffer for vertex positions rather than a FloatBuffer, but there was no change in the flickering.

It’s actually just one or two pixels here and there that seem to be changing from one frame to the next. The same problem exists in my Frost theme, but those pixels just get blurred out of existence so it’s not noticeable, with the Glass theme if the window is positioned just right those pixels get stretched into a larger area which then becomes noticeable. It appears as though the issue lies on the edges where two triangles meet.

2 Likes

I did not mean the vertex positions. For that float could be enough.
I meant the normals turbulence which you are doing (in the fragment shader).
I think there you need a lot of precision. Which is a problem in GLSL itself.

EDIT: might also be that you are grasping into area outside the background. Imagine that: you want to get a color from a nearby position, but you are reaching over the edge, so there is nothing there. This would always appear next to the edges or corners of your UI elements if that is really the cause of the problem.

I set the background texture to mirroredrepeat so UV coordinates > 1 or < 0 just wrap. The normals for the mesh are all the same and do not change. The shader is using a normal map yes, but it’s not used to modify the face normals, it’s used to offset the background image:

vec4 norm = texture2D(m_Normal, texCoord2 * vec2(m_repeat_x, m_repeat_y));

vec2 offset = vec2((1.0 / m_screenX) * m_RefractionStrength, (1.0 / m_screenY) * m_RefractionStrength);
offset *= vec2(norm.r, norm.g) * vec2(2.0, 2.0) - vec2(1.0, 1.0);
vec2 wtc = vec2((worldTexCoord.x / m_screenX), (worldTexCoord.y / m_screenY)) - offset;

vec3 back = texture2D(m_background, wtc).rgb;

Made with GIMP and Blender. Used GIMP to create a black and white tileable cloud, then used that as a displace map on sub-divided plane in Blender and saved out the Normal pass after rendering.

P.S. When rendering with Blender Internal the normal pass is in a different format. For anyone wanting to make normal maps that way, you need take the normal pass channel and seperate into individual RGBA components, multiply both the red and green channels by 0.5 then add 0.5. The blue channel is inverted so it needs to be multiplied by -0.5 then add 0.5 then recombine, plug it into the composite node and save.

1 Like

Here’s a short clip that demonstrates the effect and the flickering. When the Music Player window comes up you can see the flickering in the lower right of the Music Player window.