I feel like I am on to a good thing at the moment. I’m pretty excited.
I’ll be developing prototype stuff one day a week, working on capabilities three days a week, and the last day will be Unity stuff.
Over the weekend I was looking at the mouse look, and I found that using quaternions is not suitable for a FPS type view. We can use regular x/y/z axis to achieve this, and by flipping the x axis around 90 degrees, we can achieve the ‘feeling’ of a human looking around in a complete circle. If you stand and look straight up and turn around, the world spins around your view, as if the camera was looking down the -x axis. This is the effect we actually want to achieve when controlling the camera with the mouse. When we use quaternions, and we rotate around the x axis and then the y axis, some z-roll is introduced. This is undesirable.
So for the ‘Idiot Robots’ game, we’re going to use a first-person-shooter style controls, with a mouse controlling direction, and keys controlling movement. The gameplay will be a cross between the old Commodore 64 game Impossible Mission, which had you dodging robots to find microfilm parts to assemble a code, and Frogger, which had you dodging traffic and jumping over floating logs. A high degree of mobility is required and this gives me plenty of opportunity for introducing fun game mechanics without having to worry too much about frills like weapons and enemy animations etc.
The main capability I’m working on at present is importing Wavefront .obj files from Blender into C++.
Monday afternoon: Ah, I perhaps bit off more than I could chew by trying to import a model with materials and multiple primitives. I’ll start with a simple box and try again.
Monday night: I’m still working on parsing the wavefront file format. The issue is that Wavefront uses the character / to divide the integers for faces, and C++ uses / as a special character that has unusual properties depending on what character follows it. I can’t use the normal C++ string manipulation functions and will need to go in to each string character-by-character. This is tedious but only needs to done once.
I’m able to parse the vertex and normal data correctly, so once face data is being stored, I should be able to display the models sans-materials and textures.
I haven’t looked at materials and texturing yet. You may remember I wanted to do a UV unwrap tutorial for Blender, but that tutorial didn’t exist. I’ll have to learn this in the near future.
Tuesday night: My wife had a semi-holiday, so I spent most of the day out in town enjoying the cherry blossoms and the castle view with her. In the evening I got back and worked some more on the parser.
I’m still unable to parse face data, but the situation is improving from gibberish to strings of numbers that are just missing some data. I’m confident I’ll get it working tomorrow.
Wednesday midday: Parsing is done.
Now I need to work out how to pipe the vertex, normal and face data in to a buffer and send it to the GPU to draw. I’ll also need to actually write the code to place this object in the world. These steps SHOULD be significantly easier than parsing the file, so, hopefully I’ll have good news before the end of the day.
Wednesday afternoon: I’m more or less done (I hope) but out of time. I might get a chance to look at it when I get back from work tonight, otherwise first thing tomorrow morning.
Wednesday at midnight: It took a bus ride in to town and sitting down with a pen to figure out what was going wrong. Well it’s time for bed, here’s where I’m at right now.
Well now, it’s been such a long time since we proudly displayed a plain box!
This really is just a plain box. There’s no material or texture attached to it. I’ll need to implement this at some point. Out of curiosity, I painted it with the earth-shader.
As expected, without texture coordinates, the box didn’t get textured, although it was interesting to see the curved-shading effect on the edges of the cube.
I then imported my fighter01 blend object to see how badly it was going to come out – I figured if it was horribly mangled, I would be able to sleep on it and let my subconscious do the work.
Huh. That’s unexpectedly perfect. What a sweet way to finish the day off on.
Thursday midday: The next objective is to get the materials working. Changing the material properties will allow the shaders attached to the models to interact in interesting ways – for example, we can have two green surfaces, and one surface might have a stronger reflective property and a weaker diffuse property. One will look like plastic, and another will look like metal.
So first of all, I’ve coloured our box in with some different materials so I can see how the Wavefront format handles multiple materials on a single object.
A long time ago, I was using a ‘colour’ property in my shaders. It’s now time to create a materials class and the associated properties. This step should be reasonably straight forward.
Thursday night: It’s continuing on basically as expected. I’ve basically finished the parsing. I’ve made a material and material handler class. My new vertexes will need to keep material data stored as well, this means changing my vertex buffer handler. No real problems encountered so far.
Parsing is done. Now building the pipeline from data to vertex. There’s a few points to take care of; we have to make sure the right material is assigned to the appropriate vertex.
Here’s where I’ll leave it for tonight.
The colours are off. I have to work out if there’s a problem with how I’m reading them in (unlikely as I tested it exhaustively) or the way I’m assigning the specular/diffuse lighting in the shader (likely since I just hacked it together half asleep). I’ll figure it out tomorrow.
However, at this stage we’re officially getting data from Blender and transmitting it pretty accurately to the GPU. Incidentally, I discovered a ‘bug’ in Blender. I’m unable to assign seperate colours by vertex unless the entire quad/triangle has the same colour. GLSL supports colour interpolation between vertexes, as we discovered in one of the first images I put up here. This is probably a drawback of using free software. For my purposes, this is a non-issue. By the time I’m at a level where I care a great deal about the correct interpolation of colours across a triangle, I should be able to afford to buy a 3D graphics tool that can support it.
The colour are now correct – the error was in my vertex attribute pointer call. I had incorrectly ordered the interleaving of the ambient/diffuse/specular data, so the ambient data was being partially overwritten by the texture coordinate (which is not currently used), and the diffuse was being partially overwritten by the ambient, and the specular was being overwritten by the diffuse.
It’s late and I’m exhausted – I had work this afternoon and was happy to sort this out. I’ll look at tidying up the shader code and figuring out my next step when I have some more time. I’m itching to complete the obj format process and implement texture coordinates since it may not be too complex, but I’m wary of getting further behind schedule. My goal of making a prototype engine is nearly realised and I don’t strictly speaking need models with auto-importing textures.
Saturday: It’s tough to get any work done on the weekend as it’s the only real time I get to spend with my wife. Nonetheless, I plugged in the lighting contributions from the material into the shader for basically what is expected results. I probably won’t get a chance to revisit this until Monday, but that will be the start of a new month, and it will be time to review what is done/evaluate what will be done next.
Here’s the cube – the red surface is lit. Because it’s a perfect cube with faces at ninety degrees from each other, and the light source is perpendicular to the red face, the other faces aren’t lit at the moment. You can see the specular bloom in the bottom right corner that shows where the light is currently directed.
I then re-loaded the fighter model. I messed around with the specular lighting property to try and get a good metallic look; the ‘shininess’ value is set high and the ‘light strength’ (reflective value) is set low.
You can see the different surfaces being lit due to their angle to the light, although because the reflection value is poor there doesn’t appear to be a specular bloom. Nonetheless this achieved my objective so I’m happy with it.
The next capability I want to develop is importing textures that I have added to my Blender objects.
I downloaded this crate image from Open Game Mart.org.
I then pasted this texture on the the crate object, and tested out how it appeared in Blender’s OpenGL engine before trying it in my own.
In addition to seeing the crate is pasted correctly, I’m happily able to visually confirm that sides of the cube with no light remain in darkness. All is well. I will probably want to add an ambient lighting property eventually.
I’ve already dealt with pasting textures on to a simple surface before so this cube shouldn’t be much trouble. More complicated shapes, such as the fighter, or models that have multiple components and textures may be too difficult to justify the effort at this stage. I’m mainly interested in fairly simple models that look good and render well. For Idiot Robots, I’ll be happy with three boxes that can rotate independently.
A quick play around with Blender and exporting shows that the default export simply writes the address of the texture in the material file. This won’t do, I need texture coordinates so that I can correctly wrap the texture to the object. This will, not surprisingly, require completing an UV unwrap tutorial. Luckily, these are not hard to find.
So I’ll be back here tomorrow morning with the monthly review; and some decisions about what I’m doing in April.