I was originally trying to use the SOIL library to get my images loaded, however I eventually ditched it in favor of “lodepng”, a very lightweight png loading library. I personally prefer the .png graphics format for pretty much everything due to my days back as a Multimedia student, it’s a lossless data storage that seems pretty well supported and for a reasonable memory print.
SOIL apparently hasn’t been updated since VS8, and I’m using VC13, and it was apparently too big of a jump for the code to make. The pre-compiled binaries only existed for MinGW, at least that I could find, so it would have worked with my Eclipse environment (presuming I’d been able to get GLEW working in the first place, which to be honest was my preference). I spent time looking in to the problem and uncovered some good learning points, so I don’t consider the time wasted.
I did waste a bit of time trying to jump straight in to loading a texture from a file on to the surface of an object. I bit off more than I could chew; for anybody following in my footsteps my advice is:
1 – Create a texture coordinate variable in your vertex. Just set the texture coordinate to the x/y of your vertex so you can immediately see if it’s working.
2 – Then worry about updating the texture coordinates based on what may or may not be in a texture.
I had a small bug where I’ve assigned my vertex indexes to the last position in the element array buffer, so whenever I add a new vertex property, I need to manually update the position of the index. I need to find a way to automate this or engineer the problem out in some way.
Anyway, I had a few interesting warp effects as I was adding the texture coordinates so I thought I’d share them.
This was what you get when you send the un-normalized position data in as the texture coordinate (e.g, before the model view matrix is applied). I’m not really sure what is going on here, but that point where the lines converge does not change – moving the cube around, the lines and colours remained exactly where they were. Bizarre.
The strange sphere segment is because I was currently just testing the texture against a single face of the sphere. Remember, my sphere is actually a cube that’s warped into a sphere! You can see the strange convergence always appears in the center of the screen regardless of the orientation of the scene.
After we got this far, it was pretty trivial to go ‘ok let’s put in the NORMALIZED points this time’ and we get the famous red/green blend of vec2(0 < t < 1 x, 0 < t 1 y).
Here’s a high angle/low angle view.
First attempt at ‘striping’ the target. It was supposed to put a thin stripe down the middle. Hmm, something doesn’t seem quite right.
A bit of fiddling and mission complete. This was the objective for today, and I played around with a few more angles/modes to see if it was really working.
In the shot on the right we can see some fairly serious aliasing. I do know how to repair this using in between steps rather than a sharp YELLOW/BLUE choice, but I’m out of time for tonight.
This is transparency, using the discard; command.
However, there’s some things that need to be implemented to make it look more real;
1 – Backfaces (the far side of the sphere that is not drawn, it’s insides) need to be lit.
2 – Shadows of the bars on THIS side need to appear on those backfaces.
Backfaces are relatively trivial from my brief review; shadows are not.
I want to get the brick pattern effect shown in the OpenGL orange book (we’re actually 3/4 there already) with anti-aliasing up.
I will implement backfaces on the testsphere to somewhat improve the transparency effect.
I’m going to skip shadows for now; I want to finish off the purpose of this capability, which is to get a texture loaded from a file on to the surface of the sphere. I’d like to make it a globe; then I can chuck in a couple of stars in the background and make myself a pretty little galaxy shot to put on my desktop.
Then it’s time to fire up Blender and start porting models in and animating them.
I had a greater look at Unity, also. I have to say I’m pretty impressed. I am considering jumping in to Unity once I’ve scratched off enough capabilities here, or perhaps using it as rapid prototyping tool.
For now my main objective is to get programmed up enough to be employable somewhere, ideally in Adelaide, but if Blizzard is hiring (hey look, they are!) maybe I’ll have to go to America. I didn’t get back from Japan yet though X__X
Anyway, the situation is looking good for now. I’ve been very busy due to one of my co-workers abruptly breaking contract and fleeing the country, requiring me to pick up her shifts. So that’s…three days I’m working from 9:30 am until 9:30 pm. Wuwu.
I also turned 34 on Sunday, not incidentally :)
Well I’m pretty happy I got this knocked out given that I’ve been only able to afford scraps of time in between work and other unmitigatable events.
I wonder what it’s like to work full time in a programming job? It must be pretty amazing. I really love solving problems and coding and learning new things. It’s my dream to work in a job that is also my hobby.
Well, gotta start somewhere, I guess. I’ll take what I can get.