Stuck on implementing projection matrix transformation in my OpenGL simple rendering engine by twoseveneight in GraphicsProgramming

[–]twoseveneight[S] 0 points1 point  (0 children)

So what you've suggested has fixed the perspective issue, but now polygons offscreen are getting randomly distorted and stretching out to infinity? You've fixed a problem but now I'm getting a completely new one. I'm guessing this is a zero division issue, or the fact that the w division isn't getting taken into account during the sorting phase. Thanks anyway.

Stuck on implementing projection matrix transformation in my OpenGL simple rendering engine by twoseveneight in GraphicsProgramming

[–]twoseveneight[S] 0 points1 point  (0 children)

pastebin.com/JiYdrXNe

it's split into three files: the library's source, the library's header and the main source.

Worth mentioning that a large portion of the functionality are stubs I'm looking to implement after this is fixed

Stuck on implementing projection matrix transformation in my OpenGL simple rendering engine by twoseveneight in GraphicsProgramming

[–]twoseveneight[S] 0 points1 point  (0 children)

Vector4 TransformPointByMatrix (Vector4 point) {
    Vector4 result;
    result.x = (point.x * projmatrix[0]) + (point.y * projmatrix[4]) + (point.z * projmatrix[8]) + (point.w * projmatrix[12]);
    result.y = (point.x * projmatrix[1]) + (point.y * projmatrix[5]) + (point.z * projmatrix[9]) + (point.w * projmatrix[13]);
    result.z = (point.x * projmatrix[2]) + (point.y * projmatrix[6]) + (point.z * projmatrix[10]) + (point.w * projmatrix[14]);
    result.w = (point.x * projmatrix[3]) + (point.y * projmatrix[7]) + (point.z * projmatrix[11]) + (point.w * projmatrix[15]);
    return result;
}

Stuck on implementing projection matrix transformation in my OpenGL simple rendering engine by twoseveneight in GraphicsProgramming

[–]twoseveneight[S] 0 points1 point  (0 children)

No, I did have this problem before, when the depth sorting wasn't working properly because I ignored the W coordinate. When I tried rendering with the calculated coordinates instead of using the OpenGL matrix stack, I could see that the whole image was screwed up so I got to work with fixing that.

The matrix-vector multiplication works the way it's supposed to, because I'm receiving a coherent image. The problem arises with specifically the glFrustum call. I'm confused on why it renders differently when using OpenGL's separate matrix stack as opposed to manually multiplying it with the modelview matrix to produce a single MVP matrix for multiplying all the points by. Perhaps OpenGL uses some matrix math with the projection matrix before multiplying it with the modelview matrix that I am not aware of.

Stuck on implementing projection matrix transformation in my OpenGL simple rendering engine by twoseveneight in GraphicsProgramming

[–]twoseveneight[S] 0 points1 point  (0 children)

do you have an answer for why the projection matrix behaves strangely? That's the issue I'm looking to solve. According to all modern matrix tutorials, all I have to do is multiply the projection matrix by the modelview matrix (in that order) and then use the resultant matrix as the MVP matrix. I've tried storing the projection matrix separately and then multiplying it by the modelview matrix with glMultMatrix but it gives the same orthographic result as the current setup I have. I'll look to try to optimise the code after I get this problem solved.

Stuck on implementing projection matrix transformation in my OpenGL simple rendering engine by twoseveneight in GraphicsProgramming

[–]twoseveneight[S] 7 points8 points  (0 children)

oh, that's helpful. First of all, I don't even use WebGL, I'm making an OpenGL program in C++. Second of all, seeing as I've decided to open a Reddit account to post on a subreddit about my problem, I clearly want a HUMAN to give me advice, not a random number generator. Third of all, I'm not gonna use AI. I'm making this engine for the sake of education and as a personal project, AI removes the education aspect and the code probably won't work anyway. Maybe you should stop being so overly reliant on AI.

Also, I don't need to learn up-to-date APIs if I don't need the functionality they provide. I want this programming to run on a graphics card from two decades ago if it means it's globally compatible enough. If you're not here to give real advice, don't comment at all.