23/04/2020

# Basic Lighting

One again, many things have happened, as seen in the tweet below. In the past three days I have added the coordinate system with camera and transformations, refactored the rendering API to a more immediate pipeline, and added basic ambient + diffuse + specular shading. All that refactoring of vertex arrays has definitely come in handy and allowed me to fix a couple of bugs as I understood more about what exactly a vertex array is.

# Vertex Arrays are not Arrays of Vertices

Well, I knew that much coming into it, but I was lost on the specifics. Here's what I now understand.

Vertex Array

A vertex array is an object that, when currently bound, captures the state of vertex attributes and index buffers. Binding it again, will rebind that captured state.

There are only 4 OpenGL functions that a currently bound vertex array will capture state changes of:

# 👉 glEnableVertexAttribArray(...) / glDisableVertexAttribArray(...) (opens new window)

Enable/disable vertex attributes, e.g., enable location = 0 so I can send vertex positions to my vertex shader.

# 👉 glVertexAttribPointer(...) (opens new window)

Tell OpenGL what part of the vertex buffer is the attribute, e.g., the first 3 floats are the position.

Important!

Calling this will cause the vertex array to implicity capture the currently bound vertex buffer to know where the data is.

# 👉 glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, ...) (opens new window)

Bind (or unbind) an index buffer.

Unbinding Matters.

Make sure to unbind the vertex array before unbinding the index buffer, otherwise (since the same function binds/unbinds), the vertex array will capture the unbound state.

Knowing all this now, I have been very careful to bind/unbind everything correctly, in order not to capture the wrong state, or leave my vertex array bound.

Please, please, this is supposed to be a happy occasion, let's not bicker and argue about who unbound who. We are here today, to witness the Critical Path progress.

# Critical Path v1.1

  1. Coordinate System
    • Entity object to encapsulate a Transform component and Renderer data.
      • Just have a list of transforms as the rendering data is only cubes.
    • Camera component to handle view projection transforms, and move around the scene.
      • Camera automatically orbits the scene, no manual control.
    • The nalgebra-glm (opens new window) crate has all the required math funcionality.
  2. 🚧 Forward Lighting
    • Light component to with color, strength, and type (directional or spotlight).
      • Just using spotlights for now.
    • Material component to encapsulate ambient, diffues, specular, and shininess values.
    • 🚧 Create a GLSL function to calculate lighting from a material's values and all the light source values.
  3. 🚧 Geometry
    • I'm going to skip model loading for now, and just use some pre-defined cube vertex data.
    • 🚫 Maybe I'll render the light sources as little spheres.
    • 💎 Use instanced rendering (opens new window) to more easily manage vertex/index/transform data.
  4. Deferred Shading
    • I will need to learn how to use framebuffers.
    • I might need to do something with the Depth buffer.
    • I will need to learn about Multiple Render Targets (MRT).
    • Update the lighting calculation to take input from the G-buffer framebuffer.
Mark Description
Done
🚧 WIP
Blocked
🚫 Removed
💎 New
Last Updated: 10/31/2020, 5:02:02 AM