Return to “[Archived] Daily Dev Logs, 2012 - 2015”

Post

Week of January 19, 2014

#1
Sunday, January 19, 2014

Big day for nodes, HUDs, and all things shiny (well, I couldn't really think of a third thing so...) :monkey:

At long last, the interface is integrated with the rest of the rendering pieces of the game. It used to be a mere module, throwing itself directly on the screen. Now, however, it holds a place among the elements of the exalted "render chain," having full access to the many buffer, bits, and barnacles (??) of the rest of the rendering subsystems! :D

There are several exciting advantages to this, although perhaps the most exciting is the fact that I can now use fancy, shader-based blending to draw the UI. Instead of just additive / subtractive / alpha, I can get fancy for the sake of visibility. I wrote a blend mode today that interpolates between additive and subtractive blending based on the luminance of the target frame - it works wonders! I can see the UI now whether I'm staring into a bright sun or the cold, dark abyss of space. And all without any kind of background overlay! I can also do neat things like fade out the edges of the node viewports as I blend them onto the screen, which provides a lovely, continuous HUD effect :clap:

I had a nice surprise today when I first played with compositing the UI onto the rendered frame. I was puzzled when I noticed that the text responded to the compositing just like all the other nodal elements :wtf: For a long time (probably around 9 or 10 months) I've been under the impression that something deep within SFML would prevent me from rendering text to a texture. I thought it always rendered directly to the backbuffer, ignoring any custom framebuffer setup. I could have sworn I tested this theory :think: And yet...when I tried compositing the UI with subtractive blending, I was immediately surprised to see the white text turn black, just as it should have. This implied that the text was going through the same pipeline as everything else...e.g., being rendered to the UI's target texture! :shock:

It might seem small, but to me this tiny little thing opens up immense possibilities. I could now render a nodal UI onto a game object, if I wanted! I could create textures with custom text on them, perhaps markings on a station or ship. I could apply post-processing to the UI, and finally achieve that lovely glowing text effect that I see being heavily-abused by every single Scaleform UI in existence :roll: So much potential, all because text works just like it should! Hooray!!! :D

PS ~ Sorry for the late log! Went to bed entirely too late night :ghost: :ghost:
“Whether you think you can, or you think you can't--you're right.” ~ Henry Ford
Post

Re: Week of January 19, 2014

#2
Monday, January 20, 2014

Wow...amazing day, but not the one I was expecting! :o Another one of those days where I intended to fight a battle on one side of the world, but a war cropped up on the other :ghost:

A while ago I mentioned that the engine currently has a rather high frame time variance, due to the fact that simulation occurs only on some frames. I also mentioned that one solution would be to spread out simulation such that different objects get simulated on different frames, thus reducing the workload variance. Sounds pretty simple in theory, except that...it's not at all :shock:

The problem with doing that is that you now have every object living in a different "time zone," so to speak. Things are already complicated thanks to the fact that simulation must occur at a fixed rate while rendering occurs at a variable rate, hence rendering needs to interpolate world state to make it look as though simulation is continuous. Now, mix in the fact that different objects are being simulated on different frames, hence, have different interpolants?? It gets complicated fast :ghost:

Needless to say, it was a day of much pain. But the good kind of pain. The pain where you know you'll thank yourself for it later :) In the end, we won the war. The engine is capable of simulating every object in it's own "time zone" and smoothly interpolating the results. This actually comes with the lovely benefit that I can define a different simulation frequency on a per-object basis! That's great news, as it means I can really turn down the frequency for static objects like asteroids, as they hardly require any simulation at all. No need to waste cycles on them :)

Anyway, who cares about all this nonsense. The result? So much smoothness. Silky, silky smoothness :D The frame variance is down to ~10-20% during normal play, quite a bit better than the ~100%+ that I measured a while ago. You can definitely feel it. I've always loved games that felt smooth. Probably part of the reason why I still love old games so much (when they're running at 100+ FPS, they really don't have any choice but to be smooth!) I want LT to be like that. Really, really smooth and immersive experience. And here we have it :D

Ok! Back to the HUD ;)

(But first, back to sleep...it was another really late night :crazy: :ghost: )
“Whether you think you can, or you think you can't--you're right.” ~ Henry Ford
Post

Re: Week of January 19, 2014

#3
Tuesday, January 21, 2014

Finally coming up on one HUD / UI theory to rule them all! This time, though, I'm beginning to look at the HUD as the general idea of containing secondary interfaces within an interface. What I've realized is that a HUD widget is no different from, for example a box that shows information about your ship in the hardpoint viewer! My realization is that any given node can have multiple "streams" of information that flow out of it, and these streams can be hooked up to secondary interfaces. What a beautiful unification! It means that the HUD is not a special concept! Just one of many applications of these information streams.

I'm about halfway on the implementation of this idea. I know, I know...it's taken me a while. But..it's going to be worth it. All of this is going to be so unified, clean, easy to manage...I get all warm and fuzzy inside just thinking about it :D

Now, I want to get something off my chest.

I got derailed for maybe 30 minutes or so today thinking about life, the universe, and everything :) Throughout the entirety of development, I've kept a list of things that I would do differently, were I to do this whole thing again. In general, these are things that are either really huge changes that would take too much time to make at this point, or moderately-large changes that aren't worth the benefit they would bring at this point. Today I've added a new element to this list - perhaps the biggest one yet: "c instead of c++." That's a bit shocking. Even as I write it I am slightly scared by it. I have loved c++ so deeply for so long. But over time, I've written so much of it, I've come to understand it so deeply...and, as I march closer and closer to my dream of collapsing everything into simplicitly, I am beginning to loathe the complexity of it. I find myself thinking more and more in terms of simple, simple, units of functionality. Function and data. That's what c is. C++ adds so much that I have used without much thought. But now that I think about it, everything could have been done more simpy with just functions and data.

I won't talk specifics, because I'd need a full page for my thoughts :) But if I were to do it again, I would write LT in c with a sprinkling of c++. I would use the necessities of c++ like constructors, destructors. I would use templates for generic programming, but I wouldn't get fancy with them. I would drop inheritance, virtual functions, member functions, and pretty much everything that c++ added to classes (again, minus ctor and dtor). I honestly would love to see what the engine would look like today had I had this clarity from the beginning. Well...LT2 engine will be a fun one ;)

(Well, actually, as you might already know, LT2 will be written in a node-based programming language :D But the nodal compiler will emit mostly c, rather than c++ ;) )

Gonna be a good day of code...I can feel it :)

PS ~ I really am sorry about the unpredictability of dev log times as of late :( I really wish I could beat my sleep pattern back into something predictable...but... :oops:
“Whether you think you can, or you think you can't--you're right.” ~ Henry Ford
Post

Re: Week of January 19, 2014

#4
Wednsday, January 22, 2014

Got it :cool: !! Recursive nodal viewports, multiple "data streams" from the same node...I finally got it :D Next up: implement basic version of HUD as just a node. Bonus points: figure out how input works within this framework (when do the sub-interaces receive input?) Anddd later: figure out how sub-interfaces should be laid out. Yes yes, many questions but finally some answers as well! ;)

Well, sorry to be a rip-off today but...I don't have a whole lot more to say and am rather itchy to get back to it! The end of the month is coming too quickly for my taste and there's still so much dev fun to be had before then :geek:

PS ~ 42 users online as I'm posting this :shock: Is it me or is LT getting a bit popular lately? :D
“Whether you think you can, or you think you can't--you're right.” ~ Henry Ford
Post

Re: Week of January 19, 2014

#5
Thursday, January 23, 2014

Very productive day. Still not productive enough. More hours!! Need more hours!!! :cry:

Worked hard on trying to get those sub-interfaces looking good. Made some solid progress - probably the best idea that I had all day was to use a blurred version of the rendered frame as a background for the interface areas. This results in a glassy look to the panes, which is quite pleasing! A bit inspired by iOS 7 ;) What I really need to do though is apply some kind of 3D effect to them, as the 2D panes just stand out a bit too much for a HUD. I'm currently debating how I want to do that :monkey:

Now for something completely different! :) I spent a bit of time today looking into SDL, which is the better-known alternative to SFML when it comes to all the basic constructs on which the engine rests (window management, input devices, timers, threads, locks, etc). Truth be told, I've become a bit concerned about SFML over the past year. At first it seemed clean, elegant, and robust. But after more and more use, there are a few areas that have started to trouble me. Keyboard support for OS X is downright broken (some keys just won't work on Mac). Performance of input polling is poor on Linux (which has caused me to have to write a lot of work-around code so that I don't lose milliseconds to input). It's also missing some features that I wish I had: real support for different keyboard layouts (AZERTY, etc), more extensive device information (so that profiles can automatically be loaded for Xbox controller vs. Thrustmaster stick, etc.), condition variables, system information, and so on. When I started poking about with SDL, it didn't take long to realize that it had basically everything that I wished for and more. The only thing it doesn't do is render text. But I'm sure I can handle that, since everybody and their mother has written a tutorial on using the freetype library :roll:

So today, I've dabbled a little in understanding the library and think it would probably only take a few hours to replace SFML with SDL in the engine. It's dead-easy to understand since it's a C API. I won't do this today, nor will I do it immediately, but I'll start replacing pieces over time. It might sound like a big task, but luckily I've done a good job of isolating dependencies on external libraries, so I actually won't have to change much code :)

One tiny interesting tidbit: someone asked a long time ago whether LT would support joystick / gamepad rumble / haptic feedback. I said no because SFML does not support it. But SDL does, and I successfully wrote a quick test program today to make my gamepad rumble. So cool :D That'll be so much fun to feel when I accelerate and fire weapons!

Alright, here's to hoping that, somehow, this day contains more hours than yesterday :crazy:
“Whether you think you can, or you think you can't--you're right.” ~ Henry Ford
Post

Re: Week of January 19, 2014

#6
Friday, January 24, 2014

In. The. Zone.

Ridiculous. Absolutely ridiculous to do so much work in one day :crazy: But I'm not even tapped out yet. Bedtime is long past but I'm still too excited to rest. Things are moving so quickly, everything is coming together. Finally.

The HUD is glorious. Well, at least I think so :) It's better than I ever imagined. I finally figured the 3D projection out, after playing with several different techniques. The winner is: projection to a cylinder. The distortion is subtle enough to not get in your way, but still makes a large perceptual difference. I believe this interface, because my mind knows that it's actually painted on a 3D surface, not just slapped onto the screen. But it's even better than that, because there's a distinct 'visor' or 'viewscreen' feeling to having it projected onto a cylinder. I can almost believe that I'm looking at a holographic viewscreen overlayed on a large window in my spaceship. Immersion++ :) So much potential for pushing things further as well...like I said, this is just the beginning. Cylindrical HUD would be conducive to swiveling your head and looking at other pieces of the display that are hidden from your forward view (props to Gazz for the idea of being able to "look around" your HUD). Introducing a separation between head angle and world camera angle will be fantastic for Oculus Rift head motion and Joystick hats. The math is absolutely too simple :geek:

HUD postprocessing. Amazing what we can do since the HUD is drawn through a real render pipeline now. I only scratched the surface earlier, and I'm still only scratching the surface. But after today we have subtle, glowing noise. Subtle distortion lines. A bit of chromatic aberration. An occasional rolling flicker that brings the interface out of phase briefly and back again, giving a lovely impression of the interface being 'alive' (definitely something that we could wire up to the ship's integrity so that your HUD starts glitching out when your ship is failing). A handful of effects makes all the difference in the world.

So much other stuff done too. But so much left to do. Can't go to sleep. You know, sometimes you just get in that state of mind where all the neurons are firing in the right patterns and every minute of work is as productive as ten normal ones. So if I can just stay awake for another day or so, I'll be fitting in two weeks of work before the update ;) :ghost:
“Whether you think you can, or you think you can't--you're right.” ~ Henry Ford
Post

Re: Week of January 19, 2014

#7
Saturday, January 25, 2014

Sadly, I didn't succeed in staying up all day :( That's ok, it was still a productive one!

It's almost too easy to build stuff with the node UI + HUD + new node renderer :) Pretty much any widget I can dream up that uses simple shapes, text, and simple animation can be cooked up in minutes. I'm working on some new basic shapes for the node renderer, like bars and arcs and such. These are all really easy and fun to whip up. It'll be a lot of fun to build bunches of helpful little things that you can install in your HUD if you so please. Fuzzy dice widget, anyone? :P

I started on several of the "real" HUD nodes today, including the notification display, the center reticle, as well as some abstract bar graph thingy at the bottom of the screen (hey, abstract bar graph thingies have lots of potential uses you know :monkey: ). I'm expanding the repertoire cautiously right now, because I know that the infrastructure needs to be perfect before I explode into a flurry of widget-building. I made the mistake of expanding too aggressively with interface content many, many times before (not the least of which was during the LTP development), and each time I have regretted not perfecting the architecture before doing so. The node UI is really close to perfection...

...but I'm still not at 100% understanding with respect to how to handle all of the 2D / 3D / view / camera stuff that's going on. It's a fairly complex situation since the interface is 3D, projected onto a 3D surface, and using a different camera angle for every nodal viewport. That's a lot of different coordinate spaces flying around, and even with all the practice I've had in that stuff, the conceptual load of it all can still be a bit daunting :crazy: I think I'm in need of one more sweeping simplification :think: Not sure what it'll be but I do hope it hits me sooner rather than later :|

It's worth mentioning that Graphics Josh came up with a cute little trick today for making an ultra-immersive mode for the interface. He calls it "glass HUD." You'll probably never guess what it's supposed to look like... :P :roll: But it's an awesome little effect that maintains the visibility of the HUD (..mostly) while blending in really well with the world. I'll probably be playing with this mode most of the time...since I care more about the world than the UI ;) I think the minimalists among you will really love it!!

Onward and upward! :wave:

---

PS ~ Interesting mini-story concerning the writing of this dev log. I was typing it while laying down on the couch, which is a seriously-dangerous affair, since the couch also serves as my bed. I almost dozed off in the middle of writing, but...being dedicated to providing at least semi-timely logs, I kept my hand on the keyboard and methodically pressed back and forth on the arrow keys to keep a grasp on consciousness while my mind tried to slip into a dream. It worked better than I expected!! After a few weird daydreams, my brain finally gave up and returned consciousness to me, with an extra boost of energy to boot! Weird, interesting, and energizing...all at once! Try it sometime - keep mashing your keyboard while you're dozing off :D :ghost:
“Whether you think you can, or you think you can't--you're right.” ~ Henry Ford
Post

Re: Week of January 19, 2014

#8
Summary of the Week of January 19, 2014
  • Implemented per-object simulation interpolation, leading to major reduction in frame time variance and major increase in smoothness! :clap:
  • Implemented UI compositing with render chain
  • Implemented 3D HUD projection
  • Implemented sub-interfaces in the nodal UI
  • Had a lot of fun with post-processing effects on the HUD :D
  • Implemented GlassHUD....for the minimalist in all of us
“Whether you think you can, or you think you can't--you're right.” ~ Henry Ford

Online Now

Users browsing this forum: No registered users and 3 guests

cron