I've been working with my own system using CPU-managed hitdefs (rectangles for mouse-events) since I didn't want to redraw everything for the gl feedback buffer...
(basically I can do all the hitdef position management once during the frame loop)

now that modern gl has come into play, I would like to build a better management system that does the event processing asynchronously from the frame using the GPU...

but while knowing this is possible, I have no idea where to start and what to research without getting confused...

does anyone know the "proper" approach to building a modern asynchronous GPU-based UI manager??
(for a specific language, I guess C will do)

10 Months
Discussion Span
Last Post by DarkPikachu
Featured Replies
  • English can be a bear. I read it as a question about about modern UI using GL. My mistake. Since the UI can be modern, and no one has chatted about modern GL, I read it differently and apologize for that. To me there is OpenGL 2.x and 3.x. So … Read More


You may have to write more about your target system and most of all, the end goal.

I've seen folk spend far too much time worrying about CPU cycles when the product delivery date had already passed. At that point it's time to stop counting and start delivering a working system and work the CPU/GPU load later. Some engineers melt down and you may have to tell them the doors will close if we don't ship. They may be fine with that so your next stop is HR.

So what is the proper approach? I think "make it work" is the first goal as well as what else you can do to meet your Product Requirements.

About half the time I'm in on a project, the requirements were either not done, or were too short or too long.

For example, the debate over the UI can be the UI must have 3D or a skeuomorph design. "Apple officially shifted from skeuomorphism to a more simplified design" and Microsoft went all 3D and now has moved to a flat style.

What you call modern has to be well defined. Today the designs I see are mostly simple and flat.


heh, when I say "modern" I'm not referring to style in any sense (personally I think the modern flat style is disgusting, but that's just me) ;)

anyways, I've never really been limited by time, it's my enemy tbqh, programming is just a hobby for me so I'm as free as I need to be, but I care more about quality than anything so yeh...
I got time, hit me ;)

basically what I want to do is stop using the disgusting, outdated FFP, and the only thing holding me back is my CPU-based hitdefs...
I might be able to use the gl feedback buffer if I can tie it in asynchronously with the position updates, but I really want a single-pass update for everything, where I know the feedback buffer is strictly dual-pass, at least with the FFP...

if I had to guess, I would have to send the mouse click, position, and velocity to a particular buffer(s) defined for such...
idrk what I'm doing here, but idk what to research that focuses on my needs :/


FFP? https://en.wikipedia.org/wiki/Fresh_frozen_plasma was the first hit and while I've worked on embedded systems for decades our systems still are a mix of plain old character interfaces (show a list, ask for a number from the list) to really nice polished graphics.

Fast forward to today and we haven't built a GUI from scratch now that we have so many ready to use systems (small? look at Rasberry Pi's).

The style is one I know to throw handcuffs on the dev team fast as they can go out of control on time as they try interface after interface and drain both time and budget.

By now you get the idea I mostly manage these projects and the proper way is set by the project owner. Said project owner must get the screens and how they expect it to work into the requirements document first. If they don't, projects rarely go well.


I thought everyone knew when referring to OpenGL's ancient FFP that it stood for Fixed-Function Pipeline. :/

I'm trying to drop this in favor of a more modern approach with buffers...

but IDK how to interface the event system with the modern stuff...

also something I forgot to mention, for a Target system, as long as it can handle buffers it'll be supported.

btw sorry if my wording is confusing you...
for one I'm on a phone (no internet yet), and for two, it's a side effect of my particular autism (I confuse myself while thinking and end up with a descriptive mess of a scentence which I don't realize is a mess until much later)...


My interface with OpenGL is mostly at a high level. For the Rasberry once we determine to code and use OpenGL we rarely have to go very deep.

I get the feeling you are working on some stripped single board computer or chip and working at the barest of bare metal systems.

At the office we've done such before but the cost is usually too high on the software team so we pay a little more for hardware (already noted Rasberry) and this cuts the cost of the software. Some call this heresy and some maintain we should all write in assembly so we don't waste CPU cycles or RAM.

Yes, the years have marched on and we rarely see a sparse target system anymore. I do want to note that I know sparse as my smallest solution was an Atmel which had no RAM. Just a few registers and a few kilobytes of flash memory. Few on the team write assembler so I took that job and loved it.

Back to topic. I see you understand that "modern" is up for grabs as to definition and it's deeply personal.


honestly I don't really have a target system... as long as it's capable of running OpenGL 3x (I know my older Pentium 4 is), it should be able to work...

in any case I don't need to go deep for hardware support...
it's rather real-time interface support I'm trying to achieve...
in any case, I'm trying to build the UI under a modern gl system.
(there's only 1 use case of the word "modern" in modern-gl with shaders, so it's really easy to understand)

but idk how to design a UI under modern gl...

about the only thing I know how to do with modern gl is transformation on the GPU via a bone/matrix array...
(I need something better since data needs to be synced with the CPU interface, but doesn't use per-frame bandwidth)
^ I want to update data on the GPU, not obtain from the CPU, but that's for another topic... :P

I just need to know how to build a GPU-based UI...
I think passing events to the GPU is too much work, but idk what the middle point would be...


ohhhh boy... haha
while that looks interesting AF, I'm moreso used to doing everything myself...
I mean, if it was released as a free, non-trial, offline .deb installer, I'd give it a try, but all in all, it seems a bit much for all I really need to do...

you might laugh at me when I say my current main focus is Python34 for a long list of reasons... heh
but basically what I'm doing is building my own engine, and all I really want to know is the modern gl code that operates on the buffers for which a UI would be built upon, or rather how a UI itself would be built...
all I really need is a good source of non-confusing documentation... :P

basically what NeHe was, but for something like GLFW+FT (or at least SDL) that wasn't based on the old FFP...

I'm asking in general, because my engine is non-standard... kinda like Blender, if you might know how it's code works...

Blender's code is a bit much for me to understand though... I just need a simple example of how things should be done.

also, sorry for posting at 4AM for you. XD
for me it was around 7AM, and after sleeping on it, I kinda knew how to better describe myself :)

but yeah I'm more concerned about just building a UI atm and focusing on the usability and style later, because when you're focusing on noobs, you really need to go for what they're looking for out of the many mirrored standards you're trying to follow.

yeah it's kinda filth, but I'm making it work well and be nice to use :)
it won't be what Blender started out as, that's for sure. ;)

I reference Blender so much because what I'm working on is designed similar, but simpler and more verbose (kinda contradictory, but it's to the point of what my program needs to be)


I see where there are a few paths to make a GUI but before we get to the nuts and bolts which I can't due to the sheer magnitude of code in even a stripped won lightwieght manager like OpenBox, we always have to pick our goals on a project.

My take on your question was the target is some small resource limited single board or chip system. SOC systems (system on a chip) are now plentiful and I use the Raspberry as a starting point about how the world has changed so much from just a decade ago.

So here we are. You have some ideas to kick around and there is no one true path when it comes to what you are doing. Here the TTM (time to market) is always short so we have to push a little more into the hardware so the software can be delivered in time.

I wonder if you were around in the 90's with say Borland's VGA graphics were used to make primitive GUIs? https://www.thefreecountry.com/sourcecode/gui.shtml is just a start on that history.


heh, well I was born in the 91 era, so I wonder if that counts ;)

but yeah I'm not really focused on a target system...
I can't afford a R-pi but I do happen to have a random 400Mhz Celeron (P II) with 384MB RAM running WinXP Black lying around as well as a few 600 and 800MHz (P III) CPUs if I ever need something similar ;P

but yeah one thing's for certain, I knew there would be alot of code to write for just an example, that's why I was asking for something extsting ;)
like seriously, the best example I have to go by right now related to modern gl is the iqm GPU demo.
getting into extending that with a modern UI using mostly the GPU would just be perplexing...

I'm looking into those libs to see if I can find anything modern-level using less CPU.


That modern looking is up to you to define. To me BEOS was great stuff and is open source.
If I wanted to run something light on old gear it would be a LOAF distro or another OS that could use OpenBox.

While you may be kicking around writing all this from scratch, that's a lot of years till you get something complete.


not once have I talked about "modern looking" other than to respond that I think it's disgusting. :P
I'm not referring to a modern UI in that sense...

if it helps, here's one of the few things I'm working on using the FFP-based UI I've been talking about:

I want to know what to do to move that away from the FFP onto modern gl??

btw, I know it looks "modern"...
right now it's just a very basic and very slow system...
I will be improving it and it won't look so noobish once I'm done.

Edited by DarkPikachu


The reason modern UI is in the discussion is it's in your topic title. Since GL is almost always OpenGL I have to think you are not going to re-write GL so out of your topic title I read the question is about a modern UI that you wrote calling GL.

That said, your question "I want to know what to do to move that away from the FFP onto modern gl??" is something I can't comment on except you begin the research and work.

That is, while I have worked on UI's over the years on very small systems (SoC's today are more common) we rarely re-write the libraries.
As such I don't think I can help you here but will share what we do today.


the topic title states "modern gl" which for anyone knowledgeable means I don't intend to use the FFP commonly used in NeHe examples.

granted NeHe examples also use GLUT which also isn't modern...
SDL2 is the modern version of that, though I prefer GLFW+FT since there's no bulk. :P

I don't intend to write any libraries, all I want to do is build a UI with modern gl.


English can be a bear. I read it as a question about about modern UI using GL. My mistake. Since the UI can be modern, and no one has chatted about modern GL, I read it differently and apologize for that.

To me there is OpenGL 2.x and 3.x. So if you want to start again, try keeping your code strictly to the latest OpenGL. Would that be modern GL?


GL 3x+ is usually referred to in the sense of "modern gl" but even 4x I do believe still incorporates the FFP...

basically as I've come to know OpenGL, you're either using the FFP, or you're using modern buffers...

although I do believe there's a difference related there as well since 4x has had some improvements... not sure though...

Minecraft doesn't even use modern gl...
Optifine HD does though, but I don't know how well...
and even then, alot of mods don't use modern gl.
I'll even admit, I still prefer FFP-based display lists for FT-fonts instead of modern texture atlases :P

how this relates to the op is because I don't know how to move a UI away from the slow FFP.

pr sorry, it's not the FFP that's slow, it's the computations needed to compute both the rectangles and hitdefs...

basically I want to move away from my hitdefs, which tie me to the FFP, and figure out how to do everything under a more modern approach...

Edited by DarkPikachu


Here's something I learned long ago. Back when I first started doing a bit of 3D programming I tried without a GPU but just onboard graphics. It showed the usual onboard graphics were pretty pathetic.

OpenGL from what I see will leverage the GPU if it's supported.

While I didn't want to point out the topic you lead with, English again mislead me as to what you were asking. Your topic ends with modern gl ui. So I read it in English as adjective, adjective and the last word being the thing we were going to discuss (the ui.)
Again I apologize I took that as an English reading rather than French where you lead with what you want to talk about. "Cow brown" vs. brown cow.

So again I apologize for thinking all about the UI and hope you can get others to pile in here.

As to FFP I re-read where it's deprecated but still in the latest OpenGL as they don't want to break everything at once.

Maybe you'll get more responders over at that gamedev forum.


yeah I understand English can be a B ;)

but as for the other gamedev forum, I've tried asking various questions on various gamedev forums without ever really getting anywhere...
I'd prowl the net again if I wasn't exclusively restricted to this dumb phone, but I'm still waiting for internet...
as for right now, this forum (daniweb) has given me the most response to most of my perplexing questions, with Gribouillis recieving most of the credit for answering them ;)
unfortunately I don't think he could help me in this situation... heh

thanks for trying though, and sorry for the misunderstanding ;)

Votes + Comments
I really should have used a TNG reference as to sentence structure. Tea, Earl Grey, Hot.
This topic has been dead for over six months. Start a new discussion instead.
Have something to contribute to this discussion? Please be thoughtful, detailed and courteous, and be sure to adhere to our posting rules.