Are you on a laptop? Take a look at disabling USB power saver settings. I've heard those can be big problems, but I'm honestly not sure how to fix them. I think it varies from laptop to laptop, but might be findable in the BIOS somewhere.
I've been trying to get my desktop running as a server of my own to understand that part of things better. At this point I'm expecting less trouble setting that up than I've gotten with ubuntu's seeming disdain of wireless internet
I understand it, ethernet is faster, there's hundreds of wireless adapters out there, no one makes linux drivers because it's an extra thing to support, but arglblgergkguhgff. fbuhfv.
Anyway - the reason I was trying to get regular openGL to work is that we would be able to do some nice animations, like circles representing arguments opening up into nice rectangles with rounded edges. I assumed, erroneously, that OpenGL would be nice to vector-like graphics. It's not. We probably just want to go with minimal animations at this point, and just get a prototype out the door :\
There's probably a way to do it using regular openGL. I should probably learn GL ES or parts of it, because yeah, it's a lot more flexible to be able to do these things yourself without being tied to a library. I definitely wasn't trying to discourage you from that if you have the knowledge already. Just got excited seeing a toolbase that I've used and more or less understand
You could do the same sort of thing with PolygonSprite as far as I can see, using getVertices() and a tweening method to translate each one out towards your target shape, making and destroying new sprites each update. Punch a box-hole in the polygon, or decompose it into two tied at their edges if their math can't deal with holes, and you have a border. Or leave it alone and source your texture from the node's text data, and you have the interior. Playing around with TextureRegions, you should be able to make the border maintain rotation and scale to avoid weird distortions if what you're using is a meaningful pattern like wood or something like that.
But yeah, if nothing else, it's completely possible to drop GDX's classes into something that works directly with OpenGL to build closer to metal, or I wouldn't be suggesting this.