1 2 3 4 5 6 7 8 | ||
Editor: DonovanBaarda
Time: 2014/10/28 15:11:44 GMT+11 |
||
Note: |
changed: -If anyone has any hints why so many of these attempts didn't work, I'd love to hear them. - If anyone has any hints why so many of these attempts didn't work, I'd love to hear them. UPDATE: Fixed! -------------- After doing a heap of reading I've fixed this and figured out why this didn't work. It's all to do with the history of X. First, some terminology, since X has done an excellent job of confusing the terms "client" and "server"; app-server :- where the applications run. These applications talk to an x-server for their display. x-xerver :- The X display. Note this is usually the machine with the monitor attached, but VNC confuses this a bit. vnc-server :- The thing that converts X into VNC. It is always on the same machine as the x-server, and often also **is** the x-server, but some VNC-servers work by screen-scraping a real x-server. vnc-client :- The VNC display. This pretty much always where the monitor is attached. `GLX <http://en.wikipedia.org/wiki/GLX>`_ ++++++ Also known as "indirect rendering". When accelerated 3D was first added to X they it worked by sending OpenGL from the app-server to the x-server, and the x-server would do the 3D rendering with it's own 3D hardware. This matched the common hardware arrangements, where the x-servers have the fancy video cards, and the app-servers are headless. It also scales well, since each x-server adds it's own rendering hardware. The problem is this requires the x-server to understand the extra GLX extensions, and the app-server side to support using them. Not all 3D hardware is equal, so the app-server and x-server have to negotiate what extensions exist and try to work with what's available. Sometimes they fail to negotiate enough common functionality to give working 3D at all. Also, some extensions are nearly impossible to make work when the app-server and x-server are not on the same machine. `Virtual GL <http://en.wikipedia.org/wiki/VirtualGL>`_ ++++++++++ This is a new thing that does the hardware rendering on the app-server side. It works by intercepting the GLX on the app-server, rendering it (maybe on yet another 3D-render-server?), and then sending the rendered results to the x-server. Sending the rendered results is more efficient than sending the OpenGL (note: this was probably not true in the past when 3D scenes were simpler), and doing the rendering on the app-server allows fancy stuff that can't be done remotely. This means the x-server doesn't need to understand 3D at all, but the app-server does. It allows you to efficiently share centralized rendering hardware on the app-server between multiple x-servers, but also means it doesn't scale as well without x-servers adding their own rendering hardware. I also bet it's a bitch to make the "share rendering hardware between multiple x-servers" bit work correctly, and I bet many video hardware/drivers assume they talk to a real screen. AFAIK there is no Debian packages of `Virtual GL`_. Old School: Software rendering ++++++++++++++++++++++++++++++ Before GLX, 3D had to be done using software rendering (swrast) on the app-server and sending the rendered results to the x-server. Note this is what `Virtual GL`_ does, it just does it in a convoluted way via intercepting GLX to use hardware rendering. This means the x-server doesn't need to understand GLX extensions, but the app-server side needs to be able to do software rendering. Software rendering avoids any issues with trying to share 3D hardware and/or trick it into rendering to a virtual screen. It's old, but tried and tested. Adding VNC into the mix +++++++++++++++++++++++ VNC is a really simple low level screen/keyboard over the network protocol. It was designed around the idea of screen-scraping and exporting an existing display, and that's how it works on things like windows. In Debian there is x11vnc that does this and exports an existing x-server screen. This avoids all the rendering and protocol complications and leaves them up to whatever normally renders the screen. This is one way to get hardware rendering on the app-server for a remote display; run the app-server, x-server, and x11vnc on the same machine with 3D hardware. There might be some minor issues with VNC sending 3D updates, since the hardware rendering might bypass some of the hooks VNC uses to detect screen changes, but VNC polling options can usually work around this. Using VNC to export an existing x-server doesn't nicely support multiple user logins, since each vnc-client connection shares the same x-server screen. For this reason most people use a vnc-server that is also it's own x-server. These are also known as an x-proxy, since they translates X from the app-server into VNC for the vnc-client. These are configured to spawn a new x-server and session for each user that connects with a vnc-client. On Debian there is tightvnc and vnc4server. In theory these could do harware 3D rendering using GLX inside their x-server implementation. I've seen mixed reports that vnc4server is capable of this, but it looks like tightvnc definitely can't. However, even if your vnc-server can, it's normally running on the app-server, which is unlikely to have the fancy 3D hardware you have in the machine the screen is attached to. There is also TurboVNC which is an x-proxy vnc-server designed to work with `Virtual GL`. This is mostly just an x-proxy vnc-server with GLX support for `Virtual GL`_, but also includes some optimizations to VNC for sending rendered 3D efficiently. There are no Debian packages of TurboVNC. Making it work ++++++++++++++ So to make 3D work, you need your app-server and your x-server to either both support GLX well enough, or do the rendering on the app-server side, either in software or using `Virtual GL`_. If you are using VNC, then you either need to use a screen-scraping vnc-server (x11vnc) to scrape an x-server with working GLX, or an x-proxy vnc-server with working GLX (TurboVNC), or do the rendering in the app-server (software rendering). On Debian, the package libgl1-mesa-glx on the app-server will make all apps using 3D try to talk GLX to their x-server. This is the package that is generally preferred by anything that depends on OpenGL support. Unfortunately it seems that the cygwin x-server doesn't quite support GLX well enough for things like qtcreator or evolution to run with libgl1-mesa-glx. It almost supports it enough for glxgears to run, and it renders the first frame, but then fails to update it. Note that cygwins x-server does support GLX well enough to run accelerated glxgears when run on the same cygwin machine. It also seems that libgl1-mesa-glx doesn't support falling back to software rendering if GLX negotiation with the x-server fails. Note Debians Xorg x-servers do seem to support GLX well enough to run qtcreator, so switching from cygwin to a proper linux desktop will fix this. If you want to stick with cygwin's x-server, you will need to switch your app-server to software rendering. If you want to use a vnc-client, you either need to run x11vnc (and not have per-user sessions), or use tightvnc or vnc4server and switch your app-server to software rendering. For software rendering on Debian, you need libgl1-mesa-swx11 installed on your app-server, and this conflicts with libgl1-mesa-glx, or any of the other libgl1-*-glx supporting libraries. This means you app-server can support either swrast software rendering **or** GLX, but not both. Note this is a bit crap. If you have a mix of x-servers, some of which support GLX well and others that don't, then ideally the app-server should be able to fallback to software rendering only when required. There is also a glx-diversions package which supports "plugging in accelerated implementations from GPU vendors via alternatives", but this appears to be more the x-server side than the app-server side. I suspect some Debian bugs are in order here, but in the tangled web of different packages involved I'm not sure exactly what bugs to file against what.
I just want to run Qt5's qtcreator to do UI work on OpenTrack?. The problem is my current development platforms are;
It turns out qtcreate is a graphical IDE, and thus needs more than an ssh terminal to run. So things I tried are;
Install cygwin on the windows 7 box. I can run X11, ssh to the linux box, and run qcreator there! It doesn't work:
$ qtcreator libGL error: failed to load driver: swrast <...it just hangs with nothing happening, and needs ^C to exit> $ export LIBGL_DEBUG=verbose $ qtcreator libGL: OpenDriver: trying /usr/lib/x86_64-linux-gnu/dri/tls/swrast_dri.so libGL: OpenDriver: trying /usr/lib/x86_64-linux-gnu/dri/swrast_dri.so libGL: driver does not expose __driDriverGetExtensions_swrast(): /usr/lib/x86_64-linux-gnu/dri/swrast_dri.so: undefined symbol: __driDriverGetExtensions_swrast libGL: Can't open configuration file /home/abo/.drirc: No such file or directory. libGL: Can't open configuration file /home/abo/.drirc: No such file or directory. libGL error: failed to load driver: swrast function is no-op <... nothing again, needs another ^C> $ export LIBGL_ALWAYS_INDIRECT=1 $ qtcreator function is no-op <... and still nothing...^C>
It's just not working at all. Thinking maybe cygwins GLX stuff is dodgey, I ran glxgears. It creates the window with the gears, but then proceeds to claim 50~2000 fps with no visible changes in the gears window at all. That's with and without LIBGL_ALWAYS_INDIRECT=1. So maybe the GLX stuff is a bit dodgey.
Install xrdp, tightvnc, and LXDE on the linux box. Run windows remote desktop to access it and get a whole linux desktop running on the linux box in a window on the windows box. It doesn't work there either::
$ qtcreator Qt: XKEYBOARD extension not present on the X server. Xlib: extension "RANDR" missing on display ":10.0". Could not initialize GLX Aborted <... hey, at least it crashes instead of hangs.>
Note LIBGL_DEBUG=verbose and LIBGL_ALWAYS_INDIRECT=1 had no effect. A search of the internet suggested tightvnc has no GLX support. It's unclear if the alternative vnc4server does, with some bugs/posts suggesting it once did but now doesn't. The latest thing everyone recommends is TurboVNC? with VirtualGL? for client-side hardware rendering, but its comercial with no Debian debs. I tried vnc4server and it didn't help.
I guess that "XKEYBOARD extension not present" warning was serious.
If anyone has any hints why so many of these attempts didn't work, I'd love to hear them.
After doing a heap of reading I've fixed this and figured out why this didn't work. It's all to do with the history of X. First, some terminology, since X has done an excellent job of confusing the terms "client" and "server";
app-server :- where the applications run. These applications talk to an x-server for their display. x-xerver :- The X display. Note this is usually the machine with the monitor attached, but VNC confuses this a bit. vnc-server :- The thing that converts X into VNC. It is always on the same machine as the x-server, and often also is the x-server, but some VNC-servers work by screen-scraping a real x-server. vnc-client :- The VNC display. This pretty much always where the monitor is attached.
Also known as "indirect rendering". When accelerated 3D was first added to X they it worked by sending OpenGL? from the app-server to the x-server, and the x-server would do the 3D rendering with it's own 3D hardware. This matched the common hardware arrangements, where the x-servers have the fancy video cards, and the app-servers are headless. It also scales well, since each x-server adds it's own rendering hardware. The problem is this requires the x-server to understand the extra GLX extensions, and the app-server side to support using them. Not all 3D hardware is equal, so the app-server and x-server have to negotiate what extensions exist and try to work with what's available. Sometimes they fail to negotiate enough common functionality to give working 3D at all. Also, some extensions are nearly impossible to make work when the app-server and x-server are not on the same machine.
This is a new thing that does the hardware rendering on the app-server side. It works by intercepting the GLX on the app-server, rendering it (maybe on yet another 3D-render-server?), and then sending the rendered results to the x-server. Sending the rendered results is more efficient than sending the OpenGL? (note: this was probably not true in the past when 3D scenes were simpler), and doing the rendering on the app-server allows fancy stuff that can't be done remotely. This means the x-server doesn't need to understand 3D at all, but the app-server does. It allows you to efficiently share centralized rendering hardware on the app-server between multiple x-servers, but also means it doesn't scale as well without x-servers adding their own rendering hardware. I also bet it's a bitch to make the "share rendering hardware between multiple x-servers" bit work correctly, and I bet many video hardware/drivers assume they talk to a real screen. AFAIK there is no Debian packages of Virtual GL.
Before GLX, 3D had to be done using software rendering (swrast) on the app-server and sending the rendered results to the x-server. Note this is what Virtual GL does, it just does it in a convoluted way via intercepting GLX to use hardware rendering. This means the x-server doesn't need to understand GLX extensions, but the app-server side needs to be able to do software rendering. Software rendering avoids any issues with trying to share 3D hardware and/or trick it into rendering to a virtual screen. It's old, but tried and tested.
VNC is a really simple low level screen/keyboard over the network protocol. It was designed around the idea of screen-scraping and exporting an existing display, and that's how it works on things like windows. In Debian there is x11vnc that does this and exports an existing x-server screen. This avoids all the rendering and protocol complications and leaves them up to whatever normally renders the screen. This is one way to get hardware rendering on the app-server for a remote display; run the app-server, x-server, and x11vnc on the same machine with 3D hardware. There might be some minor issues with VNC sending 3D updates, since the hardware rendering might bypass some of the hooks VNC uses to detect screen changes, but VNC polling options can usually work around this.
Using VNC to export an existing x-server doesn't nicely support multiple user logins, since each vnc-client connection shares the same x-server screen. For this reason most people use a vnc-server that is also it's own x-server. These are also known as an x-proxy, since they translates X from the app-server into VNC for the vnc-client. These are configured to spawn a new x-server and session for each user that connects with a vnc-client. On Debian there is tightvnc and vnc4server. In theory these could do harware 3D rendering using GLX inside their x-server implementation. I've seen mixed reports that vnc4server is capable of this, but it looks like tightvnc definitely can't. However, even if your vnc-server can, it's normally running on the app-server, which is unlikely to have the fancy 3D hardware you have in the machine the screen is attached to.
There is also TurboVNC? which is an x-proxy vnc-server designed to work with Virtual GL. This is mostly just an x-proxy vnc-server with GLX support for Virtual GL, but also includes some optimizations to VNC for sending rendered 3D efficiently. There are no Debian packages of TurboVNC?.
So to make 3D work, you need your app-server and your x-server to either both support GLX well enough, or do the rendering on the app-server side, either in software or using Virtual GL. If you are using VNC, then you either need to use a screen-scraping vnc-server (x11vnc) to scrape an x-server with working GLX, or an x-proxy vnc-server with working GLX (TurboVNC?), or do the rendering in the app-server (software rendering).
On Debian, the package libgl1-mesa-glx on the app-server will make all apps using 3D try to talk GLX to their x-server. This is the package that is generally preferred by anything that depends on OpenGL? support.
Unfortunately it seems that the cygwin x-server doesn't quite support GLX well enough for things like qtcreator or evolution to run with libgl1-mesa-glx. It almost supports it enough for glxgears to run, and it renders the first frame, but then fails to update it. Note that cygwins x-server does support GLX well enough to run accelerated glxgears when run on the same cygwin machine. It also seems that libgl1-mesa-glx doesn't support falling back to software rendering if GLX negotiation with the x-server fails.
Note Debians Xorg x-servers do seem to support GLX well enough to run qtcreator, so switching from cygwin to a proper linux desktop will fix this. If you want to stick with cygwin's x-server, you will need to switch your app-server to software rendering.
If you want to use a vnc-client, you either need to run x11vnc (and not have per-user sessions), or use tightvnc or vnc4server and switch your app-server to software rendering.
For software rendering on Debian, you need libgl1-mesa-swx11 installed on your app-server, and this conflicts with libgl1-mesa-glx, or any of the other libgl1--glx supporting libraries. This means you app-server can support either swrast software rendering **or* GLX, but not both.
Note this is a bit crap. If you have a mix of x-servers, some of which support GLX well and others that don't, then ideally the app-server should be able to fallback to software rendering only when required.
There is also a glx-diversions package which supports "plugging in accelerated implementations from GPU vendors via alternatives", but this appears to be more the x-server side than the app-server side.
I suspect some Debian bugs are in order here, but in the tangled web of different packages involved I'm not sure exactly what bugs to file against what.