My current guess is that you don't use virtual 8086 mode but rather, emulate it in protected mode (as user code) mostly to call (euh run, much as an assembler interpreter) Vesa BIOS int 0x10 to set video mode and ask it where it has put the framebuffer.
avl.diff applied.
Much better, from there it should be easy to change video mode depth to an available one: [init -> vesa_drv] Found: VESA BIOS version 3.0 [init -> vesa_drv] OEM: Brookdale-G Graphics Chip Accelerated VGA BIOS [init -> vesa_drv] graphics mode 1024x768@...64... not supported [init -> vesa_drv] Supported mode list [init -> vesa_drv] 0x109 132x25@...167... [init -> vesa_drv] 0x10a 132x43@...167... [init -> vesa_drv] 0x10b 132x50@...167... [init -> vesa_drv] 0x10c 132x60@...167... [init -> vesa_drv] 0x101 640x480@...168... [init -> vesa_drv] 0x103 800x600@...168... [init -> vesa_drv] 0x105 1024x768@...168... [init -> vesa_drv] 0x111 640x480@...64... [init -> vesa_drv] Searching in default vesa modes [init -> vesa_drv] Could not set vesa mode 1024x768@...64... [init -> nitpicker] C++ runtime: Genode::Parent::Service_denied [init -> nitpicker] void* abort(): abort called
Thanks, that is quite fun to see a problem like this being fixed by exchanging emails! I guess you don't want the full log. But if you want it, just ask for it.