Hi! i implement touch screen driver in os\src\input\ft5406,and can get the coordinate x, y,then i use: _ev_queue.add(Input::Event(Input::Event::MOTION, 0, _abs_x, _abs_y, 0, 0)); _ev_queue.add(Input::Event(Input::Event::PRESS, Input::BTN_TOUCH, _abs_x, _abs_y, 0, 0)); _ev_queue.add(Input::Event(Input::Event::RELEASE, Input::BTN_TOUCH, _abs_x, _abs_y, 0, 0)); to add the event to the event queue. now the question is how can i pass the events to linux. the bot modules are: set boot_modules { core init timer omap4_gpio_drv omap4_i2c_drv omap4_fb_drv input_ft5406_drv sd_card_drv usb_drv nic_bridge nitpicker nit_fb part_blk terminal terminal_log l4android root-ginger.gz } thank you very much!!!
Hello,
I guess that you are using the 'l4android.run' script as basis? This run script starts the nitpicker GUI server by default, which gets connected directly to the input driver (i.e., your driver). Do you see the mouse cursor moving according to your touch input? If yes, then you are almost there. ;-)
_ev_queue.add(Input::Event(Input::Event::MOTION, 0, _abs_x, _abs_y, 0,
0));
_ev_queue.add(Input::Event(Input::Event::PRESS, Input::BTN_TOUCH, _abs_x, _abs_y, 0, 0)); _ev_queue.add(Input::Event(Input::Event::RELEASE, Input::BTN_TOUCH, _abs_x, _abs_y, 0, 0));
Those events are passed to the nitpicker GUI server, which, in turn, routes them to one of its clients. The particular client that you are interested in is the instance of 'nit_fb' that is used for displaying Android. If you see the mouse cursor moving but the events are not coming through to Android, they are getting stuck somewhere in this chain. In general, you could instrument those components (nitpicker, nit_fb) to print a message once they receive an input event.
I guess the problem lies in the nitpicker GUI server, which normally responds to mouse-button events, in particular the left mouse button. It interprets those events by setting the input-routing policy according to where the user clicked with the mouse. However, you are merely submitting 'BTN_TOUCH' events. So nitpicker won't interpret those. The easiest fix would be to let the driver generate 'BTN_LEFT' events instead of 'BTN_TOUCH' events. This way, nitpicker would see the kind of events it is expecting. Alternatively, we could change nitpicker to also respond to 'BTN_TOUCH' events.
Cheers Norman
Hi,
On Wed, May 08, 2013 at 12:02:52PM +0200, Norman Feske wrote:
I guess the problem lies in the nitpicker GUI server, which normally responds to mouse-button events, in particular the left mouse button. It interprets those events by setting the input-routing policy according to where the user clicked with the mouse. However, you are merely submitting 'BTN_TOUCH' events. So nitpicker won't interpret those. The easiest fix would be to let the driver generate 'BTN_LEFT' events instead of 'BTN_TOUCH' events. This way, nitpicker would see the kind of events it is expecting. Alternatively, we could change nitpicker to also respond to 'BTN_TOUCH' events.
In order for the Android input system to correctly work you should report BTN_TOUCH events to Android. On L4Re the L4Linux fb driver has the option l4fb.touchscreen which results in the fb driver implicitly converting BTN_LEFT events into BTN_TOUCH.
However, the best way is, if the (native) input driver conforms to Linux's multitouch protocol.
Best, Matthias.