Hi,
I am trying to implement a small map viewer similar to what leaflet does in HTML5 + JavaScript.
So far preliminary tests for rendering are ok.
The issue I face is in the way scroll events are handled. Those can be generated from the the mouse wheel or by touch (tactile screen) alike.
Usually the action to apply is different depending on how it is done:
- on scroll with mouse wheel, the map zooms in and out.
- on scroll with touch gesture, the map pans in some direction (zoom on touch with the usual zoom gesture is handled by zoom events).
Usually on mobile, you do not have a mouse attached so the scroll event can only come from the tactile screen.
But on desktop it can come from both (we have a couple of tactile screens around for data input and recent laptops come with touch screens).
In the API, there does not seem a way to differentiate the source of the event in order to know whether it comes from the mouse wheel or the tactile screen.
Any clue?