I have an input binding like this:
mouse_trigger {
input: MOUSE_BUTTON_LEFT
action: "touch"
}
touch_trigger {
input: TOUCH_MULTI
action: "touch"
}
I’m on a Windows 10 laptop with a touch screen. Neither tapping the screen, nor clicking the trackpad results in the “touch” action being dispatched.
If I change my input binding to this:
mouse_trigger {
input: MOUSE_BUTTON_LEFT
action: "touch"
}
touch_trigger {
input: TOUCH_MULTI
action: "touch1"
}
Then clicking the trackpad correctly dispatches the “touch” action. Tapping the screen also dispatches a “touch” action - so, ok, I guess on windows, it treats the touchscreen as mouse input, that’s fine. (However, the reliability of the screen taps seems a little flakey - sometimes I don’t get a “touch” action with released=true).
In any case - why can’t I have both a touch trigger and a mouse trigger dispatch the same action? Isn’t that the whole point of the input binding abstraction? Is this a bug?
I’m using Editor2, but since this has to do with actions at runtime, I suspect it’s more of an engine thing.