Hi,
Thanks for this useful project. I apologise for the lack of clarity but this is more of a question and potentially feature request. I have successfully integrated this library and can drag e.g. Image
s around as outline by this project.
However, what I am wanting to do is also provide my own complex TouchHandler
logic on the canvas to allow other types of user interactions besides drag animations such as free-form drawing.
I do see in https://github.com/enzomanuelmangano/react-native-skia-gesture/blob/main/src/canvas/canvas.tsx#L81 where my "top-level" TouchHandler
will also be called when supplied.
However, what I am trying to figure out is how to have the TouchInput
s be "consumed" if, for example, they do intersect a Touchable
component as defined in this library, and if not then process the touch input based on my own custom logic.
Currently I am having difficulty trying to figure out how to differentiate when a TouchInput
will be interacting with a Touchable
element on the canvas or is in empty canvas space where I should apply my own touch logic.
I am roughly thinking if the TouchHandler
internal to this lib actually intersects a Touchable
, it prevents propagation up to the parent TouchHandler
and if it does not, it does indeed call my parent TouchHandler
so I can perform my custom logic elsewhere.
Any feedback or suggestions would be most appreciated.