I'm having some trouble figuring out how to handle touches and gestures in views with custom renderers, when these views sit inside an XF ScrollView.
The view hierarchy is fairly simple:
<AbsoluteLayout x:Name="SlideView"> <ScrollView x:Name="ControlScrollView"> <ContentView x:Name="ControlView" HeightRequest="100"> </ContentView> </ScrollView> </AbsoluteLayout>
The content of ControlView is set programmatically to an AbsoluteLayout, containing a bunch of Views that each have a custom renderer. These Views implement native labels, buttons, and more complex controls. So far so good; the problem starts when I try to intercept touch events in the custom renderers.
On Android, this works more or less as expected (working the same way as the native iPhone app from which this was ported): Any vertical swipes get interpreted as a scroll action, and the scrollview will move. Horizontal swipes and taps are correctly picked up by the Android GestureDetectors. When starting a swipe on a vertical SeekBar, no scrolling takes place and the vertical gesture moves the SeekBar's pip instead.
On iOS, scrolling doesn't work at all, unless I set InputTransparent = true either on the ContentView or the AbsoluteLayout inside the ContentView. But when I do that, the iOS custom renderers no longer receive any touches, neither TouchesXXX overrides nor UIGestureRecognizers will get anything. Note that the UIView has ExclusiveTouch = false, and the UIGestureRecognizers have CancelsTouchesInView = false.
Is there a way to have the iOS app send touch events to both the XF ScrollView and the iOS native gesture recognizers?