Original address: https://www.jianshu.com/p/0b439487b4f9
One requirement in the project is to get all the click coordinates on the screen in the app.
The solution creates a subclass that inherits from UIApplication and then obtains and judges it in the sendEvent method
#import "MRApplication.h" #include <CommonCrypto/CommonCrypto.h> @interface MRApplication() @property(nonatomic,assign) BOOL isMoved; @end @implementation MRApplication - (void)sendEvent:(UIEvent *)event{ if (event.type==UIEventTypeTouches) { UITouch *touch = [event.allTouches anyObject]; if (touch.phase == UITouchPhaseBegan) { self.isMoved = NO; } if (touch.phase == UITouchPhaseMoved) { self.isMoved = YES; } if (touch.phase == UITouchPhaseEnded) { if (!self.isMoved && event.allTouches.count == 1) { UITouch *touch = [event.allTouches anyObject]; CGPoint locationPointWindow = [touch preciseLocationInView:touch.window]; NSLog(@"TouchLocationWindow:(%.1f,%.1f)",locationPointWindow.x,locationPointWindow.y); } self.isMoved = NO; } } [super sendEvent:event]; } @end
In fact, the view information already exists in the touch object. If you want to get the relative coordinates in the view, you can also use touch.view.
CGPoint locationPointWindow = [touch preciseLocationInView:touch.view];
Note: This MR Application needs to be introduced into main.m, and then it can intercept all the click events of the whole app. Among them, I deal with sliding and multi-touch. Without judgment of if, I will get the UIEvent of sliding and multi-touch.
#import <UIKit/UIKit.h> #import "AppDelegate.h" #import "MRApplication.h" int main(int argc, char * argv[]) { @autoreleasepool { return UIApplicationMain(argc, argv, NSStringFromClass([MRApplication class]), NSStringFromClass([AppDelegate class])); } }