幾行代碼解決問題汽摹,有2個方案往下看
說到iOS防截屏大部分人肯定說無法實現(xiàn)李丰,但是我在這樣回復了老板后,老板給我答復是:愛奇藝逼泣,實現(xiàn)了E棵凇!@(付費視頻)
1.查閱大量資料不難發(fā)現(xiàn)氏仗,愛奇藝之所以防截屏吉捶,是因為有版權的視頻都已轉化成加密流的視頻,這樣你在截圖的時候皆尔,就能達到防截屏的效果
2.接下來我們就是要做的把需要防止用戶截屏的區(qū)域做成加密流的視頻就可以了呐舔。
首先我把吧圖片生成一幀幀的加密流視頻
+ (CVPixelBufferRef) pixelBufferFromCGImage: (CGImageRef) image {
CGSize size = CGSizeMake(CGImageGetWidth(image), CGImageGetHeight(image));
NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey,
[NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey,
nil];
CVPixelBufferRef pxbuffer = NULL;
CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault,
size.width,
size.height,
kCVPixelFormatType_32ARGB,
(__bridge CFDictionaryRef) options,
&pxbuffer);
if (status != kCVReturnSuccess){
NSLog(@"Failed to create pixel buffer");
}
CVPixelBufferLockBaseAddress(pxbuffer, 0);
void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);
CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context = CGBitmapContextCreate(pxdata, size.width,
size.height, 8, 4*size.width, rgbColorSpace,
kCGImageAlphaPremultipliedFirst);
//kCGImageAlphaNoneSkipFirst);
CGContextConcatCTM(context, CGAffineTransformMakeRotation(0));
CGContextDrawImage(context, CGRectMake(0, 0, CGImageGetWidth(image),
CGImageGetHeight(image)), image);
CGColorSpaceRelease(rgbColorSpace);
CGContextRelease(context);
CVPixelBufferUnlockBaseAddress(pxbuffer, 0);
return pxbuffer;
}
接著,我將它制作成視頻
+ (void)imageToMP4:(UIImage *)img completion:(void (^)(NSData *))handler {
NSError *error = nil;
NSFileManager *fileMgr = [NSFileManager defaultManager];
NSString *documentsDirectory = [NSHomeDirectory()
stringByAppendingPathComponent:@"tmp"];
NSString *videoOutputPath = [documentsDirectory stringByAppendingPathComponent:@"test_output.mp4"];
if ([fileMgr removeItemAtPath:videoOutputPath error:&error] != YES)
NSLog(@"Unable to delete file: %@", [error localizedDescription]);
CGSize imageSize = CGSizeMake(img.size.width, img.size.height);
NSUInteger fps = 5;
NSArray *imageArray = @[img];
NSLog(@"videoOutputPath========%@", videoOutputPath);
NSLog(@"Start building video from defined frames.");
AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:
[NSURL fileURLWithPath:videoOutputPath] fileType:AVFileTypeQuickTimeMovie
error:&error];
/// !!!需要設置faststart
videoWriter.shouldOptimizeForNetworkUse = YES;
NSParameterAssert(videoWriter);
NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
AVVideoCodecTypeH264, AVVideoCodecKey,
[NSNumber numberWithInt:imageSize.width], AVVideoWidthKey,
[NSNumber numberWithInt:imageSize.height], AVVideoHeightKey,
nil];
AVAssetWriterInput* videoWriterInput = [AVAssetWriterInput
assetWriterInputWithMediaType:AVMediaTypeVideo
outputSettings:videoSettings];
AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor
assetWriterInputPixelBufferAdaptorWithAssetWriterInput:videoWriterInput
sourcePixelBufferAttributes:nil];
NSParameterAssert(videoWriterInput);
NSParameterAssert([videoWriter canAddInput:videoWriterInput]);
videoWriterInput.expectsMediaDataInRealTime = YES;
[videoWriter addInput:videoWriterInput];
//Start a session:
[videoWriter startWriting];
[videoWriter startSessionAtSourceTime:kCMTimeZero];
CVPixelBufferRef buffer = NULL;
//convert uiimage to CGImage.
int frameCount = 0;
double numberOfSecondsPerFrame = 1;
double frameDuration = fps * numberOfSecondsPerFrame;
//for(VideoFrame * frm in imageArray)
NSLog(@"**************************************************");
for(UIImage * img in imageArray)
{
//UIImage * img = frm._imageFrame;
buffer = [self pixelBufferFromCGImage:[img CGImage]];
BOOL append_ok = NO;
int j = 0;
while (!append_ok && j < 30) {
if (adaptor.assetWriterInput.readyForMoreMediaData) {
//print out status:
NSLog(@"Processing video frame (%d,%lu)",frameCount,(unsigned long)[imageArray count]);
CMTime frameTime = CMTimeMake(frameCount*frameDuration,(int32_t) fps);
append_ok = [adaptor appendPixelBuffer:buffer withPresentationTime:frameTime];
if(!append_ok){
NSError *error = videoWriter.error;
if(error!=nil) {
NSLog(@"Unresolved error %@,%@.", error, [error userInfo]);
}
}
}
else {
printf("adaptor not ready %d, %d\n", frameCount, j);
[NSThread sleepForTimeInterval:0.1];
}
j++;
}
if (!append_ok) {
printf("error appending image %d times %d\n, with error.", frameCount, j);
}
frameCount++;
}
NSLog(@"**************************************************");
//Finish the session:
[videoWriterInput markAsFinished];
[videoWriter finishWritingWithCompletionHandler:^{
if (handler) {
NSData *data = [NSData dataWithContentsOfFile:videoOutputPath];
handler(data);
}
}];
}
最后完成慷蠕。有些小伙伴可能會發(fā)現(xiàn)圖片變形了珊拼,那就是最大的坑了,繪制圖片需要寬度是16的整數倍于是開碼:
+ (UIImage *)composite_Picture:(UIImage *)image{
CGSize imageContSize = CGSizeMake(((int)image.size.width/16+1)*16, ((int)image.size.width/16+1)*16/(image.size.width/image.size.height));
//開啟圖形上下文
UIGraphicsBeginImageContext(imageContSize);
[image drawInRect:CGRectMake(0, 0, imageContSize.width, imageContSize.height)];
//獲取圖片
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
//關閉上下文
UIGraphicsEndImageContext();
NSLog(@"%f%f",newImage.size.width,newImage.size.height);
return newImage;
}
請真機運行流炕,進入Demo先點生成視頻澎现,在截圖。如果對您有幫助每辟,麻煩您點贊點贊剑辫,謝謝!
demo傳送門:https://gitee.com/shyzine/i-os-anti-screenshot.git 不贊不回渠欺,謝謝
方案二:
任何添加到返回View上的控件妹蔽,全部防截屏
swift代碼
static func makeSecView() -> UIView {
let field = UITextField()
field.isSecureTextEntry = true
guard let view = field.subviews.first else {
return UIView()
}
view.subviews.forEach { $0.removeFromSuperview() }
view.isUserInteractionEnabled = true
return view
}
OC代碼
-(UIView *)getBgView{
UITextField *bgTextField = [[UITextField alloc] init];
[bgTextField setSecureTextEntry:true];
UIView *bgView = bgTextField.subviews.firstObject;
[bgView setUserInteractionEnabled:true];
return bgView;
}