在iOS中作为背景的视频最有效的方法

Axe*_*l K 18 iphone video user-interface objective-c ios

也许您已经注意到iOS应用程序的最新趋势之一:使用视频作为背景 - 主要是在登录或"首次启动"屏幕.昨天我试图通过一个非常简单的测试项目(只有一个视图控制器)模仿这个,除了性能之外,我对结果很满意.在iOS模拟器中尝试(在模拟的iPhone 6上)时,CPU使用率70-110%之间波动.对于简单的登录屏幕来说,这似乎是非常不合理的.

这就是它的实际效果:http: //oi57.tinypic.com/nqqntv.jpg

问题是:是否有更有效的CPU方法来实现这一目标?Vine,Spotify和Instagram等应用程序如何做到这一点?

在你回答之前; 我使用的方法是使用MPMoviePlayerController播放的全高清视频:

- (void)viewDidLoad {
    [super viewDidLoad];

    // find movie file
    NSString *moviePath = [[NSBundle mainBundle] pathForResource:@"arenaVideo" ofType:@"mp4"];
    NSURL *movieURL = [NSURL fileURLWithPath:moviePath];

    // load movie
    self.moviePlayer = [[MPMoviePlayerController alloc] initWithContentURL:movieURL];
    self.moviePlayer.controlStyle = MPMovieControlStyleNone;
    self.moviePlayer.view.frame = self.view.frame;
    self.moviePlayer.scalingMode = MPMovieScalingModeAspectFill;
    [self.view addSubview:self.moviePlayer.view];
    [self.view sendSubviewToBack:self.moviePlayer.view];
    [self.moviePlayer play];

    // loop movie
    [[NSNotificationCenter defaultCenter] addObserver: self
                                             selector: @selector(replayMovie:)
                                                 name: MPMoviePlayerPlaybackDidFinishNotification
                                               object: self.moviePlayer];
}

#pragma mark - Helper methods

-(void)replayMovie:(NSNotification *)notification
{
    [self.moviePlayer play];
}
Run Code Online (Sandbox Code Playgroud)

当然,视频的边缘可能已被修剪,因此分辨率将更像是700x1080而不是1920x1080,但这会在性能方面产生巨大差异吗?或者我应该使用特定格式和设置压缩视频以获得最佳性能?也许有一个完全替代的方法呢?

实际上我尝试使用本文所述的GIF:https://medium.com/swift-programming/ios-make-an-awesome-video-background-view-objective-c-swift-318e1d71d0a2

问题是:

  • 从视频中创建GIF需要花费大量的时间和精力
  • 我在尝试时看到CPU使用率没有显着下降
  • 支持多种屏幕尺寸是这种方法的全部痛苦(至少在我尝试时 - 启用了Autolayout和Size Classes - 我无法让GIF在设备之间正确扩展)
  • 视频质量很差

And*_*ius 20

最好的方法是使用AVFoundation然后你控制视频层本身

在头文件声明中 @property (nonatomic, strong) AVPlayerLayer *playerLayer;

- (void)viewDidLoad {
      [super viewDidLoad];


      [self.view.layer addSublayer:self.playerLayer];

      // loop movie
      [[NSNotificationCenter defaultCenter] addObserver: self
                                             selector: @selector(replayMovie:)
                                             name: AVPlayerItemDidPlayToEndTimeNotification 
                                             object:nil];
}
-(AVPlayerLayer*)playerLayer{
      if(!_playerLayer){

         // find movie file
         NSString *moviePath = [[NSBundle mainBundle] pathForResource:@"arenaVideo" ofType:@"mp4"];
         NSURL *movieURL = [NSURL fileURLWithPath:moviePath];
         _playerLayer = [AVPlayerLayer playerLayerWithPlayer:[[AVPlayer alloc]initWithURL:movieURL]];
         _playerLayer.frame = CGRectMake(0,0,self.view.frame.size.width, self.view.frame.size.height);
         [_playerLayer.player play];

      }
    return _playerLayer
}
-(void)replayMovie:(NSNotification *)notification
{
    [self.playerLayer.player play];
}
Run Code Online (Sandbox Code Playgroud)

Swift 2.0

lazy var playerLayer:AVPlayerLayer = {

    let player = AVPlayer(URL:  NSURL(fileURLWithPath: NSBundle.mainBundle().pathForResource("LaunchMovie", ofType: "mov")!))
    player.muted = true
    player.allowsExternalPlayback = false
    player.appliesMediaSelectionCriteriaAutomatically = false
    var error:NSError?

    // This is needed so it would not cut off users audio (if listening to music etc.
    do {
        try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryAmbient)
    } catch var error1 as NSError {
        error = error1
    } catch {
        fatalError()
    }
    if error != nil {
        print(error)
    }

    var playerLayer = AVPlayerLayer(player: player)
    playerLayer.frame = self.view.frame
    playerLayer.videoGravity = "AVLayerVideoGravityResizeAspectFill"
    playerLayer.backgroundColor = UIColor.blackColor().CGColor
    player.play()
    NSNotificationCenter.defaultCenter().addObserver(self, selector:"playerDidReachEnd", name:AVPlayerItemDidPlayToEndTimeNotification, object:nil)
    return playerLayer
    }()

override func viewDidLoad() {
    super.viewDidLoad()
    self.view.layer.addSublayer(self.playerLayer)
}
override func viewWillDisappear(animated: Bool) {
    NSNotificationCenter.defaultCenter().removeObserver(self)
}
// If orientation changes
override func willAnimateRotationToInterfaceOrientation(toInterfaceOrientation: UIInterfaceOrientation, duration: NSTimeInterval) {
    playerLayer.frame = self.view.frame
}
func playerDidReachEnd(){
    self.playerLayer.player!.seekToTime(kCMTimeZero)
    self.playerLayer.player!.play()

}
Run Code Online (Sandbox Code Playgroud)

在iOS7上测试 - iOS9


bol*_*nad 2

我意识到这是一篇旧帖子,但由于我有一些降低 iOS 应用程序中 CPU 使用率的经验,所以我会回复。

首先要看的是使用AVFoundationFramework

实现AVPlayer应该有助于稍微降低 CPU 性能

但最好的解决方案是使用 Brad Larson 的GPUImage库,它利用 OpenGL,将大大减少 CPU 使用率。下载该库并有如何使用的示例。我推荐使用 GPUImageMovieWriter