我想从服务器流式传输一首歌,这似乎与我完成的方式一样好用,但是这种方法似乎有些东西是不可能的.
我目前已经实例化了一个AVPlayer
player = [[AVPlayer alloc] initWithURL:[NSURL URLWithString:@"http://media.soundcloud.com/stream/songID?stream_token=myToken"]];
Run Code Online (Sandbox Code Playgroud)
当我的应用程序认为缓冲区足够时,就调用它.
这听起来很完美,但我发现如果我停止指向AVPlayer实例,播放器会停止,如果我将应用程序推到后台,声音会消失并暂停直到应用程序返回到前台.
是否可以让歌曲继续在AVPlayer的后台播放?
如果没有,我应该看看HTTP Live Streaming吗?
我对iOS的这一方很新,我非常感谢我的同事给我的任何指导!
我正在使用以下代码从互联网上播放网址声音:
var audioPlayer = AVPlayer()
Run Code Online (Sandbox Code Playgroud)
...
let audioSession = AVAudioSession.sharedInstance()
audioSession.setCategory(AVAudioSessionCategoryAmbient, error: nil)
let url = trackFileName
let playerItem = AVPlayerItem( URL:NSURL( string:url ) )
audioPlayer = AVPlayer(playerItem:playerItem)
audioPlayer.rate = 1.0;
player.play()
Run Code Online (Sandbox Code Playgroud)
我正在努力确保它继续在后台播放,这就是我使用"AVAudioSession.sharedInstance()"的原因.当我通过锁定屏幕在模拟器中尝试它时,它会继续像它应该的那样播放.但是如果我在一个设备上播放它,一旦屏幕自动看起来或者我自己锁定声音就会关闭声音.
我的代码中缺少什么?
我有一个[AVAsset]数组,我正在尝试将所有这些资产组合成一个资产,以便我可以无缝播放视频(我尝试使用AVQueuePlayer,但不能无缝播放资产).
下面是我到目前为止所做的,但当我尝试播放最终乐曲时,它只播放第一首曲目,即使它表明它具有所有曲目且总持续时间等于所有曲目.
我是否错过了一个步骤,即使看起来所有曲目都在合成中?如果AVPlayerItem有多个轨道,也许我需要以不同的方式处理AVPlayer?
let playerLayer: AVPlayerLayer = AVPlayerLayer()
lazy var videoPlayer: AVPlayer = AVPlayer()
var videoClips = [AVAsset]()
let videoComposition = AVMutableComposition()
var playerItem: AVPlayerItem!
var lastTime: CMTime = kCMTimeZero
for clipIndex in videoClips {
let videoCompositionTrack = videoComposition.addMutableTrackWithMediaType(AVMediaTypeVideo, preferredTrackID: Int32(kCMPersistentTrackID_Invalid))
do {
try videoCompositionTrack.insertTimeRange(CMTimeRangeMake(kCMTimeZero, clipIndex.duration),
ofTrack: clipIndex.tracksWithMediaType(AVMediaTypeVideo)[0] ,
atTime: lastTime)
lastTime = CMTimeAdd(lastTime, clipIndex.duration)
} catch {
print("Failed to insert track")
}
}
print("VideoComposition Tracks: \(videoComposition.tracks.count)") // Shows multiple tracks
playerItem = AVPlayerItem(asset: videoComposition)
print("PlayerItem Duration: \(playerItem.duration.seconds)") // Shows …Run Code Online (Sandbox Code Playgroud) 我正在播放由AVCapture录制的录制视频.我将视频URL保存在名为outputFileURL的字符串中.我尝试使用AVPlayerLayer概念播放视频.我用的代码是
AVPlayer *avPlayerq = [AVPlayer playerWithURL:outputFileURL];
avPlayerq.actionAtItemEnd = AVPlayerActionAtItemEndNone;
AVPlayerLayer *videoLayer = [AVPlayerLayer playerLayerWithPlayer:avPlayerq];
videoLayer.frame= self.view.bounds;
[self.view.layer addSublayer:videoLayer];
[avPlayerq play];
Run Code Online (Sandbox Code Playgroud)
但我得到的视频不是全屏.任何人都可以帮我解决?
我正试图在登录屏幕上显示视频,如Spotify等应用程序所示.
我在做什么
要做到这一点,我正在使用AVPlayer:
self.videoPlayer = AVPlayer(playerItem: item)
self.videoView.player = self.videoPlayer
self.videoPlayer.play()
Run Code Online (Sandbox Code Playgroud)
该videoView是描述一个自定义UIView类在这里.
我将AVLayer的videoGravity设置为AVLayerVideoGravityResizeAspectFill:
self.videoView.playerLayer.videoGravity = AVLayerVideoGravityResizeAspectFill
Run Code Online (Sandbox Code Playgroud)
问题
但是,当我尝试填写视图边界时,我的视频仍然是letterboxed:
我想要的是
我想要的是让视频填满整个边界,没有任何黑条.我不在乎视频的一部分是否被剪裁:
附加信息
当我查看已弃用的MPMoviePlayerController的缩放模式属性时,我找到了aspectFill属性的以下描述:
均匀缩放影片,直到影片填充视图的可见边界.剪切两个尺寸中较大尺寸边缘的内容,使另一个尺寸精确地适合视图.保留电影的宽高比
从这个描述来看,这是我想要的确切行为.但是,正如已经说明的那样,我的视频得到了信箱.我做错了还是Apple停止支持这种缩放?如果我不关心被剪辑的部分视频,我是否必须自己实施此缩放?
感谢任何帮助,谢谢.
我想打它存储在我的设备试图既视频AVPlayer和MPMoviePlayerController.视频URl是:file:///var/mobile/Containers/Data/Application/73695F3E-9351-447B-BB8A-0B4A62CE430F/Documents/08-03-201711:48:50.3gp.现在的问题是我得到了空白屏幕.
AVPlayer *player = [AVPlayer playerWithURL:fileURL];
AVPlayerViewController *playerViewController = [AVPlayerViewController new];
[player pause];
[player play];
playerViewController.player = player;
[self addChildViewController: playerViewController];
[self.view addSubview: playerViewController.view];
//[playerViewController.player play];//Used to Play On start
[self presentViewController:playerViewController animated:YES completion:nil];
Run Code Online (Sandbox Code Playgroud)
我认为问题在于链接.我正在做的是获取本地目录链接并使用该链接追加名称.
我是新手使用AVPlayer for IOS swift并使其正常工作.但是我希望视频在UIView中播放,现在视频默认占用整个页面.我一直在尝试一些事情,这里没有任何作用是我的代码.我在该页面上有其他内容,这就是为什么我希望UIVIEW中的avplayer被称为newView,当我把下面的代码放到它时给我一个错误.我一直在UIView层中关注此示例AVPlayerLayer Position
import UIKit
import AVFoundation
import WebKit
import AVKit
class ExampleTable: UIViewController {
@IBOutlet weak var newView: UIView!
let avPlayerViewController = CustomAVPLayerC()
var playerView: AVPlayer?
var AVLayer: AVPlayerLayer?
override func viewDidAppear(_ animated: Bool) {
let movieURL:NSURL? = NSURL(string: "my url")
playerView = AVPlayer(url: movieURL! as URL)
avPlayerViewController.player = playerView
playerView?.isMuted = true
avPlayerViewController.view.frame = newView.frame
newView.addSubview(avPlayerViewController.view)
addChildViewController(avPlayerViewController)
self.present(self.avPlayerViewController, animated: true) {
self.avPlayerViewController.player?.play()
}
}
override func viewDidLoad() {
super.viewDidLoad()
}
} …Run Code Online (Sandbox Code Playgroud) 我的代码在模拟器中可以正常运行,并且可以毫无问题地传输URL。但是,当插入实际设备时,它不会播放音频。我的代码如下
var playerItem: AVPlayerItem?
var player: AVPlayer?
func initPlayer() {
let url:URL = URL(string:"https://storage.googleapis.com/purelightdatabucket/samples%2F60-Energy%20UPLIFT.mp3")!
self.playerItem = AVPlayerItem(url: url)
self.player=AVPlayer(playerItem: self.playerItem!)
let playerLayer=AVPlayerLayer(player: self.player!)
playerLayer.frame = CGRect(x: 0, y: 0, width: 10, height: 50) // actually this player layer is not visible
self.view.layer.addSublayer(playerLayer)
}
Run Code Online (Sandbox Code Playgroud) 我使用Av播放器视图控制器创建了一个启动视频.它适用于所有设备,除了我的手机X.我尝试改变视频重力框架和一切,但它不会工作.关于这个的任何想法?这是示例代码:
guard let videoPath = Bundle.main.path(forResource: "Redtaxi-splash", ofType:"mov") else {
return
}
let videoURL = URL(fileURLWithPath: videoPath)
let player = AVPlayer(url: videoURL)
playerViewController = AVPlayerViewController()
playerViewController?.player = player
playerViewController?.showsPlaybackControls = false
playerViewController?.view.frame = view.frame
playerViewController?.view.backgroundColor = .white
playerViewController?.view.contentMode = .scaleAspectFill
NotificationCenter.default.addObserver(self, selector: #selector(playerDidFinishPlaying(note:)),
name: NSNotification.Name.AVPlayerItemDidPlayToEndTime, object: player.currentItem)
view.addSubview((playerViewController?.view)!)
playerViewController?.player?.play()
Run Code Online (Sandbox Code Playgroud) 我正在编写的音乐播放器应用程序,Swift带有音频流AVPlayer且一切正常
但是,当我尝试将MPRemoteCommandCenter添加到我的应用程序时,我什至不知道为什么会发生很多错误
func setupPlayer() {
let item = AVPlayerItem(url: musicURL)
self.player = AVPlayer.init(playerItem: item)
self.player.play()
self.player.volume = 1
self.player.addPeriodicTimeObserver(forInterval: CMTimeMakeWithSeconds(1, preferredTimescale: 1), queue: DispatchQueue.main, using: { (time) in
if self.player.currentItem?.status == .readyToPlay {
self.reloadNowPlayingInfo()
let currentTime = self.player.currentTime().seconds
self.playingTime.text = currentTime.getTimeString()
self.playerSlider.value = currentTime/duration
}
})
}
func reloadNowPlayingInfo() {
var info = [String : Any]()
info[MPMediaItemPropertyTitle] = self.titleText
info[MPMediaItemPropertyArtwork] = MPMediaItemArtwork.init("some image")
info[MPMediaItemPropertyPlaybackDuration] = seconds
info[MPNowPlayingInfoPropertyElapsedPlaybackTime] = currentSecs
info[MPMediaItemPropertyArtist] = "Artist name"
MPNowPlayingInfoCenter.default().nowPlayingInfo …Run Code Online (Sandbox Code Playgroud)