iOS Swift - 将.wav文件合并并转换为.mp3

Lib*_*tal 18 audio mp3 lame ios swift

我想将两个或多个.wav文件合并为一个,然后将其转换为.mp3,我想在Swift中完成(或者至少可以选择将它包含在swift项目中).

在swift中合并两个.wav文件不是问题.这是我的例子现在我不知道如何将lame库添加到swift项目以及如何使用它(如何更改客观的蹩脚代码使用语法以在swift中使用它).

我坚持使用swift,所以我尝试了使用Objective C的Lame库.我找到了将.caf转换为.mp3的示例代码,所以我试了一下.这是我尝试过的:

- (void) toMp3
{
    NSString *cafFilePath = [[NSBundle mainBundle] pathForResource:@"sound" ofType:@"caf"];

    NSString *mp3FileName = @"Mp3File";
    mp3FileName = [mp3FileName stringByAppendingString:@".mp3"];
    NSString *mp3FilePath = [[NSHomeDirectory() stringByAppendingFormat:@"/Documents/"] stringByAppendingPathComponent:mp3FileName];

    NSLog(@"%@", mp3FilePath);

    @try {
        int read, write;

        FILE *pcm = fopen([cafFilePath cStringUsingEncoding:1], "rb");  //source
        fseek(pcm, 4*1024, SEEK_CUR);                                   //skip file header
        FILE *mp3 = fopen([mp3FilePath cStringUsingEncoding:1], "wb");  //output

        const int PCM_SIZE = 8192;
        const int MP3_SIZE = 8192;
        short int pcm_buffer[PCM_SIZE*2];
        unsigned char mp3_buffer[MP3_SIZE];

        lame_t lame = lame_init();
        lame_set_in_samplerate(lame, 44100);
        lame_set_VBR(lame, vbr_default);
        lame_init_params(lame);

        do {
            read = fread(pcm_buffer, 2*sizeof(short int), PCM_SIZE, pcm);
            if (read == 0)
                write = lame_encode_flush(lame, mp3_buffer, MP3_SIZE);
            else
                write = lame_encode_buffer_interleaved(lame, pcm_buffer, read, mp3_buffer, MP3_SIZE);

            fwrite(mp3_buffer, write, 1, mp3);

        } while (read != 0);

        lame_close(lame);
        fclose(mp3);
        fclose(pcm);
    }
    @catch (NSException *exception) {
        NSLog(@"%@",[exception description]);
    }
    @finally {
        [self performSelectorOnMainThread:@selector(convertMp3Finish)
                               withObject:nil
                            waitUntilDone:YES];
    }
}

- (void) convertMp3Finish
{
}
Run Code Online (Sandbox Code Playgroud)

但结果只是带有噪音的.mp3.

所以我需要解决我的三个问题:

  • 在Objective C中从caf创建正确的mp3
  • 更改代码以将其用于wav文件
  • 并将其更改为能够在Swift中使用它

我知道在iOS中有很多关于编码和转换mp3的问题,但是我找不到一个使用Swift的例子,我找不到使用Objective C代码的例子(只是上面的代码).感谢帮助

Lib*_*tal 11

我想发布我的工作解决方案,因为我得到了很多赞许,来自naresh的回答对我没什么帮助.

  1. 我从这个项目生成了lame.framework库https://github.com/wuqiong/mp3lame-for-iOS
  2. 我已经将库添加到我的Swift项目中(Build Phases - > Link Binary with Libraries)
  3. 我已经创建了在Objective C中使用c函数的包装器,并通过桥接头我在Swift中使用它.
  4. 对于连接wav文件,我使用AVAssetExportSession和Swift

现在源代码.所以首先包装.它是将.wav文件转换为.mp3的类.可能会有很多变化(可能是输出文件和其他选项的参数),但我认为每个人都可以改变它.我想这可以改写为Swift,但我不知道该怎么做.所以它是Objective C类:

#import "AudioWrapper.h"
#import "lame/lame.h"

@implementation AudioWrapper

+ (void)convertFromWavToMp3:(NSString *)filePath {


    NSString *mp3FileName = @"Mp3File";
    mp3FileName = [mp3FileName stringByAppendingString:@".mp3"];
    NSString *mp3FilePath = [NSTemporaryDirectory() stringByAppendingPathComponent:mp3FileName];

    NSLog(@"%@", mp3FilePath);

    @try {
        int read, write;

        FILE *pcm = fopen([filePath cStringUsingEncoding:1], "rb");  //source
        fseek(pcm, 4*1024, SEEK_CUR);                                   //skip file header
        FILE *mp3 = fopen([mp3FilePath cStringUsingEncoding:1], "wb");  //output

        const int PCM_SIZE = 8192;
        const int MP3_SIZE = 8192;
        short int pcm_buffer[PCM_SIZE*2];
        unsigned char mp3_buffer[MP3_SIZE];

        lame_t lame = lame_init();
        lame_set_in_samplerate(lame, 44100);
        lame_set_VBR(lame, vbr_default);
        lame_init_params(lame);

        do {
            read = fread(pcm_buffer, 2*sizeof(short int), PCM_SIZE, pcm);
            if (read == 0)
                write = lame_encode_flush(lame, mp3_buffer, MP3_SIZE);
            else
                write = lame_encode_buffer_interleaved(lame, pcm_buffer, read, mp3_buffer, MP3_SIZE);

            fwrite(mp3_buffer, write, 1, mp3);

        } while (read != 0);

        lame_close(lame);
        fclose(mp3);
        fclose(pcm);
    }
    @catch (NSException *exception) {
        NSLog(@"%@",[exception description]);
    }
    @finally {
        [self performSelectorOnMainThread:@selector(convertMp3Finish)
                               withObject:nil
                            waitUntilDone:YES];
    }
}
Run Code Online (Sandbox Code Playgroud)

用于连接音频文件的Swift AudioHelper类和用于将.wav文件转换为.mp3的调用方法:

import UIKit
import AVFoundation


protocol AudioHelperDelegate {
    func assetExportSessionDidFinishExport(session: AVAssetExportSession, outputUrl: NSURL)
}

class AudioHelper: NSObject {

    var delegate: AudioHelperDelegate?

    func concatenate(audioUrls: [NSURL]) {

        //Create AVMutableComposition Object.This object will hold our multiple AVMutableCompositionTrack.
        var composition = AVMutableComposition()
        var compositionAudioTrack:AVMutableCompositionTrack = composition.addMutableTrackWithMediaType(AVMediaTypeAudio, preferredTrackID: CMPersistentTrackID())

        //create new file to receive data
        var documentDirectoryURL = NSFileManager.defaultManager().URLsForDirectory(.DocumentDirectory, inDomains: .UserDomainMask).first! as! NSURL
        var fileDestinationUrl = NSURL(fileURLWithPath: NSTemporaryDirectory().stringByAppendingPathComponent("resultmerge.wav"))
        println(fileDestinationUrl)

        StorageManager.sharedInstance.deleteFileAtPath(NSTemporaryDirectory().stringByAppendingPathComponent("resultmerge.wav"))

        var avAssets: [AVURLAsset] = []
        var assetTracks: [AVAssetTrack] = []
        var durations: [CMTime] = []
        var timeRanges: [CMTimeRange] = []

        var insertTime = kCMTimeZero

        for audioUrl in audioUrls {
            let avAsset = AVURLAsset(URL: audioUrl, options: nil)
            avAssets.append(avAsset)

            let assetTrack = avAsset.tracksWithMediaType(AVMediaTypeAudio)[0] as! AVAssetTrack
            assetTracks.append(assetTrack)

            let duration = assetTrack.timeRange.duration
            durations.append(duration)

            let timeRange = CMTimeRangeMake(kCMTimeZero, duration)
            timeRanges.append(timeRange)

            compositionAudioTrack.insertTimeRange(timeRange, ofTrack: assetTrack, atTime: insertTime, error: nil)
            insertTime = CMTimeAdd(insertTime, duration)
        }

        //AVAssetExportPresetPassthrough => concatenation
        var assetExport = AVAssetExportSession(asset: composition, presetName: AVAssetExportPresetPassthrough)
        assetExport.outputFileType = AVFileTypeWAVE
        assetExport.outputURL = fileDestinationUrl
        assetExport.exportAsynchronouslyWithCompletionHandler({
            self.delegate?.assetExportSessionDidFinishExport(assetExport, outputUrl: fileDestinationUrl!)
        })
    }

    func exportTempWavAsMp3() {

        let wavFilePath = NSTemporaryDirectory().stringByAppendingPathComponent("resultmerge.wav")
        AudioWrapper.convertFromWavToMp3(wavFilePath)
    }
}
Run Code Online (Sandbox Code Playgroud)

桥接头包含:

#import "lame/lame.h"
#import "AudioWrapper.h"
Run Code Online (Sandbox Code Playgroud)


nar*_*esh 6

我们有专门的类来从/向他们的文件读取/写入媒体AVAssetReader,AVAssetWriter并且在AVAssetExportSession您的帮助下可以将其导出为mp3文件.或者你可以使用https://github.com/michaeltyson/TPAACAudioConverter