我正在开发一个应用程序,我需要通过输出音频插孔同时通过音频捕获记录和保存视频.
我已经研究了aurio touch apple示例代码并实现了音频直通.
我也通过实施视频录制AVCaptureSession
.以上两个功能单独完成并且完美地工作.
但是当我合并功能音频通过不工作因为音频会话AVCapturesession
.
我还试图通过AVCaptureSession
委托方法传递音频数据.以下是我的代码:
OSStatus err = noErr;
AudioBufferList audioBufferList;
CMBlockBufferRef blockBuffer;
CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer(sampleBuffer, NULL, &audioBufferList, sizeof(audioBufferList), NULL, NULL, 0, &blockBuffer);
CMItemCount numberOfFrames = CMSampleBufferGetNumSamples(sampleBuffer); // corresponds to the number of CoreAudio audio frames
currentSampleTime += (double)numberOfFrames;
AudioTimeStamp timeStamp;
memset(&timeStamp, 0, sizeof(AudioTimeStamp));
timeStamp.mSampleTime = currentSampleTime;
timeStamp.mFlags |= kAudioTimeStampSampleTimeValid;
AudioUnitRenderActionFlags flags = 0;
aurioTouchAppDelegate *THIS = (aurioTouchAppDelegate *)[[UIApplication sharedApplication]delegate];
err = AudioUnitRender(self.rioUnit, &flags, &timeStamp, 1, numberOfFrames, &audioBufferList);
if (err) { printf("PerformThru: error %d\n", …
Run Code Online (Sandbox Code Playgroud) 我正在使用此功能将图像上传到服务器JSON
.为此,我首先将图像转换为NSData
然后再NSString
使用Base64
.当图像不是很大时,该方法可以正常工作,但是当我尝试上传2Mb图像时,它会崩溃.
问题是,即使didReceiveResponse
调用该方法以及didReceiveData
返回该方法,服务器也不会接收我的图像(null)
.起初我认为这是一个超时问题,但即使将其设置为1000.0它仍然不起作用.任何的想法?谢谢你的时间!
这是我目前的代码:
- (void) imageRequest {
NSMutableURLRequest *request = [NSMutableURLRequest requestWithURL:[NSURL URLWithString:@"http://www.myurltouploadimage.com/services/v1/upload.json"]];
NSString *docDir = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0];
NSString *path = [NSString stringWithFormat:@"%@/design%i.png",docDir, designNum];
NSLog(@"%@",path);
NSData *imageData = UIImagePNGRepresentation([UIImage imageWithContentsOfFile:path]);
[Base64 initialize];
NSString *imageString = [Base64 encode:imageData];
NSArray *keys = [NSArray arrayWithObjects:@"design",nil];
NSArray *objects = [NSArray arrayWithObjects:imageString,nil];
NSDictionary *jsonDictionary = [NSDictionary dictionaryWithObjects:objects forKeys:keys];
NSError *error;
NSData *jsonData = [NSJSONSerialization dataWithJSONObject:jsonDictionary options:kNilOptions error:&error];
[request setHTTPMethod:@"POST"]; …
Run Code Online (Sandbox Code Playgroud) 我正在开发和应用程序,需要检查字符串是否已保存到数据库.这似乎是一个简单的操作,但它需要半秒才能返回任何我认为非常多的响应.我的问题是,是否有任何方法可以缩短时间.谢谢你的关注.
这是我目前的代码:
- (BOOL) isDeleted:(int)i {
NSString *value = [NSString stringWithFormat:@"deleted.*.%i", i];
MyAppDelegate *appDelegate = (MyAppDelegate *)[[UIApplication sharedApplication] delegate];
NSManagedObjectContext *context = [appDelegate managedObjectContext];
NSString *entityName = @"Deleted";
NSEntityDescription *entityDesc = [NSEntityDescription entityForName:entityName inManagedObjectContext:context];
NSFetchRequest *request = [[NSFetchRequest alloc] init];
[request setEntity:entityDesc];
NSPredicate *pred = [NSPredicate predicateWithFormat:@"(deletedpics like %@)", value];
[request setPredicate:pred];
NSError *error;
NSArray *objects = [context executeFetchRequest:request error:&error];
BOOL returnBool = [objects count] >= 1 ? YES : NO;
return returnBool;
}
Run Code Online (Sandbox Code Playgroud) 结果我需要以下内容:
(
"some_key" = {
"another_key" = "another_value";
};
);
Run Code Online (Sandbox Code Playgroud)
为了做到这一点,我有这个代码,但它不起作用:
NSDictionary *dictionary = [[NSDictionary alloc] initWithObjectsAndKeys:@"another_value", @"another_key", nil];
NSMutableArray *array = [[NSMutableArray alloc] init];
[array setValue:dictionary forKey:@"some_key"];
Run Code Online (Sandbox Code Playgroud)
任何的想法?谢谢!
从其他语言的Socket.IO实现列表中,我们可以看到有两种选择Objective-C
.我的问题与这两个图书馆的优点和缺点有关.
socket.IO-objc似乎更完整,文档更好,但我也想知道AZSocketIO提供哪些优势来考虑这些优势是否足以让我的项目选择其中一个.
如果有人使用其中一个可以给我一些建议,我真的很感激.谢谢!
ios ×5
objective-c ×4
node.js ×2
socket.io ×2
sockets ×2
avfoundation ×1
base64 ×1
core-audio ×1
core-data ×1
json ×1
nsdictionary ×1
optimization ×1
python ×1
twisted ×1