我正在开发一个应用程序,如果终止它将无法工作.它有一些后台任务.如果应用程序被终止,我想显示本地通知.有些应用程序执行此操作意味着这是可行的.但我无法找到方法.
我试图在applicationWillTerminate:appdelegate的方法中设置本地通知,并在我的viewcontroller中添加了应用终止的通知,但是当app实际终止时,没有任何方法被调用.
- (void)applicationWillTerminate:(UIApplication *)application
{
NSLog(@"terminated");
UIApplication * app = [UIApplication sharedApplication];
NSDate *date = [[NSDate date] dateByAddingTimeInterval:15];
UILocalNotification *alarm = [[UILocalNotification alloc] init] ;
if (alarm) {
alarm.fireDate = [NSDate date];
alarm.timeZone = [NSTimeZone defaultTimeZone];
alarm.repeatInterval = 0;
alarm.alertBody = @"This app does not work if terminated";
alarm.alertAction = @"Open";
[app scheduleLocalNotification:alarm];
}
[app presentLocalNotificationNow:alarm];
// Called when the application is about to terminate. Save data if appropriate. See also applicationDidEnterBackground:.
}
Run Code Online (Sandbox Code Playgroud)
任何帮助都会很棒.
提前致谢 !!!
我有一个项目,我必须记录来自蓝牙耳机的声音并使用默认的iPhone扬声器播放.我搜索了很多并得到了这段代码.
UInt32 allowBluetoothInput = 1;
AudioSessionSetProperty (
kAudioSessionProperty_OverrideCategoryEnableBluetoothInput,
sizeof (allowBluetoothInput),
&allowBluetoothInput
);
Run Code Online (Sandbox Code Playgroud)
------------音频录音机启动和停止代码------------
- (IBAction)Record: (id)sender
{
UIButton *btn = (UIButton *)sender;
if([btn isSelected])
{
[audioRecorder stop];
[btn setSelected:NO];
[btn setTitle:@"Start Recording" forState:UIControlStateNormal];
}
else
{
[audioRecorder record];
[btn setSelected:YES];
[btn setTitle:@"Stop Recording" forState:UIControlStateNormal];
}
}
Run Code Online (Sandbox Code Playgroud)
我在此之后使用avaudiorecorder.我在这里似乎还有其他一些东西.
--------录音机代码---------
NSURL *soundFileURL = [NSURL fileURLWithPath:AUDIO_FILE];
NSDictionary *recordSettings = [NSDictionary
dictionaryWithObjectsAndKeys:
[NSNumber numberWithInt:AVAudioQualityMin],
AVEncoderAudioQualityKey,
[NSNumber numberWithInt:16],
AVEncoderBitRateKey,
[NSNumber numberWithInt: 2],
AVNumberOfChannelsKey,
[NSNumber numberWithFloat:44100.0],
AVSampleRateKey,
nil];
NSError *error = nil;
audioRecorder = [[AVAudioRecorder alloc] …Run Code Online (Sandbox Code Playgroud) 我正在Facebook上使用Facebook SDK 3.1.1分享视频.我在这里提到了代码.
FBRequestConnection *newConnection = [[FBRequestConnection alloc] init];
// for each fbid in the array, we create a request object to fetch
// the profile, along with a handler to respond to the results of the request
NSString *fbid = @"me";
// create a handler block to handle the results of the request for fbid's profile
FBRequestHandler handler =
^(FBRequestConnection *connection, id result, NSError *error) {
// output the results of the request
[self requestCompleted:connection forFbID:fbid result:result error:error];
}; …Run Code Online (Sandbox Code Playgroud) 我在iOS应用程序中使用AVCaptureConnection捕获视频.之后,我在视频中添加一些图像作为CALayers.一切都运行正常,但在添加图像后,在结果视频的最后得到一个黑框.在此没有影响实际音频/视频的帧.对于音频我正在提取它并改变它的音高然后使用AVMutableComposition添加它.这是我正在使用的代码.请帮我解决我做错了什么,或者我需要添加其他东西.
cmp = [AVMutableComposition composition];
AVMutableCompositionTrack *videoComposition = [cmp addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
AVMutableCompositionTrack *audioComposition = [cmp addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
AVAssetTrack *sourceVideoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
AVAssetTrack *sourceAudioTrack = [[audioAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
[videoComposition insertTimeRange:CMTimeRangeMake(kCMTimeZero, [asset duration]) ofTrack:sourceVideoTrack atTime:kCMTimeZero error:nil] ;
[audioComposition insertTimeRange:CMTimeRangeMake(kCMTimeZero, [asset duration]) ofTrack:sourceAudioTrack atTime:kCMTimeZero error:nil];
animComp = [AVMutableVideoComposition videoComposition];
animComp.renderSize = CGSizeMake(320, 320);
animComp.frameDuration = CMTimeMake(1,30);
animComp.animationTool = [AVVideoCompositionCoreAnimationTool videoCompositionCoreAnimationToolWithPostProcessingAsVideoLayer:videoLayer inLayer:parentLayer];
// to gather the audio part of the video
NSArray *tracksToDuck = [cmp tracksWithMediaType:AVMediaTypeAudio];
NSMutableArray *trackMixArray = [NSMutableArray array];
for …Run Code Online (Sandbox Code Playgroud) 我有这个代码,我试图使用prestashop模式在我的网站上创建一个新客户.但我在回复中不断收到错误
NSString *xmlPath = [[NSBundle mainBundle] pathForResource:@"Login" ofType:@"xml"];
NSString *xmlStr = [[NSString alloc] initWithContentsOfFile:xmlPath encoding:NSUTF8StringEncoding error:nil];
NSString *encodedurlstring = (__bridge NSString*) CFURLCreateStringByAddingPercentEscapes (NULL, (__bridge CFStringRef) xmlStr, NULL, (CFStringRef)@"!*'();:@&=+$,/?%#[]",kCFStringEncodingUTF8);
NSString *urlStr = [NSString stringWithFormat:@"http://passkey:@farma-web.it/api/customers/?Xml=%@",encodedurlstring];
NSURL *webURL = [NSURL URLWithString:urlStr];
NSMutableURLRequest *request = [NSMutableURLRequest requestWithURL:webURL];
[request setHTTPMethod:@"POST"];
[request setValue: @"text/xml" forHTTPHeaderField: @"Content-Type"];
NSData *returnData = [NSURLConnection sendSynchronousRequest:request returningResponse:nil error:nil];
NSString *response = [[NSString alloc] initWithData:returnData encoding:NSUTF8StringEncoding];
NSLog(@"response - %@",response);
Run Code Online (Sandbox Code Playgroud)
我附加的XML是
<prestashop>
<customers>
<customer>**I DO NOT KNOW WHAT TO WRITE HERE**</customer>
<email>abc@abc.com</email>
<passwd>12344321</passwd> …Run Code Online (Sandbox Code Playgroud) 我必须检测脸型 - 椭圆形,长方形,菱形,在我的应用中听到形状.我使用开放的CV和coreimage框架完成了面部检测,但它没有告诉面部的类型.
我真正需要做的是检测脸部的类型并显示具有相同类型脸部形状的名人脸部,以显示与她/他的脸部匹配.
任何帮助都会很棒.
提前致谢!
我想创建一款图画类应用程序,多个用户可以玩并猜测一个人绘制的图画。如何将一台设备的绘图同步到全球其他设备?
我正在使用旋转木马.我将类型设置为iCarouselTypeRotary.我的问题是图像不断重复,即我只有4张图像.在最后一张图像之后,第一张图像重复并继续.我希望它像iCarouselTypeCoverFlow2类型一样停止.请帮忙.我不能使用任何其他类型.我希望这种风格仅适用于旋转木马.
提前致谢.
ios ×4
iphone ×3
xcode ×3
avfoundation ×1
bluetooth ×1
core-image ×1
icarousel ×1
ios6 ×1
opencv ×1
prestashop ×1
scroll ×1