我的iphone应用程序应该根据用户的经度和经度来解析地址.reverseGeocodeLocation工作正常,但结果是英文的.
有没有办法将结果本地化为其他语言?
无法在苹果或其他任何地方找到有关它的任何信息.
我使用的代码是:
CLGeocoder *geocoder = [[[CLGeocoder alloc] init] autorelease];
CLLocation *location = [[[CLLocation alloc]
initWithLatitude:coord.latitude longitude:coord.longitude] autorelease];
[geocoder reverseGeocodeLocation:location completionHandler:^(NSArray *placemarks, NSError *error) {
NSLog(@"reverseGeocodeLocation:completionHandler: Completion Handler called!");
if (error){
NSLog(@"Geocode failed with error: %@", error);
[self displayError:error];
return;
}
if(placemarks && placemarks.count > 0)
{
//do something
CLPlacemark *topResult = [placemarks objectAtIndex:0];
NSString *addressTxt = [NSString stringWithFormat:@"%@ %@,%@ %@",
[topResult subThoroughfare],[topResult thoroughfare],
[topResult locality], [topResult administrativeArea]];
}
}
Run Code Online (Sandbox Code Playgroud) 我试图制作卡拉OK应用程序,记录文件和麦克风的背景音乐.我还想为麦克风输入添加滤镜效果.
我可以使用惊人的音频引擎sdk完成上述所有事情,但我无法弄清楚如何将麦克风输入添加为频道,以便我可以应用滤镜(而不是背景音乐.)
任何帮助,将不胜感激.
我目前的录音代码:
- (void)beginRecording {
// Init recorder
self.recorder = [[AERecorder alloc] initWithAudioController:_audioController];
NSString *documentsFolder = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES)
objectAtIndex:0];
NSString *filePath = [documentsFolder stringByAppendingPathComponent:@"Recording.aiff"];
// Start the recording process
NSError *error = NULL;
if ( ![_recorder beginRecordingToFileAtPath:filePath
fileType:kAudioFileAIFFType
error:&error] ) {
// Report error
return;
}
// Receive both audio input and audio output. Note that if you're using
// AEPlaythroughChannel, mentioned above, you may not need to receive the input again.
[_audioController addInputReceiver:_recorder];
[_audioController addOutputReceiver:_recorder];
}
Run Code Online (Sandbox Code Playgroud) 我有一个应用程序重复显示日志上的以下错误:
<Error>: ImageIO: readTag : tag-id '0000' is bad (type = 1 count = 1) - ignoring...
和
<Error>: ImageIO: processAPP1 Failed to read tag #4 in mainIFD.
任何人都可以指出如何解决它以及它为什么会发生?
我使用在[此处] [1]找到的代码将图像附加到使用拍摄的视频上UIImagePickerController。
视频是纵向的,并且可以正常播放,但是一旦我使用AVURLASSet它,就将其方向改为横向而不是纵向,我找不到原因了吗?
谁能指出我正确的方向?
我的代码:
-(IBAction)addWaterMark:(id)sender {
AVURLAsset* videoAsset = [[AVURLAsset alloc]initWithURL:[NSURL fileURLWithPath:tempPath] options:nil];
AVMutableComposition* mixComposition = [AVMutableComposition composition];
AVMutableCompositionTrack *compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
AVAssetTrack *clipVideoTrack = [[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
[compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration)
ofTrack:clipVideoTrack
atTime:kCMTimeZero error:nil];
[compositionVideoTrack setPreferredTransform:[[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] preferredTransform]];
CGSize videoSize = [videoAsset naturalSize];
NSLog(@"%f %f",videoSize.width,videoSize.height);
}
Run Code Online (Sandbox Code Playgroud)
在这一点上我得到480360而不是temppath的正确大小
xcode ×3
ios ×2
iphone ×2
audiounit ×1
avfoundation ×1
avurlasset ×1
clgeocoder ×1
ios6 ×1
tags ×1
video ×1