AVPlayer plays on simulator but doesn't on a real device - objective-c

I'm implementing a basic audio player in order to play remote audio files. Files are in format mp3.
The code I wrote is working fine on the simulator but doesn't work on a real device. However the same url I use within my app works fine if I load it by using safari (on the same real device) so I'm not really getting the missing point.
Below is my code:
self.musicPlayer = [AVPlayer playerWithURL:[NSURL URLWithString:urlTrack]];
[self.musicPlayer play];
something extremely easy. The music player property is defined as
#property (nonatomic, retain) AVPlayer *musicPlayer;
I also tried using an AVPlayerItem but the result is the same. Here is the code I have used
AVPlayerItem *playerItem = [AVPlayerItem playerItemWithURL:[NSURL URLWithString:urlTrack]];
self.musicPlayer = [AVPlayer playerWithPlayerItem:playerItem];
[self.musicPlayer play];
Finally I tried to use the code below
self.musicPlayer = [AVPlayer playerWithURL:[NSURL URLWithString:urlTrack]];
NSLog(#"Player created:%d",self.musicPlayer.status);
[self.musicPlayer addObserver:self forKeyPath:#"status" options:0 context:nil];
- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context {
NSLog(#"Player created:%d",self.musicPlayer.status);
if (object == self.musicPlayer && [keyPath isEqualToString:#"status"]) {
if (self.musicPlayer.status == AVPlayerStatusReadyToPlay) {
[self.musicPlayer play];
} else if (self.musicPlayer.status == AVPlayerStatusFailed) {
// something went wrong
}
}
}
When the method observeValueForKeyPath is invoked the player status is 1 and the play is exectuted but still not sound.
I tried several files like:
http://www.nimh.nih.gov/audio/neurogenesis.mp3
http://www.robtowns.com/music/blind_willie.mp3
Any idea?
Tnx

Check the spelling of your filename. The device is case-sensitive, the simulator is not...
Also, check if your ringer is off, you won't hear any sound when it's off. To prevent that, use
NSError *_error = nil;
[[AVAudioSession sharedInstance] setCategory: AVAudioSessionCategoryPlayback error: &_error];
right before where you init the player

i have tried all the things but did not work, the only think that work for me AVAudioSession to playBack
here is the code i used
private var audioPlayer: AVAudioPlayer!
guard let path = Bundle.main.path(forResource: name, ofType: "mp3") else {
print("can not find path")
return
}
let url = URL(fileURLWithPath: path)
do {
try AVAudioSession.sharedInstance().setCategory(.playback)
audioPlayer = try AVAudioPlayer(contentsOf: url)
} catch {
print("some thing went wrong: \(error)")
}
audioPlayer!.prepareToPlay()
audioPlayer!.play()
please do not ignore the line in the above code
try AVAudioSession.sharedInstance().setCategory(.playback)

Related

can't load Safari contentBlocker. because can't access app group's NSUserDefault

I am making iOS 9 Safari AdBlocker app.
I am using the module of AdBlockPlusSafari.
App works good on simulator.
But when try to run it on device(iPhone6), it fails to reload contentBlocker.
[SFContentBlockerManager
reloadContentBlockerWithIdentifier:self.contentBlockerIdentifier
completionHandler:^(NSError *error) {
if (error) {
NSLog(#"Error in reloadContentBlocker: %#", error);
}
dispatch_async(dispatch_get_main_queue(), ^{
wSelf.reloading = NO;
[wSelf checkActivatedFlag];
if (completion) {
completion(error);
}
});
}];
it gives error
Error Domain=ContentBlockerErrorDomain Code=3 "(null)"
It caused by accessing the values in NSUserDefault (App Group).
- (instancetype)init
{
if (self = [super init])
{
_bundleName = [[[[[NSBundle mainBundle] bundleIdentifier] componentsSeparatedByString:#"."] subarrayWithRange:NSMakeRange(0, 2)] componentsJoinedByString:#"."];
NSString *group = [NSString stringWithFormat:#"group.%#.%#", _bundleName, #"AdBlockerPro"];
NSLog(#"Group name: %#", group);
_adblockProDetails = [[NSUserDefaults alloc] initWithSuiteName:group];
[_adblockProDetails registerDefaults:
#{ AdblockProActivated: #NO,
AdblockProEnabled: #YES
}];
_enabled = [_adblockProDetails boolForKey:AdblockProEnabled];
_activated = [_adblockProDetails boolForKey:AdblockProActivated];
}
return self;
}
The App Group name in host app and safari extension is same.
But in Safari extension, when app accesses the setting in NSUserDefault, it gives me the error.
In Project setting/Capabilities, I did all for App Group. In app id, it involves app group name exactly.
This happens on only device. On simulator, it works good. I can't find the reason of this error.
Please help me if you are experienced this.
Looking forward to your help.
I found the reason myself.
I have put something (NSMutableDictionary) in app group container and did something to write file to extension bundle.
It is prohibited by Apple.
So I deleted all from AdBlockManager (interacts with app group) except flag variables (Boolean type).
And I proceeded the file management using NSFileManager.
http://www.atomicbird.com/blog/sharing-with-app-extensions
Finally, app extension is working for me on device.
Good luck!

iCloud key-value sync iOS8 Xcode 6

Trying to fix some issue with iCloud. Here are two versions of my code.
- (void)viewDidLoad{
[super viewDidLoad];
[self checkICloudData];
}
version 1
- (void)checkICloudData
{
NSFileManager * fileManager=[NSFileManager defaultManager];
NSURL *iCloudURL=[fileManager URLForUbiquityContainerIdentifier:nil];
NSLog(#"iCloud URL is %#",[iCloudURL absoluteString]);
if (iCloudURL){
NSUbiquitousKeyValueStore * store=[NSUbiquitousKeyValueStore defaultStore];
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(updateICloudData:)
name:NSUbiquitousKeyValueStoreDidChangeExternallyNotification
object:store];
[store synchronize];
}else{
NSLog(#"iCloud is not supported or enabled");
[self loadDataFromBundle];
}
}
iCloudURL always returns nil. Other methods do not call.
version 2
- (void)checkICloudData
{
NSUbiquitousKeyValueStore * store=[NSUbiquitousKeyValueStore defaultStore];
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(updateICloudData:)
name:NSUbiquitousKeyValueStoreDidChangeExternallyNotification
object:store];
[store synchronize];
}
- (void)updateICloudData:(NSNotification*)notification
{
NSDictionary *userInfo = [notification userInfo];
NSNumber *changeReason = [userInfo objectForKey:NSUbiquitousKeyValueStoreChangeReasonKey];
NSInteger reason = -1;
if (!changeReason) {
return;
} else {
reason = [changeReason integerValue];
}
if ((reason == NSUbiquitousKeyValueStoreServerChange) || (reason == NSUbiquitousKeyValueStoreInitialSyncChange)) {
NSArray *changedKeys = [userInfo objectForKey:NSUbiquitousKeyValueStoreChangedKeysKey];
for (NSString *key in changedKeys) {
if ([key isEqualToString:casesKey]) {
[self iCloudNotification];
}else {
[self loadDataFromBundle];
}
}
}
}
With that version iCloud sync works fine with the iPad, but doesn't work with the iPhone. In the iPhone's iCloud Drive settings i see my app same as in the iPad
My iCloud settings:
Member center - iCloud Enabled. Compatibility - Include CloudKit support
Target Capabilities - Services checked only Key-Value storage
Created by default entitlements dictionary contains key com.apple.developer.icloud-container-identifiers with an empty array and key com.apple.developer.ubiquity-kvstore-identifier with a correct string value of my appID
So, why iCloudURL always returns nil? And why the second version works correct with the iPad but my iPhone does not see NSUbiquitousKeyValueStoreDidChangeExternallyNotification?
Well, the problem was in my iPhone after upgrade iOS.
When I've delete iCloud profile from iPhone's settings and add it again, the iPhone started to sync with iCloud.
Right now the second version of the code works with both devices.
Hope it helps somebody to solve that kind of problem.

Play a paused AVAudioRecorder file

in my program I want the user to be able to:
record his voice,
pause the recording process,
listen to what he recorded
and then continue recording.
I have managed to get to the point where I can record and play the recordings with AVAudioRecorder and AVAudioPlayer. But whenever I try to record, pause recording and then play, the playing part fails with no error.
I can guess that the reason it's not playing is because the audio file hasn't been saved yet and is still in memory or something.
Is there a way I can play paused recordings?
If there is please tell me how
I'm using xcode 4.3.2
If you want to play the recording, then yes you have to stop recording before you can load the file into the AVAudioPlayer instance.
If you want to be able to playback some of the recording, then add more to the recording after listening to it, or say record in the middle.. then you're in for some trouble.
You have to create a new audio file and then combine them together.
This was my solution:
// Generate a composition of the two audio assets that will be combined into
// a single track
AVMutableComposition* composition = [AVMutableComposition composition];
AVMutableCompositionTrack* audioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio
preferredTrackID:kCMPersistentTrackID_Invalid];
// grab the two audio assets as AVURLAssets according to the file paths
AVURLAsset* masterAsset = [[AVURLAsset alloc] initWithURL:[NSURL fileURLWithPath:self.masterFile] options:nil];
AVURLAsset* activeAsset = [[AVURLAsset alloc] initWithURL:[NSURL fileURLWithPath:self.newRecording] options:nil];
NSError* error = nil;
// grab the portion of interest from the master asset
[audioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, masterAsset.duration)
ofTrack:[[masterAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0]
atTime:kCMTimeZero
error:&error];
if (error)
{
// report the error
return;
}
// append the entirety of the active recording
[audioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, activeAsset.duration)
ofTrack:[[activeAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0]
atTime:masterAsset.duration
error:&error];
if (error)
{
// report the error
return;
}
// now export the two files
// create the export session
// no need for a retain here, the session will be retained by the
// completion handler since it is referenced there
AVAssetExportSession* exportSession = [AVAssetExportSession
exportSessionWithAsset:composition
presetName:AVAssetExportPresetAppleM4A];
if (nil == exportSession)
{
// report the error
return;
}
NSString* combined = #"combined file path";// create a new file for the combined file
// configure export session output with all our parameters
exportSession.outputURL = [NSURL fileURLWithPath:combined]; // output path
exportSession.outputFileType = AVFileTypeAppleM4A; // output file type
[exportSession exportAsynchronouslyWithCompletionHandler:^{
// export status changed, check to see if it's done, errored, waiting, etc
switch (exportSession.status)
{
case AVAssetExportSessionStatusFailed:
break;
case AVAssetExportSessionStatusCompleted:
break;
case AVAssetExportSessionStatusWaiting:
break;
default:
break;
}
NSError* error = nil;
// your code for dealing with the now combined file
}];
I can't take full credit for this work, but it was pieced together from the input of a couple of others:
AVAudioRecorder / AVAudioPlayer - append recording to file
(I can't find the other link at the moment)
We had the same requirements for our app as the OP described, and ran into the same issues (i.e., the recording has to be stopped, instead of paused, if the user wants to listen to what she has recorded up to that point). Our app (project's Github repo) uses AVQueuePlayer for playback and a method similar to kermitology's answer to concatenate the partial recordings, with some notable differences:
implemented in Swift
concatenates multiple recordings into one
no messing with tracks
The rationale behind the last item is that simple recordings with AVAudioRecorder will have one track, and the main reason for this whole workaround is to concatenate those single tracks in the assets (see Addendum 3). So why not use AVMutableComposition's insertTimeRange method instead, that takes an AVAsset instead of an AVAssetTrack?
Relevant parts: (full code)
import UIKit
import AVFoundation
class RecordViewController: UIViewController {
/* App allows volunteers to record newspaper articles for the
blind and print-impaired, hence the name.
*/
var articleChunks = [AVURLAsset]()
func concatChunks() {
let composition = AVMutableComposition()
/* `CMTimeRange` to store total duration and know when to
insert subsequent assets.
*/
var insertAt = CMTimeRange(start: kCMTimeZero, end: kCMTimeZero)
repeat {
let asset = self.articleChunks.removeFirst()
let assetTimeRange =
CMTimeRange(start: kCMTimeZero, end: asset.duration)
do {
try composition.insertTimeRange(assetTimeRange,
of: asset,
at: insertAt.end)
} catch {
NSLog("Unable to compose asset track.")
}
let nextDuration = insertAt.duration + assetTimeRange.duration
insertAt = CMTimeRange(start: kCMTimeZero, duration: nextDuration)
} while self.articleChunks.count != 0
let exportSession =
AVAssetExportSession(
asset: composition,
presetName: AVAssetExportPresetAppleM4A)
exportSession?.outputFileType = AVFileType.m4a
exportSession?.outputURL = /* create URL for output */
// exportSession?.metadata = ...
exportSession?.exportAsynchronously {
switch exportSession?.status {
case .unknown?: break
case .waiting?: break
case .exporting?: break
case .completed?: break
case .failed?: break
case .cancelled?: break
case .none: break
}
}
/* Clean up (delete partial recordings, etc.) */
}
This diagram helped me to get around what expects what and inherited from where. (NSObject is implicitly implied as superclass where there is no inheritance arrow.)
Addendum 1: I had my reservations regarding the switch part instead of using KVO on AVAssetExportSessionStatus, but the docs are clear that exportAsynchronously's callback block "is invoked when writing is complete or in the event of writing failure".
Addendum 2: Just in case if someone has issues with AVQueuePlayer: 'An AVPlayerItem cannot be associated with more than one instance of AVPlayer'
Addendum 3: Unless you are recording in stereo, but mobile devices have one input as far as I know. Also, using fancy audio mixing would also require the use of AVCompositionTrack. A good SO thread: Proper AVAudioRecorder Settings for Recording Voice?
RecordAudioViewController.h
#import <UIKit/UIKit.h>
#import <AVFoundation/AVFoundation.h>
#import <CoreAudio/CoreAudioTypes.h>
#interface record_audio_testViewController : UIViewController <AVAudioRecorderDelegate> {
IBOutlet UIButton * btnStart;
IBOutlet UIButton * btnPlay;
IBOutlet UIActivityIndicatorView * actSpinner;
BOOL toggle;
//Variables setup for access in the class:
NSURL * recordedTmpFile;
AVAudioRecorder * recorder;
NSError * error;
}
#property (nonatomic,retain)IBOutlet UIActivityIndicatorView * actSpinner;
#property (nonatomic,retain)IBOutlet UIButton * btnStart;
#property (nonatomic,retain)IBOutlet UIButton * btnPlay;
- (IBAction) start_button_pressed;
- (IBAction) play_button_pressed;
#end
RecordAudioViewController.m
#synthesize actSpinner, btnStart, btnPlay;
- (void)viewDidLoad {
[super viewDidLoad];
//Start the toggle in true mode.
toggle = YES;
btnPlay.hidden = YES;
//Instanciate an instance of the AVAudioSession object.
AVAudioSession * audioSession = [AVAudioSession sharedInstance];
//Setup the audioSession for playback and record.
//We could just use record and then switch it to playback leter, but
//since we are going to do both lets set it up once.
[audioSession setCategory:AVAudioSessionCategoryPlayAndRecord error: &error];
//Activate the session
[audioSession setActive:YES error: &error];
}
- (IBAction) start_button_pressed{
if(toggle)
{
toggle = NO;
[actSpinner startAnimating];
[btnStart setTitle:#"Stop Recording" forState: UIControlStateNormal ];
btnPlay.enabled = toggle;
btnPlay.hidden = !toggle;
//Begin the recording session.
//Error handling removed. Please add to your own code.
//Setup the dictionary object with all the recording settings that this
//Recording sessoin will use
//Its not clear to me which of these are required and which are the bare minimum.
//This is a good resource: http://www.totodotnet.net/tag/avaudiorecorder/
NSMutableDictionary* recordSetting = [[NSMutableDictionary alloc] init];
[recordSetting setValue :[NSNumber numberWithInt:kAudioFormatAppleIMA4] forKey:AVFormatIDKey];
[recordSetting setValue:[NSNumber numberWithFloat:44100.0] forKey:AVSampleRateKey];
[recordSetting setValue:[NSNumber numberWithInt: 2] forKey:AVNumberOfChannelsKey];
//Now that we have our settings we are going to instanciate an instance of our recorder instance.
//Generate a temp file for use by the recording.
//This sample was one I found online and seems to be a good choice for making a tmp file that
//will not overwrite an existing one.
//I know this is a mess of collapsed things into 1 call. I can break it out if need be.
recordedTmpFile = [NSURL fileURLWithPath:[NSTemporaryDirectory() stringByAppendingPathComponent: [NSString stringWithFormat: #"%.0f.%#", [NSDate timeIntervalSinceReferenceDate] * 1000.0, #"caf"]]];
NSLog(#"Using File called: %#",recordedTmpFile);
//Setup the recorder to use this file and record to it.
recorder = [[ AVAudioRecorder alloc] initWithURL:recordedTmpFile settings:recordSetting error:&error];
//Use the recorder to start the recording.
//Im not sure why we set the delegate to self yet.
//Found this in antother example, but Im fuzzy on this still.
[recorder setDelegate:self];
//We call this to start the recording process and initialize
//the subsstems so that when we actually say "record" it starts right away.
[recorder prepareToRecord];
//Start the actual Recording
[recorder record];
//There is an optional method for doing the recording for a limited time see
//[recorder recordForDuration:(NSTimeInterval) 10]
}
else
{
toggle = YES;
[actSpinner stopAnimating];
[btnStart setTitle:#"Start Recording" forState:UIControlStateNormal ];
btnPlay.enabled = toggle;
btnPlay.hidden = !toggle;
NSLog(#"Using File called: %#",recordedTmpFile);
//Stop the recorder.
[recorder stop];
}
}
- (void)didReceiveMemoryWarning {
// Releases the view if it doesn't have a superview.
[super didReceiveMemoryWarning];
// Release any cached data, images, etc that aren't in use.
}
-(IBAction) play_button_pressed{
//The play button was pressed...
//Setup the AVAudioPlayer to play the file that we just recorded.
AVAudioPlayer * avPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:recordedTmpFile error:&error];
[avPlayer prepareToPlay];
[avPlayer play];
}
- (void)viewDidUnload {
// Release any retained subviews of the main view.
// e.g. self.myOutlet = nil;
//Clean up the temp file.
NSFileManager * fm = [NSFileManager defaultManager];
[fm removeItemAtPath:[recordedTmpFile path] error:&error];
//Call the dealloc on the remaining objects.
[recorder dealloc];
recorder = nil;
recordedTmpFile = nil;
}
- (void)dealloc {
[super dealloc];
}
#end
RecordAudioViewController.xib
take 2 Buttons. 1 for begin recording and another for Play recording

Using MKReverseGeocoder in a singleton class (ARC)

I'm in the process of trying to create a singleton class for weather functions (all encompassed) so that I can change/update/recall weather data throughout my entire app in a single allocated object.
I have it working except for one little weird bug. I'm using MKReverseGeocoder on devices running iOS < 5. CLGeocoder works fine on iOS 5+. Basically, this is what happens in the app:
In the app delegate, at launch, I create the shared instance of my weather singleton. This instance doesn't do anything unless there is a UIView that expects to report weather results. When a view that reports weather is loaded, if the location isn't already statically set in the preferences, the app attempts to pull your current location.
This works, and I can log the coordinates of the devices current location. After that is done, I immediately go into trying to reverse geolocate those coordinates. The method MKReverseGeocoder start does get executed and I can log that the instance's isQuerying property is true, so I know it's attempting to geolocate the coords. (Yes, I set the delegate as my shared instance and the instance is of type MKReverseGeocoderDelegate).
Now this is the weird part. If I am launching the app for the first time, and I add a weather UIView to the screen for the first time, MKReverseGeocoder starts but never calls the delegate method. If I then close the app and open it again (second time), the current location get's looked up, and MKReverseGeocoder does call the delegate methods and everything works. It just doesn't want to work on the first launch, no matter how many times I call it (I have a button that can initiate the lookup).
It's totally baffling. The iOS 5 CLGeocoder works fine on initial launch and every subsequent launch. the MKReverseGeocoder does not work (because it doesn't call the delegate methods) on initial launch, but does on subsequent launches.
Below is the relevant code:
-(void)reverseGeocoder:(MKReverseGeocoder *)geocoder didFailWithError:(NSError *)error{
NSLog(#"error getting location:%#", error);
self.reverseGeocoder = nil;
[[NSNotificationCenter defaultCenter] postNotificationName:#"errorGettingLocation" object:nil userInfo:[NSDictionary dictionaryWithObject:error forKey:#"error"]];
}
- (void)reverseGeocoder:(MKReverseGeocoder *)geocoder didFindPlacemark:(MKPlacemark *)pm
{
//update placemark and get weather from correct placemark
placemark = pm;
NSLog(#"singleton placemark: %#",placemark);
[self getLocationFromPlacemark];
self.reverseGeocoder = nil;
}
- (void) getReverseGeoCode:(CLLocation *)newLocation
{
NSLog(#"reversGeocode starting for location: %#", newLocation);
NSString *ver = [[UIDevice currentDevice] systemVersion];
float ver_float = [ver floatValue];
if (ver_float < 5.0) {
//reverseGeocoder = nil;
self.reverseGeocoder = [[MKReverseGeocoder alloc] initWithCoordinate:newLocation.coordinate];
self.reverseGeocoder.delegate = self;
[self.reverseGeocoder start];
if(self.reverseGeocoder.isQuerying){
NSLog(#"self.reverseGeocoder querying");
NSLog(#"self.reverseGeocoder delegate %#", self.reverseGeocoder.delegate);
NSLog(#"self %#", self);
}else {
NSLog(#"geocoder not querying");
}
}
else {
[reverseGeocoder5 reverseGeocodeLocation:newLocation completionHandler:^(NSArray *placemarks, NSError *error){
if([placemarks count]>0){
placemark = [placemarks objectAtIndex:0];
[self getLocationFromPlacemark];
}
else{
if (error) {
NSLog(#"error reverseGeocode: %#",[error localizedDescription]);
}
}
}];
}
}
Also, I am setting reverserGeocoder as a (nonatomic, strong) property and synthesizing it. Remember, this works fine after a clean launch the second time (when there is a weather UIView already loaded). The calls to log aren't even getting hit (which is why I'm assuming the delegate methods aren't getting hit).
Any input would be GREATLY appreciated!
Thanks!
I ended up figuring this out... well, kind of.
Instead of using MKReverseGeocoder, I'm just doing a static lookup to google maps ala How to deal with MKReverseGeocoder / PBHTTPStatusCode=503 errors in iOS 4.3?
Works perfectly.
MKReverseGeocoder doesn't exist anymore... in iOS 5
You have to import CoreLocation and use CLGeocoder. Below is just some example sample code :
CLGeocoder *geoCoder = [[CLGeocoder alloc] init];
CLLocation *location = [[CLLocation alloc] initWithLatitude:37.33188 longitude:-122.029497];
[geoCoder reverseGeocodeLocation:location completionHandler:^(NSArray *placemark, NSError *error) {
CLPlacemark *place = [placemark objectAtIndex:0];
if (error) {
NSLog(#"fail %#", error);
} else {
NSLog(#"return %#", place.addressDictionary);
}
}];

Application doesn't work properly when installed with ipa file

I am playing youtube video on iPad via webView using this code.
NSString *htmlString = [NSString stringWithFormat:#"<html>\
<body>\
<div id=\"player\"> </div>\
<script>\
var tag = document.createElement('script');\
tag.src = \"http://www.youtube.com/player_api\";\
var firstScriptTag = document.getElementsByTagName('script')[0];\
firstScriptTag.parentNode.insertBefore(tag, firstScriptTag);\
var done = false;\
var player;\
function onYouTubePlayerAPIReady() {\
player = new YT.Player('player', {\
height: '%i',\
width: '%i',\
videoId: '%#',\
events: {\
'onReady': onPlayerReady,\
'onStateChange': onPlayerStateChange\
}\
});\
}\
function onPlayerReady(evt) {\
evt.target.playVideo();\
}\
function onPlayerStateChange(evt) {\
if(evt.data==0)\
{\
window.location=\"http:\\end\";\
}\
}\
function resizePlayer(width,height)\
{\
player.setSize(width, height);\
}\
</script>\
</body>\
</html>",
height,width, videoID];
The problem is that when I'm installing my app via xCode it works fine, but when I'm using ipa file it don't.
The problem you are having could depend on the specific device and iOS versions (there are subtle differences in UIWebView implementations), more than on using an ipa file.
So, you might try and reproduce the environment where the UIWebView fails to interpret correctly your HTML snippet. Also, don't forget to define webView:didFailLoadWithError: and give a look at a way to intercept javascript errors inside of UIWebViews and display them on the console.
Hope this helps.
I have found what caused my problem.
To allow my application to catch a moment when video reaches the end, I wrote Javascript code to redirect page to another URL. Then I implemented method from WebViewProtocol -(BOOL)webView:(UIWebView *) shouldStartLoadWithREquest:(NSURLRequest *) navigationType:(UIWebViewNavigationType)navigationType. The main idea was to close view when it tries to go to some special link, but I've made one mistake that for some unknown reason didn't raised when I launch my app from XCode.
Code with error:
-(BOOL)webView:(UIWebView *)webView shouldStartLoadWithRequest:(NSURLRequest *)request navigationType:(UIWebViewNavigationType)navigationType
{
NSString *url = [[request URL] absoluteString];
if ([url isEqualToString:#"http://youtube.com/end"])
{
[self onCloseVideo];
[self unsubscribe];
return = NO;
}
//here on else I had to return YES but I didn't
}
Code without error:
-(BOOL)webView:(UIWebView *)webView shouldStartLoadWithRequest:(NSURLRequest *)request navigationType:(UIWebViewNavigationType)navigationType
{
NSString *url = [[request URL] absoluteString];
BOOL shouldStartRequest = YES;
if ([url isEqualToString:#"http://youtube.com/end"])
{
[self onCloseVideo];
shouldStartRequest = NO;
}
return shouldStartRequest;
}