I am erasing audio file using following code:
NSMutableData *wave =[NSMutableData dataWithContentsOfURL:self.recordedFileURL options:NSDataReadingUncached error:nil];
NSUInteger length = [wave length];
Byte *byteData = (Byte*)malloc(length);
memcpy(byteData, [wave bytes], length);
NSMutableData *data = [NSMutableData dataWithBytes:byteData length:length];
[data replaceBytesInRange:NSMakeRange(length*rangeToCut, length-(length*rangeToCut)) withBytes:NULL length:0];
[data writeToFile:[self.recordedFileURL path] atomically:YES];
it is erasing correctly but after that when i resume my audio ad append resumed part to old part like following:
NSMutableData *part2=[NSMutableData dataWithContentsOfURL:self.soundFileURL options:NSDataReadingUncached error:nil];
NSFileHandle *file = [NSFileHandle fileHandleForWritingToURL:oldURL error:nil];
if (file) {
[file seekToEndOfFile];
[file writeData:part2];
}
then audio files are appended successfully but resumed part of audio has too much disturbance not able to listen that part.
Please help me what is going wrong here.
Is your sample size 16 bits or more? If you cut the audio in the middle of a sample the rest of the stream will be just noise. You need to be sure of length*rangeToCut being a multiple of the sample size.
Related
I am using the following code to convert nsdata to bytearray. It works fine in simulator. On device, it allocates memory like crazy to 600 MB [on the 'addobject' line inside the loop] and crashes. The file size I am reading is 30 MB. The error I see in output windows is "memory issue". The file is a "zip" file
NSData *data = [[NSData alloc] initWithContentsOfFile:file];
const unsigned char *bytes = [data bytes];
NSUInteger length = [data length];
NSMutableArray *byteArray = [NSMutableArray array];
for (NSUInteger i = 0; i < length; i++) {
#autoreleasepool {
[byteArray addObject:[NSNumber numberWithUnsignedChar:bytes[i]]]; }
}
I am using this bytearray inside a NSDictionary and do a "dataWithJSONObject" on the dictionary to post the json to a REST web service.
For a more memory efficient way to convert NSData to byte array, see How to convert NSData to byte array in iPhone?
If your goal is ultimately to post the binary data to a web service in JSON, try the following:
Base64-encode the the data in a String, as described here. This can be passed in the JSON Body.
Alternatively, if you have flexibility on changing the server-side of this transfer, avoid base64-encoded data in a JSON and rather directly post the binary content using an HTTP POST or PUT. This will be more efficient.
For the second approach, I can think of two ways to do this:
Here is an example for the scenario of sending an image, but any file type will work so long as you can load it into an NSData object.
Or, something like this should work
NSData *data = [NSData dataWithContentsOfFile:file];
NSURL *url = [NSURL URLWithString:#"your_url_here"];
NSMutableURLRequest *request = [NSMutableURLRequest requestWithURL:url];
[request setHTTPMethod:#"PUT"];
NSDictionary *headers = #{#"Content-Type": #"application/octet-stream"};
[request setAllHTTPHeaderFields:headers];
[request setHTTPBody:data]
// Use an URLSession object task to execute the request
Lastly, if the data can be compressed on the client before encoding and sending, that would be even better!
I hope you find this helpful.
I have NSData object,it contains 16300 bytes and I am writing to file. Write is success full. But once I want to read from path again, it gives me only 44 bytes.
//writing to path
[audioData writeToFile:recorderFilePath options:NSDataWritingAtomic error:&err];
if (err) {
NSLog(#"Error of writing to file %#",[err localizedDescription]);
}
// reading from path
NSData *paddata = [NSData dataWithContentsOfFile:filePath];
Any help will be appreciated.
Thank you.
Just below code used for write your audio file and read from path.
Write audio file to path
[fileData writeToFile:audioFilePath atomically:YES];
reading from path
NSURL *soundFileURL = [NSURL fileURLWithPath:audioFilePath];
In the recorder application,I'm trying to capture the stereo speech data into a file.
for ( int i=0; i<bufferList->mNumberBuffers; i++ ) {
memcpy(bufferList->mBuffers[i].mData, audio->mBuffers[i].mData, byteCount);
}
The above code contains the recorded speech data. The file writing goes as below.
NSString *root = [NSHomeDirectory() stringByAppendingPathComponent:#"Documents"];
NSString *filePath = [root stringByAppendingPathComponent:#"mic_in.raw"];
if(![[NSFileManager defaultManager] fileExistsAtPath:filePath]) {
[[NSData data] writeToFile:filePath atomically:YES];
}
NSData *myData = [NSData dataWithBytes:audio->mBuffers[0].mData length:byteCount];
NSFileHandle *handle = [NSFileHandle fileHandleForWritingAtPath:filePath];
[handle truncateFileAtOffset:[handle seekToEndOfFile]];
[handle writeData:myData];
myData = [NSData dataWithBytes:audio->mBuffers[1].mData length:byteCount];
[handle truncateFileAtOffset:[handle seekToEndOfFile]];
[handle writeData:myData];
[handle closeFile];
The 'stereo' speech is recorded in 'non interleaved' format.
The saved file contents are not proper. (For e.g. for 15 seconds of speech data, only 2.5 is saved. Saved data also not proper)
The file writing for 'Mono' speech is working fine.
I'm not sure, what is wrong in the 'stereo' speech file writing?
the issue of stereo file writing got it resolved, as follows.
The very frequent file operations at callback was causing the data loss.
So, in callback, the data (after interleaving conversion) is saved in a big buffer and at the end of speech recording, the complete buffer content is written into a file at one shot.
The above approach, resolved the stereo file capturing issue.
I'm developing a iPhone app and I'm relatively new to objective-c so I hope some one can give a clue.
What im doing is reading a file in chunks and encoding the chuncks into base64 and everything is working fine, the problem is that in this line NSString *str = [data base64EncodedString]; it takes a little bit of time because im encodeing chunks of 256KB, there is no problem with one file the problem is that i'm encoding image files so imagine that I encode 10 images it will be alot of chunks per image so the process can be slow.
this is the process:
*Get the file.
*Read chunck of 256KB of the file.
*Encode chunck to base64.
*Save the encoded chunck and repeat until there is no more bytes to read from the file.
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
[library assetForURL:referenceURL resultBlock:^(ALAsset *asset)
{
NSUInteger chunkSize =262144;
uint8_t *buffer = calloc(chunkSize, sizeof(*buffer));
ALAssetRepresentation *rep = [asset defaultRepresentation];
NSUInteger length = [rep size];
self.requestsToServer=[[NSMutableArray alloc]init];
NSUInteger offset = 0;
do {
NSUInteger bytesCopied = [rep getBytes:buffer fromOffset:offset length:chunkSize error:nil];
offset += bytesCopied;
NSData *data = [[NSData alloc] initWithBytes:buffer length:bytesCopied];
NSString *str = [data base64EncodedString];
//After this I add the str in a NSMutableURLRequest and I store the request
//in a NSMutableArray for later use.
} while (offset < length);
free(buffer);
buffer = NULL;
}
failureBlock:^(NSError *error)
{
}];
I want to start another thread so I can be encoding the chuncks in paralel and know when the process finish, this way while encoding one chunck I can be encodign another 3 or 4 chunks at the same time.
How I can implement this in a safely way or is this a good idea?
Thanks for your time.
Look at NSOperation and NSOperationQueue.
http://developer.apple.com/library/ios/DOCUMENTATION/Cocoa/Reference/NSOperationQueue_class/Reference/Reference.html
Simply create one NSOperation per chunk and pass them the chunk they need to encode and queue them up.
You can tell the queue how many operations can run simultaneously.
There are lots of good options for doing chunks of work in parallel for iOS.
Take a look at the Apple's Concurrency Programming Guide to get you going.
I've tried the NSTask > NSData method, but the CPU/memory overhead is extremely large for anything over 1GB, so I need to find a way to do this like, say, an FTP server does it.
EDIT: How does remote desktop's copy files do it?
I think I got it. I had to read it into the memory in small byte-size (HAHA GET THE PUN?) pieces and transfer it over that way. Keep in mind that this only works for files, not directories. I tested it on a 450MB file and it copied in about 3 minutes with the exact same byte count as the source. It was a video, and while I was streaming it to the client, I was able to play it as well. Nifty, huh?
Without further ado, here's the code I used, slightly patched up to do a simple file-copy instead of over the network.
[[NSFileManager defaultManager] createFileAtPath:#"/path/to/file/dest" contents:nil attributes:nil];
NSFileHandle *output = [NSFileHandle fileHandleForWritingAtPath:#"/path/to/file/dest"];
uint64 offset = 0;
uint32 chunkSize = 8192;
NSFileHandle *handle = [NSFileHandle fileHandleForReadingAtPath:#"/path/to/file/source"];
NSAutoreleasePool *autoreleasePool = [[NSAutoreleasePool alloc] init];
NSData *data = [handle readDataOfLength:chunkSize];
NSLog(#"Entering Loop.");
while ([data length] > 0) {
[output seekToEndOfFile];
[output writeData:data];
offset += [data length];
NSLog(#"Switching Loop.");
[autoreleasePool release];
autoreleasePool = [[NSAutoreleasePool alloc] init];
[handle seekToFileOffset:offset];
data = [handle readDataOfLength:chunkSize];
}
NSLog(#"Exited Loop.");
[handle closeFile];
[autoreleasePool release];
[output closeFile];
[output release];