How do I recursively copy a directory using vala? - file-io

I'm new to Vala, so this may be a stupid question.
According to #vala on gimpnet, it is not possible to recursively copy directories using Glib.File.copy. At the moment I am using:
Posix.system("cp -r absolutesource absolutedestination")
Is there a better method of doing this?

As I told you in IRC, you can just write a function to do it yourself by calling GLib.File.copy for each file you want to copy. Here is a basic example:
public bool copy_recursive (GLib.File src, GLib.File dest, GLib.FileCopyFlags flags = GLib.FileCopyFlags.NONE, GLib.Cancellable? cancellable = null) throws GLib.Error {
GLib.FileType src_type = src.query_file_type (GLib.FileQueryInfoFlags.NONE, cancellable);
if ( src_type == GLib.FileType.DIRECTORY ) {
dest.make_directory (cancellable);
src.copy_attributes (dest, flags, cancellable);
string src_path = src.get_path ();
string dest_path = dest.get_path ();
GLib.FileEnumerator enumerator = src.enumerate_children (GLib.FileAttribute.STANDARD_NAME, GLib.FileQueryInfoFlags.NONE, cancellable);
for ( GLib.FileInfo? info = enumerator.next_file (cancellable) ; info != null ; info = enumerator.next_file (cancellable) ) {
copy_recursive (
GLib.File.new_for_path (GLib.Path.build_filename (src_path, info.get_name ())),
GLib.File.new_for_path (GLib.Path.build_filename (dest_path, info.get_name ())),
flags,
cancellable);
}
} else if ( src_type == GLib.FileType.REGULAR ) {
src.copy (dest, flags, cancellable);
}
return true;
}
Also, it's worth noting that you might want to use one of the functions in GLib.Process instead of Posix.system.

I came across this question looking for something similar in C using GLib/GIO.
Here's my attempt at converting the C++ code above to C
gboolean
copy_recursive(GFile *src, GFile *dest, GFileCopyFlags flags, GCancellable *cancellable, GError **error) {
GFileType src_type = g_file_query_file_type(src, G_FILE_QUERY_INFO_NONE, cancellable);
if (src_type == G_FILE_TYPE_DIRECTORY) {
g_file_make_directory(dest, cancellable, error);
g_file_copy_attributes(src, dest, flags, cancellable, error);
GFileEnumerator *enumerator = g_file_enumerate_children(src, G_FILE_ATTRIBUTE_STANDARD_NAME, G_FILE_QUERY_INFO_NONE, cancellable, error);
for (GFileInfo *info = g_file_enumerator_next_file(enumerator, cancellable, error); info != NULL; info = g_file_enumerator_next_file(enumerator, cancellable, error)) {
const char *relative_path = g_file_info_get_name(info);
copy_recursive(
g_file_resolve_relative_path(src, relative_path),
g_file_resolve_relative_path(dest, relative_path),
flags, cancellable, error);
}
} else if (src_type == G_FILE_TYPE_REGULAR) {
g_file_copy(src, dest, flags, cancellable, NULL, NULL, error);
}
return TRUE;
}
As a bonus, here's a function that recursively deletes a directory
gboolean
delete_recursive(GFile *file, GCancellable *cancellable, GError **error) {
GFileType file_type = g_file_query_file_type(file, G_FILE_QUERY_INFO_NONE, cancellable);
if (file_type == G_FILE_TYPE_DIRECTORY) {
GFileEnumerator *enumerator = g_file_enumerate_children(file, G_FILE_ATTRIBUTE_STANDARD_NAME, G_FILE_QUERY_INFO_NONE, cancellable, error);
for (GFileInfo *info = g_file_enumerator_next_file(enumerator, cancellable, error); info != NULL; info = g_file_enumerator_next_file(enumerator, cancellable, error)) {
delete_recursive(
g_file_resolve_relative_path(file, g_file_info_get_name(info)),
cancellable, error);
}
g_file_delete(file, cancellable, error);
} else if (file_type == G_FILE_TYPE_REGULAR) {
g_file_delete(file, cancellable, error);
}
return TRUE;
}

Related

Record rtsp stream with ffmpeg in iOS

I've followed iFrameExtractor to successfully stream rtsp in my swift project. In this project, it also has recording function. It basically use avformat_write_header
, av_interleaved_write_frame and av_write_trailer to save the rtsp source into mp4 file.
When I used this project in my device, the rtsp streaming works fine, but recording function will always generate a blank mp4 file with no image and sound.
Could anyone tell me what step that I miss?
I'm using iPhone5 with iOS 9.1 and XCode 7.1.1.
The ffmpeg is 2.8.3 version and followed the compile instruction by CompilationGuide – FFmpeg
Following is the sample code in this project
The function that generate every frame:
-(BOOL)stepFrame {
// AVPacket packet;
int frameFinished=0;
static bool bFirstIFrame=false;
static int64_t vPTS=0, vDTS=0, vAudioPTS=0, vAudioDTS=0;
while(!frameFinished && av_read_frame(pFormatCtx, &packet)>=0) {
// Is this a packet from the video stream?
if(packet.stream_index==videoStream) {
// 20130525 albert.liao modified start
// Initialize a new format context for writing file
if(veVideoRecordState!=eH264RecIdle)
{
switch(veVideoRecordState)
{
case eH264RecInit:
{
if ( !pFormatCtx_Record )
{
int bFlag = 0;
//NSString *videoPath = [NSHomeDirectory() stringByAppendingPathComponent:#"Documents/test.mp4"];
NSString *videoPath = #"/Users/liaokuohsun/iFrameTest.mp4";
const char *file = [videoPath UTF8String];
pFormatCtx_Record = avformat_alloc_context();
bFlag = h264_file_create(file, pFormatCtx_Record, pCodecCtx, pAudioCodecCtx,/*fps*/0.0, packet.data, packet.size );
if(bFlag==true)
{
veVideoRecordState = eH264RecActive;
fprintf(stderr, "h264_file_create success\n");
}
else
{
veVideoRecordState = eH264RecIdle;
fprintf(stderr, "h264_file_create error\n");
}
}
}
//break;
case eH264RecActive:
{
if((bFirstIFrame==false) &&(packet.flags&AV_PKT_FLAG_KEY)==AV_PKT_FLAG_KEY)
{
bFirstIFrame=true;
vPTS = packet.pts ;
vDTS = packet.dts ;
#if 0
NSRunLoop *pRunLoop = [NSRunLoop currentRunLoop];
[pRunLoop addTimer:RecordingTimer forMode:NSDefaultRunLoopMode];
#else
[NSTimer scheduledTimerWithTimeInterval:5.0//2.0
target:self
selector:#selector(StopRecording:)
userInfo:nil
repeats:NO];
#endif
}
// Record audio when 1st i-Frame is obtained
if(bFirstIFrame==true)
{
if ( pFormatCtx_Record )
{
#if PTS_DTS_IS_CORRECT==1
packet.pts = packet.pts - vPTS;
packet.dts = packet.dts - vDTS;
#endif
h264_file_write_frame( pFormatCtx_Record, packet.stream_index, packet.data, packet.size, packet.dts, packet.pts);
}
else
{
NSLog(#"pFormatCtx_Record no exist");
}
}
}
break;
case eH264RecClose:
{
if ( pFormatCtx_Record )
{
h264_file_close(pFormatCtx_Record);
#if 0
// 20130607 Test
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^(void)
{
ALAssetsLibrary *library = [[ALAssetsLibrary alloc]init];
NSString *filePathString = [NSHomeDirectory() stringByAppendingPathComponent:#"Documents/test.mp4"];
NSURL *filePathURL = [NSURL fileURLWithPath:filePathString isDirectory:NO];
if(1)// ([library videoAtPathIsCompatibleWithSavedPhotosAlbum:filePathURL])
{
[library writeVideoAtPathToSavedPhotosAlbum:filePathURL completionBlock:^(NSURL *assetURL, NSError *error){
if (error) {
// TODO: error handling
NSLog(#"writeVideoAtPathToSavedPhotosAlbum error");
} else {
// TODO: success handling
NSLog(#"writeVideoAtPathToSavedPhotosAlbum success");
}
}];
}
[library release];
});
#endif
vPTS = 0;
vDTS = 0;
vAudioPTS = 0;
vAudioDTS = 0;
pFormatCtx_Record = NULL;
NSLog(#"h264_file_close() is finished");
}
else
{
NSLog(#"fc no exist");
}
bFirstIFrame = false;
veVideoRecordState = eH264RecIdle;
}
break;
default:
if ( pFormatCtx_Record )
{
h264_file_close(pFormatCtx_Record);
pFormatCtx_Record = NULL;
}
NSLog(#"[ERROR] unexpected veVideoRecordState!!");
veVideoRecordState = eH264RecIdle;
break;
}
}
// Decode video frame
avcodec_decode_video2(pCodecCtx, pFrame, &frameFinished, &packet);
}
else if(packet.stream_index==audioStream)
{
// 20131024 albert.liao modfied start
static int vPktCount=0;
BOOL bIsAACADTS = FALSE;
int ret = 0;
if(aPlayer.vAACType == eAAC_UNDEFINED)
{
tAACADTSHeaderInfo vxAACADTSHeaderInfo = {0};
bIsAACADTS = [AudioUtilities parseAACADTSHeader:(uint8_t *)packet.data ToHeader:&vxAACADTSHeaderInfo];
}
#synchronized(aPlayer)
{
if(aPlayer==nil)
{
aPlayer = [[AudioPlayer alloc]initAudio:nil withCodecCtx:(AVCodecContext *) pAudioCodecCtx];
NSLog(#"aPlayer initAudio");
if(bIsAACADTS)
{
aPlayer.vAACType = eAAC_ADTS;
//NSLog(#"is ADTS AAC");
}
}
else
{
if(vPktCount<5) // The voice is listened once image is rendered
{
vPktCount++;
}
else
{
if([aPlayer getStatus]!=eAudioRunning)
{
dispatch_async(dispatch_get_main_queue(), ^(void) {
#synchronized(aPlayer)
{
NSLog(#"aPlayer start play");
[aPlayer Play];
}
});
}
}
}
};
#synchronized(aPlayer)
{
int ret = 0;
ret = [aPlayer putAVPacket:&packet];
if(ret <= 0)
NSLog(#"Put Audio Packet Error!!");
}
// 20131024 albert.liao modfied end
if(bFirstIFrame==true)
{
switch(veVideoRecordState)
{
case eH264RecActive:
{
if ( pFormatCtx_Record )
{
h264_file_write_audio_frame(pFormatCtx_Record, pAudioCodecCtx, packet.stream_index, packet.data, packet.size, packet.dts, packet.pts);
}
else
{
NSLog(#"pFormatCtx_Record no exist");
}
}
}
}
}
else
{
//fprintf(stderr, "packet len=%d, Byte=%02X%02X%02X%02X%02X\n",\
packet.size, packet.data[0],packet.data[1],packet.data[2],packet.data[3], packet.data[4]);
}
// 20130525 albert.liao modified end
}
return frameFinished!=0;
}
avformat_write_header:
int h264_file_create(const char *pFilePath, AVFormatContext *fc, AVCodecContext *pCodecCtx,AVCodecContext *pAudioCodecCtx, double fps, void *p, int len )
{
int vRet=0;
AVOutputFormat *of=NULL;
AVStream *pst=NULL;
AVCodecContext *pcc=NULL, *pAudioOutputCodecContext=NULL;
avcodec_register_all();
av_register_all();
av_log_set_level(AV_LOG_VERBOSE);
if(!pFilePath)
{
fprintf(stderr, "FilePath no exist");
return -1;
}
if(!fc)
{
fprintf(stderr, "AVFormatContext no exist");
return -1;
}
fprintf(stderr, "file=%s\n",pFilePath);
// Create container
of = av_guess_format( 0, pFilePath, 0 );
fc->oformat = of;
strcpy( fc->filename, pFilePath );
// Add video stream
pst = avformat_new_stream( fc, 0 );
vVideoStreamIdx = pst->index;
fprintf(stderr,"Video Stream:%d",vVideoStreamIdx);
pcc = pst->codec;
avcodec_get_context_defaults3( pcc, AVMEDIA_TYPE_VIDEO );
// Save the stream as origin setting without convert
pcc->codec_type = pCodecCtx->codec_type;
pcc->codec_id = pCodecCtx->codec_id;
pcc->bit_rate = pCodecCtx->bit_rate;
pcc->width = pCodecCtx->width;
pcc->height = pCodecCtx->height;
if(fps==0)
{
double fps=0.0;
AVRational pTimeBase;
pTimeBase.num = pCodecCtx->time_base.num;
pTimeBase.den = pCodecCtx->time_base.den;
fps = 1.0/ av_q2d(pCodecCtx->time_base)/ FFMAX(pCodecCtx->ticks_per_frame, 1);
fprintf(stderr,"fps_method(tbc): 1/av_q2d()=%g",fps);
pcc->time_base.num = 1;
pcc->time_base.den = fps;
}
else
{
pcc->time_base.num = 1;
pcc->time_base.den = fps;
}
// reference ffmpeg\libavformat\utils.c
// For SPS and PPS in avcC container
pcc->extradata = malloc(sizeof(uint8_t)*pCodecCtx->extradata_size);
memcpy(pcc->extradata, pCodecCtx->extradata, pCodecCtx->extradata_size);
pcc->extradata_size = pCodecCtx->extradata_size;
// For Audio stream
if(pAudioCodecCtx)
{
AVCodec *pAudioCodec=NULL;
AVStream *pst2=NULL;
pAudioCodec = avcodec_find_encoder(AV_CODEC_ID_AAC);
// Add audio stream
pst2 = avformat_new_stream( fc, pAudioCodec );
vAudioStreamIdx = pst2->index;
pAudioOutputCodecContext = pst2->codec;
avcodec_get_context_defaults3( pAudioOutputCodecContext, pAudioCodec );
fprintf(stderr,"Audio Stream:%d",vAudioStreamIdx);
fprintf(stderr,"pAudioCodecCtx->bits_per_coded_sample=%d",pAudioCodecCtx->bits_per_coded_sample);
pAudioOutputCodecContext->codec_type = AVMEDIA_TYPE_AUDIO;
pAudioOutputCodecContext->codec_id = AV_CODEC_ID_AAC;
// Copy the codec attributes
pAudioOutputCodecContext->channels = pAudioCodecCtx->channels;
pAudioOutputCodecContext->channel_layout = pAudioCodecCtx->channel_layout;
pAudioOutputCodecContext->sample_rate = pAudioCodecCtx->sample_rate;
pAudioOutputCodecContext->bit_rate = 12000;//pAudioCodecCtx->sample_rate * pAudioCodecCtx->bits_per_coded_sample;
pAudioOutputCodecContext->bits_per_coded_sample = pAudioCodecCtx->bits_per_coded_sample;
pAudioOutputCodecContext->profile = pAudioCodecCtx->profile;
//FF_PROFILE_AAC_LOW;
// pAudioCodecCtx->bit_rate;
// AV_SAMPLE_FMT_U8P, AV_SAMPLE_FMT_S16P
//pAudioOutputCodecContext->sample_fmt = AV_SAMPLE_FMT_FLTP;//pAudioCodecCtx->sample_fmt;
pAudioOutputCodecContext->sample_fmt = pAudioCodecCtx->sample_fmt;
//pAudioOutputCodecContext->sample_fmt = AV_SAMPLE_FMT_U8;
pAudioOutputCodecContext->sample_aspect_ratio = pAudioCodecCtx->sample_aspect_ratio;
pAudioOutputCodecContext->time_base.num = pAudioCodecCtx->time_base.num;
pAudioOutputCodecContext->time_base.den = pAudioCodecCtx->time_base.den;
pAudioOutputCodecContext->ticks_per_frame = pAudioCodecCtx->ticks_per_frame;
pAudioOutputCodecContext->frame_size = 1024;
fprintf(stderr,"profile:%d, sample_rate:%d, channles:%d", pAudioOutputCodecContext->profile, pAudioOutputCodecContext->sample_rate, pAudioOutputCodecContext->channels);
AVDictionary *opts = NULL;
av_dict_set(&opts, "strict", "experimental", 0);
if (avcodec_open2(pAudioOutputCodecContext, pAudioCodec, &opts) < 0) {
fprintf(stderr, "\ncould not open codec\n");
}
av_dict_free(&opts);
#if 0
// For Audio, this part is no need
if(pAudioCodecCtx->extradata_size!=0)
{
NSLog(#"extradata_size !=0");
pAudioOutputCodecContext->extradata = malloc(sizeof(uint8_t)*pAudioCodecCtx->extradata_size);
memcpy(pAudioOutputCodecContext->extradata, pAudioCodecCtx->extradata, pAudioCodecCtx->extradata_size);
pAudioOutputCodecContext->extradata_size = pAudioCodecCtx->extradata_size;
}
else
{
// For WMA test only
pAudioOutputCodecContext->extradata_size = 0;
NSLog(#"extradata_size ==0");
}
#endif
}
if(fc->oformat->flags & AVFMT_GLOBALHEADER)
{
pcc->flags |= CODEC_FLAG_GLOBAL_HEADER;
pAudioOutputCodecContext->flags |= CODEC_FLAG_GLOBAL_HEADER;
}
if ( !( fc->oformat->flags & AVFMT_NOFILE ) )
{
vRet = avio_open( &fc->pb, fc->filename, AVIO_FLAG_WRITE );
if(vRet!=0)
{
fprintf(stderr,"avio_open(%s) error", fc->filename);
}
}
// dump format in console
av_dump_format(fc, 0, pFilePath, 1);
vRet = avformat_write_header( fc, NULL );
if(vRet==0)
return 1;
else
return 0;
}
av_interleaved_write_frame:
void h264_file_write_frame(AVFormatContext *fc, int vStreamIdx, const void* p, int len, int64_t dts, int64_t pts )
{
AVStream *pst = NULL;
AVPacket pkt;
if ( 0 > vVideoStreamIdx )
return;
// may be audio or video
pst = fc->streams[ vStreamIdx ];
// Init packet
av_init_packet( &pkt );
if(vStreamIdx ==vVideoStreamIdx)
{
pkt.flags |= ( 0 >= getVopType( p, len ) ) ? AV_PKT_FLAG_KEY : 0;
//pkt.flags |= AV_PKT_FLAG_KEY;
pkt.stream_index = pst->index;
pkt.data = (uint8_t*)p;
pkt.size = len;
pkt.dts = AV_NOPTS_VALUE;
pkt.pts = AV_NOPTS_VALUE;
// TODO: mark or unmark the log
//fprintf(stderr, "dts=%lld, pts=%lld\n",dts,pts);
// av_write_frame( fc, &pkt );
}
av_interleaved_write_frame( fc, &pkt );
}
av_write_trailer:
void h264_file_close(AVFormatContext *fc)
{
if ( !fc )
return;
av_write_trailer( fc );
if ( fc->oformat && !( fc->oformat->flags & AVFMT_NOFILE ) && fc->pb )
avio_close( fc->pb );
av_free( fc );
}
Thanks.
It looks like you're using the same AVFormatContext for both the input and output?
In the line
pst = fc->streams[ vStreamIdx ];
You're assigning the AVStream* from your AVFormatContext connected with your input (RTSP stream). But then later on you're trying to write the packet back to the same context av_interleaved_write_frame( fc, &pkt );. I kind of think of a context as a file which has helped me navagate this type of thing better. I do something identicial to what you're doing (not iOS though) where I use a separate AVFormatContext for each of the input (RTSP stream) and output (mp4 file). If I'm correct, I think what you just need to do is initialize an AVFormatContext and properly.
The following code (without error checking everything) is what I do to take an AVFormatContext * output_format_context = NULL and the AVFormatContext * input_format_context that I had associated with the RTSP stream and write from one to the other. This is after I have fetched a packet, etc., which in your case it looks like you're populating (I just take the packet from av_read_frame and re-package it.
This is code that could be in your write frame function (but it also does include the writing of the header).
AVFormatContext * output_format_context;
AVStream * in_stream_2;
AVStream * out_stream_2;
// Allocate the context with the output file
avformat_alloc_output_context2(&output_format_context, NULL, NULL, out_filename.c_str());
// Point to AVOutputFormat * output_format for manipulation
output_format = output_format_context->oformat;
// Loop through all streams
for (i = 0; i < input_format_context->nb_streams; i++) {
// Create a pointer to the input stream that was allocated earlier in the code
AVStream *in_stream = input_format_context->streams[i];
// Create a pointer to a new stream that will be part of the output
AVStream *out_stream = avformat_new_stream(output_format_context, in_stream->codec->codec);
// Set time_base of the new output stream to equal the input stream one since I'm not changing anything (can avoid but get a deprecation warning)
out_stream->time_base = in_stream->time_base;
// This is the non-deprecated way of copying all the parameters from the input stream into the output stream since everything stays the same
avcodec_parameters_from_context(out_stream->codecpar, in_stream->codec);
// I don't remember what this is for :)
out_stream->codec->codec_tag = 0;
// This just sets a flag from the format context to the stream relating to the header
if (output_format_context->oformat->flags & AVFMT_GLOBALHEADER)
out_stream->codec->flags |= AV_CODEC_FLAG_GLOBAL_HEADER;
}
// Check NOFILE flag and open the output file context (previously the output file was associated with the format context, now it is actually opened.
if (!(output_format->flags & AVFMT_NOFILE))
avio_open(&output_format_context->pb, out_filename.c_str(), AVIO_FLAG_WRITE);
// Write the header (not sure if this is always needed but h264 I believe it is.
avformat_write_header(output_format_context,NULL);
// Re-getting the appropriate stream that was populated above (this should allow for both audio/video)
in_stream_2 = input_format_context->streams[packet.stream_index];
out_stream_2 = output_format_context->streams[packet.stream_index];
// Rescaling pts and dts, duration and pos - you would do as you need in your code.
packet.pts = av_rescale_q_rnd(packet.pts, in_stream_2->time_base, out_stream_2->time_base, (AVRounding) (AV_ROUND_NEAR_INF | AV_ROUND_PASS_MINMAX));
packet.dts = av_rescale_q_rnd(packet.dts, in_stream_2->time_base, out_stream_2->time_base, (AVRounding) (AV_ROUND_NEAR_INF | AV_ROUND_PASS_MINMAX));
packet.duration = av_rescale_q(packet.duration, in_stream_2->time_base, out_stream_2->time_base);
packet.pos = -1;
// The first packet of my stream always gives me negative dts/pts so this just protects that first one for my purposes. You probably don't need.
if (packet.dts < 0) packet.dts = 0;
if (packet.pts < 0) packet.pts = 0;
// Finally write the frame
av_interleaved_write_frame(output_format_context, &packet);
// ....
// Write header, close/cleanup... etc
// ....
This code is fairly bare bones and doesn't include the setup (which it sounds like you're doing correctly anyway). I would also imagine this code could be cleaned up and tweaked for your purposes, but this works for me to re-write the RTSP stream into a file (in my case many files but code not shown).
The code is C code, so you might need to do minor tweaks for making it Swift compatible (for some of the library function calls maybe). I think overall it should be compatible though.
Hopefully this helps point you to the right direction. This was cobbled together thanks to several sample code sources (I don't remember where), along with warning prompts from the libraries themselves.

Resolving SRV records with iOS SDK

I want to resolve DNS SRV records using the iOS SDK.
I've already tried the high-level Bonjour APIs Apple is providing, but they're not what I need. Now I'm using DNS SD.
void *processQueryForSRVRecord(void *record) {
DNSServiceRef sdRef;
int context;
printf("Setting up query for record: %s\n", record);
DNSServiceQueryRecord(&sdRef, 0, 0, record, kDNSServiceType_SRV, kDNSServiceClass_IN, callback, &context);
printf("Processing query for record: %s\n", record);
DNSServiceProcessResult(sdRef);
printf("Deallocating query for record: %s\n", record);
DNSServiceRefDeallocate(sdRef);
return NULL;
}
This works as long as it gets only correct SRV records (for example: _xmpp-server._tcp.gmail.com), but when the record is typed wrong, DNSServiceProcessResult(sdRef) goes into an infinite loop.
Is there a way to stop DNSServiceProcessResult or must I cancel the thread calling it?
Use good old select(). This is what I have at the moment:
- (void)updateDnsRecords
{
if (self.dnsUpdatePending == YES)
{
return;
}
else
{
self.dnsUpdatePending = YES;
}
NSLog(#"DNS update");
DNSServiceRef sdRef;
DNSServiceErrorType err;
const char* host = [self.dnsHost UTF8String];
if (host != NULL)
{
NSTimeInterval remainingTime = self.dnsUpdateTimeout;
NSDate* startTime = [NSDate date];
err = DNSServiceQueryRecord(&sdRef, 0, 0,
host,
kDNSServiceType_SRV,
kDNSServiceClass_IN,
processDnsReply,
&remainingTime);
// This is necessary so we don't hang forever if there are no results
int dns_sd_fd = DNSServiceRefSockFD(sdRef);
int nfds = dns_sd_fd + 1;
fd_set readfds;
int result;
while (remainingTime > 0)
{
FD_ZERO(&readfds);
FD_SET(dns_sd_fd, &readfds);
struct timeval tv;
tv.tv_sec = (time_t)remainingTime;
tv.tv_usec = (remainingTime - tv.tv_sec) * 1000000;
result = select(nfds, &readfds, (fd_set*)NULL, (fd_set*)NULL, &tv);
if (result == 1)
{
if (FD_ISSET(dns_sd_fd, &readfds))
{
err = DNSServiceProcessResult(sdRef);
if (err != kDNSServiceErr_NoError)
{
NSLog(#"There was an error reading the DNS SRV records.");
break;
}
}
}
else if (result == 0)
{
NBLog(#"DNS SRV select() timed out");
break;
}
else
{
if (errno == EINTR)
{
NBLog(#"DNS SRV select() interrupted, retry.");
}
else
{
NBLog(#"DNS SRV select() returned %d errno %d %s.", result, errno, strerror(errno));
break;
}
}
NSTimeInterval elapsed = [[NSDate date] timeIntervalSinceDate:startTime];
remainingTime -= elapsed;
}
DNSServiceRefDeallocate(sdRef);
}
}
static void processDnsReply(DNSServiceRef sdRef,
DNSServiceFlags flags,
uint32_t interfaceIndex,
DNSServiceErrorType errorCode,
const char* fullname,
uint16_t rrtype,
uint16_t rrclass,
uint16_t rdlen,
const void* rdata,
uint32_t ttl,
void* context)
{
NSTimeInterval* remainingTime = (NSTimeInterval*)context;
// If a timeout occurs the value of the errorCode argument will be
// kDNSServiceErr_Timeout.
if (errorCode != kDNSServiceErr_NoError)
{
return;
}
// The flags argument will have the kDNSServiceFlagsAdd bit set if the
// callback is being invoked when a record is received in response to
// the query.
//
// If kDNSServiceFlagsAdd bit is clear then callback is being invoked
// because the record has expired, in which case the ttl argument will
// be 0.
if ((flags & kDNSServiceFlagsMoreComing) == 0)
{
*remainingTime = 0;
}
// Record parsing code below was copied from Apple SRVResolver sample.
NSMutableData * rrData = [NSMutableData data];
dns_resource_record_t * rr;
uint8_t u8;
uint16_t u16;
uint32_t u32;
u8 = 0;
[rrData appendBytes:&u8 length:sizeof(u8)];
u16 = htons(kDNSServiceType_SRV);
[rrData appendBytes:&u16 length:sizeof(u16)];
u16 = htons(kDNSServiceClass_IN);
[rrData appendBytes:&u16 length:sizeof(u16)];
u32 = htonl(666);
[rrData appendBytes:&u32 length:sizeof(u32)];
u16 = htons(rdlen);
[rrData appendBytes:&u16 length:sizeof(u16)];
[rrData appendBytes:rdata length:rdlen];
rr = dns_parse_resource_record([rrData bytes], (uint32_t) [rrData length]);
// If the parse is successful, add the results.
if (rr != NULL)
{
NSString *target;
target = [NSString stringWithCString:rr->data.SRV->target encoding:NSASCIIStringEncoding];
if (target != nil)
{
uint16_t priority = rr->data.SRV->priority;
uint16_t weight = rr->data.SRV->weight;
uint16_t port = rr->data.SRV->port;
[[FailoverWebInterface sharedInterface] addDnsServer:target priority:priority weight:weight port:port ttl:ttl]; // You'll have to do this in with your own method.
}
}
dns_free_resource_record(rr);
}
Here's the Apple SRVResolver sample from which I got the RR parsing.
This Apple sample mentions that it may block forever, but strange enough suggest to use NSTimer when trying to add a timeout yourself. But I think using select() is a much better way.
I have one to-do: Implement flushing cache with DNSServiceReconfirmRecord. But won't do that now.
Be aware, this code is working, but I'm still testing it.
You need to add libresolv.dylib to your Xcode project's 'Linked Frameworks and Libraries'.

how to use libgit2 to fetch from the git?

I read the fetch.c and try to update the content in local repository(just like "git fetch"), but git_remote_connect return -1.
err:Unexpected HTTP status code: 401
where to set the cred when connect to the remote? What is wrong with the code?? Thx.
(IBAction)Fetch:(id)sender {
git_remote *remote = NULL;
const git_error *err = NULL;
int ret = -1;
bool invoked = false;
git_repository *repo = NULL;
NSArray *str = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *docPath = [str objectAtIndex:0];
NSString *localPath = [docPath stringByAppendingPathComponent:#"abc/.git"];
NSLog(#"localPath:%#", localPath);
ret = git_repository_open(&repo, [localPath UTF8String]);
NSLog(#"git_repository_open ret:%d",ret);
err = giterr_last();
if(err == NULL)
{
NSLog(#"NULL");
}
else
{
NSLog(#"err:%s", err->message);
}
ret = git_remote_load(&remote, repo, "origin");
NSLog(#"git_remote_load ret:%d", ret);
err = giterr_last();
if(err == NULL)
{
NSLog(#"No error");
}
else
{
NSLog(#"err:%s", err->message);
return;
}
ret = git_remote_load(&remote, repo, "origin");
NSLog(#"git_remote_load ret:%d", ret);
ret = git_remote_connect(remote, GIT_DIRECTION_FETCH);
NSLog(#"git_remote_connect ret:%d", ret);
err = giterr_last();
if(err == NULL)
{
NSLog(#"No error");
}
else
{
NSLog(#"err:%s", err->message);
return;
}
ret = git_remote_download(remote, &transferProgressCallback, &invoked);
NSLog(#"git_remote_download ret:%d", ret);
err = giterr_last();
if(err == NULL)
{
NSLog(#"No error");
}
else
{
NSLog(#"err:%s", err->message);
return;
}
ret = git_remote_update_tips(remote);
NSLog(#"git_remote_update_tips ret:%d", ret);
err = giterr_last();
if(err == NULL)
{
NSLog(#"No error");
}
else
{
NSLog(#"err:%s", err->message);
return;
}
}
here is the remote config
[remote "origin"]
url = http://remote_path/git/share.git
fetch = +refs/heads/*:refs/remotes/origin/*
[branch "master"]
remote = origin
merge = refs/heads/master
I find the function below to set cred which the remote acquire
bool invoked = false;
git_remote_set_cred_acquire_cb(remote, cred_acquire_cb, &invoked);
The code of cred_acquire_cb is below:
static int cred_acquire_cb(git_cred **cred, const char *url, unsigned int allowed_types, void *payload)
{
char *_remote_user = "user";
char *_remote_pass = "pass";
*((bool*)payload) = true;
if ((GIT_CREDTYPE_USERPASS_PLAINTEXT & allowed_types) == 0 ||
git_cred_userpass_plaintext_new(cred, _remote_user, _remote_pass) < 0)
return -1;
return 0;
}

SecKeychain load item

I want to store SMTP-Data from my Mac OSX application using the keychain. I read the Keychain Services Programming Guide of Apple and wrote this method to store the data:
- (BOOL)saveSMPTData
{
OSStatus err;
SecKeychainItemRef item = nil;
SecProtocolType protocol = kSecProtocolTypeSMTP;
const char *accessLabelUTF8 = [KEYCHAIN_NAME UTF8String];
const char *serverNameUTF8 = [self.serverName UTF8String];
const char *usernameUTF8 = [self.username UTF8String];
const char *passwordUTF8 = [self.password UTF8String];
SecAccessRef access = createAccess(KEYCHAIN_NAME);
SecKeychainAttribute attrs[] = {
{ kSecLabelItemAttr, (int)strlen(accessLabelUTF8), (char *)accessLabelUTF8 },
{ kSecAccountItemAttr, (int)strlen(usernameUTF8), (char *)usernameUTF8 },
{ kSecServerItemAttr, (int)strlen(serverNameUTF8), (char *)serverNameUTF8 },
{ kSecProtocolItemAttr, sizeof(SecProtocolType), (SecProtocolType *)&protocol }
};
SecKeychainAttributeList attributes = { sizeof(attrs) / sizeof(attrs[0]), attrs };
err = SecKeychainItemCreateFromContent(kSecInternetPasswordItemClass,
&attributes,
(int)strlen(passwordUTF8),
passwordUTF8,
NULL,
access,
&item);
if (access) CFRelease(access);
if (item) CFRelease(item);
return (err == noErr);
}
SecAccessRef createAccess(NSString *accessLabel)
{
OSStatus err;
SecAccessRef access = nil;
NSArray *trustedApplications = nil;
SecTrustedApplicationRef myself;
err = SecTrustedApplicationCreateFromPath(NULL, &myself);
trustedApplications = [NSArray arrayWithObjects:(__bridge id)myself, nil];
err = SecAccessCreate((__bridge CFStringRef)accessLabel,
(__bridge CFArrayRef)trustedApplications, &access);
if (err) return nil;
return access;
}
Of course I also want to load them. My first try looks like this:
- (BOOL)loadDataFromKeychain
{
uint32_t serverNameLength = 0;
const char *serverName = NULL;
uint32_t usernameLength = 0;
const char *username = NULL;
uint32_t passwordLength = 0;
void **password = NULL;
OSStatus err = SecKeychainFindInternetPassword(NULL,
serverNameLength, serverName,
0, NULL,
usernameLength, username,
0, NULL,
0, 0,
0,
&passwordLength, password,
NULL); // How do I get the ItemRef?
return (err == noErr);
}
But this does not work, and I think I know why not. I don’t know how to get the SecKeychainItemRef for the SecKeychainFindInternetPassword method.
Maybe anyone can help me?
Instead of declaring password a void **, declare it a void * and pass &password for the second-to-last parameter.
You probably don't need the SecKeychainItemRef for what you're trying to accomplish.
By the way, have you tried using Keychain Access to verify the items are getting into the keychain?

How to badge file and folder using Cocoa

I want to badge a file and folder with some color (image). How can this be achieved?
I tried with icon service, and it works for files, but it is not working with folders.
I saw this behavior working Dropbox (10.4, 10.5 and 10.6)- how can this be done?
The blog post Cocoa Tutorial: Custom Folder Icons was very close one for me, but it was not working as expected.
Is there another solution other than icon service?
The following function is the solution I found for the problem
BOOL AddBadgeToItem(NSString* path,NSData* tag)
{
FSCatalogInfo info;
FSRef par;
FSRef ref;
Boolean dir = false;
if (tag&&(FSPathMakeRef([path fileSystemRepresentation],&par,&dir)==noErr))
{
HFSUniStr255 fork = {0,{0}};
sint16 refnum = kResFileNotOpened;
FSGetResourceForkName(&fork);
if (dir)
{
NSString *name = #"Icon\r";
memset(&info,0,sizeof(info));
((FileInfo*)(&info.finderInfo))->finderFlags = kIsInvisible;
OSErr error = FSCreateResourceFile(&par,[name lengthOfBytesUsingEncoding:NSUTF16LittleEndianStringEncoding],(UniChar*)[name cStringUsingEncoding:NSUTF16LittleEndianStringEncoding],kFSCatInfoFinderXInfo,&info,fork.length, fork.unicode,&ref,NULL);
if( error == dupFNErr )
{
// file already exists; prepare to try to open it
const char *iconFileSystemPath = [[path stringByAppendingPathComponent:#"\000I\000c\000o\000n\000\r"] fileSystemRepresentation];
OSStatus status = FSPathMakeRef((const UInt8 *)iconFileSystemPath, &ref, NULL);
if (status != noErr)
{
fprintf(stderr, "error: FSPathMakeRef() returned %d for file \"%s\"\n", (int)status, iconFileSystemPath);
}
}else if ( error != noErr)
{
return NO;
}
}
else
{
BlockMoveData(&par,&ref,sizeof(FSRef));
if (FSCreateResourceFork(&ref,fork.length,fork.unicode,0)!=noErr)
{
//test
if (FSOpenResourceFile(&ref,fork.length,fork.unicode,fsRdWrPerm,&refnum)!=noErr) {
return NO;
}
if (refnum!=kResFileNotOpened) {
UpdateResFile(refnum);
CloseResFile(refnum);
if (FSGetCatalogInfo(&par,kFSCatInfoFinderXInfo,&info,NULL,NULL,NULL)==noErr) {
((ExtendedFileInfo*)(&info.extFinderInfo))->extendedFinderFlags = kExtendedFlagsAreInvalid;
FSSetCatalogInfo(&par,kFSCatInfoFinderXInfo,&info);
}
}
//Test end
return NO;
}
}
OSErr errorr = FSOpenResourceFile(&ref,fork.length,fork.unicode,fsRdWrPerm,&refnum);
if (errorr!=noErr) {
return NO;
}
if (refnum!=kResFileNotOpened) {
CustomBadgeResource* cbr;
int len = [tag length];
Handle h = NewHandle(len);
if (h) {
BlockMoveData([tag bytes],*h,len);
AddResource(h,kIconFamilyType,128,"\p");
WriteResource(h);
ReleaseResource(h);
}
h = NewHandle(sizeof(CustomBadgeResource));
if (h) {
cbr = (CustomBadgeResource*)*h;
memset(cbr,0,sizeof(CustomBadgeResource));
cbr->version = kCustomBadgeResourceVersion;
cbr->customBadgeResourceID = 128;
AddResource(h,kCustomBadgeResourceType,kCustomBadgeResourceID,"\p");
WriteResource(h);
ReleaseResource(h);
}
UpdateResFile(refnum);
CloseResFile(refnum);
if (FSGetCatalogInfo(&par,kFSCatInfoFinderXInfo,&info,NULL,NULL,NULL)==noErr) {
((ExtendedFileInfo*)(&info.extFinderInfo))->extendedFinderFlags = kExtendedFlagHasCustomBadge;
FSSetCatalogInfo(&par,kFSCatInfoFinderXInfo,&info);
}
}
}
return NO;
}
You can do this using the -setIcon:forFile:options: method on NSWorkspace, which lets you just specify an NSImage to apply to the file/folder at the path you give it.
The intended approach is to create a Finder Sync app extension.