deploy artifacts to a remote server using msbuild - msbuild

I have all the artifacts are added to the folder c: \ artefacts in this folder for example 2 project c: \ artefacts \ proj1 and c: \ artefacts \ proj2. I need to copy from c : \ artefacts \ proj2 all such files to the server \ \ 10.77.0.3 \ Proj2 where Proj2 a folder which has a share. if I perform
Program Files (x86) \ IIS \ Microsoft Web Deploy V3 \ msdeploy.exe-verb: sync-source: contentPath = c: \ artefacts \ proj2, includeAcls = true-dest: contentPath = \ \ 10.77.0.3 \ Proj2 then I all copied. but if I write so in msbuild
<MSBuild Projects="artefacts \proj2\proj2.sln"
Properties="OutDir=\ \ 10.77.0.3 \С$\ Proj2;
Configuration=$(Configuration); "
ContinueOnError="false"/>
system writes that there is no permission to write to \ \ 10.77.0.3 \С$\ Proj2

Why isnt msdeploy enough? MSDeploy was build to transfer files onto web servers.
But OKay, if you want to do it your way than if you have your severs in a domain then check the username TeamCity's agents are being run and verify that this user has permissions on the 10.77.03.
If you dont have your servers in domain then this can be a problem, you could always set permissions to everyone on that directory but i would not recommended it.

Related

MarkLogic - mlcp export of JSON documents

MarkLogic version : 9.0-6.2
MLCP - version - 9.0.6
I am trying to export a collection (customer) of JSON documents with mlcp export, using below code. I am not getting any response from mlcp after I execute the script, and the output file is not created.
mlcp.sh export -ssl \
-host localhost \
-port 8010 \
-username uname \
-password pword \
-mode local \
-out_file_path /test/TestFiles/customer.txt \
-collection_filter customer \
-output_type document
I verified that data-hub-STAGING (8010 port) has a collection named customer with 100 JSON documents. The linux user has execute permissions on the script and write permissions on the output path. The user and password are correct.
Interestingly, when I run below mlcp code from the same directory, with the same linux user, I am able to import the documents into data-hub-STAGING
mlcp.sh import -ssl \
-host localhost \
-port 8010 \
-username uname \
-password pword \
-mode local \
-input_file_path /test/TestFiles/Customer \
-input_file_type documents \
-output_collections customer \
-output_uri_prefix /customer/ \
-output_uri_suffix ".json"
Anything wrong with my export code?
Looks like you have a typo: it is -output_file_path, not -out_file_path. Also, the value that follows that parameter is treated as a directory name, and should not exist yet.
HTH!

DLL TeamFoundation error with VS2015

I try to run a project built with tfs2013 and vs2013.
All of my colleagues can execute the build of this project but not me (I'm on vs2015).
I understood that the problem came from different TeamFoundation dll (presented in the GAC for vs2013 and placed in c: \ program files (x86) \ Microsoft Visual Studio 14.0 \ Common7 \ IDE \ CommonExtensions \ Microsoft \ TeamFoundation \ Team Explorer to vs2015 )
There is also a problem with the workspace:
vs2013 = c: \ Users \ Username \ AppData \ Local \ Microsoft \ Team Foundation \ 5.0 \ Cache
vs2015 = c: \ Users \ Username \ AppData \ Local \ Microsoft \ Team Foundation \ 6.0 \ Cache
What can i do if i want to execute this build?
Could you provide the detail error message when you built your project? Did you mean VS2015 can't found the dll in your project?
According to your infomation, the dll in GAC won't be check in souce control. You can view them in the references of the solution. And also it won't affect your build. However, if the project referenced some Team Foundation dll or 3rd part dll, you may got the error XX.dll can't be found in your project when using VS2015.
A workaround for your situation, create a folder under the solution and add the Team Foundation dll or 3rd part in the folder. Check them in the source control,remember to edit the build definition to add the folder in source settings. Then you can queue the build sucessfully.
Moreover, c: \ Users \ Username \ AppData \ Local \ Microsoft \ Team Foundation \ 6.0 \ Cache means the cache of TFS. 5.0 stands for TFS2013 and 6.0 stands for TFS2015. This should have no effect on the build result. It just mean you are using TFS2015 instead of TFS2013.

project delivery to a remote computer using MSBuild

I configured MSbuild for delivery at iis. in the same MSbuild.xml I setup for delivery 4 more programs. Now everything works: TeamCity launches MSbuild.xml
which creates 5 projects on the same computer. After one application delivered on iis server that is on another computer. I need another app comes to another computer, but not on iis but simply to a specific folder.
then the build project, which I want to put on a remote computer instead "OutDir=%(BuildArtifactsDir.FullPath)\DBUpgrader\
<MSBuild Projects="DatabaseUpgrader\DatabaseUpgrader\DatabaseUpgrader.sln"
Properties="OutDir=%(BuildArtifactsDir.FullPath)\DBUpgrader\;Configuration=$(Configuration)"
ContinueOnError="false"/>
did not quite understand (((could describe in detail or example? I have all the artifacts are added to the folder c: \ artefacts in this folder for example 2 project c: \ artefacts \ proj1 and c: \ artefacts \ proj2. I need to copy from c : \ artefacts \ proj2 all such files to the server \ \ 10.77.0.3 \ Proj2 where Proj2 a folder which has a share. if I perform
Program Files (x86) \ IIS \ Microsoft Web Deploy V3 \ msdeploy.exe-verb: sync-source: contentPath = c: \ artefacts \ proj2, includeAcls = true-dest: contentPath = \ \ 10.77.0.3 \ Proj2 then I all copied. but if I write so in msbuild
<MSBuild Projects="artefacts \proj2\proj2.sln"
Properties="OutDir=\ \ 10.77.0.3 \С$\ Proj2;
Configuration=$(Configuration); "
ContinueOnError="false"/>
system writes that there is no permission to write to \ \ 10.77.0.3 \С$\ Proj2
You can either build (or copy after build) to a UNC path or a share, e.g. <MSBuild Properties="OutputPath=\\Foo\C$\Bar" or use msdeploy with dirPath provider as it sounds like you already use it for IIS deployment.

BufferedReader in pylucne

I am using pylucne to build a search system. I am using TREC data to test my system. I have successfully written the indexer and searcher code. Now I want to use TREC topics to evaluate my system. To do this there is a class named TrecTopicsReader() which reads the queries from the TREC formatted topics file. But readQueries(BufferedReader reader) of that class needs a BufferedReader topics file object passed to it.
How to do this in pylucene. BufferedReader is not available in pylucene JCC.
After waiting for some one to answer, I also asked this question on pylucene developer mailing list.
Andi Vajda replied there. I am answering this question on Andi's behalf.
Quoting Andi:
In the PyLucene Makefile find the jcc invocation and add java.io.BufferedReader to the long command line (don't forget the ending \ as needed) and rebuild PyLucene.
More information:
In the Makefile of pyLucene you will find this line GENERATE=$(JCC) $(foreach jar,$(JARS),--jar $(jar)) \. In this there should be a line like --package java.io, add the class(BufferedReader) you want to add to JCC so that it will be available to the python code.
Then compile and install the pylucene again. (You can find the info about compilation & installation at PyLucene's documentation or you can also use this).
Also, for making a BufferedReader object from a file you will need FileReader. So add that also.
Just for Completenes: After adding this line my GENERATE will look like:
GENERATE=$(JCC) $(foreach jar,$(JARS),--jar $(jar)) \
$(JCCFLAGS) --use_full_names \
--package java.lang java.lang.System \
java.lang.Runtime \
--package java.util java.util.Arrays \
java.util.Collections \
java.util.HashMap \
java.util.HashSet \
java.util.TreeSet \
java.lang.IllegalStateException \
java.lang.IndexOutOfBoundsException \
java.util.NoSuchElementException \
java.text.SimpleDateFormat \
java.text.DecimalFormat \
java.text.Collator \
--package java.util.concurrent java.util.concurrent.Executors \
--package java.util.regex \
--package java.io java.io.StringReader \
java.io.InputStreamReader \
java.io.FileInputStream \
java.io.BufferedReader \
java.io.FileReader \
--exclude org.apache.lucene.sandbox.queries.regex.JakartaRegexpCapabilities \
--exclude org.apache.regexp.RegexpTunnel \
--python lucene \
--mapping org.apache.lucene.document.Document 'get:(Ljava/lang/String;)Ljava/lang/String;' \
--mapping java.util.Properties 'getProperty:(Ljava/lang/String;)Ljava/lang/String;' \
--sequence java.util.AbstractList 'size:()I' 'get:(I)Ljava/lang/Object;' \
org.apache.lucene.index.IndexWriter:getReader \
--version $(LUCENE_VER) \
--module python/collections.py \
--module python/ICUNormalizer2Filter.py \
--module python/ICUFoldingFilter.py \
--module python/ICUTransformFilter.py \
$(RESOURCES) \
--files $(NUM_FILES)
Doing this doesn't suffice, you also have to compile the lucene benchmark lib, which is not included in the installation libs by default, because TrecTopicsReader is present in benchmark api.
To compile and install benchmark:
You have to modify the build.xml inside the main lucene folder, where the benchmark folder is present and then you have to include this jar in main Makefile to install it into python libs as egg.
build.xml:
You have to three modifications. For simplicity follow the jar-test-framework and wherever this is present try to create the similar pattern for jar-benchmark.
The three changes you have to do are:
1) <target name="package" depends="jar-core, jar-test-framework, build-modules, init-dist, documentation"/> replace it with <target name="package" depends="jar-core, jar-test-framework, jar-benchmark, build-modules, init-dist, documentation"/>
2) For the rule
<target name="jar" depends="jar-core,jar-test-framework"
description="Jars core, codecs, test-framework, and all modules">
<modules-crawl target="jar-core"/>
</target>
replace it with
<target name="jar" depends="jar-core,jar-test-framework, jar-benchmark"
description="Jars core, codecs, test-framework, and all modules">
<modules-crawl target="jar-core"/>
</target>
3) Add the following target/rule after the target named jar-test-framework
<target name="jar-benchmark">
<ant dir="${common.dir}/benchmark" target="jar-core" inheritAll="false">
<propertyset refid="uptodate.and.compiled.properties"/>
</ant>
</target>
MakeFile:
Here also you have to do three modifications. For simplicity follow HIGHLIGHTER_JAR and add similar rules for BENCHMARK_JAR. The three changes you have to are:
1) Find JARS+=$(HIGHLIGHTER_JAR) and add JARS+=$(BENCHMARK_JAR) after that in similar manner.
2) Find HIGHLIGHTER_JAR=$(LUCENE)/build/highlighter/lucene-highlighter-$(LUCENE_VER).jar and add BENCHMARK_JAR=$(LUCENE)/build/benchmark/lucene-benchmark-$(LUCENE_VER).jar after this line in similar manner.
3) Find the rule $(ANALYZERS_JAR): and another rule for $(BENCHMARK_JAR): after that.
$(BENCHMARK_JAR): $(LUCENE_JAR)
cd $(LUCENE)/benchmark; $(ANT) -Dversion=$(LUCENE_VER) compile
For completeness here are my final Mkaefile and build.xml files.

NetBeans doesn't recognize Makefile.am

I'm coming from a Objective-C/Xcode background.
I'm used to working with C projects already imported into XCode, but now I want to analyse an existing implementation of an algorithm I'm interested in integrating with my project.
Only that this project is written completely in C and has nothing to do with Objective-C/Xcode etc.
I'm not sure what is the best way to view a purely C project on Mac, so I installed NetBeans for C/C++.
The problem is that when I try to create a New Project on NetBeans and select C/C++ Project with Existing Sources it complains that
no make files or configure scripts were found
in the root directory.. although it clearly has a Makefile.am
I know that the Balsa project is written for linux, but I'm not interested in building the binary I just want to look at the source code in a IDE kinda way (ie I can click on a function call and see where it's implemented etc etc).
So in short my question is why isn't NetBeans recognising my Makefile.am?
and just for reference here is the content of the Makefile.am
#intl dir needed for tarball --disable-nls build.
DISTCHECK_CONFIGURE_FLAGS=--disable-extra-mimeicons --without-gnome --without-html-widget
SUBDIRS = po sounds images doc libbalsa libinit_balsa src
# set tar in case it is not set by automake or make
man_MANS=balsa.1
pixmapdir = $(datadir)/pixmaps
pixmap_DATA = gnome-balsa2.png
desktopdir = $(datadir)/applications
desktop_in_files = balsa.desktop.in balsa-mailto-handler.desktop.in
desktop_DATA = balsa.desktop balsa-mailto-handler.desktop
#INTLTOOL_DESKTOP_RULE#
balsa_extra_dist = \
GNOME_Balsa.server.in \
HACKING \
balsa-mail-style.xml \
balsa-mail.lang \
balsa.1.in \
balsa.spec.in \
bootstrap.sh \
docs/mh-mail-HOWTO \
docs/pine2vcard \
docs/vconvert.awk \
$(desktop_in_files) \
gnome-balsa2.png \
intltool-extract.in \
intltool-merge.in \
intltool-update.in \
mkinstalldirs
if BUILD_WITH_G_D_U
balsa_g_d_u_extra_dist = gnome-doc-utils.make
endif
if !BUILD_WITH_UNIQUE
serverdir = $(libdir)/bonobo/servers
server_in_files = GNOME_Balsa.server
server_DATA = $(server_in_files:.server.in=.server)
$(server_in_files): $(server_in_files).in
sed -e "s|\#bindir\#|$(bindir)|" $< > $#
endif
EXTRA_DIST = \
$(balsa_extra_dist) \
$(balsa_g_d_u_extra_dist)
if BUILD_WITH_GTKSOURCEVIEW2
gtksourceviewdir = $(BALSA_DATA_PREFIX)/gtksourceview-2.0
gtksourceview_DATA = balsa-mail.lang \
balsa-mail-style.xml
endif
DISTCLEANFILES = $(desktop_DATA) $(server_DATA) \
intltool-extract intltool-merge intltool-update \
gnome-doc-utils.make
dist-hook: balsa.spec
cp balsa.spec $(distdir)
#MAINT#RPM: balsa.spec
#MAINT# rm -f *.rpm
#MAINT# $(MAKE) distdir="$(PACKAGE)-#BALSA_VERSION#" dist
#MAINT# cp $(top_srcdir)/rpm-po.patch $(top_builddir)/rpm-po.patch
#MAINT# rpm -ta "./$(PACKAGE)-#BALSA_VERSION#.tar.gz"
#MAINT# rm $(top_builddir)/rpm-po.patch
#MAINT# -test -f "/usr/src/redhat/SRPMS/$(PACKAGE)-#VERSION#-#BALSA_RELEASE#.src.rpm" \
#MAINT# && cp -f "/usr/src/redhat/SRPMS/$(PACKAGE)-#VERSION#-#BALSA_RELEASE#.src.rpm" .
#MAINT# -for ping in /usr/src/redhat/RPMS/* ; do \
#MAINT# if test -d $$ping ; then \
#MAINT# arch=`echo $$ping |sed -e 's,/.*/\([^/][^/]*\),\1,'` ; \
#MAINT# f="$$ping/$(PACKAGE)-#VERSION#-#BALSA_RELEASE#.$$arch.rpm" ; \
#MAINT# test -f $$f && cp -f $$f . ; \
#MAINT# fi ; \
#MAINT# done
#MAINT#snapshot:
#MAINT# $(MAKE) distdir=$(PACKAGE)-`date +"%y%m%d"` dist
#MAINT#balsa-dcheck:
#MAINT# $(MAKE) BALSA_DISTCHECK_HACK=yes distcheck
## to automatically rebuild aclocal.m4 if any of the macros in
## `macros/' change
bzdist: distdir
#test -n "$(AMTAR)" || { echo "AMTAR undefined. Run make bzdist AMTAR=gtar"; false; }
-chmod -R a+r $(distdir)
$(AMTAR) chojf $(distdir).tar.bz2 $(distdir)
-rm -rf $(distdir)
# macros are not used any more by current configure.in, see also
# post by Ildar Mulyukov to balsa-list, 2006.06.27
# ACLOCAL_AMFLAGS = -I macros
UPDATE
I tried this answer.. but I got the following:
autoreconf --install
configure.in:250: warning: macro `AM_GLIB_GNU_GETTEXT' not found in library
glibtoolize: putting auxiliary files in `.'.
glibtoolize: copying file `./ltmain.sh'
glibtoolize: putting macros in AC_CONFIG_MACRO_DIR, `m4'.
glibtoolize: copying file `m4/libtool.m4'
glibtoolize: copying file `m4/ltoptions.m4'
glibtoolize: copying file `m4/ltsugar.m4'
glibtoolize: copying file `m4/ltversion.m4'
glibtoolize: copying file `m4/lt~obsolete.m4'
glibtoolize: Remember to add `LT_INIT' to configure.in.
glibtoolize: Consider adding `-I m4' to ACLOCAL_AMFLAGS in Makefile.am.
glibtoolize: `AC_PROG_RANLIB' is rendered obsolete by `LT_INIT'
configure.in:250: warning: macro `AM_GLIB_GNU_GETTEXT' not found in library
configure.in:249: error: possibly undefined macro: AC_PROG_INTLTOOL
If this token and others are legitimate, please use m4_pattern_allow.
See the Autoconf documentation.
configure.in:250: error: possibly undefined macro: AM_GLIB_GNU_GETTEXT
configure.in:301: error: possibly undefined macro: AC_MSG_ERROR
autoreconf: /usr/bin/autoconf failed with exit status: 1
I'm looking into using the suggestions in the output..
Interesting. I just tried downloading "balsa" and noticed that they distributed the Makefile.am and configure.in files instead of a ready to run configure script. You could let the package maintainers know they aren't doing anyone any favors by not precompiling their own autotools sources.
Makefile.am is not a real Makefile. It's the thing that generates Makefile.in, which in turn gets translated into a real Makefile by a configure script.
Try the following steps:
Download the sources to balsa again clean. Then from the command prompt type the following:
autoreconf --install
(If you don't have autoreconf, you likely need to install the autotools packages - ughh...)
That should generate the configure script. Then type:
./configure
It complained about some missing GMime dependencies, so I didn't see it actually generate a Makefile. Once you get to the point in which a Makefile is generated, you should be able to point Netbeans to "open project from existing sources".
As per abbood's request...
Netbeans is not very good for C development. One approach would be to build an XCode project around the source base. The maintainers of the project may even accept the XCode project as a contribution.