I have this pretty simple application, it uses a webService to transfer data to my servers DataBase. Now it is very important for me to keep this application as one single file, and not having some XML files needed for it to work, but this is the case. I think the XML file holds the information to this webService, so without it the application crashes. Is there a way to get the application to work without this XML file, or a way to put the XML inside the exe archive?
Any way to accomplish this is much appreciated.
It sounds like what you want to do is invoke a web service programmatically (without a Visual-Studio-added web reference). Another potentially useful link here.
I imagine all of the configuration settings that exist in an app.config file can be hard-coded in one way or another, the various config sections will just require varying degrees of ingenuity to make it happen.
Related
I have a task to do. I need to build a WCF service that allow a client to import a file inside a database using the server backend. In order to do this, i need to communicate to the server, the setting, the events needed to start and set the importation and most importantly the file to import. Now the problem is that these files can be extremely large (much bigger then 2gb), so it's not possible to send them via browser as they are. The only thing that comes into my mind is to split these files and send them one by one to the server.
I have also another requirement: i need to be 100% sure that this file are not corrupted, so i need to implement also a sort of policy for correction and possibly recover of the errors.
Do you know if there is a sort of API or dll that can help me to achieve my goals or is it better to write the code by myself? And in this case, which would be the optimal size of the packets?
With help of this CocoaLumberjack FileLogger logging to multiple files , I am able to create multiple log files (with same name in multiple directories),
But I need to use DDLog in one of my project where it is required to write multiple logs files in the same directory with different names.
Is there any way to attain this?
DDFileLogger uses logFileManager for managing logfiles. By default it uses DDLogFileManagerDefault. You can create your own file manager that confirms to DDLogFileManager protocol and provide whatever behavior you need.
The easiest way doing this is to copy DDLogFileManagerDefault and change it to feet your needs.
recently we managed to solved some data transferring problem by finding out there is additional .xsl we could use. Since .xsl files seems to be main way of controlling information flow in dcm4chee (beside jmx configurations ofc) im wondering whether there is some kind of list or index or something like that with enumerated all .xsl files one could use and their places in workflow.
I mean it would be nice to know exactly in which points we could have some influence on process.
I tried to google something like that but no success so far :/
Any help will be appreciated.
Some online list of demo .xsl files is available at
https://svn.code.sf.net/p/dcm4che/svn/dcm4chee/dcm4chee-arc/trunk/dcm4jboss-hl7/src/etc/conf/dcm4chee-hl7
Which one will be used is configurable through the jmx console and the configuration is then written back into the configuration files.
So searching for .xsl both as file extension and as full text search pattern in your dcm4chee installation directory will point you exactly to the places where the workflow can be influenced.
It is Java-style, Linux-style open source project so don't expect it to be too user friendly or Google friendly. I'm not aware of any easy to consume overview table/list/index but it would be nice to have one. It is open source so maybe you can add one..
This is my first time that I am working on a big project for a client. So I was not sure how to solve this problem. However I have come up with two different ideas but I need professionals opinion about which one is better :)
Situation :
There is an application which runs on different client's iPad. Application data is stored by using giant XML file. This XML file is shared among all client by a server. So a server has a centralised copy and each client has their own copy. Once client made changes to their XML copy they updates server copy in and other client updates their copy by updated server copy.
Now only one client can make changes at one time, To fix this I have logic by which before client starts editing XML they need to get ownership from server and server will only allow one client to edit at one time.
Visual Representation :
Now on client side I have to think of a logic by which I will update my client copy and upload it to server. There are two options,
Option 1 :
In option 1, I can directly manipulate XML file by using GDataXML parser and upload that copy to server. For persistence I can save client copy on my iPad in document directory.
Option 2 :
In option 2, I can read XML file create a CoreData representation for local storage. When ever I update data inside core data it will I will change XML file too and than upload that file on server. Double work but I guess better persistence.
Now which one more robust and advisable? Personally I was planning to do option 2 because it seems more robust as I am persisting application data in core data. But option 1 seems more easy work but I don't know how good persistency will remain.
Sorry for lengthy question,
Thanks for any input given.
There are a number of factors which would influence selecting the second option over the first.
How big is the XML file? If you need to work with very large documents, you may need to incrementally parse the XML (SAX) into core data. This will allow you to access the document's contents without loading it all into memory at once.
Do you need to run complex queries in the data? If so, you may be better off using core data fetch predicates, rather than xpath or XSL.
Are you already using core data? Depending on how the XML data is structured, it might be simpler overall to import the data into your existing persistent store.
Otherwise, you can probably make due with parsing the entire document and either traversing the resulting tree or querying with xpath.
If you need to create an object graph based on what you get from server and show it to user (which you most probably need to do), you should stick up to second option, since it allows easy and robust data persistence.
If you do not need to present user with any data from the XML file you can, of course, store it in the Documents directory.
So, if this is a client application and it has at least some visual representation of the data from an XML file you should use CoreData.
If you want a regular update of data , then use CoreData
I have an sql query in my controller action (select * from table....where.....et al). This query is fired when the user submits a page.
It is a 50 lines code that takes in 3 parameters.
For eg: select * from employee where empDate='params.empDt' and empNum=params.empNum order by params.sort asc.
In the above case the query takes in 3 parameters (params.empDt, params.empNum and params.sort) dynamically and is executed.
Since it is a huge query, I am looking at externalising the query to an sql file. The file thus externalised would be called in the service and the query would be executed.
So i created the .sql file in grails-app/conf/sql/read_date.sql.
When i read this file and the run the app, using run-app, it wrks fine. I am able to read the file and execute the query.
However, when I create a war and deploy it on tomcat, the appn doesnt read the file and I get a FileNotFound Exception.
java.io.FileNotFoundException: grails-app/conf/sql/read_date.sql (No such file or directory)
at java.io.FileInputStream.open(Native Method)
at java.io.FileInputStream.(FileInputStream.java:106)
Inputs??
Well you see the thing is, the grails-app/conf structure does not exist within the war. If you use something like 7-zip to open up the war and take a look around in it, you will see that all the Classes in conf are now in Web-Inf/classes. The directory structure is only really guaranteed through development. If you created an Sql file in web-app/sql, this might solve the problem for you as the directories under web-app are respected during War generation. Web-app contains thing like javascript files that are required by the application and are then accessible via http://myhost:port/AppName/sql/my.sql. If I were you, I would database the sql though.
Hope this helps.
John
I can't really see the point of putting the actual SQL in an external file. Just put the SQL statements in the service class, things will be fine. There is probably no increase in speed if you have the SQL statements in an external file (that needs to be read from disk every time the query needs to be executed). If you need to change the SQL Statement often (which would be a good reason to externalise the statement), you could evaluate the possibility of reading it from the database itself (like from a text field, or something)...
If you really want to use the external text file approach from a service or controller the easiest method is probably to use ServletContext.getResource to get a reference to the data packaged in your web app. Since getResource does take a context-relative URL, you can use
getResource("/someSQLstatements.txt") and trust it to work, regardless of the location of your web application on the server's local filesystem OR the path it's mapped to via the servlet container. That should work well. (see also :http://www.velocityreviews.com/forums/t131134-specifying-path-for-file-to-be-read-by-servlet-with-tomcat.html)
have fun.