How to copy an *.exe file from one computer and paste it to another computer over LAN - vb.net

For example I have a client-server application, this application often gets updated (it's an exe file). If I download the update on the server machine then the same update should be transfered to the client machines, or vice-versa.
At the moment the update is downloaded on all machines individually. My idea is downloading the update should be done only on the server and I'm planning to make an option in the client to copy the *.exe file (update) directly from the server and paste it on the installation path.
How can I make this happen?
NOTE : the update is a self extracting file.

There is already a technology for achieving this called ClickOnce. The client application can be "published" to a share that is accessible to all the clients, then each time the client is executed a version check is done - if a later version is detected on the share then it is downloaded before execution continues.
You can read more about this here: ClickOnce Security and Deployment. Creating a ClickOnce package and publishing it is a feature already built into Visual Studio, so you do not need to write any code.

You have to write an application that will be split into different parts :
Detect file change either on the client or server machine
Perform the copy of the exe file to server or other clients.
In all cases you can't tell an exe file to update it self.

Related

What would cause SSIS to ignore Package Configuration Connections?

I have a very simple SSIS Package that has 2 connections defined in the Connection Manager section. An MS Access Data Source and an MS SQL Data Source Destination. All this package does is Truncate a table in the SQL Destination and Imports data from MS Access into the SQL table. This works as expected during Development within VS2013.
Now, I also have enabled Package Configurations for the package and have a couple of XML Configuration files (1 for each Connection) in a folder on the root of the C: drive. The Configuration file connections differ based on the server where they reside, but the folder structure exists on both servers so the package can execute against the server from which it is run.
I've checked the box to enable Package Configurations and deploy the package to 2 different Servers. 1 for Development and the other for QA. When I execute the package via the SSMS Integration package execution on my Development Server, the package utilizes the Development table. But when I execute the same package on my QA environment, it also utilizes the Development table.
Since the Development connection is the one that is embedded in the package via the Connection Manager, it appears (presumably anyway) that the package is using the embedded connection and ignoring the configuration files.
I have alternatively explicitly added the path to the Configuration file within the Execute Package Utility in the Configurations section to see if it made any difference but the results are the same. The configuration file is not acknowledged. So it again appears that the package is using the embedded connections that defined in the Configuration Managers.
I suppose I "may" be able to remove the Connections from the package in the Connection Managers section and turn off validations during Design time and then deploy again in effort of forcing the package to use the Config files but that doesn't seem like the way to go and a hack at best; provided that it would even work.
Not that I think it should make a difference but to provide more detail, here is a bit more concerning my Server Configuration:
Development - SQL 2014 [ServerName]
Quality Assurance - SQL 2014 [ServerName][InstanceName]
I don't recall ever having this issue before, hence my reason for posting.
Ok, since I am working against a dead line; I was hoping to acquire an answer sooner than later. But since that wasn't the case and because I've seen variations of this question before without a definitive answer (at least to satisfy this scenario) I performed some tests and am posting this for others who may also have need of this information.
The Following Conditions will ignore the use of Configuration Files even if Package Configurations are enabled in an SSIS Package. These findings are based on actual tests and affirmed to be true for SQL 2014 although prior versions may also be applicable.
Disclaimer: These tests focused on the Configuration Files as they pertained to actual Server Connections. (E.g. Connection Strings) and not any other variables although it’s conceivable that any other values within the Configuration file would also be affected.
Execution of the Package from within SSMS while connected to the Integrated Services Component and selecting to Run Package. The noted behavior is that whatever Connection value was acquired prior to deployment to the Server is the one that will be used; irrespective of the Configuration Files
Note: This holds true even if configurations are added in the Configurations section prior to execution. Although there is mention that the configurations are not imported and they cannot be edited; the fact is they were neither used during the testing.
If an SQL job is of type SQL Server Integration Services Package and no Configuration File references are actually added to the Configurations tab, the values the job will execute under whatever values were used during the last build within BIDS prior to deployment (Embedded Values)
If multiple configuration files are used by the package but some are omitted in the Configurations tab of the job; the job will use those Configuration Files designated but will default to the last values used in Development (Embedded Values) for those which are not present in the context of the job
Some of these behaviors are not very obvious and I'd imagine it could be a frustrating puzzle when someone expecting to follow the rules of most online tutorials for using Package Configuration files; would have the expected more straight forward results.
I know it was a time consuming task of testing to identify the root cause for me and although I'm not an expert; I'm certainly far from a novice with SSIS.
At any rate, I hope this helps someone else from hours of work and investigations.

"Install" SAP custom program in SAP System

I've created a SAP program and I want to deploy it in another SAP system.
I know I can import the Transport Request files with the created program to the new system but I'm looking for other options.
Is it possible to "install"/import my program to another SAP system?
Regards
I can only think you don't want to use the transport system because the systems are not part of the same landscape? If so, you can still use the transport system, you just need to manually move the required files around.
But, there is another approach you can follow - using SAPlink. It's an open source program that allows you download ABAP source, dictionary objects, etc. from one system into files and then upload them into another system. Of course, both systems will need to have SAPlink installed for this to work.
This is somewhat by design, SAP is the largest OTS system available and there has to be some controls to ensure that people can not install software if they are not specifically given the authorization to do so.
Even to use SAPlink ( that mjturner suggests ) requires you to have the ability to install that software first and I doubt you will find it in very many productive landscapes so likely that wont be an option.
Assuming you have a developer authorization you can always download the source code from your development SAP system and then upload from within the ABAP editor (SE38) using utilities -> More utilities->Upload/Download. Note that this doesn't work in the class editor so cut and paste is another option.
Later.......
There are three ways to move transports from one system to another.
1. Moving the transport files form the directories “data” and “cofiles” manually.
When the transport is released in the source system SAP automatically puts the transport files into the transport directory on file system. This files easily can be copied to the second system an be imported via transaction “STMS”.
2. Using CAR files
CAR files are packed files like a zip file. The contain the data and cofiles.
car.exe -cvf packedFile.car data\R900000.XXX cofiles\K900000.XXX
(car.exe is a SAP standard tool, XXX is the system ID)
This CAR files can be imported via transaction SAINT. This allows import files from frontend into the data and cofiles directory without direct access to the file system. After importing the file via SAINT the transport can be imported using STMS. This is be the common way to transport software to other systems outside the current landscape.
3. Using SAR and PAT files
These files are more special. They allow to install software as Add-On in SAP. This is required if the program should be certified by SAP. They have to be created using the AAK (SAP Add-On Assembly Kit). Unfortunately, I have not created this files myself yet. But it seems to be very complex to get this running, because there are some checks which have to be passed. The files can be imported via transaction SPAM (upload) and SAINT (import).

SFTP automation Using WinSCP or FileZilla

So, as part of my daily jobs, I have to transfer a one file from our customers server to our internal server and any responses back.
Each customer effectively has one file up and one file down each day.
I have an SFTP server here that I can use and is already used manually for a few sites.
I'm looking to automate as many sites as possible using batch files on a scheduled task.
Initially, I'm looking at automating the internal side of the process.
We simple have a requests folder that needs to import from the SFTP (then delete the original on the SFTP) and a response folder which needs to copy to a 'sent' folder and then export to the SFTP (also, deleteing the original)
On the SFTP server I have a "to site" and "from site" folder. Each file is site specific followed by a variable. So SiteNameImport.<variable> and SiteNameExport.<variable>
EDIT:
I'm asking this as I'm a novice at scripting and basically have no idea what to do.
I've tried reading the automation guide on WinSCP website but a lot of it means nothing to me.
Filezilla doesn't support automation, You're better off with WinSCP. They have some scripting examples here as well as any other information you'll need to build the scripts functionality. You'll just need to add the specifics (Like deleting sent files and so on) CuteFTP is also another solution you can script with but I believe you have to pay for a licence. I suggest VBscript, Examples can be found Here for vbscript.

How to close/unshare a file being used by another user?

My .NET app updates a certain .xls file located in a shared folder inside the domain. Using VB.NET, how can I check if that file is being used by another user and unshare/close it, so my code can update the file properly?
NOTE:.Exe file(my app) will be executed in the server where shared folder is.
Windows and NTFS is designed to prevent this scenario. Unless you:
Get the Excel application which is holding the .xls file open to close it
Shut down the server's network connection assuming the Excel file is opened by client PCs - a highly undesirable step to take
Discover the PC on which the Excel.exe instance is running and use the Windows Taskkill command-line command to kill all Excel.exe instances (since you won't necessarily know which one has it open) on the machine - another undesirable step to take
Have a look at this answer, but this uses third-party utilities, and doesn't explain how these utilities do what they do though you may be able to Process.Start them.
you just can't get Windows/NTFS to release the file lock.
A possible alternative that I'm not sure will work with an xls (as opposed to an xls_) is to make the file shared, though this has some limitations.

VB.Net 2005 Setup Project Application Data Folder no Content

Under VB.Net 2005, I created a Setup Project which produced installation files for a windows application. I placed the mdb file in the User's Application Data Folder and remapped the DataDirectory to Environment.SpecialFolder.ApplicationData at runtime. Unfortunately, sometimes the system just did create the mdb file at the Application Directory. An example was that I installed the program in Windows 7 under a normal user account. I chose inside the installer that the program would be made available to everyone. After installation, the mdb file would be created automatically for the Administrator account but not for the normal user account. For XP, the whole thing worked fine.
I am tempted to write a routine to check and create the mdb file if it is not present, but why does this happens and what other opinions do I have?
If the application is installed for all users instead of once for each user that logs onto the machine, then you will eventually run into a problem where the database isn't available for the signed on user.
In order to resolve this, you would probably be better off checking for the existence of the database on application startup and then either copying a standard version from the application directory to the current user's data directory or creating a new db from scratch in place. I would recommend the first approach.