LotusNotes agent writing output to external location - lotus-domino

Lotus Notes scheduled agent that performs an export of data, but not to the notes server, but to a shared location on the network. Any ideas?

I used to do that in two different steps:
Create the file locally on domino server
use a sheduled task or cron job to copy the file to the required network destination
Reason why I wanted to have this splitted is that I was able to create a dedicated user for the scheduled task only. This user has to have the rights to run the task on the domino server as well as write access to the network share. Additionally. the agent won't fail in case of an issue with the connectivity to the network share.

You could map the shared location as a drive (on the server side) and write the file to it.

Related

Transfer CSV files from SFTP server automatically to sql server as separated tables

My Job is trying to automate a workflow which basically transfers all the CSV files that are stored in the SFTP server folder and fetch them directly to SQL server database in separated tables each file. Is there any way I can complete this task I have been searching on the internet but nothing specifically, also I was considering SSIS packages as an option but they do not have an SFTP server task.
I would use an SSIS package to handle this due to the SFTP component.
There are a few ways to manage this. You can import an external tool such as a Cozyroc SFTP extension which enables you to set up an SFTP connector. Some of their stuff is chargeable, you would have to check whether it is free for your usage. There may be other similar external tools, I have only used Cozyroc in the past.
You can also write a simple command line batch script using a free windows utility such as winscp. You can then call a script task within SSIS which then calls the batch script.
Once you have execute the SFTP component, you can loop through the downloaded files and upload them to your database server.

How do backups work in DirectAdmin?

I'm sure there's a good amount of developers here that use DirectAdmin and I had a quick question.
I've always used cPanel and I'm not on a server that is using DirectAdmin instead. Where in DirectAdmin can you generate a full backup of the account at the user level?
Also, do DirectAdmin backups include everything related to the account like cPanel backups do? For example, not only the files and databases but also the cron jobs, DNS zones, email accounts, etc.?
And where are the backups stored by default? Is there an option to send the backups to a remote server via FTP like you can with cPanel?
There are two different backup systems built into DA:
Admin Tools | System Backup. This tool lets you backup configuration data and arbitrary directories, locally or using FTP or SCP.
Admin Tools | Admin Backup/Transfer. This tool is oriented toward backing up data account by account, in one archive per account, in a format that you can use to restore from (in the same tool) on the original or another DA server (i.e. if you want to transfer to a new server). You can back up locally and/or via FTP.
Both options can also be scheduled via cron.
Depending on your level of access, only one of these might be available to you. This page has further info for non-administrators: http://www.site-helper.com/backup.html.
You can improve your DirectAdmin backup with an incremental backup plugin that includes local and remote backup location, please check the setup guide here

Data copying in SQL Server database in Cloud

My question is more specific to a feature in our application. Our environment is currently a dedicated hosting. There is an initiative to move everything to cloud.
There is a data replication scheduler job (this job will trigger copy scripts saved in .bat files) configured in a SQL Server. This job runs every night and copies data in a few tables from one database instance (live) to another (duplicate) on the same server. The idea behind doing the data copy to a duplicate DB is to execute complex application specific reporting queries in an additional DB without interrupting main (live) DB.
Since we are moving to the cloud, we won't get access to Windows servers having SQL Server installed. And we cannot configure Windows scheduler jobs which trigger scripts in .bat files to do DB copy.
Do you have any suggestion to handle this?
Appreciate your help!

How to create a document on a server in another Lotus Notes network?

Public Domino server has a publicly available Lotus Notes database. That database has a form that an unauthenticated user can fill out and submit using his/her browser.
This publicly available form is only used for the post request and data must not be stored on that publicly available server. Instead, I need to connect to a database on an internal server and create the document there.
Obvious solution is a Lotus Script agent but when I worked on Notes, I remember non-user agents were prevented from opening databases on another server for security reasons. I certainly cannot introduce secure server setup. I need to find a way to do this that fits current setup. The servers are in two different Notes networks but mail is routed between them, so if I don't find a better solution, I will probably mail the document.
Any ideas? I have not worked with latest Notes servers. Anything in 8.5 that can help here?
In the server document on the security tab there is an Option called "Trusted Servers" if you could put the external server into that field, then the agent would be allowed to dirctly write into databases on the internal server.
If you are not able / allowed to do this, then you have to write to a "local" database (on external server) and replicate this database to internal server either by using a console command (NotesSession.SendConsoleCommand) or with the replicate method of the NotesDatabase class (not sure, if this will work due to the same security restrictions) or via scheduled replication.
If the database itself cannot be replicated on the external server, then you should use a container database and let an agent on the internal server copy the data to the internal database.
And the last possibility you already mentioned: compose the document and send it via mail. Make the target database a mailin- database and simply send you data there with NotesDocument.Send...
One of these options should solve your problem.

Processing cubes automatically and daily on Microsoft Analysis Services

I've followed the steps at the site http://www.dotnetspider.com/resources/24960-How-Process-SSAS-Cubes-Automatically.aspx
It works in development phase, but I need to change the target of the cube in deployment environment.
I opened the package file and I've edited it manually, but it doesn't works...
I don't know if is authentication problems. But my questions is, how to parametrize the target of the cube that I want to process?
Thanks.
obs: I'm not expert in Analysis Services but I need to execute this job.
The best way is to, in SSIS, base your Analysis Services connection on an expression:
Create a variable #[Server] to hold the name of your Analysis Services server.
Add an expression to you Analysis Services connection, pointing the property ServerName to that variable.
Add a Package Configuration to your package, so you can have different configurations according to where you want to deploy the package.
I did this:
Make a DOMAIN\USER administrator of SQL Server Analysis Services
Give the same DOMAIN\USER the fixed-role "sysadmin" on the SQL Server.
Create new credentials in SQL Server with Login Data of this DOMAIN\USER.
Create a proxy user on SQL Server with the new created credentials, and allow AS Service Command and AS Service Query execution.
Create your SQL Server Agent Job that execute a query for cube processing and select the created proxy user.