Pentaho repository import missing database connections - pentaho

I am trying to migrate my kettle repositories from one server A to server B. I have exported a repository from Server A as ServerA.XML. I imported the ServerA.XML to server B.
The import is finished and all the KTR, KJB files are available in the server B now. My problems in the server A the database connections are declared which you can see from
Tools->Repository->Explore->Connections. I could see that in the server A. But after exporting the ServerA.XML to server B. I am not able to see the database connection details in Tools->Repository->Explore->Connections
What am i missing here?

Related

How to setup local database from remote database in visual studio asp.net project?

I am working on an ASP.NET project which accessing a remote SQL Server database. My objective is to set up this remote database on my local system.
I have tried 2 ways, but I'm facing problems both ways:
(1) Using SQL Server import and export wizard.
I get an error saying that the connection state is closed, but I am able to fetch all database tables of remote database:
(2) Schema and Data comparison from SQL Server object explorer inside of Visual Studio.
Here I get an error in that database user-related issues are showing. User or group not found:
I have tried other things in the last 4 days also but not found any clue to solve it. I am new to ASP.NET and SQL. Please help. Any helpful answer will always be appreciated. Thanks.
I have tried the Dennis1679's answer, but the following issues are coming.
One or more unsupported elements were found in the schema used as of
a data package. Error SQL71564:Error validating element[username]:
The element[username] has been orphaned from its login and cannot be
deployed.
Install SQL Server Management Studio, connect to the remote database, right-click on the database, choose export data-tier application. Then connect to your local instance and import data-tier application.
Connect to the remote database:
Right-click on the database choose export data-tier application:
Follow the wizard and save the .bacpac somewhere
Connect to your local SQL Server (localdb)\MSSQLLocalDB
Choose Import data-tier application
Select the previously saved .bacpac and import it. Now your local instance should have the same database

Kafka data copy from one server to another

My kafka version is 0.8.2.
Now I want to copy data from Server A to Server B, and then expect kafka in Server B works the same as in Server A.
I try to copy all data in $log.dirs which include kafa-logs and zookeeper.
But it doesn't work.
Is it possible that just copy some data files, and then make it works fine in Server B as in Server A ?
Just like I copy a sql dump file from production and run into my local mysql server, then it will works same as in production server.
try kafka's kafka-mirror-maker,like this,
>bin/kafka-mirror-maker.sh
--consumer.config consumer.properties
--producer.config producer.properties --whitelist my-topic
https://kafka.apache.org/documentation.html#basic_ops_mirror_maker
consumer.properties is server A's config,you can modify zookeeper.connect to server A's zookeeper
producer.properties is server B's config,you can modify bootstrap.servers to server B's
whitelist is topic which you want to copy from.

SQL Azure does not support SSIS package , what are the options we have for moving a inpremise solution to SQL Azure?

We have DB solution in-premise where we have a SSIS which scrubs the data from the main DB and populates it to a ReportDB . The scrubbing of the data and migration is done by SSIS package . Now we are planning to move this to SQL Azure and we know that we do not have support for the SSIS . I am looking out for options which we can use .
Option A
Change your connection strings from using an OLE DB provider to an ADO.NET and go from, potentially having used, Windows authentication to definitely using SQL Server authentication. Verify everything continues to work and/or mitigate issues i.e. Lookup components support OLE DB providers or Cache Connection Manager so you'd need to preload the CCM. Change the selector in an Execute SQL Task from OLE DB to ADO, etc.
All looks good in a local environment? Deploy database objects to the SQL Azure DB, change connection strings in SSIS and run packages (locally) as normal. All should be well and good.
Option B
You'll still need to make all the changes from Option A, but this will allow you to move your on premises SSIS processing to "the cloud."
Spin up an Azure VM and install a version of SQL Server to match your SSIS packages. Deploy the packages to the server and schedule execution. Ensure there are firewall exceptions to the Azure SQL DB from the VM instance.
Option C
Rewrite everything in something else and deploy as an Azure worker. Or comingle the Main DB tables with the report DB and rewrite at TSQL.
Are you able to use SQL Server 2016 Data Masking as an alternative to your current scrubbing?

How to Connect my SQL Server 2012 to store and get its database from a storage server?

I have SQL Server 2012 installed on our Office.
and its storage is going to be full
what I want is that How can I connect my SQL Server to another Server to store its database and files
there.
means I want to have a connection to another Server for just storing my database files.
If I create new database it should also be created in the new server.
the server will not be running any databases it just will be a storage place.
How can I create a connection from my SQL Server to an Storage Server.
what I mean is the second server will be just as a secondary storage for my SQL Server
if my current server gets full than it should automatically stores its data to another server.
please proved with a detailed answer because it is first time i faced this issue.

Internal error import BACPAC file to windows azure sql database

I am using MSSQL2014 version CTP2.
Running my databases through Windows Azure SQL Database management.
I've created a BACPAC file via MSSQL of my local database and want to import it to Windows Azure.
I create the BACPAC file and upload it to my BLOBSTORAGE container.
In Windows Azure click SQL DATABASES
Bottom left click New
Click IMPORT
Choose my BACPAC URL which is the file contained in my BLOBSTORAGE
Name the DB and choose the server
Click the tick
I then get this error:
Error encountered during the service operation. Could not import
package. Internal Error. The database platform service with type
Microsoft.Data.Tools.Schema.Sql.Sql120DatabaseSchemaProvider is not
valid. You must make sure the service is loaded, or you must provide
the full type name of a valid database platform service. Internal
Error. The database platform service with type
Microsoft.Data.Tools.Schema.Sql.Sql120DatabaseSchemaProvider is not
valid. You must make sure the service is loaded, or you must provide
the full type name of a valid database platform service.
I can create a new database via Windows Azure which is fine but I'm trying to get my DB up because it has all my data in it.
What am I doing wrong?
I believe the issue was that I needed to assign the same DB owner to my db on my local machine as the one that was setup on my azure setup.