In Sharepoint 2010 BCS, how to use external data column which is from external content type-DEPARTMENT in an external list which is creating from external content type-EMPLOYEE. In database scenario DEPARTMENT id is foreign key to Employee table.
Thanks in advance!!
Related
I'm have an access database contain name of employees
I'm want create many of folders and rename it from name field from access database ??
I'm want the result bellow
any name in database gets folder by it name by using vb.net
I'm just know create the folders in vb.net
I am having few data which gets returned from my Store procedure or table from Azure SQL Server,
Client is having some formatted excel and I need to export data from SP to that formatted excel.
The excel has pre defined columns with different name than DB Column names .
Is it possible by using any Azure resources like Azure data factory or logic Apps to export data from sql server and append in the excel.
Through Azure Data Factory -Data Flow Activity we can achieve this.
Source here is SQL Database where column names are coming from Table:
Connect Sink(Target) and in the Mapping Tab uncheck Auto Mapping and Add Output column names as required at the target :
Is there a way to import a lookup table of account numbers from excel and only pull results from the database that match the account numbers listed in the excel lookup table? My lookup table contains thousands of account numbers so I can't manually type in the results that I want to filter for. I am using Microsoft SQL server studio to pull data from a SQL server database.
Yes, from SSMS in the object explorer you can right click on the database Tasks-->Import Data. From there you can use the UI to import the spreadsheet as a table in your database.
I'm having a data in a external table. Now I'm copying the data from external table to a newly created table in a database. What kind of table will be the table in the database? Is it a managed table or external table? I need your help to understand the concept behind this question
Thanks,
Madan Mohan S
The hive table get their type "Managed" or "External" at time of their creation, not when data is inserted.
So table employees is external (because it was created using "create External" in DDL and provided location of data file.
The emp is managed table because "external" was NOT used in DDL and also location of data was not needed.
The difference now is, if table employees dropped the data it was reading that was provided in "location" is not deleted. So external table is useful when data is being read by multiple tools i.e pig. If pig script is reading same location, it will still function even though employees table is dropped.
But emp is managed (in other word metadata and data both are managed by hive) so when emp is dropped the data also are deleted. So after dropping it if you check the hive warehouse directory you will no find "emp" hdfs directory anymore.
I have created an SSIS package (see below) to import data from an external SQL query into a SharePoint 2007 list. The data imports fine but when the package is ran again to update the data it duplicates the records. I'm guessing that as there is no link between the SharePoint ID of the imported records and the data from my SQL query the routine has no idea what to update and just creates a new record. How do I prevent this and allow my data to be updated in the SharePoint list?
If you are setting the key Id field in your SharePoint list target it will perform an update, otherwise the default is an insert. It sounds like you have not mapped the Id
You can either
Set (map) the ID column thus forcing the SharePoint destination component to perform an update. Have a look at this example by Chris Kent
Limit your source select statement based on the last inserted record inside the SharePoint list. Prior to the data flow task, you would need to select the max(date or key?) from SharePoint and set an expression for your data source to include this value in the WHERE clause resulting in selecting only new records. This has the added benefit of limiting the amount of data traveling across the network and your existing insert setup would work.