Converting hana tables to hdb tables - hana

Can anyone help on how to convert hana sql tables to .hdb tables and use them? For converting into .hdb files at first I have imported table .csv format and after this I am not sure how to convert to .hdb table. can someone provide any process

I'm not really sure what you going for but using hdb tables is as easy as creating table_name.hdb in exactly the same format (I.E. COLUMN TABLE ... ) as it was created in "classic" schema. Help Sap hdbtables

You can use the SAP HANA developer CLI's massConvert functionality to convert one or more tables to hdbtable.
Note that this will only take care of the table structure. If you have data that you want to keep you will have to copy it manually, for example, via a CSV export/import.

Related

Auto-Match Fields to Columns SQL LOADER

I'm trying to load some data from csv to Oracle 11g database tables through sqlldr
So I was I thinking if there's a way to carry those data matching the columns described on the ctl file with the table columns by the name. Just like an auto-match, with no sequential order or filler command
Anyone knows a thing about that? I've been searching in documentation and forums but haven't found a thing
Thank you, guys
Alas you're on 11g. What you're looking for is a new feature in 12c SQL Loader Express Mode. This allows us to load a comma-delimited file to a table without defining a Loader control file; instead Oracle uses the data dictionary ALL_TAB_COLUMNS to figure out the mapping.
Obviously there are certain limitations. Perhaps the biggest one is that external tables are the underlying mechanism so it requires the same privileges , including privileges on Directory objects. I think this reduces the helpfulness of the feature, because many people need to use SQL Loader precisely because their DBAs or sysadmins won't grant them the privileges necessary for external tables.

What is the easiest way to query a CSV file in Oracle SQL Developer?

I have a fairly simple CSV file that I would like to use within a SQL query. I'm using Oracle SQL Developer but none of the solutions I have found on the web so far seem to have worked. I don't need to store the data (unless I can use temp tables?) just to query it and show results.
Thank You!
You need to create an EXTERNAL TABLE. This essentially maps a CSV (or indeed any flat file) to a table. You can then use that table in queries. You will not be able to perform DML on the external table.

Hive schema Format?

Does anyone know where to look for an example of a hive schema?
Say you want to define a table with 500 columns and you've put these 500 columns in an excel spreadsheet, how can you get Hive to read in this schema from the excel spreadsheet and create the table that we want?
WE don't necessary need to tie ourselves for a spreadsheet - I am just using that as an example.
thanks.
Well its really a nice requirement, I myself faced couple of issues like this. But as far as I know here is no direct way for that. You need to create some code (Java) which will read from the source(in your case an excel spread sheet) and generate the create table statement and execute in Hive.
You might check in GitHub for some open-source projects that might have this facility. But Hive does not do that.
Hope it helps...!!!

SSIS - Exporting multiple files from one sql table.

I have about 30 files that will need to be exported from one sql table. What I have now is creating a sql table for each file, then exporting the contents of that individual table. This works fine but I really didn't want to have 30 tables on the server. Is there a way to export from one table using 30 different sql queries?
Thanks in advance
This really depends on your data need and how complicated the export is but... generally people do not create an export table per export. Data Transform Task's source can be a table (which you are currently using) or a stored procedure or a view.
I would need to know more about your structure to advise more accurately but... one table per export is definitely not the best solution.
I would use a parameterized stored procedure if possible.

SQL Server Schema to Schema Migration

I would like to know which one is the best approach for migrating existing DB data to another new DB with entirely different structure. I want to copy the data from my old DB and need to insert the data in new DB. For me the table names and column names of new DB is entirely different. I am using SQL Server 2008.
You should treat this as an ETL problem, not a migration, as the two schemas are entirely different. The proper tool for this is SSIS. SSIS allows you to create dataflows that map columns from one table to another, add derived sources, perform splits, merges, etc. If possible you should create source queries that return results close to the schema of the target database so you need fewer transformations.
In this you have to migrate most of the parts manually by running scripts. AFAIK automatically it will not synchronize. But using SSMS you Map tables of two different db's. hope that will help.