I'm trying to load some data from csv to Oracle 11g database tables through sqlldr
So I was I thinking if there's a way to carry those data matching the columns described on the ctl file with the table columns by the name. Just like an auto-match, with no sequential order or filler command
Anyone knows a thing about that? I've been searching in documentation and forums but haven't found a thing
Thank you, guys
Alas you're on 11g. What you're looking for is a new feature in 12c SQL Loader Express Mode. This allows us to load a comma-delimited file to a table without defining a Loader control file; instead Oracle uses the data dictionary ALL_TAB_COLUMNS to figure out the mapping.
Obviously there are certain limitations. Perhaps the biggest one is that external tables are the underlying mechanism so it requires the same privileges , including privileges on Directory objects. I think this reduces the helpfulness of the feature, because many people need to use SQL Loader precisely because their DBAs or sysadmins won't grant them the privileges necessary for external tables.
Related
Is there a way to modify the SQL server to give the virtual information_schema database a different name by default?
Or is information_schema a standard so that software knows where to look and query for information?
I'm using 10.5.15-MariaDB - MariaDB Server
I'd like to rename it to .information_schema so that the database doesn't show up in the middle of the databases list on my CMS.
I don't have control over hiding/displaying databases by name.
information_schema cannot be modified or deleted.
If I were you, I would avoid any attempt to modify the default databases.
Here is what I found searching similar questions:
INFORMATION_SCHEMA is a database within each MySQL instance, the place
that stores information about all the other databases that the MySQL
server maintains. The INFORMATION_SCHEMA database contains several
read-only tables. They are actually views, not base tables, so there
are no files associated with them, and you cannot set triggers on
them. Also, there is no database directory with that name.
*(source)
I need to get DDL query of a particular object of database like schema, table, column, etc. Is there a way to extract it from system catalog tables using sql?
I tried to find any table in information_schema or pg_catalog with required information, but I didn't find such one.
There is a way to extract it from the system catalogs, but the method depends on what type of object it is, and is not easy.
The "pg_dump" knows how to do it. I would just use that, rather than reinventing things. You can get just the DDL (exclude the data itself) using "-s" option. Then you can fish out the DDL for your specific desired object using your favorite text editor. If the object is a table, you can tell pg_dump to dump just that table, but for other objects you can't.
I have a fairly simple CSV file that I would like to use within a SQL query. I'm using Oracle SQL Developer but none of the solutions I have found on the web so far seem to have worked. I don't need to store the data (unless I can use temp tables?) just to query it and show results.
Thank You!
You need to create an EXTERNAL TABLE. This essentially maps a CSV (or indeed any flat file) to a table. You can then use that table in queries. You will not be able to perform DML on the external table.
How do I copy data from multiple tables within one database to another database residing on a different server?
Is this possible through a BTEQ Script in Teradata?
If so, provide a sample.
If not, are there other options to do this other than using a flat-file?
This is not possible using BTEQ since you have mentioned both the databases are residing in different servers.
There are two solutions for this.
Arcmain - You need to use Arcmain Backup first, which creates files containing data from your tables. Then you need to use Arcmain restore which restores the data from the files
TPT - Teradata Parallel Transporter. This is a very advanced tool. This does not create any files like Arcmain. It directly moves the data between two teradata servers.(Wikipedia)
If I am understanding your question, you want to move a set of tables from one DB to another.
You can use the following syntax in a BTEQ Script to copy the tables and data:
CREATE TABLE <NewDB>.<NewTable> AS <OldDB>.<OldTable> WITH DATA AND STATS;
Or just the table structures:
CREATE TABLE <NewDB>.<NewTable> AS <OldDB>.<OldTable> WITH NO DATA AND NO STATS;
If you get real savvy you can create a BTEQ script that dynamically builds the above statement in a SELECT statement, exports the results, then in turn runs the newly exported file all within a single BTEQ script.
There are a bunch of other options that you can do with CREATE TABLE <...> AS <...>;. You would be best served reviewing the Teradata Manuals for more details.
There are a few more options which will allow you to copy from one table to another.
Possibly the simplest way would be to write a smallish program which uses one of their communication layers (ODBC, .NET Data Provider, JDBC, cli, etc.) and use that to take a select statement and an insert statement. This would require some work, but it would have less overhead than trying to learn how to write TPT scripts. You would not need any 'DBA' permissions to write your own.
Teradata also sells other applications which hides the complexity of some of the tools. Teradata Data Mover handles provides an abstraction layer between tools like arcmain and tpt. Access to this tool is most likely restricted to DBA types.
If you want to move data from one server to another server then
We can do this with the flat file.
First we have fetch data from source table to flat file through any utility such as bteq or fastexport.
then we can load this data into target table with the help of mload,fastload or bteq scripts.
I've been writing a Library management Java app lately, and, up until now, the main Library database is stored in a .txt file which was later converted to ArrayList in Java for creating and editing the database and saving the alterations back to the .txt file again. A very primitive method indeed. Hence, having heard on SQL later on, I'm considering to port my preexisting .txt database to mySQL. Since I've absolutely no idea how SQL and specifically mySQL works, except for the fact that it can interact with Java code. Can you suggest me any books/websites to visit/buy? Will the book Head First with SQL ever help? especially when using Java code to interact with the SQL database? It should be mentioned that I'm already comfortable with using 3rd Party APIs.
View from 30,000 feet:
First, you'll need to figure out how to represent the text file data using the appropriate SQL tables and fields. Here is a good overview of the different SQL data types. If your data represents a single Library record, then you'll only need to create 1 table. This is definitely the simplest way to do it, as conversion will be able to work line-by-line. If the records contain a LOT of data duplication, the most appropriate approach is to create multiple tables so that your database doesn't duplicate data. You would then link these tables together using IDs.
When you've decided how to split up the data, you create a MySQL database, and within that database, you create the tables (a database is just something that holds multiple tables). Connecting to your MySQL server with the console and creating a database and tables is described in this MySQL tutorial.
Once you've got the database created, you'll need to write the code to access the database. The link from OMG Ponies shows how to use JDBC in the simplest way to connect to your database. You then use that connection to create Statement object, execute a query to insert, update, select or delete data. If you're selecting data, you get a ResultSet back and can view the data. Here's a tutorial for using JDBC to select and use data from a ResultSet.
Your first code should probably be a Java utility that reads the text file and inserts all the data into the database. Once you have the data in place, you'll be able to update the main program to read from the database instead of the file.
Know that the connection between a program and a SQL database is through a 'connection program'. You write an instruction in an SQL statement, say
Select * from Customer order by name;
and then set up to retrieve data one record at a time. Or in the other direction, you write
Insert into Customer (name, addr, ...) values (x, y, ...);
and either replace x, y, ... with actual values or bind them to the connection according to the interface.
With this understanding you should be able to read pretty much any book or JDBC API description and get started.