I'm fairly new to using DBs and have been tasked to create an automated process that uploads Excel-Files into a Oracle Database.
I was told that the User should put the files into a dedicated folder and then a process should automatically upload the files. After checking Stackoverflow and the Internet, however it looks to me like there is no way to do the upload with just PL/SQL.
Do i need to use other external tools to achive this or am i just looking the wrong way?
The reason i want to do it with just PL/SQL is that i don't have sys rights on the server or a way to install any tools right now.
You can upload CSV files stored in a folder using SQL Loader directly into Oracle Tables. But you need to have those files stored as CSV not XLS and transfer them to a folder in a server which contains at least an Oracle client. In this case, your user should save the files as csv, and then you must have a pick up process to move them to a server where you can run the sql loader process.
However, if you want to keep using Excel, and you have no option to move the files, Oracle Application Express which is free and included with Oracle Database contains a plugin to upload directly and automatically excel files into tables. You would have to create a small application in Apex with a page for doing this. It is totally out-of-the-box and quite easy. If you use Apex 18c or higher, it is there. If you use Apex 5.1.4 you need to install a plugin. In this case, the user is responsible to upload the excel file by the web apex application, or you can use the API APEX_DATA_PARSER package for doing so without manual intervention. However, keep in mind that if you use the API, you need to have the files accessible for the database.
Apex Data Parser 19c
Let me know if you have more doubts about it.
Regards
Related
Is there any way possible, to upload a SQL DB file stored locally on disk to Azure SQL Server? Its already populated with data and I'd like to use it for learning the Azure environment and creating web apps alongside it. NOTE: this is not a SQL Server DB file, its just a regular standalone db file with data populated into it.
There's a tool called Data Migration Assistant, you can download here. It's quite intuitive to use.
https://www.microsoft.com/en-us/download/details.aspx?id=53595
Please note:
I am a game programmer, so backend development isn't my forte. There are times, however, where I work with our database at my job. Please don't shoot me if my question is ridiculous.
Is there a way to create a local mySQL file and access it through PHP or C#?
I know you can make a local webpage on your machine (pretty much for testing purposes) and access multiple locally created files.
I assume that something similar would work with mySQL. (Are the login credentials also stored within the file?) I remember seeing a few online tutorials where it offered a download for both PHP and the database file, but I can't seem to find them now.
I've searched for this, but all the relevant results involved downloading mySQL and hosting a server which is a bit more than I wanted to do.
So if its possible to create a local mySQL, how do you do so?
The tools I intend on using while doing this:
PHP/JQUERY/HTML and C#
For MyISAM tables, inside the MySQL data directory there is one directory per database which contains several (usually three) files per table. For InnoDB tables, they are all contained in several files directly inside the data directory.
The location of the MySQL data directory is usually set in my.cnf using the datadir parameter.
The login credentials are stored in a special database called "mysql" which is in that data directory like any other database.
However, you have to install and run MySQL to access those files. You cannot access them with PHP or any other client API alone. If you want to do such a thing, better use SQLite.
MySQL is a database engine, u need to install that before you can use it. Unlike SQLite which stores it's database in files. Maybe that is something more of your liking. And I know there are library that supports SQLite for PHP, not sure about the rest.
SQLite you don't need to install anything.
MySQL can be used an an embedded database, but you will need to contact them in order to purchase a copy of it.
We would like to allow users to run custom Oracle 11G SQL scripts that have been created for them, complete with parameter prompts, and get a CSV extract of the resulting dataset. Right now, I just use SQLPlus and SQLDeveloper to do those things, but those tools would allow the creation of custom scripts as well, and we do not want users to try to create custom queries.
In many cases we intend to fulfill this need with Crystal Reports/Crystal Server, but we use CR XI, and sometimes very WIDE extracts are difficult to create because of the page size limitations. It also has a limit for the number of concurrent users, and sometimes we may need more.
Does anyone know of a FREE tool that can allow users to execute Oracle SQL Scripts and get file exports as a result and yet will NOT allow them to create new Scripts?
NOTE: We have a Citrix environment and therefore are able to limit where the script files are located and what access users have to those files and folders.
Given that a SQL script is just a text file, I'm not sure I see how this could be possible but perhaps I'm missing something about how you see something like SQL*Plus allowing the creation of custom scripts. If you give me any tool that runs SQL scripts, I can always open my favorite text editor, write a SQL script, and have your tool run it (assuming that you allow users to create new files in your Citrix environment or to map a file from their local machines).
Personally, I'd probably create a small APEX application in the database that would present a menu that let users pick an export. Behind the scenes, the APEX app would run whatever select was necessary (I'd generally create a CLOB in the database rather than a file on the file system assuming unless you're making a great deal of use of SQL*Plus formatting commands in your scripts) and would allow the user to download the file (or use some alternate file delivery mechanism such as email).
I use jasper reports for that: http://jasperforge.org/index.php?q=project/jasperreports
I was wondering if there is a way to automatically append to a script file all the changes I am making to my columns, tables, relationships etc...
The thing is I am doing a lot of different changes on a TEST db and the idea will be to apply this change script when I move the test db to production... hence keeping production data but applying all schema and object changes.
Is there an easy way to do this? Can it also migrate database diagram changes?
I have seen how you can create a change script each time I do a change but this means I have to copy and paste into a master file. Actually pretty easy!
I was just wondering if I was missing something?
Do not make changes to the test server using the UI. Write scripts and keep them under source control. You can test your scripts starting from backups of the live data and you can tune yoru scripts untill they achieve the desired result. Then you can check in the scripts for reference and later apply them on the live server. See this article Version Control and Your Database.
BTW, check out the SSMS toolpack, I think it may do what you want (I'm not sure). My advice stand none the less: version your schema, use explicitly created/saved scripts, use source control.
There's no way to directly generate a "delta" script in SSMS.
However, if every time you publish changes, you script out the entire database, including data, to SQL using the SQL Server Database Publishing Wizard you should be able to extract diffs between the versions and get your deltas that way.
If money is no object, you can purchase Visual Studio Team System Database Architect edition and use its fantastic database comparison tools to generate and version control exactly the diffs you want.
Try using TableDiff , that came with SQL Server 2005.
SQL Server 2005 TableDiff Utility
tablediff Utility
We have the process where when a developer gets done with a change, they then script it out and check it into Subversion. In Subversion we have a folder for Tables, Stored Procs, Data, etc. They script it out so it is repeatable (i.e. don’t insert the new data if it is already there.) This is important to do anyway so you keep the history of changes for a given object in the database.
In the past, we would just enter each of the files that we wanted scripted out into a text file (i.e. FileListV102.txt). When we were ready to make a release we would do “get latest” on all of the files (from VSS back then.) We then had a simple utility that would read the “file list” file and open each of those files in turn concatenating them into an output file. That is pretty easy to code.
We outgrew that and now we have a release management tools (which can be found here and will be on sale mid September), that takes all of the files and creates a big SQL script file out of it. It does it in the order that you would expect based on the folder names – so files found in the "Tables" folder are done before those in the "Data" folder, etc.
Either way, once you are done you have a big SQL script file that you can then apply to a fresh copy of production and that is what you test against.
I know I'm way late to the party, but I just wanted to add that there are tens of third party products out there. Some are very good, some are very cheap or free, and some are a mixture. I listed 22 here:
http://bertrandaaron.wordpress.com/2012/04/20/re-blog-the-cost-of-reinventing-the-wheel/
We have been using a relatively new software called Kal Admin.
It has Change Management feature and let distributing selected changes to other databases very easily. We used to do it by comparing two databases but it not satisfy our need for change tracking.
BTW Kal Admin has Metadata and data compare capabilities as well.
I have a VB project that runs on SQL SERVER 2005, while making the setup file for it, how do I include the DB?
You don't
Typically you have a DB generation script that is run either as part of setup or as part of first run of application
You also need to consider migrations (changes to DB when new releases of your application are published)
Consider using MigratorDotNet or RikMigrations to solve these problems in a seperate installer/upgrade program if you are still using VB6
I disagree, you could include the database. Simply distribute the .MDF file with your application.
Of course, the setup application would have know how to attach the database to an existing SQL Server RDBMS.
Both methods given in the above answers will work. I have tried them both. However using a
db generation script reduces the size of the final deployment files considerably. I would launch the script on the first run of the application and not in the setup itself.
I will second jack on this one.
From my experience of using installs that require an actual database file tend to have more issues then when updating or on first install when running scripts. As jack mentioned another bonus is reduced file size.
You can create who database scripts by right clicking on the required database, and selecting the script database option. Note however this will only create the tables and fields and not replicate any data.