I need some help with a problem:
I have a *.bak file that i need to check for its contents on a certain table, however because of circumstances I cannot restore it on the server.
I tried to open it using Libreoffice calc, without success, because of few workplace circumstances using excel is not an option.
How can i open and search for the data in this *.bak file?
You won't able to see anything on *.bak file. If you want to make a verification I suggest that you make a database script export (structure and data) and compare it with the previous version (without certain table) and if that table is refereed inside that script.
Related
I am working in a system which gets reloaded frequently. When the system is reloaded a lot of files can be deleted or changed back to a previous stable build version (it's a dev environment). The only thing that doesn't get reloaded is a database.
I have been tasked with inserting and deleting .sql scripts into the environments so that if required someone can run them at a given time. Usually 2 weeks after the files have been added to the system.
Initially I had thought of creating a directory to keep the files, and make a table where one of the columns keeps the path to these .sql files. This would allow for someone to query the database and using the path they could execute a desired script. The problem is the environment constantly reloading which could result in losing files.
Am I correct in assuming that the correct way to approach this is to use BLOB datatype to store the .sql files? The database isn't reloaded so no files would be lost. I am new to SQL and I'm unsure what is the correct approach to this. Would Varchar also work? Is one datatype more efficient than the other?
I wouldn't put SQL scripts in a SQL database, it doesn't feel right.
Sql Scripts usually go in a git repo, in order to be versioned and have better visibility.
I have repository connected with the integrated source control in oracle. I did full export without any data from tables of my database as separate directories and it looks fine. The problem is that every time I want to change something I have export the stored procedure or table sql code and then to upload it in the repository, which is hard because at the end of the day I'm not sure about how many changes I did and I can forgot some of them. Full export without data could have been the solution but I don't have the time to wait for export 20-25 minutes at the end of every day. Is there any way to just export the changes which are made at the current day or made after the last export. Or maybe directly export the sql code on each compilation inside oracle management studio? The database is not on my computer its located in a server which I'm connected to.
here is how my git folder looks like in separate folders
You need to work the other way around. To change a package for example, open the corresponding source code file from your Git repository in your development tool (SQL Developer, PL/SQL Developer etc), make your changes, test, save the file, check in with ticket number and comment. As a rule you should not edit stored code directly in the database. (PL/SQL Developer has a checkbox "Allow editing of database source", which I generally leave unchecked. Probably other tools have something similar.)
I'm working with a company trying to setup a new database system as their old database software has gone out of business. All the data is in a .fb file that is encrypted (You used to have to get backups 'unlocked' before they would let you use them).
I've managed to get a copy of the database (I think it's unencrypted as I copied it while the database was open and then changed the copied files permissions using terminal).
The problem is that it's a .fb file and I can't find a way to 'open' it to browse the data...
Any Ideas?
Generally speaking, data stored in relational databases aren't just stored as ascii csv files. So you won't be able to just open up a .fb file in a text editor and grab the data.
If you're still able to query the database, you will need to have the frontbase server generate a dump of the data into a flat file.
See the frontbase documentation for backup and restore. Specifically 4.9.1. Exporting Schema and Content Data:
WRITE ALL OUTPUT('<output-directory>' [,'YES']);
how can i make to automatically every day run my excel file, reload data from db and append in existing data.
Do I need to use vba script or...???
I have an idea that might work,
Write a script in any backend language of your choice (Java, Python...) that creates an active connection to the database
Run it to periodically query a database table(MySQl,..) and store newly entered data into variables
Continuously append that data into the excel file as you normally would any file.
Most languages have packages to handle excel files, I think it should do it.
Good luck
I inherited some old records for a company I volunteer for. One of the old files is an SQL Dump from their old webpage, and I would like to get the data from one of the tables for their use into Excel.
-- MySQL dump 10.11
The dump drops the table if it exists, creates the table new, and then inserts all of the data.
Is there some easy way I can get this data into Excel on my PC? I don't have SQL Server or anything like that loaded... I assumed there was some easy way to get a CSV or Excel file out of it but I have failed to find this yet without first uploading the dump to some SQL Server.
Unfortunately I don't think that there is any way to export a dump file into an excel or .CSV file. The reason for this is that the dump file is actually a collection of Select statements instead of the actual data itself. SQL servers do this to prevent a whole list of problems that can occur when you try to manipulate raw data manually.
Lucky for you, MySQL offers a free version of their server. You can find it here: http://dev.mysql.com/downloads/
I think you are best off downloading this and restoring your file as a new database. This has the added benefit of allowing you complete control over the data from that point on. Exporting to excel would be easy at that point however, you may find it a lot more fulfilling to continue using MySQL server.
Hope this helped.