Why when i use sql files and load my initial data via sql statements and when deploying the app on heroku,the sql files aren't executed and no data found in the database by default,and how to solve it ?
Heroku does not use the SQLlite database. You have to use Heroku's shared postgresql database since heroku is a 'production' environment.
When you push a django app up to heroku, it overrides your database settings in your settings.py file. You have to do a syncdb or south migration to your production postresql database your app now uses.
Btw - you will need to install postgres on your devel environment and pip install psycopg2 for postgres/python support.
Related
The system I have is a local machine for development with the dev DB and a number of remote servers with the production database. While looking for a system to manage the versions of my SQLite database I found Liquibase but I can't understand if it will work for what I need. Which is updating the schema of the production databases when i release a new version, adding the changes configured in Liquibase's changelog file for that version. Ofcourse all the rest code is under GIT so, if Liquibase only needs the changelog files I can put them in the repository, but if it needs something else it could become a problem.
Yes it should work. If you are using liquibase for first time it will run all the migrations and will store information in your database by creating seperate table for itself. Though you should verify the structure at both local and production is same and migrations won't cause error.
Our team is using a django environment to develop a website, the main issue is one team member recently updated one of the databases and the change will not care through mysql. We are literally on the same branch but the database tables are completely different. We are using the current version of django, python, and mysql are the development environment and github to share work.
The truth is Git is never synced with your MySQL database. MySQL Database is a local property of your system. If you use SQLite then that can be synced by Git. If you both need to access the same database you need some database in the cloud so that you both are on the same page.
Also, you need to migrate the migrations to get those same tables this is independent of the system but is dependent on the number of migrations created and applied.
This will create same tables and columns. Just run this the terminal
python3 manage.py makemigrations
python3 manage.py migrate
Is it possible to create a package or replace an existing package in a local database using a package from a different database without having to export it from the remote database?
Basically i have two environments/servers (DEV and QA).
The developers that work on the packages use the development environment and i would like to update the same packages in the QA environment using the package in DEV (ignore any possible issues for now e.g compilation failures etc).
Is it possible to frequently update the package in QA using the package in Dev as the source (instead of compiling from an .sql file)? Maybe a database link?
Yes, it's possible, you could created a process on your target system which uses the DBMS_METADATA package on the remote system to fetch the DDL for the desired package spec and body, and then use dynamic SQL on local system to compile the fetched code.
Alternatively, you could use tools such as Oracle's SQL Developer for migrating code. Using either the database diff functionality to detect differences and prepare the appropriate DDL scripts, or the Cart functionality to pick and choose what get's migrated. However, I'm not sure how well the SQL Developer method can be automated.
how can I disable completly the migrations of EntityFramework? I mean, what are the best practies when you are almost ready to go live with your web app? I'm worried that some automatic script reset my DB :)
Many thanks,
davide
Migrations are normally used on the development database. For production I would suggest that you instead export an SQL script using the Update-Database -Script command which can then be run on your live database when you need to update it. It would be dangerous to have your development application pointing to a live DB and to run migrations directly.
I have an eshop running PS 1.3.6 version. On my local I've updated to 1.4 first and then to 1.4.1...
Now I would like to update on server... is it possible to just upload files from my local 1.4.1, adjust the settings file and run the update script from 1.4.1 directly (without the middle step to 1.4)?
I can see there are database update scripts for each version, so it should be safe to do it like that, but I want to be sure before I run it on server.... thanks
I am usually doing the major upgrades this way:
Take a snapshot of the current site (tar.gz) & backup database using mysqldump tool (for compatibility);
Download all the files and setup the site on your local server machine using database dump (via mysql command) and downloaded snapshot. Adjust settings if necessary.
Perform an upgrade on your local site, thoroughly test it and test it again with your code & theme.
Repackage your updated files and database (tar.gz & mysqldump) and upload them to the server.
Erase the old site and untar upgraded site to its folder to take its place.
Replace old database with an upgraded one (using mysql command on the server).
Adjust settings if necessary. Test and run it! :)
That should be all. If you're more advanced you could optimize most of the steps. Give me a shout if you need all the useful commands to back up and restore files & DB.