Heroku rake db:migrate aborted with 'PG::Error:' - ruby-on-rails-3

I'm trying to run my app on Heroku, and I can't get past the 'rake db:migrate' command.
I keep getting this error:
rake aborted!
An error has occurred, this and all later migrations canceled:
PG::Error: ERROR: must be owner of extension plpgsql
Any idea?
p.s.
this is the full error log: http://pastebin.com/iYeiMD2y

It's attempting to populate the database from you schema. The problem is that it's trying to add comments to a postgresql extension that your database user does not own. These 3 lines will cause problems:
COMMENT ON EXTENSION plpgsql IS 'PL/pgSQL procedural language';
COMMENT ON EXTENSION pg_trgm IS 'text similarity measurement and index searching based on trigrams';
COMMENT ON EXTENSION unaccent IS 'text search dictionary that removes accents';
In order to comment on any database object, you need to be the owner of that object. See the PostgreSQL documentation for more information.
Heroku also doesn't allow you to create or modify extensions. They provide a list of available extensions and text-search dictionaries that are available for use and all of the extensions in your schema dump are listed.
Remove or comment out the lines code in the schema dump that are creating extensions and creating comments on extensions and that should get you past that error.

Related

'DBT docs generate' does not populate model column-level comments in the catalog

I use dbt-snowflake 1.1.0 with the corresponding dbt-core 1.1.0.
I added documentation for my models in yaml files, i.e.:
> models/stage/docs.yml
version: 2
models:
- name: raw_weblogs
description: Web logs of customer interaction, broken down by attribute (bronze). The breakdown is performed using regular expressions.
columns:
- name: ip
description: IP address from which the request reached the server (might be direct customer IP or the address of a VPN/proxy).
...
Although these details show up correctly in the DBT UI when i run dbt docs generate and then dbt docs serve, yet they are not listed in target/catalog.json:
cat target/catalog.json | grep identity
(no results)
According to the DBT documentation, I understand that column comments should be part of catalog.json.
LATER EDIT: I tried running dbt --debug docs generate and it seems that all data is retrieved directly from the target environment (in my case, Snowflake). Looking at the columns of my model in Snowflake, they indeed do NOT have any comments posted on the in Snowflake.
It thus seems to me that the underlying error might be with the fact that dbt run does not correctly persist the column metadata to Snowflake.
After further investigation, I found out the reason for lacking comments was indeed the fact that the comments are written to catalog.json when running dbt docs generate based on what is received from the database, while dbt docs serve populates the UI by combining information from catalog.json with metadata (in my case, documented column comments) from the local dbt models.
The solution to persist such metadata in the database with dbt run was to add the following DBT configuration:
> dbt_project.yml
models:
<project>:
...
+persist_docs:
relation: true
columns: true

Check if doctrine relations are correct

Is it possible to check if the relations between models are correct?
Im looking for cli command or something like Symfony2 profiler which shows wrong relations.
There is a build-in command that Validate that the mapping files are correct and in sync with the database:
./bin/doctrine help orm:validate-schema
'Validate that the mapping files are correct and in sync with the
database.'
In the symfony2 doctrine bundle exists two command instead:
doctrine:schema:validate
The doctrine:schema:validate checks the current mappings for valid
forward and reverse mappings.
and
doctrine:mapping:info
The doctrine:mapping:info shows basic information about which
entities exist and possibly if their mapping information contains
errors or not.
"Using custom column definition trigger schema update every time":
"This is a known limitation that we cannot fix."
https://github.com/doctrine/dbal/issues/2666#issuecomment-283772609

Execute service builder generated sql file on postgresql

I would like to execute sql files generated by the service builder, but the problem is that the sql files contains types like: LONG,VARCHAR... etc
Some of these types don't exist on Postgresql (for example LONG is Bigint).
I don't know if there is a simple way to convert sql file's structures to be able to run them on Postgresql?
execute ant build-db on the plugin and you will find sql folder with vary vendor specific scripts.
Daniele is right, using build-db task is obviously correct and is the right way to do it.
But... I remember a similar situation some time ago, I had only liferay-pseudo-sql file and need to create proper DDL. I managed to do this in the following way:
You need to have Liferay running on your desktop (or in the machine where is the source sql file), as this operation requires portal spring context fully wired.
Go to Configuration -> Server Administration -> Script
Change language to groovy
Run the following script:
import com.liferay.portal.kernel.dao.db.DB
import com.liferay.portal.kernel.dao.db.DBFactoryUtil
DB db = DBFactoryUtil.getDB(DB.TYPE_POSTGRESQL)
db.buildSQLFile("/path/to/folder/with/your/sql", "filename")
Where first parameter is obviously the path and the second is filename without .sql extension. The file on disk should have proper extension: must be called filename.sql.
This will produce tables folder next to your filename.sql which will contain single tables-postgresql.sql with your Postgres DDL.
As far as I remember, Service Builder uses the same method to generate database-specific code.

In Rails 3.2.3, can a migration to add index be created on the command line?

It might be convenient to put all the lines we type inside of bash to create the whole initial project inside a script, for all the scaffold creations, rake db:migrate, and even the git commands, so that if we ever need to create the same project under Rails 4 or 5, that will be very simple.
But for a migration file that adds index to a table column, is there a way to specify that on the command line or somehow automate that on the command line? Otherwise we will need to put in the command for the migration file creation and then manually edit that file by hand -- if everything can be put in a script, that can be kind of neat.
(or if the index can be specified in the scaffold line, that might be even better)
When generating your model's migration, use index after the field name for a regular index and uniq for a unique index.
Example:
$rails g resource Widget name:index part_number:uniq
You can probably do something similar when generating just a migration

go up to a specific database version in rails

How do I go up to a specific database version from an empty database in rails?
in my case, I did reset the whole database recently, so all tables have been already dropped.
my migration files are as follows:
20111127152636_create_users.rb
20120110100458_create_cars.rb
20120131003026_add_birth_date_to_users.rb
what command do I have to call to get me the second latest version, which is 20120110100458 ?
I have tried "rake db:migrate:up version=20120110100458".
unfortunately, it didn't get me the result I expected it should be; no tables were created at all.
If you want to run the 2 first migrations, use
rake db:migrate VERSION=20120110100458
(it will run 20111127152636_create_users.rb and 20120110100458_create_cars.rb)