We have a table in BigQuery, and we wanted to change the column name. So we queried all columns, and the one that we wanted to change we used an alias to rename it. It appears to have worked, however the table schema in the browser tool still shows the old column name. We refreshed the project, and it still shows the old column name.
We renamed the field from "has_info" to "ct_info":
Is this a bug in the UI?
This is an issue with the UI. The problem is that the UI has cached the schema from the old version of the table, and doesn't realize that it has changed. A complete browser-level reload of the table will fix the problem.
I'll file a bug to see if we can improve this.
Related
I'm trying to create a table with the create table GUI. It doesn't let me change the table names. Most of them are still on 'COLUMN'.
I tried editing using the GUI but the names revert back to 'COLUMN'.
It doesn't always automatically create a DDL. when it doesn't, it doesn't change/edit my table.
I've tried installing a different version but mlno luck.
Okay so I am working in Web Forms but the problem would also apply to MVC I am assuming since both have the option of creating a users database on project creation. I deleted a data table on accident and updated the database instead of deleting the database itself because I was trying to recreate it with the seed data. I didn't realize that deleting a data table would do something different from deleting the database itself. The only backup I have is pretty old, so I would prefer to use a different way to fix things if that exists. How would I fix things?
I have it working now. What I did to fix it was I went to my back up although it could have just been a brand new project, both would have been fine, and I copied the SQL script for the data table that I deleted. Then I went to my broken program and created a new data table in the spot where it was before and replaced the code with the code from the back up. I saved it and hit update and it updated the database for me.
I am having few issues with pentaho spoon: I want to copy a table from one database to another.
When I click on "copy table" in the tool menu, it auto creates the transformation for that. But when I run it then I get these issues:
The truncate table is ticked that's why I get the error that my table does not exist.
I have to manually un-tick that. Even then I get an error because the table is not created. I have to click on the SQL and then execute the query. Is there any way to automatically do it?
Third problem is that pentaho created table is not detecting the date field, so it's putting the date type as UNKNOWN. I have to manually change that to varchar. Is there any way to fix that or default to VARCHAR?
The UNKNOWN data type is typically a driver issue. What database are you using and do you have the right driver?
There is no way to automate the creation of the table within PDI - it's deliberate that it does not do this. You could however integrate PDI with a tool which does do this, something like dbDeploy is a good idea.
UPDATE There is now a way to automatically create tables, you can follow the blueprint here:
https://github.com/mattcasters/blueprints
Am I missing something here?
I've a .net 4.5.1 project with an Entity Framework model created from a SQL 2005 database (Connection type is SQL Server also), via "ADO.NET Entity Data Model".
This works fine. That is, until I update the database. If I add new tables or columns to existing tables all is well, but if I delete a column, the update just doesn't work properly. it also throws out errors when I alter the column type.
This same problem was reported a long time ago: http://blog.jongallant.com/2012/08/entity-framework-manual-update.html#.UytNrvldVD0, but it seems so ridiculous I can't believe I am not missing something. Surely I'm doing something wrong? How can I get the model to update properly?
I have the same problem. I opened up the Model, removed the changed tables from the diagram, then Updated from the Database and re-added the tables.
EF brought the table back in again with the correct structure. Save The model, Rebuild the project, everything turns up as expected.
I am not sure why it doesn't detect the deleted columns. I believe there's also issues if you change the data type of the columns, but I've not tested it. But the above solution has worked for me so far.
I am using a HSQL driver to connect to my database. I am able to connect without any errors but I can't see any of my tables in the table tree under public. I am able to create new tables which do appear, but I can't see the already created ones. Also when I check the .script file I can't see the new tables. Something strange is going on but I can’t work out what.
Anyone able to help.
I've worked it out now. It was the format of the connect string for HSQL.
I needed to add :file: in the string like this... jdbc:hsqldb:file:
Also I was using .script at the end of the file name, like this jdbc:hsqldb:.script
This was creating temp files in the format .script.lck .script.log etc.
Dropping the .script at the end of the file name opened up the database and allowed me to see the tables. Now my problem is I can't get any updates to commit. Updates happen with no errors in the console. But when I close the file and check the .script file the data is the same. Permissions I guess.
Only posting this answer to help others that might get stuck at the same point.
Try here:
How can I list all tables in a database with Squirrel SQL?
Where most of us fails is to choose the right schema in the Catalog dropdown. (Just above the Objects tab). But there are other ideas if you follow the related question.