How to delete some data from External Table Databricks - apache-spark-sql

I am trying to delete some data from Azure SQL from Databricks using JDBC, it generate error each time. I have very simple query delete from table1 where date>'2022-05-01'.
I searched many documents online but did not find any appropriate solution for this. Please find below code.
jdbcUsername = "userName"
jdbcPassword = "password" #these from Azure Key Vault
jdbcHostname = "host server name"
jdbcPort = "1433"
jdbcDatabase = "db_test"
jdbcUrl = "jdbc:sqlserver://{0}:{1};database={2}".format(jdbcHostname, jdbcPort, jdbcDatabase)
connectionProperties = {
"user" : jdbcUsername,
"password" : jdbcPassword,
"driver" : "com.microsoft.sqlserver.jdbc.SQLServerDriver"
}
pushdown_delete_query = f"(delete from table1 where date>'2022-05-01') table_alias"
print(pushdown_delete_query)
spark.read.jdbc(url=jdbcUrl, table=pushdown_delete_query, properties=connectionProperties)
the query return error com.microsoft.sqlserver.jdbc.SQLServerException: A nested INSERT, UPDATE, DELETE, or MERGE statement must have an OUTPUT clause

Related

Snowflake merge into update is not working after binding a JSON object, but it works when tested with simple update query

I am trying to merge some data from a source to a target table by binding a JSON object to the query.
The query is within a JavaScript stored procedure:
var merge_data = "MERGE INTO UNPIVOTED_DATA UD "+
"USING (SELECT KEY AS TABLE_COL, VALUE AS TABLE_COL_VAL from lateral flatten(input => parse_json(?))) ST "+
"ON (ST.TABLE_COL = UD.TABLE_FIELD_NAME) "+
"WHEN MATCHED THEN UPDATE SET UD.ODK_FIELD_NAME = ST.TABLE_COL_VAL";
var merge_data_stmt = snowflake.createStatement({sqlText: merge_data, binds: [JSON.stringify(full_table_fields_array)]});
var merge_data_rs = merge_data_stmt.execute();
merge_data_rs.next();
return merge_data_rs;
The JSON is having the following shape:
var a = {"field_name1": "field/name1", "something_else": "something/else"};
here is a small extract of the full JSON Object.
If I test the flatten query by itself it will work just fine and flatten the array:
SELECT KEY AS TABLE_COL, VALUE AS TABLE_COL_VAL from lateral flatten(input => parse_json(?))
I think the problem is within the JSON.stringify ?

perl - update from two databases

i have two databases in mariadb and i want to update two databases,
#Connect to the database1.
my $db1 = DBI->connect("DBI:mysql:database=db1;host=ip",
"login", 'paswword',
{'RaiseError' => 1});
#Connect to the database2.
my $db2 = DBI->connect("DBI:mysql:database=db2;host=ip",
"login", 'password',
{'RaiseError' => 1});
this query not work
my $query3 = $db1->prepare("
UPDATE worldmap.worldmap_table t1
SET t1.severity = 1000
WHERE t1.host IN
(SELECT h.name
FROM host_inventory as i, hosts as h WHERE i.hostid=h.hostid and h.available=1)");
$query3->execute;
thanks for your response
Use a single connection. Reference tables using the db.table syntax.

(web2py) add extra fields auth_user that reference another field

I am using Web2py and I would like to add extra fields in the auth_user. some of these fields are reference to other table. for example:
auth.settings.extra_fields['auth_user']= [
Field('country', 'reference countries')]
db.define_table(
'countries',
Field('name'),
format = '%(name)s'
)
but I receive this issue:
cannot resolve reference countries in auth_user definition
can any one help me what should I do? how can I link auth_user table with another table???
All the Best
you need to make sure your db.define_table is created before your the auth tables
like this :
db.define_table('bank',
Field('name'),
format = '%(name)s')
auth.settings.extra_fields['auth_user'] =
[Field('bank', 'reference bank',
label = T('Bank'),
notnull = True,
required = True,
requires = IS_IN_DB(db, db.bank.id, '%(name)s') ),
]
auth.define_tables(username = True, signature = True)
custom_auth_table = db[auth.settings.table_user_name]
auth.settings.table_user = custom_auth_table

How to generate list of tables for DB using RoseDB

I have to list the tables for a given database using RoseDB . I know the mysql command for it :
SHOW TABLES in DB_NAME;
How do I implement this in rose DB ? Pleas help
It's not really a Rose::DB-specific question. Simply use the database handle how you would normally in DBI:
package My::DB {
use Rose::DB;
our #ISA = qw(Rose::DB);
My::DB->register_db(
domain => 'dev',
type => 'main',
driver => 'mysql',
...
);
My::DB->default_domain('dev');
My::DB->default_type('main');
}
use Carp;
my $db = My::DB->new();
my $sth = $db->dbh->prepare('SHOW TABLES');
$sth->execute || croak "query failed";
while (my $row = $sth->fetchrow_arrayref) {
print "$row->[0]\n";
}

Titanium: How to check Table is exists or not in Database?

How can I check weather specific table does exists or not in my database before execute query ?
Example : I want to check weather Detail table exists or not in InfoDB
I want to do some thing like :-
var createDB = Titanium.Database.open('InfoDB');
if(Detail exists in InfoDB)
then
var rs = createDB.execute('SELECT * FROM Detail');
Thanks...
Try this:
var createDB = Titanium.Database.open('InfoDB');
var result = createDB.execute('SELECT name FROM sqlite_master WHERE type="table" AND name="your table name"');
if(result.isValidRow()) {
//table found
var rs = createDB.execute('SELECT * FROM Detail');
}
result.close();
Solved ! I use alternate way. I use try...catch instead.
var createDB = Titanium.Database.open('InfoDB');
try
{
var rs = createDB.execute('SELECT * FROM Detail');
}
catch(err)
{
alert(err)
}
if you like to check that before creating table
db.execute('CREATE TABLE IF NOT EXISTS Detail (..columns..)')
or dropping table
db.execute('DROP TABLE IF EXISTS Detail')
avoids table exists or table not exists errors.