I daily backup around 100 databases to BACPAC file using AzureRM for Windows PowerShell.
For some reason 20 of these databases started to throw an strange error:
Could not export schema and data from database. One or more errors occurred. One or more errors occurred. One or more errors occurred. One or more errors occurred. One or more errors occurred. Failed to convert parameter value from a Int16 to a DateTime. Invalid cast from 'Int16' to 'DateTime'.
This issue started about a week ago, always with the same 20 databases. I tried perform the backup with the Az Module instead AzureRM, and with the Azure Portal, but the same error are shown.
I think it's a bug of the Azure cmdlets because Int16 istn a datatype of SQL Azure,
Help please, i need to backup all databases daily.
Pls check to make sure your source and destination tables have the same data types.
It sounds like you might have a column on the source set to Int16 and dateTime on the server.
Related
I need make a backup of my SQL Server database. When I try, I get this error:
System.Data.SqlClient.SqlError: Read on "c:..." failed: 23(...)(Data error (cyclic redundancy error))
Now, I'm trying to run this command:
DBCC CheckDB ('MYDATABASE') WITH NO_INFOMSGS, ALL_ERRORMSGS
But I get this error
Msg 8921, Level 16, State 1, Line 18
Check terminated. A failure was detected while collecting facts. Possibly tempdb out of space or a system table is inconsistent. Check previous errors.
What can I do? I just need make a backup.
I'm using Microsoft SQL Server Management Studio.
First of all, check the Service Account used for the SQL Server Instance from Services.
Ensure the service account have enough permission for read/write at the exact location for Backup in Physical Disk.
Ensure the the user (the user you using to login in SQL Instance) have enough permission to perform backup.
Final option to recover the data from the database is create another database with same tables (blank) in different machine in different SQL instance, then Export all the database to new database using Management studio (Right click on the Database > task > Export Data)
I had experienced strange error while loading CSV file (, separated values) contain ~1.2 million rows into SQL (2016) columnstore table using SSIS package, I got following error rarely, especially on datetime columns.
After I simply restarted failed ETL it just works fine. We load same file in different environments and error appears only in one environment on same day.
I try to add error output and wait to see for next time, Meanwhile I would like to reach out to experts ask for help, if there is any issue with SQL columnstore table or SSIS while loading datatime values.
But error is while insert data, so it could be more of database side issue.
PackageName > Sequence Container > DFL Transactions. SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x80004005.
An OLE DB record is available. Source: "Microsoft SQL Server Native Client 11.0" Hresult: 0x80004005 Description: "Invalid date format".
PackageName > Sequence Container > DFL Transactions. There was an error with DST Transactions.Inputs[OLE DB Destination Input].Columns[Account_FromDate] on DST Transactions.Inputs[OLE DB Destination Input]. The column status returned was: "Conversion failed because the data value overflowed the specified type.".
I'm having troubles on a Azure SQL Database where i'm trying to read DB Audit logs.
Both procedures sys.fn_get_audit_file or sys.fn_xe_file_target_read_file sould be able to read a file.
But whatever I do i'm getting blank tables.But, even if I specify a non existing file I receive a table with zero records instead of a error.
So I'm afraid its something else.
My login is in the db_owner group.
Any suggestions ?
I found that I could only read XEL files by using the same server and same database context that they were created for. So for example, consider the following scenario:
ServerA is the Azure Synapse instance I was creating the audit XEL files from, all related to DatabaseA
ServerB is a normal SQL instance that I want to read the XEL files on
Test 1:
Using ServerB, try to read file directly from blob storage
Result: 0 rows returned, no error message
Test 2:
Using ServerB, download the XEL files locally, and try to read from the local copy
Result: 0 rows returned, no error message
Test 3:
Using ServerA, with the current DB = 'master', try to read file directly from blob storage
Result: 0 rows returned, no error message
Test 4:
Using ServerA, with the current DB = 'DatabaseA', try to read file directly from blob storage
Result: works perfectly
Because I really wanted to read the files from ServerB, I also tried doing a CREATE CREDENTIAL there that was able to read & write to my blob storage account. That didn't make any difference unfortunately - a repeat of Test 1 got the same result as before.
Situation: I have created an SSIS package that has a OLE DB Source that is an Oracle Database and an OLE DB destination of SQL Server. I use [Oracle Provider for OLE DB] connection.
Problem: when I execute the package it will run short and only return 220,000 out of 4 million records. The package runs with no errors and no warnings. Just successfully completes but will not go past 220,000 records. I found one similar problem on this site however it pointed to a date format issue and this table has no date data types in its structure.
Troubleshooting so far:
I extracted the table as a flat file and ran the package to the same destination table, this runs fine. All 4 million records will load from flat file to destination ok.
I have tried running the package as a fast load and a normal - no change
I have tried different buffer combinations and Auto Adjust Buffer Size - no change
I have uninstalled and reinstalled, VS, Oracle 12c, SSDT - No change
I thought maybe it could be memory or size issue, no luck, I load many other tables that are larger in size memory wise.
Environment Specs:
VS V 15.9.14
Oracle Developer Tools for Visual Studio 12.2.0.1.0
SSDT 15.1.61906.3120
SSIS 15.0.1301.433
SQL Server 2016 13.0.4 - SP1
Has anyone dealt with something like this what are somethings I could try or look into?
Thanks!
Currently have a pipeline running in our production environment that has an activity that copies data from an on prem sql database to sql azure database. This pipeline is replicated among the dev and QA environments but don't fail in those environments. Wanted to get a bit more insight as to what this error means.
Message=A database operation failed with the following error: 'PdwManagedToNativeInteropException ErrorNumber: 46724,
"PDW" is short for Parallel Data Warehouse and suggests you might be using the MPP product Azure SQL Data Warehouse, rather than a SQL DB as you mentioned. Is that correct?
This error reflects when your defined size of the column like varchar /int is getting overflown.
Try increasing the size of data types and column and rerun the pipeline.
I recreated it and fixed it in my Data factory.