Azure SqlExceptions - sql

I have a program that uploads about 1gb of data to a SQL Azure Database.
I use a SqlBulkCopy to upload this data. I upload about 8,000,000 entities, on average 32,000 entities at a time, with a maximum of about 1,200,000 in one time.
I am receiving a lot of SqlExceptions, with error code 4815.
At first I thought this may be due to me uploading too many at a time and Azure throttling my connection or employing ddos defense, but I allowed mhy program to only submit 25,000 entities with each SqlBulkCopy, and I got even more errors! A lot more!

I have had good results using BCP to move large amounts of data into SQL Azure. The SQL Azure migration wizard uses this approach behind the scenes. This blog post is a bit dated, but the concepts are sound when it comes to importing a lot of data:
Brute Force Migration of Existing SQL Server Databases to SQL Azure
Question did not specify source of the data, so obviously this will not work for you if you are not importing from another database.

In my case, I got a 4815 when the data I was sending in one of the fields was larger than the field size in the table definition... sending 13 characters into a VARCHAR(11).

Related

what is the best way to copy a large sql database from azure managed instance to azure single database?

Hello folks first post in stack, btw wonderful community and helps out a lot.
like mentioned in the title what is the best way to copy such a large database? we got an ~ 500 GB Database and im currently moving this database from managed instance to a azure single database using smss:smss copy via deploy to microsoft azure sql database and it takes me right now 22 hours. i feel like im back in early 20s.
it's all in the same subscription and also in the same network configuration. afaik the process of that is that smss creates a bacpac file and then import it back to the single database. but 16 hours is just too long. so do you know any better option to do this quicker because i've a hell of more and partly larger databases to copy.
Did you think about using ETL tools, such as Azure Data Factory? It has good performance to migrate the big data. Ref this performance table:
It supports SQL database and Azure SQL MI. Ref these tutorial:
Copy and transform data in Azure SQL Database by using Azure Data Factory
Copy and transform data in Azure SQL Managed Instance by using Azure
Data Factory
It may takes some money but save much time. As we all know, time is money.
HTH.

how to increase Oracle SQL database or web service performance?

I got a task to increase Oracle SQL database or web service performance. The web service required billions of data from the Oracle SQL database. Web service needs to populate those billions of data for each startup. Those data is mostly read-only and very rarely need an update or write data.
It is a very old codebase. That is why the solution was done in a way that it loads all data in memory to increase the performance. That is why it is slowing down development. It is like the first launch takes 30+ minutes. If for some reason those in-memory cached data becomes corrupted, I have to reload data from the database. It means another 30+ minutes waiting.
My task is to update this process. I have the flexibility to change the SQL database to something else that could help to speed up this process. Do you have any suggestion? Thanks in advance!
You can try to use MySQL. From my knowledge, MySQL has no limitation for the size of the database. I've attached a comparison you can look at between MySQL and Oracle. Comparison

How to insert data from one Azure SQL Database into a different Azure SQL Database?

I realize that Azure SQL Database does not support doing an insert/select from one db into another, even if they're on the same server. We receive data files from clients and we process and load them into a "load database". Once the load is complete, based upon various rules, we then move the data into a production database of which there are about 20, all clones of each other (the data only goes into one of the databases).
Looking for a solution that will allow us to move the data. There can be 500,000 records in a load file and so moving them one by one is not really feasible.
Have you tried Elastic Query? Here is the Getting Started guide for it. Currently you cannot perform remote writes, but you can always read data from remote tables.
Hope this helps!
Silvia Doomra

What file format can be use to save/access data instead on database

There is a situation in my company where we are developing a light weight .net web application with least dependencies. Application will be used hosted on client server. However there will not be any internet connection and they will use application locally.
We do not want any type of database installation on client machine. We want to keep it as simple as possible on client side. for this purpose we want to save/access data from file, as data on client side will not exceed more than 100 000 rows. We are also concerned about the speed for accessing data.
Here I want to ask how the data should be saved in file so that it can be accessed fast? What file format should be?
Whether I can use any db file which does not require any database installation on client side.
You could save all data to a json file, this will become increasingly slow and prone to corruption.
Also, have a look at SqlLite.
You can try Sql Compact Edition or SqlLite. Both are file based solution and fit as per your need.
Advantage of using these two would be that you can perform almost all the database queries on it and the data retrieval will be very fast. Also the you can think of optimizing the data storage and create tables etc.
You can use SQLite which is heavily uses in such scenarios (among others used by Chrome and Firefox). It is even public domain, so no license costs etc.

GAE Datastore Large Amounts of Data

Background:
I'm working on a project that's starting out with a large SQL dump that I have to import to a new database. This dump is about 1.5GB of just plain text, so quite a lot of information. My client right now wants me to use Google App Engine and its datastore, which I'm (a) not so fond of and (b) doesn't really play well with SQL dumps. Before I go through the trouble to make that happen...
Question:
What is a cloud-hosted database solution that can efficiently handle large quantities of data (and ideally is lower-cost)? In particular, which would be a database solution to which I could just import my SQL dump as-is?
Does your client has any reasons to use the datastore? If you already have the SQL dump, I think it would be easier to use Google Cloud Sql from GAE.