Error create table in azure storage explorer - azure-storage

[i am getting error
"Type error:Cannot read property 'statuscode' null at process response call back"
while creating table in microsoft azure storage explorer when using external storage,while locally table is creating perfectly.please help.][1]

while creating table in Microsoft azure storage explorer when using external storage, while locally table is creating perfectly.
Please have a try to check whether the network is smooth. The error message indicates that can’t get the response status from the azure storage. It seems that something wrong with the network. If the network is smooth, please try to create the storage table again. If the issue is still repro, please have a try to troubleshoot with following ways:
Try to download the latest version of Microsoft azure storage tools
Create another storage account and try it again
Please have a try to use another computer to check it.

Related

I am trying to create a table in sandbox and continue to receive this error: Unexpected error Tracking number: c7749408936598037

I am new to bigquery doing the data analytics certificate through google. I am trying to use data from that course to create a table and continue to receive this error. I have tried with multiple data sets. I am uploading as csv files, auto detect schema
Unexpected error
Tracking number: c7749408936598037
if you are using public dataset you can not create table.
You need your own GCP account , try for free trial account if not done.
I'm working on the Google Analytics Certificate as well(using Sandbox for BigQuery), I found that the error pops up and if I close out of the table creation window and refresh the page, the table is included under the dataset.
I'm not sure what is causing this, but I was able to load the csv files that I have needed so far. Hope this works for you as well!
As was previously stated above, simply close the "Create Table" window and confirm that's what you want to do, refresh the page in your browser and look under the dataset to find the table you created will actually be there.
Weird bug.
I think you should try to drop the file .csv in google cloud storage.
Create Table --> Create table from Google Cloud Storage --> Select your source path
Try to browse project name.
If still got error than upload it in google spreadsheet and than browse project name.
I did the same thing and its working.

How to delete a corrupt database from azure sql

I have tried to delete a SQL database from the azure portal. It looks like it has failed part way though. The database doesn't show up under the list of SQL servers in the Azure portal. However if I login to the server through SSMS it is still there. I now can't delete the database or create a new one with that name.
I've tried deleting the database with a query and get an error saying the database doesn't exits. If I try to create it either from the Azure portal or SSMS it gets an error saying it already exists.
I had a similar problem once, with SSL settings, where it would return that it is linked to the app even tho it wasn't, hence I was not able to delete it. After a couple of weeks of back and forward with the support, we removed it through azure resource explorer.
How to:
Once you are logged in, set read/write
In serach box find your resource
Click actions (POST/DELETE) // these should be available now since you have set read/write
Click Delete
Hopefully, this would help anyone who has any corrupted resources in Azure.

How to upload BACPAC file to Blob in Azure

So frustrating... , Microsoft do not offer help and you have figure out every small thing at the portal ...
My question is: I try to import my DB to Azure. I created a BACPAC file from the .Bak and when try to import, there's the "*storage" and then when I click on Configure required settings it's opens the "Storage Accounts" tab but give the message "Not found".
Here's the thing: I created storage and container and blob - it's just not find it. So annoying. And on the azure portal it's doesn't say how to create and how to upload the file.
Also, when I create the storage and the blob, there's no option where to Upload the BACPAC file. How do I do it? What's going on with Microsoft?? So unclear...
To summarize:
How do I create a storage/container/blob that when I try to import the DB it will see it?
How do I upload the BACPAC to the Azure Blob Storage?. I couldn't find a way to do that on the portal.
Thanks!!
Not 100% following your example, but if you have a later version of SQL Server Management Studio, you can right click the database you want to deploy, click 'Tasks/Export data tier application'. In there you can connect to a storage account which if you press Connect, you will have to type the name of the account from the portal and provide a key. You can then export it to storage and then if you connect to the Azure instance, you can right click the databases level and 'import data tier application'. If you need to create a storage account in the first place, you do that in the portal, but sounds like you have.
If you want to browse your storage and drop a file in directly, I use Azure Storage Explorer. There are various tools out there some free some not to do this. You could of course code your own interface as the API's are published.
View the portal as the administration of your subscription. When you want to use the services (not configure them) - you'll need to look at the toolsets.
Azure Storage Explorer makes navigating and uploading/downloading files in storage accounts super simple. Visual Studio works well too, for Db workloads Sql Management Studio has Azure integration. If all else fails, powershell gives you the finest level of control.
1 - Create an Azure storage account
2 - Create a blob in the new storage account
3 - Access the blob and upload your bacpac file
4 - Access the Sql server, go to import database and use the link "select the backup" to indicate in the blog the bacpac file you want to use to create the database

Unable to deploy database to Azure

I created ms sql database in SSMS 2012. Connected successfully to Azure and trying to deploy db to the cloud.
Encountering following errors:
Please see screen shot
Numerous Usupported property errors — not supported when used as part of a data package
You're likely using a feature not supported in Azure SQL Database. Please refer to this non supported features list to help you pinpoint the problem:
http://msdn.microsoft.com/en-us/library/azure/ff394115.aspx
This happened with me too. In my case ,i changed the schema of a table after creating once for the first time. After deleting that table database deployed correctly. Usually this error occurs when validating schema fails.
Regards
MAnoj Bojja

Azure Sql Reporting Service external Image

We are working on Sql Azure Reporting Services, we have a situation where we need to display client logo on report. We are passing Image path (URL) as parameter to the report which works fine on normal windows server reporting services, but when we move to Sql Azure Reporting it fails to show image on report e.g. image path can be like "http://p.lui.li/img-30718_images_j-r-full.jpg". Any help will be highly appreciated.
From what I can tell this is currently not supported in Azure SQL Reporting. I tried it in a sample report and I couldn't get it to work even if I use Azure Blob storage. You can upload the image to the reporting server as an alternative but external linking is still not implemented. I would vote here:
http://www.mygreatwindowsazureidea.com/forums/169380-business-analytics-sql-reporting/suggestions/2395600-support-external-image-in-reporting-services
This was similar to the Azure CORS problem that is finally going to be fixed soon. The nice thing is when they finally fix it you will get notified if you vote on it.
I solved this issue by moving the image to Azure Sql Server Database, and then populating the image from database resolved the issue.