Cross database query, looping through databases - linqpad

Since I have a lot of databases, I'd like to loop through them executing a linq query/update on each of them.
Is it possible to do something like
foreach(var r in master.sysdatabases)
{
from b from r.chicken
select b.age;
}
I have the premium edition for cross database support.

If all your databases all have an identical "chicken" table, you can do this:
var builder = new SqlConnectionStringBuilder (Connection.ConnectionString);
foreach (var db in sys.Databases)
{
builder.InitialCatalog = db.Name;
var dc = new TypedDataContext (builder.ToString());
try
{
var query =
from b in dc.Chickens
select b.Age;
query.Dump();
}
catch { ... }
}

I was able to solve it like this, but I honestly don't like using string concatenated queries.
var r = (from b in Sysdatabases select b.Name).ToList();
foreach(var i in r)
{
try{
var o = ExecuteQuery<string>("select urls from "+i+".dbo.website_setting");
Console.WriteLine(o);
}
catch(Exception){}
}
the try,execute is for if the table does not exist (master database etc...)

Related

How to escape _ wildcard within Google app script sql?

The function to run a standard sql query within the app script throws up an error when _is used within the sql. It is used within the condition filter to look for all names with _x_. Backslashes break the app script when used.
Within Google Apps Script: var sql1 = 'sql string';
Within sql: WHERE lower(name) like "%\_x_\%"
Update: I managed to find a workaround using REGEXP_CONTAINS(LOWER(name), r"(_x_)" but am still interested to know if it works with the regular LIKE clause.
I reproduced your case using a modified sample code from the documentation.
I queried against a sample dataset using where like "%_". Then, I write the results in a Google spreadsheet.The table I am querying in BigQuery is:
Row id
1 _id_1212
2 id1212
The code I am using is below:
/**
* Runs a BigQuery query and logs the results in a spreadsheet.
*/
function runQuery() {
// Replace this value with the project ID listed in the Google
// Cloud Platform project.
var projectId = 'project_id';
//modified query
var request = {
query: 'SELECT * from `project_id.dataset.table` where id LIKE "%_id_%";'//it will also work for where like "%\_id\_%",
//configuring the query to use StandardSQL
useLegacySql: false
};
var queryResults = BigQuery.Jobs.query(request, projectId);
var jobId = queryResults.jobReference.jobId;
// Check on status of the Query Job.
var sleepTimeMs = 500;
while (!queryResults.jobComplete) {
Utilities.sleep(sleepTimeMs);
sleepTimeMs *= 2;
queryResults = BigQuery.Jobs.getQueryResults(projectId, jobId);
}
// Get all the rows of results.
var rows = queryResults.rows;
while (queryResults.pageToken) {
queryResults = BigQuery.Jobs.getQueryResults(projectId, jobId, {
pageToken: queryResults.pageToken
});
rows = rows.concat(queryResults.rows);
}
if (rows) {
var spreadsheet = SpreadsheetApp.create('BiqQuery Results');
var sheet = spreadsheet.getActiveSheet();
// Append the headers.
var headers = queryResults.schema.fields.map(function(field) {
return field.name;
});
sheet.appendRow(headers);
// Append the results.
var data = new Array(rows.length);
for (var i = 0; i < rows.length; i++) {
var cols = rows[i].f;
data[i] = new Array(cols.length);
for (var j = 0; j < cols.length; j++) {
data[i][j] = cols[j].v;
}
}
sheet.getRange(2, 1, rows.length, headers.length).setValues(data);
Logger.log('Results spreadsheet created: %s',
spreadsheet.getUrl());
} else {
Logger.log('No rows returned.');
}
}
The output,
id
_id_1212
Both where id LIKE "%_id_%" and where id LIKE "%\_id\_%" work when I set the query to use StandardSQL (useLegacySql: false).
In addition, the error GoogleJsonResponseException: API call to bigquery.jobs.query failed with error: Syntax error: Illegal escape sequence: \_ will be thrown when trying to escape the underscore using a double backslash such as where id LIKE "%\\_id\\_%".

Ravendb select - update multiple row

I am trying to select and update multiple row from ravendb, but it recursively update same rows. Namely first 100 rows. There is no changes.
Here is my code. How can I select some rows, Update some fields of each rows and do it again and again until my job finished.
var currentEmailId = 100;
using (var session = store.OpenSession())
{
var goon = true;
while(goon){
var contacts = session.Query<Contacts>().Where(f => f.LastEmailId < currentEmailId).Take(100);
if(contacts.Any()){
foreach(var contact in contacts){
EmailOperation.Send(contact, currentEmailId);
contact.LastEmailId = currentEmailId;
}
session.SaveChanges();
}
else{
goon = false
}
}
}
It's probably because you're doing a query immediately after saving changes, without letting the indexes update after save changes. Thus, you're getting back the same items. To fix that, you can tell SaveChanges to wait until indexes are updated. Your code would look something like this:
Try this:
var goon = true;
var currentEmailId = 100;
while (goon)
{
using (var session = store.OpenSession())
{
var contacts = session.Query<Contacts>()
.Where(f => f.LastEmailId < currentEmailId)
.Take(100);
if(contacts.Any())
{
foreach(var contact in contacts)
{
EmailOperation.Send(contact, currentEmailId);
contact.LastEmailId = currentEmailId;
}
// Wait for the indexes to update when calling SaveChanges.
DbSession.Advanced.WaitForIndexesAfterSaveChanges(TimeSpan.FromSeconds(30), false);
session.SaveChanges();
}
else
{
goon = false
}
}
}
If you're updating many contacts at once, you may wish to consider using using Streaming query results combined with BulkInsert to update many Contacts en mass.

Google BigQuery returns only partial table data with C# application using .net Client Library

I am trying to execute the query (Basic select statement with 10 fields). My table contains more than 500k rows. C# application returns the response with only 4260 rows. However Web UI returns all the records.
Why my code returns only partial data, What is the best way to select all the records and load into C# Data Table? If there is any code snippet it would be more helpful to me.
using Google.Apis.Auth.OAuth2;
using System.IO;
using System.Threading;
using Google.Apis.Bigquery.v2;
using Google.Apis.Bigquery.v2.Data;
using System.Data;
using Google.Apis.Services;
using System;
using System.Security.Cryptography.X509Certificates;
namespace GoogleBigQuery
{
public class Class1
{
private static void Main()
{
try
{
Console.WriteLine("Start Time: {0}", DateTime.Now.ToString());
String serviceAccountEmail = "SERVICE ACCOUNT EMAIL";
var certificate = new X509Certificate2(#"KeyFile.p12", "notasecret", X509KeyStorageFlags.Exportable);
ServiceAccountCredential credential = new ServiceAccountCredential(
new ServiceAccountCredential.Initializer(serviceAccountEmail)
{
Scopes = new[] { BigqueryService.Scope.Bigquery, BigqueryService.Scope.BigqueryInsertdata, BigqueryService.Scope.CloudPlatform, BigqueryService.Scope.DevstorageFullControl }
}.FromCertificate(certificate));
BigqueryService Service = new BigqueryService(new BaseClientService.Initializer()
{
HttpClientInitializer = credential,
ApplicationName = "PROJECT NAME"
});
string query = "SELECT * FROM [publicdata:samples.shakespeare]";
JobsResource j = Service.Jobs;
QueryRequest qr = new QueryRequest();
string ProjectID = "PROJECT ID";
qr.Query = query;
qr.MaxResults = Int32.MaxValue;
qr.TimeoutMs = Int32.MaxValue;
DataTable DT = new DataTable();
int i = 0;
QueryResponse response = j.Query(qr, ProjectID).Execute();
string pageToken = null;
if (response.JobComplete == true)
{
if (response != null)
{
int colCount = response.Schema.Fields.Count;
if (DT == null)
DT = new DataTable();
if (DT.Columns.Count == 0)
{
foreach (var Column in response.Schema.Fields)
{
DT.Columns.Add(Column.Name);
}
}
pageToken = response.PageToken;
if (response.Rows != null)
{
foreach (TableRow row in response.Rows)
{
DataRow dr = DT.NewRow();
for (i = 0; i < colCount; i++)
{
dr[i] = row.F[i].V;
}
DT.Rows.Add(dr);
}
}
Console.WriteLine("No of Records are Readed: {0} # {1}", DT.Rows.Count.ToString(), DateTime.Now.ToString());
while (true)
{
int StartIndexForQuery = DT.Rows.Count;
Google.Apis.Bigquery.v2.JobsResource.GetQueryResultsRequest SubQR = Service.Jobs.GetQueryResults(response.JobReference.ProjectId, response.JobReference.JobId);
SubQR.StartIndex = (ulong)StartIndexForQuery;
//SubQR.MaxResults = Int32.MaxValue;
GetQueryResultsResponse QueryResultResponse = SubQR.Execute();
if (QueryResultResponse != null)
{
if (QueryResultResponse.Rows != null)
{
foreach (TableRow row in QueryResultResponse.Rows)
{
DataRow dr = DT.NewRow();
for (i = 0; i < colCount; i++)
{
dr[i] = row.F[i].V;
}
DT.Rows.Add(dr);
}
}
Console.WriteLine("No of Records are Readed: {0} # {1}", DT.Rows.Count.ToString(), DateTime.Now.ToString());
if (null == QueryResultResponse.PageToken)
{
break;
}
}
else
{
break;
}
}
}
else
{
Console.WriteLine("Response is null");
}
}
int TotalCount = 0;
if (DT != null && DT.Rows.Count > 0)
{
TotalCount = DT.Rows.Count;
}
else
{
TotalCount = 0;
}
Console.WriteLine("End Time: {0}", DateTime.Now.ToString());
Console.WriteLine("No. of records readed from google bigquery service: " + TotalCount.ToString());
}
catch (Exception e)
{
Console.WriteLine("Error Occurred: " + e.Message);
}
Console.ReadLine();
}
}
}
In this Sample Query get the results from public data set, In table contains 164656 rows but response returns 85000 rows only for the first time, then query again to get the second set of results. (But not known this is the only solution to get all the results).
In this sample contains only 4 fields, even-though it does not return all rows, in my case table contains more than 15 fields, I get response of ~4000 rows out of ~10k rows, I need to query again and again to get the remaining results for selecting 1000 rows takes time up to 2 minutes in my methodology so I am expecting best way to select all the records within single response.
Answer from User #:Pentium10
There is no way to run a query and select a large response in a single shot. You can either paginate the results, or if you can create a job to export to files, then use the files generated in your app. Exporting is free.
Step to run a large query and export results to files stored on GCS:
1) Set allowLargeResults to true in your job configuration. You must also specify a destination table with the allowLargeResults flag.
Example:
"configuration":
{
"query":
{
"allowLargeResults": true,
"query": "select uid from [project:dataset.table]"
"destinationTable": [project:dataset.table]
}
}
2) Now your data is in a destination table you set. You need to create a new job, and set the export property to be able to export the table to file(s). Exporting is free, but you need to have Google Cloud Storage activated to put the resulting files there.
3) In the end you download your large files from GCS.
It my turn to design the solution for better results.
Hoping this might help someone. One could retrieve next set of paginated result using PageToken. Here is the sample code for how to use PageToken. Although, I liked the idea of exporting for free. Here, I write rows to flat file but you could add them to your DataTable. Obviously, it is a bad idea to keep large DataTable in memory though.
public void ExecuteSQL(BigqueryService bqservice, String ProjectID)
{
string sSql = "SELECT r.Dealname, r.poolnumber, r.loanid FROM [MBS_Dataset.tblRemitData] R left join each [MBS_Dataset.tblOrigData] o on R.Dealname = o.Dealname and R.Poolnumber = o.Poolnumber and R.LoanID = o.LoanID Order by o.Dealname, o.poolnumber, o.loanid limit 100000";
QueryRequest _r = new QueryRequest();
_r.Query = sSql;
QueryResponse _qr = bqservice.Jobs.Query(_r, ProjectID).Execute();
string pageToken = null;
if (_qr.JobComplete != true)
{
//job not finished yet! expecting more data
while (true)
{
var resultReq = bqservice.Jobs.GetQueryResults(_qr.JobReference.ProjectId, _qr.JobReference.JobId);
resultReq.PageToken = pageToken;
var result = resultReq.Execute();
if (result.JobComplete == true)
{
WriteRows(result.Rows, result.Schema.Fields);
pageToken = result.PageToken;
if (pageToken == null)
break;
}
}
}
else
{
List<string> _fieldNames = _qr.Schema.Fields.ToList().Select(x => x.Name).ToList();
WriteRows(_qr.Rows, _qr.Schema.Fields);
}
}
The Web UI automatically flattens the data. This means that you see multiple rows for each nested field.
When you run the same query via the API, it won't be flattened, and you get fewer rows, as the nested fields are returned as objects. You should check if this is the case at you.
The other is that indeed you need to paginate through the results. Paging through list results has this explained.
If you want to do only one job, than you should write your query ouput to a table, than export the table as JSON, and download the export from GCS.

WPF application Linq to sql getting data

I'm making a WPF application with a datagrid that displays some sql data.
Now i'm making a search field but that doesn't seem to work:
Contactpersoon is an nvarchar
bedrijf is an nvarchar
but
LeverancierPK is an INT
How can I combinate that in my search?
If i convert LeverancierPK to string, then I can use Contains but that gives me an error
//Inisiatie
PRCEntities vPRCEntities = new PRCEntities();
var vFound = from a in vPRCEntities.tblLeveranciers
where ((((a.LeverancierPK).ToString()).Contains(vWoord)) ||
(a.Contactpersoon.Contains(vWoord)) ||
(a.Bedrijf.Contains(vWoord)))
orderby a.LeverancierPK
select a;
myDataGrid_Leveranciers.ItemsSource = vFound;
Thanks
If you don't care about pulling all records back from the DB (which in your answer you pulled everything back), then you can just do a .ToList() before the where clause.
var vFound = vPRCEntities.tblLeveranciers.ToList()
.Where(a => a.LeverancierPK.ToString().Contains(vWoord)) ||
a.Contactpersoon.Contains(vWoord) ||
a.Bedrijf.Contains(vWoord))
.OrderBy(a.LeverancierPK);
This code can do what I was looking for but I think it could be alot shorter.
PRCEntities vPRCEntities = new PRCEntities();
var vFound = from a in vPRCEntities.tblLeveranciers
orderby a.LeverancierPK
select a;
myDataGrid_Leveranciers.ItemsSource = null;
myDataGrid_Leveranciers.Items.Clear();
foreach (var item in vFound)
{
if (item.Bedrijf.Contains(vWoord))
{
myDataGrid_Leveranciers.Items.Add(item);
}
else
{
if (item.LeverancierPK.ToString().Contains(vWoord))
{
myDataGrid_Leveranciers.Items.Add(item);
}
else
{
if (item.Contactpersoon != null)
{
if (item.Contactpersoon.Contains(vWoord))
{
myDataGrid_Leveranciers.Items.Add(item);
}
}
}
}
}

LINQ Right Outer Join Problem

I am writing a right outer join query in SQL Server 2005 and it's working fine, but I am not able to convert it to LINQ.
Here is my query:
select b.number, COUNT(*) AS [AudioCount] from audios a
right join months b on DATEPART(Month, a.[RecordedDate]) = b.number
group by number
Please help me convert it to LINQ.
Thanks & Regards,
Anil Saklania
EDIT: Corrected query.
Depending on what you are looking for I have inverted it to be a left join but it is a left join from months to audio. This will enable you to return a count of zero when a month has no audio recordings. Used paolo's original testing data to test this out.
var audioMonths = from month in ListOfMonths
join audio in ListOfAudios on
month.number equals audio.RecordedDate.Month into audioLeftJoin
from audio in audioLeftJoin.DefaultIfEmpty()
select new
{
Month = month.number,
AudioId = audio != null ? audio.someProperty : null //Need some property on the audio object to see if it exists
};
var monthAudioCount = from audioMonth in audioMonths
group audioMonth by audioMonth.Month into grouping
select new
{
Month = grouping.Key,
AudioCount = grouping.Count(audioMonth => audioMonth.AudioId != null)
};
First, some notes from book: LINQ Pocket Reference by J. & B. Albahari:
1. Using an extra from translates to a SelectMany.
2. An into clause translates to a GroupJoin when it appears directly after a join clause.
Both of the excellent solutions above, by Mike and by Paolo, utilize a second, extra from clause in the query because that translates to a SelectMany.
With SelectMany, a “sequence of sequences” ( a sequence of audio sequences ) is converted into a single flat collection result set. Then, to count the audios, that single flat output collection is, in a second step, grouped according to month. In both solutions above, that is done, and it works OK, but it also necessitates careful checking for nulls.
EXPLOITING THE NATURAL HIERARCHY.
A cleaner alternative way is the use a GroupJoin instead of SelectMany. GroupJoin yields a hierarchical result set, rather than the flat result set of SelectMany. The hierarchical result set needs no grouping, of course, so we eliminate the second step.
Best of all, by utilizing the hierarchical result set of GroupJoin, we don’t have to check for nulls.
Thus we achieve another clean left outer join by this code, and borrowing Paolo's data:
static void Main(string[] args)
{
var ListOfAudios = new List<Audio>() {
new Audio() { someProperty = "test", RecordedDate = new DateTime(2011, 01, 01) },
new Audio() { someProperty = "test", RecordedDate = new DateTime(2011, 01, 02) },
new Audio() { someProperty = "test", RecordedDate = new DateTime(2011, 02, 01) },
new Audio() { someProperty = "test", RecordedDate = new DateTime(2011, 02, 02) }
};
var ListOfMonths = new List<Month>() {
new Month() {number=1, someMonthProperty="testMonth"},
new Month() {number=2, someMonthProperty="testMonth"},
new Month() {number=3, someMonthProperty="testMonth"}
};
var q = from month in ListOfMonths
join audio in ListOfAudios on month.number equals audio.RecordedDate.Month
into hierarch
select new
{
MonthNum = month.number,
AudioCnt = hierarch.Count()
};
foreach (var m in q)
{
Console.WriteLine("{0} - {1}", m.MonthNum,m.AudioCnt);
}
Console.ReadLine();
}
As per some of the comments to your question there are probably more straightforward ways to do what you want than translating your query to linq. However, just as an exercise, here's a way to write it:
var res = from audio in ListOfAudios
join month in ListOfMonths
on audio.RecordedDate.Month equals month.number into joinAudioMonth
from j in joinAudioMonth.DefaultIfEmpty()
group j by j.number into g
select new
{
number = g.Key,
cnt = g.Count()
};
EDIT:
the code above does not do a RIGHT JOIN as you requested, here's a revised one based on Mike's answer. This one does not rely on a property of the Audio object (that might be null even if the object itself exists). But I'm being nitpicky, Mike's answer is basically the correct one.
var audioMonths =
from month in ListOfMonths
join audio in ListOfAudios on
month.number equals audio.RecordedDate.Month into monthAudioJoin
from joined in monthAudioJoin.DefaultIfEmpty()
select new
{
Month = month.number,
J = joined
};
var res = from audioMonth in audioMonths
group audioMonth by audioMonth.Month into grouping
select new
{
number = grouping.Key,
cnt = grouping.Count(a => a.J != null)
};
and here's how I tested it:
public class Audio
{
public string someProperty {get; set;}
public DateTime RecordedDate {get; set; }
}
public class Month
{
public string someMonthProperty {get; set;}
public int number {get; set; }
}
public static void Main (string[] args)
{
var ListOfAudios = new List<Audio>() {
new Audio(){someProperty="test", RecordedDate=new DateTime(2011,01,01)},
new Audio(){someProperty="test", RecordedDate=new DateTime(2011,01,02)},
new Audio(){someProperty="test", RecordedDate=new DateTime(2011,02,01)},
new Audio(){someProperty="test", RecordedDate=new DateTime(2011,02,02)}
};
var ListOfMonths = new List<Month>() {
new Month() {number=1, someMonthProperty="testMonth"},
new Month() {number=2, someMonthProperty="testMonth"},
new Month() {number=3, someMonthProperty="testMonth"}
// ...
};
var audioMonths =
from month in ListOfMonths
join audio in ListOfAudios on
month.number equals audio.RecordedDate.Month into monthAudioJoin
from joined in monthAudioJoin.DefaultIfEmpty()
select new
{
Month = month.number,
J = joined
};
var res = from audioMonth in audioMonths
group audioMonth by audioMonth.Month into grouping
select new
{
number = grouping.Key,
cnt = grouping.Count(a => a.J != null)
};
foreach(var r in res)
{
Console.WriteLine("{0} - {1}", r.number, r.cnt);
}