I am trying to insert record into table using entity framework. I Know how to do it and i have done but i modified the database by adding a new table "PasswordRecovery" and then i update my .edmx file and then i try to insert the record into that table
PasswordRecovery OPasswordRecovery = new PasswordRecovery
{
userId = user.Id,
url = token,
requestDateTime = DateTime.Now,
isRecoverd = false
};
context.PasswordRecoveries.Add(OPasswordRecovery);
context.SaveChanges();
But the Exception "Invalid object name 'dbo.PasswordRecoveries"
Please help me
Related
I am trying to upload json data to one of the table created under the dataset in Bigquery but fails with "Google.GoogleApiException: 'Google.Apis.Requests.RequestError
Not found: Table currency-342912:sampleDataset.currencyTable [404]"
Service account is created with roles BigQuery.Admin/DataEditor/DataOwner/DataViewer.
The roles are also applied to the table also.
Below is the snippet
public static void LoadTableGcsJson(string projectId = "currency-342912", string datasetId = "sampleDataset", string tableId= "currencyTable ")
{
//Read the Serviceaccount key json file
string dir = Directory.GetParent(Directory.GetCurrentDirectory()).Parent.Parent.FullName + "\\" + "currency-342912-ae9b22f23a36.json";
GoogleCredential credential = GoogleCredential.FromFile(dir);
string toFileName = Directory.GetParent(Directory.GetCurrentDirectory()).Parent.Parent.FullName + "\\" + "sample.json";
BigQueryClient client = BigQueryClient.Create(projectId,credential);
var dataset = client.GetDataset(datasetId);
using (FileStream stream = File.Open(toFileName, FileMode.Open))
{
// Create and run job
BigQueryJob loadJob = client.UploadJson(datasetId, tableId, null, stream); //This throws error
loadJob.PollUntilCompleted();
}
}
Permissions for the table, using the service account "sampleservicenew" from the screenshot
Any leads on this , much appreciated
Your issue might reside in your user credentials. Please follow this steps to check your code:
Please check if the user you are using to execute your application have access to the table you want to insert data.
If your json tags matchs your table columns.
If you json inputs are correct ( table name, dataset name ).
Use a dummy table to perform a quick test of your credentials and data integrity.
These steps will help you identifying what could be missing on your side. I perform the following operations to reproduce your case:
I created a table on BigQuery based on the values of your json data:
create or replace table `projectid.datasetid.tableid` (
IsFee BOOL,
BlockDateTime timestamp,
Address STRING,
BlockHeight INT64,
Type STRING,
Value INT64
);
Created a .json file with your test data
{"IsFee":false,"BlockDateTime":"2018-09-11T00:12:14Z","Address":"tz3UoffC7FG7zfpmvmjUmUeAaHvzdcUvAj6r","BlockHeight":98304,"Type":"OUT","Value":1}
{"IsFee":false,"BlockDateTime":"2018-09-11T00:12:14Z","Address":"tz2KuCcKSyMzs8wRJXzjqoHgojPkSUem8ZBS","BlockHeight":98304,"Type":"IN","Value":18}
Build & Run below code.
using System;
using Google.Cloud.BigQuery.V2;
using Google.Apis.Auth.OAuth2;
using System.IO;
namespace stackoverflow
{
class Program
{
static void Main(string[] args)
{
String projectid = "projectid";
String datasetid = "datasetid";
String tableid = "tableid";
String safilepath ="credentials.json";
var credentials = GoogleCredential.FromFile(safilepath);
BigQueryClient client = BigQueryClient.Create(projectid,credentials);
using (FileStream stream = File.Open("data.json", FileMode.Open))
{
BigQueryJob loadJob = client.UploadJson(datasetid, tableid, null, stream);
loadJob.PollUntilCompleted();
}
}
}
}
output
Row
IsFee
BlockDateTime
Address
BlockHeight
Type
value
1
false
2018-09-11 00:12:14 UTC
tz3UoffC7FG7zfpmvmjUmUeAaHvzdcUvAj6r
98304
OUT
1
2
false
2018-09-11 00:12:14 UTC
tz2KuCcKSyMzs8wRJXzjqoHgojPkSUem8ZBS
98304
IN
18
Note: You can use above code to perform your quick tests of your credentials and the integrity of the data to insert.
I also make use of the following documentation:
Load Credentials from a file
Google.Cloud.BigQuery.V2
Load Json data into a new table
I have column InternalKey in my database table. When I do update, all columns get updated except the column InternalKey.
Here is my code - is it a keyword?
vmdpkeys.LastOperationBy = user.Id;
vmdpkeys.LastOperationOn = DateTime.Now;
var _mapeddpkeys = _mapper.Map<SysDefaultPostingKeys>(vmdpkeys);
_mapeddpkeys.InternalKey = vmdpkeys.DefaultPostingKey.Replace(" ", String.Empty);
_defaultpostingkeysrepository.Update(_mapeddpkeys);
Can you check the Migration files and ModelSnapShot file and see if the IntrernalKey has been successfully added in your DB.
I need to insert records to an Oracle DB table that already has records in it by using the table's sequence.
I tried using RQL which creates an auto-generated id for the primary key but sometimes those generated ids already exist in the database and as a result, a constraint violation error is thrown.
ATG documentation provides an alternative named Overriding RQL-Generated SQL but I didn't manage to make it work for insert statements.
GSARepository repo =
(GSARepository)request.resolveName("/examples/TestRepository");
RepositoryView view = repo.getView("canard");
Object params[] = new Object[4];
params[0] = new Integer (25);
params[1] = new Integer (75);
params[2] = "french";
params[3] = "greek";
Builder builder = (Builder)view.getQueryBuilder();
String str = "SELECT * FROM usr_tbl WHERE (age_col > 0 AND age_col < 1
AND EXISTS (SELECT * from subjects_tbl where id = usr_tbl.id AND subject
IN (2, 3)))";
RepositoryItem[] items =
view.executeQuery (builder.createSqlPassthroughQuery(str, params));
Is there any way to use table's sequence for insert statements via ATG Repository API?
Eventually, I did not manage to make it work but I found the following solution.
I retrieved the sequence number as below and then used it in the RQL insert statement.
RepositoryView view = getRestServiceDetailsRepository().getView("wsLog");
String sql = "select log_seq.nextval from dual";
Object[] params = {};
Builder builder = (Builder) view.getQueryBuilder();
Query query = builder.createSqlPassthroughQuery(sql, params);
RepositoryItem[] items = view.executeQuery(query);
if (items != null && items.length > 0) {
items[0].getRepositoryId();
}
I'm trying to add new values to my GridView, that are later passed to Cache and DataSet and underlying SQL Database.
Here is my code, but I can't figure out what to type on the line "dataRow["ID"]=" as you can see. Everything else works fine and the other values are added to the database if I just give "ID" any number that doesn't exist.
protected void insertStudent_Click(object sender, EventArgs e)
{
DataSet dataSet = (DataSet)Cache["DATASET"];
//DataRow dataRow = dataSet.Tables["Students"].Rows.Find(e.Keys["ID"]);
dataSet.Tables["Students"].PrimaryKey = new DataColumn[] { dataSet.Tables["Students"].Columns["ID"] };
DataRow dataRow = dataSet.Tables["Students"].NewRow();
dataRow["ID"] =
dataRow["FirstName"] = ((TextBox)GridView1.FooterRow.FindControl("txtFirstName")).Text;
dataRow["LastName"] = ((TextBox)GridView1.FooterRow.FindControl("txtLastName")).Text;
dataRow["Gender"] = ((DropDownList)GridView1.FooterRow.FindControl("DropDownListGender")).SelectedValue;
dataRow["Course"] = ((DropDownList)GridView1.FooterRow.FindControl("DropDownListCourse")).SelectedValue;
dataRow["Grade"] = ((DropDownList)GridView1.FooterRow.FindControl("DropDownListGrade")).SelectedValue;
Cache.Insert("DATASET", dataSet, null, DateTime.Now.AddHours(24), System.Web.Caching.Cache.NoSlidingExpiration);
dataSet.Tables["Students"].Rows.Add(dataRow);
GridView1.DataSource = (DataSet)Cache["DATASET"];
GridView1.DataBind();
}
As per Andrei in the comment above, set up your ID column in the table as:
CREATE TABLE sample( ID INT IDENTITY(1,1) NOT NULL,
FirstName VARCHAR(50) )) -- And your rest of the details
No need to add a value to ID, it will increment by itself. Insert other values and when you read from the database, you have ID column incremented.
P.S. Do not include ID column while inserting other values to the table.
Google 'SQL INCREMENT' for more information.
The answer to this question is to use AutoIncrement on the ID Column in your Cached DataSet. Then when you save to DB, the added rows will get their correct ID in the DB.
dataSet.Tables["Students"].Columns["ID"].AutoIncrement = true;
In my Windows Forms application, I'm using a SQL Server Compact database. I have a function in which I want to update the columns 'id' and 'name' in table 'owner', unless the specified id does not exist, in which case I want new values inserted.
For example, my current table has 'id' 1 and 2. It MIGHT have 'id' 3. User enters data to insert/update id 3.
I want my query to do something like this:
UPDATE owner
SET name = #InputN
WHERE id = 3
IF ##ROWCOUNT = 0
INSERT INTO owner (id, name) VALUES 3, #InputN
How should I define my query in order to make this work in SQL Server Compact Edition?
You should do it in your form codes. This way you don't even need to check if there is an di with the value=3. It will check it by itself and update the row if it exists. If not you won't get any errors.
RSSql.UpdateNonQueryParametric("update owner set name=? where id=3", newname);
public static void UpdateNonQueryParametric(string query, params Object[] parameters)
{
SqlCeParameter[] param = new SqlCeParameter[parameters.Length];
for (int i = 0; i < parameters.Length; i++)
{
param[i] = new SqlCeParameter();
param[i].Value = parameters[i];
}
_cnt = new SqlCeConnection();
_cnt.ConnectionString = ConnectionString;
_cmd = new SqlCeCommand();
_cmd.Connection = _cnt;
_cmd.CommandType = System.Data.CommandType.Text;
_cmd.CommandText = query;
_cmd.Parameters.AddRange(param);
if (_cnt.State != System.Data.ConnectionState.Open)
_cnt.Open();
_cmd.ExecuteNonQuery();
_cmd.Dispose();
if (_cnt.State != System.Data.ConnectionState.Closed)
_cnt.Close();
_cnt.Dispose();
}