I am able to replicate something I would have bet that is not true :-)
Entity-Framework-5 based application is populating an entity named "dbSetName" from stored procedure result, like this:
DbContext.Database.Initialize(false);
IDbCommand cmd = DbContext.Database.Connection.CreateCommand();
cmd.CommandType = CommandType.StoredProcedure;
cmd.CommandText = sprocName;
DbContext.Database.Connection.Open();
using (DbDataReader reader = (DbDataReader)cmd.ExecuteReader())
{
var tmpresult = ((IObjectContextAdapter)this.DbContext).ObjectContext.Translate<T>(reader, dbSetName, MergeOption.AppendOnly);
The reader is SqlDataReader. DbContext is derived from System.Data.Entity.DbContext - nothing fancy.
The stored procedure output, when I execute in SSMS, spits a single set - the result of the first and only stand-alone SELECT in its body. It consists of 3383 records.
The contents of tmpresult, when enumerated consists of 3383 records, but CONSISTENTLY and without apparent reason, a few records I see in the raw output are replaced with duplicates of records in the raw output.
For example, RAW output looks like this
Row Values
1.1 1.2 1.3
2.1 2.2 2.3
3.1 3.2 3.3
4.1 4.2 4.3
5.1 5.2 5.3
...
But the result in tmpresult is consistently like that:
Row Values
1.1 1.2 1.3
2.1 2.2 2.3
3.1 3.2 3.3
2.1 2.2 2.3
5.1 5.2 5.3
...
I can stop in the debugger, examine and ensure that I am executing the raw against same SQL server/database/sproc that EF uses... Bizarre...
Not sure what additional info to show, but I hope someone could explain this.
ADDITIONAL INFO: Replacing Objectcontext.translate<T> with SqlQuery<T> removes the issue. So do I now debug whatever assembly implements Objectcontext.translate<T>?
Related
We recently updated our kafka version from 0.10 to 1.0 and I am updating the deprecated code
KTable<Long, myClass> myKTable = this.streamBuilder
.stream(Serdes.Long(), mySerde, sub_topic)
.groupByKey(Serdes.Long(), mySerde)
.reduce(myReducer, my_store);
to this
KTable<Long, myClass> myKTable = this.streamBuilder
.stream(sub_topic, Consumed.with(Serdes.Long(), mySerde))
.groupByKey(Serialized.with(Serdes.Long(), mySerde))
.reduce(myReducer, Materialized.as(my_store));
My stream throws an error while serializing in groupByKey. The Serialized.with() does not use the keySerde provided and defaults back to byteArray. And this byteArray serde then encounters my key which is a Long and throws a cast error.
Has anyone else encountered this error in the 1.0.0 version of kafka. The first code with the outdated version of kafka works fine. But updating the code to use Serialized.with() does not seem to work. Any help is greatly appreciated.
Can you share the stack trace? I actually think the issue is with reduce() -- you need to specify the Serdes via Materialized again.
This is kinda regression on the new API and got fixed recently in trunk: https://github.com/apache/kafka/pull/4919 Thus, upcoming 2.0 release will contain the fix.
I've just started to implement binary protocol API to orientDB with C++. Current Version of used orientDB is "orientdb-community-2.2.29" with win 10 x64 and java 1.8. Since I've tried to query "select * from XXXX" on example DB serverside exceptions are thrown and no record is serialized to client. Here are the logs after successful connection and query:
2017-12-03 14:14:12:561 INFO {db=Site} /0:0:0:0:0:0:0:1:2520 - Writing bytes (4+0=4 bytes): null [OChannelBinaryServer]$ANSI{green {db=Site}} Error on unmarshalling record #73:0 (java.lang.NullPointerException)
java.lang.NullPointerException
at com.orientechnologies.orient.server.network.protocol.binary.ONetworkProtocolBinary.getRecordBytes(ONetworkProtocolBinary.java:2894)
at com.orientechnologies.orient.server.network.protocol.binary.ONetworkProtocolBinary.writeRecord(ONetworkProtocolBinary.java:2907)
at com.orientechnologies.orient.server.network.protocol.binary.ONetworkProtocolBinary.writeIdentifiable(ONetworkProtocolBinary.java:2697)
at com.orientechnologies.orient.server.network.protocol.binary.ONetworkProtocolBinary.serializeValue(ONetworkProtocolBinary.java:1639)
at com.orientechnologies.orient.server.network.protocol.binary.ONetworkProtocolBinary.command(ONetworkProtocolBinary.java:1584)
at com.orientechnologies.orient.server.network.protocol.binary.ONetworkProtocolBinary.executeRequest(ONetworkProtocolBinary.java:660)
at com.orientechnologies.orient.server.network.protocol.binary.ONetworkProtocolBinary.sessionRequest(ONetworkProtocolBinary.java:394)
at com.orientechnologies.orient.server.network.protocol.binary.ONetworkProtocolBinary.execute(ONetworkProtocolBinary.java:217)
at com.orientechnologies.common.thread.OSoftThread.run(OSoftThread.java:81)
2017-12-03 14:14:12:561 WARNI {db=Site} Cannot serialize record: XXXX#73:0{Name:[2],IDs:[1]} v3 [ONetworkProtocolBinary]
Before writing the "null" bytes the recordID, position and record version is serialized and received on client side correctly, also querying from Studio or console works like a charm. I've tried to change the class - property to STRING or EMBEDDEDMAP with the same problem.
Thanks in advance for help :-)
Fortunately I found the mistake at my own: wrong SerializationImpl was configured. The correct configuration must be ORecordSerializerBinary and not ONetworkProtocolBinary.
I have a .rrd db which is collecting data from a temperature gauge. Now I have a second gauge so I'd like to add this new gauge to the existing .rrd database. I tried many times with the "rrdtool tune" command, but after that I run a "rrdtool info" on my database, and I see that there's not the last data source (another gauge) that I tried to insert.
How can I do this?
The command you need is, as you say, rrdtool tune. The documentation is available online at https://oss.oetiker.ch/rrdtool/doc/rrdtune.en.html
The ability to extend an RRA and to add or remove a DS was only added late in RRDTool 1.4. Check that you are not using an older version of RRDTool, as if you are, you will not be able to use this feature until you upgrade.
I just checked, and I see I'm using RRDTOOL 1.4 so I would not have problems. Anyway, the fact is that I used this command:
/usr/bin/rrdtool tune TEMPCucina.rrd DS:METEOTEMPEXT:GAUGE:1200:U:U RRA:AVERAGE:0.5:1:180000
I got this back from the computer:
DS[TEMPCucina] typ: GAUGE hbt: 1200 min: nan max: nan
But it seems that I'm not able to write into TEMPCucina.rrd
And if I try to perform the following command:
rrdtool info TEMPCucina.rrd
I just get the following, and it seems that no new gauge has been created
filename = "TEMPCucina.rrd"
rrd_version = "0003"
step = 60
last_update = 1510780261
header_size = 556
ds[TEMPCucina].index = 0
ds[TEMPCucina].type = "GAUGE"
ds[TEMPCucina].minimal_heartbeat = 1200
ds[TEMPCucina].min = NaN
ds[TEMPCucina].max = NaN
ds[TEMPCucina].last_ds = "18"
ds[TEMPCucina].value = 1,8000000000e+01
ds[TEMPCucina].unknown_sec = 0
rra[0].cf = "AVERAGE"
rra[0].rows = 30000
rra[0].cur_row = 1304
rra[0].pdp_per_row = 1
rra[0].xff = 0,0000000000e+00
rra[0].cdp_prep[0].value = NaN
rra[0].cdp_prep[0].unknown_datapoints = 0
(when I try to write I get this, but I don't know how to proceed at this point)
ERROR: TEMPCucina.rrd: illegal attempt to update using time 1510780527 when last update time is 1510780527 (minimum one second step)
I finally did it, but I wasn't able to use the rrdtool tune function.
I finally found here how to perform a dump of the database, how to modify it, and finally restore it to its original location (so I could also correct some data).
This is not what I was searching for, but it solved my problem so I want to share it.
I have the following model:
class PlatformVersion(models.Model):
platform = models.ForeignKey(Platform)
version = models.FloatField(db_index=True)
display_version = models.CharField(max_length=32, db_index=True)
is_enabled = models.BooleanField(default=False)
Initial DB seeding code:
def add_version(self, platform, version, display_version):
try:
return PlatformVersion.objects.get(platform=platform, version=version)
except PlatformVersion.DoesNotExist:
return PlatformVersion.objects.create(platform=platform, version=version, display_version=display_version,
is_enabled=True)
self.add_version(platform, 9.0, '9.0')
self.add_version(platform, 9.1, '9.1')
self.add_version(platform, 9.2, '9.2')
self.add_version(platform, 9.3, '9.3')
As you can see, I'm adding minor versions only, skipping versions like 9.0.1, 9.0.2, etc.
And I have the following use-case: client sends request with platform version, that could be, for example, 9.0, 9.0.0 or 9.0.1 or 10.0 (missing in DB at all), etc. I need then to query the following PlatformVersion instance from DB:
for 9.0 -> 9.0
for 9.0.0 -> 9.0
for 9.0.1 -> 9.0
for 9.0.999 -> 9.0
for 9.1.1 -> 9.1
for 10.0 (10.* missing in DB) -> closest version = 9.3 for my initial seed data.
I'm sure I can write such code by myself, but there will be a really huge amount of such queries per second, so I care about efficiency.
Can you please suggest best option for performance?
Ok, I've realized that it's better to store major and minor versions as separate integer fields in my model, and filter by their values.
BTW, I think it's also a very bad practice to store versions like "x.x" in float field. If single field storage is required, it should be better to store version in DecimalField because of different float and decimal representaion.
This error showed when I tried to select on an empty table in MS SQL Server 2005: "either BOF or EOF is True, or the current record has been deleted". I have used TADOConnection and TADODataSet in Delphi 5 to connect and retrieve data from the database.
Conn := TADOConnection.Create(nil);
DataSet := TADODataSet.Create(nil);
Conn.ConnectionString := 'Provider=SQLOLEDB.1;Password=sa;' +
'Persist Security Info=True;' +
'User ID=user;Initial Catalog=mydb;' +
'Data Source=MYPC\SQLEXPRESS;' +
'Use Procedure for Prepare=1;' +
'Auto Translate=True;Packet Size=4096;' +
'Workstation ID=MYPC;' +
'Use Encryption for Data=False;' +
'Tag with column collation when possible=False';
Conn.LoginPrompt := False;
Conn.Open;
DataSet.Connection := Conn;
DataSet.CommandText := 'SELECT * FROM MYTABLE';
DataSet.Open;
DataSet.Free;
Conn.Free;
Is there a way to check if a database table is empty without incurring such error?
This error originally occured with an update to MDAC_TYP (to 2.6 from memory). According to an old Borland advisory
"This is a bug in the SQL Server provider. Set
CursorLocation = clUseClient to eliminate the error."
There was a ADOExpress patch available from Borland, but the link doesn't work. Embarcadero now host it here: ftp://ftpd.embarcadero.com/pub/delphi/devsupport/updates/adoexpress/d5adoupdate2.exe(Thanks for the official link Jeroen!)
I would recommend downloading and installing all patches listed on the Embarcadero site, assuming you can find them.
Download the ADO update for Delphi 5 here: ftp://ftpd.embarcadero.com/pub/delphi/devsupport/updates/adoexpress/d5adoupdate2.exe
Make sure you also have the regular update installed:
English Enterprise
English Professional
English Standard
There are more updates (Corba, original ADO install) and languages (French, German), but these should get you going.
There also used to be a http://info.borland.com/devsupport/delphi/download_files/zlibupdate.zip, but it is not on the ftpd servers.
--jeroen
It was long time agom but I recall that this problem in Delphi 5 resolves by Delphi update. Early version has serious problems with ADO components
P.S. Also I see that your code uses not typical runtime creation of components and does not use some container like data module or form (not good usually) for visual work with components. Also sometimes useful run simple queries via adoConnection.execute. If you do not use visual components, handling of ADO's Recordset object is much the same as Delphi's AdoDataset.
Since I spent a whole day on this, here is a wrap up of what I did in the end.
Delphi 5 Pro Installation with ADO Express
Uninstall Delphi 5, also deleted installation directory
Install Delphi 5 Pro (I only have a German Delphi Pro edition available)
Install Delphi 5 Pro Update (I used the German update)
Install Delphi 5 ADO Express
Install Delphi 5 ADO Express Update Pack 1
Install Delphi 5 ADO Express Update Pack 2
And the download links (many thanks again Jeroen Wiert Pluimers):
Delphi 5 Pro updates:
ftp://ftpd.embarcadero.com/pub/delphi/devsupport/updates/delphi5/D5ProUpdate.exe
ftp://ftpd.embarcadero.com/pub/delphi/devsupport/updates/delphi5/german/d5proupdate.exe
ADOExpress Update Packs 1 and 2:
ftp://ftpd.embarcadero.com/pub/delphi/devsupport/updates/adoexpress/D5ADOUpgrade.exe
ftp://ftpd.embarcadero.com/pub/delphi/devsupport/updates/adoexpress/d5adoupdate2.exe
And for future reference some screenshots to build the download links (case sensitive):