I have Access .MDB file with 125 fields in one table. If I open this MDB I have seen all 125 fields. If I open the same table by OpenSource MDB Viewer I see data in 125 fields, but catalog of fields is truncated on 50 fields.
Fields list truncated
I need to open this MDB with my .NET Framework 4.8 program. I make this job by OleDb.OleDbConnection provider with this connection string
Provider=Microsoft.Jet.OLEDB.4.0;Data Source=G:\my.mdb
Unexpectedly, I can see only first 50 fields of data. Any attempt to read by ADO.NET next field from 50 to 125 is failed.
Input6
at System.Data.ProviderBase.FieldNameLookup.GetOrdinal(String fieldName)
at System.Data.OleDb.OleDbDataReader.GetOrdinal(String name)
at System.Data.OleDb.OleDbDataReader.get_Item(String name)
at AtiConverter.Module1._Closure$__._Lambda$__12-4(OleDbDataReader X) in G:\Projects\ATI\Access\Converter1\Converter1\AccessRead.vb:line 63
at AtiConverter.RawSqlQuery.RawSqlQuery[T](OleDbConnection Connection, String SqlQuery, Func`2 RowMapperFunc, Boolean NeedPrint) in G:\Projects\ATI\Access\Converter1\Converter1\RawSqlQuery.vb:line 23
I'm use ordinary Win 11 with last update.
Related
Trying to use the mv.NET by bluefinity tools. Made some integration packages with it for importing data from a d3 multi-value database into MS SQL 2012 but seem to be having some trouble with the mapping.
For the VOYAGES table have some commentX fields in the D3 application that are acting quite unwieldy and the INSERT fails after a certain number of rows with the following message
>Error: 0xC0047062 at INSERT, mvNET Source[354]: System.Exception: Error #8: dataReader[0] = LTPAC002 ci.BufferColumnIndex = 52, ci.ColumnName = COMMGROUP(Error #8: dataReader[0] = LTPAC002 ci.BufferColumnIndex = 52, ci.ColumnName = COMMGROUP(The value is too large to fit in the column data area of the buffer.))
at mvNETDataSource.mvNETSource.PrimeOutput(Int32 outputs, Int32[] outputIDs, PipelineBuffer[] buffers)
at Microsoft.SqlServer.Dts.Pipeline.ManagedComponentHost.HostPrimeOutput(IDTSManagedComponentWrapper100 wrapper, Int32 outputs, Int32[] outputIDs, IDTSBuffer100[] buffers, IntPtr ppBufferWirePacket)
Error: 0xC0047038 at INSERT, SSIS.Pipeline: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED.The PrimeOutput method on mvNET Source returned error code 0x80131500.The component returned a failure code when the pipeline engine called PrimeOutput().The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.There may be error messages posted before this with more information about the failure.
The value is too large to fit in the column data area of the buffer. -> tried changing the input / outputs types but can't seem to get it right.
In the SQL table the columns are of type ntext.
In the .dtsx job the data type for the columns are of type Unicode String [DT_WSTR] with length 4000 , I guess these are auto-detected.
The import worked for other D3 files like this not sure why it fails for these comment fields.
Running the query on the mv.NET Data Manager ( on the d3 server) times out after 240 seconds so maybe this is the underlying issue?
Any ideas how to proceed? Thank you ~
Most like reason is column COMMGROUP does not have correct data type or some record in source do not fit in output type
To find error record (causing) you have to use on redirect row (property of component failing component ) and get the result set in some txt.csv /or tsv file .
then check data
The exception is being thrown from mv.NET so I suggest you call (or ask your reseller) to call Bluefinity support and ask them about this. You're paying for support, might as well use it. Those programs shouldn't be allowed to throw exceptions like that.
D3 doesn't export Unicode, that might be one issue. But if the Data Manager times-out then I suspect something is wrong in the connectivity into D3. Open a Connection Monitor from the Session Monitor and watch the connection when you make the request. I'm guessing it's either hanging or more probably it's falling into BASIC Debug.
Make sure all D3-side programs related to this are either all Flash-compiled, or all Not Flashed. Your app code will fall into Debug if it's not Flashed but MVNET.BP is.
If it's your program that's in Debug, fix it. If you're not sure which program it is, LIST-RUNTIME-ERRORS in DM.
If it's a MVNET.BP program, again work with Bluefinity. If you are using MVSP for connectivity then the Connection Monitor may be useless, you'll need to change that to an IP (Telnet) connection to see the raw data exchange.
I deleted an unused connectionstring and now I have wasted a lot of hours trying to get it back to work. Cause the rest of the program worked like a char and the connectiontring was never used I was working on another database.
Error 1 'dbGIPEXConnectionString' is not a member of
'GIP_Eindwerk.My.MySettings'. D:\GIP\GIP+Eindwerk\GIP+Eindwerk\dbGIPEXDataSet.Designer.vb 2544 47 GIP+Eindwerk Error 2 'dbGIPEXConnectionString' is not a member of
'GIP_Eindwerk.My.MySettings'. D:\GIP\GIP+Eindwerk\GIP+Eindwerk\dbGIPEXDataSet.Designer.vb 2957 47 GIP+Eindwerk Error 3 'dbGIPEXConnectionString' is not a member of
'GIP_Eindwerk.My.MySettings'. D:\GIP\GIP+Eindwerk\GIP+Eindwerk\dbGIPEXDataSet.Designer.vb 3583 47 GIP+Eindwerk Error 4 'dbGIPEXConnectionString' is not a member of
'GIP_Eindwerk.My.MySettings'. D:\GIP\GIP+Eindwerk\GIP+Eindwerk\dbGIPEXDataSet.Designer.vb 3901 47 GIP+Eindwerk
All the errors lead to a code that looks like
Private Sub InitConnection()
Me._connection = New Global.System.Data.OleDb.OleDbConnection()
Me._connection.ConnectionString = Global.GIP_Eindwerk.My.MySettings.Default.dbGIPEXConnectionString
End Sub
I've tried removing
<Connection AppSettingsObjectName="MySettings" AppSettingsPropertyName="dbGIPEXConnectionString" ConnectionStringObject="" IsAppSettingsProperty="true" Modifier="Assembly" Name="dbGIPEXConnectionString (MySettings)" PropertyReference="ApplicationSettings.GIP_Eindwerk.My.MySettings.GlobalReference.Default.dbGIPEXConnectionString" Provider="System.Data.OleDb" />
in the dbGIPexdataset.xsd.. Didn't change anything.
Cleaning the project also changed nothing.
I would suggest that you simply delete your Data Source and create a new one. You would have to add instance of the DataSet and table adapters back to forms in the designer but any of your own code will just continue to work as long as the names are still the same.
Here is some sample code of how I am creating/connecting/working with my database
string connection = #"Data Source='C:\test.sdf';Max Database Size=4000;"
+ "Max Buffer Size=4096;";
File.Delete(#"C:\test.sdf");
using (var engine = new SqlCeEngine(connection))
{
engine.CreateDatabase();
engine.Compact("Data Source=; Case Sensitive=True; Max Database Size=4000;");
}
using (var dbConn = new SqlCeConnection(connection))
{
// Create tables, indexes, etc, and insert loads of data here
// Somewhere in the loading of data I get
// the "Database file is larger..." exception
}
Here is my question. The database file size at the point of the exception is a mere 368 MB (386,879,488 bytes to be exact according to the file properties). Do I need to add the max database size string into the Compact statement?
Any other ideas on what could be wrong.
The default value for Max Database Size is 256 MB, so yes, you would need to add this to the connection string, if the file size grows over this.
As had ErikEJ said, this is how the connection string have to be:
"Data Source=MyData.sdf;Max Database Size=256;Persist Security Info=False;"
where you can replace 256 with a needed size.
I am trying to Update Schema on SQL Server CE 3.5 using Active Record by Castle. I have encountered an error - exception on the bottom.
I believe it is not a new issue, but I could not find a walk around for this issue. When trying to Update Schema using Active Record I get an exception. It seems that there is a wrong GetSchema call. It should be System.Data.SqlServerCe.SqlCeConnection.GetSchema() instead of DbConnection.GetSchema(). I got this information from here.
I am looking for resolution, not just information that it is MS problem. It is my problem right now, and possibly other people as well.
Exception:
System.NotSupportedException: The method is not supported.
in System.Data.Common.DbConnection.GetSchema(String collectionName, String[] restrictionValues)
in NHibernate.Dialect.Schema.AbstractDataBaseSchema.GetTables(String catalog, String schemaPattern, String tableNamePattern, String[] types)
in NHibernate.Tool.hbm2ddl.DatabaseMetadata.GetTableMetadata(String name, String schema, String catalog, Boolean isQuoted)
in NHibernate.Cfg.Configuration.GenerateSchemaUpdateScript(Dialect dialect, DatabaseMetadata databaseMetadata)
in NHibernate.Tool.hbm2ddl.SchemaUpdate.Execute(Action`1 scriptAction, Boolean doUpdate)
SQL Server CE 4.0 implements GetSchema(). If you can't upgrade I guess you're SOL...
In the following AccessVBA code, I am trying to write a record to a memo field called "Recipient_Display":
oRec1.Fields("RECIPIENT_DISPLAY") = Left(sRecipientDisplayNames, Len(sRecipientDisplayNames) - 2)
When the string contains 2036 characters, the write completes. Above this number I get the following error:
Run-time error'-2147217887(80040e21)':
Could not update; currently locked by another session on this machine.
What is the significance of this number 2036 and is there a property I can adjust that will allow the above update to take place?
Are you sure that it is the string that is the problem? The message is usual when working with unsaved project and ADO.
BTW you should be using DAO with Access and VBA.