I have an odd situation on a VS upgrade from 2013 > 2015 outside of runtime.
App Type: MS Lightswitch HTML Client
DB Type: Oracle
Framework: 4.5
Story: I upgraded VS and replaced OPD.Net to the 2015 version. Works fine.
Then I converted my application. There were a lot of things to fix, but most were pretty easily remedied. I tested the application and it works as expected so I published to test server and everything checks out. Success! So I thought.
I want to continue developing the site. As I make db changes, they need to be reconciled to the intrinsic db in my project.
After clicking 'Update Database' I see this. So far so good.
What's expected is that after I hit 'Finish', all changes to the selected table should pull in to the lsml files. But this is what I get.
I've read a few places like The given key was not present in the dictionary, what key? [closed] but these all look like runtime remediations.
If I go back to the update screen and hit 'Previous', I get this.
I sifted through every freakin lsml file in text editor looking for where the provider is assigned. No luck. I also created a new proj to compare, nothing stood out. I also tried adding another data source which works fine. So ODP.net is not the issue. I am lost on what to do now. I searched all over the site, Google, for every error message with various tags. At this point I reach out to you, or anyone that may know what this is about.
Thanks ahead of time!
Note for future users upgrading a VS LS project with Oracle db.
Since a new version ODP.Net is required (in my case 2015), the provider name is going to change. To ensure LS knows the new provider, the data source lsml file needs to be updated. In my case I used GIT to help out. This is how I resolved it.
Steps:
After converting your project and replacing ODP.net to the current version.
Create a new data source using the new provider.
Save the project and re-open. This will cause lightswitch to recompile.
Open File Explorer and navigatye to the ProjectName.server folder. In a text editor (I used notepad) open the lsml files under ProjectName.server and there should be two lsml files (1 for the pre-existing and another for the new) or more if you have multiple sources.
Copy the connection properties of the new datasource to a new temp file on your desktop.
Roll back the entire solution using GIT or other source control.
Use text editor to open the lsml file for the original data source.
Update the GUID for DataProviderName with the values from the temp file in step 4.
Note: The connection string GUID should be left alone as it should match your GUID in the web.config file.
<DataService.ConnectionProperties>
<ConnectionProperty
Name="DataProviderName"
Value="9d8fdbb9-xxxx-4787-xxxx-49831d34ad4b" />
<ConnectionProperty
Name="ProviderInvariantName"
Value="Oracle.ManagedDataAccess.Client" />
<ConnectionProperty
Name="ConnectionStringGuid"
Value="36e67aca-xxxx-41a7-xxxx-a4546761b30d" />
<ConnectionProperty
Name="ProviderManifestToken"
Value="12.1" />
</DataService.ConnectionProperties>
Finally reload project and the changes should take effect allowing you to once again update your data source.
Thanks
Related
I am building a Java app that uses an SQLite database to hold most of its data. For the end-user, the database would be almost entirely read-only, with very occasional edits. I'll (theoretically) be displaying/distributing it through my GitHub page, so my question is:
What's the best way to load the database into GitHub? (I'm using IntelliJ with DataGrip.)
I'd prefer to be able to update the database when I commit/push, instead of having to overwrite the whole file. The closest question I can find is How to include MySQL database schema on GitHub? but there could potentially be hundreds or thousands of entries, so I can't just rebuild the tables when the user installs the app.
I'm applying for entry-level developer jobs, and this project is going to be my main portfolio piece during job-hunting. I'm trying to make sure it is not only functional but also makes a good impression. Any help is (very) greatly appreciated.
EDIT:
After moving my .db file into the folder connected to GitHub (same level as my src folder) apparently I can now commit/push it with the rest of my files. How do I make sure that the connection from my Java code to the database stays valid once it is loaded onto another user's system? Can I just stick with
connection = DriverManager.getConnection("jdbc:sqlite:mydatabase.db");
or do I need to rework the path?
Upon starting, if your application can't find a corresponding sqlite database file, have it create one. Then do initial load of your tables from either CSV, JSON or XML files.
You can upload these files to Git, as they are text formats.
We recently upgraded from TFS 2010 to TFS 2015. Everything appears to be fine post-upgrade, but we are getting the error "The item is locked in workspace (null);(null)." on some source control files. It looks like we have some orphaned locks that need to be tracked down and cleaned up, but the tbl_lock database table is not on the database, so the following select query won't work:
select * FROM tbl_Lock l
LEFT JOIN tbl_PendingChange pc
ON l.PendingChangeId = pc.PendingChangeId
WHERE pc.PendingChangeId IS NULL
Does anyone know how to detect and remove these locks in TFS 2015?
I also installed the TFS power tools, and neither Visual Studio 2015 nor the power tools are picking up the locks.
Updated:
BTW, when I run the SELECT query to find out where PendingChangeId is NULL, I get back no rows. I think the trick is the LEFT JOIN. PendingChangeId would be NULL when tbl_Lock also had no record for the PendingChangeId on tbl_PendingChange (and thus the lock was orphaned). So I'd still need to know where the PendingChangeId should normally be joined to in TFS 2015, to identify which files have a lock that is bad. (Or where a workspace no longer exists, which may be another possible source for the issue.)
And I also still need to know how to clean up those bad locks. I'd prefer to do this using the tools, either via the GUI or the command line, but could also do this programmatically either using the API or the TFS Object Model files for TFS 2015.
I really would rather only touch the database directly as a last ditch resort. And I would also rather use tf vc destroy on the item as a last ditch resort as well, since that would wipe out all history on the files.
Update 2
Aha! I think I found a way to identify the files, and it looks like my thinking for what happened may be correct. Unfortunately, I had to probe the database using a READ UNCOMMITTED query to find the information. I couldn't get at this information programmatically or using the tools. (They all showed or acted like the file is not checked out.) The query that I used on TFS 2015 was:
select pc.* from tbl_PendingChange pc
left join tbl_Workspace ws on pc.WorkspaceId = ws.WorkspaceId
where ws.WorkspaceId is null
This returned the three files that have the (null);(null) lock on our database, because the WorkspaceId listed on tbl_PendingChange does not exist anymore on tbl_Workspace.
How did this happen? Our CI server uses temporary TFS workspaces. I think what happened after the upgrade is that our CI server went to check out the file and apply an update to it. (For example, to increment version numbers as part of the build process.) It checked out the file, but failed to apply the update. (Our tools like working with Server workspaces, but it may have ended up with a Local workspace and thus the file was still checked in Local, but checked out on the Server. Thus the change to the file couldn't be applied.) The code that we are using performs a workspace.Delete operation when the process completes, so the workspace was deleted - even though the workspace still had the file checked out! So this created an orphan record on tbl_PendingChange that isn't linked to any Workspace, and thus the file is still locked with pending changes. But the GUI and tools aren't seeing it as such, because they're not realizing the pending change's workspace is non-existent.
So this brings me back around to how do I fix this? If someone knows of a way to get at these orphaned pending changes, I'd appreciate it. I tried using:
TfsTeamProjectCollection tfsTeamProjectCollection = TfsTeamProjectCollectionFactory.GetTeamProjectCollection(new Uri(szProjectUri));
VersionControlServer versionControlServer = tfsTeamProjectCollection.GetService<VersionControlServer>();
string[] items = new[] { ... server item path ... };
PendingSet[] queryPendingSets = versionControlServer.QueryPendingSets(items, RecursionType.None, null, null);
PendingSet[] getPendingSets = versionControlServer.GetPendingSets(items, RecursionType.None);
but these aren't finding the orphans.
Update 3
I finally installed Team Foundation Sidekicks 2015 and gave it a try - status tool specifically, but then other tools. It's finding pending changes, but not the orphaned ones.
You can use Team Foundation Sidekicks to search and undo lock by following steps:
Install the tool and launch it.
Select TFS server to connect.
Select "Tools\Status Sidekick".
Set the "Search criteria" for the information you want.
Click "Search" button.
Select the locked file and click "Unlock lock" button.
You can using below command to undo the pending changes:
tf undo "file_path" /workspace:workspace_name
Or you can just use below command to delete the old workspace
tf workspace /delete /server:your_tfs_server workspace;username
From Visual Studio 2015 GUIļ¼
File -> Source Control -> Advanced -> Workspaces...
In the dialog that came up, check "Show remote workspaces" and the locked workspace came up in the window. Then selected it and click "Remove".
Details about it, please check this blog and more ways to resolve this you can refer the similar question: What do you do if the file in TFS is locked by someone else?
Update:
According to the sql query. It's looking for .PendingChangeId IS NULL . You can use the similarly tbl_PendingChange under collection database. However, it's not a commendatory method. Since operate directly in the TFS database is not recommended.
The following command has cleared up the pending changesets that were orphaned:
tf vc destroy <itemspec> /startcleanup
After running this command, the file was able to be added back to TFS, and the file could be checked in and out and edited as normal. Running the query:
select pc.* from tbl_PendingChange pc
left join tbl_Workspace ws on pc.WorkspaceId = ws.WorkspaceId
where ws.WorkspaceId is null
also showed that the pending changeset record related to this file was gone as well.
Microsoft's documentation on this command can be found at https://msdn.microsoft.com/en-us/library/bb386005.aspx. Before using this command, you should review the documentation carefully and be sure to understand the consequences of using it.
Because this command permanently removes files and potenally all history from TFS - and does so recursively - you need to take precautions and be absolutely certain that you are targeting the command correctly. So before using this command, I would recommend taking the following additional precautions:
Stop all user and external accesses to TFS and any other software that may be running from the machine.
Make sure to run a full backup of TFS and any other databases located on the machine.
If you can, take a snapshot in time of the server.
That way if something goes horribly wrong, you will have one or more points to fall back on.
I'm having a bit of an issue with EF6 in VS2013. I had to modify the connection string for the database-first model in a Web Application project so I followed the advice in the best answer for How should I edit an Entity Framework connection string? and deleted it from my Web.Config file.
At first it seemed to work fine, I deleted the connection string then from the Entity Designer I ran "Update Model from Database", re-created the connection string, but then my build failed with multiple errors similar to:
Public Sub New() has multiple definitions with identical signatures
After some digging I figured out that when I re-created the connection string EF created a second Model.Context.vb file named Model.Context1.vb and both are still referenced somewhere. Since then I've opened every file in the folder containing my EF model with notepad searching for a reference to Context.vb or Context1.vb and have come up empty, if I remove either file my build fails stating the file can not be found, so as a workaround I opened the Context.vb file and removed all the code so there are no duplicates, I'd like to fix it properly by removing the reference to the file deleting it if anyone knows how I can go about doing that.
I got it sorted out, after attempting to restore an older version of the EF files from source control and still running into the same issue, I realized the reference was probably in a project file.
In [projectName].vbproj I found these two entries:
<Compile Include="Data\schedulerModel.Context1.vb">
<AutoGen>True</AutoGen>
<DesignTime>True</DesignTime>
<DependentUpon>schedulerModel.Context.tt</DependentUpon>
</Compile>
<Content Include="Data\schedulerModel.Context.tt">
<Generator>TextTemplatingFileGenerator</Generator>
<DependentUpon>schedulerModel.edmx</DependentUpon>
<LastGenOutput>schedulerModel.Context1.vb</LastGenOutput>
</Content>
I removed the first, and dropped the 1 from the context.vb file in the second, opened the project and ran a rebuild without issue.
I had the same problem but a slightly different resolution. For whatever reason, updating the edmx file one time seemed to remove a seemingly important line from the project file, the line reading <LastGenOutput>MyEntityModel.Context.cs</LastGenOutput>.
I re-added the line to my project file and updating the model didn't result in any more duplicate context files. The whole block looked like the following when fixed:
<Content Include="MyEntityModel.Context.tt">
<Generator>TextTemplatingFileGenerator</Generator>
<DependentUpon>MyEntityModel.edmx</DependentUpon
<LastGenOutput>MyEntityModel.Context.cs</LastGenOutput>
</Content>
Just thought I'd add my findings to this as it has been driving me to distraction for a few weeks - every time I updated my Model from Database, I got "duplicate" context, designer files etc, and then hundreds of errors. However, the new sp or table or whatever I had added was only present in the new "context1" files, not the originals, so when I wound it back I had to go through the same process again etc etc.
Then finally a light went on when I thought of ... Source Control! I use TFS, and I found that unless ALL model-related files are checked out before doing the Update, so that's Context, Designer and Service files, then EF generates new versions of almost everything, presumably because it can't modify one of the files which are read-only due to source-control.
The key then is to fix the project file as stated in answers above before getting everything checked out and THEN doing the update. If you don't get that tag right in the proj file, it goes and does it all wrong again even though everything is checked out.
Hope this helps - my sanity is slowly returning anyway.
Ade
We have custom content types that were created as extensions of the ATTypes, two of them extend the ATFile type and one extends the ATImage type. We recently upgraded from Plone 4.2 to Plone 4.3.2. Just discovered we are not using Blob storage at all. No wonder our Data.fs is HUGE. So, I have been trying to migrate these custom types.
I have followed all of the steps explained in this example and the product's notes from pypi, these Plone instructions, and used the example from the pypi page for archetypes.schemaextender (Sorry, since I'm still a noob my reputation won't let me post more than 2 links).
In the end, I created an extender script that just extends the ATFile type changing the FileField to BlobField. It seems to be working for new items. I can add a new CustomFileType and it appears to be uploading the file to blob, and my new upload field is showing (I changed the description as a quick way to verify which one it was using).
However, I am having a problem migrating all existing content items to move the binary files over to blob. I tried the generic migrate() script, then I created my own migrate and walker as suggested in the above resources. It doesn't seem like it is doing anything though. When printing results for each item it tries merging, I do see this returned for each item:
DEBUG ATCT.migration Migrating /site/path/to/custom/file/filename.ext (CustomFile -> Blob)
When I navigate to the custom file type in the site, where it usually shows the link to the file, it is just empty. Then going to edit, it treats it as if there is no file there. As a check, I disabled the extender, restarted, and reloaded the custom file. The file was there now. So it looks like the script I am running just isn't moving that file over to where it should be now.
I feel like I am missing something simple, and it is right there, but I can't seem to find it. All of this is learn as I go and a bit over my head, so hopefully someone can easily set me straight.
If I need to provide any additional information leave a comment and I will try to provide what you need.
UPDATE
I used the Red Turtle objects as examples to migrate my custom types as suggested by keul. I still was not able to get the file to migrate to blob within the type itself. So, I tried a different approach. I created a new custom type "CustomBlob", that is a mimic setup of my CustomFile type, and only extended this new blob type to be blob aware. Then I migrated the CustomFiles to CustomBlob, did a complete clear and rebuild, and packed the zeo. The migration seemed to work for the most part, the blobstorage grew by an expected amount, the new types worked. However, the Data.fs didn't go down in size. I would have thought that the binary files that were stored in Data.fs would be removed during the migration. Am I understanding this incorrectly? How can I remove these files so the Data.fs size goes down appropriately?
Not sure if this is the best solution, but here is how I was able to get this to work.
I created temporary content types parallel of each type (for CustomImage I made CustomImageBlob, and so on). I made the new types blob-aware only, migrated all types to their parallel. Then I enabled the extender for the original types to make them blob-aware, and migrated back. It is a little redundant and time consuming, but I just could not get the files to migrate to blob when migrating to itself.
Providing this as the best answer so far in case it helps someone else, or might encourage someone to find a better solution. Thanks for the tip keul, it definitely helped me get to this solution.
Everytime I want to change some properties in some class I get the following error messages:
:Microsoft Cursor Engine [-2147217864]
Row cannot be located for updating. Some values may have been changed since it was last read.
ADODB.Recordset[-2146825069]
Operation is not allowed in this context.
How can I solve them??
Even if this question was posted a long time ago:
Now and then this error occurs in my projects, too.
Every time I try to edit specific elements in Enterprise Architect projects i get exactly the same error messages. The only solution to this is to delete the element completely and create it again.
#TomO:
When you are importing a package, is this from XMI or are you import a source code directory?
I import only via XMI file.
What are you using as a repository?
I'm using a PostgreSQL-Server based repository, which I access via ODBC Driver.
In your ODBC Data Source Configuration do you have "Return matched rows instead of affected rows" and "Allow big result sets".
Could specify where I can find these options? Perhaps this is outdated, becaus I can't find any of these options under the Options/Datasource Menu in my ODBC driver.
If you are importing form XMI are you stripping the GUIDs on import, this is always a good idea if you are making a copy of an existing folder in your model as having two elements with the same GUID is not ideal ;-)
I strip GUIDs when I'm exporting and again when I'm importing XMI files.
I would really apprechiate any help concerning this topic.
If possible i might need a little more information. When you are importing a package, is this from XMI or are you import a source code directory? What are you using as a repository? Given the error I am assuming it is not the local EAP file.
In your ODBC Data Source Configuration do you have "Return matched rows instead of affected rows" and "Allow big result sets"
If you are importing form XMI are you stripping the GUIDs on import, this is always a good idea if you are making a copy of an existing folder in your model as having two elements with the same GUID is not ideal ;-)
I have also noticed that you asked this on Apr 14th - sorry it has taken me so long to find your request. I hope this helps!
Are you accessing your ea repository as a cloud repository please? If so, you could try to switch to access the repository as an odbc datasource, and this problem might be solved. I think it is a bug of the Sparx enterprise architect cloud service.