Since EPiServer 7 upgrading to a newer version has involved:
Updating all EPiServer.* NuGet packages
Running PM > Update-EPiServer - to upgrade the local database
Running PM > Export-EPiServer - to produce a set of database upgrade script files that can be run on other servers.
However if only the upgraded EPiServer solution is built and deployed by Octopus Deploy, the database will not be upgraded meaning the site will not run.
Currently I run the EPiServerPackage manually on a server in each of our environments after a deployment.
I'm trying to decide on the cleanest way to include the /EPiUpdatePackage folder and contents running the Export-EPiServer command produces so that it will:
Checked into source control
Turned into a NuGet package on the build server
Be deployed by Octopus Deploy so that it can be remotely executed on the server the script is deployed to
As per Eric Herlitz suggestion, I have simply used the <episerver.framework updateDatabaseSchema="true"> in the web.config transforms for the environments I wish to have the database automatically upgraded on.
I'm unsure it this will present a problem if the SQL connection string user does not have the required level of permissions. However in my case, this is working correctly.
Related
We are using an on-premise installation of Microsoft Visual Studio Team Foundation Server Version 16.131.
We have a Development Continuous integration build definition that includes running some unit-tests on a very old application. As such, the unit-tests require a database to run (I know, not an ideal situation with unit tests...)
One of the artifacts of the build is a DACPAC of the database.
I'd like to deploy any database changes from that DACPAC as part of the build definition, before running the Unit-Test steps. That way, any tests that are added / changed that are dependent on database changes will (hopefully) pass.
Any ideas if this is possible, and if so, how can I publish from the DACPAC within the build definition?
We are using a PowerShell step in the build/release pipeline.
In the powershell script you will call SqlPackage:
&"C:\Program Files\Microsoft SQL Server\140\DAC\bin\SqlPackage.exe" /Action:Publish /SourceFile:"Dacpac_Artifact\DACPAC\DB.dacpac" /Profile:"Dacpac_Artifact\DACPAC\FileWithPublishPrife.publish.xml"
You could also have a look at this Microsoft DevLabs task, it also contains a sub task to deploy dacpac remotely, so you could deploy to another machine (or use localhost to deploy on the build server itself if you want). There are also other marketplace task you could use just navigate to the store and type sql you'll see a dozen.
IIS Web App Deployment Using WinRM
Short Question
Is there a way to programmatically download all of the Nuget packages needed to build/publish a solution and output them into a single directory that can be moved to a local Nuget repository?
Some Background
Where I work, most development is done on an air-gapped network (no internet). On a recent project, I was able to develop on our internet-enabled network. This happens to be the first application developed on the internet-enabled network and the first ASP.NET Core app we've ever developed. The solution builds, runs, and publishes just fine on the internet-facing network. I am now trying to move the solution to the air-gapped network, but I am having issues getting all of the dependencies moved over.
At first, the solution wouldn't build because of missing ASP.NET Core nuget packages; so I copied ALL of the nuget packages from the local cache on the machine that I used to develop the application to the local Nuget repository on the air gapped network. Now the application builds, but I can't seem to publish the web project (ASP.NET Core). I'm getting 25+ errors along the lines of:
Unable to find package runtime.any.System.Diagnositics.Tools. No packages exist with this id in source(s)...
Unable to find Nicrosoft.NETCore.App with version (>= 2.1.6)...
Unable to find...
I'm also getting a runtime error "This page cannot be displayed" when I try to run from visual studio (using IIS Express), but I'm not sure if that's a related issue. The unit/integration tests run fine.
I could try manually downloading each nuget package from Nuget.org and moving them over to the air gapped network, but it takes hours to get things moved from one network to another. Is there anyway I can automate the retrieval of all nuget packages required to build/publish a solution so that I can make a single transfer from one network to the other instead moving what I have and waiting to see what breaks? Preferably I'd like a exe or a PowerShell script that could look at a sln file and drop all the necessary nuget packages into a specified directory.
On your internet enabled dev env, you can browse to the root directory of your project and use:
dotnet restore --packages .\package\
This will use the directory called 'packages' as the local cache for the solution and all nuget files will be copied to it.
You can then include the same switch in your build, to ensure that the local cache is used.
dotnet build --packages .\package\
I answered two very similar questions a few weeks ago. Have a look at these answers to see if they help
Command to download a Nuget package with all dependencies to a folder
Is it possible to create a cache of nuget packages for computers without internet?
I have create a sample ASP.NET 5 application (pretty much the example one from New Solution), and pushed it to GIT hosted on Visual Studio Team Services (former Visual Studio Online). I want to set up continuous integration to Azure Web App (former Azure Web Site). I have tried to set it up from Azure portal itself, it did create a new build definition, but it fails to build ASP.NET 5. I have found a guide how to do this, but it never really worked for me, I get errors like this e.g.
Error parsing solution file at C:\a\1\s\Frontend\src\Frontend\Frontend.xproj: Exception has been thrown by the target of an invocation.
Predefined type 'System.Void' is not defined or imported
Another problem is that it seems it really takes a lot of time to install dnvm, get packages, etc. So all in all it's a pain to make it work.
So are there real alternatives for that or more importantly is Microsoft is planning to implement something like a Build ASP.NET 5, Deploy ASP.NET to Azure and such to make it easy as I suppose it's easy with the current ASP.NET 4 apps. I really hope that it will be an option soon since it's quite impossible to work with current build system.
For "System.Void" issue, please check the runtime version in "global.json" file and make sure it is consistent with the dependencies in "project.json" file.
For dnvm install issue, since AspNet5 runtime environment isn't installed on VSTS Hosted Build Agent for now and the different users may use different runtime versions, it requires the user to add a "PreBuild" PowerShell step to read the runtime version in "global.json" file and then install it. If you can make sure that you will always only use one version (For example: 1.0.0-rc1-update1), you can deploy your own build agent and install "1.0.0-rc1-update1" on it, then you can skip the dnvm installation during the build process.
Take a look on http://riffer.eu/wordpress/?p=112. There I have a solution vor asp.net core RC_1.
Amazingly you need only two powershell scripts - there is no compiling / visual studio necessary.
This is Edited from the OP. This is a VB .NET 4.0 WinForms application. There is a mysql datasource involved with this project. The target CPU is set to any. Problem: When running this application on any computer that has VS 2010 installed along with the mysql connector it runs flawlessly. When installing on a virgin system(ie. No developer environment installed) but that machine does have .net framework 4.0 installed and a mysql server without the connector installed the application falls immediately. So to fix the issue I install the mysql connector MSI. This immediately fixes the issue on the client system and it runs. The problems is that as you can see below from my Installer setup that the 2 needed DLL files for MYSQL are actually included in the installation package so should not need to be installed separately. So Why is it that using that installer from the images do I need to install the mysql connector? Any Ideas? Below is a screen shot of the References the program uses and from what I believe I do not need to deploy any of those DLL files with my application other than the 2 MYSQL DLL files. So why is this failing?? Below are images showing the project references as well as the Installer Files that are being installed in the applicaiton folder. As shown in the image the 2 mysql dll files are to be put in the application folder. There is also a screen shot showing each dll's properties for in application folder.
You answered your own question.
but that machine does have .net framework 4.0 installed and a mysql server without the connector installed the application falls immediately.
You don't need to install the connector msi package, but you do need to include the two DLL files in the application's directory. Anytime you have some dependency, you need to deploy it with your application.
Edit solution quoted from my comment:
From your update it sounds like you have a version mismatch on the
assemblies, and the references are set to Specific Version = True.
Check the version number of the assemblies on your developer machine
in the output directory, and check the version you are installing on
the client system. (You can just hover over the DLL to read the
version on the ToolTip). You can try to set Specific Version to false
by right clicking your reference and selecting properties, or simply
ensure you deploy the same version of assemblies. Your program is
looking for the versions its compiled against
I have developed a s/w using acces and sqlserver 2008 and now trying to make a setup file.
How could be the possible way??
I tried in VS2008 software and development. But after installing from the msi file and running the s/w it shows an error
"Microsoft.ACE.OLEDB.12.0 provider is not registered”
plz help
You have to do a few things to set up your application:
Install the .NET framework if required
Install SQL Server 2008 if required
Install your application
Define/configure the connection from your application to the SQL Server instance
Create your database/schema in the SQL Server instance.
Ignore the SQL Server problem for a moment, the easiest way to deal with the .NET framework and installing the application would be to use a setup project - which should be available from within VS.NET under Other Project Types|Setup and Deployment. There are hooks in there to give you options for installing dependencies - of which the .NET framework is one.
Ok, you have a tool to create setups (there are several others, e.g. I'm currently using WiX which I like so far, is very capable but can rapidly becomes complex) - the problem now is that the installer you need to build will depend on how and where your application is to be deployed. Do you want to ship a complete, self-contained, application on a disc? Is it to be downloaded internally within a business or distributed over the internet - each of these suggests a different set of packages at one end "everything" at the other you want the smallest possible pieces pulled down as required or perhaps even a different packaging method (e.g. clickonce).
Next up is SQL Server. You can get a redistributable package for SQL Server 2008 Express, so distributing it is not a problem however you have to determine if the user has an existing instance they want to user or if they want to install.
Once you've got an installed instance - you need to be able to create and to maintain (update) the database/schema within that instance. That I suggest you do using code (see here: How to create "embedded" SQL 2008 database file if it doesn't exist?). Which brings us to another point, you not only have to be able to install the application the first time, but you need to make sure that a) you provide a means to uninstall the application and b) that you can neatly do an upgrade in place.
I hope there are enough pointers there to get you moving - in terms of testing this, Virtual Machines are your friend, they give you the capability to create multiple environments in which to test your deployment and the ability to quickly roll back to a clean environment to test again as its virtually impossible to properly test an installer on a dev box (I've found this out the hard way) as it will already have all the dependencies for your application installed.
Pick your tools and that should let you ask more focused questions.