Organising frameworks for client, server, and plugins - objective-c

I am working on a large project that comprises a server, some plugins for controlling devices that can be loaded into the server process, and also clients that can connect to the server. I am looking for the best practice for structuring the framework(s) that will ultimately need to exist.
There are header files that are shared between client, server, and plugins, and some headers that are specific to each aspect of the system. Sometimes headers are only shared between say client and server, or server and plugin. Similarly, there is common code that can be shared between all three aspects of the project, and also code that would only be needed by one particular aspect.
When the project is finished, we will need to release a client application and plugin developer API for third parties to develop against.
I am not sure how to properly structure the Frameworks that would be needed to support this.
Do I need to have 2 separate frameworks? Or can I have 1 framework that includes all the headers and provides 2 separate dylibs?
If I need to have 2 separate frameworks, what do I do with header files that are shared between all aspects of the system? I do not want to copy them into each framework to avoid possible problems with versioning.
Would a 3rd headers-only Framework be a reasonable option?
Can anyone recommend a best practice for structuring this kind of thing with Frameworks on OS X?

Framework = library + headers for that library
Each framework only needs to include header files for the interfaces you want to expose. Even if common headers were used to build all three frameworks, you are not obliged to bundle them.
A 3 framework approach would be fine, even if one of the frameworks only bundles common headers and no library at all. Example: if you install Qt on your Mac, you will see it split across many frameworks, but headers are never duplicated among them. There are also some frameworks that only contain headers and no code (QtScript.framework, for instance).

Related

How to use shared library in ASP.Net Core MVC running on IIS

I'm looking into using ASP.Net Core MVC for some of my new projects. I work on a team of developers for a very large organization, and we each individually write a lot of small web apps. Due to the size of our organization, we have a lot of rules that we have to follow, and sometimes those rules change, completely out of our control. So this is what we have used in the past projects, all running on IIS:
ASP Classic - Each IIS root folder has a shared folder, containing a lot of commonly used .asp files. These files are mostly the same on each server, but can point to different databases for dev/test/prod environments. These library files are used for common things like authentication, authorization, encryption, sending emails, etc... Each application would be in a sibling folder to the shared folder, and include files like "..\shared\library.asp"
ASP.Net / MVC - The closest thing we could find was the GAC. Everybody says not to use the GAC, but for our purposes it does exactly what we need. We built a DLL library, and store it in the GAC of each web server. We then put local configuration (dev/test/prod environment specific stuff) information on the global web.config of each IIS server. Application specific information would be stored in that application's local web.config file.
The beauty of these two systems, is sometimes things change, and we can simply go update the global libraries, and every application that depends on them will adapt to the new code without needing a recompile. We have many applications, running on many web servers. This may not be ideal, but for our needs it works perfectly, considering the rules can change at a moment's notice, and recompiling every application would be a huge ordeal. We just have to be sure not to ever introduce breaking changes into our libraries, which is simple enough. We have zero problems with how it works.
Now, on to ASP.Net Core. Is there an elegant way to do this? It seems like Core doesn't support the GAC, nor does it support web.config. Everything wants to use appsettings.json. Is there a way to create an appsettings.json at the root level of IIS, and have it set global variables like environment="dev", authdatabase="devsql" etc? And can we store a .Net Core/Standard DLL in a shared folder, and have every app load it with a path like "..\shared\library.dll"? The closest thing I could find to do this with .Net framework was the GAC, but I'm not really finding any answers for this with Core. I appreciate any help, thanks!
sometimes things change, and we can simply go update the global libraries, and every application that depends on them will adapt to the new code without needing a recompile
Note that this is exactly one of the reasons why GAC deployment is usually avoided. If you update a dependency, and that happens to contain a breaking change (in any possibly case), then applications will start to break randomly without you having control over that.
Usually, if you update a dependency, you should have to retest every application that depends on that before you deploy the updated application. That is why dependency updates (e.g. via NuGet) are deliberate choices you need to make.
.NET Core avoids this in general by never sharing assemblies between applications and by allowing different versions side-by-side. That way, you can update applications one by one without affecting others.
This is actually a primary reason why .NET Core was made in the first place: The .NET Framework is shipped with Windows, and is a global thing. All applications will always use the same framework version. So whenever Microsoft ships an update to the .NET Framework, they have to be incredibly careful not to break applications. And that is incredibly difficult because countless applications depend on all kinds of things in the framework. Even fixing a possibly obvious bug can break stuff.
With .NET Core and side-by-side dependencies, this is no longer a problem because updates will not automatically break applications that still depend on older versions. It is a developer’s explicit choice to update an application, shipping newer dependencies.
So you should actually embrace this and start to develop your applications independently. If you have common dependencies, consider creating (private) NuGet packages for those, so that applications can depend on them and so that you have a good way to update them properly.

Can Worklight Shell and Inner Applications be used to share common code across applications?

My Worklight project contains two separate hybrid mobile applications. The applications have in common a good deal of HTML, CSS and JavaScript. I need a way to encapsulate the common code so that it can be shared by both applications.
Can Worklight Shell and Inner Applications help me to share common code across applications?
If so, where can I find detailed documentation, example code or tutorials that use Shell and Inner Applications for this purpose?
If not, is there another way to share code across Worklight applications?
Yes you can reuse, you can add to Shell common CSS, Javascript functions, Plugins and they will be applied to all the project that are build on top of that.
If it is a hybrid app you can add CSS, JS, Images under <<WLProject>>/components/<<ShellName>>/common
For any plugins related to android or IOS you can add them into respective folders under <<WLProject>>/components/<<ShellName>>
Cheers !!
Shell development in Worklight could potentially solve this for you, but its complexity is much greater and I am not convinced its price is worth it.
You can review shell development in the Advanced Topics section of the IBM Worklight Getting Started web page.

How to organize mixed HTTP server + web client Dart project files?

I'm planning to create a pure Dart application where both the HTTP server and the web client side is written in Dart. Coming from Java and Eclipse the ultimate would be that i can open the whole project hierarchy in Dart Editor and be able to run the server which serves the client files and debug both sides of the app (server side with the DartVM and client side with Dartium).
I've fired up Dart Editor and after creating a simple Command-line application as the basis for the server side i got confused with the project layout.
The direct server side code files (web server boostrap class, handler and filter classes) are definietly going into the projects bin/ folder. Server side dependencies are going into the project's pubspec.yaml file.
The problem arrises when the server have to access the client application files (.dart files, static page source, etc.) in order to serve them to the browser. The easiest solution would be to create a web folder inside the server project and put client web files there, but this way (as far as i understood) server side dependencies are inherited into the client because we are still in the same pubspec scope. I don't want this.
I thought about creating a client library in the projects lib/ folder and put web files there but i don't know how good practice is to put a complete web application into there. I guess i have to put HTML and other client static files into the asset/ subfolder of the lib. I'm affraid that i'm loosing web application assist from the IDE this way.
What i might also be able to do is to put the client into a separate project, organize it like a Dart webapp project with it's very own pubspec.yaml and then make this the dependency of the server application somehow. I don't know if this way the server could access web files in the other project for serving. Probably this is the best way of doing it because it provides a clean separation of the client and server files.
Can somebody enlighten me what's the correct way of doing this?
Some more explanation.
Say i'm going with the separate project approach as others already suggesting in the answers but i still like to run the server which is able to serve the client in the development phase without any fancy hack. The server has to access the client files in the other project. It doesn't matter if its Javascript or Dart, the static files are there anyway. And during development i wish to serve the dart files since Dartium speeds up development with it's direct Dart running capability significantly.
With Java and Maven i can make the client package a runtime dependency of the server and i can simply serve the client files from the classpath. Does Dart support accessing a pub dependency's internal files the similar way or the only way for this is to put everything into the asset folder of the client or going with the relative path hack?
This is work in progress:
prepare a Dart app for server-side deployment
To improve the development experience you may use a symlink as a workaround so that you have the client files available in a directory of the server package.
I suggest creating a feature request at http://www.dartbug.com/new for better support.
I would go for two separate projects.
You won't need to make the client package a dependency on the server package.
The server only needs to know where the directory with the build output of the client package is.
Which files to serve is usually requested by the client.
The client requests e.g. index.html and all further dependencies (.dart, .hmtl, .js, .img, .css, ...) are hard-coded in this file and therefore the server should not need to know any further details beforehand.
I'd suggest organising two separate projects. There are a few things that you might profit from if you use this approach. The most obvious there's no coupling between client and server, you get a very clear separation. The other one is that your server can evolve independently of the client. Dart applications will need to be compiled to javascript. In the end you will have a dart server app serving javascript files (+ maybe dart files if you decide to do so). Some of the packages that you use on the server side are not available in dartium - you don't want to have to deal with this dependency mess. Your server might consist of more then just one app, maybe your server will have a module in java or some other language. Keeping this two project separately gives you a lot more flexibility.

How to get Repositorytool.jar in Apache ACE?

Apache Ace documentation refers about RepositoryTool.jar that can be used to manage Repository. But I could not find this tool in the Apache ACE distribution. Where can I download this tool?
The page you're referring to is part of the old site (the new one is located at http://ace.apache.org), and refers to tooling you probably shouldn't be using anymore: it has been used before there were other ways to interact with the repository, mainly for development purposes.
Depending on your needs, you can use the repository in a number of ways,
If you need to programmatically read and write the repositories (remember that they're only XML), use the HTTP API available for that.
You can do the same thing from code, see Repository and its implementations.
If you want to edit 'meaningful' ACE data (such as linking distributions and targets), use the Client REST API. This is probably the option you want.

Should I use a single framework codebase for multiple sites or one for each site?

Here's the situation:
Multiple sites are built using a certain framework.
All sites are hosted on the same server.
Each site has its own database
Since all the sites are hosted on the same location and they all have the framework code in common, I could easely install the framework once on the server and have each site use the same files as their framework code.
This way I only have 1 framework installation that is used by all the websites.
The second option is to have each site work with its own installation of the framework.
pro's and con's of option 1:
Only have to maintain 1 codebase for the framework of multiple websites
Framework updates instantly apply for all the websites
Should 1 site have different needs of the framework or have code that's no longer compatible with the latest framework version, custom framework compatibility patches become required. (there is not always time or budget to keep legacy projects compatible with the latest framework version)
pro's and con's of option 2:
Seperate framework for each site to maintain
Framework updates have to be applied seperately for each site
Should 1 site have different needs of the framework or has no budget to be maid compatible with the latest framework update, we simply don't update that site's framework installation.
If it's really necessary, a site could quickly modify its framework to match the needs without interfering with other sites on the server.
So option 1 seems easier to maintain, while option 2 is much more flexible. I don't know what's most important.
Which of the 2 options is the overall the best choice? Or are there more options possible?
I approach this slightly differently.
I'd have a dir structure like this:
/sites/
/site1/
/site2/
[etc]
/framework/
/1.0/
/1.1/
/1.x/
[etc]
Then within each site have a /framework mapping which points to the version of the framework that site is using. That way each site's framework version is independent of the others: if one site needs a different framework version than the others (stuck on an old version, needs a new version before the other sites have been tested with it), then you have control over that sort of granularity. Equally, changing the codebase in /framwork/nightly/ (for example) will "automatically" update all sites with their /framework mapping pointing to that bleeding-edge version of the codebase.
Personally, I will always have each site using its own framework codebase, or at least using a shared codebase frozen at a set version. The problem with sharing the framework is that with each update, you'd have to test each site using that shared codebase to ensure they are still working as expected. What also happens when the framework deprecates a feature you use in one of your sites? It could prevent you from updating, leaving you open to security issues.
Option 2 does give you more of a maintenance overhead but it's the safest approach in my opinion.
It depends on a couple factors.
How many developers working on the sites? Is the framework well supported, documented? Is there a *better framework available (better by documentation, overhead (memory footprint) and community support). Which choice would look better 12 months or longer down the road.
Choice 1 is appealing for its consistency and familiarity.
Choice 2 is appealing for its potential, learning and future growth.