Best way to use ActiveRecord models in multiple applications - ruby-on-rails-3

I have 2 applications that are going to be built (possibly a 3rd API), all are going to use the same database. What is the best way to use the same models across all applications.
Also, what are some of the caveats you have experienced or foresee with this method. Looking for the best solution to this.

Rails, as opinionated software, prefers a single app, rather than shared models. I've tried it both ways. Are you going to just have copies of the models that can get out of sync? Are you still planning to use Rails' migrations? When you have multiple apps, migrations become difficult. Do you use just one app for migrations? Then you lose the ability to check in migrations along with the code it refers to. Automating builds can become very difficult. You can possibly find a way share migrations too, but that requires some source code management sleight-of-hand which ultimately makes it more difficult to do separate apps and still get everything Rails has to offer. At that point you may want to look into Sinatra.
On the other hand, there's a lot of organizing you can do in a single app that keeps the model domain shared, yet separates the controllers, such as using namespaces or engines. I'd recommend those techniques.

Related

Can you switch programming languages with the same database

Say I have a python/Django website fully built. I now want to re-create that website with Ruby on Rails or some other language. Is this possible to keep the same database? Or would I have to transfer data between the two databases?
Yes, it is possible to keep the same database in the new application that you used from the old. You can even have multiple applications use the same database as the same time.
However, you should not just port everything over query for query. There are likely subtleties in the old application that will be easy to miss... places where logic one would normally expect to live in the database instead lives in the application. There are also likely decisions regarding database structure that were made to accommodate quirks or abilities of the old environment that no longer make sense for the new.
The result is this a common point where you might also create a service layer. With a service layer, neither application talks to the database directly. Instead, they both talk to a service application that mediates access to the DB.
This new service layer helps make sure business logic is consistent across applications, without drifting between the two platforms. It helps avoid duplicating work. It helps with performance by creating an obvious place for things like a heroku caching layer, and by making it easier to scale data access across multiple servers.

Why was cakePHP designed to use Inheritance over Composition even though it's mostly considered a bad design?

CakePHP Applications being made in our company tends to become unmaintainable as it becomes more complex. I figured that one specific reason is inheritance which makes the functions in child classes depends a lot on it's parent classes and vice-versa (implementing template method pattern). Why is CakePHP designed this way and not friendly in using Dependency Injection, Strategies, or Factory patterns?
There is not such a bad design as you claim in the framework. Sure, there are probably things that could be done better but I would like to see a more substantial critic including solid arguments and examples. I assume you're not using the framework as it was intended.
Let me quote the first paragraph from this page.
According to Eric Evans, Domain-driven design (DDD) is not a technology or a methodology. It’s a different way of thinking about how to organize your applications and structure your code. This way of thinking complements very well the popular MVC architecture. The domain model provides a structural view of the system. Most of the time, applications don’t change, what changes is the domain. MVC, however, doesn’t really tell you how your model should be structured. That’s why some frameworks don’t force you to use a specific model structure, instead, they let your model evolve as your knowledge and expertise grows.
You're not showing code (for a reason?) so I guess your problem comes from stuffing everything into the table objects in src/Model/Table/ or doing something similar.
But you're totally free to create a folder structure like
/src/Service
/src/Model/Domain
and then simply instantiate services as you need them in your controller actions. A service could be for example \App\Service\User\Registration and using objects from App\Model\Domain\User.
I agree that the framework in fact doesn't provide any recommendation or template structure for how this could look like. For exactly this topic there is a discussion going on here. Because of a lack of such a structure I've started working on a plugin that provides this. The plugin doesn't require but suggest the usage of DI containers for the people who want them.
Given the whole fancy topic around DI and DDD so far I would say there is not the one way to get things right but different paths as long as the code is easy to maintain. And honestly, as long as this goal is archived I really don't care about how you call it. :) I think many people tend do make this topic to academic instead of simply trying to be practical.
Not everybody is even needing that structure. It depends on if you're building a RAD CRUD application or a more complex app. Not every application needs a DDD approach. There are so many shades of gray when it comes to design the business layer, no matter how the framework would do it, somebody would always complain about it.
I personally almost never missed a DI container in CakePHP, not even in the biggest project having more than ~560 database tables which was a hospital management solution and it just worked well.
I would suggest you to ask a more specific question about your approach how you structured your code and showing your structure and code and then asking for advice on how to improve it instead of blaming the tool you're using in the first place without providing context.
Unfortunately CakePHP v3 can not compare to the Zend3/Laminas, Symfony or Laravel.It is 7-8 years behind the other frameworks.If you are using cake for years or it is your 1st and last framework it is normal to not realise that.But if you have to use it after Zend 3... cake seems like really bad ecosystem.
Bad documentation
Bad ORM
Poor Routing system
Bad Templating engine
Bad idea to mix Data Mapper and Active Record
DIC is totally missing
Components - not good but not terrible
...
And many more thinks that should not be underestimated like - lack of GOOD tutorials, pluigns/addons/packages
The above thinks make developers to follow bad practices that adds a lot of technical depth.
If you care just for - it works! But not how it works and why it is bad, cake will fit ok for you.
Cake can not scale as good as Symfony/Laminas if you are doing big project.(yea AWS/GC can help for scaling a lot of thinks but not for scaling source code)
Cake doesn't allow you rapid development like Laravel/Symfony for decent project.
I'm wondering who and WHY would start a new project today using Cake as it has zero benefits over the other frameworks.
Probably only devs who used only Cake for last decade and do not want to start learning new technologies or devs that thinks SOLID is just a fancy hype with zero benefits like design patterns, DRY and KISS
CakePHP framework supplies user interaction with databases using Active record, it means that exist a high coupling between business layer and database layer which has negative effects in unit testing and because of that the framework is not friendly with Dependency Injection. The same issue happens with Factory pattern, high coupling mentioned before makes more difficult use simulated objects in unit testing.
Hope it helps!
Alberto

Umbraco Hive and Services Layer

I'm experimenting with the new Umbraco 5 hive, and I'm kinda a bit confused.
I'm plugging in an existing Linq to SQL services layer, which I developed for a webforms site.
I don't know much about the repository pattern, my services handle all connections with the data context, and work very well.
I have made a few repositories that plug in to the hive, and handle conversion of my entities to the Umbraco TypedEntity type.
These repositiories reference my existing services layer, to retrieve, add, update and delete. The services also handle other entity specific functions, which will not be used by the hive.
Now, it's nice to plug in these services, and just reference them in the hive repositories, but it seems I may be doing things the wrong way round, according to the offical repository pattern as I have read about.
I know there's no hard fast rules, but I would appreciate comments on what I'm doing to achieve this functionality.
I've asked this here instead of the Umbraco forum, as I want a wider perspective.
Cheers.
I personally feel that the Hive is overkill. With the ability to use your own classes directly within razor macros, I think the best approach is to forego the hive altogether and simply use your classes. Why would you trade all of the power of your existing service just to make it fit into the hive interface?
If you're writing a library for other Umbraco developers, you may need to do this, but it's my personal opinion that the hive is over-engineered at worst and a layer of abstraction aimed at newish developers at best.
So, if I were to advise you, I would say to consider the more general principles: "Keep It Simple" and "You Aren't Gonna Need It". If the interface they give you offers a tangible benefit, implement it. If not, consider what you really gain for all of that work.

Whither NetTiers?

I used NetTiers in a number of projects a job or two back. I found it extremely useful for generating back-end interfaces in ASP.NET webforms. The business and data layers were also pretty sweet. I typically use NHibernate, but I think it may be overkill on these particular projects in terms of the time it will take to get running.
Since then, I've been working on projects where practically everything is end-user facing. However, I've recently gotten a side project that will have a lot of back-end administrative stuff and was wondering if NetTiers is still as well-maintained and clean as it was a couple of years back. It doesn't appear to be, but I don't know if that means that it has actually been abandoned or if it has merely been moved elsewhere. Or is there another product (preferably a set of CodeSmith templates) that might work better for me? All I really need is a clean ActiveRecord model that can hit a SQL database on the backend and generate simple user interfaces for CRUD screens for most of my model objects. I need something that will do deep-loading of object graphs kind of like NetTiers will do as well.
Any suggestions?
I'm currently supporting a large NetTiers application and my experience has generally been one of frustration. I inherited the project and took over maintenance of the templates, fixing a number of bugs in the templates and applying some post-generation scripts to the generated files. IMHO the generated code is overly verbose, suffers from massive duplication, and would benefit from more use of generics. The templates I'm working with didn't dispose of resources correctly (the newer template versions may be better). At one point I considered upgrading to a newer version but the size of the exercise put me off. Useful documentation is difficult to find and getting answers to NetTiers questions is not straight forward. The overall impression I have is one of gradual decline.
If you're just after a simple .Net stack for generating a UI from a SQL database I suggest you take a look at ASP.NET MVC3 with MvcScaffolding and Entity Framework. Add AutoMapper and Munq for DI.
We have been using NetTiers for several years now. I think it tend to look overwhelming for first time users, in terms of quantity of stuff generated, and there are a couple of limitations around the DeepLoad functionallity and circularities. I too have the feeling that there have not been many updates lately, but in the overall I've had a great experience using Nettiers with codesmith, and from all the ones I've tried, it's clearly our favorite, with huge productivity gains. We use views, custom sp's, the indexes, etc.
In a comment to another reply: We've tried Automapper, and moved away from it due to the fact that it fails silently when the object's structures change. And moved away from Entity Framework because we don't like hand-coding our DALs. :)

Code generators or ORMs?

What do you suggest for Data Access layer? Using ORMs like Entity Framework and Hibernate OR Code Generators like Subsonic, .netTiers, T4, etc.?
For me, this is a no-brainer, you generate the code.
I'm going to go slightly off topic here because there's a bigger underlying fallacy at play. The fallacy is that these ORM frameworks solve the object/relational impedence mismatch. This claim is a barefaced lie.
I find the best way to resolve the object/relational impedance mismatch is to either use OOP exclusively and use an object database or use the idioms of the relational database exclusively and ignore OOP.
The abstraction "everything is a table" is to me, much more powerful than the abstraction "everything is a class." It takes less code, less intellectual effort and leads to faster code when you code to the database rather than to an object model.
To me this seems obvious. If your application is data driven then surely your code should be data driven too? Yet to say this is hugely controversial.
The central problem here is that OOP becomes a really leaky abstraction when used in conjunction with a database. Code that look perfectly sensible when written to the idioms of OOP looks completely insane when you see the traffic that code generates at the database. When that messiness becomes a performance problem, OOP is the first casualty.
There is really no way to resolve this. Databases work with sets of data. OOP focus on instances of classes. Trying to marry the two is always going to end in divorce.
So to answer your question, I believe you should generate your classes and try and make them map the underlying database structure as closely as possible.
Perhaps controversially, I've always felt that using code generators for the ADO.NET plumbing is fundamentally solving the wrong problem.
At some point, hopefully not too long after learning about Connection Strings, SqlCommands, DataAdapters, and all that, we notice that:
Such code is ugly
It is very boring to write
It's very easy to miss something if you're doing it by hand
It has to be repeated every time you want to access the database
So, the problem to solve is "how to do the same thing lots of times fast"?
I say no.
Using code generators to make this process quick still means that you have a ton of code, all the same, all over your business classes (or your data access layer, if you separate that from your business logic).
And then, if you want to do something generically (like track stored procedure usage, for instance), you end up having to customise your code generator if it doesn't already have the feature you want. And even if it does, you still have to regenerate everything all the time.
I like to do things once, not many times, no matter how fast I can do them.
So I rolled my own Data Access class that knows how to add parameters, set up and close connections, manage transactions, and other cool stuff. It only had to be written once, and calling its methods from a Business object that needs some database stuff done consists of one line of code.
When I needed to make the application support multithreaded database accesses, it required a change to the Data Access class only, and all the business classes just do what they already did.
There is no right answer it all depends on your project. As Simon points out if your application is all data driven, then it might make sense depending on the size and complexity of the domain to use non oop paradigm. I had a lot of success building a system using a Transaction Script pattern, which passed XML Messages around the system.
However this system started to break down after five or six years as the application grew in size and complexity (5 or 6 webs, several web services, tons of COM+ components, legacy and .net code, 8+ databases with 800+ tables 4,000+ procedures). No one knew what anything was, and duplication was running rampant.
There are other ways to alleviate the maintance then OOP; however, if you have a very complex domain then hainvg a rich domain model is ideal IMHO, as it allows for the business rules to be expressed in nice encapsulated components.
To answer your question, avoid code generators if you can. Code generators are a recipe for disaster, but if you do go with code generation do not modify the generated code. Also be sure to have a good process in place that is easy for developers to get the new generated code.
I recommend using either the following: ORM or hand roll a lightweight DAL. I am currently transitioning a project over to nHibernate off my hand rolled DAL and am having a lot of success; however, I like having the option of using either option. Further if you properly seperate your concerns (Data Access from Business Layer from Presentation) you can have a single service layer that might talk to a Dao (Data Access Object) that for one object is an ORM but for another is hand rolled). I like this flexibility as it allows to apply the best tool to the job.
I like nHibernate over a hand rolled DAL because while my DAL does abstract away most of the ADO.Net code you still have to write the code that takes a data reader to an object or an object and creates the parameters.
I've always preferred to go the code generator route, especially in C# where you can make use of extended classes to add functionality to the basic data objects.
Hate to say this, but it depends. If you find an ORM tool that fits your needs go for it. We wrote our own system in small steps while developing the application. We are using C++ and there are not that many tools out there anyway. Ours ended up being a XML description of the database, from that the SQL generation script and the basic object layer and metadata were generated.
Do your homework and evaluate theses tools and you will find a good fit for your needs.