Other engineering disciplines have rigorous process which addresses the major risks - risk-analysis

Other engineering disciplines have rigorous process which addresses the major risks involved in the undertaking of any project. If software engineering is to borrow such a practice how can it add value?

Related

How do CRUD-oriented data abstractions in interfaces introduce operational and semantic coupling?

I am going through the book "Patterns for API Design: Simplifying Integration with Loosely Coupled Message Exchanges" and came across this paragraph. I am having a hard time understanding it
Some software engineering and object-oriented analysis and design
(OOAD) methods balance processing and structural aspects in their
steps, artifacts, and techniques; some put a strong emphasis on either
computing or data. Domain-driven design (DDD) [Evans 2003; Vernon
2013], for instance, is an example of a balanced approach.
Entity-relationship diagrams focus on data structure and relationships
rather than behavior. If a data-centric modeling and API endpoint
identification approach is chosen, there is a risk that many CRUD
(create, read, update, delete) APIs operating on data are exposed,
which can have a negative impact on data quality because every
authorized client may manipulate the provider-side data rather
arbitrarily. CRUD-oriented data abstractions in interfaces introduce
operational and semantic coupling.
There is actually a lot I do not understand in this but In particular I am having difficulty with this part
CRUD-oriented data abstractions in interfaces introduce
operational and semantic coupling.
I know what CRUD is but what does data abstractions mean in this context? How does it relate to the endpoint?
what is operational and semantic coupling?

Is studying system engineering valuable to software engineers?

I'm not sure how to put it but I just learnt about system engineering and it is about designing complex systems. It is generally oriented to manufacturing and not software companies but I was wondering if concepts and techniques of system engineering might be of help for a software engineer.

Factors affecting the performance, safety and security of object-oriented programs?

How does object oriented programming improve performance, safety and security of programs and what factors affect their performance and security.
OOP doesn't improve performance per se. In fact, it has been a long criticism that OOP increased overall resource consumption. It was a trade-off between performance/optimization with productivity/maintainibility.
In terms of security, while procedural programs could be secure, the advent of OOP and increased reausability which comes in with it has spread reusable good coding practices. At the end of the day, a program that can be seamlessly maintained, built on top of reusable patterns and developed with good security practices should provide an easier foundation to detect and fix security holes.
In summary, OOP doesn't provide any direct advantage in the fields you're asking for but it provides a solid foundation to code better in most business cases. That's all.

Swarm Intelligence - what kinds of problems are effectively solved?

I am looking for practical problem (or implementations, applications) examples which are effectively algoritmized using swarm intelligence. I found that multicriteria optimization is one example. Are there any others?
IMHO swarm-intelligence should be added to the tags
Are you looking for toy problems or more for real-world applications?
In the latter category I know variants on swarm intelligence algorithms are used in Hollywood for CGI animations such as large (animated) armies riding the fields of battle.
Related but more towards the toy-problem end of the spectrum you can model large crowds with similar algorithms, and use it for example to simulate disaster-scenarios. AFAIK the Dutch institute TNO has research groups on this topic, though I couldn't find an English link just by googling.
One suggestion for a place to start further investigation would be this PDF book:
http://www.cs.vu.nl/~schut/dbldot/collectivae/sci/sci.pdf
That book also has an appendix (B) with some sample projects you could try and work on.
If you want to get a head start there are several frameworks (scientific use) for multi-agent systems such as swarming intelligence (most of 'em are written with Java I think). Some of them include sample apps too. For example have a look at these:
Repast:
http://repast.sourceforge.net/repast_3/
Swarm.org:
http://swarm.org/
Netlogo:
http://ccl.northwestern.edu/netlogo
Post edited, added more info.
I will take your question like: what kind of real-world problems SI can solve?
There are alot. Swarm intelligence is based on the complex behaviour of swarms, where agents in the swarm coordinate and cooperate by executing very simple rules to generate an emergent complex auto organized behaviour. Also, the agents often make a deliberation process to make efficient decisions, and also, the emergent behaviour of the swarms allows them to find patterns, learn and adapt to their environment. Therefore, real-world applications based on SI are those that often required coordination and cooperation techniques, optimization process, exploratory analysis, dynamical poblems, etc. Some of these are:
Optimization techniques (mathematical functions for example)
Coordination of a swarm of robots (to organize inventory for example)
Routing in communication networks. (This is also dynamical combinatorial optimization)
Data analysis (usually exploratory, like clustering). SI has alot of applications in data mining and machine learning. This allows SI algorithms to find interesting patterns in big sets of data.
Np problems in general
I'm sure there are alot more. You should check the book:
"Swarm Intelligence: from natural to artificial systems". This is the basic book.
Take care.

The environments cycled by a project

What are the environments a software product can go through. Up to now I've only seen:
designing
development
testing
staging
uat
performance
production
Anything else?
You are right. The tradicional way of software development (called waterfall) following these steps. Althrough in past then years many methodologies are created and them are improve the software development process nowadays.
If you don't now about the methodologies like Extreme Programing (XP), Test Driven Development (TDD), Scrum, Kanban , Behaviour Driven Development (BDD), Agile Unified Process, Feature Driven Development (FDD) and others Agile Methodologies (very common in these days) don't worry about. There are many material in the Internet. Some of that these methodologies are focused in the building and test software in the source code level (TDD, BDD), others are more focused with the management of the entire process (Scrum, Kanban).
Bu the manly idea in the subset of these methodologies is that the requirements change during the process and that is necessary to complement the stage of development with the test stage in small interactions to delivery a piece of software with valuable functionality in little cycle instead to follow inflexible traditional way to produce software that doesn't matter.
One of the other phases which I have seen is a performance testing. This phase is more Performance measurement driven, based on the expected SLAs for the product. It is a way of benchmarking the product post UAT and pre Production