ZF2 Server Requirements/Recommendations - apache

Hi I did a search on what would be the recommended or atleast requirements to run a nifty zf2 application.
I know I need PHP 5.3.3 or higher, but what abou the RAM and CPU requirements?
I found nothing. Magento for example needs A LOT of resources to run smooth, but I found nothing comparable for ZF2.
Have you guys met any experiences? Let me know.

This type of answer is really only one you'll be able to answer yourself by loading up your application and testing. Depending on your application you may need more resources than the typical application would.
You mentioned Magento so here is a link to their requirements:
http://www.magentocommerce.com/system-requirements
However this still won't fully answer your question. There are many factors involved here such as the web server you will be running. Other applications running on the same machine. The amount of users you will have at any given type. There really is no true calculator to define what resources you'll need.
ZF2 specifically does not have many system requirements but you can find more information on that here: http://zf2.readthedocs.org/en/latest/user-guide/overview.html
If you're just throwing a machine together to test with, then you'd probably be fine with a 20+ gig hard drive and 2 Gigs of ram to get started.

Related

How to start with a DBMS project

We are supposed to make a web-based project using postgres. The topic is library management system, where on a website a user can search whether a book is available in the library, if it is present then where, and so on.
The problem is just that I don't know anything about web development. I do have a pretty good knowledge of sql, but I'm confused a bit in that too, because I don't know if I'll just have to run the queries in my laptop in postgres and link if it "somehow" to the website, or will I have to upload my data on some server (for eg., firebase in case Android development) to be used in my website.
So briefly, I've just two questions:-
How should I start, because I have no idea where to begin with(I do have all the data needed btw)?
About postgres, will the queries run on my laptop or some server?
Please help me with this. Some online resources for the same are way more than just welcome, because I was unable to find any. Thank you!
First of all, you'd take a look at some design pattern in order to learn some theory on how to make (web) apps in the right way. You can visit Martin Fowler's web site and read them.
Once studied, you'd follow my advice. If you've got Java expetise, I'd start by learning Spring Boot, which has every piece you need to achieve your goal. This project follows lot's of design patterns (MVC, Repository, DAO, AOP, IoC/DI...) and lets you follow others (DTO). Anyway, choose the right template engine (I like Thymeleaf) or any other framework (Angular 2...).
Hope it helps.
welcome to development world. When starting out it seems very confusing but it is not that much.
Start slow, there are many tutorials across which helps.. just do a bit of google.
To answer your question :
How should I start, because I have no idea where to begin with(I do have all the data needed btw)?
-- Google simple website with postgres db. For that you will require the database to be installed and a webserver on your machine. All of which will be used when you host the website
About postgres, will the queries run on my laptop or some server?
-- It will run on where you have installed the database..
hope this helps :)

Easiest API to learn/methdology to create web applications for running mapreduce on hadoop?

I have hadoop 1.0.4 running on my ubuntu 11.04,configured with eclipse I want to make a web application to run hadoop jobs, or may be Cassandra,Hbase and Hive might be a way but I don't have much time to learn thoroughly all these and I want to do it as quickly as possible.Any advice which one might prove the easiest to get started with ?
I don't know if this question really qualifies to be here on SO in its current form. This is the reason I did not write this initially. But, a lot of SO experts are out there to decide this(they can do it much better than me) :)
Having said that, I would like to share a few things with you based on my personal experience, so that you proceed towards the correct path. First of all, Hadoop jobs(MapReduce) and Hive are actually not a good fit for web services kinda use cases. They are most suitable for offline, batch processing kinda stuff. HBase/Cassandra can be used though, if you have real time needs(like web services).
Coming back to your actual question. Before diving into Hadoop, Hive, HBase etc, I would suggest you to get some hold on web services first(if you are new to web services as well). Reason being, a web service is something which has much wider scope of applicability as compared to tools like Hadoop, Hive, HBase etc. These tools are specific to some particular use cases and cannot be used everywhere. But, web services are used almost everywhere and with n number of different things, like RDBMSs, NoSQL datastores etc etc. So if you know web service concepts you definitely have that extra edge. To begin with you can visit these links :
Web Services Tutorial by W3Schools(Nice n easy. Would serve the quick start guide purpose).
For a detailed tutorial you can visit the oracle web services tutorial.
This link by IBM developerworks has references to some really good web services learning stuff.
You might find this one really helpful to start with(Shows how to create web services using Eclipse).
And you can obviously Google web service tutorials anytime.
One last thing. Although it's not mandatory to be a pro in things like Hadoop, Hive, HBase etc, but having some decent amount of understanding of the concepts would be really helpful in developing your solution in a much better manner. It'll allow you to think accurately in the correct direction.
HTH.

How to setup wiki for emergency response?

Company I work for is currently preparing to expand and take on emergency support (after hours support).
We currently have a wiki setup with a lot of information.
However, due to the fact that bits and pieces are scattered across entire wiki (dependable on what department etc).
Whoever is on support that night/weekend needs to quickly be able to help customer with their problems. eg. if our server is being very slow there needs to be troubleshooting guide of some sort so that person can dig straight into it.
I have googled quite a bit but I was not able to find anything useful. So here is a question:
How would you structure your wiki (by topic, by symptoms, by solutions?) to minimize time person has to look for information?
Personally, I think using some sort of syntax such as
Symptom: large CPU utilization
Keywords: slow server, large cpu usage
That way when you search through wiki it would most likely come up in search. But what if issue is more software related - such as miss-configuration?
This is wiki API url
http://en.wikipedia.org/w/api.php
Also you need to have a Web Service which will accept the params on which search will be performed as well as UI page.
Web Service will have all the logics which ever needed.
Is there a Wikipedia API?

Setting up a web developer lab for learning purposes

I'm not a developer by profession. Therefore, I'm not exposed to real world technical problems that face professional developers. I read/heard about web farms, integration between different systems, load balancing ... etc.
Therefore, I was wondering if there are ways for the individual developer to create an environment that simulates real world situations with minimal number of machines like:
web farms & caching
simulating many users accessing your website (Pressure tests?)
Performance
load balancing
anything you think I should consider.
By the way, I have a server machine and 1 PC. and I don't mind investing in tools and software.
PS. I'm using Microsoft technologies for development but I hope this is not a limiting factor.
Thanks
Since I am new, SO will not let me post more than one link, so I compiled a list of links for you at a pastebin here.
alot of tools for this. i like
http://www.acme.com/software/http_load/
and http://curl-loader.sourceforge.net/
they both can simulaty much queries to your server. Run it from another machine.

PyAMF backend choices!

I've been using PyAMF to write a backend for a flex app that will request different groups of hundreds of different images depending on what the client needs. I have been using the "simple_server" WSGI server that PyAMF supplies while developing the flex code. Now I'm ready to write a robust backend that will be able to pull images from a mySQL database and send them as fast as possible and as efficiently as possible to many concurrent clients.
The PyAMF documentation is great because they supply many examples to follow, however I am confused about what kind of backend I am trying to create.
Do I want a SocketServer or a WSGI server or something like Twisted or web2py or Tornado? Are these even all different? :) Should I be using Apache modules instead (mod_wsgi or modjy or mod_python)?
I realize that this probably touches on many open debates, so maybe you could just point me to any good summaries of these debates?
Its great to have so many options, but how do I choose?
The short answer is, of course, that it depends on the requirements of your project.
How many concurrent connections is "a lot"?
How much programmer time can you throw at the problem?
How much hardware can you throw at the problem?
...etc...
If you plan to have lots of concurrent clients, it's hard to beat Twisted in the Python world. However, you'll have to deal with your database asynchronously to avoid blocking, and depending on how complex your database interactions are, this can be a bit of a pain. You're basically limited to either using twisted.enterprise.adbapi or coming up with your own twisted-ORM integration.
If you'd rather have "easy" database code (i.e. you want to use an ORM), you're better off going with a (TurboGears/Pylons/plain wsgi) project, probably hosted using Apache and mod_wsgi. This can be a pretty scalable solution, and you get a lot of stuff for free using these frameworks, but it may be more than you need.
I would avoid using one of the many plain python wsgi servers out there (wsgiref, paster, etc.) in production if you really want high performance.
Good Luck!