CMS Settings, .INI file or MySQL? - optimization

I'm looking for the best solution to store the ettings for a website, like the limit of posts for users, limit of users online, ranks, min. number of posts to be able to do something.
Like here, if you're new you can't thumbs up/down a post, or whatever, so how would you store all of these?
I thought of creating a table with constants in mysql but i think it's not the best solution to add a new mysql query on every page refresh.

Why not? After all, MySQL can handle a large number of requests and I've seen much more complex queries than checking access rights. A MySQL query is a query to a file just like checking an INI file, but optimized. I'm guessing that if you don't expect a huge amount of traffic, you'll be fine with a database.
Here it's a matter of preference. I prefer to do this in MySQL because I don't like to parse files and find querying a database easier. Also, editing rows is easier than changing values in a text file.
I'd say your first thought was spot-on. Put constants into a database.

Related

Database schema sample for storing social media post

I am building a small social networking website, I have a doubt regarding database schema:
How should I store the posts(text) by a user?
I'll have a separate POST table and will link USERS table with it, through USERS_POST table.
But every time to display all the posts on user's profile, system will have to search the entire USERS_POST table for USER id and then display?
What else should I do?
Similarly how should I store the multiple places the user has worked or studied?
I understand it's broad but I am new to Database. :)
First don't worry too much, start by making it work and see where you get performance problems. The database might be a lot quicker then you expect. Also it is often much easier to see what the best solution is when you have an actual query that is too slow.
Regarding your design, if a post is never linked to more then one user then forget the USERS_POST table and put the user id in the POST table. In either case an index on the user id would help (as in not having to read the whole table) when the database grows large.
Multiple places for a single user you would store in an additional table. For instance called USERS_PLACES, give it a column user_id to link it to USERS plus other columns for the data you wish to store per place.
BTW In postgresql you might want to keep all object (tables, columns, ...) names lowercase because unless you take care to always quote them like "USERS" postgresql will make them lowercase which can be confusing.

Updating friendly name of a liferay page through SQL

Is there a way to update the Liferay's site page's friendly name through a SQL script?
We generally do this in the control panel through admin user.
While #steven35's answer might do the job, you're hitting a pet peeve of mine. On a different level, you're doing it right if you're doing it on the Control Panel, or through the API and you should not think about a way to ever write to Liferay's database. It might work for the moment, but it might also fail in unforeseen ways - sometimes long after your update.
There have been enough samples for this to happen. If you're changing data while Liferay is running, the cache will not be updated. In case these values are also indexed in the search index, they won't be updated there and random later uses might not find the correct page without you reindexing everything. The same value might be stored somewhere else - or translated. Numerous conditions can fail - and there's always one condition more than you expect and cater for. That one condition might break your neck.
Granted, the friendly name of a page might not fall into the most complex of these cases, but just don't get into the habit of writing to Liferay's database. Or, if you do, don't complain about future upgrades failing or requiring extra work, because the database contains values that the API didn't expect. The problem is that during the next upgrade (if you do it in - say - one year) you'll long have forgotten that you manually changed data in the database and blame Liferay for problems during your upgrade.
Changing data is exactly what the UI and the API are for.
Friendly urls are stored in LayoutFriendlyURL.friendlyURL in your Liferay database so the following query should work
UPDATE "yourdatabase"."LayoutFriendlyURL" SET "friendlyURL"="/newurl" WHERE "layoutFriendlyURLId"=12345;
You will also need to update the Layout table accordingly to match the new friendly url.

SocialEngine db structure/explanation

I'm new to socialEngine and a client of mine asked me to modify his website which is using socialEngine. He is using this CMS for over 6 years now and since then, it has never been modified.
I'm trying to reduce the size of database tables which are really huge.
For example,
the whole db is around 2G,
engine4_user_logins is 435MiB,
engine4_user_fields_search is 615MiB and
engine4_authorization_allow is 234MiB.
I searched for socialEngine db structure but I couldn't find it. I'm not asking you to explain every single table in database. My questions are:
Is it safe to empty these tables? And why are these tables so full?! Is it because of the long time that no modification applied to them?!
It's safe to clear engine4_user_logins however that data in that table is used in statistics section in SocialEngine's admin panel. You can clear it but you'll loose stats data. Don't clear the other two tables. SE uses
engine4_authorization_allow for item permissions and engine4_user_fields_search for searching.
engine4_authorization_allow and engine4_user_fields_search get populated when items are created in SocialEngine. blogs, users, groups, photos, etc. That's why they're huge.
If client is having performance issues, I suggest tweaking server configurations or upgrading the server.

RAILS3: Full-text Search Word Docs?

My company has a collection of about 3500 highly-structured Word docs (and growing) that contain multiple choice questions from one of our products. I've been tasked with writing a front-end that will let people find and use these in other products. There is some metadata on them that would go in a database, but we'd also like full-text search.
I've been given the option of using for the front-end either MS Access (because I know it well) or Rails (because I'm supposed to be learning it). I've done one Rails app and prefer to continue with it.
Rather than load the documents into the database, I thought it made more sense to just have them on the file system and store paths to them in the database.
I know I can use Ferret to search database fields but what's the best way to add full-text searching to a Rails app for a pile of files on the filesystem?
Not sure if there are any gems that would search the word files for you. Although you have mentioned that you do not want to load the entire documents into the database, you might look into just copying the text contents of each file in your db. You can use win32ol library for doing this (http://ruby-doc.org/stdlib/libdoc/win32ole/rdoc/classes/WIN32OLE.html) .. If I had to implement this, I would run a cron job every night (or whatever frequency seems fit) that would refresh the database content with the changes in the word files.

"Safely" allow users to search with SQL

For example I've often wanted to search stackoverflow with
SELECT whatever FROM questions WHERE
views * N + votes * M > answers AND NOT(answered) ORDER BY views;
or something like that.
Is there any reasonable way to allow users to use SQL as a search/filter language?
I see a few problems with it:
Accessing/changing stuff (a carefully setup user account should fix that)
SQL injection (given the previous the worst they should be able to do is get back junk and crash there session).
DOS attacks with pathological queries
What indexes do you give them?
Edit: I'd like to allow joins and what not as well.
Accessing/changing stuff
No problem, just run the query with a crippled user, with permissions only to select
SQL injection
Just sanitize the query
DOS attacks
Time-out the query and throttle the access by IP. I guess you can also throttle the CPU usage in some servers
If you do SQLEncode your users' input (and make sure to remove all ; as well!), I see no huge safety flaw (other than that we're still handing nukes out to psychos...) in having three input boxes - one for table, one for columns and one for conditions. They won't be able to have strings in their conditions, but queries like your example should work. You will do the actual pasting together of the SQL statement, so you'll be in control of what is actually executed. If your setup is good enough you'll be safe.
BUT, I wouldn't for my life let my user enter SQL like that. If you want to really customize search options, give either a bunch of flags for the search field, or a bunch of form elements that can be combined at will.
Another option is to invent some kind of "markup language", sort of like Markdown (the framework SO uses for formatting all these questions and answers...), that you can translate to SQL. Then you can make sure that only "harmless" selects are performed, and you can protect user data etc.
In fact, if you ever implement this, you should see if you could run the commands from a separate account on the SQL server, which only has access to the very basic needs, and obviously only read access.
Facebook does this with FQL. See the blog post or presentation.
I just thought of a strong sanitize method that could be used to restrict what can be used.
Use MySQL and grab it's lex/yacc files
use the lex file as is
gut the yacc file to only the things you want to allow
use action rules that spit out the input on success.