So, I'm building this api where each client use the same system (controllers, models). Each client has his own database and a website that connect's to this database.
Now that's the problem, this api must serve all this websites with different database connections and since I'm new to Laravel, I'm facing a hard time. What's the best way to do that?
Should I save each database in a file (for example client's id is 1, so we have a folder with a file named 1.env with database info)?
PS.: I'm using oauth2
The solution I found was: each client with his own database and a folder with "users" and a subfolder for each client. Inside, a file with database info, in login we use a global table that indicates where's the database from that user. Next step is load the database connection and check user/password. That was a quick solution.
However in my research I stumped into "multi tenancy" thing that seams way better, you can find it more on this link.
Related
I'm currently working on a bot for some anarchy servers, and it's a lot more reliable for it to read the log than just plain chat. I need a way to access that, but I don't know how. Is there even a way to do this without admin access?
No there is actually no way to view this. Because it's a file located on the server you can only view it if you have (direct) file access to the server. The only way to get the latest.log file is to contact the server owner, but I think that no server owner of e.g Hypixel will give you this log file.
You can make a Minecraft plugin in Java that acts as an API server, you can then make it read the file and return it. Of course, you would want to protect it with some type of authorization. You can use an HTTP server, an example would be this, it allows commands to be executed but you could easily work off that.
As MCTzOCK mentioned you can't view it without asking for permission from the server owner.
I have a database of files that are already tagged. Now, I would like to upload these files to an OwnCloud or NextCloud Server and pass on my already existing tags so that they show up as tags in the respective system. I wasnt able yet to find a way how I could do that in the documentation, does anyone have an idea how I could do it?
Thanks!
I just made available the source code of the (remote) file tagging micro-service for Nextcloud on github (https://github.com/julianthome/taggy). The implementation consists of two parts: 1) the taggy client for uploading files to the Nextcloud server, and for invoking the taggy server; 2) the taggy server for adding specified tags to uploaded files.
I will polish the code further within the next days. I am also planning to add SSL support which is important because username and password are currently transmitted unencrypted to the taggy server. The server uses these credentials in order to check whether the user can be properly authenticated before tagging any files.
Please let me know if you have other ideas, suggestions or feedback ;)
Kind regards
I've got a DotNetNuke system (v 5.6) that's hosting several different portals, and I'd like to move one of them to another hosting provider. What's the easiest way to do this?
Every web site I find that claims to explain how to move a DotNetNuke site essentially says "Copy the entire database over to the new system." That's great if you've only got one portal in the database, but I've got a dozen of them. I only want to move one portal, not all of them.
Exporting the site to a .template is another popular suggestion. This exports the structure of the site (all the tab definitions, for example), but it doesn't include any of the actual HTML content. As such, that's essentially worthless.
There must be a reasonable way to do this short of trying to strip one individual portals data out of every single DNN table. Right?
When you export a site template, you can include the content of the site, as well (for the modules that support portability, which includes the standard HTML module). This is how the default site template has all of its content. When you do this, there will be a .template.resources file that you'll need, as well as the .template file.
The other option is to do a full backup and restore, and then remove the other sites once you've restored. If you have significant content in a module that doesn't support portability, I think this will be your best bet.
FYI, I did find a solution from someone over on the DotNetNuke forums.
Create a 2nd version of that install, then delete all the other
portals. Move the install with the one portal. We've done this several
times with installs with lots of portals and it works just fine. Yeah
there's still some noise left in the db, but it's a quick and
effective way of doing things.
Edit note that this will give you an install with 1 portal. You can't detach a portal from one install and reattach it to an existing
install (well, you can, but basically you have to export the portal as
a template and that isn't 100%)
This is the approach I took, and sure enough, it works.
In a nutshell:
Mirror the files for the web site to another server.
Mirror the DNN database to another server.
Log in a Host on the new setup and delete all the portals but the one you want to migrate.
Delete any module definitions that are not in use by the remaining portal.
Open up your favorite SQL tool and delete any entries in the Users and UserProfile tables that no longer have a matching row in the UserPortals table. DNN does not remove these by default, which is frustrating.
Hop in to Windows Explorer and delete all of the Portal folders you no longer need (ie: /Portal/1, /Portal/2, etc.)
Back up the database using Enterprise Manager to create a .bak file
Make a .zip of the entire DNN installation folder.
You now have a .bak that contains the database and a .zip that contains the files. Send those off to the new hosting company, and you should be all set. Just make sure to update your web.config to set the connection string properly to point to the new database server at the new hosting company.
It's just that easy. ;)
Our users' web passwords/usernames/firstname/lastname/etc are in the dbo.contacts table in our CRM. This is great for CRM and our CRM compatible apps, but I would love to query these accounts with software that can only query LDAP.
Is it possible to tell openldap, "Hey, create logins using this table*" and to update this information periodically as obviously information changes over time? My scripting-fu isn't very strong but I've worked with php and webservices and would just like to get ldap talking to this table so I can get serious with single-sign-on.
Thanks.
*This can be a live connection to the CRM db via odbc/ado, a csv file, or connection via webservices.
This has nothing to do with OpenLDAP. LDAP clients can use the add request to add entries, assuming the client's authorization state allows adding users under the base object chosen by client. There is a standalone modify client called ldapmodify. Please "LDAP: Mastering ldapmodify" for more information.
Be aware that the some versions of the openldap ldapmodify tool are broken in that it incorrectly allows values with trailing spaces (which is illegal). The directory server base 64 encodes these values, which is probably not what was intended.
I have this problem, I can create a setup for my app/database but everyone can open the backend file which is an MS Access database. I want it so we can only access the database using the frontend app. Please help me ...
You will need to create a separate front end app that accesses the database, and keep the database file somewhere secure where direct access to it is limited/restricted. You should also password-protect the file in case someone does manage to get it See tip #10 here: https://web.archive.org/web/1/http://blogs.techrepublic%2ecom%2ecom/10things/?p=552 for info about password protection (though you might find the other tips useful as well).
Whatever you do, don't rely on Access password protection. It can be cracked in minutes by tools freely available on the internet.
I would get Garry Robinson's book Real World Microsoft Access Database Protection and Security
It is the most comprehensive guide to securing an Access database that there is.
A first step, however, would be to put the backend file in a restricted folder.