Unable to finish testing ColdFusion 11 website which was migrated from ColdFusion 9 due to CFIDE\scripts - coldfusion-9

My team recently made the migration from ColdFusion 9 to ColdFusion 11. We are currently in the testing phase. During our testing we discovered there were numerous drop down lists were not being populated. Those list are populated from the selection of other lists.
At first we looked into cfselect, but discovered that wasn't the issue as we dug deeper we discovered that scripts(cfform.js, cfmessage.js, cfajax.js, cf.css and several others) we leverage for functionality were no longer accessible due to the CFIDE lockdown. After doing some research online we ended up doing the following with the help of our WebOps team.
Create a folder in D:\CF11\cfusion\wwwroot\ called cfM_scripts
Move the scripts folder from CFIDE and paste into cfM_scripts
In IIS right clicked on the website devtest.mysite.com and select add virtual directory
Name the alias /cfM_scripts
In CF administrator Settings, "Default ScriptSrc Directory" set to /cfM_scripts/scripts
Point the virtual directory to D:\CF11\cfusion\wwwroot\cfM_scripts\scripts\
Despite one of our System Admins doing the following, we are still stuck with the same problem. Now I know best practices state not to use those built in script files however we just want to test and get the site to work properly first before we start any major changes.
Was this done correctly? If not what did we miss? Is there another workaround to gain access to those files.

Your strategy should be good.
You might want to check a couple of things in your test environment.
Add the CFIDE virtual directory to the site (remove the global scripts value from cfadmin) the way it used to be on CF9 and verify that everything does work. This will guarantee that the missing CFIDE virtual directory is really the problem.
Try adding the scriptSrc attribute to a <cfform> tag and see if that works.
If the missing CFIDE virtual directory is the problem, you may want to consider implementing a solution like this instead.
Copy the full CFIDE folder {cfusionroot}\cfusion\wwwroot\CFIDE to a new folder. It can be outside of the {cfusionroot} folder structure.
Remove all of the 'unsecure' folders from the new CFIDE folder leaving basically the scripts folder.
Add this new safe folder as your CFIDE virtual directory.
Using this solution will avoid having to go to change your cfadmin setting or all of your <cfform> tags and setting the scriptSrc attribute because the CFIDE virtual directory exists with the /cfide/scripts folder.
This should be safe because /cfide/adminapi, /cfide/administrator, etc. will be gone from your copy of the CFIDE virtual directory.
Use the F12 network traffic option on a browser to make sure that the status of all requests is 200 (success). The script request should be there.

Related

can't open fossil repo over web

I've been strugling for a couple of days with this problem, but can't seem to fix it, I think I'm almost there.... but... not quite :(
This is where I am at.
I'm on a headless debian server, running virtualmin / webmin for creating my domains / users etc. I don't know if this will mess things up, but I'm happy to modify the config files manually (via webmin or via ssh/vim).
I am attempting to run fossil as a cgi service over apache.
its an internal site, named as homeserver.net I can reach the default pages just fine, and add in and create links etc as I want to.
Please note that the solution to my problem is at the end of the question.
so the files are located on disk at, which tallys up with my apache document root
/home/homeserver/www
I would like to run fossil to have both the internal site, and later on and dev work that I practice on in separate files. So I created a new directory for these repositories.
/home/homeserver/repos/web/site.fossil
/home/homeserver/repos/dev/ [no repository yet!]
reading the instructions on the fossil page I have inserted a short cgi file called 'fos_repo.cgi' that reads as.
#!/usr/bin/fossil
directory: /home/homeserver/repos
notfound: http://www.homeserver.net/site404.htm
when I open the link to
www.homeserver.net/cgi-bin/fos_repo.cgi
I get redirected to the 404 page that I have written. So the script is clearly being read and working.
From reading the fossil pages I understand that I should be able to use the following link to open/access the repo.
www.homeserver.net/cgi-bin/repos/web/site
I'm not sure why this isn't working...
so far I have tried the following.
I opened the repository from the cli, and had the server run in the background
fossil server site.fossil &
I though maybee the file should have been inside the main repo directory, not inside a sub directory, so I moved it... it now lives in
/home/homeserver/repos/site.fossil
I tried creating an alias to the file in apache
Alias /home/homeserver/repos/web/site.fossil /home/homeserver/www/repos
When I browse to
www.homeserver.net/repos/site
I get nothing, but going to
www.homeserver.net/repos/site.fossil
will attempt to downloaded the file (which is a binary)
so I think I'm getting somewhere, but I'm not sure what I'm missing.
I've used fossil before, but I ran it as a local server, and started it up as and when I needed it.
I'm running it like this so as I can eventually push the site out to a live VPS (maybe even finish up hosting the fossil site on the VPS also).
ps I really liked fossil when I used it before, and loved the whole integrated wiki and bug tracker, and the fact I could simply copy the file to my external drive to do a backup. Personally don't really want to change to something else, but if I have to....
thanks in advance.
David
Edit: trying other options.
So I thought I would try the single repository method shown on the fossil page, so adjusted my cgi script accordingly.
Now when I navitage to : www.homeserver.net/cgi-bin/fos_repo.cgi I get the following message returned
SQLITE_CANTOPEN: cannot open file at line 30276 of [f5b5a13f73]
SQLITE_CANTOPEN: os_unix.c:30276: (21) open(/home/homeserver/repos)
however if I ssh to the server an start it manually with
fossil server site.fossil
I can get to the server with www.homeserver.net:8081
So I either have a problem with my SQLite usage in apache or something else wrong. Plesse help
Solution
So for reasons of simplicity I've decided that using a single cgi file for each repo is what I am going to go with.
My initial directory structure was as follows:
/home/homeserver/www
/home/homeserver/www/repos
/home/homeserver/www/repos/web # for web site development
/home/homeserver/www/repos/dev # for other development
I think part of my problem was that I was hoping that having the directory: pont to my repos/ location fossil would find the site.fossil file (located in repos/web) and the dev.fossil file (located in repos/deb).
Obviously this didn't work.
The reason I wanted it too look like this was for separation of the information on my system.
For some reason I had decided that pointing fossil as repos/ would give me a nice fossil style front page and links to my repositories automatically. However After having used the directory: version and getting the following error message
Unable to find or open the project repository
I realised that I was still going to need to write my front page to the repositories, and that my expectation was a little too much.
So I've decided to run with a single cgi file pointing to each repo that i need to make.
Instead of
www.homeserver.net/cgi-bin/repos/web/site
try
www.homeserver.net/cgi-bin/repos.cgi/index
Reading your ( very long ) question again, I suggest trying
www.homeserver.net/cgi-bin/fos_repos.cgi/index

Changing the WebLogic Domain location

I am currently setting up a new dev environment, and have come to the final stage where I am trying to run a build.
However, one of the ANT targets is trying to create a directory, which is currently set to "C:\workspace\domains\Online" however for security reasons (they say anyway...) we do not have full access to the C: drive, so I have my domain setup in an alternate location. Where is this Domain Home/Root variable kept?
Well in my own domain there is in file <domain>/bin/setDomainEnv.bat the following line:
set DOMAIN_HOME=D:\Oracle\Middleware\user_projects\domains\domain_name
However since you are using some Ant build file to create your domain, maybe something is hardcoded in them or is one of the properties passed to this file.
An ANT build file that had been supplied by somebody else had been hardcoded! Not sure why they couldn't follow the convention! It just so happened that the location that had been hardcoded was the same as my old workspace, thus the confusion.
Thanks all.

Drupal 7: problems with file permissions and IMCE in sites/default/files directory

I have looked around a great deal on the Drupal forum and elsewhere but I cannot yet resolve this.
I have had to reinstall a large, fully functional site (Drupal 7.18) onto a new server. This has gone very smoothly. However, I do not seem to be able to set permissions for my sites/default/files directory in a manner that keeps it accessible and safe when browsing using the IMCE file browser.
Usually I set sites/default/files (and subdirectories within it) as 755, with files within these directories as 664. This works well on many other Drupal 7 sites I have built.
HOWEVER in this case, with these permissions I get the message "Unable to get a working directory for the file browser".
Only by setting directory permissions as 777 can I browse the files in these directories using IMCE - and I know that is really bad practice on shared hosting.
Please can someone advise on troubleshooting this? I have spent hours but I am getting nowhere.
I wonder if the ownership of the files and directories themselves is wrong. If they are wrong, can anyone direct me to step-by-step instructions for changing them?
Examining the 'problem' files and directories using FireFTP, I see
that both user and group names are the same as the FTP username that
was given me by my web host.
Looking at another Drupal site that works properly, I see that files
and directories in sites/default/files are set to user 531/group 528.
Thanks in anticipation! I am running D7.18 on PHP 5.2.10 with extensions enabled. Everything else seems to be working very well indeed. However, I am not sure I have the Apache or Linux skills needed to resolve this, or even to ask my hosts the correct questions ...
755 basically means that only the owner of the files can modify them, so you could try changing the directory permissions to 775 so that the owner and group can modify.
If you are using shared hosting I suggest you ask your hosting provider to help as they will have a better understanding of the users and groups on the server.
Cheers

After moving my Drupal 7 site to another host, modules are appeared without download

I'm sorry for such a complex subject. My problem is.
I tried to move my Drupal 7 site from one server to another.
I uploaded a fresh install to my new site.
I backed up my old database and imported to my new site.
I uploaded settings.php file to my new site.
When i entered to my new site's module part, All the 3rd party modules i installed in my old site, exist in my new site without any loss :)
But when i check the folders
/sites/all/modules
/sites/default
/modules
I couldn't find any file regarding to this modules.
Modules exist in my new site when i check my admin panel, I can create content by using this modules, But i can't see their files by using file manager etc.
Thank you
Because you didn't disable the modules before moving your deployment, I think they're still marked in your database, same for registration of content types. As the modules don't actually exist, you won't get their functionality.
Replacing the modules under /sites/all/modules or /sites/default/modules should leave them working as before.

Can't read or write to directory CFFILE despite 777 permissions coldfusion

This is installed on a Unix system I don't have direct access to, but can get insight on by sitting with a network team.
The problem is this, I have 3 folders I need access to, read and write. The problem is, I only have access to 1 of them, and only read. This is via ColdFusion, I can get into them fine with the user they are assigned to (and the CF server runs on, which is the "www" user).
I CAN read and write to the temporary file directory, the place files are stored before they are moved to the destination directory (SERVER-INF/ etc etc etc), but that's not helpful. I have tried having the network people set the permissions for the other folders to the same thing, but with no results. The current settings of the folder I can access are rwxrws--- and the other folders are rwxrwxr-x, so I should have more permissions ( the "s" is not a mistake in the first folder).
We have tried setting the other folders to 777 and we did not even get read capability. Does the server need to be restarted on a Unix box after setting new permissions for ColdFusion to be able to get to them? I'm out of ideas right now, I'll take any new suggestions.
TL;DR
All using ColdFusion
temp directory - can read and write to
folder 1 - can read from (including subdirectories)
folder 2 - cannot read or write to (permission denied)
folder 3 - cannot read or write to (permission denied)
Goal: Get upload functionality working.
Edit: Server using apache
Just a random guess... Have you checked that paths you are trying to access are fully correct? They should be absolute for file operations, and www user must have X permissions on the all path directories -- to enter them.
The problem ended up being a restart was required after setting the new folder permissions. We didn't think this was an issue on a Unix box, however ColdFusion apparently did. This worked.