How to browse source code in developer console on Google Compute Engine - repository

I have used Click to deploy MEAN Stack on Google Compute Engine. Everything is fine, runs in the Cloud and on my local machine.
When I try to browse source code in the developer console, I get the Getting startet screen for cloud source tools.
Should it not be set up automatically when I created the instance?
When I SSH into the instance I can see the content of opt/myApp.
This I would like to see in browse source code also.
If I try to git clone the Cloud Repository it is empty.
I'm currently using the trial version. Maybe this is the reason I can not browse or clone the Cloud Repository?

This is a really good suggestion. It would be very interesting to integrate click to deploy in more Google Cloud Platform services. The MEAN click to deploy doesn't have much of a sample application installed, which is why it doesn't make sense to put it in a code repository. The application stack itself (mongo, express, angular, node) you wouldn't put in a repository, which is really the only thing installed on the MEAN Click to Deploy server.

Related

serve oracle service cloud Customer portal locally?

I am working on customizing the oracle service cloud customer portal, but since OSvC provides only WebDAV to connect to it. It is very time-consuming to edit files and then upload them to WebDAV even for a single word change.
I am looking for a solution to serve it locally make desired changes and then upload the desired code to webDEV.
But after searching the file structure I can not make which framework it uses, I tried to use websites like https://builtwith.com/ and WhatRuns but they are also not able to find anything useful.
Although after searching in the file structure, I find some files of CodeIgnitor but the structure is way more different than the CodeIgnitor folder structure.
The short answer is no, you will not be able to run Customer Portal locally. While it is a fork of CodeIgniter from many years ago, there are server-side dependencies that will prevent you from running it in a local sandbox.
That said, it is possible to automate many of the manual tasks of interacting with WebDAV for change testing. If you edit locally, then you can use scripting hooks or event RPA robots to automate some of the manual file movement. Personally I have a flow to edit remotely in my test environment with an editor (like VSCode or Nova) that can connect to a remote server via WebDAV and edit files directly in the development area of a site. Then, when finished, I have a script that pulls down the latest version of all files and then allows me to commit changes to Git for SCM.
Another option is RPA. You can develop a robot that can be run to automate the manual tasks that you face in your workflow. Personally, I think that scripting is a better solution than RPA since you can automate all of the actions via scripting or a shell. But, it's another option to consider.
Another way of "Live editing" the OSvC CP code is to connect to WebDav via a software that supports it like Mountain Duck which uploads your code to OSvC on save.
OR use the better solution Windows Explorer which supports connecting to WebDav and treating it like a network drive, by going on My Computer -> Computer -> Map Network Drive then put https://yoursite.custhelp.com/dav/cp click Next then you'll be promoted to login using your OSvC login.

Jenkins Execution Issue

A selenium script , when executing from Eclipse in a local machine , clicks an UPLOAD button in browser to upload some files from local machine to that application running in browser.
This uploading part is failing when the script is executing from Jenkins because Jenkins server is unable to access local machine.
Any idea how to overcome this issue..??
Accessing these files from the server is a bad idea (unless it's a Jenkins's slave) though can be achieved via your code or some CMD tool, with providing the address and credentials. Put some effort in this and you'll find how to do it...
The better solutions will be adding these files to the Jenkins server as well or committing these files to your repository and checkout them before the test.
A shared folder could also do the trick on a filer type server or similar.
Hope it helps. Anyway there's no substitute to your own research e.g. Google and StackOverflow.

Where did expiring S3 URLs go in the AWS console?

I used to be able to create expiring (signed) URLs directly from within AWS console very easily, just right-click the item, choose web URL, and voila.
This feature seems to have disappeared sometime this year. Does anyone know how to do it now? I am not looking for a programmatic solution, I just occasionally need to whip up an expiring URL for a file, about once or twice a year. I really don't want to bother with a third party tool either, I want to know how one does it from within AWS console now, in 2015.
This option was never available in the AWS Management Console.
If you use Actions -> Open on the file from within the Console it generates a signed URL, but there is no capability to edit the parameters.
It is, however, available in:
The AWS Toolkit for Visual Studio
Cloudberry Explorer PowerShell
Bucket Explorer
GitHub amazon-s3-url-signer
GitHub s3-signed-url
Your own code! Signing Amazon S3 URLs

Amazon Git Eb Tools not deploying

I followed the help page here
and I created my application through the eb tools command line tools. I can see the application on command line, however when I log into my AWS Management console and select the region area, and click on "elastic beanstalk" I see the welcome screen asking me to create a new application,my question is why cant I see my application on the web interface? Also the application is running on the web because when I visit the url the link is active, but I see this message "Could not find rake-10.0.2 in any of the sources (Bundler::GemNotFound)" (which if anyone knows how to fix that would also help)
Thanks in advance
It turns out that somehow amazon changed my secret keys, and my instanced where not showing. When I updated the keys on my local machine everything began working correctly.

How do I publish php source code to a local web server in rational team concert?

I'll be using RTC in the near future here at work. My question is: where does it put the files the team members will be working on? I understand that each programmer will work on the projects files and they will push the changes to the main repository. We have a local web server where we test our work (php). So, do we have to configure RTC to publish the files to the web server? or the RTC server must be installed in the webserver so it can save the files?
We use Rational Team Concert almost exactly as you describe, and it works brilliantly. My small team of web developers collaborates on website source code and delivers it to two different streams depending on its readiness: production-stream and staging-stream. Then we have defined two builds that check out the source code, move some things around, and push the files to the web servers via SCP. So, with a few clicks we kick off a staging build, watch it finish in about two minutes and everyone can see the changes on the staging server. When the code is ready for prime-time, the change sets are delivered to production-stream and the production build is kicked off, which is configured to copy the files to the production web server.
But even before a staging or production build is run, any of us can simply configure a local web server in RTC using the Eclipse PDE and Web Tools add-ons and see the site running in localhost as we develop.
All our work is done within Rational Team Concert, from planning, to bug tracking, to source control, to builds. It's very well-suited for website management.
Your understanding is correct - you work on files locally, and they get uploaded on to the server when you checkin. Bear in mind that checkin in RTC terms really means back-up your files to the server, it is a Deliver command that shares the files with others (it is worth a quick look at the articles on jazz.net that explains how SCM works).
One way to pubish to your php server is to make that part of a build, or a build in its own right (which RTC also handles - in conjunction with your favourite build tool). The build would copy the files to the php server. The advantage of doing this as a build is you will know exactly what versions of your files are being copied, and you will be able to reproduce this copy at any point in the future.
You do not need to install the RTC server on the php server.
You can also try posting on the forums on http://jazz.net/ if you have questions on RTC.
Hope that helps.
Another alternative would be to use the command line interface to accept all changes into a workspace and run that with a cron job.
To handle discarded change sets, you'd probably want to use something like:
scm workspace replace-components <workspace-name> stream <uuid-of-stream> --all
after you had initially loaded the workspace on your web server.