Data migration from liferay portal to websphere portal - migration

i need to migrate the content of the list below from liferay Portal to WebSphere Portal. and we will use IBM Connection for the portlets.
a. Calendar.
b. Discussion Boards/Blogs/Forums
can anyone give me an API that can be use for migrating the data?
I have an options here but i dont see results in the web.
liferayAPI that will export data/content (LAR? ) and import to websphere portal
export data/content csv file and create a program that will create new blogs,forums etc to Websphere portal.
thank you,
-Jhei

If you have small amount of data set then, it would be good to create a migration app with uses RESTfull WEb-services/Direct DB interaction if you have, which will fetch data from Liferay and push data to Webspear.

Related

App for transporting SAP Commerce content between environments

We have a request from our business to create a custom application capable of extracting CMS content out of one SAP Commerce environment and loading it up into another SAP Commerce environment. The idea is that our content admins could "transport" a new page from QA to production without having to manually recreate it. Also, such a tool could be used in "refreshing" certain parts of one environment with the content of another.
I believe that we could write such a tool ourselves, and I found one other stackoverflow post mentioning a technique to do this:
Can SAP commerce cloud (Hybris) export content?
We could create an OCC endpoint for both retrieving cms content and putting cms content.
My question to you all is does there exist a third party application that could do this for us already? It would be beneficial to our organization if we could purchase a solution that did this for us rather than writing and maintaining the app ourselves.
SAP just released an extension for db sync, check here.

How to use data from Azure Active Directory in Vue.js

I want to use the Azure Active Directory for identity. I searched for a solution how to "integrate" it into a Vue.js single page application -> result was the oidc-client-js.
Now I am looking for a solution how to save sensible information about the user in a cookie and not in local or session storage of the browser.
Many thanks for the support in advance!
I'am grateful for alternatives to oidc.

Service Account for Google Data Studio to Access HTML Files on Google Cloud Storage

I have some HTML files uploaded into a Google Cloud Storage bucket that I would like to embed through an iframe on my dashboard in Google Data Studio.
This works just fine when I open access to the world on the bucket (or resource) by setting the AllUsers permission.
However, I would prefer to only allow access through Google Data Studio. How can this be achieved?
I was thinking of adding a permission for the Service Account of Google Data Studio, but don't really know how to configure this correctly.
I don't believe this is possible right now.
However, a complex solution I can think of is to use a combination of Community Connectors and Community Viz:
Build a community connector that uses your own GCP service account to read the HTML files on GCS and send back the raw HTML content as data.
Build a community viz that can take the HTML data from the connector and render the HTML.
If you have multiple HTML files, you can setup filters in Data Studio so that each viz renders only one HTML.
Code samples for Community Connector and viz are available here.

What a client has to enable to use my app to edit their Google Sheet

Let's say I wrote a simple desktop app that reads some data from a Google Sheet, goes off and finds the answer and comes back and writes it into the Google Sheet. Now hundreds of people have sheets and want to use my program to process them. I post my program and let them download it for free.
Here is the sequence I think the end user needs to follow to get it to work:
1. Go to the GCP console and create a new project
2. Enable the Google Drive and Google Sheets APIs
3. Create a credentials JSON file with access as the Project Editor.
4. Create the Sheet if needed.
5. Copy the email out of the JSON file and Share it in the Sheet.
6. Make sure the JSON file is named right and in the right directory so the app finds it.
That all seems like a lot just so an app can read and write to a Google Sheet. Is there a simpler interface I am missing?
Answer:
There are steps that will always need to be completed when a user runs an application that accesses the Google APIs as themself, however depending on the case some of these may be simplified or circumvented.
More information:
There are a few things that you will need to bear in mind when creating an application for others to use, I'll summarise these points here and explain in a bit further detail:
To use a G Suite API, a Google Cloud Platform (GCP) Project needs to be owned and have the API associated.
In order to use the Google APIs (such as Drive and Sheets) its use must be enabled within the project that will use it.
If a user is running an application as themself, they will need to authenticate the application with their own credentials.
As per Google's documentation for Setting up a project:
Every application that uses Google APIs needs an associated Google Cloud Platform project. This project is a set of configuration settings that define how your application interacts with Google services and what resources it can use.
Resultantly, if your users will be running the application themselves, from a version of the app which is unpublished and not in the G Suite Marketplace then yes - they will need to create a GCP project, enable the respective APIs and use their personal credential file to run your application.
In reality this isn't the way G Suite Applications should be created, distributed and maintained and I can't reccommend that you do it this way. The G Suite marketplace is there to put in place a way of application distribution while removing the need for each user to create a GCP project and authenticate.
Things you can do:
With this in mind, there are a couple of things you can do to which will make the process of what you're doing slightly easier/shorter. Please bear in mind this isn't recommended for application distribution and more for testing purposes:
You can combine points 1, 2 and 3 in your question by having users click the Enable the Google Sheets API button on the Python Quickstart page. This button creates a new GCP project, enables the API and gives a credential file download link in one fell swoop.
Rather than having the user create the sheet themselves, you can code a sheet existence check in your program and create it if it doesn't already exist:
from pprint import pprint
from googleapiclient import discovery
# assuming you already have all your authentication code
sheetsService = discovery.build("sheets", "v4", credentials = credentials)
# add request body to variable sheetBody:
sheetBody = {}
# create the sheet:
sheetsService.spreadsheets().create(body = sheetBody).execute()
The https://www.googleapis.com/auth/drive.file scope allows your application to only access files created by itself which gives your application an extra level of security and trust.
In the sheetBody contents you can imclude the email address of the account running the application, if applicable, by running a Files: list request of the Drive API and reading the owners.emailAddress property of the response, to save reading the address from the JSON file.
References:
G Suite Developer:
G Suite APIs
Set Up Your Project
Build your app
Google Sheets API v4:
Python Quickstart
Google Drive API v3:
Files resource

Create external link to BIM360 document

I am researching to see if this is possible. I am new to the Forge API and Revit/Bim360 in general so my apologies if I am not using the correct terminology. I have successfully made API calls using Postman to BIM360 to pull back project information.
What I would like to do is construct a URL from within our ERP application that passes the external project id and the sheet name into a web application and to have that web application take the user to the bim360 markup viewer of that item.
So user would be looking at a work order in our manufacturing system, click the link for "View Production Ticket" and a screen would pop up, ask the user to login to BIM360 if they haven't already, and open the bim360 viewer.
To do so, I would like to advise you to integrate the following Forge APIs instead in your ERP system, not embed BIM360 to the ERP system directly:
Forge OAuth API to fetch 3 legged-token with user login (their Autodesk account)
Forge Data Management API to obtain hubs, project, items, version, and derivative file links of the model file of your BIM360 account.
View your model (derivative urn) via Forge Viewer
Here is a tutorial for you http://learnforge.autodesk.io/ and hope it helps!