Importing product image data from Hybris 5.2 to 6.7 - e-commerce

Can you please let me know the best approach to migrate product image data in hybris . Should this be done using impex or is there any other convenient approach?

Here, you can find the approach to migrate your data. But you might struggle to implement it.
If you really not bother about URL changes after importing media/image then I would suggest using Hybris OOTB import/export wizard(System > Tools > Import / Export), where you can generate the export script for any ItemType and you can reImport generated Impex & media into the target system.
Like
# ---- Extension: core ---- Export Type: Media ----
"#% impex.setTargetFile( ""Media.csv"" );"
insert_update Media;&Item;#media[translator=de.hybris.platform.impex.jalo.media.MediaDataTranslator];altText;catalog(id)[allownull=true];catalogVersion(catalog(id),version)[unique=true,allownull=true];code[unique=true,allownull=true];convertedMedias(catalogVersion(catalog(id),version),code);dataPK;deniedPrincipals(uid);derivedMedias(&Item);description;internalURL;location;locationHash;mediaContainer(catalogVersion(catalog(id),version),qualifier);mediaFormat(qualifier);metaData(&Item);metaDataDataPK;mime;original(catalogVersion(catalog(id),version),code);originalDataPK;permittedPrincipals(uid);realFileName;removable[allownull=true];size;subFolderPath;supercategories(catalogVersion(catalog(id),version),code)
"#% impex.exportItemsFlexibleSearch( ""SELECT {PK} FROM {Media!} WHERE {catalogVersion} IN (8796054355417)"");"
In above script, you can change the query to get only image media.
Also, there is one advancedexport extension available to serve the same purpose with some additional functionality.
You have to deal with Extract & Transform and Load phase on your own based on your business requirement.

Related

How to dynamically map functions to serverless.yml

My team has been working with serverless and we're trying to establish a new standard in the company for a file organization that eases the collaborative development.
We plan to isolate the lambdas/functions handler each in their folder, alongside the function .yml file and other necessary configs.
Example expected directory structure (lean):
-- /app
--- /functions
---- /func_a
----- func_a.py
----- func_a.yml
---- /func_b
----- func_b.py
----- func_b.yml
- serverles.yml
The problem so far is that we have to manually declare all external config function files in the serverless.yml file, which breaks the whole purpose of the idea.
The question is: is there a way to automate this import?
What we've searched so far:
Wildcard path - does not work for file variables. Eg.: ${file(./app/functions/*.yml)} or ${file(./app/functions/{any+})}
Extending configuration using custom plugins - does not seem to be able to modify the functions list. Only found information about: DefineTopLevelProperty, defineCustomProperties, defineFunctionEvent, defineFunctionEventProperties, defineFunctionProperties, defineProvider.
Info from here: Serverless Doc - base schema link broken, so no other information aside from the one in the page.
What we thought to be options:
Maybe is there a plugin that does that? Didn't find any.
Create an isolated custom function (python) that is called before running sls deploy and creates the final serverles.yml file from a template by traversing all folders.
What is the better and most natural approach to that?
I think that for your use case, you might be better of with considering using JS/TS configuration file format instead of YAML. That allows you to use regular JS to define your config which makes importing such parts of configurations much easier. See the TS template for example on how to use it: https://github.com/serverless/serverless/blob/master/lib/plugins/create/templates/aws-nodejs-typescript/serverless.ts

Need to export dataset from Collibra

I have collibra data governance set up in my company and I am trying to download dataset from it using API. I am not able find any tutorial/link which can explain how to use API to download files collibra using script. Any where I see it shows how to import into collibra and not export.
Any guidance/link will be highly appreciated.
In REST Core APIs, Output Module can be used to export as JSON, CSV, Excel and XML. Kindly refer this, https://developer.collibra.com/rest/rest-core-api/
And also follow Hitchhiker’s Guide to the Output Module documentation for more info on how to create TableViewConfig/ViewConfig. https://developer.collibra.com/rest/output-module/

Reading and writing Hippo content

What is the best way to read and write Hippo content programmaticaly? I want to build a migration tool that writes some pages and binary files to Hippo. I am now using the JCR API to create nodes in the repo, is there any better approach?
Have you tried:
http://import-tool.forge.onehippo.org/
(you can checkout source code and use it as a reference if needed)
Another one you could check is:
https://forge.onehippo.org/svn/restimporter/
(no documentation other than:
https://forge.onehippo.org/svn/restimporter/trunk/README.txt
)
hth

WSO2 Gadget Gen Tool -

I have an external Hadoop cluster (CDH4) with Hive. I used the Gadget Gen tool (BAM 2.3.0) to create a simple table gadget, but no data is populated when I add the gadget to a dashboard using the URL supplied from the gadget gen tool.
Here's my data source settings from the Gadget Generator Wizard
jdbc:hive://x.x.x.x:10000/default
org.apache.hadoop.hive.jdbc.HiveDriver
I added the following jar files to make sure I had everything required for the JDBC connection and restarted wso2server:
hive-exec-0.10.0-cdh4.2.0.jar hive-jdbc-0.10.0-cdh4.2.0.jar
hive-metastore-0.10.0-cdh4.2.0.jar hive-service-0.10.0-cdh4.2.0.jar
libfb303-0.9.0.jar commons-logging-1.0.4.jar slf4j-api-1.6.4.jar
slf4j-log4j12-1.6.1.jar hadoop-core-2.0.0-mr1-cdh4.2.0.jar
I see map reduce jobs running on my cluster during step 2 and 3 of the wizard (and the wizard shows me previews of the actual data), but I don't see any jobs submitted after the gadget is generated.
Any help appreciated.
Gadgen gen tool is for RDBMS database such as MySQL,h2, etc. you can't provide hive URL from the gadget gen tool and run it.
Generally in WSO2 BAM, the hive is used to summarize the collected data which was stored in cassandra and write the summarized final result on RDBMS database. Then from Gadget-gen tool, the gdaget xmls are created by pointing to the final result stored RDBMS database.
You can find more information on WSO2 BAM 2.3.0 documentation. http://docs.wso2.org/wiki/display/BAM230/Gadget+Generation+Tool
Make sure the URL generated for the location of Gadget XML has the correct IP/Host Name. See whether the given gadget xml is located in the registry location of the generated url. You do not have to worry about Hive / Hadoop / Cassandra stuff as they are not relevant to the Gadget. Only the RDBMS (H2 by default) data matters. Hope your problem will be resolved when Gadget location is corrected.

importing openerp translation

I wanted to ask if there is any good way to import many translation files to openerp server? I know I can import one file at a time by using administration menu. But that would take a lot of time.
Thanks
Hello I do understand your concern that is take lot of time and hang ui.
If you are technical Persona then look for the Server Parameter for Internationalization
OpenERP Internationalisation options :
Use these options to translate OpenERP to another language.See i18n
section of the user manual. Option '-d' is mandatory.Option '-l' is
mandatory in case of importation
--load-language=LOAD_LANGUAGE
specifies the languages for the translations you want
to be loaded
-l LANGUAGE, --language=LANGUAGE
specify the language of the translation file. Use it
with --i18n-export or --i18n-import
--i18n-export=TRANSLATE_OUT
export all sentences to be translated to a CSV file, a
PO file or a TGZ archive and exit
--i18n-import=TRANSLATE_IN
import a CSV or a PO file with translations and exit.
The '-l' option is required.
--i18n-overwrite overwrites existing translation terms on updating a
module or importing a CSV or a PO file.
--modules=TRANSLATE_MODULES
specify modules to export. Use in combination with
--i18n-export
Hope this will fix your Point.