Add New custom option value using web service in Magento API - api

I'm integrating Adempiere with Magento Synchronizing Products , i want to create a custom options but im getting error
XML-RPC Error: SQLSTATE[23000]: Integrity constraint violation: 1048 Column 'price_type' cannot be null, query was: INSERT INTO `catalog_product_option_type_price` (`option_type_id`, `store_id`, `price`, `price_type`) VALUES (?, ?, ?, ?)
i have given the value Product_DATA.put("price_type", "percent"); code:
public int CreateCustomOptions(String sessionid,Integer product_id,String Option,String o_value , String o_sku, String price)
{
HashMap Product_DATA = new HashMap();
Product_DATA.put("title", o_value);
Product_DATA.put("price",price);
Product_DATA.put("price_type", "percent");
Product_DATA.put("sku", o_sku);
Vector ARGS = new Vector();
ARGS.add(sessionid);
ARGS.add(new String ("product_custom_option_value.add"));
ARGS.add(new Object[] {110,Product_DATA});
this.newRequest(remoteHost, "", sessionid, "call", ARGS);
Object RESULT = (Object) this.sendRequest();
}

Try to use (String) cast to force , or use string valueof
http://www.tutorialspoint.com/java/java_string_valueof.htm

Related

Not able to upload json data to Bigquery tables using c#

I am trying to upload json data to one of the table created under the dataset in Bigquery but fails with "Google.GoogleApiException: 'Google.Apis.Requests.RequestError
Not found: Table currency-342912:sampleDataset.currencyTable [404]"
Service account is created with roles BigQuery.Admin/DataEditor/DataOwner/DataViewer.
The roles are also applied to the table also.
Below is the snippet
public static void LoadTableGcsJson(string projectId = "currency-342912", string datasetId = "sampleDataset", string tableId= "currencyTable ")
{
//Read the Serviceaccount key json file
string dir = Directory.GetParent(Directory.GetCurrentDirectory()).Parent.Parent.FullName + "\\" + "currency-342912-ae9b22f23a36.json";
GoogleCredential credential = GoogleCredential.FromFile(dir);
string toFileName = Directory.GetParent(Directory.GetCurrentDirectory()).Parent.Parent.FullName + "\\" + "sample.json";
BigQueryClient client = BigQueryClient.Create(projectId,credential);
var dataset = client.GetDataset(datasetId);
using (FileStream stream = File.Open(toFileName, FileMode.Open))
{
// Create and run job
BigQueryJob loadJob = client.UploadJson(datasetId, tableId, null, stream); //This throws error
loadJob.PollUntilCompleted();
}
}
Permissions for the table, using the service account "sampleservicenew" from the screenshot
Any leads on this , much appreciated
Your issue might reside in your user credentials. Please follow this steps to check your code:
Please check if the user you are using to execute your application have access to the table you want to insert data.
If your json tags matchs your table columns.
If you json inputs are correct ( table name, dataset name ).
Use a dummy table to perform a quick test of your credentials and data integrity.
These steps will help you identifying what could be missing on your side. I perform the following operations to reproduce your case:
I created a table on BigQuery based on the values of your json data:
create or replace table `projectid.datasetid.tableid` (
IsFee BOOL,
BlockDateTime timestamp,
Address STRING,
BlockHeight INT64,
Type STRING,
Value INT64
);
Created a .json file with your test data
{"IsFee":false,"BlockDateTime":"2018-09-11T00:12:14Z","Address":"tz3UoffC7FG7zfpmvmjUmUeAaHvzdcUvAj6r","BlockHeight":98304,"Type":"OUT","Value":1}
{"IsFee":false,"BlockDateTime":"2018-09-11T00:12:14Z","Address":"tz2KuCcKSyMzs8wRJXzjqoHgojPkSUem8ZBS","BlockHeight":98304,"Type":"IN","Value":18}
Build & Run below code.
using System;
using Google.Cloud.BigQuery.V2;
using Google.Apis.Auth.OAuth2;
using System.IO;
namespace stackoverflow
{
class Program
{
static void Main(string[] args)
{
String projectid = "projectid";
String datasetid = "datasetid";
String tableid = "tableid";
String safilepath ="credentials.json";
var credentials = GoogleCredential.FromFile(safilepath);
BigQueryClient client = BigQueryClient.Create(projectid,credentials);
using (FileStream stream = File.Open("data.json", FileMode.Open))
{
BigQueryJob loadJob = client.UploadJson(datasetid, tableid, null, stream);
loadJob.PollUntilCompleted();
}
}
}
}
output
Row
IsFee
BlockDateTime
Address
BlockHeight
Type
value
1
false
2018-09-11 00:12:14 UTC
tz3UoffC7FG7zfpmvmjUmUeAaHvzdcUvAj6r
98304
OUT
1
2
false
2018-09-11 00:12:14 UTC
tz2KuCcKSyMzs8wRJXzjqoHgojPkSUem8ZBS
98304
IN
18
Note: You can use above code to perform your quick tests of your credentials and the integrity of the data to insert.
I also make use of the following documentation:
Load Credentials from a file
Google.Cloud.BigQuery.V2
Load Json data into a new table

Insert into table KUDU by datastage

I am writing to enquire about a problem in my process:
I have a Kudu table and when I try to insert by datastage (11.5 or 11.7) a new row where the size is bigger than 500 characters using the Impala JDBC Driver I receive this error:
Fatal Error: The connector failed to execute the statement: INSERT INTO default.tmp_consulta_teste (idconsulta, idcliente, idinstituicao, idunidadeinst, datahoraconsulta, desccpfcnpj, idcentral, idcontrato, idusuario, valorconsulta, descretornoxml, idintegracaosistema, nomeservidor) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?). The reported errors are:
[SQLSTATE HY000] java.sql.SQLException: [Cloudera]ImpalaJDBCDriver Error getting the parameter data type: HIVE_PARAMETER_QUERY_DATA_TYPE_ERR_NON_SUPPORT_DATA_TYPE.
**************How can I fix it? I need to load that information. **********
I had similar problem where the error I received was :
Servlet.service() for servlet [dispatcherServlet] in context with path [] threw
exception [Request processing failed; nested exception is
org.springframework.jdbc.UncategorizedSQLException: PreparedStatementCallback;
uncategorized SQLException for SQL [update service set comments =? where service_name
="Zzzzz";]; SQL state [HY000]; error code [500352]; [Simba]
[ImpalaJDBCDriver](500352) Error getting the parameter data type:
HIVE_PARAMETER_QUERY_DATA_TYPE_ERR_NON_SUPPORT_DATA_TYPE; nested exception is
java.sql.SQLException: [Simba][ImpalaJDBCDriver](500352) Error getting the parameter
data type: HIVE_PARAMETER_QUERY_DATA_TYPE_ERR_NON_SUPPORT_DATA_TYPE] with root cause
I referred to the last most answer in the below link:
https://community.cloudera.com/t5/Support-Questions/HIVE-PARAMETER-QUERY-DATA-TYPE-ERR-NON-SUPPORT-DATA-TYPE/td-p/48849
I did the following:
1.Ensure the table is a Kudu table.
Instead of jdbcTemplate.query I did jdbcTemplate.batchUpdate in order to use a
PreparedStatement , did SetObject in PreparedStatement.
jdbcTemplate.batchUpdate(UpdateComment, new BatchPreparedStatementSetter(){
#Override
public int getBatchSize() {
return 1;
}
#Override
public void setValues(PreparedStatement ps, int i) throws SQLException {
ps.setObject(1, comments);
}
});

sqlException when use ActiveJDBC

I use ActiveJDBC and Oracle 11g DB. When I use saveIt, i get java.sql.Exception. When I get instance or list of it, everything ok.
What I do wrong?
Exception in thread "main" org.javalite.activejdbc.DBException: java.sql.SQLException: Invalid argument
зове, query: INSERT INTO dept (DEPTNO, DNAME, LOC) VALUES (?, ?, ?), params: 45, sdfa, fdg
at oracle.jdbc.driver.AutoKeyInfo.getNewSql(AutoKeyInfo.java:187)
at oracle.jdbc.driver.PhysicalConnection.prepareStatement(PhysicalConnection.java:5704)
at org.javalite.activejdbc.DB.execInsert(DB.java:598)
at org.javalite.activejdbc.Model.insert(Model.java:2698)
at org.javalite.activejdbc.Model.save(Model.java:2597)
at org.javalite.activejdbc.Model.saveIt(Model.java:2524)
at JavaHomeTask.Dept.addPersistence(Dept.java:72)
at JavaHomeTask.App.addRow(App.java:103)
at JavaHomeTask.App.main(App.java:50)
Caused by: java.sql.SQLException: Invalid argument
... 9 more
And here is my code:
public void addPersistence() throws IOException {
BufferedReader reader = new BufferedReader(new InputStreamReader(System.in));
Dept d = new Dept();
String value;
for (String s : getAttributesNames()) {
System.out.println("Enter " + s + " and press Enter button:");
value = reader.readLine();
d.set(s, value);
}
d.saveIt();
}
public List<String> getAttributesNames() {
return Arrays.asList("DEPTNO", "DNAME", "LOC");
}
The reason of the probleb is that ActiveJDBC uses id column as a Primari Key in table for recognizing which operation - INSERT or UPDATE should be used. And if table doesn't has such column, programmer should specify PK manually using #IdName("nameOfColumn") annotation. More information you can find here

Magento API : XML-RPC Error: Default option value is not defined

I'm integrating Magento with Adempiere ,
Creating/updating product successfully have some attributes in product , i want update values in those attributes
Attribute set Instance
{scope=global, code=manufacturer, attribute_id=64, required=0, type=select}
My Code:
Product_DATA.put("sku",p.getSku());
Product_DATA.put("manufacturer",new Object[]{"Zipped"});
Vector ARGS = new Vector();
ARGS.add(SESSION_KEY);
ARGS.add(new String ("catalog_product.create"));
ARGS.add(new Object[] {p.getType().getType(),4,p.getSku(),Product_DATA});
this.newRequest(remoteHost, "", SESSION_KEY, "call", ARGS);
Object RESULT = (Object) this.sendRequest();
Create Code:
HashMap label_DATA = new HashMap<>();
label_DATA.put("store_id", 0/1);
label_DATA.put("value", "bpartner");
HashMap Vendor_DATA = new HashMap<>();
Vendor_DATA.put("label",new Object[] {label_DATA});
Vendor_DATA.put("order",0);
Vendor_DATA.put("is_default",0);
Vector ARGS1 = new Vector();
ARGS1.add(sessionid);
ARGS1.add(new String ("product_attribute.addOption"));
ARGS1.add(new Object[] {"manufacturer",new Object[] {Vendor_DATA}});
this.newRequest(remoteHost, "", sessionid, "call", ARGS1);
Object RESULT1 = (Object) this.sendRequest();
Error while create :
XML-RPC Error: Default option value is not defined
I want to create / update this "manufacturer" column in Product window , please anyone help to resolve this

In SQL Server 2008 I am able to pass table-valued parameter to my stored procedure from NHibernate.How to achieve the same in Oracle

I have created a table as a type in SQL Server 2008.
As SQL Server 2008 supports passing table value parameter as IN parameter to stored procedure. It is working fine.
Now I have to perform the same approach in Oracle.
I did it through PLSQLAssociativeArray but the limitaion of Associative array is they are homogeneous (every element must be of the same type).
Where as in case of table-valued parameter of SQL Server 2008, it is possible.
How to achieve the same in Oracle.?
Following are my type and stored procedure in SQL Server 2008:
CREATE TYPE [dbo].[EmployeeType] AS TABLE(
[EmployeeID] [int] NULL,
[EmployeeName] [nvarchar](50) NULL
)
GO
CREATE PROCEDURE [dbo].[TestCustom] #location EmployeeType READONLY
AS
insert into Employee (EMP_ID,EMP_NAME)
SELECT EmployeeID,EmployeeName
FROM #location;
GO
Call from NHibernate
var dt = new DataTable();
dt.Columns.Add("EmployeeID", typeof(int));
dt.Columns.Add("EmployeeName", typeof(string));
dt.Rows.Add(new object[] { 255066, "Nachi11" });
dt.Rows.Add(new object[] { 255067, "Nachi12" });
ISQLQuery final = eventhistorysession.CreateSQLQuery("Call TestCustom #pLocation = :id");
IQuery result = final.SetStructured("id", dt);
IList finalResult = result.List();
CREATE OR REPLACE TYPE employeeType AS OBJECT (employeeId INT, employeeName VARCHAR2(50));
CREATE TYPE ttEmployeeType AS TABLE OF employeeType;
CREATE PROCEDURE testCustom (pLocation ttEmployeeType)
AS
BEGIN
INSERT
INTO employee (emp_id, emp_name)
SELECT *
FROM TABLE(pLocation);
END;
As I understand, it is not possible to use Oracle object table parameters (see #Quassnoi's answer for an example) using either nHibernate or ODP.NET. The only collection type supported by ODP.NET is PLSQLAssociativeArray.
However, one could easily achieve the same result as with SQL Server TVPs using associative arrays. The trick is to define an array for each parameter instead of a single one for the whole table.
I'm posting a complete proof-of-concept solution as I haven't been able to find one.
Oracle Schema
The schema includes a table and a packaged insert procedure. It treats each parameter as a column and assumes each array is at least as long as the first one.
create table test_table
(
foo number(9),
bar nvarchar2(64)
);
/
create or replace package test_package as
type number_array is table of number(9) index by pls_integer;
type nvarchar2_array is table of nvarchar2(64) index by pls_integer;
procedure test_proc(p_foo number_array, p_bar nvarchar2_array);
end test_package;
/
create or replace package body test_package as
procedure test_proc(p_foo number_array, p_bar nvarchar2_array) as
begin
forall i in p_foo.first .. p_foo.last
insert into test_table values (p_foo(i), p_bar(i));
end;
end test_package;
/
nHibernate Mapping
<sql-query name="test_proc">
begin test_package.test_proc(:foo, :bar); end;
</sql-query>
nHibernate Custom IType
I've borrowed the concept from a great SQL Server related answer and modified the class slightly to work with ODP.NET. As IType is huge, I only show the implemented methods; the rest throws NotImplementedException.
If anyone wants to use this in production code, please be aware that I've not tested this class extensively even if it does what I immediately need.
public class OracleArrayType<T> : IType
{
private readonly OracleDbType _dbType;
public OracleArrayType(OracleDbType dbType)
{
_dbType = dbType;
}
public SqlType[] SqlTypes(IMapping mapping)
{
return new []{ new SqlType(DbType.Object) };
}
public bool IsCollectionType
{
get { return true; }
}
public int GetColumnSpan(IMapping mapping)
{
return 1;
}
public void NullSafeSet(IDbCommand st, object value, int index, ISessionImplementor session)
{
var s = st as OracleCommand;
var v = value as T[];
if (s != null && v != null)
{
s.Parameters[index].CollectionType = OracleCollectionType.PLSQLAssociativeArray;
s.Parameters[index].OracleDbType = _dbType;
s.Parameters[index].Value = value;
s.Parameters[index].Size = v.Length;
}
else
{
throw new NotImplementedException();
}
}
// IType boiler-plate implementation follows.
The constructor parameter specifies the type of the base array type (i.e. if you passing an array of strings, pass OracleDbType.NVarchar2. There probably is a way to deduce the DB type from the value type, but I'm not sure yet how to do that.
Extension Method for IQuery
This wraps the type creation:
public static class OracleExtensions
{
public static IQuery SetArray<T>(this IQuery query, string name, OracleDbType dbType, T[] value)
{
return query.SetParameter(name, value, new OracleArrayType<T>(dbType));
}
}
Usage
To tie all this together, this is how the class is used:
using (var sessionFactory = new Configuration().Configure().BuildSessionFactory())
using (var session = sessionFactory.OpenSession())
{
session
.GetNamedQuery("test_proc")
.SetArray("foo", OracleDbType.Int32, new[] { 11, 21 })
.SetArray("bar", OracleDbType.NVarchar2, new [] { "bar0", "bar1" })
.ExecuteUpdate();
}
The result of select * from test_table after running the code:
FOO BAR
----------------
11 bar0
21 bar1