Unbalancing error while create vendor bills in Odoo13 - odoo

Recently I have been trying to create vendor bills in odoo13 by code by apparently every time I run the code I get a message "Cannot create unbalanced journal entry. Ids: [177] Differences debit - credit: [3.5]"
3.5 in my unit_price.
My code is below:
payment_move = self.env['account.move'].create({
'type': 'in_invoice',
'partner_id': self.client.id,
})
payment_move.invoice_line_ids.create({'move_id': payment_move.id,
'quantity': 1.0,
'price_unit': 3.5,
'account_id': self.env['account.account'].search(
[('user_type_id', '=', self.env.ref('account.data_account_type_expenses').id)], limit=1).id})
Thanks in advance.

Related

How can I get a message from Binance API when my order is filled

I created a limit buy order.
If this buy order is filled so I open the long position, I want to create another order immediately.
So basically, I want to get a message from the Binance server when my order event is filled.
Is there any function to do so?
I am using WebSocket via the python-binance library, so it would be perfect if there is that functionality in the python-binance library.
Thank you.
You can do it by checking the order status. Below code will do the trick.
# Post a new sell order
params = {
'symbol': 'BTCUSDT',
'side': 'SELL',
'type': 'LIMIT',
'timeInForce': 'GTC',
'quantity': 0.001,
'price': sell_price
}
sell_response = spot_client.new_order(**params)
oid = sell_response['orderId']
loop_sell = True
while (loop_sell):
order_check = spot_client.get_order("BTCUSDT", orderId=oid)
order_status = order_check['status']
if order_status == "FILLED":
print("Sell Order Filled")
break
time.sleep(10)
Binance does not currently offer notifications when orders are created, canceled, or fulfilled through API.
You can do it through user streams. Below pointer may help you
https://dev.binance.vision/t/new-order-notification/2261

How to adjust the stock of a product tracked by lots in Odoo via API

OK so I have been banging my head at this problem for way too long by now.
I want to sync stock levels of a product that is tracked with lots between the webshop and Odoo. For this reason I need to be able to make a stock adjustment of a lot via the API (in this case in python).
I have found this possible way of doing it:
odoo(
'stock.move',
'create',
[{
"name": "Webshop stock adjustment",
"company_id": 1,
"location_id": 8, # warehouse
"location_dest_id": 14, # virtual location
"product_id": batch["product_id"][0],
"product_uom": 1,
"lot_ids": [batch["id"]], # I am searching for the id by the lot name beforehand
"product_uom_qty": 1,
"quantity_done": 1,
"state": "done"
}]
)
This, however, results in two moves! One move which has the correct lot, and another one without a specified lot. The latter move is faulty of course, as the product is tracked with lots. This results in a fault lot entry, where I can't change the quantity by hand, as the field is invalid. Worse, it results in wrong stock levels.
You can see the problematic bookings here
I have tried to just create a stock.move.line, like so:
odoo(
'stock.move.line',
'create',
[{
"company_id": 1,
"display_name": "Webshop adjustment", # does not appear
"location_id": location_id,
"location_dest_id": location_dest_id,
"product_id": batch["product_id"][0],
"product_uom_id": 1,
"lot_id": batch["id"],
"product_uom_qty": quantity,
"qty_done": quantity,
"state": "done" # has no effect
}]
)
However that results in a line with no effect: Line
I have also tried to find the stock adjustment wizard, but the only one I found in the code as opposed to the UI, doesn't have a field for lots..
I'd be happy for any input on how to solve this problem!
Meanwhile I managed to solve this problem reliably. I needed to implement a function for that, rather than mucking around with the external API.
The function here is expecting vals with the format below. It reduces whatever batch needs to go first.
[{
'sku': sku,
'qty': quantity
},]
#api.model
def reduce_lots(self, vals):
log(vals)
for product_req in vals:
product = self.env['product.product'].search(
[['default_code','=', product_req['sku']]]
)
if len(product) == 0:
continue
lots = self.env['stock.quant'].search(
['&',('product_id', '=', product[0]['id']),('on_hand', '=', True)],
order='removal_date asc'
)
move = self.env['stock.move'].create({
'name': product_req['order'],
'location_id': 8, # Our Warehouse
'location_dest_id': 14, # Virtual Location, Customer. If you need to increase stock, reverse the two numbers.
'product_id': product.id,
'product_uom': product.uom_id.id,
'product_uom_qty': product_req['qty'],
})
move._action_confirm()
move._action_assign()
product_req['lots'] = []
for line in move.move_line_ids:
line.write({'qty_done': line['product_uom_qty']})
product_req['lots'].append({
'_qty': line['product_uom_qty'],
'_lot_id': line.lot_id.name,
'_best_before': line.lot_id.removal_date
})
move._action_done()
return vals

To get the weight of product in ebay api

Currenly i am working in ebaysdk. I am facing the problem which is the weight of the product. how can i can get product weight ? I used trading api but most of weight of the products equal to 0. is there any way to get every product weight? I requested like this:
response = api_trading.execute('GetItem' ,
{"ItemID":"184097524395",'DestinationPostalCode':'2940','DestinationCountryCode':'GB'})
Some items have their weight included in the "Items Specifics" section, which can only be extracted by setting "IncludeItemSpecifics" to "true" in the trading api execution:
response = api_trading.execute('GetItem' , {"ItemID":"254350593466", "IncludeItemSpecifics": "true"})
Then, one possible way to get the details of interest is by looping through the dictionary:
for d in response.dict()['Item']['ItemSpecifics']['NameValueList']:
print(d)
The weight Name and Value will be in one of those dictionaries:
...
{'Name': 'Item Length', 'Value': '1', 'Source': 'ItemSpecific'}
{'Name': 'Item Weight', 'Value': '2.25', 'Source': 'ItemSpecific'}
{'Name': 'Item Width', 'Value': '1', 'Source': 'ItemSpecific'}
...
Source:
https://developer.ebay.com/devzone/guides/features-guide/default.html#development/ItemSpecifics.html
It seems like if the seller does not indicate a weight in the first place, the api has nothing to show for it.
If shipping is free, most likely the seller does not bother entering the weight of the item. So perhaps find those products whose shipping is not free, maybe the seller will indicate the weight for shipping calculations.
"ShippingDetails": {"ShippingType": "Calculated"}
I also tried GetItem and it can show weight of the item as long as weight is available. I also tried GetItemShipping and it can show weight of item if available, but needs DestinationPostalCode.
Souce:https://github.com/timotheus/ebaysdk-python/issues/304

pandas to gbq claims a schema mismatch while the schema's are exactly the same. On github all the issues are claimed to have been solved in 2017

I am trying to append a table to a different table through pandas, pulling the data from BigQuery and sending it to a different BigQuery dataset. While the table schema is exactly the same i get the error " "Please verify that the structure and "
pandas_gbq.gbq.InvalidSchema: Please verify that the structure and data types in the DataFrame match the schema of the destination table."
This error occurred earlier where I went for table overwrites but in this case the datasets are too large to do that (and that is not a sustainable solution).
df = pd.read_gbq(query, project_id="my-project", credentials=bigquery_key,
dialect='standard')
pd.io.gbq.to_gbq(df, dataset, projectid,
if_exists='append',
table_schema=[{'name': 'Date','type': 'STRING'},
{'name': 'profileId','type': 'STRING'},
{'name': 'Opco','type': 'STRING'},
{'name': 'country','type': 'STRING'},
{'name': 'deviceType','type': 'STRING'},
{'name': 'userType','type': 'STRING'},
{'name': 'users','type': 'INTEGER'},
{'name': 'sessions','type': 'INTEGER'},
{'name': 'bounceRate','type': 'FLOAT'},
{'name': 'sessionsPerUser','type': 'FLOAT'},
{'name': 'avgSessionDuration','type': 'FLOAT'},
{'name': 'pageviewsPerSession','type': 'FLOAT'}
],
credentials=bigquery_key)
The schema in BigQuery is as follows:
Date STRING
profileId STRING
Opco STRING
country STRING
deviceType STRING
userType STRING
users INTEGER
sessions INTEGER
bounceRate FLOAT
sessionsPerUser FLOAT
avgSessionDuration FLOAT
pageviewsPerSession FLOAT
I then get the following error:
Traceback (most recent call last): File "..file.py", line 63, in
<module>
main()
File "..file.py", line 57, in main
updating_general_data(bigquery_key)
File "..file.py", line 46, in updating_general_data
credentials=bigquery_key)
File
"..\AppData\Local\Programs\Python\Python37-32\lib\site-packages\pandas\io\gbq.py",
line 162, in to_gbq
credentials=credentials, verbose=verbose, private_key=private_key)
File
"..\AppData\Local\Programs\Python\Python37-32\lib\site-packages\pandas_gbq\gbq.py",
line 1141, in to_gbq
"Please verify that the structure and " pandas_gbq.gbq.InvalidSchema: Please verify that the structure and
data types in the DataFrame match the schema of the destination table.
To me it seems that there is a 1 on 1 match. I've seen other threads talk about this and these threads are mainly talking about date formats even though the date format is already a string in this case and is then with the table_schema still made as string.
Ultimate solution to this is instead of manually specifying schema which can always be prone to type casting/naming errors, Its always best top get the schema from the very table. So have a client using latest version of API:
from google.cloud import bigquery
from google.oauth2 import service_account
credentials = service_account.Credentials.from_service_account_file(
'credentials.json')
project_id = 'your_project_id',
client = bigquery.Client(credentials= credentials,project=project_id)
Get the table you want to write/append to:
table = client.get_table('your_dataset.your_table')
table
Generate schema from the table:
generated_schema = [{'name':i.name, 'type':i.field_type} for i in table.schema]
generated_schema
Rename your dataframe accordingly:
data.columns = [i.name for i in table.schema]
Pass the same schema while pushing it to BigQuery:
data.to_gbq(project_id = 'your_project_id',
destination_table = 'your_dataset.your_table',
credentials = service_account.Credentials.from_service_account_file(
'credentials.json'),
table_schema = generated_schema,
progress_bar = True,
if_exists = 'replace')
I had real trouble with this, and fixed it by getting pandas-gbq to create the database rather than making it in UI and trying to match the schema
I had a column in my dataframe called "No." Deleting the period got rid of the issue and the schema was inferred.
Most likely, the problem arises because the column names in the DataFrame and Schema do not match

add an extra column on Tax Report of odoo 10

I need an extra column on odoo tax report. Currently there are two columns Net and Tax. I need to add a column named Gross. This view isn't like other qweb views and not getting it. Screenshot:
The module is account_reports and the file which is generating the report I think is account_generic_tax_report.
It would be great if anyone can suggest me what to do.
Regards.
You can append new column names in the list column_header
in function get_column_names, addons/account_reports/models/account_generic_tax_report.py
columns_header = [{}, {'name': '%s \n %s' % (_('NET'), self.format_date(dt_to, dt_from, options)), 'class': 'number', 'style': 'white-space: pre;'}, {'name': _('TAX'), 'class': 'number'} , {'name': _('New Field'), 'class': 'number'}]
See attached image.