I am using django-piston and curl to post file to specific phone numbers in my database. But I'm having problems uploading files.
This is my post response using curl:
C:\curl>curl -F "phone_number=03219455375" -F "file=#C:/file.txt"
http://localhost:8000/api/uploadfile.json
Piston/0.2.2 (Django 1.2.4) crash report:
Method signature does not match.
Resource does not expect any parameters.
Exception was: 'InMemoryUploadedFile' object is not subscriptable
Handler.py:
if request.POST:
phone_number=request.POST['phone_number']
file_name=request.FILES['file']
if(phone_number == ""):
return rc.BAD_REQUEST
else:
upload2folder = os.path.join(UPLOAD_ROOT,phone_number)
if os.path.exists(upload2folder):
print "Heloo'"
open(os.path.join(upload2folder, file_name),
'wb').write(file_name.)
else:
os.mkdir(upload2folder)
#open(os.path.join(upload2folder, file_name),
'wb').write(file_name.file.read())
return rc.CREATED
else:
return rc.BAD_REQUEST'
Please help!
Ok found this helpful. http://groups.google.com/group/django-piston/browse_thread/thread/6f3f964b8b3ccf72/bd1658121bb1874c?show_docid=bd1658121bb1874c
Related
I have an API which gets the success or error message on console.I am new to python and trying to read the response. Google throws so many examples to use subprocess but I dont want to run,call any command or sub process. I just want to read the output after below API call.
This is the response in console when success
17:50:52 | Logged in!!
This is the github link for the sdk and documentation
https://github.com/5paisa/py5paisa
This is the code
from py5paisa import FivePaisaClient
email = "myemailid#gmail.com"
pw = "mypassword"
dob = "mydateofbirth"
cred={
"APP_NAME":"app-name",
"APP_SOURCE":"app-src",
"USER_ID":"user-id",
"PASSWORD":"pw",
"USER_KEY":"user-key",
"ENCRYPTION_KEY":"enc-key"
}
client = FivePaisaClient(email=email, passwd=pw, dob=dob,cred=cred)
client.login()
In general it is bad practice to get a value from STDOUT. There are some ways but it's pretty tricky (it's not made for it). And the problem doesn't come from you but from the API which is wrongly designed, it should return a value e.g. True or False (at least) to tell you if you logged in, and they don't do it.
So, according to their documentation it is not possible to know if you're logged in, but you may be able to see if you're logged in by checking the attribute client_code in the client object.
If client.client_code is equal to something then it should be logged in and if it is equal to something else then not. You can try comparing it's value when you successfully login or when it fails (wrong credential for instance). Then you can put a condition : if it is None or False or 0 (you will have to see this by yourself) then it is failed.
Can you try doing the following with a successful and failed login:
client.login()
print(client.client_code)
Source of the API:
# Login function :
# (...)
message = res["body"]["Message"]
if message == "":
log_response("Logged in!!")
else:
log_response(message)
self._set_client_code(res["body"]["ClientCode"])
# (...)
# _set_client_code function :
def _set_client_code(self, client_code):
try:
self.client_code = client_code # <<<< That's what we want
except Exception as e:
log_response(e)
Since this questions asks how to capture "stdout" one way you can accomplish this is to intercept the log message before it hits stdout.
The minimum code to capture a log message within a Python script looks this:
#!/usr/bin/env python3
import logging
logger = logging.getLogger(__name__)
class RequestHandler(logging.Handler):
def emit(self, record):
if record.getMessage().startswith("Hello"):
print("hello detected")
handler = RequestHandler()
logger.addHandler(handler)
logger.warning("Hello world")
Putting it all together you may be able to do something like this:
import logging
from py5paisa import FivePaisaClient
email = "myemailid#gmail.com"
pw = "mypassword"
dob = "mydateofbirth"
cred={
"APP_NAME":"app-name",
"APP_SOURCE":"app-src",
"USER_ID":"user-id",
"PASSWORD":"pw",
"USER_KEY":"user-key",
"ENCRYPTION_KEY":"enc-key"
}
client = FivePaisaClient(email=email, passwd=pw, dob=dob,cred=cred)
class PaisaClient(logging.Handler):
def __init__():
self.loggedin = False # this is the variable we can use to see if we are "logged in"
def emit(self, record):
if record.getMessage().startswith("Logged in!!")
self.loggedin = True
def login():
client.login()
logging.getLogger(py5paisa) # get the logger for the py5paisa library
# tutorial here: https://betterstack.com/community/questions/how-to-disable-logging-from-python-request-library/
logging.basicConfig(handlers=[PaisaClient()], level=0, force=True)
c = PaisaClient()
c.login()
I am trying to upload 2 files(an audio and image file) along with some data. I am very new to using Flask, but after reviewing other people with Filestorage issues, I am not sure what I am doing wrong.
class FiguresResource(Resource):
parser = reqparse.RequestParser()
parser.add_argument(
'thing',
type=str)
parser.add_argument(
'image_file',
type=werkzeug.datastructures.FileStorage,
location=UPLOAD_FOLDER)
parser.add_argument(
'audio_file',
type=werkzeug.datastructures.FileStorage,
location=UPLOAD_FOLDER)
def post(self):
db = connect(MONGODB_DB, host=MONGODB_HOST, port=MONGODB_PORT)
data = self.parser.parse_args()
image = data['image_file']
audio = data['audio_file']
fig = Figure(
data['thing'],
image.filename,
get_file_size(image),
audio.filename,
get_file_size(audio)
)
image.save(image.filename)
audio.save(audio.filename)
fig.save()
db.close()
When I try to send the data I get an 'Internal Server Error' 500 from my requesting client. The flask-rest server will throw---
File "/home/joe/Projects/PyKapi-venv/kapi/resources/figure_resource.py", line 53, in post
image.filename,
AttributeError: 'NoneType' object has no attribute 'filename'
127.0.0.1 - - [20/May/2018 17:12:25] "POST /figures HTTP/1.1" 500 -
I thought the issue was in my Http request, but not so sure now. I was initially sending my request with Postman, but have recently switched to using curl. This is my curl command---
curl -F thing=Fruits -F image_file=#/home/joe/Projects/Pics/Fruits.jpg -F audio_file=#/home/joe/Projects/Audio/Fruits http://127.0.0.1:5000/figures
[UPDATE] I was using the locations argument in the requestparser() wrong.
The location arguement wants the content-type of the request. So I changed it as follows~
class FiguresResource(Resource):
parser = reqparse.RequestParser()
parser.add_argument(
'thing',
type=str,
location='form')
parser.add_argument(
'image_file',
type=FileStorage,
location='files')
parser.add_argument(
'audio_file',
type=FileStorage,
location='files')
def post(self):
data = self.parser.parse_args()
image = data['image_file']
audio = data['audio_file']
image_path = join(IMAGE_FOLDER, image.filename)
audio_path = join(AUDIO_FOLDER, audio.filename)
db = connect(MONGODB_DB, host=MONGODB_HOST, port=MONGODB_PORT)
if Figure.objects(visual_aid_path=image_path):
db.close()
return {"message": "Visual Aid file with that name already exists"}
if Figure.objects(audio_aid_path=audio_path):
db.close()
return {"message": "Audio file with that name already exists"}
fig = Figure(
data['thing'],
image_path,
audio_path
)
image.save(image_path)
audio.save(audio_path)
image.close()
audio.close()
fig.save()
db.close()
`
I have a question about using apollo-upload-client and graphene-django. Here I've discovered that apollo-upload-client adding operations to formData. But here graphene-django is only trying to get query parameter. And the question is, where and how it should be fixed?
If you're referring to the data that has a header like (when viewing the HTTP from Chrome tools):
Content-Disposition: form-data; name="operations"
and data like
{"operationName":"MyMutation","variables":{"myData"....}, "query":"mutation MyMutation"...},
the graphene-python library interprets this and assembles it into a query for you, inserting the variables and removing the file data from the query. If you are using Django, you can find all of the uploaded files in info.context.FILES when writing a mutation.
Here's my solution to support the latest apollo-upload-client (8.1). I recently had to revisit my Django code when I upgraded from apollo-upload-client 5.x to 8.x. Hope this helps.
Sorry I'm using an older graphene-django but hopefully you can update the mutation syntax to the latest.
Upload scalar type (passthrough, basically):
class Upload(Scalar):
'''A file upload'''
#staticmethod
def serialize(value):
raise Exception('File upload cannot be serialized')
#staticmethod
def parse_literal(node):
raise Exception('No such thing as a file upload literal')
#staticmethod
def parse_value(value):
return value
My upload mutation:
class UploadImage(relay.ClientIDMutation):
class Input:
image = graphene.Field(Upload, required=True)
success = graphene.Field(graphene.Boolean)
#classmethod
def mutate_and_get_payload(cls, input, context, info):
with NamedTemporaryFile(delete=False) as tmp:
for chunk in input['image'].chunks():
tmp.write(chunk)
image_file = tmp.name
# do something with image_file
return UploadImage(success=True)
The heavy lifting happens in a custom GraphQL view. Basically it injects the file object into the appropriate places in the variables map.
def maybe_int(s):
try:
return int(s)
except ValueError:
return s
class CustomGraphqlView(GraphQLView):
def parse_request_json(self, json_string):
try:
request_json = json.loads(json_string)
if self.batch:
assert isinstance(request_json,
list), ('Batch requests should receive a list, but received {}.').format(
repr(request_json))
assert len(request_json) > 0, ('Received an empty list in the batch request.')
else:
assert isinstance(request_json, dict), ('The received data is not a valid JSON query.')
return request_json
except AssertionError as e:
raise HttpError(HttpResponseBadRequest(str(e)))
except BaseException:
logger.exception('Invalid JSON')
raise HttpError(HttpResponseBadRequest('POST body sent invalid JSON.'))
def parse_body(self, request):
content_type = self.get_content_type(request)
if content_type == 'application/graphql':
return {'query': request.body.decode()}
elif content_type == 'application/json':
return self.parse_request_json(request.body.decode('utf-8'))
elif content_type in ['application/x-www-form-urlencoded', 'multipart/form-data']:
operations_json = request.POST.get('operations')
map_json = request.POST.get('map')
if operations_json and map_json:
operations = self.parse_request_json(operations_json)
map = self.parse_request_json(map_json)
for file_id, f in request.FILES.items():
for name in map[file_id]:
segments = [maybe_int(s) for s in name.split('.')]
cur = operations
while len(segments) > 1:
cur = cur[segments.pop(0)]
cur[segments.pop(0)] = f
logger.info('parse_body %s', operations)
return operations
else:
return request.POST
return {}
I have made a web2py web application. The api endpoints exposed are as follows.
"/comments[comments]"
"/comments/id/{comments.id}"
"/comments/id/{comments.id}/:field"
"/comments/user-id/{comments.user_id}"
"/comments/user-id/{comments.user_id}/:field"
"/comments/date-commented/{comments.date_commented.year}"
"/comments/date-commented/{comments.date_commented.year}/:field"
"/comments/date-commented/{comments.date_commented.year}/{comments.date_commented.month}"
"/comments/date-commented/{comments.date_commented.year}/{comments.date_commented.month}/:field"
"/comments/date-commented/{comments.date_commented.year}/{comments.date_commented.month}/{comments.date_commented.day}"
"/comments/date-commented/{comments.date_commented.year}/{comments.date_commented.month}/{comments.date_commented.day}/:field"
"/comments/date-commented/{comments.date_commented.year}/{comments.date_commented.month}/{comments.date_commented.day}/{comments.date_commented.hour}"
"/comments/date-commented/{comments.date_commented.year}/{comments.date_commented.month}/{comments.date_commented.day}/{comments.date_commented.hour}/:field"
"/comments/date-commented/{comments.date_commented.year}/{comments.date_commented.month}/{comments.date_commented.day}/{comments.date_commented.hour}/{comments.date_commented.minute}"
"/comments/date-commented/{comments.date_commented.year}/{comments.date_commented.month}/{comments.date_commented.day}/{comments.date_commented.hour}/{comments.date_commented.minute}/:field"
"/comments/date-commented/{comments.date_commented.year}/{comments.date_commented.month}/{comments.date_commented.day}/{comments.date_commented.hour}/{comments.date_commented.minute}/{comments.date_commented.second}"
"/comments/date-commented/{comments.date_commented.year}/{comments.date_commented.month}/{comments.date_commented.day}/{comments.date_commented.hour}/{comments.date_commented.minute}/{comments.date_commented.second}/:field"
"/comments/complaint-id/{comments.complaint_id}"
"/comments/complaint-id/{comments.complaint_id}/:field"
The comments model is as follows
models/db.py
db.define_table(
'comments',
Field('user_id', db.auth_user),
Field('comment_made', 'string', length=2048),
Field('date_commented', 'datetime', default=datetime.now),
Field('complaint_id', db.complaints),
Field('detailed_status', 'string', length=2048),
)
I have been successful in retriving a single comment via the following request
localhost:8000/api/comments/id/1.json
Now I wish to retrieve all the comments. I am not able to figure out how to use /comments[comments] to retrieve all comments.?
I have tried
localhost:8000/api/comments.json
But it gives an output with "invalid path"
I have realized requests such as http://localhost:8000/api/comments/complaint-id/1.json
also give "invalid path" as output.
Please help.
EDIT:
Controllers/default.py
#request.restful()
def api():
response.view='generic.' + request.extension
def GET(*args,**kargs):
patterns='auto'
parser = db.parse_as_rest(patterns,args,kargs)
if parser.status == 200:
return dict(content=parser.response)
else:
raise HTTP(parser.status,parser.error)
def POST(*args,**kargs):
return dict()
return locals()
routes.py in the main web2py folder to change the default application:
routers = dict(
BASE = dict(
default_application='GRS',
)
)
Another observation:
I added another endpoint as below:
def comm():
"""" Comments api substitute"""
rows=db().select(db.comments.ALL) ## this line shows error
# rows = db(db.comments.id > 0).select()
#rows=[[5,6],[3,4],[1,2]]
#for row in rows:
# print row.id
return dict(message=rows)
Even now I am not able to retrieve all comments with "/comm.json". This gives a web2py error ticket which says "need more than 1 value to unpack" on the line "rows=db.select(db.comments.ALL)". Are the above invalid path and this error related in someway?
I have a small web server written using Twisted. One of the things I want to do is have it return a result from another web server as the response to loading a page. That is, the response to render_GET() at server A (via http://A.com/resource) should be the content of a different URL at server B (via http://B.com/resource2). The content returned by server B is dynamic, so I can't just cache it.
Right now, server A can render pages just fine, it just can't render this remote resource. I've tried with Agent(), but I can't seem to get the response from B let alone forward it to A. I know that somewhere I have to take that request from the render_GET and later write() and finish() it. That's done in the cbBody callback, which get called but can't get to the original request to populate it.
Here's a piece of the code for server A's resource handler:
def render_GET(self,request):
# try with canned content just to test the whole thing
bmpServer = BMPServer(ServerBURL,
"xyzzy",
"plugh")
d= bmpServer.postNotification({"a":123},request)
print "Deferred", d
return NOT_DONE_YET
And this is the other code at server A:
theRequest = None
def cbRequest(response,args):
print "response called"
print response
print args
print 'Response version:', response.version
print 'Response code:', response.code
print 'Response phrase:', response.phrase
print 'Response headers:'
print pformat(list(response.headers.getAllRawHeaders()))
d = readBody(response)
d.addCallback(cbBody)
return d
def cbBody(body):
print "Response body:"
print body
theRequest.write(body)
theRequest.finish()
theRequest = None
def cbError(failure):
print type(failure.value), failure # catch error here
print failure.value.reasons[0].printTraceback()
class BMPServer(object):
def __init__(self,url,arg1,arg2):
self.url = url
self.arg1 = arg1
self.arg2 = arg2
def postNotification(self,message,request):
theRequest = request
bmpMessage = {'arg1':self.token,
'arg2':self.appID,
'message':message}
print "Sending request to %s"%self.url
print "Create agent"
agent = Agent(reactor)
print "create request deferred"
print "url = %s" % self.url
d = agent.request('POST', self.url,
Headers({'User-Agent': ['Twisted Web Client Example']}),
MessageProducer(bmpMessage))
print "adding callback"
d.addCallbacks(cbRequest,cbError)
print "returning deferred"
return d
When I run this as a standalone code (outside of the resource, using react() for example), it works fine. However, when I try to include it as shown above it just never seems to receive the data. I've got WireShark running so I can see the response is being returned from Server B, but the data never shows up in cbRequest().
For example, here's the output I see:
Sending request to http://localhost:8888/postMGCMNotificationService
Create agent
create request deferred
url = http://serverB:8888/postService
Message producer: body = {"arg2": "plugh", "arg1": "xyzzy", "message": {"a": 1}}
adding callback
returning deferred
testAgent: returning deferred
<Deferred at 0x10b54d290>
Writing body now
response called
<twisted.web._newclient.Response object at 0x1080753d0>
Response version: ('HTTP', 1, 1)
Response code: 200
Response phrase: OK
Response headers:
Response body:
{"result": false}
^CUnhandled error in Deferred:
Unhandled Error
Traceback (most recent call last):
File "/Library/Python/2.7/site-packages/Twisted-13.1.0_r39314-py2.7-macosx-10.8-intel.egg/twisted/web/_newclient.py", line 1151, in _bodyDataFinished_CONNECTED
self._bodyProtocol.connectionLost(reason)
File "/Library/Python/2.7/site-packages/Twisted-13.1.0_r39314-py2.7-macosx-10.8-intel.egg/twisted/web/client.py", line 1793, in connectionLost
self.deferred.callback(b''.join(self.dataBuffer))
File "/Library/Python/2.7/site-packages/Twisted-13.1.0_r39314-py2.7-macosx-10.8-intel.egg/twisted/internet/defer.py", line 382, in callback
self._startRunCallbacks(result)
File "/Library/Python/2.7/site-packages/Twisted-13.1.0_r39314-py2.7-macosx-10.8-intel.egg/twisted/internet/defer.py", line 490, in _startRunCallbacks
self._runCallbacks()
--- <exception caught here> ---
File "/Library/Python/2.7/site-packages/Twisted-13.1.0_r39314-py2.7-macosx-10.8-intel.egg/twisted/internet/defer.py", line 577, in _runCallbacks
current.result = callback(current.result, *args, **kw)
File "AServer.py", line 85, in cbBody
print theRequest
exceptions.UnboundLocalError: local variable 'theRequest' referenced before assignment
Looking at this a little more, it seems that if I could figure out a way to get the request over to cbBody() this would all work just fine.
You can pass extra arguments to callbacks on a Deferred:
d.addCallback(f, x)
When d fires, the result is f(result of d, x). You can pass as many positional or keyword arguments as you like this way. See the API documentation for Deferred for more details.