I'm trying to send a group of files to a remote server through no-ack's python byndings for libssh2, but I am totally lost regarding the library usage due to the lack of documentation.
I've tried using the C docs for libssh2 unsuccesfully.
Since I'm using python 3.2, paramiko and pexpect are out of the question.
Anyone can help?
EDIT: I just found some code in no-Ack's blog comments to his post.
import libssh2, socket, os
SERVER = 'someserver'
username = 'someuser'
password = 'secret!'
sourceFilePath = 'source/file/path'
destinationFilePath = 'dest/file/path'
sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
sock.connect((SERVER, 22))
session = libssh2.Session()
session.startup(sock)
session.userauth_password(username, password)
sourceFile = open(sourceFilePath, 'rb')
channel = session.scp_send(destinationFilePath, 0o644, os.stat(sourceFilePath).st_size)
while True:
data = sourceFile.read(4096)
if not data:
break
channel.write(data)
exitStatus = channel.exit_status()
channel.close()
Seems to work fine.
And here's how to get files with libssh2 in Python 3.2. Major kudos to no-Ack for showing me this. You'll need the Python3 bindings for libssh2 https://github.com/wallunit/ssh4py
import libssh2, socket, os
SERVER = 'someserver'
username = 'someuser'
password = 'secret!'
sourceFilePath = 'source/file/path'
destinationFilePath = 'dest/file/path'
sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
sock.connect((SERVER, 22))
session = libssh2.Session()
session.startup(sock)
session.userauth_password(username, password)
(channel, (st_size, _, _, _)) = session.scp_recv(sourceFilePath, True)
destination = open(destinationFilePath, 'wb')
got = 0
while got < st_size:
data = channel.read(min(st_size - got, 1024))
got += len(data)
destination.write(data)
exitStatus = channel.get_exit_status()
channel.close()
To do this in Python (i.e. not wrapping scp through subprocess.Popen or similar) with the Paramiko library.
Revelent : https://stackoverflow.com/a/69596/1270589
Below is easy but it is not universal means works if you run in linux dosent work if you run in windows. tell me if you know to make below universal i.e across all O.S platforms.
import os
os.system("sshpass -p 'your password' scp /opt/pysftp_server.txt root#172.19.113.87:/home")
Related
I've got some code which uses StringSession to talk to the Telegram API using telethon.
In my unit tests, I'm trying to instantiate a mocked TelegramClient, passing it a StringSession(myvalue) object as the first parameter. The real code works fine, but I need a fake session string for 'myvalue', to use in my unit tests (where I have a mocked telegram client).
How can I create a dummy value for 'myvalue' which will successfully execute StringSession(myvalue)?
Currently, my tests are dying here:
self = <telethon.sessions.string.StringSession object at 0x7f0777492ad0>
string = 'dummyxxx'
def __init__(self, string: str = None):
super().__init__()
if string:
if string[0] != CURRENT_VERSION:
raise ValueError('Not a valid string')
string = string[1:]
ip_len = 4 if len(string) == 352 else 16
> self._dc_id, ip, self._port, key = struct.unpack(
_STRUCT_PREFORMAT.format(ip_len), StringSession.decode(string))
E struct.error: unpack requires a buffer of 275 bytes
If you don't need a valid session to start with, you can also use MemorySession instead:
from telethon.sessions import MemorySession
session = MemorySession()
# use session variable when creating the client
Someone posted an answer which helped point me in the right direction, but they later deleted it for some reason.
In case it helps anyone else, here is the code that worked for me:
import struct
import base64
from telethon.sessions import StringSession
_STRUCT_PREFORMAT = '>B{}sH256s'
CURRENT_VERSION = '1'
dc_id = 1
ip = b'\x7f\x00\x00\x01' # 127.0.0.1
port = 80
key = b'\x00' * 256
string = StringSession.encode(struct.pack(
_STRUCT_PREFORMAT.format(len(ip)),
dc_id,
ip,
port,
key
))
myvalue = CURRENT_VERSION + string
# Create the StringSession object using the dummy value to confirm it works
session = StringSession(myvalue)
print(myvalue)
Is there way to access rest api with pure lua script
GET / POST both way need to access and display response
i already tried
local api = nil
local function iniit()
if api == nil then
-- body
api = require("http://api.com")
.create()
.on_get(function ()
return {name = "Apple",
id = 12345}
end)
end
end
In linux , mac we can easily install luarocks , and then we can install curl package. It's easiest way to unix like os.
-- HTTP Get
local curl = require('curl')
curl.easy{
url = 'api.xyz.net?a=data',
httpheader = {
"X-Test-Header1: Header-Data1",
"X-Test-Header2: Header-Data2",
},
writefunction = io.stderr -- use io.stderr:write()
}
:perform()
:close()
In windows i faced several problems. Cant install luarocks correctly. then luarock install command not work correctl, etc..
In first dwnload lua from official site, and then create structure like (below web site)
http://fuchen.github.io/dev/2013/08/24/install-luarocks-on-windows/
then i download lua luadist
http://luadist.org/
then i got same structure luadist extracted folder and lua folder.
merged luadist folder and lua folder
Finaly we can use http.soket
local http=require("socket.http");
local request_body = [[login=user&password=123]]
local response_body = {}
local res, code, response_headers = http.request{
url = "api.xyz.net?a=data",
method = "GET",
headers =
{
["Content-Type"] = "application/x-www-form-urlencoded";
["Content-Length"] = #request_body;
},
source = ltn12.source.string(request_body),
sink = ltn12.sink.table(response_body),
}
print(res)
print(code)
if type(response_headers) == "table" then
for k, v in pairs(response_headers) do
print(k, v)
end
end
print("Response body:")
if type(response_body) == "table" then
print(table.concat(response_body))
else
print("Not a table:", type(response_body))
end
IF YOU DO THESE STEPS CORRECTLY , THIS WILL BE WORK 1000% SURE
I'm trying to replicate the following successful cURL operation with Grinder.
curl -X PUT -d "title=Here%27s+the+title&content=Here%27s+the+content&signature=myusername%3A3ad1117dab0ade17bdbd47cc8efd5b08" http://www.mysite.com/api
Here's my script:
from net.grinder.script import Test
from net.grinder.script.Grinder import grinder
from net.grinder.plugin.http import HTTPRequest
from HTTPClient import NVPair
import hashlib
test1 = Test(1, "Request resource")
request1 = HTTPRequest(url="http://www.mysite.com/api")
test1.record(request1)
log = grinder.logger.info
test1.record(log)
m = hashlib.md5()
class TestRunner:
def __call__(self):
params = [NVPair("title","Here's the title"),NVPair("content", "Here's the content")]
params.sort(key=lambda param: param.getName())
ps = ""
for param in params:
ps = ps + param.getValue() + ":"
ps = ps + "myapikey"
m.update(ps)
params.append(NVPair("signature", ("myusername:" + m.hexdigest())))
request1.setFormData(tuple(params))
result = request1.PUT()
The test runs okay, but it seems that my script doesn't actually send any of the params data to the API, and I can't work out why. There are no errors generated, but I get a 401 Unauthorized response from the API, indicating that a successful PUT request reached it, but obviously without a signature the request was rejected.
This isn't exactly an answer, more of a workaround that I came up with, that I've decided to post since this question hasn't yet received any responses, and it may help anyone else trying to achieve the same thing.
The workaround is basically to use the httplib and urllib modules to build and make the PUT request instead of the HTTPClient module.
import hashlib
import httplib, urllib
....
params = [("title", "Here's the title"),("content", "Here's the content")]
params.sort(key=lambda param: param[0])
ps = ""
for param in params:
ps = ps + param[1] + ":"
ps = ps + "myapikey"
m = hashlib.md5()
m.update(ps)
params.append(("signature", "myusername:" + m.hexdigest()))
params = urllib.urlencode(params)
print params
headers = {"Content-type": "application/x-www-form-urlencoded"}
conn = httplib.HTTPConnection("www.mysite.com:80")
conn.request("PUT", "/api", params, headers)
response = conn.getresponse()
print response.status, response.reason
print response.read()
conn.close()
(Based on the example at the bottom of this documentation page.)
You have to refer to the multi-form posting example in Grinder script gallery, but changing the Post to Put. It works for me.
files = ( NVPair("self", "form.py"), )
parameters = ( NVPair("run number", str(grinder.runNumber)), )
# This is the Jython way of creating an NVPair[] Java array
# with one element.
headers = zeros(1, NVPair)
# Create a multi-part form encoded byte array.
data = Codecs.mpFormDataEncode(parameters, files, headers)
grinder.logger.output("Content type set to %s" % headers[0].value)
# Call the version of POST that takes a byte array.
result = request1.PUT("/upload", data, headers)
I have developed numerous iOS apps over the years so know Objective C reasonably well.
I'd like to build my first web service to offload some of the most processor intensive functions.
I'm leaning towards using my Mac as the server, which comes with Apache. I have configured this and it appears to be working as it should (I can type the Mac's IP address and receive a confirmation).
Now I'm trying to decide on how to build the server-side web service, which is totally new to me. I'd like to leverage my Objective C knowledge if possible. I think I'm looking for an Objective C-compatible web service engine and some examples how to connect it to browsers and mobile interfaces. I was leaning towards using Amazon's SimpleDB as the database.
BTW: I see Apple have Lion Server, but I cannot work out if this is an option.
Any thoughts/recommendations are appreciated.?
There are examples of simple web servers out there written in ObjC such as this and this.
That said, there are probably "better" ways of doing this if you don't mind using other technologies. This is a matter of preference; but I've use Python, MySQL, and the excellent web.py framework for these sorts of backends.
For example, here's an example web service (some redundancies omitted...) using the combination of technologies described. I just run this on my server, and it takes care of url redirection and serves JSON from the db.
import web
import json
import MySQLdb
urls = (
"/equip/gruppo", "gruppo", # GET = get all gruppos, # POST = save gruppo
"/equip/frame", "frame"
)
class StatusCode:
(Success,SuccessNoRows,FailConnect,FailQuery,FailMissingParam,FailOther) = range(6);
# top-level class that handles db interaction
class APIObject:
def __init__(self):
self.object_dict = {} # top-level dictionary to be turned into JSON
self.rows = []
self.cursor = ""
self.conn = ""
def dbConnect(self):
try:
self.conn = MySQLdb.connect( host = 'localhost', user = 'my_api_user', passwd = 'api_user_pw', db = 'my_db')
self.cursor = self.conn.cursor(MySQLdb.cursors.DictCursor)
except:
self.object_dict['api_status'] = StatusCode.FailConnect
return False
else:
return True
def queryExecute(self,query):
try:
self.cursor.execute(query)
self.rows = self.cursor.fetchall()
except:
self.object_dict['api_status'] = StatusCode.FailQuery
return False
else:
return True
class gruppo(APIObject):
def GET(self):
web.header('Content-Type', 'application/json')
if self.dbConnect() == False:
return json.dumps(self.object_dict,sort_keys=True, indent=4)
else:
if self.queryExecute("SELECT * FROM gruppos") == False:
return json.dumps(self.object_dict,sort_keys=True, indent=4)
else:
self.object_dict['api_status'] = StatusCode.SuccessNoRows if self.rows.count == 0 else StatusCode.Success
data_list = []
for row in self.rows:
# create a dictionary with the required elements
d = {}
d['id'] = row['id']
d['maker'] = row['maker_name']
d['type'] = row['type_name']
# append to the object list
data_list.append(d)
self.object_dict['data'] = data_list
# return to the client
return json.dumps(self.object_dict,sort_keys=True, indent=4)
is it possible to use redis's MOVE command to move all keys from 1 database to another? The move command only moves 1 key, but I need to move all the keys in the database.
I would recommend taking a look at the following alpha version app to backup and restore redis databases.. (you can install it via gem install redis-dump). You could redis-dump your databaseand then redis-load into another database via the --database argument.
redis-dump project
If this doesn't fit your purposes, you may need to make use of a scripting language's redis bindings (or alternatively throw something together using bash / redis-cli / xargs, etc). If you need assistance along these lines then we probably need more details first.
I've wrote a small python script to move data between two redis servers:(only support list and string types, and you must install python redis client):
'''
Created on 2011-11-9
#author: wuyi
'''
import redis
from optparse import OptionParser
import time
def mv_str(r_source, r_dest, quiet):
keys = r_source.keys("*")
for k in keys:
if r_dest.keys(k):
print "skipping %s"%k
continue
else:
print "copying %s"%k
r_dest.set(k, r_source.get(k))
def mv_list(r_source, r_dest, quiet):
keys = r_source.keys("*")
for k in keys:
length = r_source.llen(k)
i = 0
while (i<length):
print "add queue no.:%d"%i
v = r_source.lindex(k, i)
r_dest.rpush(k, v)
i += 1
if __name__ == "__main__":
usage = """usage: %prog [options] source dest"""
parser = OptionParser(usage=usage)
parser.add_option("-q", "--quiet", dest="quiet",
default = False, action="store_true",
help="quiet mode")
parser.add_option("-p", "--port", dest="port",
default = 6380,
help="port for both source and dest")
parser.add_option("", "--dbs", dest="dbs",
default = "0",
help="db list: 0 1 120 220...")
parser.add_option("-t", "--type", dest="type",
default = "normal",
help="available types: normal, lpoplist")
parser.add_option("", "--tmpdb", dest="tmpdb",
default = 0,
help="tmp db number to store tmp data")
(options, args) = parser.parse_args()
if not len(args) == 2:
print usage
exit(1)
source = args[0]
dest = args[1]
if source == dest:
print "dest must not be the same as source!"
exit(2)
dbs = options.dbs.split(' ')
for db in dbs:
r_source = redis.Redis(host=source, db=db, password="", port=int(options.port))
r_dest = redis.Redis(host=dest, db=db, password="", port=int(options.port))
print "______________db____________:%s"%db
time.sleep(2)
if options.type == "normal":
mv_str(r_source, r_dest, options.quiet)
elif options.type == "lpoplist":
mv_list(r_source, r_dest, options.quiet)
del r_source
del r_dest
you can try my own tool, rdd
it's a command line utility,
can dump database to a file, work on it (filter, match, merge, ...), and back it in a redis instance
take care, alpha stage, https://github.com/r043v/rdd/
Now that redis has scripting using lua, you can easily write a command that loops through all the keys, checks their type and moves them accordingly to a new database.
I suggest you can try it as below:
1. copy the rdb file to another dir;
2. modify the rdb file name;
3. modify the redis configure file adapter to the new db;