My two dependent class methods develop a Protocol to define the connection strategy and a factory to open the connection. I have instigated a reactor to run this by connecting to a local host and a port, however I get the following error:
Connection failed. Reason: [Failure instance: Traceback (failure with no frames): <class 'twisted.internet.error.ConnectionRefusedError'>: Connection was refused by other side: 61: Connection refused.
from twisted.internet import protocol, task, reactor
from twisted.internet.protocol import ClientFactory
from twisted.internet.endpoints import TCP4ClientEndpoint, connectProtocol
class TestClass(protocol.Protocol):
def __init__(self):
self._make_connection = self.transport.write("Connect to the transport")
self.cnt_lost = self.transport.loseConnection()
self._tst = self.transport.getPeer()
def test_transport(self):
self._make_connection
self._tst
self.cnt_lost
class EchoClientFactory(ClientFactory):
def startedConnecting(self, connector):
print('Started to connect.')
def buildProtocol(self, addr):
print('Connected.')
return TestClass()
def clientConnectionLost(self, connector, reason):
print('Lost connection. Reason:', reason)
def clientConnectionFailed(self, connector, reason):
print('Connection failed. Reason:', reason)
reactor.connectTCP('127.0.0.1', 8050, EchoClientFactory())
reactor.run()
Not much of an answer, I managed to fix the issue above however I do not get any data returned from the server, for example, I should be getting:
Connect to the transport
Instead, I just get:
Connected
Neither, do I get the remote access of the connection.
Here's what I have tried:
listen in on the connection
class TestClass(protocol.Protocol):
def recieved_data(self, data):
self.transport.write(data)
class readClientFactory(ClientFactory):
def buildProtocol(self, addr):
print('Connected.')
return TestClass()
reactor.listenTCP(8070, readClientFactory())
reactor.run()
Connect to the port:
class readClass(protocol.Protocol):
def connectionmade(self):
self.transport.write(b"Connect to the transport")
self.transport.getPeer()
def test_transport(self):
self.transport.loseConnection()
class readClientFactory(ClientFactory):
def buildProtocol(self, addr):
print('Connected.')
return readClass()
def clientConnectionLost(self, connector, reason):
print('Lost connection. Reason:', reason)
reactor.stop()
def clientConnectionFailed(self, connector, reason):
print('Connection failed. Reason:', reason)
reactor.stop()
reactor.connectTCP('127.0.0.1', 8070, readClientFactory())
reactor.run()
output:
Connected.
It should be:
Connected.
Connect to the transport
--- Then some stuff about the ip
Related
I write a simple test to validate the https proxy in scrapy.but it didn't work
class BaiduSpider(scrapy.Spider):
name = 'baidu'
allowed_domains = ['baidu.com']
start_urls = ['http://www.baidu.com/']
def parse(self, response):
if response.status == 200:
print(response.text)
and the file of middlewares like this:
class DynamicProxyDownloaderMiddleware(object):
def process_request(self, request, spider):
request.meta['proxy'] = 'https://183.159.88.182:8010'
also the file of settings:
DOWNLOADER_MIDDLEWARES = {
'scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware': 110,
'requestTest.middlewares.DynamicProxyDownloaderMiddleware': 100
}
when using the lib of requests.the https proxy works.but changed to scrapy.it confused me.So anybody know this?
the log file:
[the log file][1]
[scrapy.downloadermiddlewares.retry] DEBUG: Retrying http://www.baidu.com/> (failed 1 times): TCP connection timed out: 10060:
the proxy address is https://183.159.88.182:8010
I want to get the certificate hash. But I have no idea how to get the server peer certificate. Either in the request or response. The server I send the request to sets the Connection close header, so the retrieving the original ssl socket in the response doesn't work.
Currently no way, sorry.
You can check a cert hash easy though: https://docs.aiohttp.org/en/stable/client_advanced.html#ssl-control-for-tcp-sockets
The following example uses SHA-256 fingerprint check:
fingerprint = b'...' # should be 64 bytes length hash (256/8)
r = await session.get('https://example.com',
ssl=aiohttp.Fingerprint(fingerprint))
I've come up with this solution/hack
import aiohttp
class WrappedResponseClass(aiohttp.ClientResponse):
def __init__(self, *args, **kwargs):
super(WrappedResponseClass, self).__init__(*args, **kwargs)
self._peer_cert = None
async def start(self, connection, read_until_eof=False):
try:
self._peer_cert = connection.transport._ssl_protocol._extra['ssl_object'].getpeercert(True)
except Exception:
pass
return await super(WrappedResponseClass, self).start(connection, read_until_eof)
#property
def peer_cert(self):
return self._peer_cert
session = aiohttp.ClientSession(otherargs..., response_class=WrappedResponseClass)
The following works for me with aiohttp 3.8.3:
async with aiohttp.ClientSession() as session:
r = await session.get('https://bbc.com')
cert = r.connection.transport.get_extra_info('peercert')
This is echoclient code that communicate with echoserver:
from twisted.internet import protocol, reactor
class Echo(protocol.Protocol):
def dataReceived(self, data):
self.transport.write(data)
class EchoFactory(protocol.Factory):
def buildProtocol(self, addr):
return Echo()
reactor.listenTCP(8000, EchoFactory())
reactor.run()
This is echoserer:
from twisted.internet import reactor, protocol
class EchoClient(protocol.Protocol):
def connectionMade(self):
self.transport.write("Hello, world!")
def dataReceived(self, data):
print "Server said:", data
self.transport.loseConnection()
class EchoFactory(protocol.ClientFactory):
def buildProtocol(self, addr):
return EchoClient()
def clientConnectionFailed(self, connector, reason):
print "Connection failed."
reactor.stop()
def clientConnectionLost(self, connector, reason):
print "Connection lost."
reactor.stop()
reactor.connectTCP("localhost", 8000, EchoFactory())
reactor.run()
Above echoserve and echoclient communicate with each other but i want server to server communication, so here other echoserver is came and communicate with first echoserver.
You need to build a proxy client and attach it to one of the servers. And communicate with the other server via the proxy client.
SSLTest.testError passes, but Trial raises an exception after tearDown. For comparison there is RegularTest.testOk that works OK.
I have not found any Twisted bug that explain this, so I assume I'm doing something wrong given how easy this is to reproduce. Any ideas?
Here's the code:
from twisted.web import resource
from twisted.internet import ssl, reactor
from twisted.web.server import Site
from twisted.web.client import Agent, WebClientContextFactory
from twisted.trial.unittest import TestCase
class DummyServer(resource.Resource):
isLeaf = True
def render(self, request):
return 'hello world'
class SSLTest(TestCase):
def setUp(self):
site = Site(DummyServer())
SSLFactory = ssl.DefaultOpenSSLContextFactory('../server.key',
'../server.crt')
port = reactor.listenSSL(0, site, contextFactory=SSLFactory)
self.port = port
self.portNumber = port._realPortNumber
def tearDown(self):
self.port.stopListening()
def testError(self):
def finished(result):
self.assertEquals(result.code, 200)
url = 'https://127.0.0.1:%s' % self.portNumber
agent = Agent(reactor, WebClientContextFactory())
d = agent.request('GET', url)
d.addCallback(finished)
return d
class RegularTest(TestCase):
def setUp(self):
site = Site(DummyServer())
port = reactor.listenTCP(0, site)
self.port = port
self.portNumber = port._realPortNumber
def tearDown(self):
self.port.stopListening()
def testOk(self):
def finished(result):
self.assertEquals(result.code, 200)
url = 'http://127.0.0.1:%s' % self.portNumber
agent = Agent(reactor,)
d = agent.request('GET', url)
d.addCallback(finished)
return d
Here's stdout:
$ trial trialerror.py
trialerror
RegularTest
testOk ... [OK]
SSLTest
testError ... [OK]
[ERROR]
===============================================================================
[ERROR]
Traceback (most recent call last):
Failure: twisted.trial.util.DirtyReactorAggregateError: Reactor was unclean.
Selectables:
<TLSMemoryBIOProtocol #0 on 51135>
trialerror.SSLTest.testError
-------------------------------------------------------------------------------
Ran 2 tests in 0.018s
FAILED (errors=1, successes=2)
Jonathan Lange wrote about this problem and its solutions. You may also want to consider not using real network connections in your unit tests. Agent already works. So do Site,
reactor.listenSSL, etc. Try to write unit tests that exercise your code and not lots and lots of code from the libraries your code depends on.
I am writing a tcp proxy with Twisted framework and need a simple client failover. If proxy can not connect to one backend, then connect to next one in the list. I used
reactor.connectTCP(host, port, factory) for proxy till I came to this task, but it does not spit out error if it can not connect. How can I catch, that it can not connect and try other host or should I use some other connection method?
You can use a deferred to do that
class MyClientFactory(ClientFactory):
protocol = ClientProtocol
def __init__(self, request):
self.request = request
self.deferred = defer.Deferred()
def handleReply(self, command, reply):
# Handle the reply
self.deferred.callback(0)
def clientConnectionFailed(self, connector, reason):
self.deferred.errback(reason)
def send(_, host, port, msg):
factory = MyClientFactory(msg)
reactor.connectTCP(host, port, factory)
return factory.deferred
d = Deferred()
d.addErrback(send, host1, port1, msg1)
d.addErrback(send, host2, port2, msg2)
# ...
d.addBoth(lambda _: print "finished")
This will trigger the next errback if the first one fails, otherwise goto the print function.