简体   繁体   English

如何在python thrift服务器中获取客户端的IP

[英]How to get client's IP in a python thrift server

I'm writing a thrift service in python and I would like to understand how I can get the client's IP address in the handler functions context. 我在python中编写了一个thrift服务,我想了解如何在处理函数上下文中获取客户端的IP地址。

Thanks, Love. 谢谢,亲爱。

You need to obtain the transport, and get the data from there. 您需要获取传输,并从那里获取数据。 Not sure how to do this exactly in Python, but there's a mailing list thread and there's this JIRA-ticket THRIFT-1053 describing a solution for C++/Java. 不确定如何在Python中完全执行此操作,但是有一个邮件列表线程,并且有这个JIRA票证THRIFT-1053描述了C ++ / Java的解决方案。

This is the relevant part from the mailing list thread: 这是邮件列表线程的相关部分:

I did it by decorating the TProcessor like this psuedo-code. 我通过像这个伪代码一样装饰TProcessor来做到这一点。

-craig -craig

class TrackingProcessor implements TProcessor {
 TrackingProcessor (TProcessor processor) {this.processor=processor;}

 public boolean process(TProtocol in, TProtocol out) throws TException {
   TTransport t = in.getTransport();
   InetAddress ia = t instanceof TSocket ?
       ((TSocket)t).getSocket().getInetAddress() : null;
   // Now you have the IP address, so what ever you want.

   // Delegate to the processor we are decorating.
   return processor.process(in,out);
  }  
}

This is a bit old but I'm currently solving the same problem. 这有点旧,但我目前正在解决同样的问题。 Here's my solution with thriftpy: 这是我的thriftpy解决方案:

import thriftpy
from thriftpy.thrift import TProcessor, TApplicationException, TType
from thriftpy.server import TThreadedServer
from thriftpy.protocol import TBinaryProtocolFactory
from thriftpy.transport import TBufferedTransportFactory, TServerSocket

class CustomTProcessor(TProcessor):
    def process_in(self, iprot):
        api, type, seqid = iprot.read_message_begin()
        if api not in self._service.thrift_services:
            iprot.skip(TType.STRUCT)
            iprot.read_message_end()
            return api, seqid, TApplicationException(TApplicationException.UNKNOWN_METHOD), None  # noqa

        args = getattr(self._service, api + "_args")()
        args.read(iprot)
        iprot.read_message_end()
        result = getattr(self._service, api + "_result")()

        # convert kwargs to args
        api_args = [args.thrift_spec[k][1] for k in sorted(args.thrift_spec)]

        # get client IP address
        client_ip, client_port = iprot.trans.sock.getpeername()

        def call():
            f = getattr(self._handler, api)
            return f(*(args.__dict__[k] for k in api_args), client_ip=client_ip)

        return api, seqid, result, call

class PingPongDispatcher:
    def ping(self, param1, param2, client_ip):
        return "pong %s" % client_ip

pingpong_thrift = thriftpy.load("pingpong.thrift")
processor = CustomTProcessor(pingpong_thrift.PingService, PingPongDispatcher())
server_socket = TServerSocket(host="127.0.0.1", port=12345, client_timeout=10000)
server = TThreadedServer(processor,
                         server_socket,
                         iprot_factory=TBinaryProtocolFactory(),
                         itrans_factory=TBufferedTransportFactory())
server.serve()

Remember that every method in the dispatcher will be called with extra parameter client_ip 请记住,调度程序中的每个方法都将使用额外的参数client_ip进行调用

Here's the same wrapper, translated to C#. 这是相同的包装器,翻译成C#。 This answer is just about the only good result of a google search for this issue, so I figure that others might have an easier time translating it to their language of choice given two points of reference instead of one. 这个答案只是谷歌搜索这个问题的唯一好结果,所以我认为其他人可能更容易将它翻译成他们选择的语言给定两个参考点而不是一个。 For the record, this worked perfectly for me. 为了记录,这对我来说非常合适。

(note that "in" and "out" are reserved words in C#) (注意“in”和“out”是C#中的保留字)

using Thrift;
using System.Net;

class TrackingProcessor : TProcessor
{
    private TProcessor processor;
    public TrackingProcessor(TProcessor processor)
    {
        this.processor = processor;
    }

    public Boolean Process(TProtocol inProt, TProtocol outProt)
    {
        TTransport t = inProt.Transport;
        IPAddress ip = ((IPEndPoint)((TSocket)t).TcpClient.Client.RemoteEndPoint).Address;
        //Presumably we want to do something with this IP
        return processor.Process(inProt, outProt);
    }
}

The only way I found to get the TProtocol at the service handler is to extend the processor and create one handler instance for each client related by transport/protocol. 我发现在服务处理程序中获取TProtocol的唯一方法是扩展处理器并为每个与传输/协议相关的客户端创建一个处理程序实例。 Here's an example: 这是一个例子:

public class MyProcessor implements TProcessor {

    // Maps sockets to processors
    private static final Map<, Processor<ServiceHandler>> PROCESSORS = Collections.synchronizedMap(new HashMap<String, Service.Processor<ServiceHandler>>());

    // Maps sockets to handlers
    private static final Map<String, ServiceHandler> HANDLERS = Collections.synchronizedMap(new HashMap<String, ServiceHandler>());

    @Override
    public boolean process(final TProtocol in, final TProtocol out)
            throws TException {

        // Get the socket for this request
        final TTransport t = in.getTransport();
        // Note that this cast will fail if the transport is not a socket, so you might want to add some checking.
        final TSocket socket = (TSocket) t; 

        // Get existing processor for this socket if any
        Processor<ServiceHandler> processor = PROCESSORS.get(socket);
        // If there's no processor, create a processor and a handler for
        // this client and link them to this new socket
        if (processor == null) {
            // Inform the handler of its socket
            final ServiceHandler handler = new ServiceHandler(socket);
            processor = new Processor<ServiceHandler>(handler);
            PROCESSORS.put(clientRemote, processor);
            HANDLERS.put(clientRemote, handler);
        }
        return processor.process(in, out);
    }
}

Then you need to tell Thrift to use this processor for incoming requests. 然后你需要告诉Thrift将此处理器用于传入请求。 For a TThreadPoolServer it goes like this: 对于TThreadPoolServer它是这样的:

final TThreadPoolServer.Args args = new TThreadPoolServer.Args(new TServerSocket(port));
args.processor(new MyProcessor());
final TThreadPoolServer server = new TThreadPoolServer(args);

The PROCESSORS map might look superfluous, but it is not since there's no way to get the handler for a processor (ie there's no getter). PROCESSORS映射可能看起来多余,但它不是因为没有办法获得处理器的处理程序(即没有getter)。

Note that it is your ServiceHandler instance that needs to keep which socket it is associated to. 请注意,您的ServiceHandler实例需要保留与之关联的套接字。 Here I pass it on the constructor but any way will do. 在这里我将它传递给构造函数,但任何方式都可以。 Then when the ServiceHandler 's IFace implementation is called, it will already have the associated Socket. 然后,当调用ServiceHandlerIFace实现时,它将已经具有关联的Socket。

This also means you will have an instance of MyProcessor and ServiceHandler for each connected client, which I think is not the case with base Thrift where only one instance of each of the classes are created. 这也意味着你将为每个连接的客户端提供一个MyProcessorServiceHandler实例,我认为不是基本Thrift的情况,其中每个类只创建一个实例。

This solution also has a quite annoying drawback: you need to figure out a method to remove obsolete data from PROCESSORS and HANDLERS maps, otherwise these maps will grow indefinitely. 这个解决方案也有一个非常恼人的缺点:你需要找出一种从PROCESSORSHANDLERS映射中删除过时数据的方法,否则这些映射将无限增长。 In my case each client has a unique ID, so I can check if there are obsolete sockets for this client and remove them from the maps. 在我的情况下,每个客户端都有一个唯一的ID,因此我可以检查该客户端是否有过时的套接字并将其从映射中删除。

PS: the Thrift guys need to figure out a way to let the service handler get the used protocol for current call (for example by allowing to extend a base class instead of implementing an interface). PS:Thrift人需要找到一种让服务处理程序获取当前调用的已使用协议的方法(例如,允许扩展基类而不是实现接口)。 This is very useful in so many scenarios. 这在很多场景中非常有用。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM