简体   繁体   English

在PHP中同时使用SOAP和REST WebServices

[英]Consuming SOAP and REST WebServices at the same time in PHP

My objective is consume various Web Services and then merge the results. 我的目标是使用各种Web服务,然后合并结果。

I was doing this using PHP cURL , but as the number of Web Services has increased, my service slowed since the process was waiting for a response and then make the request to the next Web Service. 我使用PHP cURL这样做,但随着Web服务数量的增加,我的服务因为进程等待响应然后向下一个Web服务发出请求而变慢。

I solved this issue using curl_multi and everything was working fine. 我使用curl_multi解决了这个问题,一切正常。

Now, I have a new problem, because I have new Web Services to add in my service that use Soap Protocol and I can't do simultaneous requests anymore, because I don't use cURL for Soap Web Services, I use SoapClient . 现在,我有一个新问题,因为我在我的服务中添加了新的Web服务,使用Soap协议 ,我不能再同时执行请求,因为我没有使用cURL进行Soap Web服务,我使用SoapClient

I know that I can make the XML with the soap directives and then send it with cURL, but this seems to me a bad practice. 我知道我可以使用soap指令制作XML,然后使用cURL发送它,但在我看来这是一个不好的做法。

In short, is there some way to consume REST and SOAP Web Services simultaneously? 简而言之,有没有办法同时使用REST和SOAP Web服务?

I would first try a unified, asynchronous guzzle setup as others have said. 我会先尝试一个统一的,异步的guzzle设置,正如其他人所说的那样。 If that doesn't work out I suggest not using process forking or multithreading. 如果这不成功,我建议不要使用进程分叉或多线程。 Neither are simple to use or maintain. 两者都不易使用或维护。 For example, mixing guzzle and threads requires special attention. 例如, 混合喷嘴和螺纹需要特别注意。

I don't know the structure of your application, but this might be a good case for a queue . 我不知道你的应用程序的结构,但这可能是一个队列的好例子 Put a message into a queue for each API call and let multiple PHP daemons read out of the queue and make the actual requests. 将消息放入队列中以进行每个API调用,并让多个PHP守护进程从队列中读出并发出实际请求。 The code can be organized to use curl or SoapClient depending on the protocol or endpoint instead of trying to combine them. 可以将代码组织为使用curl或SoapClient,具体取决于协议或端点,而不是尝试将它们组合在一起。 Simply start up as many daemons as you want to make requests in parallel. 只需启动尽可能多的守护进程就可以并行提出请求。 This avoids all of the complexity of threading or process management and scales easily. 这避免了线程或流程管理的所有复杂性并且可以轻松扩展。

When I use this architecture I also keep track of a "semaphore" in a key-value store or database. 当我使用这种架构时,我还会在键值存储或数据库中跟踪“信号量”。 Start the semaphore with a count of API calls to be made. 使用一系列API调用启动信号量。 As each is complete the count is reduced. 随着每个完成,计数减少。 Each process checks when the count hits zero and then you know all of the work is done. 每个进程检查计数何时达到零,然后您知道所有工作都已完成。 This is only really necessary when there's a subsequent task, such as calculating something from all of the API results or updating a record to let users know the job is done. 只有在执行后续任务时才会这样做,例如计算所有API结果中的内容或更新记录以让用户知道作业已完成。

Now this setup sounds more complicated than process forking or multithreading, but each component is easily testable and it scales across servers. 现在,这种设置听起来比过程分叉或多线程更复杂,但每个组件都可以轻松测试,并且可以跨服务器进行扩展。

I've put together a PHP library that helps build the architecture I'm describing. 我已经整理了一个PHP库 ,它有助于构建我正在描述的架构。 It's basic pipelining that allows a mix of synchronous and asynchronous processes. 它是基本流水线操作,允许混合使用同步和异步流程。 The async work is handled by a queue and semaphore. 异步工作由队列和信号量处理。 API calls that need to happen in sequence would each get a Process class. 需要按顺序发生的API调用将分别获得Process类。 API calls that could be made concurrently go into a MultiProcess class. 可以同时进行的API调用进入MultiProcess类。 A ProcessList sets up the pipeline. ProcessList设置管道。

Yes, you can. 是的你可以。

Use an HTTP client(ex: guzzle, httpful) most of them are following PSR7 , prior to that you will have a contract. 使用HTTP客户端(例如:guzzle,httpful),其中大多数都遵循PSR7 ,在此之前您将签订合同。 Most importantly they have plenty of plugins for SOAP and REST. 最重要的是,他们有足够的SOAP和REST插件。

EX: if you choose guzzle as your HTTP client it has plugins SOAP . EX:如果你选择guzzle作为你的HTTP客户端它有插件SOAP You know REST is all about calling a service so you don't need extra package for that, just use guzzle itself. 你知道REST是关于调用服务所以你不需要额外的包,只需使用guzzle本身。

**write your API calls in an async way (non-blocking) that will increase the performance. **以异步方式(非阻塞)编写API调用,以提高性能。 One solution is you can use promises 一种解决方案是你可以使用promises

Read more 阅读更多

its not something php is good at, and you can easily find edge-case crash bugs by doing it, but php CAN do multithreading - check php pthreads and pcntl_fork . 它不是PHP擅长的东西,你可以通过这样做轻松找到边缘情况崩溃错误,但php可以做多线程 - 检查php pthreadspcntl_fork (neither of which works on a webserver behind php-fpm / mod_php , btw, and pcntl_fork only works on unix systems (linux/bsd), windows won't work) (两者都不能在php-fpm / mod_php,btw和pcntl_fork之后的网络服务器上工作,只适用于unix系统(linux / bsd),windows无效)

however, you'd probably be better off by switching to a master process -> worker processes model with proc_open & co. 但是,通过切换到主进程 - >工作进程模型与proc_open&co,你可能会更好。 this works behind webservers both in php-fpm and mod_php and does not depend on pthreads being installed and even works on windows, and won't crash the other workers if a single worker crash. 这适用于php-fpm和mod_php中的web服务器,并且不依赖于正在安装的pthread,甚至可以在Windows上运行,并且如果单个工作程序崩溃,也不会使其他工作程序崩溃。 also you you can drop using php's curl_multi interface (which imo is very cumbersome to get right), and keep using the simple curl_exec & co functions. 你也可以使用php的curl_multi接口(非常麻烦,以获得正确),并继续使用简单的curl_exec和co函数。 (here's an example for running several instances of ping https://gist.github.com/divinity76/f5e57b0f3d8131d5e884edda6e6506d7 - but i'm suggesting using php cli for this, eg proc_open('php workerProcess.php',...); , i have done it several times before with success.) (这是运行几个ping https的例子: httpsproc_open('php workerProcess.php',...); - 但我建议使用php cli,例如proc_open('php workerProcess.php',...); ,我成功了几次。)

You could run a cronjob.php with crontab and start other php scripts asynchronously: 您可以使用crontab运行cronjob.php并异步启动其他PHP脚本:

// cronjob.php
$files = [
    'soap-client-1.php',
    'soap-client-2.php',
    'soap-client-2.php',
];

foreach($files as $file) {
    $cmd = sprintf('/usr/bin/php -f "%s" >> /dev/null &', $file);
    system($cmd);
}

soap-client-1.php SOAP客户端,1.PHP

$client = new SoapClient('http://www.webservicex.net/geoipservice.asmx?WSDL');

$parameters = array(
    'IPAddress' => '8.8.8.8',
);
$result = $client->GetGeoIP($parameters);
// @todo Save result

Each php script starts a new SOAP request and stores the result in the database. 每个php脚本都会启动一个新的SOAP请求并将结果存储在数据库中。 Now you can process the data by reading the result from the database. 现在,您可以通过从数据库中读取结果来处理数据。

This seems like an architecture problem. 这似乎是一个架构问题。 You should instead consume each service with a separate file/URL and scrape JSON from those into an HTML5/JS front-end. 您应该使用单独的文件/ URL来使用每个服务,并将JSON从那些服务中删除到HTML5 / JS前端。 That way, your service can be divided into many asynchronous chunks and the speed of each can been tweaked separately. 这样,您的服务可以分为许多异步块,每个块的速度可以单独调整。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM