简体   繁体   中英

Consuming SOAP and REST WebServices at the same time in PHP

My objective is consume various Web Services and then merge the results.

I was doing this using PHP cURL , but as the number of Web Services has increased, my service slowed since the process was waiting for a response and then make the request to the next Web Service.

I solved this issue using curl_multi and everything was working fine.

Now, I have a new problem, because I have new Web Services to add in my service that use Soap Protocol and I can't do simultaneous requests anymore, because I don't use cURL for Soap Web Services, I use SoapClient .

I know that I can make the XML with the soap directives and then send it with cURL, but this seems to me a bad practice.

In short, is there some way to consume REST and SOAP Web Services simultaneously?

I would first try a unified, asynchronous guzzle setup as others have said. If that doesn't work out I suggest not using process forking or multithreading. Neither are simple to use or maintain. For example, mixing guzzle and threads requires special attention.

I don't know the structure of your application, but this might be a good case for a queue . Put a message into a queue for each API call and let multiple PHP daemons read out of the queue and make the actual requests. The code can be organized to use curl or SoapClient depending on the protocol or endpoint instead of trying to combine them. Simply start up as many daemons as you want to make requests in parallel. This avoids all of the complexity of threading or process management and scales easily.

When I use this architecture I also keep track of a "semaphore" in a key-value store or database. Start the semaphore with a count of API calls to be made. As each is complete the count is reduced. Each process checks when the count hits zero and then you know all of the work is done. This is only really necessary when there's a subsequent task, such as calculating something from all of the API results or updating a record to let users know the job is done.

Now this setup sounds more complicated than process forking or multithreading, but each component is easily testable and it scales across servers.

I've put together a PHP library that helps build the architecture I'm describing. It's basic pipelining that allows a mix of synchronous and asynchronous processes. The async work is handled by a queue and semaphore. API calls that need to happen in sequence would each get a Process class. API calls that could be made concurrently go into a MultiProcess class. A ProcessList sets up the pipeline.

Yes, you can.

Use an HTTP client(ex: guzzle, httpful) most of them are following PSR7 , prior to that you will have a contract. Most importantly they have plenty of plugins for SOAP and REST.

EX: if you choose guzzle as your HTTP client it has plugins SOAP . You know REST is all about calling a service so you don't need extra package for that, just use guzzle itself.

**write your API calls in an async way (non-blocking) that will increase the performance. One solution is you can use promises

Read more

its not something php is good at, and you can easily find edge-case crash bugs by doing it, but php CAN do multithreading - check php pthreads and pcntl_fork . (neither of which works on a webserver behind php-fpm / mod_php , btw, and pcntl_fork only works on unix systems (linux/bsd), windows won't work)

however, you'd probably be better off by switching to a master process -> worker processes model with proc_open & co. this works behind webservers both in php-fpm and mod_php and does not depend on pthreads being installed and even works on windows, and won't crash the other workers if a single worker crash. also you you can drop using php's curl_multi interface (which imo is very cumbersome to get right), and keep using the simple curl_exec & co functions. (here's an example for running several instances of ping https://gist.github.com/divinity76/f5e57b0f3d8131d5e884edda6e6506d7 - but i'm suggesting using php cli for this, eg proc_open('php workerProcess.php',...); , i have done it several times before with success.)

You could run a cronjob.php with crontab and start other php scripts asynchronously:

// cronjob.php
$files = [
    'soap-client-1.php',
    'soap-client-2.php',
    'soap-client-2.php',
];

foreach($files as $file) {
    $cmd = sprintf('/usr/bin/php -f "%s" >> /dev/null &', $file);
    system($cmd);
}

soap-client-1.php

$client = new SoapClient('http://www.webservicex.net/geoipservice.asmx?WSDL');

$parameters = array(
    'IPAddress' => '8.8.8.8',
);
$result = $client->GetGeoIP($parameters);
// @todo Save result

Each php script starts a new SOAP request and stores the result in the database. Now you can process the data by reading the result from the database.

This seems like an architecture problem. You should instead consume each service with a separate file/URL and scrape JSON from those into an HTML5/JS front-end. That way, your service can be divided into many asynchronous chunks and the speed of each can been tweaked separately.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM