简体   繁体   中英

High-level HTTP client library for native C/C++ in Win32

Are there no "high-level" HTTP libraries for native C/C++ in Win32 or am I just looking in the wrong places?

By "high-level" I mean an API that lets me do HTTP web requests/responses in C++ with "about the same" abstraction level as the .NET framework (but note that using C++/CLI is not an option for me).

How to do something like this (with about the same amount of code) in C/C++ in Win32 without using .NET? As a reference, I include a code sample to show how I'd do it in C#.

byte[] fileBytes = null;
bool successfulDownload = false;
using (WebClient client = new WebClient())
{
    WebProxy proxy = WebProxy.GetDefaultProxy();
    client.Proxy = proxy;
tryAgain:
    try
    {
        fileBytes = client.DownloadData(fileUrl);
        successfulDownload = true;
    }
    catch (WebException wEx)
    {
        if (wEx.Response != null && wEx.Response is HttpWebResponse)
        {
            string username = null, password = null;
            bool userCanceled = false;
            HttpStatusCode statusCode = ((HttpWebResponse)wEx.Response).StatusCode;
            switch (statusCode)
            {
                case HttpStatusCode.ProxyAuthenticationRequired:
                    // This is just a convenience function defined elsewhere
                    GetAuthenticationCredentials(fileUrl, true,
                        out username, out password, out userCanceled);
                    if (!userCanceled)
                    {
                        client.Proxy.Credentials = new NetworkCredential(username, password);
                        goto tryAgain;
                    }
                    break;
                case HttpStatusCode.Unauthorized:
                    // This is just a convenience function defined elsewhere
                    GetAuthenticationCredentials(fileUrl, false,
                        out username, out password, out userCanceled);
                    if (!userCanceled)
                    {
                        client.Credentials = new NetworkCredential(username, password);
                        goto tryAgain;
                    }
                    break;
            }
        }
    }
}

Win32 provides the Internet* functions.

http://msdn.microsoft.com/en-us/library/aa385473(VS.85).aspx

You'll need to do an (IIRC, I haven't touched these APIs in over 10 years) InternetOpenURL and InternetReadFile .

I think libcurl matches those requirements. And then some.

This example shows how to get a HTTP page, storing it in memory only. It's a bit more code than your example, but it's in C.

Apart from libcurl/curlpp (which is flexible and powerful but I find very...clunky) there are two C++ libraries I'm keeping my eye on. Both are quite new and based on Boost::ASIO. However neither support proxies (as best I can tell).

cpp-netlib ( blog ) is perhaps more mature (I know it's had some real-world testing) but is currently lacking timeouts (I'm working on it!). Example:

network::http::request  request("http://google.com");
network::http::client   client;
network::http::response response;

response = client.get(request);
if (response.status() == 200)
{
    std::cout << body(response));
}

Urdl ( documentation ) is written by the ASIO creator and has timeouts (but was only announced last month ). It uses a different model, opting to work with streams:

urdl::istream is("http://google.com");
std::string line;
while (std::getline(is, line))
{
    std::cout << line << std::endl;
}

I agree that C++ doesn't have great support for HTTP but both of these libraries show a lot of promise.

POCO also has cross platform networking components.

The examples give a FTP program as something like this (This is without the error checking fluff)

Poco::Net::FTPStreamFactory::registerFactory();
std::ofstream localFile(inputFile, std::ios_base::out | std::ios_base::binary);
Poco::URI uri(inputURL);
std::auto_ptr<std::istream> ptrFtpStream(Poco::Net::URIStreamOpener::defaultOpener().open(uri));
Poco::StreamCopier::copyStream(*ptrFtpStream.get(), localFile);

You are not looking in the wrong places. That's just the sad reality. That's why there's c++ wrapper for libcurl called curlpp .

Below is an example of how to retrive a web page and print it to stdout stream.

#include <curlpp/curlpp.hpp>
#include <curlpp/Easy.hpp>
#include <curlpp/Options.hpp>


using namespace curlpp::options;

int main(int, char **)
{
  try
  {
    // That's all that is needed to do cleanup of used resources (RAII style).
    curlpp::Cleanup myCleanup;

    // Our request to be sent.
    curlpp::Easy myRequest;

    // Set the URL.
    myRequest.setOpt<Url>("http://example.com");

    // Send request and get a result.
    // By default the result goes to standard output.
    myRequest.perform();
  }

  catch(curlpp::RuntimeError & e)
  {
    std::cout << e.what() << std::endl;
  }

  catch(curlpp::LogicError & e)
  {
    std::cout << e.what() << std::endl;
  }

  return 0;
}

Qt网络的一部分QtNetwork也是可能的。

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM