简体   繁体   English

如何使用PHP读取远程服务器上的目录内容?

[英]How can I read the directory contents on a remote server with PHP?

I have a url, http://www.mysite.com/images and the images directory allows Directory Listings. 我有一个网址, http://www.mysite.com/images images目录允许目录列表。 How can I get the files in that directory with PHP? 如何使用PHP获取该目录中的文件?

Here is an example if you need to read the images over HTTP and the server is Apache: 如果您需要通过HTTP读取图像并且服务器是Apache,这是一个示例:

<?php
$url = 'http://www.mysite.com/images';

$html = file_get_contents($url);

$count = preg_match_all('/<td><a href="([^"]+)">[^<]*<\/a><\/td>/i', $html, $files);

for ($i = 0; $i < $count; ++$i) {
    echo "File: " . $files[1][$i] . "<br />\n";
}

?>

If it is the same server you are running your PHP on, you can use opendir() and readdir(). 如果它与您运行PHP的服务器相同,则可以使用opendir()和readdir()。

I know this question is very old, but just to get me into the swing of using this forum I thought I'd add my view. 我知道这个问题已经很老了,但是为了让我能够使用这个论坛,我想我会添加我的观点。 I found the following happened (referring to the original answer to use regex. 我发现以下情况发生了(指的是使用正则表达式的原始答案。

My html turned out to be formatted like this: 我的html结果是这样的格式:

<td>
<a href="bricks.php">bricks.php</a>
</td>

So I ended up using this: 所以我最终使用了这个:

$count = preg_match_all('/<a href=\"([^\"?\/]+)">[^<]*<\/a>/i', $html, $files);

I wanted to use the following (which tested ok in the online generator testers, but it failed to find a match in the php code): 我想使用以下(在在线生成器测试器中测试好了,但它在php代码中找不到匹配):

$count = preg_match_all('/<td>(?:[\w\n\f])<a href="([^"]+)">[^<]*<\/a>(?:[\w\n\f])<\/td>/i', $html, $files);

You can use regex to take urls from listing. 您可以使用正则表达式从列表中获取网址。 (No you can't use DOMDOCUMENT as it's not a valid HTML) (不,你不能使用DOMDOCUMENT,因为它不是有效的HTML)

You need FTP access (an FTP account to that URL). 您需要FTP访问(该URL的FTP帐户)。 If you have this, then you can log into the server with FTP and use: 如果你有这个,那么你可以使用FTP登录服务器并使用:

opendir()

and

readdir()

to accomplish what you are trying to do. 完成你想要做的事情。


If you do not have access to the server, you will need to scrape the site's HTML, and it gets more complex -> so I can let somebody else tackle that one... but google search "scrape html site" or something similar, there are plenty of pre-written functions that can do similar things. 如果您无法访问服务器,则需要抓取网站的HTML,并且它会变得更加复杂 - >所以我可以让别人解决这个问题......但谷歌搜索“scrape html site”或类似的东西,有很多预先编写的函数可以做类似的事情。

ie http://www.thefutureoftheweb.com/blog/web-scrape-with-php-tutorial http://www.thefutureoftheweb.com/blog/web-scrape-with-php-tutorial

http://www.bradino.com/php/screen-scraping/ http://www.bradino.com/php/screen-scraping/

// tho a late-comer, this one seems a bit more reader-friendly if not faster //这是一个后来者,如果不是更快的话,这个似乎更容易读者

$url = 'http://whatevasite/images/';
$no_html = strip_tags(file_get_contents($url));
$arr = explode('Parent Directory', $no_html);
$files = trim($arr[1]);
$files = explode("\n ", $files);
var_dump($files);

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 PHP:如何读取 .XLS 文件的内容而不将其保存到服务器? - PHP : How can I read an .XLS file's contents without saving it to the server? 如何使用PHP在Dropbox中创建目录内容的数组 - How can I create an array of the directory contents in Dropbox with PHP 如何在symfony 2(远程服务器)中隐藏/ web /目录 - How can I hide the /web/ directory in symfony 2 (remote server) 如何使用file_get_contents在php中的远程Web服务器上获取gzip格式的页面? - How do I use file_get_contents to get a gzip'ed page on a remote web server in php? 如何在远程服务器上运行php脚本? - How can I run a php script on a remote server? 如何使用 PHP 检查远程服务器上是否存在文件? - How can I check if a file exists on a remote server using PHP? 如何使用js和php在远程服务器上保存文件? - How can I save a file on remote server with js and php? 如何使用 PHP 和流程元素读取文本文件的内容? - How can I read contents of a text file using PHP & process elements? 如何在PHP中将stdin的内容(直到EOF)读入字符串? - How can I read the contents of stdin (up until EOF) into a string in PHP? 如何使用php中的file_get_contents函数读取特定文件夹中的文件内容? - how can I use the function of file_get_contents in php to read the file content in a specific folder?
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM