[英]php get all files from a remote directory
I have searched, and searched for 3+ hours this morning and tried over 10 different setups for how to grab and display a list of images from a url, and none of them worked correctly. 我已经搜索了,并且今天早上搜索了3个多小时,并尝试了10种不同的设置,以便如何从网址中抓取并显示图像列表,但这些设置都没有正常工作。 I would either end up with no info displaying, or a 500 error.
我要么最终没有信息显示,或500错误。 Can someone point me to an example or help me out here on how to do this properly.
有人可以给我一个例子或者帮助我如何正确地做到这一点。 file_get_contents is not a viable option.
file_get_contents不是一个可行的选择。
Example Directory: http://www.webtoonlive.com/webtoon/fantasy_world_survival/ch02/ 示例目录: http : //www.webtoonlive.com/webtoon/fantasy_world_survival/ch02/
Files i know that are in that directory: 001.jpg, 002.jpg, 003.jpg 我知道的文件在该目录中:001.jpg,002.jpg,003.jpg
I would like the output to be the exact url to the file. 我希望输出是文件的确切URL。
Let me know if more info is needed, i'm not 100% sure exactly how to explain it right lol. 让我知道如果需要更多的信息,我不是100%确定如何正确解释它大声笑。
Edit: 编辑:
ok so what I guess i actually want to do is check the url for all the image tags and display a list with the full url to that image. 好吧,我想我真正想做的是检查所有图像标签的网址,并显示一个包含该图像的完整网址的列表。
New to working with this url+images+php stuff so please don't hit me too hard with your downvote hammer with no comments lol. 使用这个网址+图片+ php的新东西,所以请不要用你的downvote锤子打我太难,没有评论大声笑。
Code I Tried: 我试过的代码:
<?php
/*
Credits: Bit Repository
URL: http://www.bitrepository.com/
*/
$url = $location;
// Fetch page
$string = FetchPage($url);
// Regex that extracts the images (full tag)
$image_regex_src_url = '/<img[^>]*'.
'src=[\"|\'](.*)[\"|\']/Ui';
preg_match_all($image_regex, $string, $out, PREG_PATTERN_ORDER);
$img_tag_array = $out[0];
echo "<pre>"; print_r($img_tag_array); echo "</pre>";
// Regex for SRC Value
$image_regex_src_url = '/<img[^>]*'.
'src=[\"|\'](.*)[\"|\']/Ui';
preg_match_all($image_regex_src_url, $string, $out, PREG_PATTERN_ORDER);
$images_url_array = $out[1];
echo "<pre>"; print_r($images_url_array); echo "</pre>";
// Fetch Page Function
function FetchPage($path)
{
$file = fopen($path, "r");
if (!$file)
{
exit("The was a connection error!");
}
$data = '';
while (!feof($file))
{
// Extract the data from the file / url
$data .= fgets($file, 1024);
}
return $data;
}
?>
and it returned a blank page 它返回了一个空白页面
Based loosely on the code you already tried (but was riddled with problems). 松散地基于你已经尝试过的代码(但是却充满了问题)。 This grabs the full contents of the URL
$url
, parses out the <img>
src
attributes, and then outputs them. 这将获取URL
$url
的全部内容,解析出<img>
src
属性,然后输出它们。
Because this particular web host uses <base href=""/>
tag to reset the base part of all URLs on the page, I've added a $base
variable which you should set to the contents of the base tag. 因为这个特定的Web主机使用
<base href=""/>
标记来重置页面上所有URL的基本部分,所以我添加了一个$base
变量,您应该将其设置为基本标记的内容。
Additionally, it looks like this particular web host has some pretty smart anti-hotlinking in place, so not all images may be visible. 此外,看起来这个特定的Web主机有一些非常智能的反链接,因此并非所有图像都可见。
But! 但! Give it a whirl, let me know if it does what you need it to, and any questions.
给它一个旋转,让我知道它是否做你需要它,以及任何问题。
<?php
$url = 'http://www.webtoonlive.com/webtoon/fantasy_world_survival/ch02/';
$base = 'http://www.webtoonlive.com/';
// Pull in the external HTML contents
$contents = file_get_contents( $url );
// Use Regular Expressions to match all <img src="???" />
preg_match_all( '/<img[^>]*src=[\"|\'](.*)[\"|\']/Ui', $contents, $out, PREG_PATTERN_ORDER);
foreach ( $out[1] as $k=>$v ){ // Step through all SRC's
// Prepend the URL with the $base URL (if needed)
if ( strpos( $v, 'http://' ) !== true ) $v = $base . $v;
// Output a link to the URL
echo '<a href="' . $v . '">' . $v . '</a><br/>';
}
Sample output: 样本输出:
http://www.webtoonlive.com/webtoon/fantasy_world_survival/ch02/000.jpg
http://www.webtoonlive.com/webtoon/fantasy_world_survival/ch02/001.jpg
http://www.webtoonlive.com/webtoon/fantasy_world_survival/ch02/002.jpg
http://www.webtoonlive.com/webtoon/fantasy_world_survival/ch02/003.jpg
http://www.webtoonlive.com/webtoon/fantasy_world_survival/ch02/004.jpg
http://www.webtoonlive.com/webtoon/fantasy_world_survival/ch02/005.jpg
http://www.webtoonlive.com/webtoon/fantasy_world_survival/ch02/006.jpg
http://www.webtoonlive.com/webtoon/fantasy_world_survival/ch02/007.jpg
http://www.webtoonlive.com/webtoon/fantasy_world_survival/ch02/008.jpg
http://www.webtoonlive.com/webtoon/fantasy_world_survival/ch02/009.jpg
http://www.webtoonlive.com/webtoon/fantasy_world_survival/ch02/010.jpg
http://www.webtoonlive.com/webtoon/fantasy_world_survival/ch02/011.jpg
http://www.webtoonlive.com/webtoon/fantasy_world_survival/ch02/012.jpg
http://www.webtoonlive.com/webtoon/fantasy_world_survival/ch02/013.jpg
http://www.webtoonlive.com/webtoon/fantasy_world_survival/ch02/014.jpg
http://www.webtoonlive.com/webtoon/fantasy_world_survival/ch02/015.jpg
http://www.webtoonlive.com/webtoon/fantasy_world_survival/ch02/016.jpg
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.