I have a webpage with some images I downloaded from google drive using php code:
$albumURL = $url."host/".$fila['album']."/"; @$doc->loadHTMLFile($albumURL); $xpath = new DOMXPath($doc); $anchors = $xpath->query('//a'); $ref=""; foreach($anchors as $element) { $ref = $element->getAttribute("href"); } $first = true; foreach($anchors as $element) { $src = $element->getAttribute("href"); if($src != "http://drive.google.com"){ if($primero){ $first = false; echo '<div class="item active"><img src='.$url.$element->getAttribute("href").'></div>'."\\n"; } echo '<div class="item"><img src='.$url.$src.'></div>'."\\n"; }
The problem is that when I try to load my page it's very slow, I'm guessing that it is because of the quantity of images I request. Here is a link of my page: http://arreglosnavidad.host22.com/arreglos.php
I don't know if there is any way of compressing those images by using php or html/css.
Of course, it would be better if I don't have to upload those images in the server because of the CRUD management.(And to keep them up-to-date)
Some options
GET
request for large quantities images hence increase client to server traffic, you can combine all several images into one big image and use relative position in your div to just display the section or subimage that is part of your consolidated image. Check this link for this trick
https://css-tricks.com/css-sprites/
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.