The page is now working pretty well, and the majority of errors are fixed, I've tried to move as much of the styles into CSS as possible, and tried to clean out unused CSS. The calendar and contact are working without errors, but with these features and lightbox there are a lot of CDN CSS and JS files. The site now utilizes a 2 stage image scrset one for smaller devices and another for larger devices. I may implement a 3rd stage for in between.
After using two page optimization websites and finding some other suggestions, such as enable gzip compression for all text based files, enable caching for longer periods of time particularly images, and enable keep alive, all through the .htaccess file and improving server response, it also looks it is possible to improve performance by:
- Minimize Html, CSS and JS - Haven't do this for the main page yet.
- Combine some smaller CSS files with "styles.min.css" and serve from the website (lightbox,etc).
- Combine some smaller JS file with "script.min.js" and serve from the website (lighbox,etc).
- Delay load of all non-blocking JS until after the first page has rendered.
- Delay load of Avail Calendar iframe load after the first page has rendered.
- Delay load of reCaptcha until after the first page has rendered, perhaps using "invisible" recaptcha.
- Delay loading of below the fold images by first loading base64 and then using javascript
Our website so far: https://mysticseacaptain.com
Web page optimize sites:
- https://varvy.com/pagespeed/
- https://gtmetrix.com/
- https://www.webpagetest.org/
Minimize Html, CSS, JS
- https://htmlcompressor.com/compressor/
Defer parse of JS
- https://varvy.com/pagespeed/defer-loading-javascript.html -tried this it doesn't seem to work.
- https://premium.wpmudev.org/forums/topic/how-do-i-defer-parsing-of-javascript-to-iprove-performace - some notes
Delay load of Avail Calendar iframe
- http://www.aaronpeters.nl/blog/iframe-loading-techniques-performance?%3E
- https://www.experts-exchange.com/questions/23986112/Delay-iFrame-load-until-after-parent-window-loads.html
- http://stackoverflow.com/questions/17298863/delay-an-iframe-from-loading-its-src-until-a-function-is-called
Delay load of reCaptcha
- Perhaps use the invisible reCaptcha
Delay load of images
- https://varvy.com/pagespeed/defer-images.html
- But twinstream advises that images will be ignored by google image rankings. (Can anyone confirm this?)
Note about Optimizing image files. I have done this four or five times using jpeg each time starting with original files and doing a single step compression (sometimes resize as well) using 30%, 40%,50% depending on outcome (review and selection). The web page optimizers are still saying my images need to be compressed further and are failing the site for image size. This is simply not realistic, I am not going to crank these images down to 20% as they suggest, they already have taken a hit on image quality.
I would be very interested in other peoples experiences, suggestions and approach to this aspect of web design.
I am particularly interested in details for techniques that work.
Well, I guess there isn't much interest in this aspect of web design, or perhaps not to much to say because it is so site specific.
I did work my way through all of these issues, basically finding the fastest way to load the website using the tools I had.
One goal was to try to defer the AvailCalendar and reCaptcha until after DOM loaded. I found that because the site is https the available javascript threw a cross site blocking error that is for protection from outside injections I guess.
The reCaptcha could not be delayed after the DOM or the captcha does not appear or work, so it is best to load it early, just after the CSS in the head. This seems to result in a 1% improvement over putting at the bottom of body.
The javascript to delay the iframe worked and prevented loading until after everything else had loaded, and the late loading did improve pagespeed but did not lengthen site load times, which is fine because users will find there way to the Calendar tab later in all likelihood.
I minimized everything, moved all style tags into CSS (except one), included small and loose CSS in the style+.min.css file such as ..validator, lightbox, etc. and did the same for JS moving all loose script into the script+.min.js file and including ..validator and lightbox in with the file, to reduce the number of files and HTTP calls.
If there were something I could do to load all the images together that might help, but I haven't found anything so far.
I was looking into an Apache Server program called mod_concat which help load multiple CSS and JS with a simple server tool, but it looks like that is for earlier servers and it would change what I have done because I've joined the smaller user CSS and JS files to speed up uploads.
Not so proud of the pagespeed at 68% and YSlow at about 84% but that is the best that can be done at this point and I am done with this now. Also bootstrap.min.css is big enough now that it is blocking render.
Website: https://mysticseacaptain.com
Optimized with https://gtmetrix.com/ and https://www.webpagetest.org/
Just thought I would report the outcome.
By the way some of these tools keep reporting that the images are not compressed enough, and I think they have these readings wrong. If I compress further they will become bad images.
It's probably quiet as you are not asking questions about bootstrap studio, instead you are asking people to help you optimise your site which is not what this forum is for.
Yes, you have a point, but since Bootstrap adds quite an additional load in the form of additional CSS and JS files, I believe it is a subject that needs to be properly addressed here in the forum, and have tried to post some helpful links and some details about my experience optimizing a Bootstrap Studio website for some guidance to others.
Compared to my previous websites which were often well under 1mb and perhaps a few over, which became "not mobile friendly" even though they were very light, this particular Bootstrap website is considerably larger (including more photos) and I have found there are a number of hardware and technical constraints to optimization, but it can be done.
I found that the iframe availcalendar and google recaptcha for the php contact form were the most costly. I was able to delay the iframe, but the recaptcha did not lend itself to that approach.
In summary, Chris, I respectfully disagree about what the intent of the forum is, insofar as Bootstrap needs to be optimized and there may be some opportunities for Bootstrap Studio to help in some ways with that chore. There is no chance of that happening if we don't discuss it. Furthermore, Bootstrap Studio already has made some assumptions about optimization in its design, and exporting, and therefore this discussion is certainly "fair game" in the forum. Yes, I can understand why you might not want to add anything to this discussion.
It is an interesting method using java script to load images after loading the html base64 method first. I think you are correct in that the replaced images will still be indexed by google as java script is indexed. It may be a issue of priority ranking as they may fall lower on the ladder ?
I also found it interesting when researching this idea that Google itself converts its ranked images to base64 obviously to display the image search results faster on its own image result display page. Then, when you click on an image and it comes up in its own window, the actual picture is shown and the actual image link which by my research shows very few if any base64 images that websites are using (without the fast loading conversion).
My research indicated that Base64 is to be used for images generally that have no importance. Base64 conversion can actually create files larger than the actual picture image size. My understanding is also that loading base64 images will gain points for being responsive as the images are loaded with the html.
Eventually I will begin some tests......thank-you for the information.
Thank you Twinstream. I'll do some research on this too.
I've observed that the website optimizing links above generally download the "SRC" version (which is the larger version of the image, in my case 1024x768) and do not try to test mobile or in between images served up by the "SCRSET" attribute. Once having chosen to download the largest version of the image provided, these websites then proceed to "Fail" the images and advise me to optimize them, when the test has not even considered using the much smaller mobile version.
After some testing of loading images base64 and then using javascript to replace the base64 image with actual image links in the scr, I have found that this does indeed still allow google to index the image as a standard image file. My image ranking was in the top 10 after only two weeks.
Twinstream,<br /> What great news! It was your idea. I assume you found this technique saved time to page display. Thank you for adding this valuable information. It is very helpful. Can you point me in the direction of your website so I can see how you did it? Thanks again! And congratulation on the rating. Best Rick
I have not had time to study the effects of performance yet as this test was performed with only two images which are two of many. As this is my busy season I have little time at the moment that allows me to test further.
Here is the code I used though to change base64 images in the src using the scrset.
<script>
function init() {
var imgDefer = document.getElementsByTagName('img');
for (var i=0; i<imgDefer.length; i++) {
if(imgDefer[i].getAttribute('srcset')) {
imgDefer[i].setAttribute('src',imgDefer[i].getAttribute('srcset'));
} } }
window.onload = init;
</script>