Photo resize. Client-side or server-side? - resize

I create a photo-gallery site. I want an each photo to have 3 or 4 instances with different sizes (including original photo).
Is better to resize a photo on client-side (using Flash or HTML5) and upload all the instances of this photo to a server separately? Or it's better to upload a photo to a server only one time, but resize it using server resources (for example GD)?
What would be your suggestions?
Also it's interesting to know, how does big sites do this work? For example 500px.com (this site for each photo creates 4 instances and all works fast enough) or Facebook.

There are several schools of thought on this topic, it really comes down to how many images you have an how likely it is that the images will be viewed more than once. It is most common for all of the image sizes to be created using a tool like Adobe Photoshop, GIMP, Sizzlepig or GD (locally or on A server, not necessarily the web server) then upload all the assets to the server.
Resizing before you host the image takes some of the strain off of the end-user's web browser and more importantly reduces the amount of bandwidth required to host the site (especially useful when you are running a large site and paying per GB transferred)
To answer your part about really big sites, some do image scaling ahead of time, others do it on the fly, but typically it's done server side.

Related

Performance issue for Rails: How to send gzip assets

I am using rails asset pipeline feature in production environment.I have written some setting in nginx to send files in the gzip format and files are coming properly in gzip format.I guess the browser is automatically decoding it so hence i am not able to find out whether the js files are comming in gizp format or not.
I have put the following command and i am getting "content-encoding: zip" in the response.
curl -v -H 'Accept-Encoding: gzip' -o /dev/null http://www.example.com/assets/style.css 2>&1 | grep -i Content-Encoding
I have written below setting in nginx to send files in the gzip format and files are coming properly in gzip format.
location ~ ^/(assets)/ {
root /home/webuser/app/woa/public;
gzip_static on;
expires max;
add_header Cache-Control public;
# access_log /dev/null;
}
How will i come to know that files are coming in gzip format or not??
Also please suggest any other options which can be helpful to improve the performance of the site.
Not a direct answer to your first question(if you solved it please do explain), but for improving site performance remember the Perfomance Golden Rule:
80-90% of the end-user response time is spent on the front-end. Start there.
Below is a non-exhaustive list areas of improvement for increasing performance in a Rails app:
Diagnosing the Problem:
YSlow / Google Page Speed
A useful diagonsis tool for identifying perfomance issues is Yslow or Google Page Speed They are browser extensions that diagnoses and identifies common issues slowing down your app (particularly on the front end).
Back-end Tools
For your Rails back-end I recommend incorporating tools such as Bullet & NewRelic directly into your development processes, so that while you're developing you can spot bad queries immediately while they are still easy to fix.
Check Server Console Logs
Checking your server logs is an effective method for diagnosing what components of your Rails app is taking the longest. E.g. below are sample logs from two unrelated production Rails apps running in my local development environment:
# app1: slowish
Rendered shared/_header.html.erb (125.9ms)
Rendered clients/details.html.erb within layouts/application (804.6ms)
Completed 200 OK in 3655ms (Views: 566.9ms | ActiveRecord: 1236.9ms)
# app2: debilitatingly slow
Rendered search/_livesearch_division_content.js.erb (5390.0ms)
Rendered search/livesearch.js.haml (34156.6ms)
Completed 200 OK in 34173ms (Views: 31229.5ms | ActiveRecord: 2933.4ms)
App1 & App2 both suffer from performance issues, but App2's performance issues are clearly debilitatingly slow. (34 seconds!) But with these server logs, I know for App1 that I should look into clients/details.html.erb, and that for App2 I absolutely need to investigate search/livesearch.js.haml.
Improving Front-end Performance
Budget your page size strictly
To maintain fast load times you need reduce the amount/size of your page assets (JS/CSS/Images). So think about your page size like a budget. For example, Hootesuite recently declared that their home page now has a strict page-size budget of 1 mb. No exceptions. Now check out their page. Pretty fast isn't it?
Easy wins for reducing your page size include stripping out unused JS or CSS files, including them only where needed, and changing static images into much smaller vectors.
Serve smaller image resolutions based on screen width
Image loading is a large cause of slow page loading times. A large 5mb image used in the background of your splash page can easily be brought down to 200kb-400kb in size, and still be high quality enough to be hardly indistinguishable from the higher resolution original. The difference in page load times will be dramatic.
You should do these same improvements to user uploaded images as well. E.g. if your website's user avatars are 50px by 50px in size, but a user uploads a 5mb image for his avatar, then it's essential that you serve the image with lower file sizes and resolutions to fit exactly how it will be shown on your site.
Carrierwave, Fog, and rmagick are popular gems used with Amazon S3 to achieve better image loading. With that collection of packages you can dynamically serve smaller image resolutions based upon the screen size of each user. You can then use media queries so that mobile device get served smaller resolution sizes of images compared to your users with Retina screens.
Use a Content Delivery Network to speed up asset loading
Adding on to the last point, you can speed up asset/image loading times by using a Content Delivery Network (CDN's) such as Cloudfront. CDN's distribute assets across many servers, then serve assets to your users via servers that are located the closest to the user making the request.
Fingerprint Static Assets
When static assets are fingerprinted, when a user visits your page their browser will cache a copy of these assets, meaning that they no longer need to be reloaded again for the next request.
Move Javascript files to the bottom of the page
Javascript files placed at the bottom of the page will load after the page loads. If javascript assets are placed on the top of the page, then the page will remain blank as a user's browser attempts to load your javascript files. Fortunately Rails will automatically place javascript files to the bottom of your page if you use the asset pipeline or specify javascript files using the javascript_include_tag.
EDIT: Most modern browsers now optimize Javascript loading automatically so you can mostly ignore this advice.
Improving Back-end Performance
Cache, Cache, Cache!
Among all backend performance optimizations, caching is among the most effective for producing dramatic performance gains. A well implemented caching regime can greatly minimize the damage of inefficient queries within your backend during periods of high scalability. Content that is accessed frequently, yet changes relatively infrequently, benefits the most from caching.
Caching is so powerful that it brought down the page load times of App2 mentioned above from 34 seconds to less than a second in production. There is simply no other performance enhancement on the back-end that can come even close to what we got from caching.
Overall, when doing performance optimization with caching, start high then go low. The gains you will get will be greater for less effort.
From high to low, some types of caching available to you are:
HTTP caching (does not cache your server, but involves a user's browser caching content locally by reading HTTP headers)
Page caching (memcache)
Action Caching (memcache)
Fragment caching (memcache) or Russian doll caching (a favoured technique for caching with fragments)
Model caching (memcache)
To learn more about caching, a good place to start is here: http://guides.rubyonrails.org/caching_with_rails.html
Index Everything
If you are using SQL for your database layer, make sure that you specify indexes on join tables for faster lookups on large associations used frequently. You must add them during migrations explicitly since indexing is not included by default in Rails.
N+1 queries
A major performance killer for Rails apps using relational (SQL) databases are N+1 queries. If you see in your logs that your app is making many database read/writes for a single request, then it's often a sign you have N+1 queries. N+1 queries are easy to miss during development but can rapidly cripple your app as your database grows (I once dealt with an that had twelve N+1 queries. After accumulating only ~1000 rows of production data, some pages began taking over a minute to load).
Bullet is a great gem for catching N+1 queries early as you develop your app. A simple method for resolving N+1 queries in your Rails app is to eager load the associated Model where necessary. E.g. Post.all changes to Post.includes(:comments).all if you are loading all the comments of each post on the page.
Upgrade to Rails 4 and/or Ruby 2.1.x or higher
The newer version of Rails contains numerous performance improvements that can speed up your app (such as Turbolinks.)
Ruby 2.1.x+ contain much better garbage collection over older versions of Ruby. So far reports of people upgrading have found notable performance increases from upgrading.
I am missing many improvements here, but these are a few performance improvements that I can recommend. I will add more when I have time.

Using AWS S3 for photo storage

I'm going to be using S3 to store user uploaded photos. Obviously, I wont be serving the image files to user agents without resizing them down. However, not one size would do, as some thumbnails will be smaller than other larger previews. So, I was thinking of making a standard set of dimensions scaling from the lowest 16x16 to some highest 1024x1024. Is this a good way to solve this problem? What if I need a new size later on? How would you solve this?
Pre-generating different sizes and storing them in S3 is a fine approach, especially if you know what sizes you need, are likely to use all of the sizes for all of the images, and don't have so many images and sizes that the storage cost is excessive.
Here's another approach I use when I don't want to pre-generate and store all the different sizes for every image, or when I don't know what sizes I will want to use in the future:
Store the original size in S3.
Run a web server that can generate any desired size from the original image on request.
Stick a CDN (CloudFront) in front of the web server.
Now, your web site or application can request a URL like /16x16/someimage.jpg from CloudFront. The first time this happens, CloudFront will get the resized image from your web server, but then CloudFront will cache the image and serve it for you, greatly reducing the amount of traffic that hits your web server.
Here's a service that resizes images from arbitrary URLs, serving them through CloudFront: http://filter.to
This sounds like a good approach. Depending on your application you should define a set of thumbnail sizes that you always generate. But also store the original user file, if your requirements change later. When you want to add a new thumbnail size, you can iterate over all original files and generate the new thumbnails from it. This option gives you flexibilty for later.

Managing images e-Commerce

I am working on e-Shop project.
I my design for each product I need a picture with three sizes:
480 * 480
290 * 290
200 * 200
Which one is better ?
Asking e-Shop Admin to upload a picture for all above sizes.
Asking him to upload a picture with size 480 * 480 then generating other sizes via asp.net
Requiring your site admin to upload three separate images is simply pushing unnecessary work overhead onto the admin, and this generally results in them not bothering to create and upload three separate images for each product - resulting in an ecommerce site with missing images.
You're far better to use the technology to make the administrators life easier by requiring them to load a single image and automating the creation of the smaller sized images. Automating manual tasks is the whole point of the IT industry, and not doing so where possible kind of defeats the purpose of building these systems.
There's not really any issue with CPU usage as you only need to generate the 2 smaller images once at the point of loading, or not at all by using CSS to resize (this may not be optimal use of bandwidth). I'd go with creating the 2 smaller images either when it is uploaded by the admin and storing it in a cache, or creating them on the fly upon the first time it is requested and then putting it into a cache.
Upload all three images - will reduce the CPU overhead. You can then use the processing power to enable your site to be more responsive.
In my opinion, preparing three sizes of the image is a time consuming process because it must be repeated for every product.
generating would be better.
on the other hand just uploading a big one and then showing small images with css class' can be useful. (if the visitor will see all the images all the time)

debugging a slow site -> long delay between connection and data sending

i ran a test from pingdom tools to check the loading time of my website... the result is that i have a lot of files that, in spite of being very small (5kB), take a lot of time (1 second or more) to load because there is a big delay between the beginning of the connection and the beginning of data downloading (in pingdom tools, this results in a very large green bar).
Have a look at this for example: http://tools.pingdom.com/default.asp?url=http%3a%2f%2fwww.giochigratis-online.net%2f&id=5691308
How can i lower the "green bar" time? Is this an apache problem (like, i dont know, the number of max. parallel connections, or something similar...), or an hardware problem? Cpu-limited, bandwith-limited, or what else?
I see that many other websites have very little green bars... how do they reduce the delay between the connection and the real data sending?
Thanks!
ps.: the site is made with drupal. Homepage generation takes about 700ms
pps.: i tested 3 other websites on the same server: same problem.
I think it could be the problem with max no. of parallel connections as you mentioned - either on server or client side. For instance, Firefox has default of network.http.max-connections-per-server = 15 (see here) while you have >70 files to be downloaded in your domain and next 40 from Facebook.
You can reduce number of loaded images by generating sprites i.e. the image consisting of multiple small images, and then using CSS to diplay them properly in places that you want. This is widely used e.g. by Google - see http://www.google.com/images/nav_logo83.png

Best way to manage probably huge photo library with iPhone SDK

I'm developing a app with a list of products. I wanna let the user have 1 picture for each products.
Now, the problem is what to do next. I think that the best way is that the photos get sync when the user connect to their computer & itunes, and acces them from the app (something like: /photos/catalog/ref1.jpg.
The other option is put them on my sqlite database, but I worry that that get bigger. I have data + picture, data change a lot but pictures are rarely modified (if much, I expect the user take 2-3 new pictures each time).
I would just use the network connection available on the device, and not bother with sync through iTunes.
When you download the images, write them to the apps Documents folder, then load them from there. Network usage vs. disk space will be concern. Keep in mind some carrier networks can be crazy expensive for data transfer.
If the images are named with a systematic format, then you can maintain them by comparing the image identifiers against your data, pruning out the older or irrelevant ones.
Do the math and ballpark just how much disk space you think it would take for a local copy of all the images.