I'm looking to embed multiple SoundCloud objects on my website -- it's the primary content, but it takes way too long to load. SoundCloud's own website doesn't take nearly as much time to load and they have more than 5 objects on a single page. What are some ways I can speed up the SoundCloud objects?
Define "way too long to load". Is it loading (albeit slowly) or is it not?
If you're embedding through SoundCloud, the objects should load in the same speed as when you're on SoundCloud.com itself, except for the data on your webpage, which goes through your webhost, then back to the user's computer.
If i don't get your question wrong though, it may be a problem with your webhost's communication speed. Is your webhost a free to use one? Have you checked the upload/download speed before purchasing it?
Or is it because some part of your code the embedding is wrong? (If it doesn't load, that is)
Cheers.
Related
For a messenger app I store the latest messages in Redis.
They will be kept for 24 hours. Along with each message I have a thumbnail image.
Is it a good approach to store the thumbnail (2KB each) along with the message in Redis? It would make fetching the messages much faster since I get the message and the image in one transaction.
Or should the thumbnail be stored in S3 despite the fact that I need an additional PUT and GET request per message?
Edit:
The thumbnails are different per message. A message consists out of a text and a link to an image. While the full resolution image is stored on S3, the message saved in Redis contains only a link to it.
The client is an iOS app. The app collects all messages from Redis. If the message contains an image, only the thumbnail should be shown before downloading the full resolution file.
The application design must allow thousands of requests / second.
See WhatsApp example:
Edit:
I calculated the AWS cost for both options.
Redis: Redis would cost 3k USD for 120 Million messages.
S3: An additional PUT request per message would double the S3 costs. 10k USD for 1B messages / month
Let's assume this is your requirement:
An iOS app instant messaging app;
There will be 1k/s messages;
If the message contains preview-able information, like video/img, a thumbnail should be displayed.
Some inferred conditions:
There might be 3k/s messages during peak;
There might be 3k/s preview-able messages during peak.
I assume the other part of your system is well-done, and won't have bottle neck. 1k/s messages means you need to do at least 1k write per second to redis, that's totally nothing to redis. Then you are asking if you need to store the thumbnail of preview-able information as well in the redis, and my quick and personal answer is NO.
Client Aspect
First question you should ask yourself is, does respond time really matter for client in this case? Would the missing of the preview be a big trouble and cause a major user experience degradation? Are there any ways to bear with slow respond time while maintaining a relatively high UX?
I believe users won't be too unhappy if he/she didn't see a preview of a video/img, compared to missing the video/img link. I agree that missing an img preview may cause some UX degradation, but why you would display it something saying "I'm bad please blame me"? You could display the img whenever you received the full thumbnail.
Server Aspect
First question you should ask is, does caching give any more benefit than uploading? Besides, does caching introduce any problems?
Since you might not have good control on the thumbnail size, pushing to redis might take longer and consume more resource than you expected. And this may cause some issues on writing text messages into redis. Also, if you store the thumbnail in redis, you need to require the thumbnail through your server, which is one more request, and a big response.
Suggestions
Don't store in redis, just generate the thumbnail and upload to S3. Trust amazon, they are good, for most of the time.
But wait, are we done? Absolutely no. Why we need to upload the image to our server first, then asks the server to generate thumbnail upload them? Why can't we just do it on the client side?
Yes, that's another solution. Compress the picture, upload thumbnail and full size to S3, and get a link to it, and send the link to server. Then the server will send this link to the other client, and the other client will fetch the image from S3.
In this way, your server won't be flooded by huge images, even during peak.
Concerns
There are of course quite a lot concerns: how to handle upload failure case? How to handle malicious abusing actions? How to handle duplicated images (like stickers)? How to link an image with a chat room?
I will leave these questions to you, since some of them are biz logic related :)
Last Words
DO do load test and benchmark using good simulation of traffic and good logging so that you know where's the bottle neck, and could optimize wisely.
And always remember: Get it run first, then get it right, and get it fast only if you have enough motivation and strong reason. Premature optimization is the root of all evil, and, a waste of time.
I was wondering how the services like http://www.inspectlet.com/ does store the video sessions. By the looks I don't think it's a webRTC implementation. What i was able to figure out that there is active express socket which is making communication but in that case they will have to store the page and track all the events from DOM. Just wanted to confirm that this is the approach they are following.
Looking at the event listeners on the page, it looks like there are a lot of bindings. For example, the <body> has scroll, keyup, and change events bound to a function. I'm sure it also has mousemove, mouseclick, etc. All of this is likely stored in a Javascript variable (object, probably) and then AJAXed every so often to http://hn.inspectlet.com/adddata with the data parameters. Here is a sample of what is being sent:
http://hn.inspectlet.com/adddata?d=mr,212941,46,337,46,1277)mr,213248,163,498,163,1438)mr,213560,144,567,144,1507)mr,213873,138,240,138,1180)mr,214188,169,184,169,1124)mr,214504,158,520,158,1460)mr,214816,231,487,231,1427)mr,215130,329,197,329,1137)mr,215444,894,289,894,1229)mr,215755,903,295,903,735)s,215755,440,0)&w=259769975&r=494850609&sd=1707&sid=1660474937&pad=3&dn=dn&fadd=false&oid=99731212&lpt=212629
As suggested in Adam's answer, they track many events in the page and send them either via a websocket or post/get request (XHR) to their servers.
I am not sure what inspectlet does specifically, but in general, such a solution will need to follow the below general steps:
When the page is fully loaded (hook on DOMContentLoaded probably) they will send the page data to the server. Then they also hook on MutationObserver in order to track all changes to the DOM in the page, so when something changes dynamically, the tracking script can 'record' it and send it to the server.
The SaaS application in turn, will have a player that will parse all this raw data and then play it back, this will usually require using some virtual file system for assests (images, css, scripts) and handling js scripts to not post back again (replay xhr will have bad results for tracked websites) but play back the mutations as they were captured (recorded)
In regards of data HTML pages compress really well, especially when you can make assumptions about the data (since you know its html) - so yes, they actually cache a lot (similar to what google does in that regards, but google does it for the entire web, not just 'customers')
The DOM Mutations will need to be stored as well. This is up to the algorithm, it can either be stored as plain text or using a smarter data model, storing them in plain text is obviously not the cost effective solution.
The above is an abstraction, there are many edge cases to handle in order to implement such a solution as well as a lot of mathematical and algorithmic thinking in regards of heatmaps to make them accurate.
So after a long search was able to find a new promising solution on the block, which solves all the complex parts in building such service. It is still under development but it solves the problem. Below is the link to it,
https://www.rrweb.io/
https://github.com/rrweb-io/rrweb
In the quest for fast (initial) page load, I'm wondering if there is a way to organize page load without using any javascript?
I don't know if you really need the delay capacity of javascript, if you could just order the way the content of your page loads (header-->content-->sidebars-->elements further down...
Or am I wrong about getting good page ratings without delaying?
Of course you can write a page without using js at all (for example pure html like it was in the beginning of web) and if you will use css3 then your page could look modern and dynamic. But it is hard to imagine any rich and modern functionality without support of scripting on client side (like autoload data on scroll via ajax, or collecting statistics or having local storage etc).
If you are thinking about fastest way to load initial page - try to think about caching static objects like images, js files etc, and to load static data first giving perception of a fast load, loading dynamic data on a background using js. However this is quite a spread topic and there are different techniques how to do such effect, i would recommend searching for topics about architectures of modern high load web apps like facebook, youtube, twitter and so on.
I'm currently building a ASP.NET MVC 3-WebApp that handles lots of images, lets say up to 100 per page.
At the moment the WebApp itselfs serves the images. The reason is, I want to make sure that only authorized and logged on users can download the images. This approach suffers from performance, because on the one hand the Browsers loads the images sequentially, and on the other hand this scales not very well.
Therefore I would like to introduce a external WCF-WebService from another host, that serves the images and only the images. This works very well, but at the moment I have no idea how to make the Download-Url secure.
In my page from lets say "www.imageviewer.com" I would like to have many image tags like so:
[Image-Tag] Source="imageservice.imageviewer.com/Download/someID" [/Image-Tag]
I know I could send some encrypted security information within the Download Url, like UserID or other SecurityTokens and make some processing with that. But this would not prevent, that the User (or another User) can download the image in another Browser, without being logged on.
I would like to have a session-based solution. Only with a valid session after logging on to the WebApp the Browser should download the image from the WebService.
Any ideas how to solves this?
Are webservices the new regex?
Some people, when confronted with a problem, think "I know, I'll use a webservice". Now they have two problems.
Please describe in what way you think a webservice will make your images load faster? There will be more overhead (XML (un)packing, adding another layer of code), and since a webservice is not more or less an HTTP request than the thing your browser does when requesting an image you will still run into the browser's limit of connections.
A browser does not really sequentially load images, but rather does it at a rate of about two to eight at a time to the same domain.
Loading the images from different subdomains is a more common approach, and adding some lazy loading will speed it up even more. You can then still secure it, using information stored in your session, cookies or headers.
If you, on the other hand, just want an answer to your question and no friendly advice: you could simply secure your service with a Custom User Name and Password Validator, where you simply override the validator to validate against your known user credentials.
I want to make an app where the users can post messages that will be displayed on a website. The users would need to create a username and password to be able to post.
The app would be like a twitter, but only be able to post through the app and read the last few posts and not be able to write private messages.
The website would function like a huge cloud of thoughts where everyone could go and read what others have written. Once the post hit the cloud, they can't be deleted. Only me could delete posts.
All posts would have different color and font size, it would look like a huge tag cloud on the website.
How do I make an app and a website like this?
David H
The tutorial application for Google Application Engine is an unstyled version of what you describe. They'll even host it for you for free (up to a non-trivial level of usage).
The tag cloud creation is not so very hard but without knowing your preferred language it is hard to point you to helpful libraries (there are plenty out there).
Getting people to use it will be the hard part.
added in response to comment:
Good luck on your endeavor. I would be surprised if you weren't able to learn everything you need to know and have a working web app by the time school starts. I found a simple stand alone web cloud creation library that explains what it does and will run on GAE. So now even that part is in place for you.
I'm tempted to make some pathetic reference to the sorts of computing that I did prior to high school, but I expect that you probably have SD data cards have more computational power than I had available to me. Kids these days! ;)