Backend-framework that can use ImageMagick and S3 - amazon-s3

I'm new to web development field, and this is the first time for me to design system. If I can receive any advice, it will be appreciated.
Currently, I am working on video-hosting platform, using AWS S3. So the backend-framework should be connected to S3. When a user download a image from backend, they should be able to edit the image with ImageMagick or something similar.
Here is a question;
Is there a good backend-framework that makes uploading images to S3 easier? I think rails active storage system works really well in this regard. But the problem of rails is that it can't use full ImageMagick functions.
What I have to care about is the connection to S3, and the ability to use image-editting system.
For now, I plant to use Ruby on Rails, but I would like to know other options and their pros and cons.
Thank you.

Related

Creating thumbnails for images on S3

I have quite common situation, as I suppose. I have website that is lcoated on amazon EC2 and I'd like to move all dynamic files to amazon S3. Everything seems ok, except 2 points:
I'm using library PDFNet with their WebViewer. To display pdf files in browser Webviwer use special ".xod" format. PDFNet provide functionality to convert pdf files to xod format. Let's see an example, when PDF file was upload on S3 and no xod file was created (I'm going to use Lambda to avoid it in future, but still). So in this case I have to download file to my local machine, convert it to xod file and upload xod file on S3(I don't see any other opportunities to do it, but it can take a lot of traffic)?
Second problem is almost the same, but it's linked with thumbnails. Currently I'm dynamically resize thumbnails depending on the required resolution and I'd like to keep it. Amazon Lambda is not situable in this case, what is the best way to do it?
Why do you say that Lambda is not suitable here?
For pt#1 PDFNet gives a library for Java, you can write a lambda function in java (its possible now) and use that to get infinite scale.
For pt#2: Amazons tutorial (http://docs.aws.amazon.com/lambda/latest/dg/with-s3-example.html) gives a detailed example of how to resize images when uploaded to S3. The example is in nodeJs, you can write a java version as well if you like.
Note that if you want to have custom logic for decision making, you can add attributes while uploading the file in S3 (http://docs.aws.amazon.com/AmazonS3/latest/dev/UsingMetadata.html#User-Defined Metadata) which you can use in your lambda function to take decisions while resizing.

How to upload images taken by raspberry to AWS IoT

I am trying to program a raspberry pi so it can take picture every 1o seconds and upload to DynamoDB through AWS IoT. So far I have programmed pi to take picture every 10 minutes. But I am not being able to send it to AWS IoT. I have been working on this for weeks now. Can anybody help me pleaseeee ?? I would really appreciate it. I am very new to programming. Thank you in advance
Things I have already done:
I have created a thing in AWS
I have also created certificate and that kind of stuff.
I have also created a table in DynamoDB
I need help with what codes do I need to add on what I have right now. So the pictures taken by Pi is uploaded to DynamoDB instead of saving in pi. If you can direct me to other websites or places you know where i can get help will be really appreciated.
Here is my code
ROLL=$(cat /var/tlcam/series)
SAVEDIR=/var/tlcam/stills
while [ true ]; do
filename=$ROLL-$(date -u +"%d%m%Y_%H%M-%S").jpg
/opt/vc/bin/raspistill -o $SAVEDIR/$filename
sleep 4;
done;
I believe you want to use S3 instead of DynamoDB. The object limit in DynamoDB is 64KB, which would be a very small picture. S3 will allow you to store an object up to 5TB's in size. (Storing a lot of images S3 vs DynamoDB)
S3 has a couple SDK's available for use (aws.amazon.com/code), but since you are using a Raspberry Pi I'd assume you would want to use Python or CLI. You can find some Python examples using S3 here: boto3.readthedocs.org/en/latest/guide/s3.html. You can also find examples for using the CLI here: docs.aws.amazon.com/cli/latest/reference/s3api/index.html
These SDK's will allow you to upload images to S3 and download the images from S3 (say to a web interface or app).

Where can I upload data permanently?

assume I have some thesis etc. and want to give the audience the possibility to download the coding part and test it;
Is there a platform for professionally upload it and also keep it there permanently (of course it should not be deleted within a couple of months)
Thanks for your help...
You can use tools such as Git-hub or bitbucket. These allow you to upload code and even have version control. Users can download your code directly and use it if they need to.

API call to upload Graph Stylesheet file to Neo4J

using Graph Stylesheets in Neo4J is nice, but I don't like the manual upload procedure.
Is there any way to perform that upload with the Neo4J API?
Cheers
Unfortunately not, it's stored in the browser in local storage.
And right now there is no functionality to tie it to the database.
It should be possible to create e.g. a chrome extension that allows management of grass files, history etc. But I'm not knowledgable enough to know how to tie it in.

Is there an automated way to push all my javascript/css/images to s3 everytime I do a website push?

So I am in the process of moving all the thumbnails of my major sites to S3 and now I am thinking about how I can consistently put all my CSS/JS/images that power the actual sites to it. It's easy enough to upload everything the first time but I am trying to think of a way to somehow automate the process everytime I push out to production.
Does anyone have any clever ways of doing this?
I used to use s3sync to compare and update the assets just before upload the site files using a bash file to iterate through my files
This works well but when the amount of likes to compare (lets say thousands) gets big this process start being really slow. If you have an small architecture (in term of assets) this would do the trick
to make this better I would recommend capistrano or some other assistant that helps you to deploy...this way you can run at all once..
upload assets
deploy your files
In the other hand you could take a look to cloudfront (amazon's CDN) and set it up using ORIGIN..this way you dont need to worry about upload the files to s3 since they will be automatically pulled on demand. The down side of this approach is the caching if you need to update a file and keep the same name (AKA expire the object)...you can do this in cloudfront but will need an script to do the task.
Depending in the traffic (and other factors, ofcourse) one or other path will fit the best.