How to upload one more file in same hash of IPFS - file-upload

I have created a website to upload a single image on IPFS and generate hash for it .
Now I want to upload a folder of 2 or more images, I did with pinata but I want to upload it through my website
THIS IS HTML (APP.TSX)
<form onSubmit={onSubmit}>
<input type="file" multiple webkitdirectory mozdirectory />
<input type="submit" className="button-62" />
</form>
THIS IS ON-SUBMIT FUNCTION
const onSubmit = async (e: React.FormEvent<HTMLFormElement>) => {
e.preventDefault();
const form = e.target as HTMLFormElement;
const files = (form[0] as HTMLInputElement).files;
if (!files || files.length === 0) {
return alert("No file selected");
}
const filesAsArray = Array.from(files);
if (filesAsArray[0].webkitRelativePath === "") {
// files or bunch of files
filesAsArray.forEach(async (file: any, index: number) => {
const result = await (ipfs as IPFSHTTPClient).add(file);
console.log("results-cid : ", result.cid);
console.log("results-path : ", result.path);
console.log(`full-url : https://ipfs.io/ipfs/${result.path}`);
setUploadedFile([
...uploadedFile,
{
cid: result.cid,
path: result.path,
},
]);
});
} else {
filesAsArray.forEach(async (file: any, index: number) => {
console.log("file", file);
const result = await (ipfs as IPFSHTTPClient).add(
{
path: `images/${file.name}`,
content: file,
}
);
console.log("result", result);
console.log("results-cid : ", result.cid);
console.log("results-path : ", result.path);
});
}
};
THIS IS OUTPUT IN CONSOLE
folder
App.tsx:64 file File {name: '1.jpg', lastModified: 1649998567608, lastModifiedDate: Fri Apr 15 2022 10:26:07 GMT+0530 (India Standard Time), webkitRelativePath: 'images/1.jpg', size: 102265, …}
App.tsx:63 folder
App.tsx:64 file File {name: '2.jpg', lastModified: 1651732741324, lastModifiedDate: Thu May 05 2022 12:09:01 GMT+0530 (India Standard Time), webkitRelativePath: 'images/2.jpg', size: 19522, …}
App.tsx:71 result {path: 'images', cid: CID, size: 102332}
App.tsx:72 results-cid : CID {code: 112, version: 0, multihash: Digest, bytes: Uint8Array(34), byteOffset: 0, …}
App.tsx:73 results-path : images
App.tsx:71 result {path: 'images', cid: CID, size: 19589}
App.tsx:72 results-cid : CID {code: 112, version: 0, multihash: Digest, bytes: Uint8Array(34), byteOffset: 0, …}
App.tsx:73 results-path : images
Basically, it is generating two hashes (node/directory) and storing both in images in these two different directories. I want to store both the images in same directory:
HASH 1 QmSK11ykHorPMxkwXQvj61Y2UoWxvJG7UdVhH8zTUmJ1hZ
HASH 2 QmWCjf19j3zZXAgjkRW1W2c5kECXEpWWYmSoUHq6XVw6t9

If you are using IPFS JavaScript client, you might as well try the Mutable File System API, basically allowing you think of IPFS as a file system (Note that file.path was just to demonstrate that you put a path there):
await ipfs.files.mkdir('/path/to/imageDir')
files.forEach(async (file) => {
await ipfs.files.cp(file.path, `/path/to/imageDir/${file.name}`)
})
const { cid, type } = await ipfs.files.stat('/path/to/imageDir')
Check out the doc for more info.
Alternatively, you can try web3.storage directory wrapping API which gives you a nice abstraction over IPFS.

Related

Merging two components data into one

I want to re-create the ls -AlF program but in a way that I like myself, and I want to use NoFlo to do it.
This is the graph (graphs/ListDirectory.fbp) that I made:
ReadDir(filesystem/ReadDir)
Stat(filesystem/Stat)
SplitByStatType(SplitByStatType)
Display(core/Output)
ReadDir OUT -> IN Stat
ReadDir ERROR -> IN Display
Stat OUT -> IN SplitByStatType
Stat ERROR -> IN Display
SplitByStatType DIRECTORY -> IN Display
SplitByStatType FILE -> IN Display
'.' -> SOURCE ReadDir
This is the component components/SplitByStatType.js:
const noflo = require('noflo')
exports.getComponent = () => {
const component = new noflo.Component()
component.description = 'Splits directories and files.'
component.icon = 'directory'
component.inPorts.add('in', {
datatype: 'object',
})
component.outPorts.add('file', {
datatype: 'object',
})
component.outPorts.add('directory', {
datatype: 'object',
})
component.outPorts.add('blockdevice', {
datatype: 'object',
})
component.outPorts.add('characterdevice', {
datatype: 'object',
})
component.outPorts.add('fifo', {
datatype: 'object',
})
component.outPorts.add('socket', {
datatype: 'object',
})
component.outPorts.add('error', {
datatype: 'object',
})
component.process((input, output) => {
if (!input.hasData('in')) return
const data = input.getData('in')
const { isFile, isDirectory, isBlockDevice, isCharacterDevice, isFifo, isSocket } = data
if (isFile) {
output.send({
file: data,
})
}
if (isDirectory) {
output.send({
directory: data,
})
}
if (isBlockDevice) {
output.send({
blockdevice: data,
})
}
if (isCharacterDevice) {
output.send({
characterdevice: data,
})
}
if (isFifo) {
output.send({
fifo: data,
})
}
if (isSocket) {
output.send({
socket: data,
})
}
// TODO: Else, error?
output.done()
})
return component
}
What would you call this component and/or has someone made it already?
Can I do this without implementing my own component using other already existing components?
How do I tie together the filename and the stat so that I can process it in another component and print one line for each?
What I want to end up with is one line per node with directories first (sorted and with a /) and files last (also sorted and files beginning with '.' first).
noflo-assembly has a Merge utility for doing exactly that. But it is also quite easy to do with a regular component:
const noflo = require('noflo');
exports.getComponent = () => {
const c = new noflo.Component();
c.addInport('a');
c.addInport('b');
c.addOutport('out');
return c.process((input, output) => {
// Check that we have data for both A and B inputs
if (!input.hasData('a', 'b')) {
return;
}
// Read both inputs
const [a, b] = input.getData('a', 'b');
// Send combined data out
output.sendDone({
out: {
a,
b,
}
});
});
};

React native : can't unzip the file I get with rn-fetch-blob

I'm trying to download a zip file with rn-fetch-blob, then when I got this file I unzip it with React-native-zip-archive.
It often works well, but sometimes, the "unzipFile()" function I've created can't unzip the file, like if it is corrupted.
Someone already got this problem ?
Here is my code :
downloadZipFile(res => {
unzipFile(res.path(), (boolean, path) => {
if (boolean !== false) {
db = SQLite.openDatabase({
name: "addb.sqlite",
location: "default",
createFromLocation: path
}).then(DB => {
db = DB;
db.transaction(tx => {
tx.executeSql(
"SELECT * FROM sqlite_master",
[],
(tx, results) => {
console.log("Logs sqlite_master");
const rows = results.rows;
for (let i = 0; i < rows.length; i++) {
console.log(_getCurrentDate());
datas.push({
...rows.item(i)
});
}
console.log(datas);
callback(true);
},
(tx, err) => {
console.log(err)
}
);
});
});
} else {
console.log("Can't create database");
callback(false);
}
});
});
And the functions I used :
export function downloadZipFile(callback) {
RNFetchBlob.config({
fileCache: true
})
.fetch(
"GET",
"MY LINK"
)
.then(res => {
console.log("The file saved to ", res.path());
callback(res);
})
.catch((errorMessage, statusCode) => {
// error handling
console.log(
"erreur : " + errorMessage + " and statuscode : " + statusCode
);
});
}
export function unzipFile(sourcePath, callback) {
const charset = "UTF-8";
const targetPath = "/data/user/0/com.myapp/databases/";
unzip(sourcePath, targetPath, charset)
.then(path => {
console.log(`unzip completed at ${path}`);
callback(true, path);
})
.catch(error => {
console.log("there is an error" + error);
callback(false, null);
});
}
Others informations :
The file is a database that I have to put in the "databases" folder's application. I tried to put a console.log(path) everywhere in the "unzipFile()" function to see if the file is really created when I try to unzip it, and it seems he is here… And when the file is impossible to unzip, it does the same size as the others which work.
rn-fetch-blob calls an api which copy an existant distant database and zip it as an axd file. Is there any problem with this format ? Can the api be the problem ?
The axd file created by the api is used by an existant application and seems to work correctly for the existant application. Moreover, when we download the file without rn-fetch-blob (by copying the link in my navigator), it works correctly everytime I tried.
I tried to download the file directly,the api always sent me the same file (a zip file or an axd file), and it works without problem (20 try). Can the problem be the delay to download the file ? With the api, it takes 5 or 6 seconds, without it takes 2 seconds. But I think my unzipFile() function only start when the file is downloaded, no ? And as I said, when I put a console.log(path) in the unzipFile() function, the file is here, with same size as others...
I Don't know how to make it works everytime, hope someone can help me :)
Ty !
I tried to put a for(let i = 1; i < 101; i++) to do the RNFB 100 times :
it works 97 times / 100 and 96 times /100...
Then I tried to put a timer, to be sure the file is finished to download, it works 3 times / 100...
And I deleted the timer, and now it never works anymore, or 5 times / 100...
I really Don't understand what is the problem :(

How to check if file type is a specified file type(Image) in vue.js?

Helo. i want to check my input file type if it's jpg or not. i worked with 'vee-validate', which it was fine, but what i like to do is like what I've did with file size base on a tutorial.
Here's the code:
<input type="file" #change="updateMelliCodeFrontScan" name="mellicode_front_url" class="form-input" >
Here's the Vue method:
updateMelliCodeFrontScan(e){
// console.log('uploading');
let file = e.target.files[0];
let reader = new FileReader();
// let vm = this;
if (file['size'] < 200000){
reader.onloadend = (file) => {
// console.log('RESULT', reader.result)
this.form.mellicode_front_url = reader.result;
}
reader.readAsDataURL(file);
}else
{
swal({
type: 'error',
title: 'Size limited.',
text: 'size limit',
})
}
},
so i want to do it like this:
=> i want to make another if for file type base on extension/type like file['type'].
i used file['type'] === jpg and didn't worked.
if (file['size'] < 200000){
reader.onloadend = (file) => {
// console.log('RESULT', reader.result)
this.form.mellicode_front_url = reader.result;
}
reader.readAsDataURL(file);
}else
{
swal({
type: 'error',
title: 'Size limited.',
text: 'size limit',
})
}
It's because it return image/jpeg in file[0]
if(files[0]['type']==='image/jpeg')
Try this it's works.
For multiple file type checking you can use like this
if(['image/png','image/jpeg','image/svg'].includes(files[0]['type'])){
}

Upload larger files to S3 using not multipart but put upload

I want to upload and encode video file using S3, elastic-transcoder and lambda fuction.
Lambda function and settings work fine, but when I upload video file(mp4), it is automatically 'multipart' uploaded (because it's larger than 5mb).
And maybe because of that, job processes on elastic-transcoder run three times. So I got to run unnecessary job twice on each upload (the first job successfully encode video file).
Is there any ways to avoid this, such as force to upload larger file (about 60mb) not using 'multipart' but 'put' function? And I'd prefer to do this job from browser console for non engineers.
And this is lambda function to submit job for ElasticTranscoder. When I upload video files, 'put' doesn't work but 'multipart' event works. 'Put' event works for smaller files like images or txt, just for testing.
console.log('Loading event');
var aws = require('aws-sdk');
var s3 = new aws.S3({apiVersion: '2006-03-01'});
var ets = new aws.ElasticTranscoder({apiVersion: '2012-09-25', region: 'us-west-2'});
exports.handler = function(event, context) {
console.log('Received event:');
console.log(JSON.stringify(event, null, ' '));
var bucket = event.Records[0].s3.bucket.name;
var key = event.Records[0].s3.object.key;
var fileName = key.split('.')[0];
s3.getObject({Bucket:bucket, Key:key},
function(err,data) {
if (err) {
console.log('error getting object ' + key + ' from bucket ' + bucket +
'. Make sure they exist and your bucket is in the same region as this function.');
context.done('error','error getting file'+err);
} else {
console.log("### JOB KEY ### " + key);
ets.createJob({
PipelineId: '***',
Input: {
Key: key,
FrameRate: 'auto',
Resolution: 'auto',
AspectRatio: 'auto',
Interlaced: 'auto',
Container: 'auto',
},
Output: {
Key: fileName + '.m3u8',
ThumbnailPattern: fileName + '-thumbs-{count}',
PresetId: '1351620000001-200035',
Rotate: 'auto'
}
}, function(error, data) {
if(error) {
console.log(error);
} else {
console.log('Job submitted');
}
});
}
}
);
};

Blueimp jQuery File Upload - how to change upload directory

how can i change the upload directory?
i want to change the file upload directory dynamically
f.g : for each user,upload files to her/his folder
thanks
You can store the directory in a $_SESSION variable or in a $_COOKIE , and then get the saved value in the file /php/index.php
$uplDir = $_SESSION["uploadDirectory"].'/;
$option = array(
/* some options */
'upload_dir' => $uplDir,
/* .... */
);
$upload_handler = new UploadHandler($option);
ps. remember the session_start(); at the beginning
You can send that via parameters in the form data in js file
<script>
$(function () {
$('#fileupload').fileupload({
dataType: 'json',
formData: [{ name: 'custom_dir', value: '/save/file/here/' }],
done: function (e, data) {
$.each(data.result.files, function (index, file) {
$('<p/>').text(file.name).appendTo(document.body);
});
}
});
});
</script>
//=========================
while in the upload handler definition
require('UploadHandler.php');
$custom_dir = $_SERVER['DOCUMENT_ROOT'] . $_REQUEST['custom_dir'];
$upload_handler = new UploadHandler(array('upload_dir' => $custom_dir));