Upload large file to Azure blob storage via REST API Put Block Blob - react-native

I am using React Native to build Mobile application for Andrioid and iOS.
based on the situation that no framework is exist to support Azure Storage API for React Native (all frameworks are required browsers that does not exist in React Native),
I use REST API for the interaction with the Azure storage and it works fine e.g list containers, list blob, get blob and put blob.
in order to upload large file I tried to use the same mechanizm for 'put block' api (as describe here: https://learn.microsoft.com/en-us/rest/api/storageservices/put-block) without succcess, failed on error code 403.
I will appreciate for your assist.
Thank you.
my code for upload single block:
private createAuthorizationHeader(canonicalizedString: string) {
const str = CryptoJS.HmacSHA256(canonicalizedString, CryptoJS.enc.Base64.parse(this.config.accountKey));
const sig = CryptoJS.enc.Base64.stringify(str);
const authorizationHeader = `SharedKey ${this.config.accountName}:${sig}`;
return authorizationHeader;
}
async putBlockBlob(containerName: str, blobPath: str, blobContent: str, blockIndex: number,) {
const requestMethod = 'PUT';
const urlPath = `${containerName}/${blobPath}`;
const dateInRfc1123Format = new Date(Date.now()).toUTCString();
const storageServiceVersion = '2019-12-12';
const blobLength: number = blobContent.length;
const blockId = Buffer.from(`block-${blockIndex}`).toString('base64');
const blobType = 'BlockBlob';
// StringToSign =
// VERB + "\n" +
// Content-Encoding + "\n" +
// Content-Language + "\n" +
// Content-Length + "\n" +
// Content-MD5 + "\n" +
// Content-Type + "\n" +
// Date + "\n" +
// If-Modified-Since + "\n" +
// If-Match + "\n" +
// If-None-Match + "\n" +
// If-Unmodified-Since + "\n" +
// Range + "\n" +
// CanonicalizedHeaders +
// CanonicalizedResource;
const canonicalizedHeaders = `x-ms-date:${dateInRfc1123Format}\nx-ms-version:${storageServiceVersion}`;
const canonicalizedResource = `/${this.config.accountName}/${urlPath}}\nblockid:${blockId}\ncomp:block`;
const stringToSign = `${requestMethod}\n\n\n${blobLength}\n\napplication/octet-stream\n\n\n\n\n\n\n${canonicalizedHeaders}\n${canonicalizedResource}`;
const uriStr = `${urlPath}?comp=block&blockid=${blockId}`;
const authorizationHeader = this.createAuthorizationHeader(stringToSign);
const header = {
'cache-control': 'no-cache',
'x-ms-date': dateInRfc1123Format,
'x-ms-version': storageServiceVersion,
Authorization: authorizationHeader,
'Content-Length': `${blobLength}`,
'Content-Type': 'application/octet-stream',
};
try {
return axios
.create({baseURL: `https://${this.config.accountName}.blob.core.windows.net/`,})
.request({
method: requestMethod,
url: uriStr,
data: blobContent,
headers: header,
})
.then((response) => response.data)
.catch((err) => {
throw err;
});
} catch (err) {
console.log(err);
throw err;
}
}

I believe the issue is coming because of a missing new line character between Range and CanonicalizedHeaders.
Can you try by changing the following line of code:
const stringToSign = `${requestMethod}\n\n\n${blobLength}\n\napplication/octet-stream\n\n\n\n\n\n\n${canonicalizedHeaders}\n${canonicalizedResource}`;
to:
const stringToSign = `${requestMethod}\n\n\n${blobLength}\n\napplication/octet-stream\n\n\n\n\n\n\n\n${canonicalizedHeaders}\n${canonicalizedResource}`;

it will help you to upload the data to Azure storage server
upload file to Server
export const uploadMedia = async (params: any, callBack: any) => {
const SAS_URL: any = "https://${blobUrl}.blob.core.windows.net";
const CONTAINER: any = "";
const SAS_TOKEN: any = "";
const { fileType, localUri } = params;
const userId = "set user ID here";
const fileName = String(fileType).concat(customIdGenerator(7));
const assetPath = `${SAS_URL}/${CONTAINER}/${userId}/${fileName}`;
HEADER["x-ms-blob-content-type"] = CONST_HEADER(fileType);
return await RNFetchBlob.fetch(
"PUT",
`${assetPath}?${SAS_TOKEN}`,
HEADER,
RNFetchBlob.wrap(localUri)
)
?.uploadProgress(callBack)
.then(() => {
return assetPath;
});
};
fileType = 'video' | image | pdf
let params: any = {
fileType: 'image',
localUri: image,
};
generate Custom Id for Uniqueness or you can also use UUID
const customIdGenerator = (length: any) => {
var result = "";
var characters =
"ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789";
var charactersLength = characters.length;
for (var i = 0; i < length; i++) {
result += characters.charAt(Math.floor(Math.random() * charactersLength));
}
return result;
};
set Headers for different Files
const CONST_HEADER = (type: any) => {
return type == 'image'
? `image/png`
: type == 'video'
? 'video/mp4'
: type == 'pdf' && 'application/pdf';
};

Related

ethers.js, Swap on uniswapV3 failed tx

Im trying to use exactInput() function for UniV3 interface but when trying to execute the code the transactions fails https://goerli.etherscan.io/tx/0xb0d5e4b491610b9db8d98cc938008ba2a4e1a06e67b05ed87ac6c0ca3ad61dab
I know eth send shows 0 in this one but even especifying amount it fails, I dont know what to change..
I have checked many codes out there and cant see the mistake, please could someone give me some advice?
const {abi: V3SwapRouterABI} = require('#uniswap/v3-periphery/artifacts/contracts/interfaces/ISwapRouter.sol/ISwapRouter.json')
const { ethers } = require("ethers")
require("dotenv").config()
const INFURA_URL_TESTNET = process.env.INFURA_URL_TESTNET
const PRIVATE_KEY = process.env.PRIVATE_KEY
const WALLET_ADDRESS = process.env.WALLET_ADDRESS
// now you can call sendTransaction
const wethToken= "0xB4FBF271143F4FBf7B91A5ded31805e42b2208d6"
const Uni= "0x1f9840a85d5aF5bf1D1762F925BDADdC4201F984"
const UniswapRouter="0x68b3465833fb72A70ecDF485E0e4C7bD8665Fc45"
const UniV3Contract = new ethers.Contract(
UniswapRouter,
V3SwapRouterABI
)
const provider = new ethers.providers.JsonRpcProvider(INFURA_URL_TESTNET)
const wallet = new ethers.Wallet(PRIVATE_KEY)
const signer = wallet.connect(provider)
const FEE_SIZE = 3
function encodePath(path, fees) {
if (path.length != fees.length + 1) {
throw new Error('path/fee lengths do not match')
}
let encoded = '0x'
for (let i = 0; i < fees.length; i++) {
// 20 byte encoding of the address
encoded += path[i].slice(2)
// 3 byte encoding of the fee
encoded += fees[i].toString(16).padStart(2 * FEE_SIZE, '0')
}
// encode the final token
encoded += path[path.length - 1].slice(2)
return encoded.toLowerCase()
}
async function getToken() {
const path = encodePath([wethToken, Uni], [3000])
const deadline = Math.floor(Date.now()/1000) + (60*10)
const params = {
path: path,
recipient: WALLET_ADDRESS,
deadline: deadline,
amountIn: ethers.utils.parseEther('0.01'),
amountOutMinimum: 0
}
const encodedData = UniV3Contract.interface.encodeFunctionData("exactInput", [params])
const txArg = {
to: UniswapRouter,
from: WALLET_ADDRESS,
data: encodedData,
gasLimit: ethers.utils.hexlify(1000000)
}
const tx = await signer.sendTransaction(txArg)
console.log('tx: ', tx)
const receipt = tx.wait()
console.log('receipt: ', receipt)
}
module.exports = { getToken
You will need to remove the Deadline.. The new router 0x68b3465833fb72A70ecDF485E0e4C7bD8665Fc45 moved deadline to the multi-call function (since the router is designed to be multi-call)

React-native-fs : How to use readDir recursively using .map()?

I tried to get all the files and directories available in a folder using react-native-fs.
I created a function to get all the files and directories recursively in a folder, I call this function this way :
const data = await scanDir(path);
I first tried using the .map() function but my function return only some elements :
async function scanDir(pathOfDirToScan, data = {directory: [], files: []}) {
const readedFilesAndDir = await FS.readDir(pathOfDirToScan);
Object.keys(readedFilesAndDir).map(async key => {
if (readedFilesAndDir[key].isDirectory()) {
const directoryPath = pathOfDirToScan + '/' + readedFilesAndDir[key].name;
data.directory.push(directoryPath);
data = await scanDir(directoryPath, data);
} else {
data.files.push(pathOfDirToScan + '/' + readedFilesAndDir[key].name);
}
});
return data;
}
It seems my function return the data after the first time map is executed, but the function continue after that.
I then tried with a for loop and it works as intended :
async function scanDir(pathOfDirToScan, data = {directory: [], files: []}) {
const readedFilesAndDir = await FS.readDir(pathOfDirToScan);
for (let i = 0; i < readedFilesAndDir.length; i++) {
if (readedFilesAndDir[i].isDirectory()) {
const directoryPath = pathOfDirToScan + '/' + readedFilesAndDir[i].name;
data.directory.push(directoryPath);
data = await scanDir(directoryPath, data);
} else {
data.files.push(pathOfDirToScan + '/' + readedFilesAndDir[i].name);
}
}
return data;
}
What should I do to make the function properly works using .map() ?
The FS.readDir(dirpath) returns an array of objects as per docs. Object.keys(obj) is not required for iteration in that case, just readedFilesAndDir.map() will do your task.
Copy and pasted your own code with some corrections. Hope, it helps:
async function scanDir(pathOfDirToScan, data = {directory: [], files: []}) {
const readedFilesAndDir = await FS.readDir(pathOfDirToScan);
readedFilesAndDir.map(async eachItem=> {
if (eachItem.isDirectory()) {
const directoryPath = pathOfDirToScan + '/' + eachItem.name;
data.directory.push(directoryPath);
data = await scanDir(directoryPath, data);
} else {
data.files.push(pathOfDirToScan + '/' + eachItem.name);
}
});
return data;
}

Ionic 4 Image upload using Angular HTTP

I use Ionic 4 and Angular 7 with PHP as Back-end.
I am trying to upload files (images/videos/PDFs/audio).
Is there a general way to send it.
I tried to send image using camera plugin it returns the URI and it works on the app using img tag.
But I can't get the file it self to send it using formData
openCamera() {
const options: CameraOptions = {
quality: 100,
destinationType: this.camera.DestinationType.FILE_URI,
encodingType: this.camera.EncodingType.JPEG,
mediaType: this.camera.MediaType.PICTURE,
sourceType: this.camera.PictureSourceType.PHOTOLIBRARY
};
this.camera.getPicture(options).then((imageData) => {
this.imageData = imageData;
this.image = (<any>window).Ionic.WebView.convertFileSrc(imageData);
// this.image works fine in img tag
this.sendMsg(this.image);
}, (err) => {
// Handle error
alert('error ' + JSON.stringify(err));
});
}
sendMsg(file?) {
const data = new FormData();
data.set('group_id', this.groupId);
data.set('text', this.msg);
if (file) {
data.set('file', this.image);
data.set('text', '');
}
this.messeges.push(data);
this._messengerService.postMsg(data).subscribe(
res => {
console.log('res ', res);
if (res.success === true) {
console.log('data added ', res);
}
}
);
}
I want the use the URI to get the actual file
Ionic Native plugin will return only base64. As per your question, you need to convert formdata. so, You need to convert base64 to formdata externally.
dataURItoBlob(dataURI) {
// convert base64/URLEncoded data component to raw binary data held in a string
var byteString;
if (dataURI.split(',')[0].indexOf('base64') >= 0)
byteString = atob(dataURI.split(',')[1]);
else
byteString = unescape(dataURI.split(',')[1]);
// separate out the mime component
var mimeString = dataURI.split(',')[0].split(':')[1].split(';')[0];
// write the bytes of the string to a typed array
var ia = new Uint8Array(byteString.length);
for (var i = 0; i < byteString.length; i++) {
ia[i] = byteString.charCodeAt(i);
}
return new Blob([ia], { type: mimeString });
}
and
profileUpdate(options) {
this.camera.getPicture(options).then((imageData) => {
let base64Image = 'data:image/jpg;base64,' + imageData;
let data = this.dataURItoBlob(base64Image);
let formData = new FormData();
formData.append('profile', data, "filename.jpg");
//here you pass the formdata to to your API
})

Large file upload for office 365(StartUpload,ContinueUpload,FinishUpload) not working as expected - SharePoint

When I am trying to upload large file using 3 new methods (StartUpload, ContinueUpload, FinishUpload) by uploading chunks of file then final uploaded file is corrupt file and size is also greater than actual file. I have used Rest API to upload large files.
Steps followed are as follows:-
Create HTML for input file.
<input name="FileUpload" type="file" id="uploadInput" className="inputFile" multiple="false" onchange="upload(this.files[0])" />
Below method is start point of code:
Creating Global variable for siteurl
var Tasks = {
urlName: window.location.origin + "/",
siteName: '/sites/ABC',
};
Calling Upload() method
First Create Dummy File with size 0 in folder to continue with large file upload.
Create FileReader object and then start creating chunks of file with 3 parameters(offset,length,method(i.e. start/continue/finishupload)) and push chunks into an array.
Creating unique id for upload i.e. uploadID
Calling UploadFile method
function upload(file) {
var docLibraryName = "/sites/ABC/Shared Documents";
var fileName = $("#uploadInput").val().replace(/C:\\fakepath\\/i, '');
var folderName = "";
createDummaryFile(docLibraryName, fileName, folderName)
var fr = new FileReader();
var offset = 0;
var total = file.size;
var length = 1000000 > total ? total : 1000000;
var chunks = [];
fr.onload = evt => {
while (offset < total) {
if (offset + length > total)
length = total - offset;
chunks.push({
offset,
length,
method: getUploadMethod(offset, length, total)
});
offset += length;
}
for (var i = 0; i < chunks.length; i++)
console.log(chunks[i]);
if (chunks.length > 0) {
const id = getGuid();
uploadFile(evt.target.result, id, docLibraryName, fileName, chunks, 0);
}
};
fr.readAsArrayBuffer(file);
}
function createDummaryFile(libraryName, fileName, folderName) {
return new Promise((resolve, reject) => {
var endpoint = Tasks.urlName + Tasks.siteName + "/_api/web/GetFolderByServerRelativeUrl('" + libraryName + "/" + folderName + "')/Files/add(url=#TargetFileName,overwrite='true')?" +
"&#TargetFileName='" + fileName + "'";
var url;
const headers = {
"accept": "application/json;odata=verbose"
};
performUpload(endpoint, headers, libraryName, fileName, folderName, convertDataBinaryString(0));
});
}
function S4() {
return (((1 + Math.random()) * 0x10000) | 0).toString(16).substring(1);
}
function getGuid() {
return (S4() + S4() + "-" + S4() + "-4" + S4().substr(0, 3) + "-" + S4() + "-" + S4() + S4() + S4()).toLowerCase();
}
//check position for selecting method
function getUploadMethod(offset, length, total) {
if (offset + length + 1 > total) {
return 'finishupload';
} else if (offset === 0) {
return 'startupload';
} else if (offset < total) {
return 'continueupload';
}
return null;
}
Upload file method
Convert arraybuffer to blob chunks to start uploading file
Start actual file chunks upload using methods and offset of 1mb we created earlier (uploadFileChunk method)
Start loop for chunk and call same method
function uploadFile(result, id, libraryPath, fileName, chunks, index) {
const data = convertFileToBlobChunks(result, chunks[index]);
var response = uploadFileChunk(id, libraryPath, fileName, chunks[index], data);
index += 1;
if (index < chunks.length)
uploadFile(result, id, libraryPath, fileName, chunks, index, chunks[index].offset);
}
function convertFileToBlobChunks(result, chunkInfo) {
var arrayBuffer = chunkInfo.method === 'finishupload' ? result.slice(chunkInfo.offset) : result.slice(chunkInfo.offset, chunkInfo.offset + chunkInfo.length);
return convertDataBinaryString(arrayBuffer);
}
function convertDataBinaryString(data) {
var fileData = '';
var byteArray = new Uint8Array(data);
for (var i = 0; i < byteArray.byteLength; i++) {
fileData += String.fromCharCode(byteArray[i]);
}
return fileData;
}
UploadFileChunk method to actually start uploading file chunks)
Form string if startupload then no fileoffset and if continueupload and finishupload then it will have fileoffset.
Call performupload method to start uploading using rest api
function uploadFileChunk(id, libraryPath, fileName, chunk, data) {
new Promise((resolve, reject) => {
var offset = chunk.offset === 0 ? '' : ',fileOffset=' + chunk.offset;
var folderName = "";
var endpoint = Tasks.urlName + Tasks.siteName + "/_api/web/getfilebyserverrelativeurl('" + libraryPath + "/" + fileName + "')/" + chunk.method + "(uploadId=guid'" + id + "'" + offset + ")";
const headers = {
"Accept": "application/json; odata=verbose",
"Content-Type": "application/octet-stream"
};
performUpload(endpoint, headers, libraryPath, fileName, folderName, data);
});
}
function performUpload(endpoint, headers, libraryName, fileName, folderName, fileData) {
new Promise((resolve, reject) => {
var digest = $("#__REQUESTDIGEST").val();
$.ajax({
url: endpoint,
async: false,
method: "POST",
headers: headers,
data: fileData,
binaryStringRequestBody: true,
success: function(data) {},
error: err => reject(err.responseText)
});
});
}
Please suggest why file uploaded is corrupted and having size less or greater than actual file?
Thanks in advance.
I had the same problem with this code. I changed convertFileToBlobChunks to just return the ArrayBuffer.
function convertFileToBlobChunks(result, chunkInfo) {
var arrayBuffer = chunkInfo.method === 'finishupload' ?
result.slice(chunkInfo.offset) : result.slice(chunkInfo.offset, chunkInfo.offset + chunkInfo.length);
return arrayBuffer;
}
I also removed "Content-Type": "application/octet-stream" from the header.
After doing that it uploaded fine.

Express Deprecated

I have a photo app that uploads photos to AWS. When testing the uploading photos feature on my localhost, my terminal throws the following error:
express deprecated res.send(status, body): Use
res.status(status).send(body) instead aws/aws.js:50:18
My photos DO save to AWS, im just wondering what this error is and how to fix it. Below is my aws code that the error refers too.
'use strict';
var AWS = require('aws-sdk'),
crypto = require('crypto'),
config = require('./aws.json'),
createS3Policy,
getExpiryTime;
getExpiryTime = function () {
var _date = new Date();
return '' + (_date.getFullYear()) + '-' + (_date.getMonth() + 1) + '-' +
(_date.getDate() + 1) + 'T' + (_date.getHours() + 3) + ':' + '00:00.000Z';
};
createS3Policy = function(contentType, callback) {
var date = new Date();
var s3Policy = {
'expiration': getExpiryTime(),
'conditions': [
['starts-with', '$key', 'images/'],
{'bucket': config.bucket},
{'acl': 'public-read'},
['starts-with', '$Content-Type', contentType],
{'success_action_status' : '201'}
]
};
// stringify and encode the policy
var stringPolicy = JSON.stringify(s3Policy);
var base64Policy = new Buffer(stringPolicy, 'utf-8').toString('base64');
// sign the base64 encoded policy
var signature = crypto.createHmac('sha1', config.secretAccessKey)
.update(new Buffer(base64Policy, 'utf-8')).digest('base64');
// build the results object
var s3Credentials = {
s3Policy: base64Policy,
s3Signature: signature,
AWSAccessKeyId: config.accessKeyId
};
// send it back
callback(s3Credentials);
};
exports.getS3Policy = function(req, res) {
createS3Policy(req.query.mimeType, function (creds, err) {
if (!err) {
return res.send(200, creds);
} else {
return res.send(500, err);
}
});
};
Replace res.send(statusCode, "something") with res.status(statusCode).send("something")
This should do it for your code:
exports.getS3Policy = function(req, res) {
createS3Policy(req.query.mimeType, function (creds, err) {
if (!err) {
return res.send(creds); //200 is not needed here, express will default to this
} else {
return res.status(500).send(err);
}
});
};