Getting Dropbox - 403 error when logging on using dropboxuploader.php - dropbox

Hey I'm using dropboxuploader.php to login into dropbox. All was working fine, but when i came into work yesterday i could no longer connect. This is what dropbox is returning to me.
HTTP/1.1 100 Continue
HTTP/1.1 403 Forbidden
Server: nginx/1.2.3
Date: Thu, 04 Oct 2012 08:44:36 GMT
Content-Type: text/html
Transfer-Encoding: chunked
Connection: keep-alive
It seems you tried to do something we can't verify. Did you log into a different Dropbox account in a different window? Try clicking here to go back to the page you came from, or just go home.

Replace the login function with below code and it should work:
protected function login() {
$data = $this->request('https://www.dropbox.com/login');
$str = '<input type="hidden" name="t" value="';
$start = strpos($data,$str);
$val = "";
if($start !== false)
{
$val = substr($data,$start+strlen($str),24);
}
$data = $this->request('https://www.dropbox.com/login', true, array('login_email'=>$this->email, 'login_password'=>$this->password, 't'=>$val));
if (stripos($data, 'location: /home') === false)
throw new Exception('Login unsuccessful.');
$this->loggedIn = true;
}

Just update your dropbox uploader file instead doing your fixes.
https://github.com/jakajancar/DropboxUploader

Related

How do I implement sendgrid.env file?

I am trying to implement the sendgrid email API.
I have followed all the instructions I can find to try and set up the sendgrid email API but I all I get is this:
Response status: 401 Response Headers - HTTP/1.1 401 Unauthorized
- Server: nginx - Date: Wed, 02 Mar 2022 21:56:45 GMT
- Content-Type: application/json
- Content-Length: 88
- Connection: keep-alive
- Access-Control-Allow-Origin: https://sendgrid.api-docs.io
- Access-Control-Allow-Methods: POST
- Access-Control-Allow-Headers: Authorization, Content-Type, On-behalf-of, x-sg-elas-acl
- Access-Control-Max-Age: 600
- X-No-CORS-Reason: https://sendgrid.com/docs/Classroom/Basics/API/cors.html
- Strict-Transport-Security: max-age=600; includeSubDomains
I installed the SendGrid helper library, ran the Composer command which created a composer.json file in the root of my project, installed the SendGrid helper library for PHP, along with its dependencies in a new directory named vendor.
I then used the following to create sendgrid.env:
echo "export SENDGRID_API_KEY='SG.XXX....XXXX'" > sendgrid.env
echo "sendgrid.env" >> .gitignore
source ./sendgrid.env
The API key works fine when I tested it using curl --request POST and email comes through ok.
But I have tried every combination I can think of to integrate the sendgrid.env e.g. changing the location, removing the single quotes etc. but I just get the same error message every time.
Here is my php script to send the email:
declare(strict_types=1);
require 'vendor/autoload.php';
use \SendGrid\Mail\Mail;
$email = new Mail();
// Replace the email address and name with your verified sender
$email->setFrom(
'paul#xxx.com',
'Paul xxx'
);
$email->setSubject('Sending with Twilio SendGrid is Fun');
// Replace the email address and name with your recipient
$email->addTo(
'paul#yyy.com',
'Paul yyy'
);
$email->addContent(
'text/html',
'<strong>and fast with the PHP helper library.</strong>'
);
$sendgrid = new \SendGrid(getenv('SENDGRID_API_KEY'));
try {
$response = $sendgrid->send($email);
printf("Response status: %d\n\n", $response->statusCode());
$headers = array_filter($response->headers());
echo "Response Headers\n\n";
foreach ($headers as $header) {
echo '- ' . $header . "\n";
}
} catch (Exception $e) {
echo 'Caught exception: '. $e->getMessage() ."\n";
}
This is the structure of the files:
My domain and email have been verified on Sendgrid.
I have a feeling that there's another step involved to get the API from the sendgrid.env file?
Thanks in advance.

asp .net core server serve static file without specify any charset in the Content-Type Response Header

in my blazor server app (.NET6.0) i serve some static files and show them by embedding them a iframe (so the browser deal directly with the type of file, can be image, video, sound, pdf, etc)
I notice encoding problem on accentuated character on the txt and the html files when its shown in the iframe
I try to insert a inside the head on the iframe but same result
i notice on the http call to the file these Response Headers
accept-ranges: bytes
content-encoding: br
content-type: text/plain
date: Thu, 23 Dec 2021 13:15:47 GMT
etag: "1d7f7ff334b008b"
last-modified: Thu, 23 Dec 2021 13:15:48 GMT
server: Kestrel
vary: Accept-Encoding
Im surprise there is no utf-8 specified in the content-type header, im wondering if this is the source of my problem ?
I expected something like that content-type: text/plain; charset=utf-8
i try to play with the StaticFileOptions in Startup to change the headers but even put empty option maker the app broken at startup
app.UseStaticFiles(new StaticFileOptions {
});
//even doing this break the app, when the app start the file blazor.server.js finish in 404 on client side
So i can't really make something here
th serve my static files, i use a virtual directory on this manner
app.UseFileServer(new FileServerOptions
{
FileProvider = new PhysicalFileProvider(Sys.Web.AppliIs.Path_Webdav),
RequestPath = new PathString("/" + Sys.Web.AppliIs.WEBDAV_FOLDER),
EnableDirectoryBrowsing = false
});
I notice i don't have encoding problem when i open the link directly with chrome, its only inside my iframe for now, i can't explain that.
Thanks for your help
I was able to override the OnPrepareResponse to add the charset, i don't know why but i have to put ISO charset to resolve encoding problems
var lOptions = new FileServerOptions
{
FileProvider = new PhysicalFileProvider(Sys.Web.AppliIs.Path_Webdav),
RequestPath = new PathString("/" + Sys.Web.AppliIs.WEBDAV_FOLDER),
EnableDirectoryBrowsing = false,
};
lOptions.StaticFileOptions.OnPrepareResponse = (context) =>
{
var headers = context.Context.Response.Headers;
var contentType = headers["Content-Type"];
contentType += "; charset=ISO-8859-1";
headers["Content-Type"] = contentType;
};
app.UseFileServer(lOptions);
for me the subject is close, but if anybody know why i have to specify this charset im still intersted.

Ionic CORS Error, But Server Has CORS Enabled

I have an Ionic 4 app that uses a lambda API hosted on AWS. CORS is enabled on the API Gateway. The following snippet is from a curl request to the API.
< content-type: application/json
< content-length: 42
< date: Sat, 16 Feb 2019 02:19:25 GMT
< x-amzn-requestid: 47a5fcac-3191-11e9-af42-d387861aa6ad
< access-control-allow-origin: *
< access-control-allow-headers: Content-Type,X-Amz-Date,Authorization,X-Api-Key,X-Amz-Security-Token
< x-amz-apigw-id: VK7vFGc4oAMFTqg=
< access-control-allow-methods: POST,OPTIONS
This post discusses a few possible workarounds (change content type, etc.), but they don't work.
Changing the Content-Type header to text/plain or removing that header altogether makes no difference.
The following error is also presented on the Ionic console
Cross-Origin Read Blocking (CORB) blocked cross-origin response
https://mycoolapi.com/GetLegal with MIME type application/json.
See https://www.chromestatus.com/feature/5629709824032768 for more details.
The following is my service code.
getLegal(data: any) {
return new Promise((resolve, reject) => {
let httpHeaders = new HttpHeaders().set('Content-Type', 'application/json');
this.httpClient.post(this.apiUrl+'/GetLegal', JSON.stringify(data), {
headers: httpHeaders,
})
.subscribe(res => {
resolve(new LegalResponse(res));
}, (err) => {
console.log("Oops, there has been an error")
reject(err);
});
});
}
Help?
This ended up being a bug on the Amazon side. The curl snippet was from a GET method, which was sending the CORS headers. The POST method was not. After redeploying the API without changing anything, the GET method was no longer sending the CORS headers and the POST method was. The application is working, for now.

PhantomJS returns status 200 on localhost but 403 on live server

I have to scrape HTML documents from given url. On my localhost the Phantom JS script is returning the url fine. But on live server I get a 403 forbidden status
scraper.js
var system = require('system');
var page = require('webpage').create();
$url = system.args[1];
page.open($url, function(status) {
if (status == "success") {
var content = page.content;
console.log(content);
}
phantom.exit();
});
PhantomJS command:
phantomjs scraper.js http://www.submarino.com.br/produto/126862765/
The scraper works fine on other pages. But the domain www.submarino.com.br and www.americanas.com.br don't work. I know it has something to do with Akamai. The response with error output is:
Response (#1, stage "start"): {"body":"","bodySize":300,"contentType":"text/html","headers":[{"name":"Server","value":"AkamaiGHost"},{"name":"Mime-Version","value":"1.0"},{"name":"Content-Type","value":"text/html"},{"name":"Content-Length","value":"300"},{"name":"Expires","value":"Wed, 10 Aug 2016 00:38:13 GMT"},{"name":"Date","value":"Wed, 10 Aug 2016 00:38:13 GMT"},{"name":"Connection","value":"close"},{"name":"Set-Cookie","value":"MobileOptOut=1; path=/; domain=submarino.com.br\nb2wChannel=INTERNET; path=/; domain=submarino.com.br"},{"name":"Vary","value":"Accept-Encoding, User-Agent"}],"id":1,"redirectURL":null,"stage":"start","status":403,"statusText":"Forbidden","time":"2016-08-10T00:38:13.540Z","url":"http://www.submarino.com.br/produto/126862765/"}
Response (#1, stage "end"): {"contentType":"text/html","headers":[{"name":"Server","value":"AkamaiGHost"},{"name":"Mime-Version","value":"1.0"},{"name":"Content-Type","value":"text/html"},{"name":"Content-Length","value":"300"},{"name":"Expires","value":"Wed, 10 Aug 2016 00:38:13 GMT"},{"name":"Date","value":"Wed, 10 Aug 2016 00:38:13 GMT"},{"name":"Connection","value":"close"},{"name":"Set-Cookie","value":"MobileOptOut=1; path=/; domain=submarino.com.br\nb2wChannel=INTERNET; path=/; domain=submarino.com.br"},{"name":"Vary","value":"Accept-Encoding, User-Agent"}],"id":1,"redirectURL":null,"stage":"end","status":403,"statusText":"Forbidden","time":"2016-08-10T00:38:13.541Z","url":"http://www.submarino.com.br/produto/126862765/"}
When it works fine it returns:
Response (#1, stage "start"): {"body":"","bodySize":30076,"contentType":"text/html;charset=UTF-8","headers":[{"name":"Content-Encoding","value":"gzip"},{"name":"Content-Type","value":"text/html;charset=UTF-8"},{"name":"Server","value":"Apache-Coyote/1.1"},{"name":"X-Powered-By","value":"JSF/1.2"},{"name":"x-tid","value":"CATALOGO-0d4d336f-c0f1-4b71-9663-28fa89b5c123"},{"name":"Cache-Control","value":"max-age=1800"},{"name":"Expires","value":"Wed, 10 Aug 2016 01:10:18 GMT"},{"name":"Date","value":"Wed, 10 Aug 2016 00:40:18 GMT"},{"name":"Connection","value":"keep-alive"},{"name":"Set-Cookie","value":"MobileOptOut=1; path=/; domain=submarino.com.br\nb2wChannel=INTERNET; path=/; domain=submarino.com.br"},{"name":"Vary","value":"Accept-Encoding, User-Agent"}],"id":1,"redirectURL":null,"stage":"start","status":200,"statusText":"OK","time":"2016-08-10T00:40:18.388Z","url":"http://www.submarino.com.br/produto/126862765/"}
Response (#1, stage "end"): {"contentType":"text/html;charset=UTF-8","headers":[{"name":"Content-Encoding","value":"gzip"},{"name":"Content-Type","value":"text/html;charset=UTF-8"},{"name":"Server","value":"Apache-Coyote/1.1"},{"name":"X-Powered-By","value":"JSF/1.2"},{"name":"x-tid","value":"CATALOGO-0d4d336f-c0f1-4b71-9663-28fa89b5c123"},{"name":"Cache-Control","value":"max-age=1800"},{"name":"Expires","value":"Wed, 10 Aug 2016 01:10:18 GMT"},{"name":"Date","value":"Wed, 10 Aug 2016 00:40:18 GMT"},{"name":"Connection","value":"keep-alive"},{"name":"Set-Cookie","value":"MobileOptOut=1; path=/; domain=submarino.com.br\nb2wChannel=INTERNET; path=/; domain=submarino.com.br"},{"name":"Vary","value":"Accept-Encoding, User-Agent"}],"id":1,"redirectURL":null,"stage":"end","status":200,"statusText":"OK","time":"2016-08-10T00:40:18.390Z","url":"http://www.submarino.com.br/produto/126862765/"}
I attempted cURLing this site from hurl.it and other cURL services and they can access the url. Is there something I can do? This is driving me crazy!
Most likely it's a geographical or suspicious IP range limitation. I've tried to open the url just now and was also denied the page, then accessed it via american proxy and was able to open it. Just use an american or brasilian proxy.
Also when scraping it's important to mimic a real browser behaviour as close as possible, so I'd suggest you add useragent and viewport emulation to your script:
page.viewportSize = { width: 1280, height: 800 };
page.settings.userAgent = "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/51.0.2704.106 Safari/537.36";
Also be sure to subscribe to error and console messages to be aware of any errors and messages from the target page.
page.onConsoleMessage = function(msg) {
console.log('CONSOLE: ' + msg);
};
page.onError = function (msg, trace)
{
console.log(msg);
trace.forEach(function(item) {
console.log(' ', item.file, ':', item.line);
})
}

How to get Header Location value from a Fetch request in browser

From a ReactJS - Redux front app, I try to get the Location Header value of an REST API response.
When I Curl this :
curl -i -X POST -H "Authorization: Bearer MYTOKEN" https://url.company.com/api/v1.0/tasks/
I Have this answer :
HTTP/1.1 202 ACCEPTED
Server: nginx
Date: Fri, 12 Aug 2016 15:55:47 GMT
Content-Type: text/html; charset=utf-8
Content-Length: 0
Connection: keep-alive
Location: https://url.company.com/api/v1.0/tasks/status/idstatus
When I make a Fetch in ReactJS
var url = 'https://url.company.com/api/v1.0/tasks/'
fetch(url, {
method: 'post',
credentials: 'omit',
headers: {
'Authorization': `Bearer ${token}`
}
}
)
I don't have any Headers in the response object :
No header in Fetch request
I tried all the response.headers functions I've found in https://developer.mozilla.org/en-US/docs/Web/API/Headers :
response.headers.get('Location');
But well, as headers is empty, I have empty results.
Do you know why I can't get a proper Header object filled with the headers values ?
Thanks to John, I've found the answer.
I just had to put
Access-Control-Expose-Headers: Location
To my response headers and it worked, so now I can access Location value with :
response.headers.get('Location');
Thx John !
Besides to expose the Location Header in the server.
I just could access the location in the react application with:
response.headers.location;