Access local storage with different origin from document the storage is tied to - selenium

Yahoo.com has multiple URLs for local storage on a single web page, as seen below. I am attempting to access any local storage not behind the page URL https://www.Yahoo.com, but the window.localStorage method only has access to the page URL.
Navigating to the other domains seen below to access their local storage will create a new set of local storage which is not common to the original page, and does not have the desired data - this implies to me that despite being stored under a different origin, all this data is part of this one webpage. I just cannot figure out how to access it apart from using the Developer Tools UI.
My end goal is to access it through Selenium, or local system files created by Geckodriver, but am posing the more general question.

Related

Google Cloud Storage: Alternative to signed URLs for folders

Our application data storage is backed by Google Cloud Storage (and S3 and Azure Blob Storage). We need to give access to this storage to random outside tools (upload from local disk using CLI tools, unload from analytical database like Redshift, Snowflake and others). The specific use case is that users need to upload multiple big files (you can think about it much like m3u8 playlists for streaming videos - it's m3u8 playlist and thousands of small video files). The tools and users MAY not be affiliated with Google in any way (may not have Google account). We also absolutely need to data transfer to be directly to the storage, outside of our servers.
In S3 we use federation tokens to give access to a part of the S3 bucket.
So model scenario on AWS S3:
customer requests some data upload via our API
we give customers S3 credentials, that are scoped to s3://customer/project/uploadId, allowing upload of new files
client uses any tool to upload the data
client uploads s3://customer/project/uploadId/file.manifest, s3://customer/project/uploadId/file.00001, s3://customer/project/uploadId/file.00002, ...
other data (be it other uploadId or project) in the bucket is safe because the given credentials are scoped
In ABS we use STS token for the same purpose.
GCS does not seem to have anything similar, except for Signed URLs. Signed URLs have a problem though that they refer to a single file. That would either require us to know in advance how many files will be uploaded (we don't know) or the client would need to request each file's signed URL separately (strain on our API and also it's slow).
ACL seemed to be a solution, but it's only tied to Google-related identities. And those can't be created on demand and fast. Service users are also and option, but their creation is slow and generally they are discouraged for this use case IIUC.
Is there a way to create a short lived credentials that are limited to a subset of the CGS bucket?
Ideal scenario would be that the service account we use in the app would be able to generate a short lived token that would only have access to a subset of the bucket. But nothing such seems to exist.
Unfortunately, no. For retrieving objects, signed URLs need to be for exact objects. You'd need to generate one per object.
Using the * wildcard will specify the subdirectory you are targeting and will identify all objects under it. For example, if you are trying to access objects in Folder1 in your bucket, you would use gs://Bucket/Folder1/* but the following command gsutil signurl -d 120s key.json gs://bucketname/folderName/** will create a SignedURL for each of the files inside your bucket but not a single URL for the entire folder/subdirectory
Reason : Since subdirectories are just an illusion of folders in a bucket and are actually object names that contain a ‘/’, every file in a subdirectory gets its own signed URL. There is no way to create a single signed URL for a specific subdirectory and allow its files to be temporarily available.
There is an ongoing feature request for this https://issuetracker.google.com/112042863. Please raise your concern here and look for further updates.
For now, one way to accomplish this would be to write a small App Engine app that they attempt to download from instead of directly from GCS which would check authentication according to whatever mechanism you're using and then, if they pass, generate a signed URL for that resource and redirect the user.
Reference : https://stackoverflow.com/a/40428142/15803365

Delayed upload to OneDrive

I created a web service which consists of a server part and a client part. The server part generates a file which should be saved automatically to onedrive. The size of the file is quite large and it`s content changes frequently. (Which prevents me from saving the file after each modification.)
I checked the onedrive api and currently see the following two solutions:
Authenticate the client via the „token flow“. In this case, the
problem is that there is (as far as I know) no reliable solution
which can trigger the upload to onedrive before the browser is
closed. (Should work on dekstop, iOS,…)
Authenticate the server via the „code flow“. In this case the server has the ability to save the file to onedrive even if the browser was already closed. The problem is, that the server has to keep a record of all authenticate users and their long term refresh tokens. Which could be a huge security risk, I would like to avoid.
Are there any other solutions to this problem?

local storage value on checkout page

i have stored value on local Storage in Shopify. but when i went to checkout page it remove value from local Storage. how i can get local storage value at checkout page?
Unless you have your own https domain set up you will be checking out from a different domain. Like cookies LocalStorage is domain dependent and you can only retrieve data from the domain you stored it in. Also because you have only limited control over the checkout you will find it difficult to implement some of the common cross-domain communication strategies.
You are pretty limited as far as affecting the checkout. You can get access to the order again on the Thank You page and script tags from apps should become active again but you will still be on the shopify domain so you cannot get access to any local storage

Unable to access shared folder in one drive for business

We are building an application using onedrive api v2.0.
But we are unable to see/access folders which are shared to user by any other users while making a request to /me/drive/root/children.
Can someone please suggest how to access folders which are shared by someone as our api will further create folders, files in these shared folder.
Shared files don't show up in the user's drive (/me/drive/root in your example) but are available through the sharedWithMe view:
GET /drive/sharedWithMe
This returns a collection of items that have been shared with the owner of the drive (or the current user in this case). Each item returned will include the remoteItem property which contains a reference to the actual item. If you want to access the actual item, then you need to build a request like this:
GET /drives/{item.remoteItem.parentReference.driveId}/items/{item.remoteItem.id}
Which will access the actual shared item (not the link to the remote item which is returned by sharedWithMe).
There's more information about Shared With Me on the OneDrive dev portal.
The request which was specified by greg works fine. I don't about the scopes specified by gregg, but to access the shared files, you to add the current user through you are making the as secondary administrator to other user ( in this case the user who shared the file) in SharePoint MySite administration.
I suggest to use graph API for accessing files from one drive as the responses are more consistent compared to the response from onedrive api. Though both Ali's require the secondary administrator setting to access the files.

How can you rewrite an internal link access from external hostname?

What if I have some internal links on my xampp web server that need to get accessed by external (requests from the internet)?
Lets say http://mylocalsite.local/login/ is a link on the site.
if I click on this link from http://externalwebsiteaddress.com, which should redirect to my webserver. Can the internal link (mylocalsite) be rewritten to http://externalwebsiteaddress.com/login ? Because currently external requests give a "Cannot access this page error" because its internal.
I don't want the internal link to redirect to the external link when accessed from within the network.
Basically if the links are internal , then find a way to service the request from external website successfully, without having to physically change the link on the server.
in other words
If the user is accessing from an external internet link, then find a way to redirect to proper folder using external link so it can be resolved properly.
EX: All links start with http://externalwebsiteaddress/wherever
If user is accessing from internal (inside the network) , then use the local links.
EX: All links start with http://internaladdress/wherever
Would this be a mod_rewrite? I am confusing myself way to much.