As per the official documentation, I have created the Eventual directory and Symbolic links(_add pointing to filestore & _pre pointing to _pre) within it. The automatic migration does not happen. I am using docker container of artifactory pro version 6.23.13 . I have waited overnight for the migration to happen but it didnt. Also the artifactory was serving only 4 artifacts.
Answering to my own question, I had initially created the eventual directory and links in the path /var/opt/jfrog/artifactory which is the home for docker container. Seems like there is another path that exists within the container /opt/jfrog/artifactory and creating the directory and links in that path seemed to do the trick
I have a project which is built-in ReactJs. and I am using s3 and CloudFront for deployment.
I am facing an issue whenever I deploy code after deployment it takes too much time to reflect changes. sometimes I have to manually clear browser cookies for the latest changes. Do I need to configure S3 or CloudFront settings?
Follow this steps:
Go to cloudfront :
Do invalidation of objects
Create entry /*
Reference : https://aws.amazon.com/premiumsupport/knowledge-center/cloudfront-serving-outdated-content-s3/
I am using s3 bucket to host my web site. Whenever I release a new version of my web site, I want all clients download it from s3 instead of reading from their browser cache. I know I can set up an expire time for the object saved on s3 bucket but it is not an idea solution since users have to use the cached content for a period of time. Is there a way to force browser to download the content if they are changed in s3 bucket?
Irrespective of whether you are using s3 bucket for hosting or any other hosting server, caching can be controlled by appending hash number to file name.
For example your js file bundle name should be like bundle.7e2c49a622975ebd9b7e.js.
When you deploy it again it will change to some other hash value bundle.205199ab45963f6a62ec.js.
By doing this, browser automatically knows that, new file has arrived and should be downloaded again.
This can be easily done using any popular bundlers like grunt, gulp, webpack.
webpack example
I am working on a client project, where the AKAMAI CDN has configured. They got Amazon S3 for hosting.
Problem:
I've committed the code in branch and could see the changes deployed on server in a codebase
Now I am trying to hit server URL in browser and trying to verify my code change
I couldn't see the UI change as per
I observer the CSS file URL is coming with query parameter (i.e.: server.com/css/filename.css??browserId=other&themeId=AbcTheme_WAR_abctheme&?t=125786954258&languageId=en_US&b=8569&t=1259648753695)
Now I am opening same URL in browser but now removing url query parameters from the file
This time I could see my changes in the same file
Questions:
Is this an issue related to CDN?
Is the CDN managing different versions of the same file to be served?
If so my changes should be merged into the latest file pointing to a webpage, which has url query parameters.
I know CDN will take time to refresh the pages but I am trying to verify my changes after 48 hours of the deployment.
Any help would be appreciated.
Thanks.
My question is similar to this one, but the solutions provided haven't helped me: Force applicationCache to reload cached files
Here's the run down. I currently have a sencha touch application hosted on S3, and there's a problem which requires an update to the index.html file.
In order to enable offline access to the app, I've cached index.html in cache.appcache. Below is my cache.appcache file:
CACHE MANIFEST
# 127476e50461cf415c27fb33d81914faab1fc687
index.html
# 364c8e0f0cc7c9922d0019d083b4abba7d519e1c
resources/images/ajax-loader.gif
# 4028c1082f32387af25e2399aae7173ed0a51cf4
resources/images/cloud_download.png
# 40454710d633ca15b65d891d3842d3ef8b2136bf
resources/images/delete1.png
# 62c6a1ec578fa7d1d7a3117c2a84c5195c33ddb8
resources/images/loading.png
# ad85882c6285881966307da8da97ff597de9a486
resources/images/loadingbg.gif
# d2abb7549cd282c1e3fec6e9249d1e51ad5ec75d
resources/images/logo.png
FALLBACK:
NETWORK:
*
In hindsight, to enable offline access I should have probably left index.html as a non-cached network file with a fallback to some 'offline.html' file, but the app's been deployed for a while, and I need to make a change to index.html, and I just can't get the file to update, not even on my local machine by clearing the cache, and not by using private browsing as per the link above. I need to be able to change the file without the user having to do anything to receive the changes.
Here's what I've tried:
1) I removed index.html from the cache manifest and uploaded it. When I did that, and reloaded, the browser picked up the updated cache manifest and downloaded the files:
Application Cache Progress event (0 of 6) http://m.example.com/resources/images/loading.png (index):1
Application Cache Progress event (1 of 6) http://m.example.com/resources/images/ajax-loader.gif (index):1
Application Cache Progress event (2 of 6) http://m.example.com/resources/images/cloud_download.png (index):1
Application Cache Progress event (3 of 6) http://m.example.com/resources/images/loadingbg.gif (index):1
Application Cache Progress event (4 of 6) http://m.example.com/resources/images/delete1.png (index):1
Application Cache Progress event (5 of 6) http://m.example.com/resources/images/logo.png (index):1
Application Cache Progress event (6 of 6) http://m.example.com/ (index):1
Thankfully that means the browser isn't caching the manifest file, but unfortunately, even though it downloaded the index, the file didn't update. I've confirmed the file has been properly uploaded to the s3 bucket, and if I download the file the changes are there, but even after reloading the browser, clearing the browser cache multiple times, viewing the source shows the old index.html file. Note if I go to http://m.example.com/?bla, it works, so I know s3 is serving the correct file (although I haven't ruled out an s3 request cache), but http://m.example.com/ is still broken.
I'm guessing that, although the appcache is redownloading the file, at the browser level it's still cached, so appcache is just downloading the browser's cached version, although clearing the browser cache doesn't fix the issue.
2) I never set any expires headers on the file in s3 so not sure if s3 sets really long expiration headers by default, but I've tried adding Expires: -1 to index.html but doesn't help.
3) I've also tried uploading a new file called index2.html, and changing the index document in the s3 bucket to index2.html, but still I'm getting the old copy.
Not only do I need to get this working on my dev machine, but I also need to fix the issue on existing user's browsers, ideally without them having to do anything. I'm starting to think my only option is changing the app url, which I'd really rather not do. The index page seems so hard wired into the browser I'm not sure if even pointing m.example.com to a new ip address would help. Anyone have any ideas how I could solve this?
Update: I tried looking in the network tab in the chrome console while at the same time pushing up a new cache manifest and reloading the page. Unfortunately cache manifest requests don't seem to show in the network tab even if they're being re-downloaded.
Ok so after a bit more searching I discovered some idiosyncrasies in appcache, like the fact that whatever page includes your cache manifest, is auto cached by default regardless of your settings. I would have thought that, if the manifest was updated, the browser would at least re-download index.html, but it appears not. I used the solution found in the link below which indicates a workaround where you attach your manifest to another page which you reference in your source page via iframe, therefore allowing index.html to be not cached again.
My HTML5 Application Cache Manifest is caching everything
After doing that, to avoid future caching issues I made a duplicate page at 'offline.html' and made a failover pointing index.html to offline.html. So basically index.html is now never cached and isn't going to leave me stranded with an unusable domain, and when offline it will redirect to the offline page which can be happily cached for offline access. Phew!