I'm attempting to download and use the Google Landmark v2 dataset using TensoFlow Federated with the following code:
train, test = tff.simulation.datasets.gldv2.load_data(gld23k=True)
At some point during the download this error occurs:
ValueError: Incomplete or corrupted file detected. The md5 file hash does not match the provided value of 825975950b2e22f0f66aa8fd26c1f153 images_000.tar.
I've tried on Google CoLab and my personal machine but the same error occurs.
Is there anyway to get around this issue?
Thanks any help appreciated.
Related
I am using Hypermapper while doing some work on Google Colaboratory. All the informations about Hypermapper can be found on this site: Hypermapper
I want to plot Pareto front and everything seems to go just fine but I cannot find where Hypermapper stores the image of it. Does anyone know where Hypermapper usually stores output information if it is used on Colab?
with strategy.scope():
model = transformer(vocab_size=VOCAB_SIZE,
num_layers=NUM_LAYERS,
units=UNITS,
d_model=D_MODEL,
num_heads=NUM_HEADS,
is_encoder=True,
dropout=DROPOUT)
model.load_weights("path")
I get error:
InvalidArgumentError: Unsuccessful TensorSliceReader constructor: Failed to get matching files on path: UNIMPLEMENTED: File system scheme '[local]' not implemented (file: 'path')
TL;DR: You need to use Cloud Storage (GCS), which is a Google Cloud Platform (GCP) service.
As stated in the Cloud TPU documentation (https://cloud.google.com/tpu/docs/troubleshooting/trouble-tf#cannot_use_local_filesystem), TPU servers do not have access to your local storage; they can only see files in GCS buckets.
You need to place all the data used for training a model (or inference, depending on your intent) in a GCS bucket, including pretrained weights and the dataset. Note that GCS is a paid service, albeit not very pricey (and first-time users get a trial period).
Links to GCP officlal docs below might help you get started:
Create storage buckets
Connecting to Cloud Storage Buckets
I am new to google colabs. And I am wondering if it is possible to save graphs and or data(result) from google Colab into a google docs file.
So I can create a report.
thanks for the help
Harm
I've uploaded my trained model to the Google Cloud Platform that I trained and exported on lobe.ai. Now I want to send a test request with an image to it so I can use it on my web application. How do I do this?
With your tensorflow (I deduce this from your tags) model, you have 2 solutions
Either your test locally
Or you can deploy your model on AI Platform in online prediction mode.
In both cases, you have to submit a binary + your features in a JSON instance according with your model inputs
Job No: swift-sphinx-624:job_Ja1iYkl8OdF83J9xU5CIQJFlomM
Is failing, tried making the dataset public to no avail. Any info more descriptive than 'backend error' would be greatly appreciated.
Really sorry but I just dont have anything more to give, error message is so undescriptive.
You should not upload a file that big to BigQuery. The proper way is to upload that file to google storage and then import from there. This process is much more reliable.
google's services fail. we have to account for that.
Load from Cloud Storage
https://developers.google.com/bigquery/loading-data-into-bigquery#loaddatagcs
Resumable uploads to Cloud Storage
https://developers.google.com/storage/docs/gsutil/commands/cp#resumable-transfers
I hope this helps!