What is the .optimizer package in Pytorch? - optimization

I am trying to write my own optimizer for pytorch and am looking at the source code https://pytorch.org/docs/stable/_modules/torch/optim/sgd.html#SGD
to get started. When I try to run the code for the SGD, I get an error on the line
from .optimizer import Optimizer, required. I've searched everywhere but I'm not sure where to obtain the .optimizer package. Any help here is greatly appreciated.

Here import .optimizer means import optimizer.py from the same directory as the current .py file, so this file

Related

FileNotFoundError in Python3 (Code editor: Pycharm)

I have imported
import numpy as np
and I have used
xy = np.loadtxt('./Desktop/wine.csv', delimiter=',', dtype=np.float32, skiprows=1)
but Python3 is not able to read the file and I really do not know why. Can anyone help me please?
Can you try again by specifying the full file path?
/home/username/Desktop/wine.cvs

Python: ImportError: /usr/local/lib/python3.8/lib-dynload/math.cpython-38-x86_64-linux-gnu.so: file too short

Im a beginner in Python and i try to import a math module but i get the following error and i dont understand what it is about :
ImportError: /usr/local/lib/python3.8/lib-dynload/math.cpython-38-x86_64-linux-gnu.so: file too short
can someone helps me please?
Thanks

Accessing already downloaded dataset with tensorflow_datasets API

I am trying to work with the quite recently published tensorflow_dataset API to train a Keras model on the Open Images Dataset. The dataset is about 570 GB in size. I downloaded the data with the following code:
import tensorflow_datasets as tfds
import tensorflow as tf
open_images_dataset = tfds.image.OpenImagesV4()
open_images_dataset.download_and_prepare(download_dir="/notebooks/dataset/")
After the download was complete, the connection to my jupyter notebook somehow interrupted but the extraction seemed to be finished as well, at least all downloaded files had a counterpart in the "extracted" folder. However, I am not able to access the downloaded data now:
tfds.load(name="open_images_v4", data_dir="/notebooks/open_images_dataset/extracted/", download=False)
This only gives the following error:
AssertionError: Dataset open_images_v4: could not find data in /notebooks/open_images_dataset/extracted/. Please make sure to call dataset_builder.download_and_prepare(), or pass download=True to tfds.load() before trying to access the tf.data.Dataset object.
When I call the function download_and_prepare() it only downloads the whole dataset again.
Am I missing something here?
Edit:
After the download the folder under "extracted" has 18 .tar.gz files.
This is with tensorflow-datasets 1.0.1 and tensorflow 2.0.
The folder hierarchy should be like this:
/notebooks/open_images_dataset/extracted/open_images_v4/0.1.0
All the datasets have a version. Then the data could be loaded like this.
ds = tf.load('open_images_v4', data_dir='/notebooks/open_images_dataset/extracted', download=False)
I didn't have open_images_v4 data. I put cifar10 data into a folder named open_images_v4 to check what folder structure tensorflow_datasets was expecting.
The solution to this was to also use the "data_dir" parameter when initializing the dataset:
builder = tfds.image.OpenImagesV4(data_dir="/raid/openimages/dataset")
builder.download_and_prepare(download_dir="/raid/openimages/dataset")
This way the dataset is donwloaded and extracted in the same directory. Before, it was (for me unnoticeably) extracting to the default directory, which is under /home/.../. That's what caused the error, as there wasn't enough space left under my home directory.
After the extraction, the folder structure is exactly as Manoj-Mohan described.
Above solution haven't worked for me.
builder = tfds.builder(name='folder_name', data_dir=data_dir)
builder.download_and_prepare(download_dir="/home/...")
ds = builder.as_dataset()

Google colab issue importing ue using different class files

I am trying to use Google colab for my project for which I have to upload a few python files because I need those class files.But while executing the main function.It is constantly throwing me an error 'module object has no attribute' . Is there some memory issue with colab or what! Help would be much appreciated.
import numpy as np
import time
import tensorflow as tf
import NN
import Option
import Log
import getData
import Quantize
AttributeError: 'module' object has no attribute 'NN'
I uploaded all files using following code :
from google.colab import files
src = list(files.upload().values())[0]
open('Option.py','wb').write(src)
import Option
But its always giving me error on some or the other files which I am importing.
The updated version (for a few weeks) can save the files without you having to call open(fname, 'wb').write(src)
So, you only have to upload your 5 files: NN.py, Option.py, Log.py, getData.py, and Quantize.py (and probably other dependency + data) then try importing each one e.g. import NN to see if there's any error.

Do I have to specify import when Python script is being run in Ipython?

I am writing a script that I know I will run in Ipython.
I start Ipython as ipython --pylab.
This imports numpy, matplotlib, etc.
So do I have to specify these import statements again in my script?
I did not, and my script did not run.
Thanks for your help.
--pylab imports numpy both as * and np. I always use np. to minimize confusion.
If I need numpy in my script, I include the usual import numpy as np line. This lets me run the script from the shell. I can also run it with run ... from within IPython. I could also do an import in Ipython or some other script.
I've never tried omitting the import numpy line, and I don't see any need to start. It doesn't save any time or space. Make a habit of importing what you need, and don't assume the environment will do it for you.
Functions that you define and edit from with in IPython don't need their own import statements.
just tried this script:
def foo1(x):
return np.sum(x)
def foo2(x):
return x.sum()
Obviously I can load it with 'run'. And foo2(np.array([1,2,3])) works because the array uses its own method. But foo1 produces a NameError: global name 'np' is not defined.