Facing error while trying to use the English package of spacy - spacy

I need to use spacy in my program. I have installed spacy. But I am facing error while running it. Error isgiven below.
import error: cannot import name 'en'
I tried to download en.
But the issue still not get resolved.
Please help me to resolve it.
Thanks in advance.

Please try the below steps.
1.Open cmd using "Run as administrator"
2.Use the command
pip install -U spacy
3.To download the English package
python -m spacy download en
4.To load it
import spacy
spacy.load('en')

Related

Importing TensorFlow "async" syntax error

I am trying to use the module imageai for a project and ran the line "from imageai.Detection import ObjectDetection". However, when I do so, this error appears:
File /Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/tensorflow/python/pywrap_tensorflow_internal.py:114
def TFE_ContextOptionsSetAsync(arg1, async):
^
SyntaxError: invalid syntax
I found someone who had the same issue here: https://github.com/tensorflow/tensorflow/issues/20690 , but I'm not quite sure how to edit the last file of the trace where the error occurs. Does anyone have any tips on how to do this? Thanks!
I have tried looking at the above GitHub error but am not sure how to approach it.
ImageAI uses Pytorch as backend. So you need to install all the required libraries before installing and importing the imageai module.
Please use the code below to install imageai in your system:
pip install cython pillow>=7.0.0 numpy>=1.18.1 opencv-python>=4.1.2 torch>=1.9.0 --extra-index-url https://download.pytorch.org/whl/cpu torchvision>=0.10.0 --extra-index-url https://download.pytorch.org/whl/cpu pytest==7.1.3 tqdm==4.64.1 scipy>=1.7.3 matplotlib>=3.4.3 mock==4.0.3
pip install imageai --upgrade
Now, import the Object detection from imageai:
from imageai.Detection import ObjectDetection
Please refer this link for more details.
Note: You can easily install imageai in Google Colab with this code
!pip install imageai
from imageai.Detection import ObjectDetection
Hints: Please use below code to install, import and check the TensorFlow version
pip install tensorflow
import tensorflow as tf
tf.__version__

install "pip undetected-chromedriver" for selenium python

I'm trying to make an autofiller using selenium, but it couldn't be done. so I decided to use undetected chromedriver to finish the automation.
I am having some difficulty here to import the undetected-chromedriver.
I already downloaded it by inputting the command line: pip install undetected-chromedriver
But when I put the import undetected_chromedriver as uc, the complier doesn't recognize it.
Below is the Error message after trying to import undetected-chromedriver:
import undetected_chromedriver as uc
ModuleNotFoundError: No module named 'undetected_chromedriver'
Use the following command to check if the undetected_chromedriver package is in the list
pip list
or
pip3 list
Try the following
# navigate into the project directory with your python script
cd presearch
# create virtual environment
python3 -m venv venv
# activate the virtual environment
source venv/bin/activate
# install required pip packages
pip3 install undetected-chromedriver
If you have multiple python versions installed, you might check if you actually installed it in the right one.

XLNetTokenizer requires the SentencePiece library but it was not found in your environment

I am trying to implement the XLNET on Google Collaboratory. But I get the following issue.
ImportError:
XLNetTokenizer requires the SentencePiece library but it was not found in your environment. Checkout the instructions on the
installation page of its repo: https://github.com/google/sentencepiece#installation and follow the ones
that match your environment.
I have also tried the following steps:
!pip install -U transformers
!pip install sentencepiece
from transformers import XLNetTokenizer
tokenizer = XLNetTokenizer.from_pretrained('xlnet-base-cased-spiece.model')
Thank you for your help in advance.
After the
!pip install transformers and
!pip install sentencepiece
please restart your runtime and then execute all other codes.
I got the same error in google colab. Restarting the runtime did it for me.

Spacy linking not working

I am using rasa.ai to build a bot. So far it was working fine but this morning I installed this requirement , then installed Spacy with below command.
python -m spacy download en_core_web_md
It seemed all good with successful linking. Now when I am running my bot with below command
python -m rasa_nlu.train --config config_spacy.yml --data data/training-rasa.json --path projects
I am getting error
FileNotFoundError: [Errno 2] No such file or directory:'/Users/usename/anaconda3/lib/python3.6/site-packages/spacy/data/en/vocab/strings.json'
To me this seems like a Spacy linking error but I don't understand why coz Spacy linking was successful from the above Spacy installation.
Any suggestion?
Turns out the requirement file is getting an older version of Spacy. So, I had to so pip install rasa_nlu[spacy] to get the latest Spacy (>2). That resolved the problem.

Object Detection API error: "ImportError: cannot import name anchor_generator_pb2"

I'm trying to get Tensorflow's new Object Detection API working. I've followed the installation instructions, but when running the command
python object_detection/builders/model_builder_test.py
I get the following error
from object_detection.protos import anchor_generator_pb2
ImportError: cannot import name anchor_generator_pb2
I've looked inside object_detection.protos, and there doesn't seem to be anything named anchor_generator_pb2. Has anyone else managed to get this command to run, or solved this issue?
Missed a step in the installation instructions, where the following needs to be run from models/research:
protoc object_detection/protos/*.proto --python_out=.
You need to rerun the below command after that:
pip install .
This worked for me
Run it from your models/research
python setup.py
protoc -I=./ --python_out=./ ./object_detection/protos/*.proto
export PYTHONPATH=$PYTHONPATH:`pwd`:`pwd`/slim