Is there any solutions to add captcha to Django-allauth? - captcha

Is there any solutions to use captcha with django-allauth?
I want to use captcha on registration form for standard email+password registrations.

I too needed to do this with django-allauth and found that implementing the django-recaptcha package to be relatively simple.
Configure django-recaptcha
Sign up for a recaptcha account.
Plug your settings in
# settings.py
RECAPTCHA_PUBLIC_KEY = 'xyz'
RECAPTCHA_PRIVATE_KEY = 'xyz'
RECAPTCHA_USE_SSL = True # Defaults to False
Customize SignupForm
After installing django-recaptcha, I followed someguidelines for customizing the SignupForm.
from django import forms
from captcha.fields import ReCaptchaField
class AllAuthSignupForm(forms.Form):
captcha = ReCaptchaField()
def save(self, request, user):
user = super(AllAuthSignupForm, self).save(request)
return user
You also need to tell allauth to inherit from this form in settings.py
ACCOUNT_SIGNUP_FORM_CLASS = 'myapp.forms.AllAuthSignupForm'
Wire up signup form template
{{ form.captcha }} and {{ form.captcha.errors }} should be available on the signup template context at this point.
That was it! Seems like all the validation logic is tucked into the ReCaptchaField.

To get the ReCaptcha field to the bottom of the form, simply add the other fields before the captcha field.
So what was user, email, captcha, password1, password2 becomes user, email, password1, password2, captcha with this form:
from allauth.account.forms import SignupForm, PasswordField
from django.utils.translation import ugettext_lazy as _
from captcha.fields import ReCaptchaField
class UpdatedSignUpForm(SignupForm):
password1 = PasswordField(label=_("Password"))
password2 = PasswordField(label=_("Password (again)"))
captcha = ReCaptchaField()
def save(self, request):
user = super(UpdatedSignUpForm, self).save(request)
return user
You then just need to add this form into the settings.py file as described in the previous answer.

You can also take a look at Form.field_order.
So a simple sign up form using django-allauth, with captcha and the fields ordered as you wish, would look like this:
from allauth.account.forms import SignupForm
from captcha.fields import ReCaptchaField
class MyCustomSignupForm(SignupForm):
captcha = ReCaptchaField()
field_order = ['email', 'password1', 'captcha']
In this case, the captcha will be at the very end.

The accepted answer is mostly fine, but produced this error for me when submitting the sign up form:
save() missing 1 required positional argument: 'user'
To fix this, your custom form could look like this
class AllAuthSignupForm(forms.Form):
captcha = ReCaptchaField()
def signup(self, request, user):
pass
NOTE: django-allauth also warns if the custom signup form does not have signup method.

Related

How to update a value from a field without refreshing the page Odoo

I'm creating a new sector on the settings page of my module, where I have a value and an update icon, which aims to update the field value when I click on the icon.
But when I call the function to execute the function, the page is reloaded and I never get the value, but the value is printed in the terminal with a logger, does anyone have any suggestions?
My XML code:
<button type="object" name="refresh_credits" class="btn-link" icon="fa-refresh"/>
<span class="btn-link">Credits</span>
<field name="new_credits"/>
My python code inside a class:
class ResConfigSettings(models.TransientModel):
_inherit = 'res.config.settings'
new_credits = fields.Integer()
def refresh_credits(self):
data_details_credits = self.env['show.credits'].content_credits_info()
_logger.info(self.env['show.credits'].content_credits_info()[4])
self.new_credits = data_details_credits[4]
You have to override set_values and get_values method to update the values in settings. We cannot directly update the values in general settings.
Below is the example:
def set_values(self):
super(ResConfigSettings, self).set_values()
self.env['ir.config_parameter'].sudo().set_param('MODULE_NAME.new_credits',
self.new_credits)
#api.model
def get_values(self):
res = super(ResConfigSettings, self).get_values()
config = self.env['ir.config_parameter'].sudo()
config_key = config.get_param('MODULE_NAME.new_credits')
if config_zoom_key:
res.update(
field_name=config_key
)
return res
And also for better understanding, watch this video:
Add fields general settings

Django 2.0 - Rendering wrong template (with no error)

I'm trying to load a template visit_form.html which is a DetailView with a form within it. Each time I click on a link from main.html the wrong template gets loaded -> main_detail.html. I have cleared browser cache, invalidated caches.
The goal is to have the MainVisitDisplay render the visit_form.html, but all I get is the main_detail.html. It throws an error for main_detail.html when I change the location of the main_detail.html template, and throws a "TemplateDoesNotExist" error, looking for the main_detail.html template.
My MWE is:
urls.py
from django.conf.urls import url
from . import views
from django.urls import path
urlpatterns = [
path('', views.index, name='index'),
path('main/', views.MainListView.as_view(), name='main'),
path('main/<int:pk>/', views.MainDetailView.as_view(), name='main_detail'),
path('visit/add/<int:pk>/', views.MainVisitDisplay.as_view(), name='visit_form'),
]
views.py
class MainVisitDisplay(DetailView):
model = Main
template = "visit_form.html"
def get_context_data(self, **kwargs):
context = super().get_context_data(**kwargs)
context['form'] = VisitForm()
return context
class MainDetailView(generic.DetailView):
template_name = "clincher/main_detail.html"
model = Main
main.html (template) url
{% url 'clincher:visit_form' main.id %}
This was a really simple. use template_name = "template_name.html" NOT template = "template_name.html. Not sure why it kept rendering the other templates. Also, apparently, Django 2.0 does not cache templates, but feel free to confirm or deny this.

Using Scrapy to scrape data after form submit

I'm trying to scrape content from listing detail page that can only be viewed by clicking the 'view' button which triggers a form submit . I am new to both Python and Scrapy
Example markup
<li><h3>Abc Widgets</h3>
<form action="/viewlisting?id=123" method="post">
<input type="image" src="/images/view.png" value="submit" >
</form>
</li>
My solution in Scrapy is to extract form actions then use Request to return the page with a callback to parse it for for the desired content. However I have hit a few issues
I'm getting the following error "request url must be str or unicode"
secondly when I hardcode a URL to overcome the above issue it seems my parsing function is returning what looks like a list
Here is my code - with reactions of the real URLs
from scrapy.spiders import Spider
from scrapy.selector import Selector
from scrapy.http import Request
from wfi2.items import Wfi2Item
class ProfileSpider(Spider):
name = "profiles"
allowed_domains = ["wfi.com.au"]
start_urls = ["http://example.com/wps/wcm/connect/internet/wfi/Contact+Us/Find+Your+Local+Office/findYourLocalOffice.jsp?state=WA",
"http://example.com/wps/wcm/connect/internet/wfi/Contact+Us/Find+Your+Local+Office/findYourLocalOffice.jsp?state=VIC",
"http://example.com/wps/wcm/connect/internet/wfi/Contact+Us/Find+Your+Local+Office/findYourLocalOffice.jsp?state=QLD",
"http://example.com/wps/wcm/connect/internet/wfi/Contact+Us/Find+Your+Local+Office/findYourLocalOffice.jsp?state=NSW",
"http://example.com/wps/wcm/connect/internet/wfi/Contact+Us/Find+Your+Local+Office/findYourLocalOffice.jsp?state=TAS"
"http://example.com/wps/wcm/connect/internet/wfi/Contact+Us/Find+Your+Local+Office/findYourLocalOffice.jsp?state=NT"
]
def parse(self, response):
hxs = Selector(response)
forms = hxs.xpath('//*[#id="area-managers"]//*/form')
for form in forms:
action = form.xpath('#action').extract()
print "ACTION: ", action
#request = Request(url=action,callback=self.parse_profile)
request = Request(url=action,callback=self.parse_profile)
yield request
def parse_profile(self, response):
hxs = Selector(response)
profile = hxs.xpath('//*[#class="contentContainer"]/*/text()')
print "PROFILE", profile
I'm getting the following error "request url must be str or unicode"
Please have a look at the scrapy documentation for extract(). It says : "Serialize and return the matched nodes as a list of unicode strings" (bold added by me).
The first element of the list is probably what you want. So you could do something like:
request = Request(url=response.urljoin(action[0]), callback=self.parse_profile)
secondly when I hardcode a URL to overcome the above issue it seems my
parsing function is returning what looks like a list
According to the documentation of xpath it's a SelectorList. Add extract() to the xpath and you'll get a list with the text tokens. Eventually you want to clean up and join the elements that list before further processing.

PHPUnit, fill an input that is not contained in a form

My question is really simple,
I work on a site where to access login form, user need to fill a first email input, then click on a 'next' button and then access the 'real' login form.
I try to build a unit test with PHPUnit but i dont find the method to fill this input while it's not contain in a form.
$mail = self::randomMail(10, "#yop.com");
$input = $crawler->filter('input#emailValue')->first();
$input-> ???;
$link = $crawler->filter('div#btn-next')->first()->link();
$crawler = $client->click($link);

"Element is not currently visible and so may not be interacted" while trying to click on sub-menus

I am automating flipkart site. My aim is to login to site and logout from site.
I successfully logged in but unable to logout as the logout link is under sub-menus (mousehover).
Attached the screenshot.
I tried all possible scenarios like using Actions class and javascriptexecutor.
Using javascriptexecutor it is working fine only if I manually place cursor on the sub-menu otherwise it is throwing an error.
I had no problem with getting this to work.
Here is the script using the getting-started-with-selenium framework
#Config(url="http://flipkart.com", browser=Browser.FIREFOX)
public class TestFlikPart extends AutomationTest {
#Test
public void testLoginLogout() {
String username = "<username>";
String password = "<password>";
click(By.cssSelector("a[href*='/login']"))
.setText(By.cssSelector("input[name='email']"), username)
.setText(By.cssSelector("input[name='password']"), password)
.click(By.cssSelector("input[type='submit'][value='Login']"))
.validatePresent(By.cssSelector("li.greeting-link > a"))
.hoverOver(By.cssSelector("li.greeting-link > a"))
.click(By.cssSelector("ul.account-dropdown a[href*='/logout']"))
// should be logged out now.
.validatePresent(By.cssSelector("a[href*='/login']"));
}
}
I think MrTi is correct - i think you might have forgotten to do .perform() on the action. A further explanation on the hoverOver() method, this is what that method contains -
actions.moveToElement(driver.findElement(by)).perform();