I'm trying to implement two mutations in graphql-ruby - one for creation of a resource, and one for editing. In most scenarios, they both take the exact same parameters from the client, so I want to avoid duplicating the mutations and try and have the arguments specified in a reusable class/module.
I'm using graphql-ruby 1.8 and the new class based API, and have started with this:
class Mutations::ApplicationMutation < GraphQL::Schema::Mutation
... common to every mutation ...
end
class Mutations::CreateResourceMutation < Mutations::ApplicationMutation
argument :name, String, required: true
argument :description, String
argument :create_only_field, String
end
class Mutations::UpdateResourceMutation < Mutations::ApplicationMutation
argument :name, String, required: true
argument :description, String
argument :update_only_field, String
end
In this basic example, the name and description attributes are the same in both mutations. I've extracted the resolver out into another class so that is reusable, but I'm not sure the best way to tackle the arguments.
I imagine an ActiveSupport::Concern would work, but it doesn't feel right or the way I think it should work for this, but I'm very new to GraphQL in general so have no idea.
I can suggest a nice solution creating a complex argument and use in two mutations.
Here is the code
# graphql/inputs/resource_input.rb
Inputs::ResourceInput = GraphQL::InputObjectType.define do
name 'ResourceInput'
description 'An input object representing arguments for a user'
argument :name, String, required: true
argument :description, String
argument :create_only_field, String
argument :update_only_field, String
end
and in mutations you can just use
argument :resourceInput, Inputs::ResourceInput, "the resource complex input"
Hope this will help you to avoid dupes
Related
I have a class deriving from pydantic.BaseModel and would like to create a "fake" attribute, i.e. a computed property. The propery keyword does not seem to work with Pydantic the usual way. Below is the MWE, where the class stores value and defines read/write property called half with the obvious meaning. Reading the property works fine with Pydantic, but the assignment fails.
I know Pydantic is modifying low-level details of attribute access; perhaps there is a way to define computed field in Pydantic in a different way?
import pydantic
class Object(object):
def __init__(self,*,value): self.value=value
half=property(lambda self: .5*self.value,lambda self,h: setattr(self,'value',h*2))
class Pydantic(pydantic.BaseModel):
class Config:
extra='allow'
value: float
half=property(lambda self: .5*self.value,lambda self,h: setattr(self,'value',h*2))
o,p=Object(value=1.),Pydantic(value=1.)
print(o.half,p.half)
o.half=p.half=2
print(o.value,p.value)
outputs (value=1. was not modified by assigning half in the Pydantic case):
0.5 0.5
4 1.0
I happened to be working on the same problem today. Officially it is not supported yet, as discussed here.
However, I did find the following example which works well:
class Person(BaseModel):
first_name: str
last_name: str
full_name: str = None
#validator("full_name", always=True)
def composite_name(cls, v, values, **kwargs):
return f"{values['first_name']} {values['last_name']}"
Do make sure your derived field comes after the fields you want to derive it from, else the values dict will not contain the needed values (e.g. full_name comes after first_name and last_name that need to be fetched from values).
Instead of using a property, here's an example which shows how to use pydantic.root_validator to compute the value of an optional field:
https://daniellenz.blog/2021/02/20/computed-fields-in-pydantic/
I've adapted this for a similar application:
class Section (BaseModel):
title: constr(strip_whitespace=True)
chunks: conlist(min_items=1, item_type=Chunk)
size: typing.Optional[ PositiveInt ] = None
role: typing.Optional[ typing.List[ str ]] = []
license: constr(strip_whitespace=True)
#root_validator
def compute_size (cls, values) -> typing.Dict:
if values["size"] is None:
values["size"] = sum([
chunk.get_size()
for chunk in values["chunks"]
])
return values
In this case each element of the discriminated union chunks has a get_size() method to compute its size. If the size field isn't specified explicitly in serialization (e.g., input from a JSON file) then it gets computed.
Created a pip package that allows you to easily create computed properties.
Here you can check it out:
https://pypi.org/project/pydantic-computed/
By using the package the example with getting the half of a value would look like this :
from pydantic import BaseModel
from pydantic_computed import Computed, computed
class SomeModel(BaseModel):
value: float
value_half: Computed[float]
#computed("value_half")
def compute_value_half(value: float):
return value / 2
Official docs says that class can be built dynamically like so:
constant A := Metamodel::ClassHOW.new_type( name => 'A' );
A.^add_method('x', my method x(A:D:) { say 42 });
A.^compose;
A.new.x(); # x will be only called on instances
But what if I am building a class and don't assign it to a constant but rather store it in a var (for instance when I need to create a bunch of classes in loop) like so:
my $x = Metamodel::ClassHOW.new_type( name => 'some custom string' );
$x.^add_method('x', my method ($y:) { say $y });
$x.^compose;
But in this case I can call method x both on class ($x.x) and on instance ($x.new.x) though I want it to only be called on instances.
I tried to define method like so:
$x.^add_method('x', my method ($y:D:) { say $y });
but that produces an error:
Invalid typename 'D' in parameter declaration.
Of course I can check defindness of the value inside the method but I want some compile-time guarantees (I want to believe that type checking is done in compile time).
I tried to play with signatures and parameters but couldn't find a way to create an invocant parameter but what is more important I am not sure how to assign signature which I have in a variable to some method.
Change:
my $x = ...
to:
my constant x = my $ = ...
In full:
my constant x = my $ = Metamodel::ClassHOW.new_type( name => 'some custom string' );
x.^add_method('x', my method (x:D $y:) { say $y });
x.^compose;
x = Metamodel::ClassHOW.new_type( name => 'another custom string' );
...
I want some compile-time guarantees (I want to believe that type checking is done in compile time).
By making the constant's RHS be a variable declaration, you blend static compile-time aspects with dynamic run-time ones.
(BTW, the my before constant is just me being pedantic. A plain constant like you've used is equivalent to our constant which is less strict than my constant.)
I note the error message for a non-instance is different:
Type check failed in binding to parameter '$y';
expected type some custom string cannot be itself
At a guess that's because the usual message comes from Mu or Any and your class isn't inheriting from either of them.
I am not sure how to assign signature which I have in a variable to some method.
I'll leave that part unanswered.
The best way I can think of to produce a method with types substituted into a signature with Raku today is to use a parametric role to help out, like this:
my role Helper[::T] {
method foo(T $inv:) {}
}
my &meth = Helper.^parameterize(Int).^pun.^lookup("foo");
say &meth.signature;
Which outputs (Int $inv: *%_). Substitute Int with the type you are building.
I know how to use the apply function on a normal Kotlin class but have not been able to use it with a data class:
data class Person(name: String)
val person = Person().apply {
name = "Tony Stark"
}
I get a compile message of:
No value passed for parameter 'name'
The issue is that name is a constructor parameter only and not made a property, which is invalid for the data class concept anyway. Fix like this:
data class Person(val name: String)
The apply function works similar with any class. But there are some errors in your code snippet:
Parameter in Person constructor didn't mentioned as var or val, so there is no fields name in that class. It would be better to make it var to be able to change value.
You made class's constructor with 1 parameter, but trying to use empty constructor - it is error.
I am using Dapper to query a flat list of items from a database, into a POCO class as follows:
Public Class Node
Public Property Name As String
Public Property ParentNodeName As String
Public Property Children As IEnumerable(Of Node)
End Class
I am trying to use the accepted answer to this question, in order to create a tree out of the flat list.
The only caveat is that I am using VB.NET.
I have tried it a straightforward port of the C# solution:
nodes.ForEach(Function(n) n.Children = nodes.Where(Function(ch) ch.ParentNodeName = n.Name).ToList)
but it does not compile with the error
Error BC30452 Operator '=' is not defined for types 'List(Of Node)' and 'List(Of Node)'.
The = symbol is interpreted as an equality operator, while I meant to use the assignment operator.
I have pasted the C# code into the telerik converter, and the converted code is:
Private Shared Function BuildTree(ByVal items As List(Of Category)) As IEnumerable(Of Category)
items.ForEach(Function(i) CSharpImpl.__Assign(i.Categories, items.Where(Function(ch) ch.ParentId = i.Id).ToList()))
Return items.Where(Function(i) i.ParentId Is Nothing).ToList()
End Function
Private Class CSharpImpl
<Obsolete("Please refactor calling code to use normal Visual Basic assignment")>
Shared Function __Assign(Of T)(ByRef target As T, value As T) As T
target = value
Return value
End Function
End Class
It uses an helper class to solve this issue, but suggests a refactor to avoid this.
Hence the questions:
Is there a general way to disambiguate equality = and assignment = in VB.NET, without resorting to an helper class and a specific function to assignement
In this specific case, is there a simple refactor I can use to get rid of the issue?
That's because of VB.Net distinction between functions and subroutines.
Instead of
nodes.ForEach(Function(n) n.Children = nodes.Where(Function(ch) ch.ParentNodeName = n.Name).ToList)
use
nodes.ForEach(Sub(n) n.Children = nodes.Where(Function(ch) ch.ParentNodeName = n.Name).ToList)
When you use Function, the lambda expression is expected to return a value; and in your case it looks like it wants to return a boolean.
But you want to use a lambda expression that does not return anything (in your case, you want an assignment), you have to use Sub.
I ran into a problem using java objects in jython today because jython is trying to be intelligent and automatically creates properties for (simple) getter/setter methods - For each method a field with the leading get/set removed and the next letter converted to lowercase is created:
//java code
class MyClass {
public List<Thing> getAllThings() { ... }
public List<Thing> getSpecificThings(String filter) { ... }
public void setSomeThing(SomeThing x) { ... }
[...]
}
#jython code
obj = MyClass()
hasattr(obj, "allThings") #-> True
hasattr(obj, "specificThings") #-> False because getSpecificThings has a param
hasattr(obj, "someThing") #-> False BUT
"someThing" in dir(obj) #-> True
The last line summarizes my problem here - the result of dir contains these fields (even when executed on obj.class instead of obj). I need a list of all methods callable on the object, which for my objects basically is the result of dir without these properties and filtered to exclude everything inherited from java.lang.Object and things starting with an underscore (the purpose of this is to automagically convert some python classes to java equivalents, e.g. dicts to Maps). In theory I could use __dict__ which doesn't contain them, but this would mean I'd have to recursively evaluate the base classes' __dict__s too, which I would like to avoid. What I am currently doing is seeing if the attribute actually exists and then check if it has an argslist attribute (meaning it is a method), which is true for every dir entry except for the generated properties:
for entry in dir(obj):
#skip things starting with an underscore or inherited from Object
if entry.startswith("_") or entry in dir(java.lang.Object): continue
#check if the dir entry is a fake setter property
if not hasattr(obj, entry): continue
#check if the dir entry has an argslist attribute (false for getter props)
e = getattr(obj, entry)
if not hasattr(e, "argslist"): continue
#start actual processing of the entry...
The problem with this approach is that the objects in question are interfaces to beans and a getSomething method typically fetches data from a database, so the getattr call for a property makes a roundtrip to the DB which can take multiple seconds and waste tons of memory.
Can I stop jython from generating these properties? If not, does anybody have an idea how I can filter out the properties without accessing them first? The only thing I could think of was checking if dir contains a method named get/set<property>, but this seems hackish and could generate false positives, which must be avoided.
The answer was easier than anticipated. While a hasattr of the property is True for an object instance, it is False for the objects class if the get-method in question is not static - the class doesn't have the property as you can't execute the method on it. The updated loop now looks like this:
for entry in dir(obj):
#skip things starting with an underscore or inherited from Object
if entry.startswith("_") or entry in dir(java.lang.Object): continue
#check if the dir entry is a fake property
if not hasattr(obj.class, entry): continue
#check if the dir entry has an argslist attribute (false if not a method)
e = getattr(obj, entry)
if not hasattr(e, "argslist"): continue
#start actual processing of the entry...