Spring Data Solr: how to set multiValue to false when declaring a field - spring-data-solr

I am playing around with Spring Data Solr with Schemaless Solr. There are some dots I am not able to connect: when and how are the schema fields being created?
The following page states
Automatic schema population will inspect your domain types whenever the applications context is refreshed and populate new fields to your index based on the properties configuration. This requires solr to run in Schemaless Mode.
Use #Indexed to provide additional details like specific solr types to use.
It also goes to show a curl request:
// curl ../solr/collection1/schema/fields -X POST -H 'Content-type:application/json'
However, when I run a simple example with field annotated with #Indexed, I do not see the /schema/fields API being called on SOLR. How are these fields created?
The reason I am asking is that they seem to be automatically created with multiValued=true. I did not see the #Indexed annotation taking multiValued as a parameter. How can I force Spring Data Solr to declare fields as non-multiValued when it creates them?
Now, all of this is really to resolve this exception I am seeing.
java.lang.IllegalArgumentException: [Assertion failed] - this argument is required; it must not be null
at org.springframework.util.Assert.notNull(Assert.java:115)
at org.springframework.util.Assert.notNull(Assert.java:126)
at org.springframework.data.solr.core.convert.MappingSolrConverter$SolrPropertyValueProvider.readValue(MappingSolrConverter.java:426)
at org.springframework.data.solr.core.convert.MappingSolrConverter$SolrPropertyValueProvider.readCollection(MappingSolrConverter.java:601)
at org.springframework.data.solr.core.convert.MappingSolrConverter$SolrPropertyValueProvider.readValue(MappingSolrConverter.java:440)
at org.springframework.data.solr.core.convert.MappingSolrConverter$SolrPropertyValueProvider.readValue(MappingSolrConverter.java:412)
at org.springframework.data.solr.core.convert.MappingSolrConverter$SolrPropertyValueProvider.getPropertyValue(MappingSolrConverter.java:395)
at org.springframework.data.solr.core.convert.MappingSolrConverter.getValue(MappingSolrConverter.java:206)
at org.springframework.data.solr.core.convert.MappingSolrConverter$1.doWithPersistentProperty(MappingSolrConverter.java:194)
at org.springframework.data.solr.core.convert.MappingSolrConverter$1.doWithPersistentProperty(MappingSolrConverter.java:186)
at org.springframework.data.mapping.model.BasicPersistentEntity.doWithProperties(BasicPersistentEntity.java:309)
at org.springframework.data.solr.core.convert.MappingSolrConverter.read(MappingSolrConverter.java:186)
at org.springframework.data.solr.core.convert.MappingSolrConverter.read(MappingSolrConverter.java:174)
at org.springframework.data.solr.core.convert.MappingSolrConverter.read(MappingSolrConverter.java:149)
at org.springframework.data.solr.core.SolrTemplate.convertSolrDocumentListToBeans(SolrTemplate.java:560)
at org.springframework.data.solr.core.SolrTemplate.convertQueryResponseToBeans(SolrTemplate.java:552)
at org.springframework.data.solr.core.SolrTemplate.createSolrResultPage(SolrTemplate.java:369)
at org.springframework.data.solr.core.SolrTemplate.doQueryForPage(SolrTemplate.java:300)
at org.springframework.data.solr.core.SolrTemplate.queryForPage(SolrTemplate.java:308)
at org.springframework.data.solr.repository.support.SimpleSolrRepository.findAll(SimpleSolrRepository.java:111)
at org.springframework.data.solr.repository.support.SimpleSolrRepository.findAll(SimpleSolrRepository.java:106)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at
My guess is that the exception is happening because values are being returned as a collection?
I tried to step through the code to understand what is happening. The field that is causing problem is "name", the value is [product-1]. The exception is happening when trying to unmarshall a solr document into a POJO.
First we go inside the following method:
private <T> T readValue(Object value, TypeInformation<?> type, Object parent) {
if (value == null) {
return null;
}
Assert.notNull(type);
Class<?> rawType = type.getType();
if (hasCustomReadTarget(value.getClass(), rawType)) {
return (T) convert(value, rawType);
}
Object documentValue = null;
if (value instanceof SolrInputField) {
documentValue = ((SolrInputField) value).getValue();
} else {
documentValue = value;
}
if (documentValue instanceof Collection) {
return (T) readCollection((Collection<?>) documentValue, type, parent);
} else if (canConvert(documentValue.getClass(), rawType)) {
return (T) convert(documentValue, rawType);
}
return (T) documentValue;
}
When calling this method, the value is a collection and the type is java.lang.String. This results in the if(documentValue instanceof Collection) being selected, which results into following method being executed:
private Object readCollection(Collection<?> source, TypeInformation<?> type, Object parent) {
Assert.notNull(type);
Class<?> collectionType = type.getType();
if (CollectionUtils.isEmpty(source)) {
return source;
}
collectionType = Collection.class.isAssignableFrom(collectionType) ? collectionType : List.class;
Collection<Object> items;
if (type.getType().isArray()) {
items = new ArrayList<Object>();
} else {
items = CollectionFactory.createCollection(collectionType, source.size());
}
TypeInformation<?> componentType = type.getComponentType();
Iterator<?> it = source.iterator();
while (it.hasNext()) {
items.add(readValue(it.next(), componentType, parent));
}
return type.getType().isArray() ? convertItemsToArrayOfType(type, items) : items;
}
In this method, we end up calling type.getComponentType(), which returns null and ultimately cause Assert.notNull() to fail.
What am I missing in all this?
My code is as follows. Launch and config class:
#Configuration
#ComponentScan
#EnableAutoConfiguration
#EnableSolrRepositories(schemaCreationSupport=true, basePackages = { "com.example.solrdata" }, multicoreSupport = true)
public class Application {
public static void main(String[] args) {
SpringApplication.run(Application.class, args);
}
}
Model class:
#SolrDocument(solrCoreName = "collection1")
public class Product {
#Id
String id;
#Indexed
String name;
public Product(String id, String name) {
this.id = id;
this.name = name;
}
public Product() {
super();
}
public String getId() {
return id;
}
public void setId(String id) {
this.id = id;
}
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}}
Repository:
public interface ProductRepository extends SolrCrudRepository<Product, String> {}
Test class:
#RunWith(SpringJUnit4ClassRunner.class)
#SpringApplicationConfiguration(Application.class)
public class SolrProductRepositoryTest {
#Autowired
private ProductRepository repo;
#Before #After
public void setup(){
repo.deleteAll();
}
#Test
public void testCRUD() {
assertEquals(0, repo.count());
Product product = new Product("1","product-1");
repo.save(product);
assertEquals(1, repo.count());
Product product2 = repo.findOne(product.getId());
assertEquals(product2.getName(), product.getName());
}}
And finally, my POM:
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>1.3.3.RELEASE</version>
</parent>
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-solr</artifactId>
</dependency>
</dependencies>

When using Solr 4.10 with the defaults unfortunately the JSONResponseWriter in solrconfig.xml uses text/pain as content type.
<queryResponseWriter name="json" class="solr.JSONResponseWriter">
<str name="content-type">text/plain; charset=UTF-8</str>
</queryResponseWriter>
This causes the content type negotiation of SolrSchemaRequest to silently fail and skip the schema update step - and solr default field type guessing kicks in at that place.
Setting content-type to application/json allows to add fields according to the bean definition.
#SolrDocument(solrCoreName = "collection1")
public static class SomeDomainType {
#Indexed #Id
String id;
#Indexed
String indexedStringWithoutType;
#Indexed(name = "namedField", type = "string", searchable = false)
String justAStoredField;
#Indexed
List<String> listField;
#Indexed(type = "tdouble")
Double someDoubleValue;
}
Before
{
responseHeader: {
status: 0,
QTime: 86
},
fields: [
{
name: "_version_",
type: "long",
indexed: true,
stored: true
},
{
name: "id",
type: "string",
multiValued: false,
indexed: true,
required: true,
stored: true,
uniqueKey: true
}
]
}
After Schema Upate
{
responseHeader: {
status: 0,
QTime: 1
},
fields: [
{
name: "_version_",
type: "long",
indexed: true,
stored: true
},
{
name: "id",
type: "string",
multiValued: false,
indexed: true,
required: true,
stored: true,
uniqueKey: true
},
{
name: "indexedStringWithoutType",
type: "string",
multiValued: false,
indexed: true,
required: false,
stored: true
},
{
name: "listField",
type: "string",
multiValued: true,
indexed: true,
required: false,
stored: true
},
{
name: "namedField",
type: "string",
multiValued: false,
indexed: false,
required: false,
stored: true
},
{
name: "someDoubleValue",
type: "tdouble",
multiValued: false,
indexed: true,
required: false,
stored: true
}
]
}

Related

Spring Batch - Unable to deserialize the execution context - OffsetDateTime - cannot deserialize

I'm trying to create a spring batch job with multiples steps and passing object from step to step.
To do this I use ExecutionContext that i promoted from step to job context.
At first run, no problem data goes right from step to step
At next runs, I get the error :
"Unable to deserialize the execution context" Caused by: com.fasterxml.jackson.databind.exc.InvalidDefinitionException: Cannot construct instance of java.time.OffsetDateTime (no Creators, like default construct, exist): cannot deserialize from Object value (no delegate- or property-based Creator)
I write context in a ItemWriter like so :
#Override
public void write(List<? extends Employee> items) throws Exception {
ExecutionContext stepContext = this.stepExecution.getExecutionContext();
List<Employee> e = new ArrayList<Employee>();
e.addAll(items);
stepContext.put("someKey", e);
}
And read it back in a ItemReader (from another step) with :
#BeforeStep
public void retrieveInterstepData(StepExecution stepExecution) {
JobExecution jobExecution = stepExecution.getJobExecution();
ExecutionContext jobContext = jobExecution.getExecutionContext();
this.someObject = (List<Employee>) jobContext.get("someKey");
}
I check spring database context and my dates (LocalDate, OffsetDateTime, ...) are store like :
"LocalDate": {
"year": 2019,
"month": "OCTOBER",
"dayOfMonth": 30,
"monthValue": 10,
"era": ["java.time.chrono.IsoEra", "CE"],
"dayOfWeek": "WEDNESDAY",
"dayOfYear": 303,
"leapYear": false,
"chronology": {
"id": "ISO",
"calendarType": "iso8601"
}
}
"OffsetDateTime": {
"offset": {
"totalSeconds": 0,
"id": "Z",
"rules": {
"fixedOffset": true,
"transitionRules": ["java.util.Collections$UnmodifiableRandomAccessList", []],
"transitions": ["java.util.Collections$UnmodifiableRandomAccessList", []]
}
},
"month": "OCTOBER",
"year": 2019,
"dayOfMonth": 28,
"hour": 13,
"minute": 42,
"monthValue": 10,
"nano": 511651000,
"second": 36,
"dayOfWeek": "MONDAY",
"dayOfYear": 301
}
I guess it's jackson's choice to store it like that (I custom nothing)
But it seems that jackson can't read it's own format at next run ?!
My stubs are generated with from swagger with "swagger-codegen-maven-plugin" and configOptions/dateLibrary=java8 so I can't change them.
I tried to add
<dependency>
<groupId>com.fasterxml.jackson.datatype</groupId>
<artifactId>jackson-datatype-jsr310</artifactId>
</dependency>
And
#PostConstruct
public void init() {
objectMapper.registerModule(new JavaTimeModule());
objectMapper.disable(SerializationFeature.WRITE_DATES_AS_TIMESTAMPS);
}
In the #SpringBootApplication
no change
Any ideas ? Either to store dates more simply like "2019-11-04" or make jackson read it's own format ?
Your object mapper should be set on the Jackson2ExecutionContextStringSerializer used by the job repository. You can extend DefaultBatchConfigurer and override createJobRepository:
#Bean
public JobRepository createJobRepository() throws Exception {
ObjectMapper objectMapper = new ObjectMapper().registerModule(new JavaTimeModule());
objectMapper.disable(SerializationFeature.WRITE_DATES_AS_TIMESTAMPS);
Jackson2ExecutionContextStringSerializer defaultSerializer = new Jackson2ExecutionContextStringSerializer();
defaultSerializer.setObjectMapper(objectMapper);
JobRepositoryFactoryBean factory = new JobRepositoryFactoryBean();
factory.setDataSource(dataSource);
factory.setTransactionManager(transactionManager);
factory.setSerializer(defaultSerializer);
factory.afterPropertiesSet();
return factory.getObject();
}
EDIT :
My bad I just saw that I have a
#Bean
public BatchConfigurer batchConfigurer(#Qualifier("batchDataSource") DataSource dataSource) {
return new DefaultBatchConfigurer(dataSource);
}
That provide 2 batchConfigurer to spring.
Thanks !
ORIGINAL :
Thanks it seems promising.
But I dont find where to extends and use it, on which class.
I have a Batch Class configuration :
#Configuration
#EnableConfigurationProperties(BatchProperties.class)
public class BatchDatabaseConfiguration {
#Value("${spring.datasource.driver-class-name}")
private String driverClassName;
#Value("${spring.datasource.url}")
private String dbURL;
#Bean("batchDataSource")
public DataSource batchDataSource() {
final DriverManagerDataSource dataSource = new DriverManagerDataSource();
dataSource.setDriverClassName(driverClassName);
dataSource.setUrl(dbURL);
return dataSource;
}
#Bean
public BatchConfigurer batchConfigurer(#Qualifier("batchDataSource") DataSource dataSource) {
return new DefaultBatchConfigurer(dataSource);
}
#Bean(name = "batchTransactionManager")
public PlatformTransactionManager batchTransactionManager(#Qualifier("batchDataSource") DataSource dataSource) {
DataSourceTransactionManager tm = new DataSourceTransactionManager();
tm.setDataSource(dataSource);
return tm;
}
}
And a Class with Job's definition :
#Configuration
#EnableBatchProcessing
public class ExtractionJobConfiguration {
#Autowired
private JobBuilderFactory jobBuilderFactory;
#Autowired
private StepBuilderFactory stepBuilderFactory;
#Bean
public Job creationJob() {
...
}
[...]
}
And the main :
#EntityScan(basePackages = { "..." })
#SpringBootApplication
#EnableAsync
public class App {
public static void main(String[] args) {
ApplicationContext ctx = SpringApplication.run(App.class, args);
}
What do you think ?
I also read that Spring Batch 4.2.0+ allow for customisation of ObjectMapper in Jackson2ExecutionContextStringSerializer (https://jira.spring.io/browse/BATCH-2828)
Is that what you propose ? (I don't find other information)

Using jackson mixin class for a list of objects

I'm having a problem deserializing the following json
{
"GrpHdr": {
"MsgId": "Message-1",
"CreDtTm": "2018-03-02T10:15:30+01:00[Europe/Paris]",
"NbOfTxs": "1",
"InitgPty": {
"Nm": "Remitter"
}
},
"PmtInf": [
{
"PmtInfId": "1"
},
{
"PmtInfId": "2"
}
]
}
I have created a MixIn class:
public abstract class CustomerCreditTransferInitiationMixIn {
public PaymentInstructions paymentInstructions;
#JsonCreator
public CustomerCreditTransferInitiationMixIn(
#JsonProperty("GrpHdr") GroupHeader GrpHdr,
#JsonProperty("PmtInf") List<PaymentInstruction> PmtInf
) {
this.paymentInstructions = PaymentInstructions.valueOf(PmtInf);
}
#JsonProperty("GrpHdr")
abstract GroupHeader getGroupHeader();
#JsonProperty("PmtInf")
abstract List<PaymentInstruction> getPaymentInstructions();
}
I'm having no trouble deserializing the group header in this case. Mapping different names. But in the PmtInf case I get confused. It is a list that I want to deserialize to a List of PaymentInstructions. But PmtInf is a paymentistruction.
I have created a test:
#Test
public void JacksonMixinAnnotationTestJsonIsoFileFromTester() throws JsonProcessingException, Throwable {
CustomerCreditTransferInitiation customerCreditTransferInitiation;
String jsonFile = "testWithShortNames";
InputStream inputStream = new ClassPathResource(jsonFile + ".json").getInputStream();
ObjectMapper objectMapper = buildMapper();
objectMapper.configure(DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES, false);
objectMapper.registerModule(new JavaTimeModule());
objectMapper.addMixIn(CustomerCreditTransferInitiation.class, CustomerCreditTransferInitiationMixIn.class);
objectMapper.addMixIn(GroupHeader.class, GroupHeaderMixIn.class);
objectMapper.addMixIn(PaymentInstruction.class, PaymentInstructionMixIn.class);
objectMapper.addMixIn(PartyIdentification.class, PartyIdentificationMixIn.class);
customerCreditTransferInitiation = objectMapper.readValue(inputStream, CustomerCreditTransferInitiation.class);
//GroupHeader
Assert.assertNotNull(customerCreditTransferInitiation.getGroupHeader());
Assert.assertNotNull(customerCreditTransferInitiation.getGroupHeader().getMessageId());
Assert.assertNotNull(customerCreditTransferInitiation.getGroupHeader().getCreationDateTime());
Assert.assertNotNull(customerCreditTransferInitiation.getGroupHeader().getNumberOfTransactions());
Assert.assertNotNull(customerCreditTransferInitiation.getGroupHeader().getInitiatingParty());
Assert.assertNotNull(customerCreditTransferInitiation.getGroupHeader().getInitiatingParty().getName());
//PaymentInstructions
Assert.assertNotNull(customerCreditTransferInitiation.getPaymentInstructions());}
Getting the following error:
com.fasterxml.jackson.databind.exc.UnrecognizedPropertyException:
Unrecognized field "PmtInfId" (class
com.seb.payment.iso.domain.PaymentInstruction), not marked as
ignorable (19 known properties: "paymentInformationId",
"paymentMethod", "created", "paymentTypeInformation", "controlSum",
"debtorAgent", "instructionForDebtorAgent", "numberOfTransactions",
"requestExecutionTime", "debtorAccount", "creditTransferTransactions",
"debtorAgentAccount", "batchBooking", "poolingAdjustmentDate",
"ultimateDebtor", "chargeBearerType", "debtor", "chargesAccount",
"chargesAccountAgent"]) at [Source: UNKNOWN; line: -1, column: -1]
(through reference chain:
com.seb.payment.iso.domain.CustomerCreditTransferInitiation["PmtInf"]->com.seb.payment.iso.domain.PaymentInstruction["PmtInfId"])
In our case we have implemented our own deserializers in abstract iterable.
On:
ObjectReader objectReader = ObjectMapperFactory.instance().readerFor(this.itemClass);
MixedIn classes are lost

Add type definition for type not used in any api method

I have an api end point that takes a custom header.
This header is a object that looks like this:
User: {"UserId":"someguid"}
If I have the type as a parameter in a api method I can do as follows:
public class AddFileParamTypes : IOperationFilter
{
public void Apply(Operation operation, SchemaRegistry schemaRegistry, ApiDescription apiDescription)
{
if (operation.operationId == "FileDelivery_Post")
{
operation.consumes.Add("multipart/form-data");
operation.parameters.Add(new Parameter
{
name = "file",
required = true,
type = "file",
#in = "formData"
});
operation.parameters.Add(new Parameter
{
name = "User",
required = true,
schema = new Schema() { #ref = "#/definitions/User" },
#in = "header"
});
}
}
}
But the User type will not be a parameter for a method, so how do I add the definition to swashbuckle?
Found out, it was simple but lacking in documentation:
public class AddFileParamTypes : IOperationFilter
{
public void Apply(Operation operation, SchemaRegistry schemaRegistry, ApiDescription apiDescription)
{
if (operation.operationId == "FileDelivery_Post")
{
operation.consumes.Add("multipart/form-data");
operation.parameters.Add(new Parameter
{
name = "file",
required = true,
type = "file",
#in = "formData"
});
operation.parameters.Add(new Parameter
{
name = "User",
required = true,
schema = schemaRegistry.GetOrRegister(typeof(User)),
#in = "header"
});
}
}
}

SerializationException of Avro Date Object (Date LogicalType)

I have a publisher that accepts a GenericRecord class.
#Override
public Future<RecordMetadata> publish(GenericRecord genericRecord) {
Future<RecordMetadata> recordMetadataFuture =
getPublisher().send(new ProducerRecord<>(producerConfiguration.getProperties()
.getProperty(ProducerConfiguration.PROPERTY_NAME_TOPIC), "sample.key",genericRecord));
return recordMetadataFuture;
}
private KafkaProducer<String, GenericRecord> getPublisher() {
return new KafkaProducer<>(producerConfiguration.getProperties());
}
And I have the following avro schema:
{
"type" : "record",
"name" : "SampleDate",
"namespace": "com.sample.data.generated.avro",
"doc" : "sample date",
"fields" : [
{
"name" : "sampleDate",
"type" : {
"type" : "int",
"logicalType" : "date"
}
}
]
}
I have built my own serializer:
Date Serializer:
#Component
public class SampleDateSerializer implements Serializer<GenericRecord> {
private AvroGenericSerializer serializer;
#Override
public void configure(Map<String, ?> configs, boolean isKey) {
serializer = new AvroGenericSerializer(SampleDate.SCHEMA$);
}
#Override
public byte[] serialize(String topic, GenericRecord data) {
return serializer.serialize(data);
}
#Override
public void close() {
}
Generic Serializer:
public class AvroGenericSerializer {
private EncoderFactory avroEncoderFactory;
private DecoderFactory avroDecoderFactory;
private GenericDatumWriter<GenericRecord> avroWriter;
private GenericDatumReader<GenericRecord> avroReader;
public AvroGenericSerializer(Schema schema) {
avroEncoderFactory = EncoderFactory.get();
avroDecoderFactory = DecoderFactory.get();
avroWriter = new GenericDatumWriter<>(schema);
avroReader = new GenericDatumReader<>(schema);
}
public byte[] serialize(GenericRecord data) {
final ByteArrayOutputStream stream = new ByteArrayOutputStream();
final BinaryEncoder binaryEncoder = avroEncoderFactory.binaryEncoder(stream, null);
try {
avroWriter.write(data, binaryEncoder);
binaryEncoder.flush();
stream.close();
return stream.toByteArray();
} catch (IOException e) {
throw new RuntimeException("Can't serialize Avro object", e);
}
}
public GenericRecord deserialize(byte[] bytes) {
try {
return avroReader.read(null, avroDecoderFactory.binaryDecoder(bytes, null));
} catch (IOException e) {
throw new RuntimeException("Can't deserialize Avro object", e);
}
}
}
However, when testing my publisher class, I am encountering the following error:
org.apache.kafka.common.errors.SerializationException: Can't convert value of class com.sample.data.generated.avro.SampleDate to class com.sample.message.serialize.SampleDateSerializer specified in value.serializer
Debugging the code, I have found out that the
GenericDatumWriter.write()...
method is returning null when calling the
Conversion conversion = this.getData().getConversionByClass(datum.getClass(), logicalType);
which is called from
org.apache.avro.generic.GenericData
public <T> Conversion<T> getConversionByClass(Class<T> datumClass, LogicalType logicalType) {
Map conversions = (Map)this.conversionsByClass.get(datumClass);
return conversions != null?(Conversion)conversions.get(logicalType.getName()):null;
}
In this regard, is there a way for me to populate the
GenericData.conversionsByClass
Map, so that it can return the correct converter to use for the given
date logicalType?
I have solved it by passing the GenericData object in my GenericDatumWriter.
My Generic Serializer now looks like this:
public AvroGenericSerializer(Schema schema) {
avroEncoderFactory = EncoderFactory.get();
avroDecoderFactory = DecoderFactory.get();
final GenericData genericData = new GenericData();
genericData.addLogicalTypeConversion(new TimeConversions.DateConversion());
avroWriter = new GenericDatumWriter<>(schema, genericData);
avroReader = new GenericDatumReader<>(schema);
}

Swashbuckle 5 and multipart/form-data HelpPages

I am stuck trying to get Swashbuckle 5 to generate complete help pages for an ApiController with a Post request using multipart/form-data parameters. The help page for the action comes up in the browser, but there is not included information on the parameters passed in the form. I have created an operation filter and enabled it in SwaggerConfig, the web page that includes the URI parameters, return type and other info derived from XML comments shows in the browser help pages; however, nothing specified in the operation filter about the parameters is there, and the help page contains no information about the parameters.
I must be missing something. Are there any suggestion on what I may have missed?
Operation filter code:
public class AddFormDataUploadParamTypes : IOperationFilter
{
public void Apply(Operation operation, SchemaRegistry schemaRegistry, ApiDescription apiDescription) {
if (operation.operationId == "Documents_Upload")
{
operation.consumes.Add("multipart/form-data");
operation.parameters = new[]
{
new Parameter
{
name = "anotherid",
#in = "formData",
description = "Optional identifier associated with the document.",
required = false,
type = "string",
format = "uuid"
},
new Parameter
{
name = "documentid",
#in = "formData",
description = "The document identifier of the slot reserved for the document.",
required = false,
type = "string",
format = "uuid"
},
new Parameter
{
name = "documenttype",
#in = "formData",
description = "Specifies the kind of document being uploaded. This is not a file name extension.",
required = true,
type = "string"
},
new Parameter
{
name = "emailfrom",
#in = "formData",
description = "A optional email origination address used in association with the document if it is emailed to a receiver.",
required = false,
type = "string"
},
new Parameter
{
name = "emailsubject",
#in = "formData",
description = "An optional email subject line used in association with the document if it is emailed to a receiver.",
required = false,
type = "string"
},
new Parameter
{
name = "file",
#in = "formData",
description = "File to upload.",
required = true,
type = "file"
}
};
}
}
}
With Swashbuckle v5.0.0-rc4 methods listed above do not work. But by reading OpenApi spec I have managed to implement a working solution for uploading a single file. Other parameters can be easily added:
public class FileUploadOperationFilter : IOperationFilter
{
public void Apply(OpenApiOperation operation, OperationFilterContext context)
{
var isFileUploadOperation =
context.MethodInfo.CustomAttributes.Any(a => a.AttributeType == typeof(YourMarkerAttribute));
if (!isFileUploadOperation) return;
var uploadFileMediaType = new OpenApiMediaType()
{
Schema = new OpenApiSchema()
{
Type = "object",
Properties =
{
["uploadedFile"] = new OpenApiSchema()
{
Description = "Upload File",
Type = "file",
Format = "binary"
}
},
Required = new HashSet<string>()
{
"uploadedFile"
}
}
};
operation.RequestBody = new OpenApiRequestBody
{
Content =
{
["multipart/form-data"] = uploadFileMediaType
}
};
}
}
I presume you figured out what your problem was. I was able to use your posted code to make a perfect looking 'swagger ui' interface complete with the file [BROWSE...] input controls.
I only modified your code slightly so it is applied when it detects my preferred ValidateMimeMultipartContentFilter attribute stolen from Damien Bond. Thus, my slightly modified version of your class looks like this:
public class AddFormDataUploadParamTypes<T> : IOperationFilter
{
public void Apply(Operation operation, SchemaRegistry schemaRegistry, ApiDescription apiDescription)
{
var actFilters = apiDescription.ActionDescriptor.GetFilterPipeline();
var supportsDesiredFilter = actFilters.Select(f => f.Instance).OfType<T>().Any();
if (supportsDesiredFilter)
{
operation.consumes.Add("multipart/form-data");
operation.parameters = new[]
{
//other parameters omitted for brevity
new Parameter
{
name = "file",
#in = "formData",
description = "File to upload.",
required = true,
type = "file"
}
};
}
}
}
Here's my Swagger UI:
FWIW:
My NuGets
<package id="Swashbuckle" version="5.5.3" targetFramework="net461" />
<package id="Swashbuckle.Core" version="5.5.3" targetFramework="net461" />
Swagger Config Example
public class SwaggerConfig
{
public static void Register()
{
var thisAssembly = typeof(SwaggerConfig).Assembly;
GlobalConfiguration.Configuration
.EnableSwagger(c =>
{
c.Schemes(new[] { "https" });
// Use "SingleApiVersion" to describe a single version API. Swagger 2.0 includes an "Info" object to
// hold additional metadata for an API. Version and title are required but you can also provide
// additional fields by chaining methods off SingleApiVersion.
//
c.SingleApiVersion("v1", "MyCorp.WebApi.Tsl");
c.OperationFilter<MyCorp.Swashbuckle.AddFormDataUploadParamTypes<MyCorp.Attr.ValidateMimeMultipartContentFilter>>();
})
.EnableSwaggerUi(c =>
{
// If your API supports ApiKey, you can override the default values.
// "apiKeyIn" can either be "query" or "header"
//
//c.EnableApiKeySupport("apiKey", "header");
});
}
}
UPDATE March 2019
I don't have quick access to the original project above, but, here's an example API controller from a different project...
Controller signature:
[ValidateMimeMultipartContentFilter]
[SwaggerResponse(HttpStatusCode.OK, Description = "Returns JSON object filled with descriptive data about the image.")]
[SwaggerResponse(HttpStatusCode.NotFound, Description = "No appropriate equipment record found for this endpoint")]
[SwaggerResponse(HttpStatusCode.BadRequest, Description = "This request was fulfilled previously")]
public async Task<IHttpActionResult> PostSignatureImage(Guid key)
You'll note that there's no actual parameter representing my file in the signature, you can see below that I just spin up a MultipartFormDataStreamProvider to suck out the incoming POST'd form data.
Controller Body:
var signatureImage = await db.SignatureImages.Where(img => img.Id == key).FirstOrDefaultAsync();
if (signatureImage == null)
{
return NotFound();
}
if (!signatureImage.IsOpenForCapture)
{
ModelState.AddModelError("CaptureDateTime", $"This equipment has already been signed once on {signatureImage.CaptureDateTime}");
}
if (!ModelState.IsValid)
{
return BadRequest(ModelState);
}
string fileName = String.Empty;
string ServerUploadFolder = System.Web.Hosting.HostingEnvironment.MapPath("~/App_Data/");
DirectoryInfo di = new DirectoryInfo(ServerUploadFolder + key.ToString());
if (di.Exists == true)
ModelState.AddModelError("id", "It appears an upload for this item is either in progress or has already occurred.");
else
di.Create();
var fullPathToFinalFile = String.Empty;
var streamProvider = new MultipartFormDataStreamProvider(di.FullName);
await Request.Content.ReadAsMultipartAsync(streamProvider);
foreach (MultipartFileData fileData in streamProvider.FileData)
{
if (string.IsNullOrEmpty(fileData.Headers.ContentDisposition.FileName))
{
return StatusCode(HttpStatusCode.NotAcceptable);
}
fileName = cleanFileName(fileData.Headers.ContentDisposition.FileName);
fullPathToFinalFile = Path.Combine(di.FullName, fileName);
File.Move(fileData.LocalFileName, fullPathToFinalFile);
signatureImage.Image = File.ReadAllBytes(fullPathToFinalFile);
break;
}
signatureImage.FileName = streamProvider.FileData.Select(entry => cleanFileName(entry.Headers.ContentDisposition.FileName)).First();
signatureImage.FileLength = signatureImage.Image.LongLength;
signatureImage.IsOpenForCapture = false;
signatureImage.CaptureDateTime = DateTimeOffset.Now;
signatureImage.MimeType = streamProvider.FileData.Select(entry => entry.Headers.ContentType.MediaType).First();
db.Entry(signatureImage).State = EntityState.Modified;
try
{
await db.SaveChangesAsync();
//cleanup...
File.Delete(fullPathToFinalFile);
di.Delete();
}
catch (DbUpdateConcurrencyException)
{
if (!SignatureImageExists(key))
{
return NotFound();
}
else
{
throw;
}
}
char[] placeHolderImg = paperClipIcon_svg.ToCharArray();
signatureImage.Image = Convert.FromBase64CharArray(placeHolderImg, 0, placeHolderImg.Length);
return Ok(signatureImage);
Extending #bkwdesign very useful answer...
His/her code includes:
//other parameters omitted for brevity
You can actually pull all the parameter information (for the non-multi-part form parameters) from the parameters to the filter. Inside the check for supportsDesiredFilter, do the following:
if (operation.parameters.Count != apiDescription.ParameterDescriptions.Count)
{
throw new ApplicationException("Inconsistencies in parameters count");
}
operation.consumes.Add("multipart/form-data");
var parametersList = new List<Parameter>(apiDescription.ParameterDescriptions.Count + 1);
for (var i = 0; i < apiDescription.ParameterDescriptions.Count; ++i)
{
var schema = schemaRegistry.GetOrRegister(apiDescription.ParameterDescriptions[i].ParameterDescriptor.ParameterType);
parametersList.Add(new Parameter
{
name = apiDescription.ParameterDescriptions[i].Name,
#in = operation.parameters[i].#in,
description = operation.parameters[i].description,
required = !apiDescription.ParameterDescriptions[i].ParameterDescriptor.IsOptional,
type = apiDescription.ParameterDescriptions[i].ParameterDescriptor.ParameterType.FullName,
schema = schema,
});
}
parametersList.Add(new Parameter
{
name = "fileToUpload",
#in = "formData",
description = "File to upload.",
required = true,
type = "file"
});
operation.parameters = parametersList;
first it checks to make sure that the two arrays being passed in are consistent. Then it walks through the arrays to pull out the required info to put into the collection of Swashbuckle Parameters.
The hardest thing was to figure out that the types needed to be registered in the "schema" in order to have them show up in the Swagger UI. But, this works for me.
Everything else I did was consistent with #bkwdesign's post.