How do I solve this spec loader error in apache isis - isis

I am using the income platform as a test platform. I have added an entity in a separate module Provider. Provider has a reference to itself via a parent property. I want to ensure that I can use autocomplete or be able to select a provider from a list when filling the parent field.
Whenever I add the Autocomplete option in the #DomainObject annotation and reference the ProviderRepository, I am getting a build error as below.
[DEBUG] Meta model invalid
org.apache.isis.core.metamodel.specloader.validator.MetaModelInvalidException: 1: #DomainObject annotation on domainapp.modules.provider.dom.Provider specifies unknown repository 'd
omainapp.modules.provider.dom.ProviderRepository'
at org.apache.isis.core.metamodel.specloader.validator.ValidationFailures.assertNone(ValidationFailures.java:51)
at org.apache.isis.core.metamodel.specloader.SpecificationLoader.validateAndAssert(SpecificationLoader.java:252)
at org.apache.isis.core.runtime.system.session.IsisSessionFactoryBuilder$1.run(IsisSessionFactoryBuilder.java:206)
at org.apache.isis.core.runtime.system.session.IsisSessionFactory$1.call(IsisSessionFactory.java:322)
at org.apache.isis.core.runtime.system.session.IsisSessionFactory$1.call(IsisSessionFactory.java:319)
at org.apache.isis.core.runtime.system.session.IsisSessionFactory.doInSession(IsisSessionFactory.java:353)
at org.apache.isis.core.runtime.system.session.IsisSessionFactory.doInSession(IsisSessionFactory.java:319)
at org.apache.isis.core.runtime.system.session.IsisSessionFactory.doInSession(IsisSessionFactory.java:306)
at org.apache.isis.core.runtime.system.session.IsisSessionFactoryBuilder.buildSessionFactory(IsisSessionFactoryBuilder.java:201)
at org.apache.isis.tool.mavenplugin.IsisMojoAbstract.execute(IsisMojoAbstract.java:65)
at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:132)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:208)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:116)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:80)
at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build(SingleThreadedBuilder.java:51)
at org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:120)
at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:355)
at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:155)
at org.apache.maven.cli.MavenCli.execute(MavenCli.java:584)
at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:216)
at org.apache.maven.cli.MavenCli.main(MavenCli.java:160)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:289)
at org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:229)
at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:415)
at org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:356)
[DEBUG] flushTransaction
[DEBUG] ... calling #PreDestroy method: org.apache.isis.objectstore.jdo.datanucleus.service.support.TimestampService: close
Here is the Provider entity
package domainapp.modules.provider.dom;
//import domainapp.modules.provider.ProviderModule;
import domainapp.modules.provider.ProviderModule.PropertyDomainEvent;
import javax.jdo.annotations.IdentityType;
import javax.jdo.annotations.VersionStrategy;
import javax.xml.bind.annotation.adapters.XmlJavaTypeAdapter;
import org.apache.isis.applib.annotation.Action;
import org.apache.isis.applib.annotation.Auditing;
import org.apache.isis.applib.annotation.CommandReification;
import org.apache.isis.applib.annotation.DomainObject;
import org.apache.isis.applib.annotation.Editing;
import org.apache.isis.applib.annotation.Parameter;
import org.apache.isis.applib.annotation.Property;
import org.apache.isis.applib.annotation.Publishing;
import org.apache.isis.applib.annotation.SemanticsOf;
import org.apache.isis.applib.annotation.Title;
import org.apache.isis.applib.services.i18n.TranslatableString;
import org.apache.isis.applib.services.message.MessageService;
import org.apache.isis.applib.services.repository.RepositoryService;
import org.apache.isis.applib.services.title.TitleService;
import org.apache.isis.applib.util.ObjectContracts;
import org.apache.isis.schema.utils.jaxbadapters.PersistentEntityAdapter;
import domainapp.modules.provider.dom.ProviderRepository;
//
import java.math.BigDecimal;
import java.net.MalformedURLException;
import java.net.URL;
import java.util.Comparator;
import java.util.List;
import java.util.Set;
import java.util.SortedSet;
import java.util.TreeSet;
import javax.jdo.annotations.IdentityType;
import javax.jdo.annotations.VersionStrategy;
import javax.validation.constraints.Digits;
import javax.xml.bind.annotation.adapters.XmlJavaTypeAdapter;
import com.google.common.base.Objects;
import com.google.common.base.Predicate;
import com.google.common.collect.Ordering;
import org.joda.time.LocalDate;
import org.apache.isis.applib.Identifier;
import org.apache.isis.applib.NonRecoverableException;
import org.apache.isis.applib.RecoverableException;
import org.apache.isis.applib.annotation.Action;
import org.apache.isis.applib.annotation.Collection;
import org.apache.isis.applib.annotation.DomainObject;
import org.apache.isis.applib.annotation.Editing;
import org.apache.isis.applib.annotation.InvokeOn;
import org.apache.isis.applib.annotation.InvokedOn;
import org.apache.isis.applib.annotation.MemberOrder;
import org.apache.isis.applib.annotation.Optionality;
import org.apache.isis.applib.annotation.Parameter;
import org.apache.isis.applib.annotation.Programmatic;
import org.apache.isis.applib.annotation.Property;
import org.apache.isis.applib.annotation.Publishing;
import org.apache.isis.applib.annotation.RestrictTo;
import org.apache.isis.applib.annotation.SemanticsOf;
import org.apache.isis.applib.annotation.Where;
import org.apache.isis.applib.security.UserMemento;
import org.apache.isis.applib.services.actinvoc.ActionInvocationContext;
import org.apache.isis.applib.services.eventbus.EventBusService;
import org.apache.isis.applib.services.eventbus.ObjectCreatedEvent;
import org.apache.isis.applib.services.eventbus.ObjectLoadedEvent;
import org.apache.isis.applib.services.eventbus.ObjectPersistedEvent;
import org.apache.isis.applib.services.eventbus.ObjectPersistingEvent;
import org.apache.isis.applib.services.eventbus.ObjectRemovingEvent;
import org.apache.isis.applib.services.eventbus.ObjectUpdatedEvent;
import org.apache.isis.applib.services.eventbus.ObjectUpdatingEvent;
import org.apache.isis.applib.services.i18n.TranslatableString;
import org.apache.isis.applib.services.message.MessageService;
import org.apache.isis.applib.services.repository.RepositoryService;
import org.apache.isis.applib.services.scratchpad.Scratchpad;
import org.apache.isis.applib.services.title.TitleService;
import org.apache.isis.applib.services.user.UserService;
import org.apache.isis.applib.services.wrapper.HiddenException;
import org.apache.isis.applib.services.wrapper.WrapperFactory;
import org.apache.isis.applib.services.xactn.TransactionService;
import org.apache.isis.applib.util.ObjectContracts;
import org.apache.isis.applib.util.TitleBuffer;
import org.apache.isis.applib.value.Blob;
import org.apache.isis.applib.value.Clob;
import org.apache.isis.schema.utils.jaxbadapters.PersistentEntityAdapter;
//
import lombok.Builder;
import lombok.EqualsAndHashCode;
import lombok.Getter;
import lombok.Setter;
import lombok.ToString;
import org.apache.isis.applib.annotation.MemberOrder;
import org.apache.isis.applib.annotation.Optionality;
//import org.apache.isis.applib.services.eventbus.PropertyDomainEvent;
import org.apache.isis.applib.value.Blob;
import org.isisaddons.wicket.fullcalendar2.cpt.applib.CalendarEvent;
import org.isisaddons.wicket.fullcalendar2.cpt.applib.CalendarEventable;
import org.isisaddons.wicket.gmap3.cpt.applib.Locatable;
import org.isisaddons.wicket.gmap3.cpt.applib.Location;
import org.isisaddons.wicket.gmap3.cpt.service.LocationLookupService;
import domainapp.modules.provider.dom.ProviderMenu;
import domainapp.modules.provider.dom.ProviderRepository;
#javax.jdo.annotations.PersistenceCapable(
identityType = IdentityType.DATASTORE,
schema = "simple"
)
#javax.jdo.annotations.DatastoreIdentity(
strategy = javax.jdo.annotations.IdGeneratorStrategy.IDENTITY,
column = "id")
#javax.jdo.annotations.Version(
strategy = VersionStrategy.DATE_TIME,
column = "version")
#javax.jdo.annotations.Queries({
#javax.jdo.annotations.Query(
name = "findByName",
value = "SELECT "
+ "FROM domainapp.modules.provider.dom.Provider "
+ "WHERE name.indexOf(:name) >= 0 ")
})
#javax.jdo.annotations.Unique(name = "Provider_name_UNQ", members = {"name"})
#DomainObject(
autoCompleteRepository = ProviderRepository.class, // for drop-downs, unless autoCompleteNXxx() or choicesNXxx() present
autoCompleteAction = "autoComplete",
// updatedLifecycleEvent = ToDoItem.UpdatedEvent.class,
auditing = Auditing.ENABLED
)
#XmlJavaTypeAdapter(PersistentEntityAdapter.class)
#EqualsAndHashCode(of = {"name"})
#ToString(of = {"name"})
public class Provider implements Comparable<Provider>, Locatable {
/**
*
* The fields of the Provider entity name String required, website String,
* twitter String, facebook String, email String, primaryLocation String,
* country String, city String, state String, street String, apiKey String,
* apiUser String, endpointUrl String, logo ImageBlob
*/
#Builder
public Provider(final String name) {
setName(name);
}
#javax.jdo.annotations.Column(allowsNull = "false", length = 40)
#Title(prepend = "Provider: ")
#Property(editing = Editing.DISABLED)
#Getter
#Setter
private String name;
//parent
#javax.jdo.annotations.Column(allowsNull = "true")
#Property(
editing = Editing.ENABLED,
command = CommandReification.ENABLED,
publishing = Publishing.ENABLED
)
#Getter
#Setter
private Provider parent;
//notes
#javax.jdo.annotations.Column(allowsNull = "true", length = 4000)
#Property(
editing = Editing.ENABLED,
command = CommandReification.ENABLED,
publishing = Publishing.ENABLED
)
#Getter
#Setter
private String notes;
//website
#javax.jdo.annotations.Column(allowsNull = "true", length = 40)
#Property(
editing = Editing.ENABLED,
command = CommandReification.ENABLED,
publishing = Publishing.ENABLED
)
#Getter
#Setter
private String website;
//facebook
#javax.jdo.annotations.Column(allowsNull = "true", length = 40)
#Property(
editing = Editing.ENABLED,
command = CommandReification.ENABLED,
publishing = Publishing.ENABLED
)
#Getter
#Setter
private String facebook;
//twitter
#javax.jdo.annotations.Column(allowsNull = "true", length = 40)
#Property(
editing = Editing.ENABLED,
command = CommandReification.ENABLED,
publishing = Publishing.ENABLED
)
#Getter
#Setter
private String twitter;
//instagram
#javax.jdo.annotations.Column(allowsNull = "true", length = 40)
#Property(
editing = Editing.ENABLED,
command = CommandReification.ENABLED,
publishing = Publishing.ENABLED
)
#Getter
#Setter
private String instagram;
//email String,
#javax.jdo.annotations.Column(allowsNull = "true", length = 40)
#Property(
editing = Editing.ENABLED,
command = CommandReification.ENABLED,
publishing = Publishing.ENABLED
)
#Getter
#Setter
private String email;
// primaryLocation String, copy the way the location is done in todoApp, meaning we include
// locatable
// public static class LocationDomainEvent extends ProviderModule.PropertyDomainEvent<Location> { }
private Double locationLatitude;
private Double locationLongitude;
#Property(
//ISIS-1138: Location value type not parsed from string, so fails to locate constructor
//domainEvent = LocationDomainEvent.class,
optionality = Optionality.OPTIONAL,
//editing = Editing.ENABLED,
command = CommandReification.ENABLED,
publishing = Publishing.ENABLED
)
public Location getLocation() {
return locationLatitude != null && locationLongitude != null ? new Location(locationLatitude, locationLongitude) : null;
}
public void setLocation(final Location location) {
locationLongitude = location != null ? location.getLongitude() : null;
locationLatitude = location != null ? location.getLatitude() : null;
}
#MemberOrder(name = "location", sequence = "1")
public Provider updateLocation(
final String address) {
final Location location = this.locationLookupService.lookup(address);
setLocation(location);
setAddress(address);
return this;
}
// address String,
#MemberOrder(name = "location", sequence = "2")
#javax.jdo.annotations.Column(allowsNull = "true", length = 200)
#Property(
editing = Editing.DISABLED,
command = CommandReification.ENABLED,
publishing = Publishing.ENABLED
)
#Getter
#Setter
private String address;
// city String,
// state String,
// street String,
// apiKey String,
#javax.jdo.annotations.Column(allowsNull = "true", length = 40)
#Property(
editing = Editing.ENABLED,
command = CommandReification.ENABLED,
publishing = Publishing.ENABLED
)
#Getter
#Setter
private String apiKey;
// apiUser String,
#javax.jdo.annotations.Column(allowsNull = "true", length = 40)
#Property(
editing = Editing.ENABLED,
command = CommandReification.ENABLED,
publishing = Publishing.ENABLED
)
#Getter
#Setter
private String apiUser;
// endpointUrl String,
#javax.jdo.annotations.Column(allowsNull = "true", length = 40)
#Property(
editing = Editing.ENABLED,
command = CommandReification.ENABLED,
publishing = Publishing.ENABLED
)
#Getter
#Setter
private String endpointUrl;
// logo ImageBlob
// public static class LogoDomainEvent extends PropertyDomainEvent<Blob> { }
#javax.jdo.annotations.Persistent(defaultFetchGroup = "false", columns = {
#javax.jdo.annotations.Column(name = "logo_name")
,
#javax.jdo.annotations.Column(name = "logo_mimetype")
,
#javax.jdo.annotations.Column(name = "logo_bytes", jdbcType = "BLOB", sqlType = "LONGVARBINARY")
})
#Property(
// domainEvent = LogoDomainEvent.class,
optionality = Optionality.OPTIONAL,
editing = Editing.ENABLED,
command = CommandReification.ENABLED,
publishing = Publishing.ENABLED
)
#Getter
#Setter
private Blob logo;
//
//updatename
#Action(
semantics = SemanticsOf.IDEMPOTENT,
command = CommandReification.ENABLED,
publishing = Publishing.ENABLED
)
public Provider updateName(
#Parameter(maxLength = 40)
final String name) {
setName(name);
return this;
}
public String default0UpdateName() {
return getName();
}
public TranslatableString validate0UpdateName(final String name) {
return name != null && name.contains("!") ? TranslatableString.tr("Exclamation mark is not allowed") : null;
}
#Action(semantics = SemanticsOf.NON_IDEMPOTENT_ARE_YOU_SURE)
public void delete() {
final String title = titleService.titleOf(this);
messageService.informUser(String.format("'%s' deleted", title));
repositoryService.remove(this);
}
#Override
public int compareTo(final Provider other) {
return ObjectContracts.compare(this, other, "name");
}
#javax.inject.Inject
RepositoryService repositoryService;
#javax.inject.Inject
TitleService titleService;
#javax.inject.Inject
MessageService messageService;
#javax.inject.Inject
private LocationLookupService locationLookupService;
#javax.inject.Inject
private ProviderMenu providerMenu;
#javax.inject.Inject
private ProviderRepository providerRepository;
}
And here is the ProviderRepository
package domainapp.modules.provider.dom;
import java.util.List;
import org.apache.isis.applib.annotation.DomainService;
import org.apache.isis.applib.annotation.NatureOfService;
import org.apache.isis.applib.annotation.Programmatic;
import org.apache.isis.applib.query.QueryDefault;
import org.apache.isis.applib.services.registry.ServiceRegistry2;
import org.apache.isis.applib.services.repository.RepositoryService;
#DomainService(
nature = NatureOfService.DOMAIN,
repositoryFor = Provider.class
)
public class ProviderRepository {
public List<Provider> listAll() {
return repositoryService.allInstances(Provider.class);
}
public List<Provider> findByName(final String name) {
return repositoryService.allMatches(
new QueryDefault<>(
Provider.class,
"findByName",
"name", name));
}
public Provider create(final String name) {
final Provider object = new Provider(name);
serviceRegistry.injectServicesInto(object);
repositoryService.persist(object);
return object;
}
//region > autoComplete (programmatic)
#Programmatic // doesn't need to be part of metamodel
public List<Provider> autoComplete(final String description) {
return findByName(description);
}
//endregion
#javax.inject.Inject
RepositoryService repositoryService;
#javax.inject.Inject
ServiceRegistry2 serviceRegistry;
}
I have been unable to figure out what the issue is.

What's happening here is that you are running the Isis Maven plugin, and it is using an AppManifest (defined in the pom.xml) that - I am guessing - doesn't reference your new module that contains these classes.
I'm guessing you probably cloned the module-simple for your new module. In that case you can fix things by either:
adding in the new module to the manifest referenced by the plugin in your new module (not that of module-simple)
using -Dskip.isis-validate on the mvn command line
remove the plugin by removing the mavenmixin-isis-validate
HTH
Dan

Related

How can I implement the Android Car API for Andorid Studio to read out my EVs percentage?

I'm currently trying to display the percentage of my Ev Via Android Auto. I can't manage to get CarInfo carInfo = getCarContext().getCarService(CarHardwareManager.class).getCarInfo(); to run. I used this in my automotive build.gradle useLibrary 'android.car' and tried importing many things but that didn't help obviously.
This is my first file:
package com.example.aatest4;
import android.car.Car;
import android.car.CarInfoManager;
import android.content.Intent;
import androidx.annotation.NonNull;
import androidx.car.app.CarAppMetadataHolderService;
import androidx.car.app.CarAppService;
import androidx.car.app.CarContext;
import androidx.car.app.Screen;
import androidx.car.app.Session;
import androidx.car.app.hardware.CarHardwareManager;
import androidx.car.app.hardware.info.CarInfo;
import androidx.car.app.validation.HostValidator;
//import android.content.ServiceConnection;
public class HelloWorldService extends CarAppService {
#NonNull
#Override
public HostValidator createHostValidator() {
return HostValidator.ALLOW_ALL_HOSTS_VALIDATOR;
}
public Session onCreateSession() {
return new Session() {
#NonNull
#Override
public Screen onCreateScreen(#NonNull Intent intent) {
CarInfo carInfo = getCarContext().getCarService(CarHardwareManager.class).getCarInfo();
return new HelloWorldScreen(getCarContext());
}
};
}
}
and this my second:
package com.example.aatest4;
//import android.car.Car;
//import android.car.CarInfoManager;
import androidx.annotation.NonNull;
import androidx.car.app.CarContext;
import androidx.car.app.Screen;
import androidx.car.app.model.Pane;
import androidx.car.app.model.PaneTemplate;
import androidx.car.app.model.Row;
import androidx.car.app.model.Template;
public class HelloWorldScreen extends Screen {
//public HelloWorldScreen() {
public HelloWorldScreen(CarContext carContext) {
super(carContext);
}
#NonNull
#Override
public Template onGetTemplate() {
String akku = String.valueOf(50);
Row row = new Row.Builder().setTitle(akku + "% ").addText(akku + "%").build();
return new PaneTemplate.Builder(new Pane.Builder().addRow(row).build()) .setTitle("Akkuanzeige").build();
//Todo: Center?
}
}

Accessing TableRow columns in BigQuery Apache Beam

I am trying to
1.Read JSON events from Cloud Pub/Sub
2.Load the events from Cloud Pub/Sub to BigQuery every 15 minutes using file loads to save cost on streaming inserts.
3.The destination will differ based on "user_id" and "campaign_id" field in the JSON event, "user_id" will be dataset name and "campaign_id" will be the table name. The partition name comes from the event timestamp.
4.The schema for all tables stays same.
I am new to Java and Beam. I think my code mostly does what I am trying to do and I just a need little help here.
But I unable to access "campaign_id" and "user_id" field in the JSON message.
So, my events are not routing to the correct table.
package ...;
import com.google.api.services.bigquery.model.TableSchema;
import javafx.scene.control.TableRow;
import org.apache.beam.sdk.Pipeline;
import org.apache.beam.sdk.coders.Coder;
import org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO;
import org.apache.beam.sdk.io.gcp.bigquery.DynamicDestinations;
import org.apache.beam.sdk.io.gcp.bigquery.TableDestination;
import org.apache.beam.sdk.io.gcp.bigquery.TableRowJsonCoder;
import org.apache.beam.sdk.io.gcp.pubsub.PubsubIO;
import org.apache.beam.sdk.transforms.MapElements;
import org.apache.beam.sdk.transforms.PTransform;
import org.apache.beam.sdk.transforms.SimpleFunction;
import org.apache.beam.sdk.values.PCollection;
import org.apache.beam.sdk.values.ValueInSingleWindow;
import org.joda.time.Duration;
import org.joda.time.Instant;
import java.io.ByteArrayInputStream;
import java.io.IOException;
import java.io.InputStream;
import java.nio.charset.StandardCharsets;
import java.text.SimpleDateFormat;
import static org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write.CreateDisposition.CREATE_IF_NEEDED;
import static org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write.Method.FILE_LOADS;
import static org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write.WriteDisposition.WRITE_APPEND;
public class ClickLogConsumer {
private static final int BATCH_INTERVAL_SECS = 15 * 60;
private static final String PROJECT = "pure-app";
public static PTransform<PCollection<String>, PCollection<com.google.api.services.bigquery.model.TableRow>> jsonToTableRow() {
return new JsonToTableRow();
}
private static class JsonToTableRow
extends PTransform<PCollection<String>, PCollection<com.google.api.services.bigquery.model.TableRow>> {
#Override
public PCollection<com.google.api.services.bigquery.model.TableRow> expand(PCollection<String> stringPCollection) {
return stringPCollection.apply("JsonToTableRow", MapElements.<String, com.google.api.services.bigquery.model.TableRow>via(
new SimpleFunction<String, com.google.api.services.bigquery.model.TableRow>() {
#Override
public com.google.api.services.bigquery.model.TableRow apply(String json) {
try {
InputStream inputStream = new ByteArrayInputStream(
json.getBytes(StandardCharsets.UTF_8.name()));
//OUTER is used here to prevent EOF exception
return TableRowJsonCoder.of().decode(inputStream, Coder.Context.OUTER);
} catch (IOException e) {
throw new RuntimeException("Unable to parse input", e);
}
}
}));
}
}
public static void main(String[] args) throws Exception {
Pipeline pipeline = Pipeline.create(options);
pipeline
.apply(PubsubIO.readStrings().withTimestampAttribute("timestamp").fromTopic("projects/pureapp-199410/topics/clicks"))
.apply(jsonToTableRow())
.apply("WriteToBQ",
BigQueryIO.writeTableRows()
.withMethod(FILE_LOADS)
.withWriteDisposition(WRITE_APPEND)
.withCreateDisposition(CREATE_IF_NEEDED)
.withTriggeringFrequency(Duration.standardSeconds(BATCH_INTERVAL_SECS))
.withoutValidation()
.to(new DynamicDestinations<TableRow, String>() {
#Override
public String getDestination(ValueInSingleWindow<TableRow> element) {
String tableName = "campaign_id"; // JSON message in Pub/Sub has "campaign_id" field, how do I access it here?
String datasetName = "user_id"; // JSON message in Pub/Sub has "user_id" field, how do I access it here?
Instant eventTimestamp = element.getTimestamp();
String partition = new SimpleDateFormat("yyyyMMdd").format(eventTimestamp);
return String.format("%s:%s.%s$%s", PROJECT, datasetName, tableName, partition);
}
#Override
public TableDestination getTable(String table) {
return new TableDestination(table, null);
}
#Override
public TableSchema getSchema(String destination) {
return getTableSchema();
}
}));
pipeline.run();
}
}
I arrived at the above code based on reading:
1.https://medium.com/myheritage-engineering/kafka-to-bigquery-load-a-guide-for-streaming-billions-of-daily-events-cbbf31f4b737
2.https://shinesolutions.com/2017/12/05/fun-with-serializable-functions-and-dynamic-destinations-in-cloud-dataflow/
3.https://beam.apache.org/documentation/sdks/javadoc/2.0.0/org/apache/beam/sdk/io/gcp/bigquery/DynamicDestinations.html
4.BigQueryIO - Write performance with streaming and FILE_LOADS
5.Inserting into BigQuery via load jobs (not streaming)
Update
import com.google.api.services.bigquery.model.TableFieldSchema;
import com.google.api.services.bigquery.model.TableRow;
import com.google.api.services.bigquery.model.TableSchema;
import com.google.api.services.bigquery.model.TimePartitioning;
import com.google.common.collect.ImmutableList;
import org.apache.beam.sdk.Pipeline;
import org.apache.beam.sdk.coders.Coder;
import org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO;
import org.apache.beam.sdk.io.gcp.bigquery.TableDestination;
import org.apache.beam.sdk.io.gcp.bigquery.TableRowJsonCoder;
import org.apache.beam.sdk.io.gcp.pubsub.PubsubIO;
import org.apache.beam.sdk.transforms.MapElements;
import org.apache.beam.sdk.transforms.PTransform;
import org.apache.beam.sdk.transforms.SimpleFunction;
import org.apache.beam.sdk.values.PCollection;
import org.joda.time.Duration;
import java.io.ByteArrayInputStream;
import java.io.IOException;
import java.io.InputStream;
import java.nio.charset.StandardCharsets;
import static org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write.CreateDisposition.CREATE_IF_NEEDED;
import static org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write.Method.FILE_LOADS;
import static org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO.Write.WriteDisposition.WRITE_APPEND;
public class ClickLogConsumer {
private static final int BATCH_INTERVAL_SECS = 15 * 60;
private static final String PROJECT = "pure-app";
public static PTransform<PCollection<String>, PCollection<TableRow>> jsonToTableRow() {
return new JsonToTableRow();
}
private static class JsonToTableRow
extends PTransform<PCollection<String>, PCollection<TableRow>> {
#Override
public PCollection<TableRow> expand(PCollection<String> stringPCollection) {
return stringPCollection.apply("JsonToTableRow", MapElements.<String, com.google.api.services.bigquery.model.TableRow>via(
new SimpleFunction<String, TableRow>() {
#Override
public TableRow apply(String json) {
try {
InputStream inputStream = new ByteArrayInputStream(
json.getBytes(StandardCharsets.UTF_8.name()));
//OUTER is used here to prevent EOF exception
return TableRowJsonCoder.of().decode(inputStream, Coder.Context.OUTER);
} catch (IOException e) {
throw new RuntimeException("Unable to parse input", e);
}
}
}));
}
}
public static void main(String[] args) throws Exception {
Pipeline pipeline = Pipeline.create(options);
pipeline
.apply(PubsubIO.readStrings().withTimestampAttribute("timestamp").fromTopic("projects/pureapp-199410/topics/clicks"))
.apply(jsonToTableRow())
.apply(BigQueryIO.write()
.withTriggeringFrequency(Duration.standardSeconds(BATCH_INTERVAL_SECS))
.withMethod(FILE_LOADS)
.withWriteDisposition(WRITE_APPEND)
.withCreateDisposition(CREATE_IF_NEEDED)
.withSchema(new TableSchema().setFields(
ImmutableList.of(
new TableFieldSchema().setName("timestamp").setType("TIMESTAMP"),
new TableFieldSchema().setName("exchange").setType("STRING"))))
.to((row) -> {
String datasetName = row.getValue().get("user_id").toString();
String tableName = row.getValue().get("campaign_id").toString();
return new TableDestination(String.format("%s:%s.%s", PROJECT, datasetName, tableName), "Some destination");
})
.withTimePartitioning(new TimePartitioning().setField("timestamp")));
pipeline.run();
}
}
How about: String tableName = element.getValue().get("campaign_id").toString() and likewise for the dataset.
Besides, for inserting into time-partitioned tables, I strongly recommend using BigQuery's Column-Based Partitioning, instead of using a partition decorator in the table name. Please see "Loading historical data into time-partitioned BigQuery tables" in the javadoc - you'll need a timestamp column. (note that the javadoc has a typo: "time" vs "timestamp")

EclipseLink: "Missing class for indicator field value" without inheritance

I have a problem using Moxy to convert a JSON String to an XML Object.
Here is the exception I get when I do this conversion:
Exception [EclipseLink-43] (Eclipse Persistence Services - 2.6.2.v20151217-774c696): org.eclipse.persistence.exceptions.DescriptorException
Exception Description: Missing class for indicator field value [TENANT] of type [class java.lang.String].
Descriptor: XMLDescriptor(fr.niji.nates.webservices.macd.ws.COMPONENTTYPE --> [])
at org.eclipse.persistence.exceptions.DescriptorException.missingClassForIndicatorFieldValue(DescriptorException.java:940)
at org.eclipse.persistence.internal.oxm.QNameInheritancePolicy.classFromRow(QNameInheritancePolicy.java:278)
[...]
Here is the class COMPONENTTYPE:
import javax.xml.bind.annotation.XmlAccessType;
import javax.xml.bind.annotation.XmlAccessorType;
import javax.xml.bind.annotation.XmlAttribute;
import javax.xml.bind.annotation.XmlSeeAlso;
import javax.xml.bind.annotation.XmlType;
#XmlAccessorType(XmlAccessType.FIELD)
#XmlType(name = "COMPONENT_TYPE")
#XmlSeeAlso({
COMPONENTDETAILTYPE.class,
MACDRESULTTYPE.Created.class
})
public class COMPONENTTYPE {
#XmlAttribute(name = "type", required = true)
protected String type;
#XmlAttribute(name = "dbid", required = true)
protected int dbid;
public String getType() {
return type;
}
public void setType(String value) {
this.type = value;
}
public int getDbid() {
return dbid;
}
public void setDbid(int value) {
this.dbid = value;
}
}
The problem seems to be only on "type" attribute.
Does anyone have an idea?
Thanks,
The solution I found is to add the annotation #XmlDiscriminatorNode to the class :
import javax.xml.bind.annotation.XmlAccessType;
import javax.xml.bind.annotation.XmlAccessorType;
import javax.xml.bind.annotation.XmlAttribute;
import javax.xml.bind.annotation.XmlSeeAlso;
import javax.xml.bind.annotation.XmlType;
import org.eclipse.persistence.oxm.annotations.XmlDiscriminatorNode;
#XmlAccessorType(XmlAccessType.FIELD)
#XmlType(name = "COMPONENT_TYPE")
#XmlSeeAlso({
COMPONENTDETAILTYPE.class,
fr.niji.nates.webservices.macd.ws.MACDRESULTTYPE.Created.class
})
#XmlDiscriminatorNode("##type")
public class COMPONENTTYPE {
[...]

rdf jena api compared subject predicate object

I created a function of extension sparql! how I can compare the subject,object and proopriété of rdf file with the parameter of the function which are (subject, property, object)?
This is my function but it does not display any result!
import static com.sun.org.apache.xalan.internal.xsltc.compiler.util.Type.String;
import java.*;
import java.io.*;**strong text**
import java.util.HashMap;
import java.util.Map;
import java.util.logging.Level;
import java.util.logging.Logger;
import org.apache.jena.datatypes.xsd.XSDDatatype;
import org.apache.jena.rdf.model.Model;
import org.apache.jena.rdf.model.ModelFactory;
import org.apache.jena.rdf.model.Property;
import org.apache.jena.rdf.model.RDFNode;
import org.apache.jena.rdf.model.Resource;
import org.apache.jena.rdf.model.Statement;
import org.apache.jena.rdf.model.StmtIterator;
import org.apache.jena.sparql.expr.NodeValue;
import org.apache.jena.sparql.function.Function;
import org.apache.jena.sparql.function.FunctionBase2;
import org.apache.jena.sparql.function.FunctionFactory;
import static org.apache.jena.vocabulary.RDF.Nodes.object;
import static org.apache.jena.vocabulary.RDF.Nodes.predicate;
import static org.apache.jena.vocabulary.RDF.Nodes.subject;
public class haschild extends FunctionBase2 implements FunctionFactory{
private Map _cache;
public haschild()
{
_cache = new HashMap();
}
#Override
public NodeValue exec(NodeValue v1, NodeValue v2) {
String value = v2.asString();
String value2 = v1.asString();
Model model = ModelFactory.createDefaultModel();
InputStream is = null;
try {
is = new BufferedInputStream(
new FileInputStream( "C:\\\\fichier rdf/journal.webscience.org-vivo.rdf"));
} catch (FileNotFoundException ex) {
Logger.getLogger(haschild.class.getName()).log(Level.SEVERE, null, ex);
}
model.read(new InputStreamReader(is), "");
StmtIterator iter = model.listStatements();
// affiche l'objet, le prédicat et le sujet de chaque déclaration
while (iter.hasNext()) {
Statement stmt = iter.nextStatement(); // obtenir la prochaine déclaration
Resource subject = stmt.getSubject(); // obtenir le sujet
Property predicate = stmt.getPredicate(); // obtenir le prédicat
RDFNode object = stmt.getObject(); // obtenir l'objet
if((value2.equals(subject.toString()))&&(object.toString().equals(value))&&(predicate.toString().equals("http://www.w3.org/2000/01/rdf-schema#HasChild")))
return NodeValue.TRUE;
return NodeValue.FALSE;}
return null;
}
#Override
public Function create(String uri) {
throw new UnsupportedOperationException("Not supported yet."); //To change body of generated methods, choose Tools | Templates.}
}
Compare the Nodes that back the various items and .equals those.
See NodeValue#asNode and RDFNode.asNode.
If you want value-equality (e.g. 1 equals +1.0e0), convert the RDFNode.asNode to a NodeValue and use NodeValue.sameAs.
NodeValue.asString returns the SPARQL syntax version of the value. Not for use in comparisons.

Neo4j error caused by Lucene (Too many open files)

I've just started evaluating Neo4j to see how well its fits our use case.
I'm using the embedded Java API to insert edges and nodes into a graph.
After creating around 5000 nodes I get the following error (using Neo4j 2.1.6 and 2.1.7 on OS X Yosemite)
org.neo4j.graphdb.TransactionFailureException: Unable to commit transaction
Caused by: javax.transaction.xa.XAException
Caused by: org.neo4j.kernel.impl.nioneo.store.UnderlyingStorageException: java.io.FileNotFoundException: /Users/mihir.k/IdeaProjects/Turant/target/neo4j-hello-db/schema/label/lucene/_8zr.frq (Too many open files)
Caused by: java.io.FileNotFoundException: /Users/mihir.k/IdeaProjects/Turant/target/neo4j-hello-db/schema/label/lucene/_8zr.frq (Too many open files)
I've looked at numerous similar StackOverFlow questions and other related threads online. They all suggest increasing the max open files limit.
I've tried doing that.
These are my settings:
kern.maxfiles: 65536
kern.maxfilesperproc: 65536
However this hasn't fixed the error.
While the Neo4j code runs I tried using the lsof|wc -l command. The code always breaks when around 10000 files are open.
The following is the main class that deals with Neo4j:
import java.io.File;
import java.io.Serializable;
import java.util.Map;
import java.util.concurrent.TimeUnit;
import org.neo4j.cypher.internal.compiler.v1_9.commands.True;
import org.neo4j.cypher.internal.compiler.v2_0.ast.False;
import org.neo4j.graphdb.*;
import org.neo4j.graphdb.factory.GraphDatabaseFactory;
import org.neo4j.graphdb.schema.Schema;
import org.neo4j.graphdb.schema.IndexDefinition;
import org.neo4j.graphdb.index.UniqueFactory;
import org.neo4j.graphdb.index.Index;
import org.neo4j.graphdb.index.IndexHits;
public class Neo4jDB implements Serializable {
private static final String DB_PATH = "target/neo4j-hello-db-spark";
IndexDefinition indexDefinition;
private static GraphDatabaseFactory dbFactory;
public static GraphDatabaseService db;
public void main(String[] args) {
System.out.println("Life is a disease, sexually transmitted and irrevocably fatal. Stop coding and read some Neil Gaiman.");
}
public void startDbInstance() {
db =new GraphDatabaseFactory().newEmbeddedDatabase(DB_PATH);
}
public Node createOrGetNode ( LabelsUser360 label , String key, String nodeName ,Map<String,Object> propertyMap)
{
System.out.println("Creating/Getting node");
try ( Transaction tx = db.beginTx() ) {
Node node;
if (db.findNodesByLabelAndProperty(label, key, nodeName).iterator().hasNext()) {
node = db.findNodesByLabelAndProperty(label, key, nodeName).iterator().next();
} else {
node = db.createNode(label);
node.setProperty(key, nodeName);
}
for (Map.Entry<String, Object> entry : propertyMap.entrySet()) {
node.setProperty(entry.getKey(), entry.getValue());
}
tx.success();
return node;
}
}
public void createUniquenessConstraint(LabelsUser360 label , String property)
{
try ( Transaction tx = db.beginTx() )
{
db.schema()
.constraintFor(label)
.assertPropertyIsUnique(property)
.create();
tx.success();
}
}
public void createOrUpdateRelationship(RelationshipsUser360 relationshipType ,Node startNode, Node endNode, Map<String,Object> propertyMap)
{
try ( Transaction tx = db.beginTx() ) {
if (startNode.hasRelationship(relationshipType, Direction.OUTGOING)) {
Relationship relationship = startNode.getSingleRelationship(relationshipType, Direction.OUTGOING);
for (Map.Entry<String, Object> entry : propertyMap.entrySet()) {
relationship.setProperty(entry.getKey(), entry.getValue());
}
} else {
Relationship relationship = startNode.createRelationshipTo(endNode, relationshipType);
for (Map.Entry<String, Object> entry : propertyMap.entrySet()) {
relationship.setProperty(entry.getKey(), entry.getValue());
}
}
tx.success();
}
}
public void registerShutdownHook( final GraphDatabaseService graphDb )
{
Runtime.getRuntime().addShutdownHook( new Thread()
{
#Override
public void run()
{
db.shutdown();
}
} );
}
}
There is another Neo4jAdapter class that is used to implement domain specific logic. It uses the Neo4jDB class to do add/update nodes/properties/relationships
import org.apache.lucene.index.IndexWriter;
import org.codehaus.jackson.map.ObjectMapper;
import org.json.*;
import org.neo4j.graphdb.*;
import org.neo4j.graphdb.factory.GraphDatabaseFactory;
import org.neo4j.graphdb.schema.IndexDefinition;
import java.io.*;
import java.nio.file.Files;
import java.nio.file.Paths;
import java.util.HashMap;
import java.util.Iterator;
import java.util.Map;
import java.util.Set;
public class Neo4jAdapter implements Serializable {
static Neo4jDB n4j = new Neo4jDB();
public static GraphDatabaseService db = Neo4jDB.db ;
public void begin() {
n4j.startDbInstance();
}
public static void main(String[] args) {}
public String graphPut(String jsonString) {
System.out.println("graphput called");
HashMap<String, Object> map = jsonToMap(jsonString); //Json deserializer
Node startNode = n4j.createOrGetNode(...);
Node endNode = n4j.createOrGetNode(...);
propertyMap = new HashMap<String, Object>();
propertyMap.put(....);
try (Transaction tx = Neo4jDB.db.beginTx()) {
Relationship relationship = startNode.getSingleRelationship(...);
if (relationship != null) {
Integer currentCount = (Integer) relationship.getProperty("count");
Integer updatedCount = currentCount + 1;
propertyMap.put("count", updatedCount);
} else {
Integer updatedCount = 1;
propertyMap.put("count", updatedCount);
}
tx.success();
}
n4j.createOrUpdateRelationship(RelationshipsUser360.BLAH, startNode, endNode, propertyMap);
}
}
}
return "Are you sponge worthy??";
}
}
Finally, there is a Sprak App that calls the "graphput" method of the Neo4jAdapter class. The relevant code snippet is (the following is scala+spark code) :
val graphdb : Neo4jAdapter = new Neo4jAdapter()
graphdb.begin()
linesEnriched.foreach(a=>graphdb.graphPutMap(a))
where 'a' is a json string and linesEnriched is a Spark RDD (basically a set of strings)