class OntopSPARQLEngine extends AnyRef

A SPARQL engine based on Ontop as SPARQL-to-SQL rewriter.

Linear Supertypes
AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. OntopSPARQLEngine
  2. AnyRef
  3. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Instance Constructors

  1. new OntopSPARQLEngine(spark: SparkSession, databaseName: String, partitions: Set[RdfPartitionComplex], ontology: Option[OWLOntology])

    spark

    the Spark session

    databaseName

    an existing Spark database that contains the tables for the RDF partitions

    partitions

    the RDF partitions

    ontology

    an (optional) ontology that will be used for query optimization and rewriting

Value Members

  1. final def !=(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  4. final def asInstanceOf[T0]: T0
    Definition Classes
    Any
  5. val blankNodeStrategy: BlankNodeStrategy.Value
  6. def clear(): Unit

    Free resources, e.g.

    Free resources, e.g. unregister Spark tables.

  7. def clone(): AnyRef
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native()
  8. val databaseName: String
  9. final def eq(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  10. def equals(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  11. def execAsk(query: String): Boolean

    Executes an ASK query on the provided dataset partitions.

    Executes an ASK query on the provided dataset partitions.

    query

    the SPARQL query

    returns

    true or false depending on the result of the ASK query execution

    Exceptions thrown

    org.apache.spark.sql.AnalysisException if the query execution fails

  12. def execConstruct(query: String): RDD[Triple]

    Executes a CONSTRUCT query on the provided dataset partitions.

    Executes a CONSTRUCT query on the provided dataset partitions.

    query

    the SPARQL query

    returns

    an RDD of triples

    Exceptions thrown

    org.apache.spark.sql.AnalysisException if the query execution fails

  13. def execSelect(query: String): RDD[Binding]

    Executes a SELECT query on the provided dataset partitions and returns a DataFrame.

    Executes a SELECT query on the provided dataset partitions and returns a DataFrame.

    query

    the SPARQL query

    returns

    an RDD of solution bindings

    Exceptions thrown

    org.apache.spark.sql.AnalysisException if the query execution fails

  14. def execute(query: String): DataFrame

    Executes the given SPARQL query on the provided dataset partitions.

    Executes the given SPARQL query on the provided dataset partitions.

    query

    the SPARQL query

    returns

    a DataFrame with the resulting bindings as columns

    Exceptions thrown

    org.apache.spark.sql.AnalysisException if the query execution fails

  15. def executeDebug(query: String): (DataFrame, Option[OntopQueryRewrite])

    Executes the given SPARQL query on the provided dataset partitions.

    Executes the given SPARQL query on the provided dataset partitions.

    query

    the SPARQL query

    returns

    a DataFrame with the raw result of the SQL query execution and the query rewrite object for processing the intermediate SQL rows (None if the SQL query was empty)

    Exceptions thrown

    org.apache.spark.sql.AnalysisException if the query execution fails

  16. def finalize(): Unit
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  17. final def getClass(): Class[_]
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  18. def hashCode(): Int
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  19. final def isInstanceOf[T0]: Boolean
    Definition Classes
    Any
  20. final def ne(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  21. final def notify(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  22. final def notifyAll(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  23. var ontology: Option[OWLOntology]
  24. val partitions: Set[RdfPartitionComplex]
  25. val rdfDatatype2SQLCastName: Map[RDFDatatype, DataType]
  26. val spark: SparkSession
  27. def stop(): Unit

    Shutdown of the engine, i.e.

    Shutdown of the engine, i.e. all open resource will be closed.

  28. final def synchronized[T0](arg0: ⇒ T0): T0
    Definition Classes
    AnyRef
  29. def toString(): String
    Definition Classes
    AnyRef → Any
  30. val typeFactory: TypeFactory
  31. final def wait(): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  32. final def wait(arg0: Long, arg1: Int): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  33. final def wait(arg0: Long): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native()

Inherited from AnyRef

Inherited from Any

Ungrouped