Skip navigation links
A B C D F G H I L M N P R S T W X 

A

accumulate(A, T) - Method in interface net.sansa_stack.hadoop.generic.Accumulating
 
accumulate(Dataset, Quad) - Method in class net.sansa_stack.hadoop.jena.rdf.trig.RecordReaderTrigDataset.AccumulatingDataset
 
accumulatedValue(A) - Method in interface net.sansa_stack.hadoop.generic.Accumulating
 
accumulatedValue(Dataset) - Method in class net.sansa_stack.hadoop.jena.rdf.trig.RecordReaderTrigDataset.AccumulatingDataset
 
Accumulating<T,G,A,U> - Interface in net.sansa_stack.hadoop.generic
 
accumulating - Variable in class net.sansa_stack.hadoop.generic.RecordReaderGenericBase
 
AccumulatingDataset() - Constructor for class net.sansa_stack.hadoop.jena.rdf.trig.RecordReaderTrigDataset.AccumulatingDataset
 
aggregate(boolean, Flowable<U>, List<U>) - Method in class net.sansa_stack.hadoop.generic.RecordReaderGenericBase
Modify a flow to perform aggregation of items into records according to specification The complex part here is to correctly combine the two flows: - The first group of the splitAggregateFlow needs to be skipped as this in handled by the previous split's processor - If there are no further groups in splitFlow then no items are emitted at all (because all items belong to s previous split) - ONLY if the splitFlow owned at least one group: The first group in the tailFlow needs to be emitted

B

BASE_IRI_KEY - Static variable in class net.sansa_stack.hadoop.jena.rdf.base.FileInputFormatRdfBase
 
baseIri - Variable in class net.sansa_stack.hadoop.generic.RecordReaderGenericBase
 
baseIriKey - Variable in class net.sansa_stack.hadoop.generic.RecordReaderGenericBase
 
buffer - Variable in class net.sansa_stack.hadoop.util.InterruptingReadableByteChannel
 
bytesRead - Variable in class net.sansa_stack.hadoop.util.InterruptingReadableByteChannel
 
bytesRead - Variable in class net.sansa_stack.hadoop.util.ReadableByteChannelWithConditionalBound
 

C

classify(T) - Method in interface net.sansa_stack.hadoop.generic.Accumulating
 
classify(Quad) - Method in class net.sansa_stack.hadoop.jena.rdf.trig.RecordReaderTrigDataset.AccumulatingDataset
 
close() - Method in class net.sansa_stack.hadoop.generic.RecordReaderGenericBase
 
close() - Method in class net.sansa_stack.hadoop.util.InputStreamWithCloseIgnore
 
close() - Method in class net.sansa_stack.hadoop.util.InputStreamWithCloseLogging
 
close() - Method in class net.sansa_stack.hadoop.util.InterruptingReadableByteChannel
 
close() - Method in interface net.sansa_stack.hadoop.util.ReadableByteChannelDecorator
 
close() - Method in class net.sansa_stack.hadoop.util.ReadableByteChannelWithoutCloseOnInterrupt
 
codec - Variable in class net.sansa_stack.hadoop.generic.RecordReaderGenericBase
 
createAccumulator(G) - Method in interface net.sansa_stack.hadoop.generic.Accumulating
 
createAccumulator(Node) - Method in class net.sansa_stack.hadoop.jena.rdf.trig.RecordReaderTrigDataset.AccumulatingDataset
 
createRecordFlow() - Method in class net.sansa_stack.hadoop.generic.RecordReaderGenericBase
 
createRecordReader(InputSplit, TaskAttemptContext) - Method in class net.sansa_stack.hadoop.jena.rdf.base.FileInputFormatRdfBase
 
createRecordReaderActual(InputSplit, TaskAttemptContext) - Method in class net.sansa_stack.hadoop.jena.rdf.base.FileInputFormatRdfBase
 
createRecordReaderActual(InputSplit, TaskAttemptContext) - Method in class net.sansa_stack.hadoop.jena.rdf.trig.FileInputFormatTrigDataset
 
createRecordReaderActual(InputSplit, TaskAttemptContext) - Method in class net.sansa_stack.hadoop.jena.rdf.trig.FileInputFormatTrigQuad
 
createRecordReaderActual(InputSplit, TaskAttemptContext) - Method in class net.sansa_stack.hadoop.jena.rdf.turtle.FileInputFormatTurtleTriple
 
creationStackTrace - Variable in class net.sansa_stack.hadoop.util.InputStreamWithCloseLogging
 
currentKey - Variable in class net.sansa_stack.hadoop.generic.RecordReaderGenericBase
 
currentValue - Variable in class net.sansa_stack.hadoop.generic.RecordReaderGenericBase
 

D

datasetFlow - Variable in class net.sansa_stack.hadoop.generic.RecordReaderGenericBase
 
decompressor - Variable in class net.sansa_stack.hadoop.generic.RecordReaderGenericBase
 
delegate - Variable in class net.sansa_stack.hadoop.util.ReadableByteChannelDecoratorBase
 

F

FileInputFormatRdfBase<T> - Class in net.sansa_stack.hadoop.jena.rdf.base
Base class for unit testing of reading an RDF file with an arbitrary number of splits.
FileInputFormatRdfBase(Lang, String) - Constructor for class net.sansa_stack.hadoop.jena.rdf.base.FileInputFormatRdfBase
 
FileInputFormatTrigDataset - Class in net.sansa_stack.hadoop.jena.rdf.trig
 
FileInputFormatTrigDataset() - Constructor for class net.sansa_stack.hadoop.jena.rdf.trig.FileInputFormatTrigDataset
 
FileInputFormatTrigQuad - Class in net.sansa_stack.hadoop.jena.rdf.trig
 
FileInputFormatTrigQuad() - Constructor for class net.sansa_stack.hadoop.jena.rdf.trig.FileInputFormatTrigQuad
 
FileInputFormatTurtleTriple - Class in net.sansa_stack.hadoop.jena.rdf.turtle
 
FileInputFormatTurtleTriple() - Constructor for class net.sansa_stack.hadoop.jena.rdf.turtle.FileInputFormatTurtleTriple
 
FileOutputFormatTrig2<TKey> - Class in net.sansa_stack.hadoop.output
Not yet used; the idea is to provide an improved version of elepha's TrigOutputFormat
FileOutputFormatTrig2() - Constructor for class net.sansa_stack.hadoop.output.FileOutputFormatTrig2
 
FileSplitUtils - Class in net.sansa_stack.hadoop.util
 
FileSplitUtils() - Constructor for class net.sansa_stack.hadoop.util.FileSplitUtils
 
fileSystem - Variable in class net.sansa_stack.hadoop.jena.locator.LocatorHdfs
 
findFirstPositionWithProbeSuccess(Seekable, Predicate<Long>, Matcher, boolean, Predicate<Seekable>) - Static method in class net.sansa_stack.hadoop.generic.RecordReaderGenericBase
Uses the matcher to find candidate probing positions, and returns the first positoin where probing succeeds.
findNextRecord(Pattern, Seekable, long, long, long, long, Predicate<Long>, Predicate<Seekable>) - Static method in class net.sansa_stack.hadoop.generic.RecordReaderGenericBase
 

G

generateFileSplits(Path, long, int) - Static method in class net.sansa_stack.hadoop.util.FileSplitUtils
Utility method to create a specific number of splits for a file.
getBytesRead() - Method in class net.sansa_stack.hadoop.util.InterruptingReadableByteChannel
 
getBytesRead() - Method in class net.sansa_stack.hadoop.util.ReadableByteChannelWithConditionalBound
 
getCurrentKey() - Method in class net.sansa_stack.hadoop.generic.RecordReaderGenericBase
 
getCurrentValue() - Method in class net.sansa_stack.hadoop.generic.RecordReaderGenericBase
 
getDecodedStreamFromSplit(FileSplit, Configuration) - Static method in class net.sansa_stack.hadoop.util.FileSplitUtils
Util method to open a decoded stream from a split.
getDelegate() - Method in interface net.sansa_stack.hadoop.util.ReadableByteChannelDecorator
 
getDelegate() - Method in class net.sansa_stack.hadoop.util.ReadableByteChannelDecoratorBase
 
getFileExtension() - Method in class net.sansa_stack.hadoop.output.FileOutputFormatTrig2
 
getName() - Method in class net.sansa_stack.hadoop.jena.locator.LocatorHdfs
 
getPos() - Method in interface net.sansa_stack.hadoop.util.SeekableDecorator
 
getProgress() - Method in class net.sansa_stack.hadoop.generic.RecordReaderGenericBase
 
getRecordWriter(StreamRDF, Writer, Configuration) - Method in class net.sansa_stack.hadoop.output.FileOutputFormatTrig2
 
getSeekable() - Method in class net.sansa_stack.hadoop.util.InterruptingReadableByteChannel
 
getSeekable() - Method in interface net.sansa_stack.hadoop.util.SeekableDecorator
 
getSeekable() - Method in class net.sansa_stack.hadoop.util.SeekableInputStream
 
getSplits(JobContext) - Method in class net.sansa_stack.hadoop.jena.rdf.base.FileInputFormatRdfBase
 
getStream(Writer, Configuration) - Method in class net.sansa_stack.hadoop.output.FileOutputFormatTrig2
 

H

hashCode() - Method in class net.sansa_stack.hadoop.jena.locator.LocatorHdfs
 
headerBytesKey - Variable in class net.sansa_stack.hadoop.generic.RecordReaderGenericBase
 

I

identity() - Static method in interface net.sansa_stack.hadoop.generic.Accumulating
Identity accumulator - turns each item into a group that contains only the item and whose value is the item
in - Variable in class net.sansa_stack.hadoop.util.InterruptingReadableByteChannel
 
initialize(InputSplit, TaskAttemptContext) - Method in class net.sansa_stack.hadoop.generic.RecordReaderGenericBase
Read out config paramaters (prefixes, length thresholds, ...) and examine the codec in order to set an internal flag whether the stream will be encoded or not.
initRecordFlow() - Method in class net.sansa_stack.hadoop.generic.RecordReaderGenericBase
 
InputStreamWithCloseIgnore - Class in net.sansa_stack.hadoop.util
 
InputStreamWithCloseIgnore(InputStream) - Constructor for class net.sansa_stack.hadoop.util.InputStreamWithCloseIgnore
Constructs a new ProxyInputStream.
InputStreamWithCloseLogging - Class in net.sansa_stack.hadoop.util
Util class to debug a stream already closed exception
InputStreamWithCloseLogging(InputStream, BiConsumer<? super Throwable, ? super Throwable>) - Constructor for class net.sansa_stack.hadoop.util.InputStreamWithCloseLogging
 
InputStreamWithZeroOffsetRead - Class in net.sansa_stack.hadoop.util
Workaround for HADOOP-17453: read(bts, off, len) with off != 0 is broken in several version of BZip2Codec Invoking read with a non-zero offset creates a in intermediate buffer to which is read with a zero offset The content of the intermidate buffer is then copied to the requesting buffer bts at the appropriate offset.
InputStreamWithZeroOffsetRead(InputStream) - Constructor for class net.sansa_stack.hadoop.util.InputStreamWithZeroOffsetRead
Constructs a new ProxyInputStream.
interrupted - Variable in class net.sansa_stack.hadoop.util.InterruptingReadableByteChannel
 
InterruptingReadableByteChannel - Class in net.sansa_stack.hadoop.util
ReadabalByteChannel whose read method is guaranteed to return at a specific position (before crossing it) such as a split boundary
InterruptingReadableByteChannel(InputStream, Seekable, long) - Constructor for class net.sansa_stack.hadoop.util.InterruptingReadableByteChannel
 
InterruptingReadableByteChannel(InputStream, Seekable, long, Consumer<InterruptingReadableByteChannel>) - Constructor for class net.sansa_stack.hadoop.util.InterruptingReadableByteChannel
 
interruptPos - Variable in class net.sansa_stack.hadoop.util.InterruptingReadableByteChannel
 
interruptPosFoundCallback - Variable in class net.sansa_stack.hadoop.util.InterruptingReadableByteChannel
 
isEncoded - Variable in class net.sansa_stack.hadoop.generic.RecordReaderGenericBase
 
isInEofState - Variable in class net.sansa_stack.hadoop.util.ReadableByteChannelWithConditionalBound
 
isOpen() - Method in class net.sansa_stack.hadoop.util.InterruptingReadableByteChannel
 
isOpen() - Method in interface net.sansa_stack.hadoop.util.ReadableByteChannelDecorator
 
isOpen() - Method in class net.sansa_stack.hadoop.util.ReadableByteChannelWithoutCloseOnInterrupt
 
isSplitable(JobContext, Path) - Method in class net.sansa_stack.hadoop.jena.rdf.base.FileInputFormatRdfBase
 

L

lang - Variable in class net.sansa_stack.hadoop.jena.rdf.base.FileInputFormatRdfBase
Input language
lang - Variable in class net.sansa_stack.hadoop.jena.rdf.base.RecordReaderGenericRdfBase
 
lang - Variable in class net.sansa_stack.hadoop.jena.rdf.base.RecordReaderGenericRdfNonAccumulatingBase
 
LocatorHdfs - Class in net.sansa_stack.hadoop.jena.locator
Support for resources using the "http:" and "https" schemes
LocatorHdfs(FileSystem) - Constructor for class net.sansa_stack.hadoop.jena.locator.LocatorHdfs
 
LocatorHdfs(FileSystem, String[]) - Constructor for class net.sansa_stack.hadoop.jena.locator.LocatorHdfs
 
log() - Method in class net.sansa_stack.hadoop.jena.locator.LocatorHdfs
 
logClose(String) - Static method in class net.sansa_stack.hadoop.generic.RecordReaderGenericBase
 
logUnexpectedClose(String) - Static method in class net.sansa_stack.hadoop.generic.RecordReaderGenericBase
 

M

maxRecordLength - Variable in class net.sansa_stack.hadoop.generic.RecordReaderGenericBase
 
maxRecordLengthKey - Variable in class net.sansa_stack.hadoop.generic.RecordReaderGenericBase
 
minRecordLength - Variable in class net.sansa_stack.hadoop.generic.RecordReaderGenericBase
 
minRecordLengthKey - Variable in class net.sansa_stack.hadoop.generic.RecordReaderGenericBase
 

N

net.sansa_stack.hadoop.generic - package net.sansa_stack.hadoop.generic
 
net.sansa_stack.hadoop.jena.locator - package net.sansa_stack.hadoop.jena.locator
 
net.sansa_stack.hadoop.jena.rdf.base - package net.sansa_stack.hadoop.jena.rdf.base
 
net.sansa_stack.hadoop.jena.rdf.trig - package net.sansa_stack.hadoop.jena.rdf.trig
 
net.sansa_stack.hadoop.jena.rdf.turtle - package net.sansa_stack.hadoop.jena.rdf.turtle
 
net.sansa_stack.hadoop.output - package net.sansa_stack.hadoop.output
 
net.sansa_stack.hadoop.util - package net.sansa_stack.hadoop.util
 
nextKeyValue() - Method in class net.sansa_stack.hadoop.generic.RecordReaderGenericBase
 

P

parse(Callable<InputStream>) - Method in class net.sansa_stack.hadoop.generic.RecordReaderGenericBase
Create a flowable from the input stream.
parse(Callable<InputStream>) - Method in class net.sansa_stack.hadoop.jena.rdf.trig.RecordReaderTrigDataset
 
parse(Callable<InputStream>) - Method in class net.sansa_stack.hadoop.jena.rdf.trig.RecordReaderTrigQuad
 
parse(Callable<InputStream>) - Method in class net.sansa_stack.hadoop.jena.rdf.turtle.RecordReaderTurtleTriple
 
PARSED_PREFIXES_LENGTH_DEFAULT - Static variable in class net.sansa_stack.hadoop.jena.rdf.base.FileInputFormatRdfBase
 
performOpen(String) - Method in class net.sansa_stack.hadoop.jena.locator.LocatorHdfs
 
prefixBytes - Variable in class net.sansa_stack.hadoop.generic.RecordReaderGenericBase
 
PREFIXES_KEY - Static variable in class net.sansa_stack.hadoop.jena.rdf.base.FileInputFormatRdfBase
 
PREFIXES_MAXLENGTH_KEY - Static variable in class net.sansa_stack.hadoop.jena.rdf.trig.RecordReaderTrigDataset
 
PREFIXES_MAXLENGTH_KEY - Static variable in class net.sansa_stack.hadoop.jena.rdf.trig.RecordReaderTrigQuad
 
PREFIXES_MAXLENGTH_KEY - Static variable in class net.sansa_stack.hadoop.jena.rdf.turtle.RecordReaderTurtleTriple
 
prefixesLengthMaxKey - Variable in class net.sansa_stack.hadoop.jena.rdf.base.FileInputFormatRdfBase
 
prefixesMaxLengthKey - Variable in class net.sansa_stack.hadoop.jena.rdf.base.RecordReaderGenericRdfBase
 
prefixesMaxLengthKey - Variable in class net.sansa_stack.hadoop.jena.rdf.base.RecordReaderGenericRdfNonAccumulatingBase
 
probeRecordCount - Variable in class net.sansa_stack.hadoop.generic.RecordReaderGenericBase
 
probeRecordCountKey - Variable in class net.sansa_stack.hadoop.generic.RecordReaderGenericBase
 

R

rawStream - Variable in class net.sansa_stack.hadoop.generic.RecordReaderGenericBase
 
read(byte[], int, int) - Method in class net.sansa_stack.hadoop.util.InputStreamWithZeroOffsetRead
 
read(ByteBuffer) - Method in class net.sansa_stack.hadoop.util.InterruptingReadableByteChannel
 
read(ByteBuffer) - Method in interface net.sansa_stack.hadoop.util.ReadableByteChannelDecorator
 
read(ByteBuffer) - Method in class net.sansa_stack.hadoop.util.ReadableByteChannelWithConditionalBound
 
read(ByteBuffer) - Method in class net.sansa_stack.hadoop.util.ReadableByteChannelWithoutCloseOnInterrupt
 
ReadableByteChannelDecorator<T extends ReadableByteChannel> - Interface in net.sansa_stack.hadoop.util
 
ReadableByteChannelDecoratorBase<T extends ReadableByteChannel> - Class in net.sansa_stack.hadoop.util
 
ReadableByteChannelDecoratorBase(T) - Constructor for class net.sansa_stack.hadoop.util.ReadableByteChannelDecoratorBase
 
ReadableByteChannelWithConditionalBound<T extends ReadableByteChannel> - Class in net.sansa_stack.hadoop.util
Readable byte channel wrapper that before every read checks for an end-of-file (eof) condition.
ReadableByteChannelWithConditionalBound(T, Predicate<? super ReadableByteChannelWithConditionalBound<T>>) - Constructor for class net.sansa_stack.hadoop.util.ReadableByteChannelWithConditionalBound
 
ReadableByteChannelWithoutCloseOnInterrupt - Class in net.sansa_stack.hadoop.util
ReadableByteChannel from an InputStream without closing the stream on interrupt as Channels.newChannel does.
ReadableByteChannelWithoutCloseOnInterrupt(InputStream) - Constructor for class net.sansa_stack.hadoop.util.ReadableByteChannelWithoutCloseOnInterrupt
 
RECORD_MAXLENGTH_KEY - Static variable in class net.sansa_stack.hadoop.jena.rdf.trig.RecordReaderTrigDataset
 
RECORD_MAXLENGTH_KEY - Static variable in class net.sansa_stack.hadoop.jena.rdf.trig.RecordReaderTrigQuad
 
RECORD_MAXLENGTH_KEY - Static variable in class net.sansa_stack.hadoop.jena.rdf.turtle.RecordReaderTurtleTriple
 
RECORD_MINLENGTH_KEY - Static variable in class net.sansa_stack.hadoop.jena.rdf.trig.RecordReaderTrigDataset
 
RECORD_MINLENGTH_KEY - Static variable in class net.sansa_stack.hadoop.jena.rdf.trig.RecordReaderTrigQuad
 
RECORD_MINLENGTH_KEY - Static variable in class net.sansa_stack.hadoop.jena.rdf.turtle.RecordReaderTurtleTriple
 
RECORD_PROBECOUNT_KEY - Static variable in class net.sansa_stack.hadoop.jena.rdf.trig.RecordReaderTrigDataset
 
RECORD_PROBECOUNT_KEY - Static variable in class net.sansa_stack.hadoop.jena.rdf.trig.RecordReaderTrigQuad
 
RECORD_PROBECOUNT_KEY - Static variable in class net.sansa_stack.hadoop.jena.rdf.turtle.RecordReaderTurtleTriple
 
RecordReaderGenericBase<U,G,A,T> - Class in net.sansa_stack.hadoop.generic
A generic record reader that uses a callback mechanism to detect a consecutive sequence of records that must start in the current split and which may extend over any number of successor splits.
RecordReaderGenericBase(String, String, String, Pattern, String, String, Accumulating<U, G, A, T>) - Constructor for class net.sansa_stack.hadoop.generic.RecordReaderGenericBase
 
RecordReaderGenericRdfBase<U,G,A,T> - Class in net.sansa_stack.hadoop.jena.rdf.base
 
RecordReaderGenericRdfBase(String, String, String, String, Pattern, Lang, Accumulating<U, G, A, T>) - Constructor for class net.sansa_stack.hadoop.jena.rdf.base.RecordReaderGenericRdfBase
 
RecordReaderGenericRdfNonAccumulatingBase<T> - Class in net.sansa_stack.hadoop.jena.rdf.base
 
RecordReaderGenericRdfNonAccumulatingBase(String, String, String, String, Pattern, Lang) - Constructor for class net.sansa_stack.hadoop.jena.rdf.base.RecordReaderGenericRdfNonAccumulatingBase
 
RecordReaderTrigDataset - Class in net.sansa_stack.hadoop.jena.rdf.trig
RecordReader for the Trig RDF format that groups consecutive quads having the same IRI for the graph component into Datasets.
RecordReaderTrigDataset() - Constructor for class net.sansa_stack.hadoop.jena.rdf.trig.RecordReaderTrigDataset
 
RecordReaderTrigDataset.AccumulatingDataset - Class in net.sansa_stack.hadoop.jena.rdf.trig
 
RecordReaderTrigQuad - Class in net.sansa_stack.hadoop.jena.rdf.trig
 
RecordReaderTrigQuad() - Constructor for class net.sansa_stack.hadoop.jena.rdf.trig.RecordReaderTrigQuad
 
RecordReaderTurtleTriple - Class in net.sansa_stack.hadoop.jena.rdf.turtle
 
RecordReaderTurtleTriple() - Constructor for class net.sansa_stack.hadoop.jena.rdf.turtle.RecordReaderTurtleTriple
 
recordStartPattern - Variable in class net.sansa_stack.hadoop.generic.RecordReaderGenericBase
Regex pattern to search for candidate record starts used to avoid having to invoke the actual parser (which may start a new thread) on each single character

S

SCHEME_NAMES - Static variable in class net.sansa_stack.hadoop.jena.locator.LocatorHdfs
 
seek(long) - Method in interface net.sansa_stack.hadoop.util.SeekableDecorator
 
seekable - Variable in class net.sansa_stack.hadoop.util.InterruptingReadableByteChannel
 
seekable - Variable in class net.sansa_stack.hadoop.util.SeekableInputStream
 
SeekableDecorator - Interface in net.sansa_stack.hadoop.util
 
SeekableInputStream - Class in net.sansa_stack.hadoop.util
Combines Hadoop's Seekable and InputStream into one class
SeekableInputStream(InputStream, Seekable) - Constructor for class net.sansa_stack.hadoop.util.SeekableInputStream
Constructs a new ProxyInputStream.
seekToNewSource(long) - Method in interface net.sansa_stack.hadoop.util.SeekableDecorator
 
setStreamToInterval(long, long) - Method in class net.sansa_stack.hadoop.generic.RecordReaderGenericBase
Seek to a given offset and prepare to read up to the 'end' position (exclusive) For non-encoded streams this is just performs a seek on th stream and returns start/end unchanged.
split - Variable in class net.sansa_stack.hadoop.generic.RecordReaderGenericBase
 
splitEnd - Variable in class net.sansa_stack.hadoop.generic.RecordReaderGenericBase
 
splitLength - Variable in class net.sansa_stack.hadoop.generic.RecordReaderGenericBase
 
splitStart - Variable in class net.sansa_stack.hadoop.generic.RecordReaderGenericBase
 
stackTraceConsumer - Variable in class net.sansa_stack.hadoop.util.InputStreamWithCloseLogging
 
stream - Variable in class net.sansa_stack.hadoop.generic.RecordReaderGenericBase
 

T

testForEof - Variable in class net.sansa_stack.hadoop.util.ReadableByteChannelWithConditionalBound
 
trigFwdPattern - Static variable in class net.sansa_stack.hadoop.jena.rdf.trig.RecordReaderTrigDataset
 
trigFwdPattern - Static variable in class net.sansa_stack.hadoop.jena.rdf.trig.RecordReaderTrigQuad
 
turtleRecordStartPattern - Static variable in class net.sansa_stack.hadoop.jena.rdf.turtle.RecordReaderTurtleTriple
Syntatic constructs in Turtle can start with: TODO Anything missing? base / @base prefix / @prefix @lt;foo;> - an IRI [ ] - a blank node foo: - a CURIE

W

wrap(InputStream, Function<? super Throwable, String>, Consumer<? super String>) - Static method in class net.sansa_stack.hadoop.util.InputStreamWithCloseLogging
Convenience method for e.g.

X

xschemeNames - Variable in class net.sansa_stack.hadoop.jena.locator.LocatorHdfs
 
A B C D F G H I L M N P R S T W X 
Skip navigation links

Copyright © 2016–2021 Smart Data Analytics (SDA) Research Group. All rights reserved.