class CommandLineApp extends AnyRef
Main application that parse command line arguments and invoke appropriate extractor.
- Alphabetic
- By Inheritance
- CommandLineApp
- AnyRef
- Any
- Hide All
- Show All
- Public
- All
Instance Constructors
-
new
CommandLineApp(conf: CmdAppConf)
- conf
Scallop option reader constructed with class CmdAppConf
Value Members
-
final
def
!=(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
final
def
##(): Int
- Definition Classes
- AnyRef → Any
-
final
def
==(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
final
def
asInstanceOf[T0]: T0
- Definition Classes
- Any
-
def
clone(): AnyRef
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws( ... ) @native() @HotSpotIntrinsicCandidate()
-
final
def
eq(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
-
def
equals(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
-
final
def
getClass(): Class[_]
- Definition Classes
- AnyRef → Any
- Annotations
- @native() @HotSpotIntrinsicCandidate()
-
def
handler(): Any
Prepare for invoking extractors.
Prepare for invoking extractors.
- returns
Any
-
def
hashCode(): Int
- Definition Classes
- AnyRef → Any
- Annotations
- @native() @HotSpotIntrinsicCandidate()
-
final
def
isInstanceOf[T0]: Boolean
- Definition Classes
- Any
-
final
def
ne(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
-
final
def
notify(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native() @HotSpotIntrinsicCandidate()
-
final
def
notifyAll(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native() @HotSpotIntrinsicCandidate()
-
def
process(): Any
Process the handler.
Process the handler.
- returns
Any
-
def
saveCsv(d: Dataset[Row]): Unit
Routine for saving Dataset obtained from querying DataFrames to CSV.
Routine for saving Dataset obtained from querying DataFrames to CSV. Files may be merged according to options specified in 'partition' setting.
- d
generic dataset obtained from querying DataFrame
- returns
Unit
-
def
saveParquet(d: Dataset[Row]): Unit
Routine for saving Dataset obtained from querying DataFrames to Parquet.
Routine for saving Dataset obtained from querying DataFrames to Parquet. Files may be merged according to options specified in 'partition' setting.
- d
generic dataset obtained from querying DataFrame
- returns
Unit
-
def
setAppName(): String
Set the app name.
Set the app name.
- returns
String
-
def
setSparkContext(sc: SparkContext): Unit
Set Spark context to be used.
Set Spark context to be used.
- sc
either a brand new or existing Spark context
-
final
def
synchronized[T0](arg0: ⇒ T0): T0
- Definition Classes
- AnyRef
-
def
toString(): String
- Definition Classes
- AnyRef → Any
-
def
verifyArgumentsOrExit(): Unit
Verify the validity of command line arguments regarding input and output files.
Verify the validity of command line arguments regarding input and output files.
All input files need to exist, and ouput files should not exist, for this to pass. Throws exception if condition is not met.
- returns
Unit
- Exceptions thrown
IllegalArgumentException
exception thrown
-
final
def
wait(arg0: Long, arg1: Int): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... )
-
final
def
wait(arg0: Long): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... ) @native()
-
final
def
wait(): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws( ... )
Deprecated Value Members
-
def
finalize(): Unit
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws( classOf[java.lang.Throwable] ) @Deprecated @deprecated
- Deprecated
(Since version ) see corresponding Javadoc for more information.